The BBC is Testing an Experimental Neural Interface for Television Remote Control

"Brain Power", Image by Allan Arifo

“Brain Power”, Image by Allan Arifo

Experimental research into using human brainwaves as an alternative form of input to control computers and prosthetic devices has been underway for a number of years. This technology is often referred to as neural interfaces or brain-computer interfaces. The results thus far have generally been promising. Here is a roundup of reports on ExtremeTech.com.

Another early phase neural interface project has been undertaken by the BBC to develop a system enabling a user to mentally select a program from an onscreen television guide. This was reported in a most interesting article entitled The BBC Wants You to Use Your Brain as a Remote Control by Abhimanyu Ghoshal, posted on TheNextWeb.com on June 18, 2015. While still using my keyboard for now, I will sum up, annotate and pose a few questions.

This endeavor, called the Mind Control TV Project, is a joint effort BBC’s digital unit and a user experience (“UX”) firm called This Place. In its current format, the input hardware is a headset that can read human brainwave signals. The article contains three pictures of the hardware and software (which is a customized version of the BBC’s iPlayer app normally used for viewing TV shows on the network).

To choose from among a number of options present onscreen, the user is required to “‘concentrate’ on it” while wearing the headset. That is, to choose a particular option, the user must concentrate upon it “for a few seconds”. A meter in the interface indicates the level of brain activity the user is generating and the “threshold” he or she must reach in order to initiate their choice.

The BBC hopes that this research will, in the future, benefit people with physical and neural disabilities that restrict their movements.

My questions are as follows:

  • Could this system eventually be so miniaturized that it could be integrated into an ordinary pair of glasses, perhaps Google Glass or something else?
  • Notwithstanding the significant benefits mentioned in this article, what other types of apps and systems might also find advantages in adapting neural interfaces?
  • What entrepreneurial opportunities might be waiting out there as a result of this technology?
  • How might neural interfaces be integrated with the current wave of virtual and augmented reality systems (covered in these seven recent Subway Fold posts), about to very soon enter the consumer market?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s