back
Blog image
Brain Computer Interfaces
  : December 13, 2020

Back in the early days of computing, data input was done using punch cards or switches or reels of rapidly moving paper tape. Since that time, we have developed increasingly fluid methods for human-computer interaction with keyboards, mouses, monitors and touchscreens which now dominate our digital lives. But what if we could link the computational wetware in our own heads directly with a computer.


Brain computer interfaces or BCI's could include either input or output devices. Input BCI's are comprised of sensors which acquire electrical brain wave signals together with processing hardware that extracts distinct features and translates them into useful commands. The other way around output BCI's translate digital information into electrical signals that are fed into the user’s brain by electrode arrays or other stimulator hardware.


BCI's may be constructed using one of two approaches.


The first is non-invasive and may employ electroencephalography or EEG to acquire brainwave signals from sensors positioned on the outside of the skull. Alternatively, non invasive BCI's may feed electrical signals into the brain using a TMS or Transcranial Magnetic Stimulation device.


The second way to build a BCI is invasive and currently requires surgery to position sensors or stimulators inside the skull. Two output technologies have so far been trialled, with the first known as electrocorticography or ECoG, placing a strip or grid of electrodes on the surface of the brain. Alternatively, intracortical microelectrodes can be implanted a few millimetres into the brain where they may serve as either sensors or stimulators.


For obvious reasons today most BCI research is non-invasive. This limits its scope and application but even so some interesting results have been achieved. Perhaps most notably, many researchers have used EEG to acquire P300 brain waves which are then processed to allow a user to input letters or commands into a computer. Such interfaces are painstakingly slow, but without doubt they can work.


Researchers at the University of Washington have even used non-invasive BCI's to send commands from one person's brain to another over the Internet. Specifically, in a set of experiments in 2014 one test subject wore an EEG skullcap while watching a monitor screen on which a computer game was being displayed. But the fire button was in the hand of a second subject seated half a mile away in another building and who had a TMS coil placed over the brain. To shoot down Rockets the first user had to think about pressing the fire button. When that happened, brainwave signals were acquired by EEG, processed and transmitted over the Internet. In turn the TMS device induced an electrical pulse into the receiving subject’s brain so causing them to experience visual stimuli and press the fire button. Amazingly the split-second accuracy required to shoot down Rockets was achieved. In fact, across three pairs of test subjects, success rate between 25% and 83% was obtained.


In time, University of Washington researchers hope to use BCI's to transmit more complex ideas and thoughts from one brain to another. This could even allow for what they call brain tutoring in which knowledge is transferred directly from the brain of a teacher to the brain of a student. The team at the University of Washington use highly sophisticated sensors stimulators and processing hardware.


However basic EEG BCI's are now available for consumers including the Emotiv EPOC, the InteraXon Muse and the Neurosky MindWave. Such low-end hardware might actually record electromyographic signals from muscles in addition to or even in place of electroencephalographic signals from the brain.


For many years, we've been learning how to interface electronic hardware with the human body. Since 1982, over 300,000 people have had their hearing restored with a cochlear implant. These feature a surgically inserted electrode array that delivered electrical pulses to its hosts’ auditory nerves. The cochlear implant electrode array is connected to a surgically inserted receiver stimulator which receives wireless signals from external sound processor connected to a microphone. Some external sound processors can also receive and transmit audio directly from a computer. So already some people have a computer feeding sound directly into their auditory nerves.


A company called Second Sight is also developing visual implants. Since 2011, more than 75 patients have been fitted with a Second Sight retinal prosthesis system called the Argus II. This is implanted on and in the eye, where it's 60 electrode array stimulates the retina of somebody who has gone blind due to retinitis pigmentosa. Images are fed to the implant for an externally worn camera which is connected to a visual processing unit and an antenna. The vision so provided is very rudimentary but does allow patients to identify shapes and even large letters.


Since 2016, Second Sight have also been developing a visual cortical prosthesis called the Orion 1. This tiny electrode array is implanted into the visual cortex of the patient's brain so bypassing their eyeball and optic nerves. In January 2018, the first Orion 1 was implanted into a test patient. Like the Argus II, the Orion 1 implant receives signals from an external video camera and currently allows its user to perceive patterns of light.


Another developer of brain implant technology is BrainGate. The first BrainGate neural interface system was created in 2002 by a Brown University spin-off company called Cyberkinetics. By 2008, the hardware had been upgraded to BrainGate 2 and was being developed by academics from five universities supported by a score of institutions and foundations.


The BrainGate 2 implant is a 4 mm array of 100 electrodes that surgically implanted. A decoder processes signals from the implant and in turn it allows the user to control an external electrical device.


In 2008, a team from the university of Pittsburgh implanted a brain gate sensor into the motor cortex of a monkey. Over time the monkey then learned to feed itself by using its brain to control a robot arm.


In May 2012, a team from Brown University implanted brain gate electrode arrays into two paralyzed human patients. One of these was a 58-year-old woman who learned how to use a robot arm to pick up a bottle raise it to her mouth and take a drink.


Clearly today BCI implants like the Orion 1 and the BrainGate 2 are being developed to assist people with medical conditions. Nevertheless, the possibility has to exist for more advanced forms of such technology to develop into sophisticated non-medical human-computer interfaces.


It is clear that BCI's are rising and in the future, this could lead to some kind of cyborg fusion between human beings and machines.


Disclaimer : The views and opinions expressed in the article belong solely to the author, and not necessarily to the author's employer, organisation, committee or other group or individual.




Comments




No comments



Leave a comment

All fields are mandatory. Your email id won't be published!











Preferences

These can't be turned off as they are used to ensure the smooth execution of this site.

The site uses GA to understand the readers better.

Cookies

This site uses cookies

Allow this site to use google-analytics for user analysis and on-page improvements. Please review the Privacy Policy and Terms & Conditions
Allow All
Preferences