Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Autonomous Cars and the Brain-Computer Interface

DZone's Guide to

Autonomous Cars and the Brain-Computer Interface

The future is here! As CES draws near, see how Nissan and other orgs are bridging the gap between brain and machine with revolutionary new interfaces and autonomous cars.

· IoT Zone ·
Free Resource

In preparation for a jam-packed CES, this week, Nissan unveiled research that will enable vehicles to interpret signals from the driver’s brain, redefining how people interact with their cars. The company’s Brain-to-Vehicle (B2V) technology promises to speed up reaction times for drivers and will lead to cars that keep adapting to make driving more enjoyable.

In the world’s first system of its kind, the driver wears a device that measures brain wave activity, which is then analyzed by autonomous systems. By anticipating intended movement, the systems can take actions – such as turning the steering wheel or slowing the car – 0.2 to 0.5 seconds faster than the driver, while remaining largely imperceptible.

Image title

This breakthrough from Nissan is the result of research into using brain decoding technology to predict a driver’s actions and detect discomfort. By catching signs that the driver’s brain is about to initiate a movement – such as turning the steering wheel or pushing the accelerator pedal – driver-assist technologies can begin the action more quickly. This can improve reaction times and enhance manual driving. By detecting and evaluating driver discomfort, artificial intelligence can change the driving configuration or driving style when in autonomous mode.

Tech Interest in the BCI Is High

A brain-computer interface (BCI)is a direct communication pathway between an enhanced or wired brain and an external device. Nissan is not the only company interested in the ability to sense, control, and communicate with the outside world through the power of thought. Microsoft applied for a patent on electromyography (EMG) controlled computing perhaps signaling the future development of a wearable that detects muscles movements and interprets them into commands. It's not Microsoft's first foray into this area, they shared a prototype of an EMG controller in 2010, and has previously filed an EMG controlled gesture patent.

Over the last few years, we've also seen Elon Musk is working on and recruiting scientists and “developing ultra-high bandwidth brain-machine interfaces to connect humans and computers.” Bryan Johnson, the former founder of OSFund co-founder of Braintree, created Kernel in 2017, committing $100 million of his own money. Both companies are relatively discrete about the specifics of their work but Johnston presented a keynote at a conference in Berlin last year and told the audience:

"I believe that we are about to enter into the most consequential revolution in the history of the human race and specifically, that we are going to build the tools to read and write our neural code, so that we can take control of our cognitive evolution."

Research Is Bringing BCI Closer to Reality Through Implants

A group of Melbourne scientists funded by DARPA has made the notion of wirelessly controlled limbs closer to reality through the development of a tiny, matchstick-sized device called a stentrode. Once implanted into a blood vessel next to the motor cortex, the brain’s control center, it will pick up brain signals and allow patients to move a robotic exoskeleton attached to their limbs simply by thinking about it. It picks up strong electrical frequencies emitted by the brain that are coded into a computer. The computer then sends a signal to an exoskeleton attached to the arms or legs, enabling movement. The success of the stentrobe is heightened due to its location – a blood vessel – which eschews the need for complex brain surgery. The team is preparing for a trial where paralyzed patients will be implants with the stentrobe.

Image title

Similarily, a team of leading neurologists, neuroscientists, engineers, computer scientists, neurosurgeons, mathematicians, and other researchers at BrainGate in the US are working on developing brain-computer interface (BCI) technologies to restore the communication, mobility, and independence of people with neurologic disease, injury, or limb loss. Using an array of micro-electrodes implanted into the brain, their research has shown that the neural signals associated with the intent to move a limb can be “decoded” by a computer in real-time and used to operate external devices. This has allowed people with spinal cord injury, brainstem stroke, and ALS to control a computer cursor simply by thinking about the movement of their own paralyzed hand and arm.

In early clinical research, the technology has provided intuitive control over advanced prosthetic limbs, and provided people with paralysis with easy control over powerful assistive movement and communication devices. An exciting goal is to enable naturally-controlled movements of paralyzed limbs.

From the Brain to the Intenet of Things

Last year researchers at Wits University in Johannesburg, South Africa created a project called 'Brainternet' that streams brainwaves onto the internet. Essentially, it turns the brain into an Internet of Things node on the Web. It works by converting electroencephalogram (EEG) signals (brain waves) in an open source brain live stream. A person wears a powered, mobile, internet accessible Emotiv EEG device for an extended period. During this time, the Emotiv transmits the EEG signals to a Raspberry Pi live streams the signals to an application programming interface (code that allows software programmes to communicate) and displays data on a website that acts as a portal. This is currently an open website where the public can observe the individual's brain activity.

"Ultimately, we're aiming to enable interactivity between the user and their brain so that the user can provide a stimulus and see the response. Brainternet can be further improved to classify recordings through a smartphone app that will provide data for a machine-learning algorithm. In future, there could be information transferred in both directions – inputs and outputs to the brain," says project supervisor Adam Pantanowitz.

While the notion of a brain-controlled car like Nissan is probably some way off, for some it is something to hope for. A couple of years ago I came across design plans for a car that could be controlled by a driver with quadriplegia through BCI and interviewed it's designer Rajshekhar Dass about the aim to control technical devices through brain waves, facial gestures and infinitesimal movements. I received a number of emails from people with quadriplegia who said they missed the pleasure of driving and loved the idea. Technological advancements in this area are expensive and they move slowly. But their progress will be life-changing for many.

Topics:
brain computer interface ,ces2018 ,autonomous cars ,iot

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}