Brain Computer Interface enables people with disabilities to drive autonomously
A dark room. A screen. And Cornelia, whose head is adorned with a futuristic headset. Her eyes are almost penetrating the display. A dot speeds across the screen. Suddenly, music starts to play out of nowhere. What’s going on? Is this a trick? No! Cornelia started the music – only with the power of her thoughts. How did she do it? To answer this question, we need to go back in time: Everything started with Cornelia’s cousin Markus Burkhart, who was diagnosed with multiple sclerosis several years ago and is now paralyzed from his head down. Despite this limitation, he runs his own car repair shop and with the help of eye-tracking and special computer programs, even carries out the administrative work. Although this may be somewhat tedious and time-consuming, it does work.
Brain Computer Interface in the car: With the Audi Aicon, a vision becomes reality
“I find it fascinating to observe him,” Cornelia explains. “Markus would give anything to be able to drive a car again and to be independent. I wondered if it would be possible to operate a car with an integrated eye-tracking system in combination with a brain-computer interface for him.” And so, she found a topic for her thesis at Audi in the Design Interior Interface area.
During her studies at the Mediadesign University of Applied Sciences in Munich, Cornelia Engel got to know the Brain Computer Interface (BCI) of EMOTIV. The “brain-computer interface” enables cognitively and motorically impaired people to communicate with their environment via mental commands. The mobile EEG device registers the electrical activities of the nerve cells. A computer then translates these signals into commands and passes them onto a device – for example a computer program, a wheelchair, a light switch or even a car.
With this approach of reading brain activity to translate thoughts into actions, Cornelia has not only optimized usability, but also made autonomous driving accessible to more people. Physically impaired people, like her cousin in particular, can benefit from this.
Because this is the reality so far: In the autonomous concept car Audi Aicon, which was presented at the IAA 2017, or its predecessor Audi AI:ME, the passenger can operate the graphical interface with an eye-tracking system in addition to touch and voice control. Several infrared sensors are used to detect which display area the passenger is looking at; then the function shown there is displayed on a larger scale. To activate it, the passenger needs to tap on the touch-sensitive wooden screen. However, Markus isn’t able to type anymore. But the Brain Computer Interface offers a solution.
How to control thoughts with the Brain Computer Interface
That was the theory. Cornelia reveals how difficult it was in reality: “I learned a command – that was typing – in around two weeks. This usually takes that long, because the BCI must first be calibrated for the user that in turn requires great practice and maximum concentration. A concise and stable idea must come about for a clear signal to emerge. Cornelia compares this procedure to the process of a baby learning to grab things or speak.
As a first step, she began to meditate so her brain activities turned calm and balanced. Only then she could assign swings or impulses to a command in the second step. “I imagine myself singing ‘forward’ ¬– that’s my impulse to use the ‘tap’ command,” explains Cornelia. “It works differently for everyone. For example, my friend thought of the color green.”
In order to use the eye-tracker and Brain Computer Interface in a car’s interior, both systems must be integrated into the Audi Graphic User Interface (GUI). Cornelia put this into practice in her bachelor thesis. She has developed 7 commands for this purpose and processed them graphically and conceptually: Left, right, up, down, clockwise and counter-clockwise rotation and typing.
Brain Computer Interface: mental commands are, “at some point, just like riding a bike”
For all this to work, other areas of the brain must be stimulated for each command. Controlling different commands directly one after the other requires maximum concentration. “But when it becomes second nature to somebody, it’s like riding a bike,” says Cornelia.
Once it’s learned, the Brain Computer Interface has a high level of operating safety. If this system is additionally linked to an eye-tracker, it’s not necessary to wait until the desired control panel lights up but it can be targeted immediately. This increases the speed of operation. Just the right user experience for people with motor disabilities.
The big moment: How Markus will regain mobility despite his disability
Little did Cornelia know how the Brain Computer Interface would create outstanding new opportunities for her cousin Markus. An example from everyday life: He orders an Audi to his home via app and eye-tracking. He controls his wheelchair with the BCI. The car recognizes his smartphone via Bluetooth, opens the door, lowers the ramp and the seats fold back so that Markus can drive in it. He is welcomed by PIA, Audi’s individual language assistant. The air conditioning adjusts to his favorite ambience. In the meantime, his Brain Computer Interface connects to the Audi so that he can control the applications on the display. Music selection, volume, even a stopover is possible. When he arrives at his destination, the doors open, the ramp extends and Markus can drive out.
A vision of the future? So far, yes. There is still a lot to do before this can become a reality: workshops to learn the commands, an even more mature technique and, of course, the legal scope must be clarified. But the way has been paved. And that would mean a better life quality for Markus and many others with disabilities – simply through the power of thought and Cornelia’s work.