Study demonstrates that a brain-computer interface can improve your performance
Our state of arousal—being fearful, agitated, or calm—can significantly affect our ability to make optimal decisions, judgments, and actions in real-world dynamic environments. Imagine, for instance, walking across a balance beam. Your performance—speed across the beam and the odds of making it across without falling off—are dramatically better if the beam sits a mere six inches off the ground and you are relaxed rather than terror-stricken on a beam 60 feet higher. To keep you in the zone of maximum performance, your arousal needs to be at moderate levels, not so high that it pushes you over the edge.
Biomedical engineers at Columbia Engineering have shown—for the first time—that they can use online neurofeedback to modify an individual's arousal state to improve performance in a demanding sensory motor task, such as flying a plane or driving in suboptimal conditions. The researchers used a brain computer interface (BCI) to monitor, through electroencephalography (EEG) in real time, the arousal states of the study participants when they were engaged in a virtual reality aerial navigation task. The system generated a neurofeedback signal that helped participants to decrease their arousal in particularly difficult flight situations, which in turn improved participants' performance. The study was published today by Proceedings of the National Academy of Sciences.
"The whole question of how you can get into the zone, whether you're a baseball hitter or a stock trader or a fighter pilot, has always been an intriguing one," says Paul Sajda, professor of biomedical engineering (BME), electrical engineering, and radiology, who led the study. "Our work shows that we can use feedback generated from our own brain activity to shift our arousal state in ways that significantly improve our performance in difficult tasks—so we can hit that home run or land on a carrier deck without crashing."
The 20 subjects in the study were immersed in a virtual reality scenario in which they had to navigate a simulated airplane through rectangular boundaries. Known as a boundary avoidance task, this demanding sensory-motor task model created cognitive stresses, such as making the boxes narrower every 30 seconds, that escalated arousal and quickly resulted in task failure—missing or crashing into the boundary. But when the researchers used neurofeedback, the subjects did better, were able to fly longer while performing the difficult tasks that required high levels of visual and motor coordination.
There were three feedback conditions (BCI, sham, and silence) randomly assigned for every new flight attempt. In the BCI condition, subjects heard the sound of a low-rate synthetic heartbeat that was continuously modulated in loudness as a function of the level of inferred task-dependent arousal, as decoded from the EEG. The higher that level of arousal, the louder the feedback and vice versa. Participants' task performance in the BCI condition, measured as time and distance over which the subject can navigate before failure, was increased by around 20 percent.
The Yerkes-Dodson law is a well-established and intensively studied law in behavioral psychology about the relationship between arousal and performance. Developed in 1908, it posits an inverse-relationship between arousal and task performance, that there is a state of arousal that is optimal for behavioral performance in a given task. In this new study, the researchers showed that they could use neurofeedback in real time to move an individual's arousal from the right side of the Yerkes-Dodson curve to the left, toward a state of improved performance.
"What's exciting about our new approach is that it is applicable to different task domains," Sajda adds. "This includes clinical applications that use self-regulation as a targeted treatment, such as mental illness."
The researchers are now studying how neurofeedback can be used to regulate arousal and emotion for clinical conditions such as PTSD. They are also exploring how they might use online monitoring of arousal and cognitive control to inform human-agent teaming, when a robot and a human work together in a high-stress situation like a rescue. If the robot has information on the human's arousal state, it could choose its tasks in a way that reduces its teammate's arousal, pushing her/him into an ideal performance zone.
"Good human-agent teams, like the Navy SEALS, do this already, but that is because the human-agents can read facial expressions, voice patterns, etc., of their teammates to infer arousal and stress levels," Sajda says. "We envision our system being a better way to communicate not just this type of information, but much more to a robot-agent."