Computer interface helps disabled patients set tone of musical performance
Pioneering technology has been used to unite a string quartet and four people living with severe disability for a world first in musical performance.
The Paramusical Ensemble saw patients from the Royal Hospital for Neuro-disability (RHN) in London interacting with musicians through a Brain Computer Music Interface (BCMI).
The system, developed at Plymouth University, allows a person to control musical systems through brainwave signals detected by electrodes placed on the scalp.
This performance was one of the first showcases of the technology, which researchers believe could have a transformative impact on people being treated for medical conditions such as Locked-In Syndrome.
The initiative is led by Professor of Computer Music Eduardo Miranda and PhD student Joel Eaton, from Plymouth University's Interdisciplinary Centre for Computer Music Research (ICCMR), in collaboration with Dr Julian O'Kelly and Dr Sophie Duport from the RHN.
Professor Miranda, Director of the ICCMR, said: "We have been working with the RHN for around four years, and our collaboration is having a hugely positive impact on everyone involved and changing perceptions at the same time. Our work is giving people an opportunity to put their physical impediments aside, and use music to communicate in ways that would not normally be possible because of their medical conditions. It is an amazing example of research being taken out of the laboratory and into the real world, with both inspiring and very emotional results."
Steve Thomas, one of the patient musicians involved in the project, said: "This is a truly magical experience. It is a chance to play with other severely disabled musicians, and it actually sounds impressive."
For the Paramusical Ensemble, each of the four patients connected to the BCMI generates the musical parts to be performed by a different member of the string quartet in real-time.
The participants are given four options of musical phrases displayed on a panel, which they can select by staring at lights flashing next to them.
The BCMI detects which phrase has been selected by each participant - by reading the electrical activity of their visual cortex - and sends the phrases to the string quartet to perform. The resulting piece lasts for up to 20 minutes.
Dr Julian O'Kelly, RHN Research Fellow, said: "It's still early days in terms of the widespread application of the technology, but for many of the people that we work with every day at the RHN, this could be a very exciting development. It has the potential to really enhance their ability to get involved in the live composition and performance of music. As a practising music therapist working in brain injury rehabilitation, I'm really looking forward to seeing where this technology will take us."