Once again with feeling: Australian science tugs heart-strings

Canadian artist Erin Gee describes it as "human voices in electronic bodies"
Neurophysiologist Vaughan Macefield (R) triggers nerve impulses from actor Ben Schultz during an experiment at the University of Western Sydney. Do humans really wear their hearts on their sleeve? An ambitious Australian neuroscience project aiming to translate emotional impulses directly into music is hoping to find out.

Do humans really wear their hearts on their sleeve? An ambitious Australian neuroscience project aiming to translate emotional impulses directly into music is hoping to find out.

Canadian artist Erin Gee describes it as " in electronic bodies", and there is a definite futuristic feel to her collaboration with the University of Western Sydney's .

A fingerprint scan is required to gain entry to the labs where her first subject, Ben Schultz, 27, is strapped to a bed, connected via a complex of wires to monitors not unlike those seen in a hospital.

Neurophysiologist Vaughan Macefield plays with a needle attached to a wire feeding directly into Schultz's leg, listening carefully for changes in the crackling from a speaker in the corner.

"That's the sound that's being picked up from the ," Gee explains. "That's the translation of what's happening electrically."

Schultz said the needle was uncomfortable when it was moved but was not painful.

Tapping into a very precise part of the nerve will allow Macefield to eavesdrop directly on the brain's signals to the body as Schultz is shown a series of images designed to elicit emotion, such as mutilation and erotica.

And that is where the music begins.

"While we cannot read Ben's mind and tell you why he's feeling emotions, the technology exists today that we can actually definitively tell you that he is feeling emotions, and we can tell you exactly how much emotion he's feeling," Gee told AFP.

"I can bottle Ben's emotions and save them for later."

Computers that can connect directly to the brain are currently in development
Neurophysiologist Vaughan Macefield (L) triggers nerve impulses from actor Ben Schultz during an experiment at the University of Western Sydney. Robotic technology could be used to teach cxhildren with autism to identify feelings by externalising and exaggerating them into forms like music.

Along with the nerve reading, Schultz's , breathing speed, skin sweat and heart activity are being recorded and fed into Gee's computer, where custom-made software converts them into a chorus of chimes and bells.

The experiment will be repeated with several other subjects so Gee and Macefield can fine-tune their methods and sounds for a live " symphony" performance that promises to be unlike any other attempted before.

Two actors attached to the various monitors will perform an "emotional score" -- Gee is not quite sure what it will look like yet, but it will require them to summon a series of emotions.

The music their feelings produce - "what happiness sounds like" for instance -- will be performed by small robotic pianos that will also flash lights as different moods are detected.

The team has chosen actors as subjects because they routinely need to manifest emotion on demand.

The team has chosen actors as subjects because they routinely need to manifest emotion on demand
Research fellow Rachael Brown (R) tracks nerve impulses from actor Ben Schultz during an experiment at the University of Western Sydney. Neurophysiologist Vaughan Macefield said the research would feed into the field of "affective computing", which deals with machines that can understand and respond to human inputs.

"It will be like seeing someone expertly playing their emotions as they would play a cello," said Gee, whose first show is scheduled for Montreal next year.

Macefield said the research would feed into the field of "affective computing", which deals with machines that can understand and respond to human inputs.

Computers that can connect directly to the brain, allowing users to search for information simply by thinking about it, are currently in development and Macefield said he was interested in how machines could help people.

Many mental illnesses and disorders are associated with heightened or blunted emotional responses and Macefield said technology could have therapeutic benefits.

Children with autism disorders, for example, struggled to understand the emotions of others or to express themselves, and Macefield said Gee's robotic technology could be used to teach them how to identify feelings by externalising and exaggerating them into forms like music.

"It may well be that by amplifying people's emotions they can read them better; it may be that by amplifying their own emotions that people can read them better," he said.

(c) 2012 AFP

Citation: Once again with feeling: Australian science tugs heart-strings (2012, August 23) retrieved 25 April 2024 from https://medicalxpress.com/news/2012-08-australian-science-heart-strings.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Emotional expression in music and speech share similar tonal properties

 shares

Feedback to editors