Training computers to understand the human brain

October 8, 2012, Tokyo Institute of Technology
The activation maps of the two contrasts (hot color: mammal > tool ; cool color: tool > mammal) computed from the 10 datasets of our participants.

Understanding how the human brain categorizes information through signs and language is a key part of developing computers that can 'think' and 'see' in the same way as humans. Hiroyuki Akama at the Graduate School of Decision Science and Technology, Tokyo Institute of Technology, together with co-workers in Yokohama, the USA, Italy and the UK, have completed a study using fMRI datasets to train a computer to predict the semantic category of an image originally viewed by five different people.

The participants were asked to look at pictures of animals and hand tools together with an auditory or written (orthographic) description. They were asked to silently 'label' each pictured object with certain properties, whilst undergoing an fMRI brain scan. The resulting scans were analysed using algorithms that identified patterns relating to the two separate semantic groups (animal or tool).

After 'training' the algorithms in this way using some of the auditory session data, the computer correctly identified the remaining scans 80-90% of the time. Similar results were obtained with the orthographic session data. A cross-modal approach, namely training the computer using auditory data but testing it using orthographic, reduced performance to 65-75%. Continued research in this area could lead to systems that allow people to speak through a computer simply by thinking about what they want to say.

Understanding how the categorizes information through signs and language is a key part of developing computers that can 'think' and 'see' in the same way as humans. It is only in recent years that the field of semantics has been explored through the analysis of brain scans and brain activity in response to both language-based and visual inputs. Teaching computers to read brain scans and interpret the language encoded in brain activity could have a variety of uses in medical science and .

Now, Hiroyuki Akama at the Graduate School of Decision , Tokyo Institute of Technology, together with co-workers in Yokohama, the USA, Italy and the UK, have completed a study using fMRI to train a computer to predict the semantic category of an image originally viewed by five different people.

The five participants in the project were shown two sets of forty randomly arranged pictures during the experiment. The pictures came from two distinct categories – either an animal, or a hand tool. In the first session, twenty images of animals and twenty of hand tools were accompanied by the spoken Japanese name of each object (auditory). In the second session - shown to the participants several days later - the same twenty randomly ordered images were accompanied by Japanese written characters (orthography). Each participant was asked to silently 'label' each image with properties they associate with that object in their mind.

During each session, the participants were scanned using fMRI technology. This provided Akama and his team with 240 individual scans showing brain activity for each session. The researchers analyzed the brain scans using a technique called multi-voxel pattern analysis (MVPA). This involves using computer algorithms to identify repeating patterns of across voxels, the cube-shaped elements that make up the 3D scan images. Interestingly, animal pictures tended to induce activity in the visual part of the brain, whereas tool pictures triggered a response more from sensory-motor areas – a phenomenon reported in previous studies.

The MVPA results were then used to find out if the computer could predict whether or not the participants were viewing an animal or hand tool image by looking at the patterns in the scans.

Several different tests were given to the computer. After training the machine to recognise patterns related to 'animals' and 'tools' in some of the auditory session data for example, the computer correctly identified the remaining auditory data scans as animal or tool 80-90% of the time. The computer found the auditory data easier to predict, although it had a very similar success rate when identifying the orthographic session data.

Akama and his team then decided to try a cross-modal approach, namely training the computer using one session data set but testing it using the other. As perhaps would be expected, the for auditory and orthographic sessions differed, as people think in different ways when listening and reading. However, the computer suffered an even stronger performance penalty than anticipated, with success rates down to 65-75%. The exact reasons for this are unclear, although the researchers point to a combination of timing differences (the time taken for the participants to respond to written as opposed to auditory information) and spatial differences (the anatomy of the individuals' brains differing slightly and thereby affecting the voxel distributions).

One future application of experiments such as this could be the development of real-time -computer-interfaces. Such devices could allow patients with communication impairments to speak through a computer simply by thinking about what they want to say.

Explore further: Scientists probe connection between sight and touch in the brain

More information: H. Akama et al. Decoding semantics across fMRI sessions with different stimulus modalities: a practical MVPA study. Frontiers in Neuroinformatics 6 (24) (2012). doi: 10.3389/fninf.2012.00024

Related Stories

Scientists probe connection between sight and touch in the brain

September 8, 2011
Shakespeare famously referred to "the mind's eye," but scientists at USC now have also identified a "mind's touch."

Word association: Study matches brain scans with complex thought

August 31, 2011
In an effort to understand what happens in the brain when a person reads or considers such abstract ideas as love or justice, Princeton researchers have for the first time matched images of brain activity with categories ...

In the brain, winning is everywhere

October 5, 2011
Winning may not be the only thing, but the human brain devotes a lot of resources to the outcome of games, a new study by Yale researchers suggest.

'Harmless' condition shown to alter brain function in elderly

August 13, 2012
Researchers at the Mayo Clinic say a common condition called leukoaraiosis, made up of tiny areas in the brain that have been deprived of oxygen and appear as bright white dots on MRI scans, is not a harmless part of the ...

Recommended for you

Broken shuttle may interfere with learning in major brain disorders

June 22, 2018
Unable to carry signals based on sights and sounds to the genes that record memories, a broken shuttle protein may hinder learning in patients with intellectual disability, schizophrenia, and autism.

Scientists discover fundamental rule of brain plasticity

June 21, 2018
Our brains are famously flexible, or "plastic," because neurons can do new things by forging new or stronger connections with other neurons. But if some connections strengthen, neuroscientists have reasoned, neurons must ...

Scientists discover how brain signals travel to drive language performance

June 21, 2018
Effective verbal communication depends on one's ability to retrieve and select the appropriate words to convey an intended meaning. For many, this process is instinctive, but for someone who has suffered a stroke or another ...

Researchers find mechanism behind choosing alcohol over healthy rewards

June 21, 2018
A new study links molecular changes in the brain to behaviours that are central in addiction, such as choosing a drug over alternative rewards. The researchers have developed a method in which rats learn to get an alcohol ...

Waking up is hard to do: Prefrontal cortex implicated in consciousness

June 21, 2018
Philosophers have pondered the nature of consciousness for thousands of years. In the 21st century, the debate over how the brain gives rise to our everyday experience continues to puzzle scientists. To help, researchers ...

Study on instinctive behaviour elucidates a synaptic mechanism for computing escape decisions

June 21, 2018
How does your brain decide what to do in a threatening situation? A new paper published in Nature describes a mechanism by which the brain classifies the level of a threat and decides when to escape.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.