Todd Kuiken (L), Director of the Center for Bionic Medicine and Director of Amputee Services at The Rehabilitation Institute of Chicago, explains the bionic arm on Glen Lehman (R), a retired sergeant first class in the United States Army.

A bionic prosthetic arm that is controlled by its operator's thoughts and feels like the amputee's lost limb went on display Thursday at a major US science conference.

More than 50 amputees worldwide, many of them military veterans whose limbs were lost in combat, have received such devices since they were first developed by US doctor Todd Kuiken in 2002.

The arm uses technology called Targeted Muscle Reinervation (TMR), which works by rerouting brain signals from nerves that were severed in the injury to muscles that are working and intact.

"What we do is use the nerves that are still left," Kuiken said. "Muscle becomes the biological amplifier."

Glen Lehman, a retired US military sergeant who lost his arm in Iraq, demonstrated the latest technology at the annual conference of the American Association for the Advancement of Science in Washington.

"It feels great, if feels intuitive. It is a lot better than the other prosthetic I have now," said Lehman, whose forearm and elbow were blown off in a Baghdad grenade attack in 2008.

"The other one is still controlled by muscle impulse, you just flex muscle to make it move, it is not intuitive. This arm is more trained to me, whereas the other arm I had to train to it," he said.

"It does feel like my own hand."

Lehman demonstrated for reporters how he could pinch his finger and thumb together, lift his forearm and bend his elbow, and turn his wrist just by thinking about those actions.

Kuiken said more advances, such as the ability to transfer some sensation to the limb, are being studied in the lab but have not yet made it to patients.

Other drawbacks include the inability to sense how hard the battery-powered prosthetic hand is squeezing, but Kuiken said scientists are working on ways to improve the technology with added sensors.

"Our goal would be to put sensors in the prosthesis to, for example, know how hard you are squeezing and then bring that up and have a device squeeze on this area (of the bicep) so the patient has an idea of how hard he is squeezing."

Glen Lehman (L), a retired sergeant first class in the United States Army who received Targeted Muscle Reinnervation (TMR) surgery after he lost his arm in Iraq, stands with his bionic arm next to LTC Martin Baechler, M.D., a surgeon at Walter Reed Army Medical Center, during a presentation of the latest in TMR, a bionic limb technology.

Kuiken said the team has encountered some technological "challenges" that have slowed progress but is "excited about moving forward."

A series of other efforts to test and improve on these mind-reading robotics, known as brain-computer interfaces, were also showcased at the conference.

Among them, how researchers can now place computer chips on the surface of the brain to interpret neural activity, potentially allowing spinal cord injury patients to control a range of devices from computer games to prosthetics.

Someday, patients who are bed-ridden will be able to wear a special electronic cap that allows them to maneuver a rolling robot carrying a video camera, so that the patient could join in the dinner conversation without leaving the bedroom.

But the stunning technology is anything but easy work for the patients.

According to Jose del R. Millan and his team at the Ecole Polytechnique Federale de Lausanne in Switzerland, in a "typical brain-computer interface (BCI) set-up," users send mental messages of either left, right, or no-command.

"But it turns out that no-command is very taxing to maintain and requires extreme concentration. After about an hour, most users are spent. Not much help if you need to maneuver that wheelchair through an airport," his team said in a statement.

So now researchers are figuring out how to hook up a machine to interpret a user's brain signals and read their intent.

Users are asked to read or speak aloud while thinking of as many left, right or no commands as possible. The technology learns to sift through the fray and figure out when a command has been delivered.

The result "makes multitasking a reality while at the same time allows users to catch a break."