Europe against GMO crops! Please, sign the Avaaz petition! I already did.
It's us who decide, not Monsanto!!!

Saturday, 9 February 2008

Brain signals make robbot walk!

Monkey’s Thoughts Propel Robot, a Step That May Help Humans

On Thursday, the 12-pound, 32-inch monkey made a 200-pound, 5-foot humanoid robot walk on a treadmill using only her brain activity.

She was in North Carolina, and the robot was in Japan.

It was the first time that brain signals had been used to make a robot walk, said Dr. Miguel A. L. Nicolelis, a neuroscientist at Duke University whose laboratory designed and carried out the experiment.

In 2003, Dr. Nicolelis’s team proved that monkeys could use their thoughts alone to control a robotic arm for reaching and grasping.

These experiments, Dr. Nicolelis said, are the first steps toward a brain machine interface that might permit paralyzed people to walk by directing devices with their thoughts. Electrodes in the person’s brain would send signals to a device worn on the hip, like a cell phone or pager, that would relay those signals to a pair of braces, a kind of external skeleton, worn on the legs.

“When that person thinks about walking,” he said, “walking happens.”

A brain machine interface is any system that allows people or animals to use their brain activity to control an external device. But until ways are found to safely implant electrodes into human brains, most research will remain focused on animals.

In preparing for the experiment, Idoya was trained to walk upright on a treadmill. She held onto a bar with her hands and got treats — raisins and Cheerios — as she walked at different speeds, forward and backward, for 15 minutes a day, 3 days a week, for 2 months.

Meanwhile, electrodes implanted in the so-called leg area of Idoya’s brain recorded the activity of 250 to 300 neurons that fired while she walked. Some neurons became active when her ankle, knee and hip joints moved. Others responded when her feet touched the ground. And some fired in anticipation of her movements.

To obtain a detailed model of Idoya’s leg movements, the researchers also painted her ankle, knee and hip joints with fluorescent stage makeup and, using a special high speed camera, captured her movements on video.

The video and brain cell activity were then combined and translated into a format that a computer could read. This format is able to predict with 90 percent accuracy all permutations of Idoya’s leg movements three to four seconds before the movement takes place.

On Thursday, an alert and ready-to-work Idoya stepped onto her treadmill and began walking at a steady pace with electrodes implanted in her brain. Her walking pattern and brain signals were collected, fed into the computer and transmitted over a high-speed Internet link to a robot in Kyoto, Japan.

The robot, called CB for Computational Brain, has the same range of motion as a human. It can dance, squat, point and “feel” the ground with sensors embedded in its feet, and it will not fall over when shoved.

Designed by Gordon Cheng and colleagues at the ATR Computational Neuroscience Laboratories in Kyoto, the robot was chosen for the experiment because of its extraordinary ability to mimic human locomotion.

As Idoya’s brain signals streamed into CB’s actuators, her job was to make the robot walk steadily via her own brain activity. She could see the back of CB’s legs on an enormous movie screen in front of her treadmill and received treats if she could make the robot’s joints move in synchrony with her own leg movements.

As Idoya walked, CB walked at exactly the same pace. Recordings from Idoya’s brain revealed that her neurons fired each time she took a step and each time the robot took a step.

When Idoya’s brain signals made the robot walk, some neurons in her brain controlled her own legs, whereas others controlled the robot’s legs. The latter set of neurons had basically become attuned to the robot’s legs after about an hour of practice and visual feedback.

Idoya cannot talk but her brain signals revealed that after the treadmill stopped, she was able to make CB walk for three full minutes by attending to its legs and not her own.

Vision is a powerful, dominant signal in the brain, Dr. Nicolelis said. Idoya’s motor cortex, where the electrodes were implanted, had started to absorb the representation of the robot’s legs — as if they belonged to Idoya herself.
source:NYTimes

My comment: I felt like crying on this one, it's so nice. The most amazing part is that they made a monkey move the robot's legs. A monkey that has limited visualization's abilities (most probably). And what if it was human. This technology is absolutely amazing. Of course, it will requires knowledge of the right spots in human's brain, but in the end, it could give so much to humankind. Well, actually I prefer if we manage to grow body parts for replacement, but anyway, until then, it's absolutely necessary to be able to give the severed or missing limbs way to serve us. And I guess this could lead to brain controlled technology, which may have some nice appliance in factories. After all the delicate movement of hands controlling robot arm, may not be delicate enough, while the brain has immense capabilities in that direction.

No comments: