Building a Future on Science; February 2008; Scientific American Magazine; by Christine Soares; 6 Page(s)
In a tiny, darkened room on the Duke University campus, Miguel Nicolelis looks on approvingly while a pair of students monitors data streaming across computer screens. The brightly colored dashes and spikes reflect the realtime brain activity of a rhesus macaque named Clementine, who is walking at a leisurely pace on a little treadmill in the next room. Staticky pops coming from a speaker on a back wall are the amplified sound of one of her neurons firing. "This is the most beautiful music you can hear from the brain," Nicolelis declares with a smile.
The run-through is preparation for the next big demonstration of work toward mind-controlled human prosthetics that first garnered worldwide headlines for Nicolelis and his team in 2003. Back then, the group showed that they could listen in on brain signals generated by a monkey using a joystick to play a video game and translate that biological code into commands for a mechanical arm to perform the same motions. Now the group intends to make robotic legs walk under commands from the motor cortex of a monkey strolling along like Clementine. This time the scientists also want to feed sensor data from the robot feet into the monkey's brain, so she can "feel" the mechanical legs' strides as though they were her own. To raise the stakes still further, the monkey will be at Duke in North Carolina, but the robotic legs will be half a world away at the Advanced Telecommunications Research Institute International in Kyoto, Japan.