Source: The New York Times
Monkey’s Thoughts Propel Robot, a Step That May Help Humans
By SANDRA BLAKESLEE
If Idoya could talk, she would have plenty to boast about.
On Thursday, the 12-pound, 32-inch monkey made a 200-pound, 5-foot humanoid robot walk on a treadmill using only her brain activity.
She was in North Carolina, and the robot was in Japan.
It was the first time that brain signals had been used to make a robot walk, said Dr. Miguel A. L. Nicolelis, a neuroscientist at Duke University whose laboratory designed and carried out the experiment.
In 2003, Dr. Nicolelis’s team proved that monkeys could use their thoughts alone to control a robotic arm for reaching and grasping.
These experiments, Dr. Nicolelis said, are the first steps toward a brain machine interface that might permit paralyzed people to walk by directing devices with their thoughts. Electrodes in the person’s brain would send signals to a device worn on the hip, like a cell phone or pager, that would relay those signals to a pair of braces, a kind of external skeleton, worn on the legs.
“When that person thinks about walking,” he said, “walking happens.”
Richard A. Andersen, an expert on such systems at the California Institute of Technology in Pasadena who was not involved in the experiment, said that it was “an important advance to achieve locomotion with a brain machine interface.”
Another expert, Nicho Hatsopoulos, a professor at the University of Chicago, said that the experiment was “an exciting development. And the use of an exoskeleton could be quite fruitful.”
A brain machine interface is any system that allows people or animals to use their brain activity to control an external device. But until ways are found to safely implant electrodes into human brains, most research will remain focused on animals.
In preparing for the experiment, Idoya was trained to walk upright on a treadmill. She held onto a bar with her hands and got treats — raisins and Cheerios — as she walked at different speeds, forward and backward, for 15 minutes a day, 3 days a week, for 2 months.
Meanwhile, electrodes implanted in the so-called leg area of Idoya’s brain recorded the activity of 250 to 300 neurons that fired while she walked. Some neurons became active when her ankle, knee and hip joints moved. Others responded when her feet touched the ground. And some fired in anticipation of her movements.
To obtain a detailed model of Idoya’s leg movements, the researchers also painted her ankle, knee and hip joints with fluorescent stage makeup and, using a special high speed camera, captured her movements on video.
The video and brain cell activity were then combined and translated into a format that a computer could read. This format is able to predict with 90 percent accuracy all permutations of Idoya’s leg movements three to four seconds before the movement takes place.
On Thursday, an alert and ready-to-work Idoya stepped onto her treadmill and began walking at a steady pace with electrodes implanted in her brain. Her walking pattern and brain signals were collected, fed into the computer and transmitted over a high-speed Internet link to a robot in Kyoto, Japan.
The robot, called CB for Computational Brain, has the same range of motion as a human. It can dance, squat, point and “feel” the ground with sensors embedded in its feet, and it will not fall over when shoved.
Designed by Gordon Cheng and colleagues at the ATR Computational Neuroscience Laboratories in Kyoto, the robot was chosen for the experiment because of its extraordinary ability to mimic human locomotion.
As Idoya’s brain signals streamed into CB’s actuators, her job was to make the robot walk steadily via her own brain activity. She could see the back of CB’s legs on an enormous movie screen in front of her treadmill and received treats if she could make the robot’s joints move in synchrony with her own leg movements.
As Idoya walked, CB walked at exactly the same pace. Recordings from Idoya’s brain revealed that her neurons fired each time she took a step and each time the robot took a step.
“It’s walking!” Dr. Nicolelis said. “That’s one small step for a robot and one giant leap for a primate.”
The signals from Idoya’s brain sent to the robot, and the video of the robot sent back to Idoya, were relayed in less than a quarter of a second, he said. That was so fast that the robot’s movements meshed with the monkey’s experience.
An hour into the experiment, the researchers pulled a trick on Idoya. They stopped her treadmill. Everyone held their breath. What would Idoya do?
“Her eyes remained focused like crazy on CB’s legs,” Dr. Nicolelis said.
She got treats galore. The robot kept walking. And the researchers were jubilant.
When Idoya’s brain signals made the robot walk, some neurons in her brain controlled her own legs, whereas others controlled the robot’s legs. The latter set of neurons had basically become attuned to the robot’s legs after about an hour of practice and visual feedback.
Idoya cannot talk but her brain signals revealed that after the treadmill stopped, she was able to make CB walk for three full minutes by attending to its legs and not her own.
Vision is a powerful, dominant signal in the brain, Dr. Nicolelis said. Idoya’s motor cortex, where the electrodes were implanted, had started to absorb the representation of the robot’s legs — as if they belonged to Idoya herself.
In earlier experiments, Dr. Nicolelis found that 20 percent of cells in a monkey’s motor cortex were active only when a robotic arm moved. He said it meant that tools like robotic arms and legs could be assimilated via learning into an animal’s body representation.
In the near future, Idoya and other bipedal monkeys will be getting more feedback from CB in the form of microstimulation to neurons that specialize in the sense of touch related to the legs and feet. When CB’s feet touch the ground, sensors will detect pressure and calculate balance. When that information goes directly into the monkeys’ brains, Dr. Nicolelis said, they will have the strong impression that they can feel CB’s feet hitting the ground.
At that point, the monkeys will be asked to make CB walk across a room by using just their thoughts.
“We have shown that you can take signals across the planet in the same time scale that a biological system works,” Dr. Nicolelis said. “Here the target happens to be a robot. It could be a crane. Or any tool of any size or magnitude. The body does not have a monopoly for enacting the desires of the brain.”
To prove this point, Dr. Nicolelis and his colleague, Dr. Manoel Jacobsen Teixeira, a neurosurgeon at the Sirio-Lebanese Hospital in São Paulo, Brazil, plan to demonstrate by the end of the year that humans can operate an exoskeleton with their thoughts.
It is not uncommon for people to have their arms ripped from their shoulder sockets during a motorcycle or automobile accident, Dr. Nicolelis said. All the nerves are torn, leaving the arm paralyzed but in chronic pain.
Dr. Teixeira is implanting electrodes on the surface of these patients’ brains and stimulating the underlying region where the arm is represented. The pain goes away.
By pushing the same electrodes slightly deeper in the brain, Dr. Nicolelis said, it should be possible to record brain activity involved in moving the arm and intending to move the arm. The patients’ paralyzed arms will then be placed into an exoskeleton or shell equipped with motors and sensors.
“They should be able to move the arm with their thoughts,” he said. “This is science fiction coming to life.”
Source: The New York Times
1/15/2008
Thought Experiment - monkey brain made robot walking
Bejegyezte: Robofriend dátum: 4:52 AM 0 megjegyzés
1/14/2008
2007 in Robots
Article source: Scientific American
In 2007, our artificially intelligent companions moved closer to replacing us on the battlefield, improving healthcare (on Earth and in space) and even befriending our children
By Larry Greenemeier
Last week's announcement of Japan's "Robot of the Year" for 2007—a mechanical arm capable of grabbing 120 items-per-minute from a conveyor belt—marked an anticlimactic end to what has otherwise been a good year in the advancement of artificial intelligence.
The three Fanuc Ltd. assembly-line mechanical arms—which beat out competitors such as Fujitsu's 24-inch-tall (61-centimeter) dancing humanoid HOAP and Komatsu Ltd.'s tank-shaped, fire-extinguishing robot—won for their practicality; they are optimized to work efficiently and accurately on food and pharmaceutical manufacturing lines.
Still, 2007 offered plenty of other significant, if less heralded (and immediately useful), developments and pushed robotic technology to new levels, or at least promised to in the near future.
As part of NASA's plans to send peopled missions back to the moon (and then on to Mars), the space agency, in September, performed a series of tests to determine if robotic technology could be used to provide medical care for astronauts during extended spaceflights. On board a military C-9 aircraft flying in parabolic arcs over the Gulf of Mexico, four surgeons and four astronauts performed simulated surgery both by hand and using a robotic device developed by SRI International to determine if the robot's software can compensate for errors in movement caused by turbulence and varying gravitational conditions.
The U.S. Department of Defense continued its quest to develop autonomous robotic technology that will eventually take the place of human soldiers in battle. In November, the Defense Advanced Research Projects Agency (DARPA) hosted its 2007 DARPA Urban Challenge, a competition that tested the driving prowess of experimental driverless autos. "Boss," an SUV put together by a team including gearheads from Carnegie Mellon University, General Motors Corporation, Caterpillar and Continental AG drove away with the $2 million grand prize. (The second- and third-place finishers were, respectively, built by groups at Stanford University and Virginia Polytechnic Institute and State University.) Boss maintained an average speed of 14 miles per hour throughout the 55-mile course at the former George Air Force Base in Victorville, Calif., which was built to resemble an urban layout. The autonomous vehicles demonstrated their abilities by changing lanes, merging onto roadways amidst fast-moving traffic and traversing busy intersections—using only sensors, global positioning systems and computers.
During the tragedy at Utah's Crandall Canyon mine in August, when six miners and three rescuers perished in a mine collapse and subsequent rescue attempt, rescuers learned valuable lessons about the capabilities and limitations of robotic equipment. A robot crawler was sent 1,500 feet (457 meters) through a borehole into the mine, located about 120 miles (193 kilometers) south of Salt Lake City, after it crumbled due to a cave-in so powerful that it registered a magnitude of 3.9 on the Richter scale. Workers, handicapped by time constraints and the continued shifting of the mountain's mass, managed to get the crawler to the mine's floor but were bogged down by debris and unable retrieve the device, which remains trapped 52 feet (16 meters) below the mountain's surface.
Other robots helped us learn about ourselves. In November, University of California, San Diego, researchers reported in Proceedings of the National Academy of Sciences USA that "current robot technology is surprisingly close to achieving autonomous bonding and socialization with human toddlers for significant periods of time." QRIO, another two-foot- (61-centimeter-) humanoid was placed in UC San Diego's Early Childhood Education Center and programmed to wave, dance, sit and stand, among other functions. Children aged 18 to 24 months quickly warmed to the machine and began to treat it more like a peer than an object.
Earlier this month, Toyota unveiled the latest in its line of "partner robots": the aptly named Violin-playing Robot, which can hold the string instrument in place with its left hand and move the bow with its right hand to produce music. The roughly 5-foot, 123 lb. (1.5-meter, 56 kg) humanoid joins walking and rolling robots the company introduced in 2004, which are capable of playing the trumpet. Toyota rival Honda also this month introduced advancements to its humanoid, ASIMO (first introduced in 2005) , that allow it to perform tasks such as carrying a tray and pushing a cart while simultaneously employing an eye camera to detect the speed and direction of humans and other ASIMOs to avoid collisions. The new ASIMO also knows when its battery levels are low and will automatically return to its base to recharge.
When not serving tea, robots aided scientists in understanding insect behavior. Researchers at the Free University of Brussels (U.L.B.) in Belgium, École Polytechnique Fédérale de Lausanne (E.P.F.L.) in Switzerland and the University of Rennes in France, along with other European educational institutions, reported in Science that they successfully introduced a few autonomous robots (boxy in design, but made to smell like the real deal) into cockroach communities. The robots were able to alter the collective decision-making process of the group and trigger new behavior patterns in the bugs. The findings show that it may be possible to use robots to study and control how groups of animals from insects to vertebrates interact.
And further proving there is no limit to what robotic technology can accomplish, a Web video recently surfaced featuring a mechanical device that can not only open a beer bottle but can follow that feat with a proper pour. (Note how the cup is held at an angle; many humans have yet to master this technique.)
This sampling merely scratches the surface of the past year's advances in robotics that whet the appetite for what's to come: Early next year, for instance, researchers at the University of Colorado at Boulder will benchmark robotic devices to precisely mix and measure medications used in treatments such as chemotherapy. The robotic Mars rovers Opportunity and Spirit are currently hunkering down in anticipation of the harsh Martian winter season but will soon resume their exploration of the Red Planet. And Scandanavian research firm Sintef is developing artificially intelligent equipment to help offshore oil and gas drilling platforms run more safely and efficiently.
In all, 2008 promises continued progress in the area of artificial intelligence, although it will still be a while before humankind reaches the point where it cannot live without the robots it has created.
Article source: Scientific American
Bejegyezte: Robofriend dátum: 1:38 PM 0 megjegyzés
1/13/2008
Fast Growing Robot Market
Educational and Entertainment Robot Market Strategy, Market Shares, and Market Forecasts, 2008-2014 - a report by Electronics.ca Publications (electronics industry market research and knowledge network)
Robots used in education and entertainment provide modular motion and remote sensor system that provide flexibility. The educational kits are designed for pure fun and for educational competitions where students put together modules in innovative ways to create designs that work.
Robots are set to provide more variety to entertainment. The robotic ability to sing and dance and fight provides endless new modalities of entertainment as people organize their robots in a creative manner. According to the new report, educational robots stimulate innovation. Creativity is set to be stimulated by the modular systems that are available to students in the robotic community.
The modularity of robot kits makes them versatile and flexible. Modules can be put together in a variety of ways, giving users choices about what functionality the robot will have. Robot competitions are team efforts where groups join together to build a robot that performs in a particular manner meeting the requirements of the particular competition.
Educational robots are used by every level of student. Kits are geared to various age and skill levels. Robotics competitions are being held for every age level. Students do not yet receive formal education on robots and are more likely to enter competitions as clubs competing against each other representing different educational institutions.
Humanoid robots and innovative shaped robots are evolving a place in homes and offices, providing information and communications, as well as automated locomotion. The automated modules constructed as robots provide elaborate capability. Markets for educational robotic kits at 541,000 units in 2007 are anticipated to reach 35.8 million units by 2014. As the price comes down and schools begin to institutionalize robotics programs, there is very fast growth anticipated. Growth at the low end robotic kits starts to level off as demand increases for robots with more components and more functionality. Markets for educational robotic kits at $27.5 million in 2007 are anticipated to reach $1.69 billion by 2014.
Robot entertainment and educational markets at $184.9 million in 2007 are anticipated to reach $2.985 billion by 2014. Market growth is spurred by the evolution of a new technology useful in a range of industry segments. The educational and entertainment robots represent a first step in the evolution of the robotic markets because they provide the teaching aspect of the market that precedes any other market evolution in the services and mobility segments of consumer robotics.
Details of the new report can be found on Electronics.ca Publications' website.
Bejegyezte: Robofriend dátum: 1:57 PM 0 megjegyzés
Hey chicks, check this: WowWee FemiSapien
WowWee is targeting at women and girls. FemiSapien is a female humanoid robot that dances to music. She hears via its onboard microphone, and can control other robots in WowWee's lineup. She also reacts to voice commands and even human touch.
Maybe RoboSapien will dating soon. :)
FemiSapien will be available in late summer 2008 at a friendly price of $100.
Video:
Bejegyezte: Robofriend dátum: 7:59 AM 0 megjegyzés