Welcome to Robot Info!

This blog is dedicated to robotics in general for easy access to a lot of different information regarding the topic.

1/15/2008

Thought Experiment - monkey brain made robot walking


Source: The New York Times

Monkey’s Thoughts Propel Robot, a Step That May Help Humans
By SANDRA BLAKESLEE

If Idoya could talk, she would have plenty to boast about.

On Thursday, the 12-pound, 32-inch monkey made a 200-pound, 5-foot humanoid robot walk on a treadmill using only her brain activity.

She was in North Carolina, and the robot was in Japan.

It was the first time that brain signals had been used to make a robot walk, said Dr. Miguel A. L. Nicolelis, a neuroscientist at Duke University whose laboratory designed and carried out the experiment.

In 2003, Dr. Nicolelis’s team proved that monkeys could use their thoughts alone to control a robotic arm for reaching and grasping.

These experiments, Dr. Nicolelis said, are the first steps toward a brain machine interface that might permit paralyzed people to walk by directing devices with their thoughts. Electrodes in the person’s brain would send signals to a device worn on the hip, like a cell phone or pager, that would relay those signals to a pair of braces, a kind of external skeleton, worn on the legs.

“When that person thinks about walking,” he said, “walking happens.”

Richard A. Andersen, an expert on such systems at the California Institute of Technology in Pasadena who was not involved in the experiment, said that it was “an important advance to achieve locomotion with a brain machine interface.”

Another expert, Nicho Hatsopoulos, a professor at the University of Chicago, said that the experiment was “an exciting development. And the use of an exoskeleton could be quite fruitful.”

A brain machine interface is any system that allows people or animals to use their brain activity to control an external device. But until ways are found to safely implant electrodes into human brains, most research will remain focused on animals.

In preparing for the experiment, Idoya was trained to walk upright on a treadmill. She held onto a bar with her hands and got treats — raisins and Cheerios — as she walked at different speeds, forward and backward, for 15 minutes a day, 3 days a week, for 2 months.

Meanwhile, electrodes implanted in the so-called leg area of Idoya’s brain recorded the activity of 250 to 300 neurons that fired while she walked. Some neurons became active when her ankle, knee and hip joints moved. Others responded when her feet touched the ground. And some fired in anticipation of her movements.

To obtain a detailed model of Idoya’s leg movements, the researchers also painted her ankle, knee and hip joints with fluorescent stage makeup and, using a special high speed camera, captured her movements on video.

The video and brain cell activity were then combined and translated into a format that a computer could read. This format is able to predict with 90 percent accuracy all permutations of Idoya’s leg movements three to four seconds before the movement takes place.

On Thursday, an alert and ready-to-work Idoya stepped onto her treadmill and began walking at a steady pace with electrodes implanted in her brain. Her walking pattern and brain signals were collected, fed into the computer and transmitted over a high-speed Internet link to a robot in Kyoto, Japan.

The robot, called CB for Computational Brain, has the same range of motion as a human. It can dance, squat, point and “feel” the ground with sensors embedded in its feet, and it will not fall over when shoved.

Designed by Gordon Cheng and colleagues at the ATR Computational Neuroscience Laboratories in Kyoto, the robot was chosen for the experiment because of its extraordinary ability to mimic human locomotion.

As Idoya’s brain signals streamed into CB’s actuators, her job was to make the robot walk steadily via her own brain activity. She could see the back of CB’s legs on an enormous movie screen in front of her treadmill and received treats if she could make the robot’s joints move in synchrony with her own leg movements.

As Idoya walked, CB walked at exactly the same pace. Recordings from Idoya’s brain revealed that her neurons fired each time she took a step and each time the robot took a step.

“It’s walking!” Dr. Nicolelis said. “That’s one small step for a robot and one giant leap for a primate.”

The signals from Idoya’s brain sent to the robot, and the video of the robot sent back to Idoya, were relayed in less than a quarter of a second, he said. That was so fast that the robot’s movements meshed with the monkey’s experience.

An hour into the experiment, the researchers pulled a trick on Idoya. They stopped her treadmill. Everyone held their breath. What would Idoya do?

“Her eyes remained focused like crazy on CB’s legs,” Dr. Nicolelis said.

She got treats galore. The robot kept walking. And the researchers were jubilant.

When Idoya’s brain signals made the robot walk, some neurons in her brain controlled her own legs, whereas others controlled the robot’s legs. The latter set of neurons had basically become attuned to the robot’s legs after about an hour of practice and visual feedback.

Idoya cannot talk but her brain signals revealed that after the treadmill stopped, she was able to make CB walk for three full minutes by attending to its legs and not her own.

Vision is a powerful, dominant signal in the brain, Dr. Nicolelis said. Idoya’s motor cortex, where the electrodes were implanted, had started to absorb the representation of the robot’s legs — as if they belonged to Idoya herself.

In earlier experiments, Dr. Nicolelis found that 20 percent of cells in a monkey’s motor cortex were active only when a robotic arm moved. He said it meant that tools like robotic arms and legs could be assimilated via learning into an animal’s body representation.

In the near future, Idoya and other bipedal monkeys will be getting more feedback from CB in the form of microstimulation to neurons that specialize in the sense of touch related to the legs and feet. When CB’s feet touch the ground, sensors will detect pressure and calculate balance. When that information goes directly into the monkeys’ brains, Dr. Nicolelis said, they will have the strong impression that they can feel CB’s feet hitting the ground.

At that point, the monkeys will be asked to make CB walk across a room by using just their thoughts.

“We have shown that you can take signals across the planet in the same time scale that a biological system works,” Dr. Nicolelis said. “Here the target happens to be a robot. It could be a crane. Or any tool of any size or magnitude. The body does not have a monopoly for enacting the desires of the brain.”

To prove this point, Dr. Nicolelis and his colleague, Dr. Manoel Jacobsen Teixeira, a neurosurgeon at the Sirio-Lebanese Hospital in São Paulo, Brazil, plan to demonstrate by the end of the year that humans can operate an exoskeleton with their thoughts.

It is not uncommon for people to have their arms ripped from their shoulder sockets during a motorcycle or automobile accident, Dr. Nicolelis said. All the nerves are torn, leaving the arm paralyzed but in chronic pain.

Dr. Teixeira is implanting electrodes on the surface of these patients’ brains and stimulating the underlying region where the arm is represented. The pain goes away.

By pushing the same electrodes slightly deeper in the brain, Dr. Nicolelis said, it should be possible to record brain activity involved in moving the arm and intending to move the arm. The patients’ paralyzed arms will then be placed into an exoskeleton or shell equipped with motors and sensors.

“They should be able to move the arm with their thoughts,” he said. “This is science fiction coming to life.”

Source: The New York Times

1/14/2008

2007 in Robots

Article source: Scientific American

In 2007, our artificially intelligent companions moved closer to replacing us on the battlefield, improving healthcare (on Earth and in space) and even befriending our children
By Larry Greenemeier

Last week's announcement of Japan's "Robot of the Year" for 2007—a mechanical arm capable of grabbing 120 items-per-minute from a conveyor belt—marked an anticlimactic end to what has otherwise been a good year in the advancement of artificial intelligence.

The three Fanuc Ltd. assembly-line mechanical arms—which beat out competitors such as Fujitsu's 24-inch-tall (61-centimeter) dancing humanoid HOAP and Komatsu Ltd.'s tank-shaped, fire-extinguishing robot—won for their practicality; they are optimized to work efficiently and accurately on food and pharmaceutical manufacturing lines.

Still, 2007 offered plenty of other significant, if less heralded (and immediately useful), developments and pushed robotic technology to new levels, or at least promised to in the near future.

As part of NASA's plans to send peopled missions back to the moon (and then on to Mars), the space agency, in September, performed a series of tests to determine if robotic technology could be used to provide medical care for astronauts during extended spaceflights. On board a military C-9 aircraft flying in parabolic arcs over the Gulf of Mexico, four surgeons and four astronauts performed simulated surgery both by hand and using a robotic device developed by SRI International to determine if the robot's software can compensate for errors in movement caused by turbulence and varying gravitational conditions.

The U.S. Department of Defense continued its quest to develop autonomous robotic technology that will eventually take the place of human soldiers in battle. In November, the Defense Advanced Research Projects Agency (DARPA) hosted its 2007 DARPA Urban Challenge, a competition that tested the driving prowess of experimental driverless autos. "Boss," an SUV put together by a team including gearheads from Carnegie Mellon University, General Motors Corporation, Caterpillar and Continental AG drove away with the $2 million grand prize. (The second- and third-place finishers were, respectively, built by groups at Stanford University and Virginia Polytechnic Institute and State University.) Boss maintained an average speed of 14 miles per hour throughout the 55-mile course at the former George Air Force Base in Victorville, Calif., which was built to resemble an urban layout. The autonomous vehicles demonstrated their abilities by changing lanes, merging onto roadways amidst fast-moving traffic and traversing busy intersections—using only sensors, global positioning systems and computers.

During the tragedy at Utah's Crandall Canyon mine in August, when six miners and three rescuers perished in a mine collapse and subsequent rescue attempt, rescuers learned valuable lessons about the capabilities and limitations of robotic equipment. A robot crawler was sent 1,500 feet (457 meters) through a borehole into the mine, located about 120 miles (193 kilometers) south of Salt Lake City, after it crumbled due to a cave-in so powerful that it registered a magnitude of 3.9 on the Richter scale. Workers, handicapped by time constraints and the continued shifting of the mountain's mass, managed to get the crawler to the mine's floor but were bogged down by debris and unable retrieve the device, which remains trapped 52 feet (16 meters) below the mountain's surface.

Other robots helped us learn about ourselves. In November, University of California, San Diego, researchers reported in Proceedings of the National Academy of Sciences USA that "current robot technology is surprisingly close to achieving autonomous bonding and socialization with human toddlers for significant periods of time." QRIO, another two-foot- (61-centimeter-) humanoid was placed in UC San Diego's Early Childhood Education Center and programmed to wave, dance, sit and stand, among other functions. Children aged 18 to 24 months quickly warmed to the machine and began to treat it more like a peer than an object.

Earlier this month, Toyota unveiled the latest in its line of "partner robots": the aptly named Violin-playing Robot, which can hold the string instrument in place with its left hand and move the bow with its right hand to produce music. The roughly 5-foot, 123 lb. (1.5-meter, 56 kg) humanoid joins walking and rolling robots the company introduced in 2004, which are capable of playing the trumpet. Toyota rival Honda also this month introduced advancements to its humanoid, ASIMO (first introduced in 2005) , that allow it to perform tasks such as carrying a tray and pushing a cart while simultaneously employing an eye camera to detect the speed and direction of humans and other ASIMOs to avoid collisions. The new ASIMO also knows when its battery levels are low and will automatically return to its base to recharge.



When not serving tea, robots aided scientists in understanding insect behavior. Researchers at the Free University of Brussels (U.L.B.) in Belgium, École Polytechnique Fédérale de Lausanne (E.P.F.L.) in Switzerland and the University of Rennes in France, along with other European educational institutions, reported in Science that they successfully introduced a few autonomous robots (boxy in design, but made to smell like the real deal) into cockroach communities. The robots were able to alter the collective decision-making process of the group and trigger new behavior patterns in the bugs. The findings show that it may be possible to use robots to study and control how groups of animals from insects to vertebrates interact.

And further proving there is no limit to what robotic technology can accomplish, a Web video recently surfaced featuring a mechanical device that can not only open a beer bottle but can follow that feat with a proper pour. (Note how the cup is held at an angle; many humans have yet to master this technique.)



This sampling merely scratches the surface of the past year's advances in robotics that whet the appetite for what's to come: Early next year, for instance, researchers at the University of Colorado at Boulder will benchmark robotic devices to precisely mix and measure medications used in treatments such as chemotherapy. The robotic Mars rovers Opportunity and Spirit are currently hunkering down in anticipation of the harsh Martian winter season but will soon resume their exploration of the Red Planet. And Scandanavian research firm Sintef is developing artificially intelligent equipment to help offshore oil and gas drilling platforms run more safely and efficiently.

In all, 2008 promises continued progress in the area of artificial intelligence, although it will still be a while before humankind reaches the point where it cannot live without the robots it has created.

Article source: Scientific American

1/13/2008

Fast Growing Robot Market

Educational and Entertainment Robot Market Strategy, Market Shares, and Market Forecasts, 2008-2014 - a report by Electronics.ca Publications (electronics industry market research and knowledge network)

Robots used in education and entertainment provide modular motion and remote sensor system that provide flexibility. The educational kits are designed for pure fun and for educational competitions where students put together modules in innovative ways to create designs that work.

Robots are set to provide more variety to entertainment. The robotic ability to sing and dance and fight provides endless new modalities of entertainment as people organize their robots in a creative manner. According to the new report, educational robots stimulate innovation. Creativity is set to be stimulated by the modular systems that are available to students in the robotic community.

The modularity of robot kits makes them versatile and flexible. Modules can be put together in a variety of ways, giving users choices about what functionality the robot will have. Robot competitions are team efforts where groups join together to build a robot that performs in a particular manner meeting the requirements of the particular competition.

Educational robots are used by every level of student. Kits are geared to various age and skill levels. Robotics competitions are being held for every age level. Students do not yet receive formal education on robots and are more likely to enter competitions as clubs competing against each other representing different educational institutions.

Humanoid robots and innovative shaped robots are evolving a place in homes and offices, providing information and communications, as well as automated locomotion. The automated modules constructed as robots provide elaborate capability. Markets for educational robotic kits at 541,000 units in 2007 are anticipated to reach 35.8 million units by 2014. As the price comes down and schools begin to institutionalize robotics programs, there is very fast growth anticipated. Growth at the low end robotic kits starts to level off as demand increases for robots with more components and more functionality. Markets for educational robotic kits at $27.5 million in 2007 are anticipated to reach $1.69 billion by 2014.

Robot entertainment and educational markets at $184.9 million in 2007 are anticipated to reach $2.985 billion by 2014. Market growth is spurred by the evolution of a new technology useful in a range of industry segments. The educational and entertainment robots represent a first step in the evolution of the robotic markets because they provide the teaching aspect of the market that precedes any other market evolution in the services and mobility segments of consumer robotics.

Details of the new report can be found on Electronics.ca Publications' website.

Hey chicks, check this: WowWee FemiSapien

WowWee is targeting at women and girls. FemiSapien is a female humanoid robot that dances to music. She hears via its onboard microphone, and can control other robots in WowWee's lineup. She also reacts to voice commands and even human touch.

Maybe RoboSapien will dating soon. :)

FemiSapien will be available in late summer 2008 at a friendly price of $100.

Video:

WowWee Tribot

WowWee presented it's new robot named Tribot at CES 2008.

Tribot is funny joke-telling robofriend. It plays games that require you to move it around in certain patterns, and its three wheels give him a good decent range of movement. It also has a motion-sensing controller, allowing you to move it forwards and backwards by simply tilting it where you want to go.

Tribot will be available in the USA at september 2008 for $99.





1/12/2008

Robot lion by WowWee

WowWee has expanded its range of Alive robot animals beyond Elvis and a chimpanese to include a lion, panda, polar bear and white tiger cubs.

The new friendly looking robots feature realistic fur, an animated face, and recorded sounds activated by touch and tilt sensors.

Offering fully interactive features, WowWee say that the robots will even has lifelike responses. Picking the lion cub up for example by the scruff of the neck and his legs will go limp, like a real cub being carried by its mother, or leave him alone for five minutes and he will purr himself into sleep mode.

Video:

Rovio by WowWee - An alien in Your home?

At CES 2008 WowWee has introduced Rovio, a robot with Omni-directional wheels and self charging station designed to be used as a survelience robot with Skype Capabilities. Even Microsoft is promoting this little machine.

Rovio is a “telepresence” robot that’s outfitted with the latest in micro-GPS technology from Evolution Robotics. Rovio has a Wi-Fi-enabled Web-cam that allows you to patrol your home while you’re away, via an Internet-enabled PC, console or mobile phone.

This one is an interesting looking robot. With the new GPS system you can drive the robot with shortcut “go-to” commands. For instance, “go to the dining room”. This makes it easy to control the robot remotely over the Web or even from your mobile phone. It also has what they call a “patrol mode” where it automatically sends you pictures of locations you want to check.

With the other cool features, it also has the ability to guide itself back on course if, say your kids pick it up and move it, or if the dog wants to play with it. When it gets thirsty for energy, it can navigate to its charging station from anywhere in the home, and dock with pinpoint accuracy. It’s expected to launch in the second half of the year.


WowWee Rovio will be available fall 2008 for an estimating price of $299.


Video



Pleo


Pleo is a robotic dinosaur designed to emulate the appearance and behavior of a week-old baby Camarasaurus.

It was designed by Caleb Chung, the co-creator of the Furby, and is manufactured by Ugobe. Chung selected this species of dinosaur because its body shape, stocky head, and relatively large cranium made it ideal for concealing the sensors and motors needed for lifelike animation. According to Ugobe, each Pleo will "learn" from its experiences and environment through a sophisticated artificial intelligence and develop an individual personality.

Pleo was unveiled on February 7, 2006 at the DEMO Conference in Scottsdale, Arizona and was expected to come on the Indian and American markets around Fall 2007.

Pleo shipments started on December 5, 2007. The robot is software-upgradeable via SD card or USB interfaces and costs $349 USD.

Ugobe will encourage user modifications of the robot's firmware, providing a graphical interface for home users and an API for programmers.

Ugobe develops robotic technology to animate robots into lifelike creatures with organic movement and adaptable behaviors. Ugobe calls its robots "Life Forms," powered by its "Life OS".

Design

Ugobe's designs combined sensory, articulation, and neuronetics to create the lifelike appearance of its robots. In developing Pleo, Ugobe paid particular attention to the biological and neurological systems of the Camarasaurus, and "re-interpreted" those elements through hardware and software. Pleo was engineered by a group of robotics specialists, animators, technologists, scientists, biologists, and programmers. Pleo is intented to be a fun experience for everyone.

However, it should be noted by any potential buyers that the paint on the skin has been seen to wear off within the first 24 hours and flaking from the paint on the teeth has also found similar problems according to many Pleo owners (http://forums.pleoworld.com/showthread.php?t=503, retrieved 20th Dec, 2007)

Features

  • camera-based vision system (for light detection and navigation)
  • two microphones, binaural hearingbeat detection (allows pleo to dance and listen to music) - this feature was removed but may be added on again.
  • eight touch sensors (head, chin, shoulders, back, feet)four foot switches (surface detection)
  • fourteen force-feedback sensors, one per joint
  • orientation tilt sensor for body position
  • infrared mouth sensor for object detection into mouthinfrared transmit and receive for communication with other Pleos
  • Mini-USB port for online downloads
  • SD card slot for Pleo add-ons
  • infrared detection for external objects
  • 32-bit Atmel ARM 7 microprocessor (main processor for Pleo)
    • 32-bit NXP ARM 7 sub processor (camera system, audio input dedicated processor)
    • four 8-bit processors (low-level motor control)
Official Pleo site: www.pleoworld.com

Video: Pleo hatches!



Video: Taking Pleo out of the box



1st Robo Expo, 2008, New Delhi

SOURCE: http://machinist.in/

The 1st Robo Expo 2008 organized by Confederation of Indian Industry (CII) concurrent to the 9th Auto Expo, has witnessed the launch of a number of new Robots.

The Robo Expo, being held from 10-17 January at the Andhra Pavilion, Pragati Maidan showcases world leaders of Robotics & Automation such as ABB Ltd, Hi-Tech Robotic Systemz, Kuka, Motoman Motherson Robotics Ltd, Panasonic, Precision Automation & Robotics India Ltd (PARI), Rhythmsoft and Rockwell amongst others. While speaking at the inaugural of the Robo Expo, Dr. V.Krishnamurthy, Chairman, National Manufacturing Competitiveness Council (NMCC) complimented CII’s effort in organising the Robo Expo. “Personal safety and enhanced productivity are becoming increasingly important for the manufacturing sector. These require enhanced effort in the area of robotics and automation,” said Dr. Krishnamurthy.Mr. V.Govindarajan, Member Secretary, NMCC said, “The immense range of application possibilities that robotics offer can be seen at the Robo Expo”. There are robots with applications in entertainment, sports, productivity improvement and industrial use here at the Expo, he added.

1st Robo Expo, 2008 will in future become an annual event of the CII Mission for Innovation. The Expo has received an overwhelming response from the industry with most large manufacturers of the world participating in the event. A Robo Conference, which will be held on January 14-15 will provide platform to the participants to network with and learn from the experts and practitioners in Robotics and Automation

Sony Rolly outside Japan

Sony's dancing robot named Rolly is a
combination of digital-music player and portable speaker set.

As the Rolly plays a song stored in its 2 gigabytes of flash memory, it can also spin itself around and flap the covers of its speakers in tune to the beat.

You can also remotely control Rolly from a Bluetooth phone, and you can program it with new dance steps. It's supposed to go on sale sometime in the first half of this year in the US; Sony won't name a price, but a publicist suggested that $500 could be a reasonable estimate.


Rolly is coming to the United States this spring. The little egg packed with all kinds of abilities, not the least of which is playing MP3 or AAC files via Bluetooth with A2DP off of its 2GB (which is an improvement on the Japanese version's 1GB) of flash memory. No price right now.

Specs

  • 2GB internal memory Connect/Charge via USB 2.0
  • Bluetooth 2.0 (to receive music)
  • Editable Motion (software is provided)
  • Battery 3.7V 1560 mAh (5:00 of music playback, 4:30 with streaming Bluetooth music, 4:00 of music and motion, 3:30 Bluetooth streaming and motion)
  • Music format: MP3, ATRAC, AAC up to 300+ kbps
  • 104×65×65mm, 300g

First post


Robot Info's coming soon!