It can speak, sing and dance. It can imitate human emotions. In appearance it resembles a spaceman and goes by the cutesy name of "Fei Fei."
Man-sized Fei Fei, an entertainment robot recently developed by a research team at the Institute of Automation under the Chinese Academy of Sciences, aims to present to the public a true picture of what the robots of today can actually do.
Through a microphone wired to the robot, at the command, "Say hello," Fei Fei raises an arm and waves.
His head has 12 joints that can move freely enabling him to imitate expressions of all kinds of human emotion, from joy, anger, surprise, disgust, sadness, and even fear. Each time he finishes his imitation, he giggles and boasts of his excellent mimicking abilities.
When he laughs, his mouth opens wide and his eyebrows rise. When he is sad, he lowers his brows and eyelids.
On his chest is a screen, which displays a menu list, including emotions and movement appreciation, dialogue, Olympic knowledge quiz and sound imitation among others. Programmed to obey, he readily responds to human commands.
On the right side of Fei Fei's display screen a digital camera is embedded which allows him to snap and process prints in moments.
In a corner, a few meters away, stands Fei Fei's granddad, a first-generation entertainment robot developed by the institute. Compared to Fei Fei, he is shorter and much heavier.
In 2000, assigned by China Science and Technology Museum, Li Furong led a research team to develop a robot, which can show school children how communications are conducted between machines and people to arouse their interest in the science. "The first entertainment robot we designed was quite simple and could only hold a basic everyday dialogue," said Li.
The project fired a strong interest in Li, in the science of robotics. In the past, his research interests had mainly focused on sound recognition. And his impressions of robots were confined to the stuff of science fiction novels and films, much the same as the average Joe Public.
In the process of designing the robot, Li increasingly realized that artificial intelligence could be an ideal penetration point for his earlier sound cognition research.
Sound recognition technology is one of the basics for robot development, says Li. "Our research power in the field has an edge even in the world. Our robots can recognize a good deal of vocabulary in different fields."
However, at present, Li admits it is still difficult to break the bottleneck of so-called fuzzy logic, because of the complexity of the Chinese language system. "Even a tiny difference of one word in a sentence will probably denote a totally different meaning," he said.
So sometimes, no matter how loud one shouts at him, Fei Fei remains unresponsive to words which sound slightly different from the instructions programmed in advance by the researchers.
To realize a more natural and intimate dialogue with the robot, Li's research team is working on developing another technology, one that will ensure the robot understands random and free speaking by a person on a particular topic such as entertainment and sports.
Another technical basis of robot development is image recognition which Li's team plan to apply to the next generation of entertainment robots.
At present, several departments in the institute are working together to tackle the subject of facial recognition. Similar to fingerprint identification, the robot can recognize an individual person by detecting the front facial image captured by a color camera.
Intelligent control is another important area scientists are exploring. Compared to the series of entertainment robots developed by the Sony Corporation, which have already realized more advanced movements of bipedicular movement on irregular and tiled surfaces, Chinese scientists have much to do to improve their robots movement co-ordination.
"Imitating human body movements more vividly requires more joints. However, the lack of precision in robotic parts manufactured domestically means our robot is as yet incapable of complex movement," said Li.
Sci-fi vs reality
The response to Li and his team's robots at exhibitions varies from the incredulous that the robot can seemingly understand their words to those disappointed they cannot do the stuff of sci-fi and movie robots.
"People's expectations of robots have gone far beyond what they can really do. It is still rather unrealistic to let a robot prepare a dinner," said Li.
But he hopes their generation of entertainment robots will stimulate a general interest among the public, while at the same time giving them a true picture of robotic science and technology as it currently stands.
"Only when people have a proper understanding will they not be too hypercritical about robot functions. This will surely benefit the potential of robots in our everyday life," he continued.
Li is confident that entertainment robots will gradually undertake more functions in the future.
The robot can be a teacher and friend of children, can help busy couples clean the floor, adjust the temperature of the air-conditioner, water flowers when they are out of town, even call the emergency services in the event of a break-in or fire. "All these assumptions are not daydreams, they are attainable," he said.
At present, on their research agenda are robots that can preside over a celebration ceremony, replace the beauties currently used as model accessories for new automobiles, or even take on the role of image spokesman for a product.
The intelligence of a robot, in fact, lies in its ability to imitate that of a human being. It collects external sounds and visual information and responds after processing and calculating the appropriate response.
"The increase of robot intelligence still lags far behind public expectations, because the process is hindered by a lack of understanding of the cognitive mechanisms of human beings," explained Li. "How to improve our robots ability to distinguish and differentiate against a background of normal external interference is one of our major priorities," added Li.
(China Daily June 6, 2005)