Tomorrow’s home care robots are more likely to adopt a human-like appearance rather than Paro’s Pinnipedian form factor. Even the first-generation products include rudimentary attempts to offer emotional comfort. The head of the Kompaï robot from Robosoft features a “face,” for example, while the Giraff robot from Sweden’s Giraff Technologies AB displays a screen image of a human head and shoulders that speaks to the elderly person.
The Personal Robots Group at MIT’s Media Lab have taken things a step further with research that focuses on robots that can make stronger social and emotional connections with people. The lab has developed a range of robots including one called Nexi that can blink, shrug, and make facial expressions.
Challenges remain because there is only so far this can be taken before that robot enters the so-called uncanny valley, a place where the machine has a face that is very humanlike, but not enough to be mistaken for a real person. The result is typically creepy or even the stuff of nightmares (in the same way that zombies are frightening). To develop a meaningful “relationship” with a human, a robot needs to be able to detect a person’s mood and react accordingly. This is particularly important in the home-care sector where a lonely senior may be frequently unhappy due to discomfort or loneliness. The robot will need to pick up on this negative demeanor through visual cues such as tears and downturned mouth, audio signals such as extended silence or barely audible speech, and even physiological signs such as lowered temperature and raised blood pressure.
Christopher G. Atkeson, a professor in the Robotics Institute and Human-Computer Interaction Institute at Carnegie Mellon University, is conducting pioneering work in adapting robotic technology to meet the needs of the elderly. Atkeson’s key contribution is to suggest that robots should be soft, not made of metal. He argues that people are more likely to accept a soft robot (with an inflatable body constructed from materials similar to a bounce house) as a companion rather than a conventional “metal monster.” In fact, his vision partly influenced the making of Big Hero 6, a Disney movie about a boy whose closest companion is Baymax, an endearing inflatable robot that comforts the sick. The film showed how a health-care robot could interact with humans.
Aldebaran, a French company primarily owned by Japanese telecom firm SoftBank, is among the first to develop a cyberconscious robot. The firm created Pepper, an interactive robot that, according to the manufacturer, is able to detect human emotions and choose the ideal way to communicate with the person. The robot listens to voices and analyzes body language and can modify language and gestures to adapt to a given situation. Pepper includes a screen on his (apparently the robot is male) chest, where he “shows his emotions and what’s happening in his inner world.”
Enhancing Robot Sensors
Robots such as Pepper and its descendants will need to take advantage of the full range of modern sensors in order to pick up the gamut of cues that indicate a human’s state of mind. Like humans themselves, robots will use vision to take in much of this information.
OMRON’s global research and development group has developed the OKAO Vision system, which can recognize different faces through analysis of facial features. These products are already finding a niche in robot-vision applications. OKAO Vision comprises Human Vision Components (HVC) and advanced software that the company says will help robotic machines “understand people visually in much the same way as humans do.” The software can recognize faces and facial attributes to estimate a person’s gender, age, and ethnicity as well as determining whether a person is happy, surprised, angry, sad, or neutral.
In addition to charged coupled device-based vision systems such as OKAO’s HVC, future robots will benefit from enhanced vision through the use of sensors that extend sensitivity to a wider range of electromagnetic radiation than just visible light. For example, infrared (IR) sensors such as Amphenol’s ZTP Thermopile IR Sensors and Lumex’s MicronSensIR™ will enable robots to “see” in the dark and locate their charges as well as determine whether the person’s temperature is high or low.
Complementing vision sensors, future robots will also employ audio sensors based on microphones converting the vibrations from sound waves into voltages. Multiple microphones situated around the robots head will be used to determine the direction and intensity of a person’s voice. But that’s the simple part; in order for a home-care robot to use the sound of its charge’s voice to determine emotional status, the machine will need to be equipped with analog-to-digital conversion (ADC) and digital-signal processing (DSP) electronics allied to a powerful microprocessor and some clever software. Companies such as Analog Devices, Cirrus Logic, STMicroelectronics, and Texas Instruments provide the electronics that will underpin robot audio-analysis applications.
Touch sensors that will help the machine determine a person’s emotional state will supplement a next-generation home-care robot’s vision system. Today, developers can choose from a range of touch sensors that use technologies such as a change in capacitance to detect touch. Although the technology is currently used for products such as touch-screen displays, it’s not too difficult to envision how the process could be reversed such that a robot could use a capacitive touch sensor to “feel” human skin and gain some insight into the person’s emotional state by measuring the changes in capacitance that occur as the skin changes temperature. Companies such as Freescale Semiconductor, Maxim Integrated, Cypress Semiconductor, and ON Semiconductor are strong in touch-sensor technology.
Tomorrow’s developments include a touch sensor that uses gold nanoparticles supplemented with organic connector molecules. When the sensor touches skin and slightly flexes, the distance between the nanoparticles changes and the electrical characteristics of the sensor alter, allowing not only pressure, but also the temperature and skin humidity of the person being touched to be measured.
Robots will likely leverage technology beyond their own in their role as caregivers. For example, wearable wireless medical sensors enabled by technologies such as Bluetooth Low Energy could be employed to relay information such as heart rate, blood pressure, and temperature to the robot in order to complement its own sensors in determining a person’s mood. Companies that specialize in Bluetooth technology include Broadcom, Nordic Semiconductor, Panasonic, and Texas Instruments.
Robots have some way to go before they can match the quality of home care routinely provided by humans. But the lucrative rewards promised for manufacturers targeting the rapidly expanding over-65 age group is encouraging rapid development. When asked by Not Impossible Now, a technology website, whether robots such as Disney’s Baymax would ever be possible, Professor Atkeson replied “not only will they be possible, but it will happen very soon.”
This is the second and concluding part of a series of articles on robots and electronics. A version of this article was first published by Mouser Electronics. Click HERE for the link to the original article.
Steven Keeping gained a BEng (Hons.) degree at Brighton University, U.K., before working in the electronics divisions of Eurotherm and BOC for seven years. He then joined Electronic Production magazine and subsequently spent 13 years in senior editorial and publishing roles on electronics manufacturing, test, and design titles including What’s New in Electronics and Australian Electronics Engineering for Trinity Mirror, CMP and RBI in the U.K. and Australia. In 2006, Steven became a freelance journalist specializing in electronics. He is based in Sydney.