New ultra-sensitive, wearable sensors that help human-like robots to read people's expressions have been developed by scientists.
Nae-Eung Lee and colleagues note that one way to make interactions between people and robots more intuitive would be to endow machines with the ability to read their users' emotions and respond with a computer version of empathy. Most current efforts toward this goal analyze a person's feelings using visual sensors that can tell a smile from a frown, for example.
But these systems are expensive, highly complex and don't pick up on subtle eye movements, which are important in human expression. Lee's team wanted to make simple, low-cost sensors to detect facial movements, including slight changes in gaze.
In addition to applications in robotics, the sensors could be used to monitor heartbeats, breathing, dysphagia (difficulty swallowing) and other health-related cues.
The study is published in the journal ACS Nano .