Making robots more human by detecting human emotions

Stretchable, ultrasensitive strain sensors could provide a simple, low-cost way for robots to detect emotions
April 30, 2015

Stretchable transparent ultrasensitive strain sensors attached to the forehead, near the mouth, under the eye, and on the neck to sense skin strains induced by muscle movements during expression of emotions and daily activities (credit: Eun Roh et al./ACS Nano)

If robots could detect human emotions, it might make them more “human.” That’s the premise of new research by Korean scientists, who have developed simple, low-cost, ultra-sensitive wearable strain sensors that can detect facial expressions.

This kind if detection is normally done with vision sensors connected to a computer, with facial-analysis algorithms, but such systems are expensive and have low mobility and high complexity, the researchers note in a paper published in ACS Nano.

Schematic illustration of the cross-section of the strain sensor consisting of the three-layer stacked nanohybrid structure (credit: Eun Roh et al./ACS Nano)

Instead, the researchers created a stretchable, transparent sensor by layering a carbon-nanotube film on two different kinds of electrically conductive elastomers. They found that changes in resistance values could indicate whether subjects were laughing or crying and where they were looking, based on characteristic patterns of resistance change.

Laughing has a characteristic pattern that can be inferred from signals from sensors that measure changes in resistance on the forehead and near the mouth (credit: Eun Roh et al./ACS Nano)

The sensors could also have applications in monitoring heartbeats, breathing, dysphagia (difficulty swallowing),and other health-related cues, the researchers suggest.

The work was funded by the National Research Foundation of Korea.


Abstract of Stretchable, Transparent, Ultrasensitive, and Patchable Strain Sensor for Human–Machine Interfaces Comprising a Nanohybrid of Carbon Nanotubes and Conductive Elastomers

Interactivity between humans and smart systems, including wearable, body-attachable, or implantable platforms, can be enhanced by realization of multifunctional human–machine interfaces, where a variety of sensors collect information about the surrounding environment, intentions, or physiological conditions of the human to which they are attached. Here, we describe a stretchable, transparent, ultrasensitive, and patchable strain sensor that is made of a novel sandwich-like stacked piezoresisitive nanohybrid film of single-wall carbon nanotubes (SWCNTs) and a conductive elastomeric composite of polyurethane (PU)-poly(3,4-ethylenedioxythiophene) polystyrenesulfonate (PEDOT:PSS). This sensor, which can detect small strains on human skin, was created using environmentally benign water-based solution processing. We attributed the tunability of strain sensitivity (i.e., gauge factor), stability, and optical transparency to enhanced formation of percolating networks between conductive SWCNTs and PEDOT phases at interfaces in the stacked PU-PEDOT:PSS/SWCNT/PU-PEDOT:PSS structure. The mechanical stability, high stretchability of up to 100%, optical transparency of 62%, and gauge factor of 62 suggested that when attached to the skin of the face, this sensor would be able to detect small strains induced by emotional expressions such as laughing and crying, as well as eye movement, and we confirmed this experimentally.