The sense of touch can add to people's ability to perceive faces, claims a new study.
Lead researcher Kazumichi Matsumiya of Tohoku University in Japan, said that in daily life, people usually recognize faces through sight and almost never explore them through touch but they use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, like speech perception or object recognition - these new findings suggest that even face processing is essentially multisensory.
In a series of studies, Matsumiya took advantage of a phenomenon called the "face aftereffect" to investigate whether our visual system responds to nonvisual signals for processing faces.
Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.
To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.
In line with his hypothesis, Matsumiya found that participants' experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.
According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality - but these experiments suggest that face perception is truly crossmodal.
The new research has been published in Psychological Science, a journal of the Association for Psychological Science.