Haptics and Emotion: How Touch Communicates Feelings

When we think about the content of a conversation, it’s easy to focus on just the verbal information exchanged through spoken words; however, there are many other factors that color our interpretation of conversations and, in turn, the information they communicate. One such consideration is the context provided by prosody—the intonation, stress, tempo, rhythm, and pauses in a person’s speech, all of which lend their voice a unique texture. The brain also employs detailed mappings that link different kinds of facial expressions and gestures with the emotions and nuances that they convey. In fact, up to 65% of the raw information in a conversation is exchanged nonverbally [1]. As we continue to investigate human communication, we uncover a highly complex, multi-modal system that comprises many of our senses—including our sense of touch.

Though modern messaging systems are efficient in transmitting visual information, they provide a limited channel for expressing emotions and fail to capture the nuances of face-to-face conversations. To express emotion, people often resort to using emoticons, emoji, or shorthand abbreviations (such as the ubiquitous ‘lol’). While 😀, 😃, and 😂 all convey different “happy” emotions, none of these pictographs can convey what it feels like to see a subtle smile creep across someone’s face and cascade into a joyous laugh.

Researchers around the world have tried developing new systems to add channels of interactivity to online conversations. Tsetserukou et al. have proposed a series of devices that send simulations of physical sensations to a remote partner using haptic feedback. The HaptiHug, HaptiHeart, and HaptiButterfly send simulations of hugs, heartbeats, and “butterflies in the stomach,” respectively. HaptiShiver and HaptiTemper stimulate a person’s spine and transmit temperature changes. The HaptiTickler simulates tickling through random vibrotactile stimulation on the ribs [2]. While this work has provided several innovative approaches to augmenting digital communication, it falls short of providing a direct mapping between the emotional state of a conversational partner and the haptic feedback generated.

It may be possible to use sensory substitution to provide such a mapping. Sensory substitution is the stimulation of one sense (like sound) to provide information used in another sense (like vision). Sonification systems offer one potential method for sensory substitution. These systems use non-speech audio to convey information—for example, a Geiger counter, which works by emitting audible clicks when near a source of radiation. Stiem-Amit et al. have used sonification systems to stimulate “visual” perception in blind subjects. After being trained with such a system, subjects were not only able to perceive the presence of another person but also recognize and imitate their exact body posture [3].

In further experiments, Lemmens et al. used haptics to augment the experience of watching a movie. They constructed a body-conforming jacket embedded with electronics to provide rich vibrotactile cues, using 64 haptic actuators spread across the user’s torso [4]. The jacket provided users with rich spatiotemporal patterns, conveying additional context to the interactions between characters in a given scene. While the device offered very rich vibrotactile feedback, the researchers explored only abstract haptic patterns or approximations of human touch behaviors, without exploring the exact affective content conveyed by the sensations.

Haptic feedback and our sense of touch reinforce and mediate our perception of emotions in further subtle ways. The exchange of touch with a conversational partner can express sympathy, anger, gratitude, fear, love, disgust, happiness, and sadness [5, 6]. Likewise, even incidental or unrelated haptic feedback has been shown to affect an individual’s emotional response to an interaction [7]. In one experiment, Arshafa et al. explored the emotional response to stimulations on the torso and arms. Heartbeats on the abdomen and chest appeared to elicit “joy” when the stimulations were soft and elicited “anger” when the stimulations were more intense [8].

Touch is a powerful way to communicate emotion. The innate system we use to elicit emotion offers exciting potential for affective haptics—devices that leverage the sense of touch to evoke emotion. Even more exciting is the potential use of affective haptics in sensory substitution—that is, conveying emotion as a new “sense” through touch. Such a device might help a blind person or someone on the autism spectrum understand the emotions of the person they’re communicating with. The ability to convey subtle variations in emotion presents a further opportunity—any sort of information conveyed through haptic feedback can have emotional associations built in. And what better way to build intuition for something than tapping into how we feel about it?


[1] Knapp, M., J. Hall, and T. Horgan, Nonverbal communication in human interaction. 2013: Cengage Learning.
[2] Tsetserukou, D. and A. Neviarouskaya,iFeel_IM!: augmenting emotions during online communication. IEEE computer graphics and applications, 2010(5): p. 72-80.
[3] Striem-Amit, E. and A. Amedi,Visual cortex extrastriate body-selective area activation in congenitally blind people “seeing” by using sounds. Curr Biol, 2014. 24(6): p. 687-92.
[4]Lemmens, P., et al. A body-conforming tactile jacket to enrich movie viewing. in EuroHaptics conference, 2009 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2009. Third Joint. 2009. IEEE.
[5]Hertenstein, M.J., et al., The communication of emotion via touch. Emotion, 2009. 9(4): p. 566.
[6] Hertenstein, M.J., et al.,Touch communicates distinct emotions. Emotion, 2006. 6(3): p. 528.
[7] Ackerman, J.M., C.C. Nocera, and J.A. Bargh,Incidental haptic sensations influence social judgments and decisions. Science, 2010. 328(5986): p. 1712-1715.
[8] Arafsha, F., K.M. Alam, and A. El Saddik.EmoJacket: Consumer centric wearable affective jacket to enhance emotional immersion. in Innovations in Information Technology (IIT), 2012 International Conference on. 2012. IEEE.

Author: Ajay Karpur

Ajay graduated from Arizona State University in 2016 with a B.S.E. in Electrical Engineering. He has experience in signal processing, hardware design, and software development.

Leave a Reply

Your email address will not be published. Required fields are marked *