Sensory Synesthesia and Human Perception

Sensory synesthesia is a neurological phenomenon in which stimuli from one modality of sensory input leads to involuntary and automatic experiences in another sensory modality. There is some debate regarding the classification of synesthetic phenomenon, but several striking observations reveal that at least a small percentage of people experience a heightened interconnectedness between their different senses.

Kiki, Bouba, and Visual Perception

In 1929, the German-American scientist Wolfgang Kohler observed what is now known as the Bouba-Kiki effect [1]. In 2001, Vilyanur S. Ramachandran replicated Kohler’s experiment with college students in the United States and India, and found a large consensus between participants prompted to provide auditory names to visual objects [2]. The findings of Ramachandran and Kohler demonstrate that sensory information appears to carry a predictable and consistent scaffolding of associations and relationships to other modalities of stimuli. The participants’ visual perceptions of the shapes printed on the page were used to make judgments of the appropriate auditory sounds that ought to be associated with those shapes. VS Ramachandran and his colleague Edward Hubbard suggest that the evolution of language may not be entirely arbitrary—instead, the naming of objects in space may reflect a natural association of auditory stimuli with the visual, tactile, olfactory, and overall perception of the object’s nature. Sounds (and by extension, all sensory information) may automatically convey some degree of symbolic meaning in relation to experiences from other senses.

Auditory-Tactile Synesthesia

When viewed with an MRI, the brain activity of a patient with a localized lesion in the right ventrolateral nucleus of the thalamus revealed modifications to the individuals’ perception. “Initially, the patient was more likely to detect events on the contralesional side when a simultaneous ipsilesional event was presented within the same, but not different sensory modality.” Eventually, this transformed into a form of synesthesia “in which auditory stimuli produce tactile percepts.” This study revealed the likelihood that the experience of sensory synesthesia may be acquired after a brain injury [3].

Visual-Tactile Synesthesia

Mirror-touch synesthesia is a condition in which watching another person being touched activates a similar neural circuit to actual touch. When observing individuals who experience mirror-touch synesthesia with brain imaging, their empathic responses to the experiences of other people appears to be heightened [4]. This form of synesthesia also appears to augment an individual’s ability to recognize and interpret the facial expressions of an interaction partner [5]. Although a thorough empirical explanation for the phenomenon has not yet been developed, there are different potential theoretical explanations currently being investigated in more detail. The Threshold Theory explains it “in terms of hyper-activity within a mirror system for touch and/or pain,” and the Self-Other Theory explains it “in terms of disturbances in the ability to distinguish the self from others.” [6] The two theories carry different implications: the Threshold Theory implies a localized phenomenon impacting the mirror system, while the Self-Other theory implies a more general difference that may be reflected in other cognitive processes as well.

Enhanced Sensory Perception

Some scholars argue that artistic experimentation may be rooted in sensory synesthesia, by allowing an artist to describe a sensory experience using a wider range of detail [7]. Although scientists have developed methods of testing and profiling synesthetes [8], much of the theoretical framework used to understand cross-modal sensory perception remains speculative. Although VS Ramachandran mentions a possible relationship between synesthesia and enhanced sensory perception [9], it remains unclear exactly how this enhancement manifests itself in a person’s ability to perform different activities or pursue artistic endeavors. In a preliminary study exploring the perceptual processing abilities of synaesthetes [10], “there was a relationship between the modality of synaesthetic experience and the modality of sensory enhancement.” In other words, a synaesthete who experiences color triggered by other sensory modalities will also have enhanced color perception. A synaesthete who experiences tactile sensations will have enhanced tactile perception. Further research is required to understand exactly how these enhanced perceptual abilities manifest themselves in common tasks.

Artists with Senaesthesia

Wikipedia has a large list of notable individuals with synaesthesia. The list includes several famous artists:

  • Lorde
  • Billy Joel
  • Vincent Van Gogh
  • Eddie Van Halen
  • Stevie Wonder
  • Kanye West
  • Hans Zimmer

Although anyone without synaesthesia can make art, the process of linking different sensory modalities appears to help some artists produce their most notable works.


[1] Köhler, W (1929). Gestalt Psychology. New York: Liveright.
[2] Ramachandran, V.S, & Hubbard, E.M. (2001). Synaesthesia — A Window Into Perception, Thought and Language. Journal of Consciousness Studies, 8(12), 3–34
[3] Ro, T., Farnè, A., Johnson, R. M., Wedeen, V., Chu, Z., Wang, Z. J., … & Beauchamp, M. S. (2007). Feeling sounds after a thalamic lesion. Annals of neurology, 62(5), 433-441.
[4] Banissy, M. J., & Ward, J. (2007). Mirror-touch synesthesia is linked with empathy. Nature neuroscience, 10(7), 815-816.
[5] Banissy, M. J., Garrido, L., Kusnir, F., Duchaine, B., Walsh, V., & Ward, J. (2011). Superior facial expression, but not identity recognition, in mirror-touch synesthesia. The Journal of Neuroscience, 31(5), 1820-1824.
[6] Ward, J., & Banissy, M. J. (2015). Explaining mirror-touch synesthesia. Cognitive neuroscience, 6(2-3), 118-133.
[7] Van Campen, C. (1997). Synesthesia and artistic experimentation. Psyche, 3(6).
[8] Van Campen, C., & Froger, C. (2003). Personal profiles of color synesthesia: developing a testing method for artists and scientists. Leonardo, 36(4), 291-294.
[9] Ramachandran, V. S. (2003). The emerging mind: the Reith Lectures 2003 (p. 867). London: Profile.
[10] Banissy, M. J., Walsh, V., & Ward, J. (2009). Enhanced sensory perception in synaesthesia. Experimental brain research, 196(4), 565-571.

Macaron and the Future of Haptic Editors

Screenshot of the Macaron interface.
A screenshot of the Macaron haptic effects editor.

Earlier this week, we had the pleasure of talking to Oliver Schneider, a graduate student and researcher at the University of British Columbia. Working at the Sensory Perception & Interaction Research Group, Oliver spends most of his days developing new software and hardware interfaces that engage our sense of touch. He described various techniques he used to create development tools and interfaces for creating rich tactile effects, including Haptic Jazz – a system for taking improvisational input on a tablet and translating it in real-time into a vibrotactile sensation. Continue reading “Macaron and the Future of Haptic Editors”

Electrovibration and Touchscreens: Creating Virtual Textures

Graph of Perceived Friction by Voltage
A higher voltage results in a higher perceived friction.

In 1950, Edward Mallinckrodt, a researcher at Washington University in St. Louis, accidentally discovered the phenomenon of electrovibration (also known as electrostatic vibration). He noticed that a brass electric light socket had a different texture when a light was burning than it did when the light was turned off. Along with a team of researchers, he began exploring the phenomenon in more detail by running experiments using an aluminum plate with insulating varnish. They wrote:

If the dry skin of one’s finger is moved gently over a smooth metal surface covered with a thin insulating layer, and the metal is connected to the ungrounded side of an 110-v power line, the surface has a characteristic feeling that disappears when the alternating voltage is disconnected.

Continue reading “Electrovibration and Touchscreens: Creating Virtual Textures”

Haptics and Emotion: How Touch Communicates Feelings

When we think about the content of a conversation, it’s easy to focus on just the verbal information exchanged through spoken words; however, there are many other factors that color our interpretation of conversations and, in turn, the information they communicate. One such consideration is the context provided by prosody—the intonation, stress, tempo, rhythm, and pauses in a person’s speech, all of which lend their voice a unique texture. The brain also employs detailed mappings that link different kinds of facial expressions and gestures with the emotions and nuances that they convey. In fact, up to 65% of the raw information in a conversation is exchanged nonverbally [1]. As we continue to investigate human communication, we uncover a highly complex, multi-modal system that comprises many of our senses—including our sense of touch.

Continue reading “Haptics and Emotion: How Touch Communicates Feelings”

Adding Senses to the Human Body

Expressive aphasia, also known as Broca’s aphasia, is a neurological condition characterized by an individual’s inability to produce grammatically correct speech, often due to a physical impact or alteration to the anterior regions of the brain, which impairs the proper function of neurons that would otherwise help construct vocalizations of grammatically correct sentences [1]. In the suburbs of Paris, France in 1861, Paul Broca identified the location of the region responsible for expressive aphasia after he conducted an autopsy of a patient incapable of uttering any word other than “tan” [2]. Though speculations of the structure and function of human consciousness had existed for several centuries, Broca’s discovery resulted in a new framework for understanding the brain’s role in producing conscious experience. The patient’s brain incurred a lesion from injury, and only a small subset of his cognitive function was impaired. Naturally, psychologists concluded that different parts of the brain mediate different cognitive processes. By 1874, Carl Wernicke discovered receptive aphasia (Wernicke’s aphasia), which results from damage to posterior regions of the brain [3], and increasing numbers of scientists began exploring which regions of the brain were responsible for different aspects of cognition. Brain science adopted a new goal: mapping the locations of the brain corresponding to each observable function in human consciousness.

Continue reading “Adding Senses to the Human Body”

Slip Pad: Simulated Lateral and Rotational Slip

Researchers at the University of California, Berkeley, have developed a new tactile interface for simulating the slip of a surface using interleaved belts. The source for the project is available on GitHub.

We introduce a novel haptic display designed to reproduce the sensation of both lateral and rotational slip on a user’s fingertip. The device simulates three-degrees-of-freedom of slip by actuating four interleaved tactile belts on which the user’s finger rests. We present the specifications for the device, the mechanical design considerations, and initial evaluation experiments. We conducted experiments on user discrimination of tangential lateral and rotational slip. Initial results from our preliminary experiments suggest the device design has potential to simulate both tangential lateral and rotational slip.

Ho, C., Kim, J., Patil, S., & Goldberg, K. The Slip-Pad: A Haptic Display Using Interleaved Belts to Simulate Lateral and Rotational Slip.

PianoTouch: A Haptic Interface for Learning Music

Researchers at Georgia Tech University have created a haptic interface that augments the experience of learning to play the piano.

We present PianoTouch, a wearable, wireless haptic piano instruction system, composed of (1) five small vibration motors, one for each finger, fitted inside a glove, (2) Bluetooth module mounted on the glove, and (3) piano music output from a laptop. With this system, users hear the piano music, and feel t he vibrations indicating which finger is used to play the note. In this paper, we investigate the system’s potential for passive learning, i.e. learning piano playing automatically, while engaged in everyday activities. In a preliminary study, subjects learned two songs initially, and then wore the PianoTouch glove for 30 minutes while listening to the songs repeated. One of the songs included tactile sensations and the other did not. The study found that after 30 minutes, the PianoTouch subjects were able to play the song accompanied by tactile sensations better than the n on- tactile song.

Huang, K., Do, E. L., & Starner, T. (2008, September). PianoTouch: A wearable haptic piano instruction system for passive learning of piano skills. In Wearable Computers, 2008. ISWC 2008. 12th IEEE International Symposium on (pp. 41-44). IEEE.