Haptics and Emotion: How Touch Communicates Feelings

When we think about the content of a conversation, it’s easy to focus on just the verbal information exchanged through spoken words; however, there are many other factors that color our interpretation of conversations and, in turn, the information they communicate. One such consideration is the context provided by prosody—the intonation, stress, tempo, rhythm, and pauses in a person’s speech, all of which lend their voice a unique texture. The brain also employs detailed mappings that link different kinds of facial expressions and gestures with the emotions and nuances that they convey. In fact, up to 65% of the raw information in a conversation is exchanged nonverbally [1]. As we continue to investigate human communication, we uncover a highly complex, multi-modal system that comprises many of our senses—including our sense of touch.

Continue reading “Haptics and Emotion: How Touch Communicates Feelings”

A Brief History of Haptic Feedback in Video Games

Picture of Fonz arcade cabinet
The original Fonz arcade cabinet

Currently, almost every modern video game console includes some form of vibrotactile feedback, but this was not always the case. As an increasing number of video games were made for computers and at-home entertainment systems, arcade game manufacturers sought ways to make their cabinet games more immersive. Though arcade controls were typically customized to each individual game, the increasing availability of video games outside of arcades placed pressure on companies to provide arcade visitors with experiences more uniquely tailored to branded game cabinets. In 1976, Sega’s game Moto-Cross (rebranded as Fonz) was the first to feature vibrotactile feedback, allowing each player to feel the rumble of their motorcycle as it crashed with another player’s bike on the screen. The control scheme was a success.

Continue reading “A Brief History of Haptic Feedback in Video Games”

How do devices provide haptic feedback?

Video game controllers, cell phones, wearables, and dozens of other consumer electronic devices make use of vibrotactile feedback to increase user engagement. There are three different types of hardware most frequently used to provide haptic feedback: eccentric rotating mass motors, linear resonant actuators, and piezoelectric actuators.

Continue reading “How do devices provide haptic feedback?”

Adding Senses to the Human Body

Expressive aphasia, also known as Broca’s aphasia, is a neurological condition characterized by an individual’s inability to produce grammatically correct speech, often due to a physical impact or alteration to the anterior regions of the brain, which impairs the proper function of neurons that would otherwise help construct vocalizations of grammatically correct sentences [1]. In the suburbs of Paris, France in 1861, Paul Broca identified the location of the region responsible for expressive aphasia after he conducted an autopsy of a patient incapable of uttering any word other than “tan” [2]. Though speculations of the structure and function of human consciousness had existed for several centuries, Broca’s discovery resulted in a new framework for understanding the brain’s role in producing conscious experience. The patient’s brain incurred a lesion from injury, and only a small subset of his cognitive function was impaired. Naturally, psychologists concluded that different parts of the brain mediate different cognitive processes. By 1874, Carl Wernicke discovered receptive aphasia (Wernicke’s aphasia), which results from damage to posterior regions of the brain [3], and increasing numbers of scientists began exploring which regions of the brain were responsible for different aspects of cognition. Brain science adopted a new goal: mapping the locations of the brain corresponding to each observable function in human consciousness.

Continue reading “Adding Senses to the Human Body”

The First Haptic Wrist Band Design

In 1995, inventor Geir Jensen designed a piece to mount on a watch strap that could provide caller identification through haptic rhythms. His device was based on a frequency-controlled actuator that provides tactile feedback, but he never actually built the hardware. Instead, he submitted the idea to a technology competition and was rejected by the Norwegian Industrial and Regional Development Fund. Jensen was thinking two decades ahead of his time.

The Texas Instruments DRV2603: A Simpler Chip

In the past, we’ve discussed driving a linear resonant actuator using a DRV2605 haptic driver chip from Texas Instruments. Though the DRV2605 chip provides plenty of features (audio to haptics, licensed effects from Immersion, and flexible I2C or PWM input), it also requires a more complicated integration with an existing circuit. Though an extra capacitor or two doesn’t introduce too much complexity, the DRV2605 also isn’t suited for circuits attempting to drive multiple motors at the same time. Since the chip uses I2C, its address remains the same in every chip and cannot be modified. As a result, integrating multiple DRV2605 chips on a single I2C bus requires an I2C switch or multiplexor – multiple slaves cannot be controlled by the same master on the same bus.

There must be a simpler way! And Texas Instruments provides the DRV2603 to provide a simpler option. The DRV2603 haptic driver forgoes the licensed effects and audio input for a simple PWM-only input. Each motor driven with a DRV2603 chip only needs a single digital output pin from a signal source that is capable of producing a signal from 10kHz to 250kHz will be capable of driving multiple motors simultaneously.

For haptics projects that rely on multiple actuators to produce feedback, the DRV2603 provides a simpler way to get started using linear resonant actuators.

Audio to Haptics

Now, it’s common to use an audio signal to drive a haptic actuator and produce tactile effects that correspond to sounds. The Apple Watch and Taptic engine use this technique to render haptic feedback.

In general, the best approach for most consumer electronics devices involves the implementation of a haptic driver capable of consuming audio signals as input. The DRV2605 from Texas Instruments can provide this form of feedback with an eccentric rotating mass motor or linear resonant actuator.

Another option, which is less suited for most portable electronic devices, is to use a surface transducer. A surface transducer will frequently be used in speakers to produce vibrations in an enclosure that result in sound. Unfortunately, surface transducers consume a lot of power, but they are able to produce vibrations that propagate over a hard surface very quickly and consistently.

The Lilypad Vibe Board

The Lilypad Vibe Board is an excellent way to quickly integrate haptic effects into a wearable project. It uses an eccentric rotating mass (ERM) motor and can be driven directly from a general purpose IO pin from an Arduino board. It relies on 5-volt logic and places a 33-ohm resistor in series with the DC motor to reduce the current draw below the 40ma maximum that can be drawn from an output pin of an Arduino board. It also contains a protective diode to prevent damage to a connected IC.

Although the Lilypad Vibe Board can be integrated very quickly into a project and uses a well-designed circular PCB with sewable and solderable pads, the board cannot be used to produce advanced haptic effects easily. Because its current draw is limited to the output of an Arduino output pin, its vibration does not reach the rated maximum of the motor. Likewise, it contains no pre-programmed effects, and the reduced power consumption also reduces the effectiveness of pulse-width modulation for creating custom effects. We recommend using it for projects and prototypes that require simple alerts – its ease of integration makes it very useful when advanced effects aren’t necessary.

PianoTouch: A Haptic Interface for Learning Music

Researchers at Georgia Tech University have created a haptic interface that augments the experience of learning to play the piano.

We present PianoTouch, a wearable, wireless haptic piano instruction system, composed of (1) five small vibration motors, one for each finger, fitted inside a glove, (2) Bluetooth module mounted on the glove, and (3) piano music output from a laptop. With this system, users hear the piano music, and feel t he vibrations indicating which finger is used to play the note. In this paper, we investigate the system’s potential for passive learning, i.e. learning piano playing automatically, while engaged in everyday activities. In a preliminary study, subjects learned two songs initially, and then wore the PianoTouch glove for 30 minutes while listening to the songs repeated. One of the songs included tactile sensations and the other did not. The study found that after 30 minutes, the PianoTouch subjects were able to play the song accompanied by tactile sensations better than the n on- tactile song.

Huang, K., Do, E. L., & Starner, T. (2008, September). PianoTouch: A wearable haptic piano instruction system for passive learning of piano skills. In Wearable Computers, 2008. ISWC 2008. 12th IEEE International Symposium on (pp. 41-44). IEEE.