Vibrotactile Feedback Simulator: Haptic Waveform Design

simulator intermediate ~10 min
Loading simulation...
150 Hz at 1.5 g — strong, perceptible vibrotactile pulse

A 150 Hz vibration at 1.5 g with 100 ms envelope produces a clearly perceptible mid-range tactile sensation, engaging both Meissner and Pacinian mechanoreceptors for a rich haptic experience.

Formula

Displacement: x = A / (2πf)² (from acceleration amplitude)
Perceptual threshold: ~0.01 g at 250 Hz, ~0.1 g at 30 Hz
Energy: E ∝ A² × f² × τ × D (proportional to actuator power)

Engineering Touch

When your phone buzzes in your pocket, you are experiencing vibrotactile feedback — mechanical vibrations transmitted through the skin to specialized mechanoreceptors. Despite its simplicity, vibrotactile feedback is remarkably expressive. By varying frequency, amplitude, duration, and temporal pattern, designers can create tactile alphabets of dozens of distinguishable signals, enabling eyes-free communication and enhancing interaction with digital interfaces.

The Psychophysics of Vibration

Human vibrotactile perception is governed by four mechanoreceptor types, each with distinct frequency sensitivity and receptive field size. The Pacinian corpuscle, buried deep in the dermis, is exquisitely sensitive to vibrations around 250 Hz — detecting displacements as small as 10 nanometers. Meissner corpuscles in the fingertips respond to lower frequencies (10-50 Hz) with high spatial resolution. This simulation shows how frequency and amplitude map onto receptor activation.

Waveform Design

A haptic effect is defined by its waveform — the time-varying amplitude of vibration. The envelope (attack-sustain-release) shapes the perceived sharpness: short attack times create crisp clicks, while gradual ramps feel smooth and gentle. Duty cycle modulates the perceived intensity and determines power consumption, a critical constraint for battery-powered wearables. The frequency spectrum determines which receptor channels are activated.

From Phones to Surgical Robots

Vibrotactile feedback has evolved from simple on/off phone notifications to rich, high-definition haptics. Modern smartphones use wideband actuators to simulate button clicks, texture scrolling, and notification urgency levels. Game controllers create immersive environmental effects. Surgical robots restore tactile sensation lost through teleoperation. The frontier is full-body haptic suits for VR that map vibrotactile arrays across the skin surface.

FAQ

What is vibrotactile feedback?

Vibrotactile feedback uses mechanical vibration applied to the skin to convey information or create sensations. It is the most common form of haptic feedback, found in smartphones, game controllers, wearables, and automotive interfaces. The vibration frequency, amplitude, and temporal pattern determine the perceived quality.

Why does frequency matter for vibrotactile perception?

Human skin contains four types of mechanoreceptors, each tuned to different frequency ranges. Merkel cells respond to static pressure, Meissner corpuscles to 10-50 Hz flutter, Pacinian corpuscles to 40-500 Hz vibration (peak at 250 Hz), and Ruffini endings to skin stretch. Designing for the right receptor channel maximizes perceptual impact.

What actuator types are used?

Eccentric rotating mass (ERM) motors provide broad-frequency vibration but have slow response. Linear resonant actuators (LRA) are tuned to a specific frequency with fast rise times. Piezoelectric actuators offer the widest bandwidth and fastest response but require high voltage drivers.

How do you design a good haptic effect?

Start by choosing the target receptor channel (frequency range), then shape the amplitude envelope for the desired temporal profile (sharp click, smooth ramp, pulsing rhythm). Duty cycle controls perceived intensity and power consumption. User testing is essential because haptic perception is highly subjective.

Sources

Embed

<iframe src="https://homo-deus.com/lab/haptics-engineering/vibrotactile-feedback/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub