Haptic Texture Rendering Simulator: Surface Feel & Roughness Perception

simulator intermediate ~10 min
Loading simulation...
f_t = 25 Hz — medium texture in Meissner range

A 2 mm wavelength texture scanned at 50 mm/s produces a 25 Hz temporal signal, placing it in the Meissner corpuscle sensitivity range. The sensation resembles fine sandpaper or woven fabric.

Formula

f_temporal = v_scan / λ_spatial (Hz)
Perceived roughness: R ∝ particle spacing × amplitude^0.5 (Lederman model)
Friction force: F_lat = μ × F_normal × texture_factor

The Science of Surface Feel

Run your finger across sandpaper, silk, or wood grain — each produces a distinct tactile sensation that you recognize instantly. This remarkable perceptual ability arises from the interplay between skin mechanics, mechanoreceptor responses, and cortical processing. Haptic texture rendering aims to recreate these sensations artificially, enabling virtual reality environments where you can feel the weave of fabric or the grain of marble.

Spatial and Temporal Codes

The duplex theory of texture perception holds that coarse features (>1 mm spacing) are encoded spatially — the pattern of skin deformation across the fingerpad's mechanoreceptor array. Fine features (<1 mm) generate vibrations as the finger scans across them, encoded temporally by vibration-sensitive Pacinian corpuscles. The temporal frequency equals scanning velocity divided by spatial wavelength, linking hand movement to perceived texture.

Rendering Approaches

Modern haptic texture displays use several strategies. Ultrasonic friction modulation creates a thin air film between finger and screen, reducing friction in proportion to an applied signal — mimicking texture bumps as the finger moves. Electrostatic displays apply voltage to attract the finger, modulating friction electrically. Pin arrays physically push the skin into texture profiles. Each approach has bandwidth and resolution trade-offs that this simulation helps visualize.

Roughness Perception Model

Perceived roughness scales with texture amplitude and spatial frequency according to psychophysical power laws. For fine textures, roughness correlates with the spectral energy of skin vibration in the Pacinian frequency range (40-500 Hz). For coarse textures, roughness correlates with the physical depth of surface features. This simulation computes temporal frequency, estimated roughness, and lateral force as you adjust texture parameters and scanning velocity.

FAQ

How do we perceive texture through touch?

Texture perception uses two complementary mechanisms: spatial coding (the pattern of skin deformation resolving individual features larger than ~1 mm) and temporal coding (vibrations generated as the finger scans across fine features). The duplex theory holds that coarse textures use spatial codes while fine textures rely on vibratory cues.

What is haptic texture rendering?

Haptic texture rendering creates the sensation of touching textured surfaces through actuator-driven skin stimulation. Methods include vibrotactile replay (playing recorded texture vibrations), friction modulation (using electrostatics or ultrasonics to vary surface friction), and direct force rendering (using robotic devices to push the finger along texture profiles).

How does scanning speed affect texture perception?

Scanning velocity converts spatial texture features into temporal vibrations: f_temporal = velocity / wavelength. Faster scanning increases the vibration frequency. Remarkably, the brain largely compensates for velocity changes, maintaining stable roughness perception across a range of scanning speeds.

What devices render haptic textures?

Ultrasonic friction displays (e.g., TPad, TeslaTouch) modulate surface friction using a thin air film. Electrostatic displays vary friction by applying voltage to the fingertip. Mechanical pin arrays physically deform the skin. Each technology has different bandwidth, resolution, and form factor trade-offs.

Sources

Embed

<iframe src="https://homo-deus.com/lab/haptics-engineering/texture-rendering/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub