The Building Blocks of Visual Effects
In 1983, William Reeves at Lucasfilm invented particle systems to create the Genesis effect in Star Trek II: The Wrath of Khan — a wall of fire sweeping across a planet. The technique was revolutionary: instead of modeling complex shapes, spawn thousands of simple points and let statistical variation create organic-looking fire, smoke, water, and explosions. Four decades later, particle systems remain the foundation of real-time visual effects.
Anatomy of a Particle
Each particle is surprisingly simple: a position, velocity, color, size, opacity, and remaining lifetime. The emitter spawns particles with randomized initial values within configured ranges. Each frame, physics (gravity, wind, drag) updates the velocity, velocity updates the position, and age-based curves modify color and opacity. When lifetime reaches zero, the particle dies and its slot is recycled.
Fire, Smoke, and Sparks
Different effects emerge from different parameter combinations. Fire uses fast upward velocity, short lifetime, warm-to-black color ramp, and growing size. Smoke uses slow velocity, long lifetime, gray tones, and large final size. Sparks use high initial speed, gravity, point rendering, and bright white-to-orange colors. This simulation lets you switch between presets and tweak parameters to see how each effect is built from the same underlying system.
From Points to Production
Modern particle systems go far beyond simple points. Mesh particles emit geometry instead of quads. Ribbon particles connect consecutive positions for trails. Vector fields guide particles along artist-authored flow patterns. Collision modules let particles bounce off scenery. The humble particle — born as a single pixel in a 1983 film — has evolved into one of the most versatile tools in the graphics programmer's arsenal.