HBT Experiment Simulator: Photon Bunching & Antibunching

simulator intermediate ~10 min
Loading simulation...
g⁽²⁾(0) = 2.00 — photon bunching (thermal light)

Thermal light with 10 ns coherence time shows maximum bunching at zero delay: g⁽²⁾(0) = 2. The probability of detecting two photons simultaneously is twice that expected for random arrivals.

Formula

g⁽²⁾(τ) = 1 + exp(-2|τ|/τ_c) (thermal)
g⁽²⁾(τ) = 1 (coherent)
g⁽²⁾(τ) = 1 - exp(-|τ|/τ_life) (single photon)

A Controversial Experiment

When Hanbury Brown and Twiss announced in 1956 that photons from a thermal source tend to arrive in pairs (bunching), many physicists were skeptical — how could independent photons know about each other? The controversy spurred the development of quantum coherence theory by Roy Glauber and launched quantum optics as a discipline. This simulation recreates the HBT measurement for three fundamentally different light sources.

Measuring g⁽²⁾(τ)

The experiment splits light onto two single-photon detectors and counts coincidences as a function of the time delay τ between detection events. For thermal light, excess coincidences at τ = 0 reveal bunching — photons prefer to arrive together. The characteristic timescale of bunching equals the coherence time τ_c of the source, typically nanoseconds for filtered thermal light.

Three Quantum Signatures

The value of g⁽²⁾(0) cleanly separates three regimes: thermal light gives g⁽²⁾(0) = 2 (bunching from Bose-Einstein statistics), coherent laser light gives g⁽²⁾(0) = 1 (random Poisson arrivals), and a single quantum emitter gives g⁽²⁾(0) → 0 (antibunching — one photon cannot be split). The antibunching dip below 1 has no classical wave explanation and is the definitive signature of the particle nature of light.

Modern Applications

HBT measurements are now routine in quantum optics labs worldwide. They certify single-photon sources for quantum cryptography, characterize quantum dots and nitrogen-vacancy centers, and probe photon statistics in cavity QED and circuit QED systems. In astronomy, revived intensity interferometry using modern detectors promises sub-milliarcsecond stellar imaging with arrays of optical telescopes.

FAQ

What is the Hanbury Brown-Twiss experiment?

The HBT experiment, first performed in 1956 by Robert Hanbury Brown and Richard Q. Twiss, measures intensity correlations of light by splitting a beam onto two detectors and recording coincidence counts as a function of time delay. It revealed photon bunching in thermal light and founded the field of quantum optics.

What is g⁽²⁾(τ)?

The second-order correlation function g⁽²⁾(τ) = ⟨I(t)I(t+τ)⟩/⟨I(t)⟩² measures the normalized probability of detecting two photons separated by time τ. g⁽²⁾(0) > 1 indicates bunching (classical), g⁽²⁾(0) = 1 is random (coherent), and g⁽²⁾(0) < 1 indicates antibunching (quantum).

What is photon antibunching?

Photon antibunching (g⁽²⁾(0) < 1) means photons tend to arrive one at a time rather than in pairs. It has no classical explanation and is the gold standard for certifying single-photon sources. First observed by Kimble, Dagenais and Mandel in 1977 using resonance fluorescence from sodium atoms.

How did HBT change astronomy?

Hanbury Brown and Twiss originally developed intensity interferometry to measure stellar angular diameters, achieving results impossible with conventional amplitude interferometry due to atmospheric turbulence. Their Narrabri stellar intensity interferometer measured dozens of star diameters in the 1960s and 1970s.

Sources

Embed

<iframe src="https://homo-deus.com/lab/quantum-optics/hbt-experiment/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub