Earthquake Clocks
Faults accumulate tectonic stress between earthquakes at a roughly constant rate, then release it suddenly during rupture. This cycle suggests quasi-periodic recurrence — not perfectly regular, but not random either. Paleoseismic data from trench excavations provides the dated earthquake sequence needed to estimate the mean recurrence interval and its variability, forming the foundation for time-dependent seismic hazard assessment.
Probability Models
The simplest model treats earthquakes as a Poisson process: each year has the same probability regardless of when the last earthquake occurred. But real fault data shows clustering around a mean interval, motivating time-dependent models. The Brownian Passage Time (BPT) distribution models the earthquake cycle as stress accumulation with random perturbations, producing a hazard rate that increases as elapsed time grows — capturing the intuition that a 'late' earthquake is overdue.
Conditional Probability
The key output for hazard planning is the conditional probability: given that the last earthquake was t years ago, what is the probability of another in the next W years? For a fault with 300-year mean recurrence and 200 years elapsed, this probability depends critically on the coefficient of variation. Regular faults (CV=0.3) show sharply increasing hazard as elapsed time approaches the mean; irregular faults (CV=0.8) behave nearly like Poisson processes.
From Probability to Policy
Recurrence statistics drive building codes, insurance rates, and emergency preparedness. The USGS earthquake probability maps for California combine recurrence intervals from dozens of faults, weighting time-dependent and Poisson models. This simulation lets you explore how the statistical parameters — mean interval, variability, and elapsed time — shape the probability landscape that guides billion-dollar infrastructure decisions.