Quantifying Earthquakes
Earthquake magnitude is a logarithmic measure of the energy released at the source. Charles Richter's 1935 local magnitude scale ML — commonly called 'the Richter scale' — was the first quantitative measure, relating the maximum trace amplitude on a seismogram to the earthquake's size after correcting for distance. Though conceptually simple, ML saturates above about magnitude 6.5 because the high-frequency waves it measures cannot grow indefinitely.
From Amplitude to Magnitude
The local magnitude formula ML = log10(A) + 2.56×log10(Δ) - 1.67 converts the maximum displacement amplitude A (in mm) at epicentral distance Δ (in km) into a magnitude value. The logarithmic scale means each unit increase corresponds to a 10× increase in amplitude. Body-wave magnitude mb uses the ratio of amplitude to period (A/T) of teleseismic P-waves, corrected by the empirical Q function for distance and depth.
Energy and Intensity
The Gutenberg-Richter energy-magnitude relation log10(E) = 1.5M + 4.8 reveals that each magnitude unit represents 31.6 times more energy. A magnitude 5 releases about 2×10¹² joules — equivalent to 500 tons of TNT — while a magnitude 9 releases 2×10¹⁸ joules, comparable to 480 megatons. Modified Mercalli Intensity (MMI) describes the perceived shaking and damage at the surface, which depends on magnitude, depth, distance, and local soil conditions.
Modern Moment Magnitude
Moment magnitude Mw, introduced by Kanamori in 1977, is derived from the seismic moment M₀ = μAD, where μ is the rock's rigidity, A is the fault rupture area, and D is the average slip. Because it measures the total energy budget of the fault, Mw never saturates and is now the worldwide standard for reporting earthquake size. The moment is determined from long-period seismic waveforms or GPS geodetic measurements of ground deformation.