The Ultimate Speed Limit
In 1948, Claude Shannon published a theorem that stunned the engineering world: every communication channel has a maximum data rate, determined solely by its bandwidth and signal-to-noise ratio, below which error-free transmission is theoretically possible — and above which it is not. This capacity limit, C = B·log₂(1 + S/N), is the most important equation in telecommunications, guiding the design of every modern wireless and wired system.
Bandwidth vs. Power
The Shannon formula reveals two paths to higher capacity: increase bandwidth or increase signal power. In the power-rich, bandwidth-limited regime (like urban spectrum), capacity grows logarithmically with power — each additional 3 dB yields only one more bit/s/Hz. In the power-limited, bandwidth-rich regime (like deep-space links), spreading the signal over more bandwidth helps linearly. Real systems operate somewhere between these extremes.
Approaching the Limit
For decades after Shannon's theorem, practical codes fell far short of the limit. The breakthrough came in 1993 with turbo codes and in 1996 with the rediscovery of LDPC codes, both achieving performance within a fraction of a decibel of capacity. Today, 5G NR and Wi-Fi 6 use LDPC codes that operate within 0.5 dB of the Shannon limit — engineering has essentially closed the gap that information theory predicted.
From Theory to Practice
This simulation computes the Shannon capacity for realistic parameters: you set the bandwidth, signal power, receiver noise figure, and temperature, and see the theoretical maximum throughput alongside the SNR and spectral efficiency. Compare the results to real-world systems — Wi-Fi, LTE, satellite links — to see how close modern technology comes to the fundamental bound.