Hebb's Postulate: A Simple Rule for Learning
In 1949, Canadian psychologist Donald Hebb proposed a deceptively simple idea: 'When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased.' This became the most influential single sentence in neuroscience. Popularly shortened to 'neurons that fire together wire together,' Hebb's postulate provided the first concrete mechanism for how experience could modify brain connectivity — the basis of learning and memory.
The Mathematics of Correlation-Based Learning
The basic Hebbian learning rule is mathematically simple: the change in synaptic weight wij between neurons i and j is proportional to the product of their activities: Δwij = η × xi × xj. When both neurons are active (positive values), the weight increases. When one is inactive, no change occurs. This rule naturally detects correlations in neural activity — connections between frequently co-active neurons strengthen, while connections between uncorrelated neurons remain weak. The learning rate η controls how quickly connections change, balancing plasticity (ability to learn new patterns) against stability (retaining old memories).
Cell Assemblies and Associative Memory
Hebb predicted that repeated co-activation would create 'cell assemblies' — groups of neurons with strong mutual connections that function as a unit. Modern neuroscience has confirmed these assemblies exist throughout the brain. When you see a familiar face, a specific assembly of neurons activates together. If you then hear that person's name, overlapping assemblies link the visual and auditory representations. The power of cell assemblies lies in pattern completion: activating just part of the assembly (a partial cue) causes the rest to fire, completing the memory. This is why a smell can trigger a vivid childhood memory — the olfactory neurons activate part of a stored assembly.
The Stability-Plasticity Dilemma
Pure Hebbian learning has a fatal flaw: weights grow without bound. Once a connection strengthens, the correlated activity increases further, driving even more strengthening — a positive feedback loop that saturates all synapses. Biological brains solve this through multiple mechanisms: synaptic decay (forgetting), homeostatic plasticity (neurons adjust their excitability to maintain stable firing rates), and competitive learning (strengthening some connections weakens others). The Oja rule (Δw = η × x × y - η × y² × w) adds a normalization term that prevents saturation while preserving the correlation-detecting property. Modern spike-timing-dependent plasticity (STDP) further refines Hebb's rule by showing that the precise timing of pre and post-synaptic spikes — not just their co-occurrence — determines whether connections strengthen or weaken.