The Evolution of Cooperation

Axelrod's Tournament & Spatial Prisoner's Dilemma — from Beinhocker's "The Origin of Wealth"

In the 1980s, Robert Axelrod invited game theorists to submit strategies for an iterated Prisoner's Dilemma tournament. The winner was Tit-for-Tat — the simplest strategy imaginable: cooperate first, then copy what your opponent did last round. Watch as strategies compete, evolve, and form spatial patterns on the grid below. Cooperation isn't naive — it's an emergent, evolutionarily stable strategy.

Beinhocker's Key Insight

Beinhocker uses Axelrod's tournaments to show that cooperation is not a mystery requiring altruism — it emerges naturally from repeated interactions and evolutionary selection. In a world of pure one-shot encounters, defection dominates. But add memory, repetition, and spatial structure, and cooperative strategies like Tit-for-Tat form clusters that resist invasion by defectors. The shadow of the future makes cooperation rational.

Simulation SPATIAL Generation: 0

Spatial Strategy Map

Each cell is an agent colored by strategy. Watch cooperative clusters form and defectors try to invade their borders.
?
Axelrod Fun Fact: In Axelrod's first tournament (1980), 14 strategies competed. Tit-for-Tat, submitted by Anatol Rapoport, won despite never "beating" any individual opponent. It won by eliciting cooperation from others.

Cooperation Rate Over Time

The fraction of all actions that were cooperative. Rising cooperation = strategies learning to work together.
?
Axelrod Fun Fact: The key properties of successful strategies in Axelrod's tournament were: nice (cooperate first), retaliatory (punish defection), forgiving (return to cooperation), and clear (easy for opponents to model).

Strategy Population Over Time

How each strategy's share of the population changes. Watch for competitive exclusion and coexistence.
?
Evolution Fun Fact: Nowak & May (1992) showed that spatial structure alone can sustain cooperation. Cooperators survive by forming clusters where they interact mostly with each other, earning mutual cooperation payoffs that defectors at the edges cannot match.

Average Payoff

Higher payoff means the population is collectively doing better. Maximum = R (mutual cooperation).
?
Game Theory Fun Fact: The Prisoner's Dilemma payoff matrix satisfies T > R > P > S. If both cooperate, each gets R=3. But the temptation T=5 makes defection individually rational — even though mutual cooperation (R=3) beats mutual defection (P=1).

Strategy Diversity (Shannon Entropy)

Higher entropy means more diverse strategies coexist. Low entropy means one strategy dominates.
?
Beinhocker Fun Fact: Beinhocker argues that economies, like ecologies, exist in a state of "perpetual novelty" — never reaching equilibrium but constantly creating new strategies and niches. Diversity is the engine of innovation.