In the 1980s, Robert Axelrod invited game theorists to submit strategies for an iterated Prisoner's Dilemma tournament. The winner was Tit-for-Tat — the simplest strategy imaginable: cooperate first, then copy what your opponent did last round. Watch as strategies compete, evolve, and form spatial patterns on the grid below. Cooperation isn't naive — it's an emergent, evolutionarily stable strategy.
Beinhocker uses Axelrod's tournaments to show that cooperation is not a mystery requiring altruism — it emerges naturally from repeated interactions and evolutionary selection. In a world of pure one-shot encounters, defection dominates. But add memory, repetition, and spatial structure, and cooperative strategies like Tit-for-Tat form clusters that resist invasion by defectors. The shadow of the future makes cooperation rational.