The Zimbabwe Land Reform Program was only a program in the loosest sense of the word. But given enough time, it is highly likely that we will one day look back on it as a success story: because people eventually adapt.
An illustrative example: the land grab that took place in the 2000s was very similar to the land grab that took place in the 1930s (after the Land Apportionment Act was passed by the Southern Rhodesian House of Assembly) – only that time, it was the newly arrived white farmers doing the seizing. And by the end of the 1990s, those commercial farms had granted Zimbabwe the “Handbasket of Africa” pseudonym.
I’m not saying that the process was just. I’m just saying that one day, the most recent land crisis will also probably look that way in retrospect.
Which is the whole point of “Adaptation” as a business philosophy: good businesses start with failure; and with enough failures, you eventually iterate toward success.
The summarised theory: in a world of much complexity, you need to learn by trial and error.
Tim Harford describes the process as natural and evolutionary. Something about guppies in a pond and how the guppy-eating cichlids eat all the brightly coloured ones which leaves you with gravel-coloured guppies. But if there are no cichlids, then the females only want to mate with the brightly-coloured ones. And in three generations, the ponds with cichlids have only boring guppies; and the ponds without cichlids are a fiesta of colour.
However.
I have a fairly strong distaste for books based on anecdote. After all, for every anecdote, there’s an equal and opposite reanecdote. And mine goes like this:
“No matter how complex it gets, I’m pretty sure that a mother lioness doesn’t encourage her cubs to trial and error it with a black mamba. She teaches them to stay the bugger away from snakes.”
But I guess that both principles have to apply concurrently. Some truths are self-evident (avoid the snake), and some truths are established empirically (how to hunt).
Nevertheless, here are some good reasons to read the book:
- Harford points out that you really can’t plan for everything – and sometimes, overplanning itself can lead to disaster when a system is “tightly coupled” (for example: the financial system, where large banks sell products to investors, and then insure themselves with large insurers, who in turn reinsure themselves with other insurers – means that the whole banking and insurance system is a well-primed set of dominoes).
- And I love a good criticism of the God Complex (the delusion that you can control everything by putting enough health and safety procedures in place).
- Once you’re done, you can watch this clip:
Which, frankly, is a far more concise and convincing.