You know what I think about when I hear about the epic failure of all these fancy financial models that were designed to calculate risk? I think about the Atlantic Cod. These fish used to be everywhere. (Once upon a time, they were considered the cash crop of the ocean. Spanish fishing vessels would trek across the Atlantic just to fish the abundant cod off the coast of Canada.) Now the Newfoundland cod fishery is gone, yet another victim of overfishing.
The story of cod is usually told as the tragedy of trawlers. A trawler is boat designed to drag a massive net behind it. These nets are weighted, so that they cling to the bottom of the ocean floor. They sweep up everything for miles and miles. Most of the haul is trash – trawlers leave a trail of dead, unwanted fish – but they can also capture thousands of cod in a single haul. The use of radar made these trawlers even more efficient; now they knew exactly where to drop their nets. The result was a boom in caught cod: by the late 1960’s, fishermen were hauling in more than 800,000 tons of cod every year.
But trawlers aren’t entirely to blame. Their catch was still within the legal limits. (Cheating, of course, remained a big problem. Many fishing boats caught way too many fish, just as fraudulent lending helped implode the subprime market.) In fact, the Canadian government had been concerned about the cod population for decades. In the 1970’s, the government instituted strict regulations that limited the total catch to just 16 percent of the total cod population. The tricky part, of course, was coming up with the population estimates in the first place. It’s hard to know how many fish to catch if you don’t know how many fish there are. But fishery scientists were confident that their sophisticated models were accurate. They had randomly selected areas of the ocean to sample and then, through the use of a complicated algorithm, arrived at their total estimate of the cod population. They predicted that the new regulations would allow the cod stock to steadily increase. Fish and the fishing industry would both thrive.
The models were all wrong. The cod population never grew. By the late 1980’s, even the trawlers couldn’t find cod. It was now clear that the scientists had made some grievous errors. The fishermen hadn’t been catching 16 percent of the cod population; they had been catching 60 percent of the cod population. The models were off by a factor of four. “For the cod fishery,” write Orrin Pilkey and Linda Pilkey-Jarvis, in their excellent book Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future, “as for most of earth’s surface systems, whether biological or geological, the complex interaction of huge numbers of parameters make mathematical modeling on a scale of predictive accuracy that would be useful to fishers a virtual impossibility.”
People love models, especially when they’re big, complex and quantitative. Models make us feel safe. They take the uncertainty of the future and break it down into neat, bite-sized equations. But here’s the problem with models, which is really a problem with the human mind. We become so focused on the predictions of the model – be it the cod population, or the risk of mortgage derivatives – that we stop questioning the basic assumptions of the model. (Instead, the confirmation bias seeps in and we devote way too much mental energy to proving the model true.) It’s not just about black swans or random outliers. After all, there was no black swan event that triggered this most recent financial mess. There was simply an exquisite model, churning out extremely profitable predictions, that happened to be based on a false premise. Hopefully, the markets will recover quicker than the Atlantic cod.