Sunday, February 16, 2014

It's not the knowledge that sinks you, it's the discounting.

Learning From Iraq, Katrina and Other Policy Disasters by Megan McArdle. An interview with a professor, Steve Teles, who teaches a course about policy failures.
ST: First, I want to reshape their instincts … I want them to just instinctually look for the things that can go wrong and equip them with a set of cases and mechanisms for failure that are right at the front of their brain.

Second, I want to help them reason from history. Policy makers reason from analogy all the time -- it is one of the most fundamental ways they make decisions. But they do so sloppily, opportunistically, and without a very rich set of analogies.

Third, I want them to be able to communicate, in that form of analogical reasoning, to a boss who doesn't know anything.

Mainly, though, I want them to be able to apply skepticism back at themselves. A big theme in the class is that policy mistakes are caused by selective information processing, our tendency to filter out information that is uncomfortable to our beliefs … I'm trying to equip them with some mental habits to apply skepticism MOST to things they want to believe will work.

MM: I feel like most of us are pretty good at figuring out why policies we don't agree with won't work. The problem is applying that skepticism to your own side, presumably. Is that right?

ST: Well, you'd think so, but no. I mean, it certainly helps to be motivated to find things that will go wrong. But in predicting the mechanism that will cause failure, being motivated doesn't always help. That's why you need a rich set of mechanisms extracted from past experience to know where to look.

MM: It's interesting with the financial crisis and the Iraq war. The people who "predicted the crisis," or said the war was a bad idea, were, by and large, not correct about what happened, or why it was a bad idea.

ST: Yes, that's true. Although, just to be clear, in almost all cases of major policy mistakes, there were people who predicted what would happen. That is, the information that could have allowed you to know what was going to happen was available, but policy makers ignored it or discounted it.

MM: So last question: What have you learned from teaching this class? What has surprised you most?

ST: I think mainly I've learned just how hard it is to reason from history, even though we do it all the time. We all say that the "lessons" of X or Y are whatever, but a "lesson" involves extracting something from one case and applying it to a very different one. That's hard, and easy to do very badly, with terrible effects. So I think the main thing I've learned is a bit more modesty … we have no choice but to reason analogically, but we need to apply a lot of skepticism back on our own reasoning.
I like the counsel on humility.

I would take exception, no, rather I would distinguish the comment "Although, just to be clear, in almost all cases of major policy mistakes, there were people who predicted what would happen. That is, the information that could have allowed you to know what was going to happen was available, but policy makers ignored it or discounted it".

I don't think it is that the knowledge was there and it was a matter of incorporating it. It is the discounting that is the issue. The facts might have been overlooked, but by far more common is that people were aware of the facts but discounted them.

This goes all the way back to Troy. Laocoön warned the Trojans against Greeks bearing gifts - but he was discounted.

No comments:

Post a Comment