
Almost no one who thinks about bias — what forms it takes, how it trips up effective decision-making, and so on — does so more often or more carefully than behavioral economists do. So it’s always interesting to hear them talk about the subject. Back in July, for example, Melissa Dahl flagged a conversation between Danny Kahneman and The Guardian’s David Shariatmadari in which Kahneman explained that if he could rid the world of one human bias, it’s overconfidence.
In an interview with Richard Thaler, another behavioral-econ godfather (can you have more than one godfather?) and the author of Misbehaving: The Making of Behavioral Economics, Katherine Milkman of the Wharton School of Business mentioned that exchange and asked Thaler if he agreed that overconfidence would be the best bias to axe.
His response is worth excerpting at length:
It’s never a good idea to disagree with Danny. I think that would also be at the top of my list. Let’s add some related biases that contribute to overconfidence, like the confirmation bias. One of the reasons we’re overconfident is that we actively seek evidence that supports our views. That’s true of everybody, that’s part of human nature, so that’s one reason we’re overconfident; we’re out there looking for support that we’re right. We rarely go out of our way to seek evidence that would contradict us. If people want to make a New Year’s resolution, it would be to test their strong beliefs by asking what would convince them that they were wrong, then looking around and seeing whether they might find some evidence for that. [emphasis mine]
The other one would be a hindsight bias, a notion that was first introduced by Baruch Fischhoff, who was a graduate student of [University of Minnesota psychology professor] Paul E. Meehl. Hindsight bias is the [inclination to believe] that after the fact we all think things were obvious. Now, if you ask people, “What did they think 10 years ago was the prospect that we would have an African-American president before we would have a woman president?” People would say, “Oh, yeah, well, that could have happened. All you needed was the right guy to come along at the right time.” Or some people, of course, will say the wrong guy, but in any case…. In truth, no one thought that back then. The evidence for hindsight bias is overwhelming, and this has huge managerial implications because when managers evaluate the decisions of their employees, they do so with hindsight.
So some project failed and after the fact it’s obvious why it failed and it’s obvious that the employee should have thought of it. Whereas, before the fact, it wasn’t obvious to anybody; otherwise, we wouldn’t have done it. The advice I always give my students in dealing with hindsight bias is before big decisions, get everybody to write stuff down — including the boss — and agree on what the criteria are for good and bad decisions. That will help at least a little bit — after the fact — when things blow up. We’ll have it on record that nobody anticipated the fact that our competitor was going to introduce a better version of our great idea two months before the launch, and we had no way of knowing that was going to happen.
This is all very useful information, of course. But what’s most interesting, to me at least, is the bolded part. Think about some belief you’re positive of. Try to come up with a list of the pieces of evidence that could convince you you’re wrong about it. It’s hard, isn’t it? And it gets harder the more closely held the belief, the more it feels like an important part of who you are and how you see the world. For a lot of people and a lot of subjects, an honest answer would be “Well, there’s nothing that would convince me I’m wrong about this.” Which is fine! We’re human. And no new piece of evidence is going to come along to prove that, say, murdering an innocent person is wrong or that rain is actually produced by the earth and flies up into the sky, where it forms cloud.
But: accurately.
we know that human history is nothing but closely held beliefs being disproven. It takes pioneers to shoot down these ideas, and it’s always pioneers who have specific sorts of cognitive tendencies that allow them to see through widely embraced illusions. I wonder whether and to what extent we can inculcate these tendencies in ourselves. I wonder if — and I honestly can’t point to any studies I’m aware of that can support or debunk this idea — an exercise like the one Thaler is describing might, in the long run and if done consistently and in a rigorous manner, help our brains help us see the world moreSo maybe, as Thaler suggests, this is all pointing to yet another New Year’s resolution: In 2016, sit down and think really hard about why you might be wrong.