One of the most common refrains in the aftermath of the Brexit vote was that the British electorate had acted irrationally in rejecting the advice and ignoring the predictions of economic experts. But economic experts have a truly remarkable history of getting things wrong. And it turns out, as Daniel Kahneman explains in Thinking, Fast and Slow, that there is a close causal relationship between being an expert and getting things wrong:

People who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options. Even in the region they knew best, experts were not significantly better than nonspecialists. Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident. “We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly,” [Philip] Tetlock writes. “In this age of academic hyperspecialization, there is no reason for supposing that contributors to top journals—distinguished political scientists, area study specialists, economists, and so on—are any better than journalists or attentive readers of The New York Times in ‘reading’ emerging situations.” The more famous the forecaster, Tetlock discovered, the more flamboyant the forecasts. “Experts in demand,” he writes, “were more overconfident than their colleagues who eked out existences far from the limelight.”

So in what sense would it be rational to trust the predictions of experts? We all need to think more about what conditions produce better predictions — and what skills and virtues produce better predictors. Tetlock and Gardner have certainly made a start on that:

The humility required for good judgment is not self-doubt – the sense that you are untalented, unintelligent, or unworthy. It is intellectual humility. It is a recognition that reality is profoundly complex, that seeing things clearly is a constant struggle, when it can be done at all, and that human judgment must therefore be riddled with mistakes. This is true for fools and geniuses alike. So it’s quite possible to think highly of yourself and be intellectually humble. In fact, this combination can be wonderfully fruitful. Intellectual humility compels the careful reflection necessary for good judgment; confidence in one’s abilities inspires determined action….

What’s especially interesting here is the emphasis not on knowledge but on character — what’s needed is a certain kind of person, and especially the kind of person who is humble.

Now ask yourself this: Where does our society teach, or even promote, humility?

6 Comments

  1. heh. and there's an amusing regress lurking in the wings. because inevitably at some point someone will come out of the woodwork and publish a very serious piece about about what an expert with his algorithm says about when you can feel quite confident in trusting which experts about what.

  2. I find it hard to imagine someone who thinks highly of themselves and is intellectually humble. If thinking highly of yourself means (as Gardner and Tetlock suggest) confidence in one's abilities, then wouldn't that confidence extend to one's predictive abilities? Where's the intellectual humility in that?

  3. Nick, I think it's a matter of being confident in the abilities you do have but not in the abilities you don't have. So, for instance, a really smart economist could have a proper confidence in his ability to come up with a superior theory of market fluctuation, but when asked what the stock market is going to do in the next three months would reply, "I don't know. Too many variables, too much noise in the signal. Your guess is as good as mine."

  4. The Templeton-funded "intellectual humility" project at Fuller, St Louis, and Edinburgh is doing interesting interdisciplinary work on this topic (psychology, theology, and philosophy).

  5. Interesting as Kahneman’s thesis may be, the central question is not really about expertise at all but the predictability of complex systems, specifically those derived from human culture. We should all recognize by now that we can’t agree on interpretations of our own history even with the advantage of hindsight, so the irrationality and unpredictability of most historical flows is a given. One can certainly frame questions that are easier to answer (easier than, say, the direction of financial markets this week) with greater knowledge and understanding rather than punting and throwing darts, and the broader flows are also relatively easy to spot (are we heading toward greater collectivism — globalization — or less?), but accurate and actionable predictions are few and far between. This year’s election cycle is a case in point.

    When predictability is applied toward complex systems having less to do with irrational human behaviors, prophesying the future becomes (somewhat) more reliable. Population demographics might be a good example, so long as energy and the ecosphere continue to provide the supports that allow life (e.g., industrial civilization) to flourish. Once those ebb away, of course, all bets are off.

    Also, it is obvious that one could read Kahneman (the part you quoted, anyway) as an attack on expertise, so why bother to study, learn, synthesize, and understand anything when it’s all in flux anyway? That’s a rather poor outlook, but it comes directly out of the instrumental mind that apprehends things only in terms of their usefulness. Plenty of people seek to game systems for financial advantage, anticipating movements in the culture and supplying goods and/or services before someone else gets there to corner the market. That does not in any way reflect the robust varieties of human learning and endeavor that create expertise. It’s just one slice, and a craven one at that.

Comments are closed.