Beware Of Experts Bearing Predictions
For the investment adviser seeking publicity, the value of a bold and unequivocal prediction is tough to overstate. A newsletter editor promising a surefire winner set to double will get attention, as will one predicting a drop in the Dow Industrials below 1,000.
By comparison, recommending a diversified portfolio along with some cash as a hedge against market weakness may not attract much notice. After all, you don’t need a diversified portfolio or a cash position if you know how events will unfold — and many see no point in paying an adviser who can’t see the future any better than they can.
Two flaws plague this line of reasoning. First, as legendary baseball player Yogi Berra said, “It is tough to make predictions, especially about the future.” If your investment approach is based entirely on making predictions and betting on those specific outcomes, your returns will only be as good as your forecasts. History suggests that is a tough way to make a buck.
Second, the perceived certainty that a prediction can bring is not only illusory, it is also dangerous to your fiscal health. Once our minds latch on to a prediction, dislodging that conclusion can be difficult — even in the face of subsequent contradictory information.
The expert problem
In one of the best-known investigations into the value of expert opinions, University of California psychologist Philip Tetlock solicited predictions from 284 professional economic and political forecasters, asking them to rate the probability of several potential outcomes.
Most of the questions had only three possible outcomes, writes Jonah Lehrer, author of 2009’s “How We Decide.” Yet the pundits picked the right answer less than one-third of the time. In other words, writes Lehrer, “A dart-throwing chimp would have beaten the vast majority of professionals.”
What’s more, Tetlock found that the most famous pundits tended to be the least accurate, consistently making overblown forecasts. Those with doctorates were no more accurate than those with only undergraduate degrees. Journalists were just as accurate as professors.
The problem, Tetlock concluded, was the certainty of the experts. Experts, especially prominent ones, were so confident in the accuracy of their particular worldview that they imposed a “top-down solution on their decision-making process,” writes Lehrer. Instead of weighing the evidence and trusting their gut, the experts disregarded evidence that contradicted their worldview.
To be fair, we all tend to be overconfident, with a majority of survey respondents consistently answering they are above-average drivers. In dozens of experiments covering a wide range of professions and cultures, researchers have asked people to estimate a range of possible values for a number (such as the circumference of the earth) so that they have a 98% chance of being right. Instead of the 2% error rate one would expect if the respondents provided a sufficiently wide range, the error rate is typically between 15% and 30%.
We all like a good story, much preferring a cause-and-effect explanation to a mere sequence of events. In “The Black Swan,” Nassim Taleb explains our preference for narrative as an ingrained biological need; we want to see patterns and rules because our mind can better handle information that way. As a result, writes Taleb, our “inability to remember not the true sequence of events but a reconstructed one will make history appear in hindsight to be far more explainable than it actually was — or is.”
Finally, we all tend to underestimate the impact of outliers, the extreme events that few predict, like the sudden breakup of the Soviet Union or the Sept. 11 terror attacks. Particularly with investments, as Taleb rightly points out, what matters is not how often you are right, but how large your cumulative errors are. Even if you predicted the market flawlessly from year-end 2003 to September 2008, all your work would have been for naught if you failed to predict the sell-off that followed the failure of Lehman Brothers.
In addition to guarding against the human frailties described above, you can take concrete steps to protect yourself from the dangers of certainty and expert opinion:
• First, take expert predictions with a big grain of salt. Prognosticators are paid to predict, and that’s what they’ll do whether they have any particular insight or not. Be particularly wary of pundits making bombastic, ironclad predictions. The best pundits realize their predictions might be wrong, so commentators who seem most confident are the most dangerous.
• Second, realize that we are all predisposed toward certainty and precise predictions. It feels good to be certain. Because we don’t like being pulled by our mind in different directions, we sometimes rush to judgment. The best antidote for this is to actively encourage some inner dissonance. Force yourself to consider alternative conclusions and information that contradicts your ideas.
• Third, pay attention to the way you arrive at decisions. As much as possible, make your decisions part of a process, a repeatable system you can amend based on your results. Learn from your mistakes.