BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Got One Way Of Looking At The World? It's Wrong

Following
This article is more than 8 years old.

It's difficult, but not impossible, to predict the future. In Superforecasting, Philip Tetlock profiles a small group of people who were able to successfully predict geopolitical events. More on this in a minute.

Tetlock puts wannabe visionaries into two categories:

Hedgehogs: folks who rely on one or two big ideas to understand the world and where it is going

Foxes: people who scoff at the idea of using one model to understand the world, and who instead seek out the best approach that fits

Who do you think does best?

I hope you said foxes, because they consistently win.

And now for the backstory. The Intelligence Advanced Research Projects Activity (IARPA) identifies and supports "high-risk, high-payoff" research. In 2010, they told Tetlock of their plans to sponsor a massive tournament to see who could create the best methods for creating intelligence forecasts such as:

  • Will the president of Tunisia flee to a cushy exile in the next month?
  • Will the euro fall below $1.20 in the next twelve months?

3,200 people passed through the initial stage of psychometric tests and started forecasting. To be clear, a somewhat random group of people from numerous walks of life set out to beat the forecasts of intelligence professionals. Many did.

Doug Lorch typifies the successful approach. He is actively open-minded (AOM). People like Doug actively seek out evidence and opinions that go against theirs. As Tetlock writes, "Beliefs are hypotheses to be tested, not treasures to be guarded."

To take this idea a step further, Superforecasting makes the case that teams can also engage in AOM practices. But Tetlock argues that actively open-minded thinking is "an emergent property of the group itself, a property of the communication patterns among group members."

I particularly like this passage from Tetlock's book, which talks about the humility required to be a good forecaster:

"The humility required for good judgment is not self-doubt - the sense that you are untalented, unintelligent, or unworthy. It is intellectual humility. It is a recognition that reality is profoundly complex, that seeing things clearly is a constant struggle when it can be done at all, and that human judgment must, therefore, be riddled with mistakes."

Now, you may not care about your ability to predict geopolitical events, but I'd argue that many of Tetlock's findings have applications to any professional's career. If you rely too much on one or two mental models, you will make bad decisions.

We see this all the time with leaders of previously successful firms who can't recognize how much their industry has changed and who keep trying to use the strategy that got them here, instead of a new strategy capable of advancing their company to the next level. Instead of success, they sink into stagnation.

By all means, test a strategy that has worked for you in the past, but be prepared to replace it with one that works better.

After all, wouldn't it be great if you could come up with an even better approach than the one that worked so brilliantly for you, say, last year?

Follow me on Twitter or LinkedInCheck out my website or some of my other work here