When I think back on the 16 years I spent in market research and consulting for the life sciences industry, much of my time was spent forecasting. Forecasting informs a multitude of activities and decisions, both strategic and tactical, within the industry and I was lucky enough to be immersed in many of these:

  • assessing the market potential of a new product launch
  • quantifying commercial opportunity within a new geography
  • estimating the size of a therapy area or indication
  • evaluating the impact of a new competitor on a market incumbent
  • predicting the likely returns of a product at different price points
  • gauging the commercial potential of an acquisition or divestment, be it an asset, a portfolio of assets or a company level
  • forecasting the size of a salesforce for a new product launch, to name just a few.

All are highly important and many are instrumental in determining the commercial success of a life sciences company.

Forecasting is a complex undertaking that requires consideration of current and future events, and their likely timings and impact. One would expect that the most accurate forecasts would be developed by experts who have significant experience of a particular industry or market and those skilled in mathematical modelling techniques which can assimilate a plethora of datasets to replicate likely market scenarios.

In this week's Deloitte Monday Briefings (see http://blogs.deloitte.co.uk/mondaybriefing/), our Chief Economist, Ian Stewart, wrote a fascinating piece on How to be a better forecaster, in attempt to shed some light on why events often take pollsters, regulators, intelligence experts and indeed, economists by surprise. As a forecaster myself, the contents resonated well with me and I thought that others would also find them intriguing and entertaining. Below are some extracts that I found particularly insightful.

The 'experts' aren't always the most accurate!

  • One of the leading academics in the field of forecasting, Philip Tetlock, measured the forecasting abilities of 284 experts (government officials, academics, journalists and economists) over twenty years to 2003. He famously concluded that the average expert was "roughly as accurate as a dart-throwing chimpanzee".
  • In his latest book, Superforecasting: The Art and Science of Prediction, published in 2015, and co- written with Dan Gardner, a Canadian journalist, he draws on another forecasting tournament that he ran with the US intelligence agency, Intelligence Advanced Research Projects Activity (IARPA), between 2011 and 2015. Tetlock assembled a group of more than 2,000 curious, non-expert volunteers under the banner of the Good Judgement Project. Working in teams and individually the volunteers were asked to forecast the likelihood of various events of the sort intelligence analysts try to predict every day, for example Will Saudi Arabia agree to OPEC production cuts by the end of 2014?
  • Tetlock found that the most accurate forecasts were made by two per cent of his volunteers – a small group of so-called "superforecasters". Their performance was consistently impressive. beating everything from financial markets to trained intelligence analysts.Crucially, it was the way these people made decisions and learned, rather than deep subject knowledge, which gave them the edge over specialist intelligence analysts.

Natural forecasters 'think like a fox'

  • In their thought process the super-forecasters were almost exclusively what the philosopher Isaiah Berlin described in a famous 1953 essay as "foxes". Foxes have a broad and eclectic perspective, an approach which contrasts with what Berlin described as "hedgehogs" who have depth and expertise in a narrow field.
  • To Tetlock hedgehogs are confident in their deep knowledge and are often guided by one or two theories: Keynesianism, post-liberalism, communism and so on. Foxes are sceptical about such theories, open-minded, cautious in their forecasts and quick to adjust their ideas as events change.
  • Rather than rely on one or two simplifying ideas to explain events, Tetlock's super-forecaster foxes embraced complexity and were comfortable with a sense of doubt. Indeed, the best forecasters continuously sought ways to make better predictions, displayed a mix of determination, self-reflection and willingness to learn from mistakes, while keeping an honest tally of successes and failures. Bad forecasters ignore their mistakes; good ones acknowledge and learn from them.
  • Tetlock improved accuracy by having the forecasters work in teams. In year one the researchers randomly assigned some forecasters to work in teams and provided them with tips on how to work together effectively. Others worked alone. Their results showed that teams were on average 23 percent more accurate than individuals.
  • Sharing and debating ideas can significantly improve forecast scores. But some teams did poorly; free riding by individual members was one problem; another was a tendency to groupthink which weakened the very scepticism and openness that Tetlock finds essential to good forecasting.

We can learn to think more like a fox

  • Perhaps most encouragingly, Tetlock found that people can be trained to be better forecasters. Indeed with just one hour of training, results could be raised by ten per cent.
  • For individuals, the training focused on thinking in terms of probabilities and removing thinking biases (for instance, focussing on the limitations of one's own knowledge and being open to alternative views). For groups the training aimed to strike a balance between conflict and harmony, given that too much conflict destroys the cooperation that is essential to teamwork and too much consensus leads to groupthink.
  • Sadly, humility and a willingness to change your mind are not always the hallmarks of those who prosper in the world of forecasting. Strong, well-articulated views often matter more. Tetlock's work shows that for those who want to be better at forecasting, as opposed to merely being entertaining, new ways of thinking can yield significant rewards.

When I think back to my time as a forecaster in life sciences I couldn't begin to predict whether Philip Tetlock's experiment would yield the same results, or how compatible a quick-to-change 'fox-like' mind-set would be with the long term game of drug development and commercialisation. However, in my experience, the process you take a team or client through to get to an answer is often just as important, if not more important, than the answer itself, so varying this experience by encouraging people to think more like a fox could indeed lead to significant rewards.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.