As crowdsourcing has grown in scale and effectiveness, there has been a growing appreciation of the fallibility of experts. Alas, that growth has been from a very low base, and most policy discussions continue to revolve around a so called expert panel who will shape and guide things.
A paper, published recently in Nature, highlights the risks involved in placing too much emphasis on expert advice. The authors suggest that experts are susceptible to a wide range of subjective influences, which the experts themselves are often oblivious to.
How reliable are experts?
I’ve written previously on the tendency for senior leaders to rely more on gut instinct than on hard data, and the Nature article reminds us of the need to balance this instinct with less biased sources of insight.
“Policy makers use expert evidence as though it were data. So they should treat expert estimates with the same critical rigour that must be applied to data,” they reveal.
“Experts must be tested, their biases minimised, their accuracy improved, and their estimates validated with independent evidence. Put simply, experts should be held accountable for their opinions.”
With expert judgements often no better than apparent novices, what can improve our use of expert opinion? The authors offer up eight suggestions to help.
Getting the most from experts
- Groups of experts are better than individuals on their own as the outlandish suggestions even themselves out
- Select members carefully as value declines dramatically once people step outside of their specialism
- Judge their value on their merits rather than any reputation, qualification or experience
- Try and build groups that are as diverse as possible. Homogeneity is your enemy
- Interestingly, those who are less self assured, yet can pull in information from diverse sources are usually better judges
- Try and gage expertise with some test questions, and use this finding to then weight the opinion of your experts
- Train your experts on various horizon scanning type activities so they can better ascribe probabilities to their predictions
- Make sure you provide regular feedback on the success (or otherwise) of predictions. Try and make the feedback as instant and as unambiguous as possible
The authors don’t advocate binning experts altogether and do say how valuable they still can be, but caution that they need to be used in the right way in order to get the most out of them.
“The cost of ignoring these techniques – of using experts inexpertly – is less accurate information and so more frequent, and more serious, policy failures,” they conclude.
How many of the eight steps do you or your organization currently use?
I think the problem is that we tend to regard expert opinion as the definitive answer to a problem rather than one of many possible solutions. A conclusion rather than a beginning if you like.