Over a million developers have joined DZone.

How Reliable Are Experts?

If you're bringing in or working with any self-proclaimed "experts," follow these steps to make sure you're being effective.

· Agile Zone

Reduce testing time & get feedback faster through automation. Read the Benefits of Parallel Testing, brought to you in partnership with Sauce Labs.

As crowdsourcing has grown in scale and effectiveness, there has been a growing appreciation of the fallibility of experts.  Alas, that growth has been from a very low base, and most policy discussions continue to revolve around a so called expert panel who will shape and guide things.

A paper, published recently in Nature, highlights the risks involved in placing too much emphasis on expert advice.  The authors suggest that experts are susceptible to a wide range of subjective influences, which the experts themselves are often oblivious to.

How Reliable Are Experts?

I’ve written previously on the tendency for senior leaders to rely more on gut instinct than on hard data, and the Nature article reminds us of the need to balance this instinct with less biased sources of insight.

“Policy makers use expert evidence as though it were data. So they should treat expert estimates with the same critical rigour that must be applied to data,” they reveal. “Experts must be tested, their biases minimised, their accuracy improved, and their estimates validated with independent evidence. Put simply, experts should be held accountable for their opinions.”

With expert judgements often no better than apparent novices, what can improve our use of expert opinion?  The authors offer up eight suggestions to help.

Getting the Most From Experts

  1. Groups of experts are better than individuals on their own as the outlandish suggestions even themselves out.
  2. Select members carefully as value declines dramatically once people step outside of their specialism.
  3. Judge their value on their merits rather than any reputation, qualification or experience.
  4. Try and build groups that are as diverse as possible.  Homogeneity is your enemy.
  5. Interestingly, those who are less self assured, yet can pull in information from diverse sources are usually better judges.
  6. Try and gage expertise with some test questions, and use this finding to then weight the opinion of your experts.
  7. Train your experts on various horizon scanning type activities so they can better ascribe probabilities to their predictions.
  8. Make sure you provide regular feedback on the success (or otherwise) of predictions.  Try and make the feedback as instant and as unambiguous as possible.

The authors don’t advocate binning experts altogether and do say how valuable they still can be, but caution that they need to be used in the right way in order to get the most out of them.

“The cost of ignoring these techniques – of using experts inexpertly – is less accurate information and so more frequent, and more serious, policy failures,” they conclude.

How many of the eight steps do you or your organization currently use?

The Agile Zone is brought to you in partnership with Sauce Labs. Discover how to optimize your DevOps workflows with our cloud-based automated testing infrastructure.


Published at DZone with permission of Adi Gaskell, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}