October 2025
We often face decisions where the answer isn't clear-cut. Whether it's estimating the impact of a new policy, forecasting the effects of climate change or assessing the potential of a new technology, we routinely rely on expert judgement to fill the gaps in our data. But how often do we pause to consider the limitations of these judgements and the potential for bias, guesswork or simply outdated information creeping in? For too long, we’ve accepted best guesses as sufficient, and it’s time we moved beyond that: expert knowledge elicitation (EKE) provides an answer.
For years, I've worked on bridging the divide between expert knowledge and analytical application via EKE, and, time after time, I have seen the value of EKE to decision makers. It’s a crucial step that’s often missed: how do we reliably transform the deep, contextual knowledge held by experts into actionable intelligence that drives better decisions? The disconnect between expert intuition and data-driven modelling can lead to missed opportunities and flawed and potentially biased strategies.
EKE is the systematic process of drawing out, documenting and quantifying the knowledge held by experts. It's more than just asking a few questions; it involves carefully designed protocols, techniques to avoid common biases (like anchoring and confirmation bias), and a focus on transparency. The Sheffield Elicitation Framework (SHELF) is a good example of a structured and robust approach to this, where multiple experts can be brought together to explore the range of views on a topic and make judgements about what a reasonable level of uncertainty should be for the decision maker.
The benefits are significant. EKE, when done well, can:
Improve Accuracy: By actively probing assumptions and challenging estimates, we can arrive at more reliable values.
Increase Buy-in: Experts involved in the process are far more likely to support decisions based on the outcomes. Participation gives them ownership and a voice.
Uncover New Insights: Engaging with experts often reveals areas of uncertainty we hadn’t considered or points us towards new avenues for investigation. They bring contextual knowledge that’s impossible to capture in a purely data-driven approach.
Create a Transparent Record: A well-documented EKE process provides a clear record of how estimates were derived, which is invaluable for auditing, future revisions, and explaining decisions to stakeholders. It moves away from “we just thought it was…” to a documented and defensible reasoning process.
However, it's vital to acknowledge that EKE is not a simple undertaking. It requires meticulous preparation and a keen awareness of potential psychological pitfalls. Capturing representative judgements demands significant investment of time and effort from everyone involved. That’s why transparency isn't just a nice-to-have: it's essential. Any potential user will need to understand the basis on which the judgements have been made and trust the process. This includes well-structured questions, unambiguous definitions of quantities, and open opportunities for experts to share their expertise and reasoning. It’s important to recognise that conducting EKE is a time-intensive process. It requires dedicated facilitator time, expert availability and subsequent analysis. While these costs may seem substantial, they are outweighed by the improved accuracy and reduced risk associated with more informed decisions.
Crucially, the most effective EKE methods borrow heavily from the social sciences, particularly fields like cognitive psychology and decision theory. Understanding how experts think, how biases affect their judgements and how to structure questioning to avoid pitfalls is essential. We’re not just extracting information; we’re managing a complex cognitive process.
We need to shift our mindset. Estimating values shouldn’t be a ‘best guess’ exercise, but a structured, transparent, and rigorously challenged process. Expert knowledge elicitation provides a powerful framework for doing just that, and, with careful implementation, can unlock significant value for both policymakers and industry leaders. It's time to embrace a more sophisticated approach to decision-making, one that places human expertise at the core and intelligently supports it with the tools of the future (my blog on LLMs in EKE brings EKE into the context of some "future" tools).
Here are some examples that I have been involved with (that I have been able to talk about):
Government policy
Gosling, J.P., Hart, A., Mouat, D., Sabirovic, M., Scanlon, S. and Simmons, A. (2012). Quantifying experts' uncertainty about the future cost of exotic diseases. Risk Analysis, 32, 881-93.
Pina-Sánchez, J. and Gosling, J.P. (2022). Enhancing the measurement of sentence severity through expert knowledge elicitation. Journal of Legal Research Methodology, 2, 26-45.
Environmental modelling
Astfalck, L., Cripps, E., Gosling, J.P., Hodkiewicz, M. and Milne, I. (2018). Expert elicitation of directional metocean constituents. Ocean Engineering, 161, 268-76.
Dessai, S., Bhave, A., Birch, C., Conway, D., Garcia-Carreras, L., Gosling, J.P., Mittal, N. and Stainforth, D. (2018). Building narratives to characterise uncertainty in regional climate change through expert elicitation. Environmental Research Letters, 13(7).
Toxicology
Gosling, J.P., Hart, A., Owen, H., Davies, M., Li, J. and MacKay, C. (2013). A Bayes linear approach to weight-of-evidence risk assessment for skin allergy. Bayesian Analysis, 8, 169-86.
Boobis, A., Flari, V., Gosling, J.P., Hart, A., Craig, P., Rushton, L. and Idahosa-Taylor, E. (2013). Interpretation of the margin of exposure for genotoxic carcinogens - elicitation of expert knowledge about the form of the dose response curve at human relevant exposures. Food and Chemical Toxicology, 57, 106-18.
Holzhauer, B., Hampson, L.V., Gosling, J.P., Bornkamp, B., Kahn, J., Lange, M.R, Luo, W.L, Brindicci, C., Lawrence, D., Ballerstedt, S. and O'Hagan, A. (2022). Eliciting judgements about dependent quantities of interest: the SHELF extension and copula methods illustrated using an asthma case study. Pharmaceutical Statistics, 21, 1005-21.