Wikipedia
https://en.wikipedia.org/wiki/Expert_elicitation
https://en.wikipedia.org/wiki/Delphi_method
Several books and articles provide suitable background reading.
Aspinall, Willy, and Roger Cooke. 2013. Quantifying scientific uncertainty from expert judgement elicitation. Risk and Uncertainty Assessment for Natural Hazards, ed. Jonathan Rougier, Steve Sparks and Lisa Hill. Cambridge University Press.
Ayyub, Bilal M. 2001. Elicitation of Expert Opinions for Uncertainty and Risks. CRC Press.
Cooke, Roger M. 1991. Experts in Uncertainty: Opinion and Subjective Probability in Science. Oxford University Press.
Dias, Luis C., Alec Morton, and John Quigley (editors). 2018. Elicitation: The Science and Art of Structuring Judgement. Springer. https://www.springer.com/gb/book/9783319650517
Expert Elicitation Task Force (Hets et al.). 2011. Expert Elicitation Task Force White Paper. United States Environmental Protection Agency.
Hubbard, Douglas W. 2007. How to Measure Anything: Finding the Value of ‘‘Intangibles’’ in Business. John Wiley & Sons.
Meyer, Mary A., and Jane M. Booker. 2001. Eliciting and Analyzing Expert Judgment: A Practical Guide. Society for Industrial and Applied Mathematics (SIAM), Philadelphia.
Morgan, M. Granger. 2013. Use (and abuse) of expert elicitation in support of decision making for public policy. Proceedings of the National Academy of Sciences (United States) 111: 7176-7184.
O’Hagan, Anthony 2012. Probabilistic uncertainty specification: overview, elaboration techniques and their application to a mechanistic model of carbon flux. Environmental Modelling and Software 36: 35-48.
References with Engineering background, focused on expert elicitation for Risk Assessment of industrial installations
(added by Caroline Morais)
Mosleh, A., Bier, V.M. and Apostolakis, G., 1988. A critique of current practice for the use of expert opinions in probabilistic risk assessment. Reliability Engineering & System Safety, 20(1), pp.63-85.
This reference states that:
Experts are individuals with recognised knowledge or skill in a specific domain.
Experts opinion is sometimes the only available source when data is missing
Experts are usually seem as the least credible source of data, because they can be oriented by different sources of bias
Droguett, E.L. and Mosleh, A., 2008. Bayesian methodology for model uncertainty using model performance data. Risk Analysis: An International Journal, 28(5), pp.1457-1476.
The structure of a model can be a source of uncertainty.
Mosleh’s presentation, key-note speaker of ESREL 2018: https://www.ntnu.edu/documents/1272224149/0/keynote-lecture-ali-mosleh.pdf/c324fe37-ab05-4f8c-8a77-fd23a1c7d3af (try to find all the reference cited by him on the presentation)
Mosleh have shown that in engineering field expert’s performance have varied with the stage of their career (early and late career performed better than mid-career).
Lin, S.W. and Bier, V.M., 2008. A study of expert overconfidence. Reliability Engineering & System Safety, 93(5), pp.711-721.
Experts can be systematically overconfident about the accuracy of their judgments
Taylor-Adams, S. and Kirwan, B., 1995. Human reliability data requirements. International Journal of Quality & Reliability Management, 12(1), pp.24-46.
Kirwan, B., 1997. Validation of human reliability assessment techniques: part 1—validation issues. Safety Science, 27(1), pp.25-41.
Mkrtchyan, L., Podofillini, L. and Dang, V.N., 2016. Methods for building conditional probability Tables of bayesian belief networks from limited judgment: an evaluation for human reliability application. Reliability Engineering & System Safety, 151, pp.93-112.
Shirazi, C.H., 2009. Data-informed calibration and aggregation of expert judgment in a Bayesian framework (Doctoral dissertation).
Laumann, K., Blackman, H. and Rasmussen, M., 2018. Challenges with data for human reliability analysis. In Safety and Reliability–Safe Societies in a Changing World (pp. 315-321). CRC Press. 14
Bailey, R.T., 1997. Estimation from zero‐failure data. Risk Analysis, 17(3), pp.375-380.
Gustafson, D.H., Sainfort, F., Eichler, M., Adams, L., Bisognano, M. and Steudel, H., 2003. Developing and testing a model to predict outcomes of organizational change. Health services research, 38(2), pp.751-776.
Shirazi, C.H., 2009. Data-informed calibration and aggregation of expert judgment in a Bayesian framework (Doctoral dissertation). Swain, A.D. and Guttmann, H.E., 1983. NUREG/CR-1278. Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications, US Nuclear Regulatory Commission.
Budnitz, R.J., Apostolakis, G. and Boore, D.M., 1997.Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts (No. NUREG/CR-6372-Vol. 1; UCRL-ID-122160). Nuclear Regulatory Commission, Washington, DC (United States). Div. of Engineering Technology; Lawrence Livermore National Lab., CA (United States); Electric Power Research Inst., Palo Alto, CA (United States); USDOE, Washington, DC (United States).
Antonucci, A., Huber, D., Zaffalon, M., Luginbühl, P., Chapman, I. and Ladouceur, R., 2013, July. CREDO: a military decision-support system based on credal networks. In Information Fusion (FUSION), 2013 16th International Conference on (pp. 1942-1949). IEEE.
Tolo, S., Patelli, E. and Beer, M., 2018. An open toolbox for the reduction, inference computation and sensitivity analysis of Credal Networks. Advances in Engineering Software, 115, pp.126-148.