Project III (MATH3382) 2017-18


Quantifying Expert Uncertainty

Peter Craig

Description

Evidence-based decision making is very much in vogue, as it should be. But what constitutes evidence? Usually people think of data. But data are often unavailable or only partially relevant. Expert judgement is crucial to filling data gaps and interpreting data which are available. Judgment is always subject to uncertainty.

This project will be concerned with methods for quantifying expert uncertainty, for combining judgements about different sources of uncertainty and for mixing expert judgements with data. Key tools are expert knowledge elicitation, mathematical models, Monte Carlo simulation and Bayesian statistics.

After developing an understanding of basic concepts, students will be expected to choose a problem to study, to explore suitable tools for the problem, to learn about the underlying theory and to apply the tools.

Prerequisites

Statistical Concepts II is nearly essential.

Monte Carlo II would be helpful.

Taking Decision Theory III and or Bayesian Statistics IIII would enrich the project.

Resources

A few references (books chosen because they are in the main library):

Wikipedia article on expert elicitation

Uncertain Judgements: Eliciting Experts' Probabilities, edited by O'Hagan et al, WIley 2006 (ISBN 0470029994)

Risk analysis : a quantitative guide, Vose, Wiley 2000 (ISBN 047199765X)

Risk of introduction of Rift Valley fever into the Southern Mediterranean area through movement of infected animals, European Food Safety Authority 2013, Supporting Publications 2013:EN-416

Guidance on Expert Knowledge Elicitation in Food and Feed Safety Risk Assessment, European Food Safety Authority, 2014.

email: P S Craig


Valid HTML 4.01 Transitional