Show simple item record

dc.contributor.authorICASP14
dc.contributor.authorHartford, Desmond
dc.contributor.authorBaecher, Gregory
dc.date.accessioned2023-08-03T10:42:06Z
dc.date.available2023-08-03T10:42:06Z
dc.date.issued2023
dc.identifier.citationHartford, Desmond, Baecher, Gregory, Poorly-informative priors in geotechnical risk analysis, 14th International Conference on Applications of Statistics and Probability in Civil Engineering (ICASP14), Dublin, Ireland, 2023.
dc.descriptionPUBLISHED
dc.description.abstractQuantitative risk analysis has become a common part of geotechnical engineering. In domains such as dam safety, seismic hazard assessment, and flood damage reduction it has come to depend to a large extent on personalized probabilities in the Ramsey-deFinetti-Savage sense derived by quantifying engineering judgment. From a Bayesian view, quantified judgments principally manifest in prior probabilities, which may be informed by prior information, but which also may be poorly- or un-informed and based on subjective experience. The choice of Likelihood function within the Bayesian context also introduces personalistic uncertainty, but that is infrequently considered. We use the term poorly-informative to differentiate from the non-informative prior in the Jeffreys sense. As the field becomes more receptive to risk-informed thinking, the question of how to quantify and calibrate judgment has become more pressing. How do we quantify priors in a way that is aligned with reality? How much difference does vagueness in the prior make in engineering predictions? Do we weight different experts’ probabilities differently? We now have four decades of experience in attempting to quantify geotechnical judgment in the aleatory domain where chance is dominant and in the epistemic domain where inadequate knowledge is dominant. This experience is reflected upon to draw lessons and to create workable suggestions for practice. The paper principally draws on experience with risk analysis in dam safety. How well-calibrated is an expert when assigning probabilities to parameters or to events in the world? Since probabilities in the Bayesian sense are degrees of belief, the assignment of probability is always correct to the extent that it accurately reflects an expert’s belief. Two people can assign different probabilities and both be “right.” Yet, if a consultant is hired for the purpose of contributing information from which to make decisions, one would like to know whether that expert’s beliefs are consistent with frequencies in the world. Is he or she calibrated? How can quantified expert opinion be validated considering ex post observations of engineering performance, especially failures? A quantitative Bayesian validation procedure is proposed based on the concept of expert-as-information in the sense of Morris and used to assess the credibility of experts. This is applied to how a decision-maker should ascribe credibility to an expert’s judgments when attempting to predict the performance of engineering designs.
dc.language.isoen
dc.relation.ispartofseries14th International Conference on Applications of Statistics and Probability in Civil Engineering(ICASP14)
dc.rightsY
dc.titlePoorly-informative priors in geotechnical risk analysis
dc.title.alternative14th International Conference on Applications of Statistics and Probability in Civil Engineering(ICASP14)
dc.typeConference Paper
dc.type.supercollectionscholarly_publications
dc.type.supercollectionrefereed_publications
dc.rights.ecaccessrightsopenAccess
dc.identifier.urihttp://hdl.handle.net/2262/103192


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

  • ICASP14
    14th International Conference on Application of Statistics and Probability in Civil Engineering

Show simple item record