Abstract
Event trees are a common framework for quantitative risk analysis where a joint probability distribution is displayed in a tree format. They are closely related to Bayesian networks which utilise conditional independencies in their graph representation.In this thesis we are mainly concerned with the connection between the two.
In the first part we propose a possible sequential translation algorithm of event trees to Bayesian networks. Tools from information theory are exploited to quantify the strength of dependencies within the network. This allows us to simplify the model by removing weak conditional dependencies. We apply this algorithm to smaller, artificial data and a real-world example. The algorithm
can also be used to simplify a Bayesian network or to find the weakest link in it. One of the benefits of using a simpler Bayesian network is faster calculations.
In a next step we discuss two types of model extensions that can be included within the same type of algorithm. By construction, event trees can only model discrete variables and hence a derived Bayesian network will also show this limitation. Given our application background with safety risk, variables such as ignition time are represented as discretised versions in the event tree. We show how the ignition time variable can be made continuous in time and how possible direct consequence variables can be included given their dependence on a continuous set.
Lastly, we exemplify how loss data for an event tree can be used to make the translation / simplification context-dependent. Weighted entropy is used as a basis for a measure of similarity that accounts in a sense for both quantitative and qualitative differences. We contrast the different results using the entropy versus weighted entropy approach in the sequential algorithm on an artificial data set.
Date of Award | 22 Jul 2020 |
---|---|
Original language | English |
Awarding Institution |
|
Supervisor | Simon Shaw (Supervisor) & Vangelis Evangelou (Supervisor) |