Tuesday 14:00-15:30
Part 4 Examples of imprecise probability in engineering
by Edoardo Patelli
107 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
108 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
109 Problem Statement Mathematical and numerical models
I Adopted to simulate, study and forecast future outcomes I Models must capture the essentials of the real response. I Models contain a fairly large set of parameters whose “true" values are not precisely known, i.e. they are uncertain.
110 Current challenges in Engineering Uncertainty quantification and management
I Complex systems often must be designed to operate in harsh domains with a wide array of operating conditions I Quantitative data is either very sparse or prohibitively expensive to collect
111 Current challenges in Engineering High-consequence and safety-critical systems
I Risk is often misestimated I Models are deterministic without incorporating any measure of uncertainty (Columbia accident report) I Inadequate assessment of uncertainties, unjustified assumptions (NASA-STD-7009) I Looking for the “black swan” (e.g. Fukushima)
112 System failures Accident triggered by natural events I
I 02 October 2002 - Golf Mexico I Hurricane Lili -BP Eugene Island 322 Platform A
113 System failures Accident triggered by natural events II
I Fukushima Daiichi nuclear disaster (Japan) on 11 March 2011 I 9.0 MW Tohoku earthquake I maximum ground accelerations of 0.56 g (design tolerances of 0.45 g) I 15 m tsunami arriving 41 minutes later (design seawall of 5.7 m)
114 System failures Accident triggered by natural events III
Source: repubblica.it
I Amatrice (ITALY), 24 September 2016, (6.0 magnitude)
115 System failures Accident triggered by natural events IV
Source: tt.com
I Landslide (Matrei, Austria, 14th May, 2013)
116 System failures Accident caused by poor maintenance
I Southwest B733 near Yuma on Apr 1st 2011 I Hole in fuselage, sudden decompression I Viareggio (Italy) on July 1st 2009 I The failure of an axle on the wagon, derailment and subsequent fire
117 Unavoidable uncertainties I Irreducible (aleatory) uncertainties
I Parameters intrinsically uncertain I Value varies at each experiment I Future environmental conditions, chaotic and stochastic process
118 Unavoidable uncertainties II Reducible (epistemic) uncertainties
I Quantities that could be determined in theory I Practically they are not measured I Field properties, simplified model
119 I Qualitative information (expert judgements) I Heterogeneous information from different sources and in different format
Unavoidable uncertainties III “Lack of knowledge” and mixed information
I Statistical information often not available (unique structure) I Few or missing data (expensive to collect)
120 Unavoidable uncertainties III “Lack of knowledge” and mixed information
I Statistical information often not available (unique structure) I Few or missing data (expensive to collect) I Qualitative information (expert judgements) I Heterogeneous information from different sources and in different format
120 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
121 How to deal with uncertainties?
I Effects of uncertainty need to be included at the design stage I From “Intuitive analysis” to “Quantitative analysis”
122 How to deal with uncertainties?
I Effects of uncertainty need to be included at the design stage I From “Intuitive analysis” to “Quantitative analysis”
122 How to deal with uncertainties?
I Effects of uncertainty need to be included at the design stage I From “Intuitive analysis” to “Quantitative analysis”
122 How to deal with uncertainties?
I Effects of uncertainty need to be included at the design stage I From “Intuitive analysis” to “Quantitative analysis”
122 How to deal with uncertainties?
I Effects of uncertainty need to be included at the design stage I From “Intuitive analysis” to “Quantitative analysis”
122 Managing I Identification of the parameters whose uncertainty is the most/least consequential I Worst-case system performance assessment I Design in the presence of uncertainty
Dealing with uncertainty: Requirements
Quantifying I Modelling and refinement of uncertainty based on experimental data, simulations and/or expert opinion I Propagation of uncertainties through system models I Parameter ranking and sensitivity analysis in the presence of uncertainty
123 Dealing with uncertainty: Requirements
Quantifying I Modelling and refinement of uncertainty based on experimental data, simulations and/or expert opinion I Propagation of uncertainties through system models I Parameter ranking and sensitivity analysis in the presence of uncertainty Managing I Identification of the parameters whose uncertainty is the most/least consequential I Worst-case system performance assessment I Design in the presence of uncertainty
123 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
124 P must satisfy some natural properties, called axioms of probability:
1. P(A) 0 (Nonnegativity) ≥ 2. P(Ω) = 1 (Normalisation) 3. If A B = ∅ then P(A B) = P(A) + P(B) (Additivity) ∩ ∪
Traditional modelling of uncertainty
Probability: A real number P(A) assigned to every event A Ω ⊂
125 2. P(Ω) = 1 (Normalisation) 3. If A B = ∅ then P(A B) = P(A) + P(B) (Additivity) ∩ ∪
Traditional modelling of uncertainty
Probability: A real number P(A) assigned to every event A Ω ⊂ P must satisfy some natural properties, called axioms of probability:
1. P(A) 0 (Nonnegativity) ≥
125 3. If A B = ∅ then P(A B) = P(A) + P(B) (Additivity) ∩ ∪
Traditional modelling of uncertainty
Probability: A real number P(A) assigned to every event A Ω ⊂ P must satisfy some natural properties, called axioms of probability:
1. P(A) 0 (Nonnegativity) ≥ 2. P(Ω) = 1 (Normalisation)
125 Traditional modelling of uncertainty
Probability: A real number P(A) assigned to every event A Ω ⊂ P must satisfy some natural properties, called axioms of probability:
1. P(A) 0 (Nonnegativity) ≥ 2. P(Ω) = 1 (Normalisation) 3. If A B = ∅ then P(A B) = P(A) + P(B) (Additivity) ∩ ∪
125 Subjective probability Bayesian approach
Combination of rare data and prior expert knowledge
P(x θ)P(θ) P(x θ)P(θ) P(θ x) = | = n | | P(x) P(x θi )P(θi ) i=1 | P I Update of expert knowledge by means of data P(θ x) | I Influence of subjective assessment (prior distribution,P(θ)) decays quickly with increase of sample size
126 Statistical analysis of imprecise and rare data Open questions?
I Evaluation of remaining subjective uncertainty I Consideration of imprecision (data and expert knowledge) I Very small sample size or no data
127 Statistical analysis of imprecise and rare data Open questions?
I Evaluation of remaining subjective uncertainty I Consideration of imprecision (data and expert knowledge) I Very small sample size or no data
127 Statistical analysis of imprecise and rare data Open questions?
I Evaluation of remaining subjective uncertainty I Consideration of imprecision (data and expert knowledge) I Very small sample size or no data
127 I It is a BIG and UNJUSTIFIED (wrong) assumption! I Give a false sense of confidence
Naive approach Model epistemic uncertainty as aleatory uncertainty
I Information available as interval only (i.e. bounds) uniform distribution →
Safe Failure CDF
performance
128 Naive approach Model epistemic uncertainty as aleatory uncertainty
I Information available as interval only (i.e. bounds) uniform distribution → I It is a BIG and UNJUSTIFIED (wrong) assumption! I Give a false sense of confidence
Safe Failure CDF
performance
128 I Uncertainty propagation is an optimization problem
Interval analysis Epistemic Uncertainty
I Intervals as epistemic uncertainty of parametric values
129 Interval analysis Epistemic Uncertainty
I Intervals as epistemic uncertainty of parametric values I Uncertainty propagation is an optimization problem
129 I UQ by sampling intervals (solving an optimization problem for each sample)
Dempster-Shafer structure Collection of intervals
I Different probability masses associated to distinct intervals (e.g. weighting expert options or assumptions)
130 Dempster-Shafer structure Collection of intervals
I Different probability masses associated to distinct intervals (e.g. weighting expert options or assumptions) I UQ by sampling intervals (solving an optimization problem for each sample)
130 Probability box Epistemic + Aleatory uncertainty
1
I X 0.5
Distributional F Lower bound F X Upper Bound F probability box X 0 0 1 2 3 4 5 X 1
Y 0.5 F Lower Bound F Y Upper Bound F I Distributional-free Y 0 probability box -4 -3 -2 -1 0 1 Y References I S. Ferson, V.Kreinovich, L.Ginzburg, D.S.Myers, & K.Sentz, Constructing probability boxes and Dempster-Shafer structures Sandia National Laboratories, 2003
131 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
132 Definition of a random set Succinct introduction
I Random set Γ is like a random variable whose realizations γ are sets in F , not numbers I F is a focal set I γ := Γ(˜ α) F is a focal element ∈ When all focal elements of F are singletons, Γ becomes a random variable X References I Alvarez, D. A., On the calculation of the bounds of probability of events using infinite random sets, International Journal of Approximate Reasoning, Vol. 43, 2006, pp. 241–267. I Alvarez, D. A., Infinite random sets and applications in uncertainty analysis, Ph.D. thesis, Unit of applied mathematics, University of Innsbruck, 2007 I Alvarez, D. A., A Monte Carlo-based method for the estimation of lower and upper probabilities of events using infinite random sets of indexable type, Fuzzy Sets and Systems, Vol. 160, 2009, pp. 384–401.
133 Representation of the uncertainty Possibility distributions, probability boxes and families of intervals
Using intervals and d-dimensional boxes as elements of F , is enough to model (without making any supposition at all): I Possibility distributions (also known as normalized fuzzy sets) I Cumulative distribution functions (CDFs) I Intervals and families of intervals (Dempster-Shafer structures) I Distribution-free and distributional probability boxes (p-boxes) and their joint combinations (using Copulas). Definition A copula is a multivariate CDF C :[0, 1]d [0, 1] such that each → of its marginal CDFs is uniform on the interval [0, 1].
134 Relationship between Random Sets, CDFs and finite families of intervals
Figure 2: Focal element of a family Figure 1: Focal element of a CDF of intervals
135 Relationship between Random Sets, p-boxes and fuzzy set
Obtained by generating an α from a uniform distribution on (0, 1] and then, obtaining the corresponding focal element γ = Γ(˜ α).
Figure 3: Distribution-free probability box Figure 4: Possibility distribution
136 Relationship between Random Sets and distributional probability boxes
Distributional p-boxes do not have a graphical representation. Consider a CDF with m uncertain and independent parameters, (e.g. intervals Ii for i = 1, 2,..., m) using the random set representation, a focal element of the probability box can be represented as the input intervals I : i = 1, 2,..., m together { i } with the sample of α which is a uniform random variable on (0, 1] Ω. ≡ Represented as the random set Γ:Ω˜ F , α Γ(α) → 7→ (i.e. (F , PΓ)) defined on R where F is the system of focal elements α I I : α Ω . { × 1 × · · · × m ∈ }
137 Uncertainty propagation of a random set
Obtained by generating an α from a uniform distribution on (0, 1] and then, obtaining the corresponding focal element γ = Γ(˜ α).
Figure 6: A focal set as a Figure 5: Distribution-free p-box d-dimensional box γ = d γ in X ×i=1 i
138 Focal set Lower and upper probability measures The probability measure of a set F P(X ) is bounded ∈ LP (F ) P(F ) UP (F ) (F ,PΓ) ≤ ≤ (F ,PΓ) where,
UP (F ) = P γ : γ F = (F ,PΓ) Γ { ∩ 6 ∅} LP (F ) = P γ : γ F (F ,PΓ) Γ { ⊆ }
α j aleatory space 1
α
H : Ω × Θ → R Ω αi 0 1 system
W : Ω × Θ → X G : X → R
epistemic space θ realization of the input of the I θ∗ j system for a given α and θ Θ reduced epistemic space
Ii
139 Combination of focal set Representation either as a d-dimensional box in X or as a point in (0, 1]d
Figure 7: A focal set as a d Figure 8: A focal set as a point d-dimensional box γ = = γi in X d ×i 1 α := [α1, α2, . . . , αn] in (0, 1]
140 Needs for improved methods for quantifying and managing uncertainty
Challenges Computational Aspects
141 Needs for improved methods for quantifying and managing uncertainty
Challenges Computational Aspects
141 Challenges Computational Aspects
Needs for improved methods for quantifying and managing uncertainty
141 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
142 NASA Langley UQ challenge problem Motivations and Timeline
Aim I Determine limitations and ranges of applicability of existing UQ methodologies I Develop new discipline-independent UQ methods I Advance the state of the practice in UQ problems of direct interest to NASA Timeline I January 2013: 100+ UQ experts were invited to participate I January 2014: 11 groups presented at Scitech 2014 I January 2015: Special edition AIAA Journal of Aerospace Information Systems
143 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
144 Physical System NASA Langley Generic Transport Model (GTM)
I 5.5% dynamically scaled, remotely piloted, twin-turbine, research aircraft I Aircraft motion outside the normal flight envelope I Dynamics are driven by nonlinearities and coupling, having oscillatory and divergent behaviour
145 Generic Transport Model (GTM) Dynamically scaled, highly instrumented, flight test article
146 Generic Transport Model (GTM) Dynamically scaled, highly instrumented, flight test article
146 Generic Transport Model (GTM) AIRSTAR, pilot commands
147 The mathematical Model Simulation model
I A high-fidelity aerodynamic mathematical model I Match closely the dynamics of the test (based on system identification experiments and wind tunnel data
148 The mathematical Model Black-box model
I Specialized aircraft knowledge is not required I 5 tasks: Uncertainty Characterization, Sensitivity analysis, Uncertainty Quantification, Extreme Case analysis and Robust Design
149 I 5 Intermediate variables x (e.g. control effectiveness of elevator, time delay due telemetry and communications) I 14 Design variables d: controller gains I 8 Performance metrics g (e.g. Lon stability, lat/dir stability, elevator actuation)
x = h(p) g = f (p, x)
The mathematical Model Black-box model
I 21 Uncertain Parameters p: loss in control effectiveness, actuator failure, icing, deadzone, and desired range in operating conditions
150 I 14 Design variables d: controller gains I 8 Performance metrics g (e.g. Lon stability, lat/dir stability, elevator actuation)
x = h(p) g = f (p, x)
The mathematical Model Black-box model
I 21 Uncertain Parameters p: loss in control effectiveness, actuator failure, icing, deadzone, and desired range in operating conditions I 5 Intermediate variables x (e.g. control effectiveness of elevator, time delay due telemetry and communications)
150 I 8 Performance metrics g (e.g. Lon stability, lat/dir stability, elevator actuation)
x = h(p) g = f (p, x)
The mathematical Model Black-box model
I 21 Uncertain Parameters p: loss in control effectiveness, actuator failure, icing, deadzone, and desired range in operating conditions I 5 Intermediate variables x (e.g. control effectiveness of elevator, time delay due telemetry and communications) I 14 Design variables d: controller gains
150 The mathematical Model Black-box model
I 21 Uncertain Parameters p: loss in control effectiveness, actuator failure, icing, deadzone, and desired range in operating conditions I 5 Intermediate variables x (e.g. control effectiveness of elevator, time delay due telemetry and communications) I 14 Design variables d: controller gains I 8 Performance metrics g (e.g. Lon stability, lat/dir stability, elevator actuation)
x = h(p) g = f (p, x)
150 The propagation of p for a fixed d makes g a p-box
Uncertain Parameters (p) Challenges
Uncertainty models for p are given Cat. I Random Variables (aleatory uncertainty) Cat. II Intervals (epistemic uncertainty) Cat. III Probability boxes (aleatory + epistemic uncertainty)
Some of these Uncertainty Models can be reduced/improved
151 Uncertain Parameters (p) Challenges
Uncertainty models for p are given Cat. I Random Variables (aleatory uncertainty) Cat. II Intervals (epistemic uncertainty) Cat. III Probability boxes (aleatory + epistemic uncertainty)
Some of these Uncertainty Models can be reduced/improved
The propagation of p for a fixed d makes g a p-box 151 Objectives Tasks
I Uncertainty Characterization Advance methods for the refinement of uncertainty models using limited experimental data I Sensitivity Analysis Develop methods for the identification of critical parameters from within a multidimensional parameter space I Uncertainty Propagation Deploy approaches for propagating uncertainties in multidisciplinary systems subject to both aleatory and epistemic uncertainty I Extreme Case Analysis Identify the combination of uncertainties that lead to best- and worst-case outcomes according to two probabilistic measures of interest I Robust Design Determine design points that provide optimal worst-case probabilistic performance. This objective is added as an optional element
152 NASA Langley UQ challenge problem Proposed strategy Solving each problem with at least two different approaches
I Cross validate results I Increase confidence I Test different hypotheses I Different numerical implementations
I Theoretical framework: Generalized probabilistic approach (Random Set Theory) I Computational framework: OpenCossan Reference Patelli et al. Uncertainty management in multidisciplinary design of critical safety systems Journal of Aerospace Information Systems, 2014 (DOI:10.2514/1.I010273) 153 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
154 Uncertainty Characterization Reduce uncertainty
Sub-Model: x1 = h1(p1, p2, p3, p4, p5)
I Refine the uncertainty model of p given n observations of x1
I 2 sets of 25 realizations of x1 I What is the effect of the number of observations n on the quality of the resulting Uncertainty Model?
155 Uncertainty Characterization Reduce uncertainty
Variable Category Aleatory Epistemic Description component component
p1 III Unimodal Beta I1 = [3/5, 4/5] Interval of E[p1] (α1) I2 = [1/50, 1/25] Interval of V [p1] p2 II I3 = [0, 1] Interval p3 I Uniform(0,1] (α2) Random variable p , p III Multivariate Gaussian I = [ 5, 5] Interval of E[p ] 4 5 4 − 4 (α3,α4) I5 = [1/400, 4] Interval of V [p4] I = [ 5, 5] Interval of E[p ] 6 − 5 I7 = [1/400, 4] Interval of V [p4] I = [ 1, 1] Interval of ρ(p , p ) 8 − 4 5
4 aleatory terms and 8 epistemic components.
156 Aim and strategy
aleatory space Ω epistemic space Θ
αj
1 α reduced original I θ∗ epistemic epistemic j space space
αi 0 0 1 Ii
Aleatory space, defined as Ω = (0, 1]4 serves as the support of the copula which models the dependence between variables p1, p3, p4 and p5.
157 Uncertainty Characterization Aim and strategy I Refine the uncertainty model of p given n observations I What is the effect of the number of observations n? aleatory space Ω epistemic space Θ
αj
1 α reduced original I θ∗ epistemic epistemic j space space
αi 0 0 1 Ii
8 defined as Θ = i=1 Ii ; “true” parameter vector θ∗; when additional information× is available, the epistemic uncertainty is reduced. Strategies
I Bayesian updating on the epistemic space I Non-parametric statistical methods
158 Available Data Observations Observation of x 1 1 KS−density set #1 0.9 KS−density set#2 ecdf set #1 I 2 sets of 25 realizations of x1 0.8 ecdf set#2 Sample set #1 I Observations come from 0.7 Sample set #2 stochastic model (aleatory 0.6 0.5 uncertainty) F(x) 0.4
I eCDF of two data sets are 0.3 quite different 0.2 0.1 I Gaussian Smoother approach 0 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 x 1 Each realization associate to a Gaussian distribution:
n (x x )2 Fˆ (x) = 1 exp − − k h nσ√2π k=1 2σ2 Pradlwarter, H. and Schuëller, G., The Use of KernelP Densities and Confidence Intervals to cope with Insufficient Data in Validation Experiments, Computer Methods in Applied Mechanics and Engineering, 197(29-32), 2008, 2550–2560.
159 Bayesian updating on the epistemic space
The updated belief about θ after observing the given dataset , ∗ Dn is modelled by the posterior PDF p(θ ): |Dn p( θ)p(θ) p(θ ) = Dn| . |Dn P( ) Dn Likelihood function: n p( θ) = p(x ; θ) Dn| i Yi=1 is related to the probability of observing the samples = x , Dn { i i = 1, 2,..., n assuming that the true parameters underlying the } model PDF p(x; θ) is θ when a set of independent and identically distributed observations is available. Dn
160 Bayesian updating on the epistemic space Likelihood estimation through kernel density
Assume that the samples were drawn from p(x; θ ), where Dn ∗ θ Ω; the likelihood p( θ ) will be defined in the following ∗ ∈ Dn| i way: I draw M = 100 points from the aleatory space Ω, using copula C; we will call these samples ω : j = 1,..., M ; { j } (j) I calculate x1 := h1(ωj , θl ) for j = 1,..., M; I using kernel density estimation and the samples ( ) x j : j = 1,..., M , estimate the PDF p(x θ ) p(x; θ ) { 1 } | i ≡ i I calculate the likelihood function as n p( θ ) = p(x θ ) Dn| i k | i kY=1
161 Reduced epistemic space Bayesian updating (25 observations)
162 Reduced epistemic space Bayesian updating (50 observations)
163 Reduced epistemic space Bayesian updating
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6 ) ) 1 1 0.5 0.5 F(X F(X 0.4 0.4
0.3 0.3
0.2 Full range updated p−box 0.2 Reduced range updated p−box Full range updated p−box 0.1 0.1 Calibration data eCDF Reduced range updated p−box Validation data eCDF Calibrtation+Validation data eCDF 0 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 X X 1 1
164 5. If Di < Dv˜ collect θi .
Uncertainty Characterization Non parametric approach
1. Generate ni realizations on the epistemic space θi i 2. Evaluate the model x1 := h1(αj ; θi ) for j = 1,..., nj ; 3. Estimate the empirical CDF Fˆ( θ ) ·| i 4. Compute Kolmogorov-Smirnov test (measure of similarity Di )
165 Uncertainty Characterization Non parametric approach
1. Generate ni realizations on the epistemic space θi i 2. Evaluate the model x1 := h1(αj ; θi ) for j = 1,..., nj ; 3. Estimate the empirical CDF Fˆ( θ ) ·| i 4. Compute Kolmogorov-Smirnov test (measure of similarity Di ) 5. If Di < Dv˜ collect θi .
165 5. If Di < Dv˜ collect θi .
Uncertainty Characterization Non parametric approach
1. Generate ni realizations on the epistemic space θi i 2. Evaluate the model x1 := h1(αj ; θi ) for j = 1,..., nj ; 3. Estimate the empirical CDF Fˆ( θ ) ·| i 4. Compute Kolmogorov-Smirnov test (measure of similarity Di )
165 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
166 General remarks
Aim Rank 4 category II & III parameters (x1,..., x5) Rank 17 category II & III parameters (J1 J2)
Two strategies were employed:
I Nonspecificity techniques I Global Sensitivity Analysis
167 Nonspecificity techniques Procedure and definitions Pinching: I + p (I I ) where p = 0.1, 0.3, 0.5, 0.7, 0.9 i · i − i { } Nonspecificity: measure of epistemic uncertainty, of the output random set after pinching was estimated
I Klir, G. J. and Wierman, M. J., Uncertainty-Based Information: Elements of Generalized Information Theory, Vol. 15 of Studies in Fuzziness and Soft Computing), Physica-Verlag, Heidelberg, Germany, 1998. I Klir, G. J., Uncertainty and Information : Foundations of Generalized Information Theory, John Wiley and Sons, New Jersey, 2006. I Alvarez, D. A., Reduction of uncertainty using sensitivity analysis methods for infinite random sets of indexable type, International Journal of Approximate Reasoning, Vol. 50, No. 5, 2009, pp. 750 – 762.
168 Nonspecificity approach Some examples
Pbox of x after setting E[p ] = 0.7 Pbox of x after setting p = 0.1 1 1 2 6
1 1 0.9 0.9 Probability boxes corresponding 0.8 0.8 0.7 0.7 0.6 0.6 to the original (dashed lines) and 0.5 0.5 0.4 0.4 pinched (continuous lines) output 0.3 0.3 0.2 0.2 0.1 0.1 probability boxes. 0 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.2 0.4 0.6 0.8 1 Tools: Pbox of x after setting p = 0.5 Pbox of x after setting p = 0.5 3 12 4 16
I 50 samples for each pinching 1 1 0.9 0.9 0.8 0.8 I GA for each focal element 0.7 0.7 0.6 0.6 0.5 0.5 (30000 individuals and 10 0.4 0.4 0.3 0.3 generations) 0.2 0.2 0.1 0.1 0 0 0.2 0.4 0.6 0.8 1 0.5 0.6 0.7 0.8 0.9 1
169 Sensitivity Analysis Background
Local Sensitivity Analysis
I Obtained varying input factors one at time and holding others fixed to a nominal value Zi∗ I Involve partial derivatives
∂f (Z) sZ = V (Y Zi = z∗) = Z =z σz i | i ∂zi | i i∗ ∗ i
170 Sensitivity Analysis Background
Global Sensitivity Analysis I Obtained averaging the local sensitivity analysis over the distribution of uncertain factors, E(V (Y Z = z )) | i i∗ I Variation of the output is taken globally (the supp(Z) is explored) I Identify “active” input factors at a low computational cost I Sensitivity measures (Sobol’ indexes) can by estimated via Monte Carlo Method
171 Global sensitivity analysis General remarks
Compute Sobol’ indices and Total indices
VarXi [EX i (Y xi )] VarX i (EXi (Y X i )) S = ∼ | T = 1 ∼ | ∼ i V [Y ] i − Var(Y )
Requirements: I Exact knowledge of PDF I Variance of a measurable output Strategy:
I Redefined model h∗: no p-boxes and a scalar output + I δi = ∞ Fi (x) Fref (x) dx (Problem B1) −∞ | − | I extended-FASTR method and Saltelli’s method
172 Redefined model for sensitivity analysis
173 Redefined model for sensitivity analysis
173 Global Sensitivity Analysis Results tasks B1
Sensitivity indices of x using FAST method Sensitivity indices of x using Saltelli's method 1 1 1 1 Main effect Main effect Confidence Intervals: 0.05% 0.95% 0.9 0.9 Total effect (interactions)
0.8 0.8
0.7 0.7
0.6 0.6
0.5 0.5
0.4 0.4
0.3 0.3 Normalised sensitivity measures Normalised sensitivity measures 0.2 0.2
0.1 0.1
0 0 p2 p2 E[p1] E[p4] E[p5] E[p1] E[p4] E[p5] Var[p1] Var[p4] Var[p5] (p4,p5) Var[p1] Var[p4] Var[p5] (p4,p5) ρ ρ
extended-FAST approach and Saltelli’s method for x1
174 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
175 Uncertainty Propagation General remarks
Aim Find the range of the metrics with reduced and improved models
J = E[w (p, d )] and J = 1 P[w (p, d ) < 0] 1 baseline 2 − baseline
Two strategies were employed:
I Approach A: propagate intervals obtained from given distribution-free p-boxes and construct Dempster-Shafer structure I Approach B: global optimization in the epistemic space (search domain). Monte Carlo Simulation to estimate J1 and J2
176 Uncertainty Quantification Epistemic + Aleatory uncertainty
Double loop approach I Sampling intervals and reliability analysis for each sample I Difficult to treat distribution-free p-boxes I Huge computational efforts I Allows to identify “extreme realizations”
Global Optimization domain search
f(x)
Line Sampling output Simulation
177 Uncertainty Quantification Propagation of Focal Elements
I Sample focal elements in the α-space I Propagate intervals and construct Dempster-Shafer structure I Limitation: treats p-boxes as distribution-free p-boxes
Propagation of focal elemants
domain Sampling search D-S structure α 0 1 {i} f(x)
Global Optimization (Interval propagation) output
178 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
179 Extreme Case Analysis Aim and strategy
Aim
I Identify epistemic realizations that yields extreme values of
J = E [w(p, d )] ; J = 1 P [w(p, d ) < 0] 1 baseline 2 − baseline I Qualitatively describe the relation between sub-disciplines x and failure modes g.
Strategies:
I Identify realizations from Subproblem C - Approach B I Analyses range variability of the performance functions g = f (x)
180 Extreme Case analysis Identify realizations
Search feasible Global domain E Optimization
m1 {i} Empirical CDF a1 p1 v1 Monte Carlo m 2 Simulation {i} a2 p2 v2 {i} {i} h(p) f(x) max(g) w w {i} I a6 p6 1 {i} 6 J1 = S w N i a21 1 {i} J = I(w >0) {i} 2 S a2 p21 b21 N i Extreme case realizations of J (moments) 2 1 0.9 max J = 0.38 2 0.8 min J = 0.24 0.7 2 0.6
d 0.5 0.4 0.3 0.2 0.1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 [0, 1] [0, 1] [0, 1] [0, 1]* [−1, 1]* [−4.9, .3] [.0025, 4] [.63, .76]* [.097, .3]* [.48, .767] [.026, .04]* [.166, .424] [.039, .089] [.279, .684] [.044, .085] [.024, .062] [.439, .573] [.016, .021] [.416, .624] [.480, .741] [.010, .018] [−4.5, 4.8]* [.242, .746] [.009, .018] [.476, .851] [.024, .091] [.036, .050]* [7.77, 29.6]* ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) [.500, .885] ) [.017, .090] ) ) ) ) [.060, .082] ) ) ) ) ) ) ) 2 6 8 8 7 7 5 5 4 5 4 1 1 12 16 p p 14 13 13 14 10 10 15 15 18 18 21 17 17 20 21 20 p ,p p 4 v(p v(p v(p v(p v(p m(p m(p m(p m(p v(p v(p v(p v(p m(p v(p v(p v(p v(p m(p m(p m(p m(p (p m(p m(p m(p m(p ρ
181 Extreme Case analysis Analysis of the failure modes
h aleatory p1:5 space h1 d α p6:10 Ω h2 J1 E w simulation p11:15 x g 8 w = [ ] h3 f max gi of p i=1 − p16:20 J2 =1 P [w< 0] epistemic h4 space p21 θ h5 Θ
Normalized parallel coordinates of "p " leading to high values of "w" i 1 w > 1000 0.8
0.6 d 0.4
0.2
0 p1 p2 p3 p4* p5* p6 p7 p8 p9 p10 p11 p12 p13 p14 p15 p16 p17 p18 p19 p20 p21
182 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
183 Robust Design Aim and strategy Robust design in presence of aleatory and epistemic uncertainty:
I Design Variable: 14 control parameters (d) I Objective Function: minimize max(J1) (expected value) I Objective Function: minimize max(J2) (failure probability) I Sensitivity analysis of the obtained designs Computational challenges I Each candidate solution: 3 days (i.e. max(J ), max(J )) ≈ 1 2 I Running time: 3 days (for each solution) ≈ Strategies:
I Surrogate model (Artificial Neural Networks) I Optimizer: Genetic Algorithms (and BOBYQA)
184 Robust Design Surrogate model (Artificial Neural Networks)
Approximation of the most computationally expensive part
I Inputs: x and d d ANN I Outputs: g Results:
I max(J1) = 0.0044 (baseline 3.05)
I max(J2) = 0.34 (baseline 0.41) w Interval/Random Predictor Model Friday 09:00 - 11:00 Part 12
185 Surrogate model (Artificial Neural Networks) Training
Training data issues I Positive and negative values of g concentrated around zero I Few extremely high values
1 1 Non linear transformation: z(gj ) := 200 min(gj ) − 100(gj +2 min(gj ) ) 4 | | | | x 10 j min(gj ) max(gj ) 2 1500
1 -1.4353 5.57 1.5 2 -5.3903 15.90 1000 3 -0.0533 1036.2 1 500 4 -0.0015 1159.2 0.5 5 -0.4375 1018.8 0 0 6 -0.0182 1080.1 −500 0 500 1000 1500 −0.1 −0.05 0 0.05 0.1 Histogram of g Histogram of z(g ) 7 -0.0002 1137.2 3 3 8 -0.0129 1072.5 Original and transformed values of g3
186 Optimization: Genetic Algorithms
Task E1: minimization of max(J1)
x 105 3 2 d1 0.4 1.5 d2 d3 2.5 d4 0.2 1 d5 d6 0 0.5 d7 2 d8 51 52 53 54 55 0 d9 ) d10 1 −0.5 d11 1.5 value
d12 sup(J −1 d13 d14 1 −1.5 −2 0.5 −2.5
0 10 20 30 40 50 generation nr. 0 0 10 20 30 40 50 generation nr. dE1 = [ 0.0140, -0.2568, -0.0944, -0.4405, -0.1508, -0.1029, -0.0713, 0.2002, -0.4431, 0.2579, 0.0044, -0.2086, 0.6330, -0.0166 ]
max(J1) = 0.0044 (baseline 3.05)
187 Optimization: Genetic Algorithms
Task E2: minimization of max(J2)
Objective: minimization of max(J2) 1 2 d1 d2 1.5 d3 d4 1 d5 d6 0.8 0.5 d7 d8 )
0 d9 1 d10 −0.5 d11 d12 sup(J value 0.6 −1 d13 d14 −1.5 −2 0.4 −2.5
−3 0 10 20 30 40 50 60 0 10 20 30 40 50 generation nr. generation nr. dE2 = [ 0.0102, -0.2409, -0.0844, -2.1800, 0.8192, -0.1642, -0.0981, -0.4674, -0.5958, 0.3230, 0.0015, -0.1975, 0.6249, 0.0052 ]
max(J2) = 0.34 (baseline 0.41)
188 Examples of imprecise probability in engineering Outline
Uncertainty Management in Engineering Motivation Method of analysis Limitation of traditional approaches Random Set The NASA UQ Challenge Problem The model Uncertainty Characterization (Subproblem A) Sensitivity Analysis (Subproblem B) Uncertainty Propagation (Subproblem C) Extreme Case Analysis (Subproblem D) Robust Design (Subproblem E) Conclusions
189 Uncertainty quantification in Engineering
Generalised Probabilistic method I Rigorous framework to deal with scarce data, imprecision and vagueness without making any assumption I Dealing with different representations of uncertainties I Utilisation of traditional stochastic methods and techniques I Modelling epistemic uncertainties as aleatory uncertainty might lead to severe over/under estimations I Provides bounds of the estimations (traditional probabilistic results always included) Challenges Still computational intensive analysis
190 Sensitivity method I Global sensitivity method: 5 105. (using traditional tools) × I Nonspecificity method: 5 107 (required dedicated tools). × Uncertainty propagation I 250 106 model evaluations (Nα = 1000 focal elements + × GA with 10000 individuals and 50/55 iterations) Robust Design I Requires surrogate model (Friday for further details)
Computationally demanding approach (global optimization)
Summary of the approaches Drawbacks and limitations Uncertainty Characterization I Bayesian: > 106 model evaluations and difficult to implement I Non-Parametric: very fast and easy to implement
191 Uncertainty propagation I 250 106 model evaluations (Nα = 1000 focal elements + × GA with 10000 individuals and 50/55 iterations) Robust Design I Requires surrogate model (Friday for further details)
Computationally demanding approach (global optimization)
Summary of the approaches Drawbacks and limitations Uncertainty Characterization I Bayesian: > 106 model evaluations and difficult to implement I Non-Parametric: very fast and easy to implement Sensitivity method I Global sensitivity method: 5 105. (using traditional tools) × I Nonspecificity method: 5 107 (required dedicated tools). ×
191 Robust Design I Requires surrogate model (Friday for further details)
Computationally demanding approach (global optimization)
Summary of the approaches Drawbacks and limitations Uncertainty Characterization I Bayesian: > 106 model evaluations and difficult to implement I Non-Parametric: very fast and easy to implement Sensitivity method I Global sensitivity method: 5 105. (using traditional tools) × I Nonspecificity method: 5 107 (required dedicated tools). × Uncertainty propagation I 250 106 model evaluations (Nα = 1000 focal elements + × GA with 10000 individuals and 50/55 iterations)
191 Summary of the approaches Drawbacks and limitations Uncertainty Characterization I Bayesian: > 106 model evaluations and difficult to implement I Non-Parametric: very fast and easy to implement Sensitivity method I Global sensitivity method: 5 105. (using traditional tools) × I Nonspecificity method: 5 107 (required dedicated tools). × Uncertainty propagation I 250 106 model evaluations (Nα = 1000 focal elements + × GA with 10000 individuals and 50/55 iterations) Robust Design I Requires surrogate model (Friday for further details)
Computationally demanding approach (global optimization)
191