Surrogate minimisation in high dimensions
Fabrizia Guglielmetti (ESO)
In collaboration with: Torsten Enßlin (MPA), Henrik Junklewitz (AIfA), Theo Steininger (MPA)
Bayes Forum Seminar - 24.02.2017 (1) Outline
❖ Motivation!
• Performance enhancement of optimisation schemes !
❖ Global optimisation as a Bayesian decision problem!
• Two connected statistical layers!
❖ Surrogates basic principles !
❖ Application of Kriging Surrogate within NIFTY (Selig, M. et al., 2013, A&A, 554, 26) framework
Bayes Forum Seminar - 24.02.2017 (2) Introduction
!
❖ Complex computer codes are essentials!
!
!
!
!
!
!
Bayes Forum Seminar - 24.02.2017 (3) Introduction
A 48-hour computer simulation of Typhoon Mawar using the Weather Research and Forecasting model! credit Wikipedia
Bayes Forum Seminar - 24.02.2017 (3) Introduction
See, e.g., Cox D.D, Park J.S., Singer C.E. (2001) Preuss, R. & von Toussaint, U. (2016)
Bayes Forum Seminar - 24.02.2017 (3) Introduction
Bayes Forum Seminar - 24.02.2017 (3) Introduction
Bayes Forum Seminar - 24.02.2017 (3) Introduction
Bayes Forum Seminar - 24.02.2017 (3) Introduction
Bayes Forum Seminar - 24.02.2017 (3) Introduction
!
❖ Complex computer codes are essentials!
!
! Computer ! x Code z(x)
! Simulator !
!
Bayes Forum Seminar - 24.02.2017 (4) Introduction
!
❖ Complex computer codes are essentials:!
!
! Computer ! x Code z(x)
! ! Simulator Minimisation! of Complex Objective function. For reasonable data size, current estimation techniques:!
! a. SLOW if full objective function is accounted!
b. FAST if do not account for full objective function!
Bayes Forum Seminar - 24.02.2017 (4) Introduction
!
❖ Complex computer codes are essentials:!
!
! Computer ! x Code z(x)
! ! Emulator ! • statistical representation of x ! ! • expresses knowledge about z(x) at any given x!
• built using prior information and training set of model runs
Bayes Forum Seminar - 24.02.2017 (4) Introduction
!
❖ Complex computer codes are essentials:!
!
! Computer ! x Code z(x)
! Emulator ! Kriging surrogate sampler is used to emulate the original Complex ! Objective function
Bayes Forum Seminar - 24.02.2017 (4) Surrogates (or metamodels, emulators, …)
Bayes Forum Seminar - 24.02.2017 (5) Surrogates (or metamodels, emulators, …)
• A map is a surrogate that predicts terrain elevation
• Prediction of the map allows one to locate the summit without climbing it
• Searching for the highest peak is a form of global optimisation Bayes Forum Seminar - 24.02.2017 (5) Surrogates (or metamodels, emulators, …)
… in the same fashion we can use metamodels to find the optimal signal configuration
Bayes Forum Seminar - 24.02.2017 (5) Bayesian inference, parametric model
❖ Data: x, y!
❖ Model: !y = fk(x)+✏ ! 1 Gaussian likelihood:! p(y x, k,M ) exp (y f (x ))2/ 2 | i / 2 j k j j ! Y Prior over the param:!p(k M ) | i ! p(y x, k,Mi)p(k Mi) Posterior param distr: p! (k x, y,Mi)= | | | p(y x,Mi) ! | Make predictions: ! p(y⇤ x⇤, x, y,M )= p(y⇤ k,x⇤,M )p(k x, y,M )dk | i | i | i Bayes Forum Seminar - 24.02.2017 Z (6) Bayesian inference, parametric model
❖ Data: x, y!
❖ Model: !y = fk(x)+✏ ! 1 Gaussian likelihood:! p(y x, k,M ) exp (y f (x ))2/ 2 | i / 2 j k j j ! Y Prior over the param:!p(k M ) | i ! p(y x, k,Mi)p(k Mi) Posterior param distr: p! (k x, y,Mi)= | | | p(y x,Mi) ! | Make predictions: ! p(y⇤ x⇤, x, y,M )= p(y⇤ k,x⇤,M )p(k x, y,M )dk | i | i | i Bayes Forum Seminar - 24.02.2017 Z (6) Bayesian inference, parametric model
❖ Data: x, y!
❖ Model: !y = fk(x)+✏ ! 1 Gaussian likelihood:! p(y x, k,M ) exp (y f (x ))2/ 2 | i / 2 j k j j ! Y Prior over the param:!p(k M ) | i ! p(y x, k,Mi)p(k Mi) Posterior param distr: p! (k x, y,Mi)= | | | p(y x,Mi) ! | Make predictions: ! p(y⇤ x⇤, x, y,M )= p(y⇤ k,x⇤,M )p(k x, y,M )dk | i | i | i Bayes Forum Seminar - 24.02.2017 Z (6) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: !
! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
!
!
Bayes Forum Seminar - 24.02.2017 (7) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: ! Information Hamiltonian ! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
!
!
Bayes Forum Seminar - 24.02.2017 (7) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: ! Information Hamiltonian ! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
! Partition function
!
Bayes Forum Seminar - 24.02.2017 (7) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: !
! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
! T d =(d1,d2,...,dn) n N finite dataset 2 ! IFT exploits known or inferred correlation structures of the field of interest s = s ( x ) over some domain ⌦ = x to regularise the ill- posed inverse problem of determining an { } number of dof from a finite dataset 1
Bayes Forum Seminar - 24.02.2017 (7) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: !
! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
! most probable signal configuration !
Bayes Forum Seminar - 24.02.2017 (7) IFT for signal inference
❖ Information Field Theory (Enßlin, T. et al. 2009), information theory for fields!
❖ Signal field (s) estimation: !
! H(d,s) P (d s)P (s) e ! P (s d)= | | P (d) ⌘ Zd !
! most probable signal configuration ! s = argmin s d H(d, s) h i h | i
Bayes Forum Seminar - 24.02.2017 (7) Minimization of (complex) energy function
Expensive computer code
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Optimizer
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Optimizer
Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Response Optimizer/ Surface Adjoint simulation Model (RSM)
Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code Used to increase the speed
Response Optimizer/ Surface Adjoint simulation Model (RSM)
Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code Used to increase the speed
Response Optimizer/ Surface Adjoint simulation Model (RSM)
Cheap-to-run surrogate model of the original model function Find optimal image is used and combined with reconstruction the full objective function! ! (same model is used throughout the optimisation process)
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Response Optimizer/ Surface Adjoint simulation Model (RSM)
Kriging (Wiener, 1938; Krige, 1951; Matheron, 1962; Sacks et al, 1989; Find optimal image Cressie, 1989, 1993; Jones et al, 1989; reconstruction Kleijnen, 2009; Razavi et al; 2012)
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Response Optimizer/ Surface Adjoint simulation Model (RSM)
Kriging (Kitanidis, 1986; Handcock & Stein, 1993; Mira & Sanchez,2002; More & Halvorsen, 1989; Morris, 1993; KennedyFind & optimal image O’Hagan, 2000; Bayarri & Berger, 2007; reconstruction Preuss et al. 2016)
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Surface Adjoint simulation Model (RSM)
Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Gradients take correct direction early in Find optimal image convergence reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Proper shape of RSM is provided Find optimal image reconstruction
Bayes Forum Seminar - 24.02.2017 (8) Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Find optimal image reconstruction Max shape control in image variations afforded to the optimiser for a minimum number of
Bayes Forum Seminar - 24.02.2017 variables Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Find optimal image reconstruction Max shape control Divert optimization in image variations away from infeasible afforded to the regions optimiser for a minimum number of
Bayes Forum Seminar - 24.02.2017 variables Minimization of (complex) energy function
Expensive computer code
Partially converged simulations Response Optimizer/ Position of individual surface Surface Adjoint simulation points are moved in the direction Model (RSM) of the gradients predicted by the adjoint simulation
Find optimal image reconstruction Max shape control Divert optimization in image variations away from infeasible afforded to the regions optimiser for a minimum number of
Bayes Forum Seminar - 24.02.2017 variables Surrogate recipe The simulator
Target function
z=f(x)
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator
Target function
z=f(x)
Quantity of interest
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator
Target function
z=f(x)
Input x Quantity of interest
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator
Target function Output z z=f(x)
Input x Quantity of interest
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator
Target function Output z z=f(x)
Input x
Quantity of interest x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator Response
Target function Output z z=f(x)
Input x
Quantity of interest x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The simulator Computationally intensive Target function Output z z=f(x)
Input x
Quantity of interest x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe The aim is to learn about f(x) …
Target function Output z z=f(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe … assume we can effort only n evaluations of f(x)
Target function Output z z=f(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe … assume we can effort only n evaluations of f(x)
Target function Output z z=f(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe Build a sampling plan with a set of experimental inputs: (1) (n 1) T X = x ,...,x { } Target function Output z z=f(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe Build a sampling plan with a set of experimental inputs: (1) (n 1) T X = x ,...,x { } Target function Output z f(x) field samples
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe Use measurement apparatus (numerical solver) to get the (1) (n 1) T observations: i.e. calculate the responses y = y ,...,y Target function { } Output z
y
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe Fit a surrogate model f ˆ to the data: from the observations (x,y) we make a prediction of the response Target function Output z
y
fˆ(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe Assume f ˆ stands in for f we can find x 0 as close as to the true minimum of fˆ(x0) Target function Output z
y
fˆ(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe enhanced Improve inference by including observations of the GRADIENT Target function Output z
reduce computational y cost of higher- dimensional optimisation problems
fˆ(x)
Input x
The emulator x,z multi-dimensional
Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe enhanced KRIGING WEIGHTING SCHEME
Weights are assigned to sample points according to! a data driven weighting function: sample points The emulator closer to the new sample point to be predicted receive more weight Bayes Forum Seminar - 24.02.2017 (9) Surrogate recipe ˆ Validatef ( x 0 ) against the true expensive f : computation of f (x0) Target function Output z
y
fˆ(x)
Input x
The emulator
Bayes Forum Seminar - 24.02.2017 (9) Surrogate challenges
! 1. Multimodal and multidimensional landscape!
2. Surrogate function has to emulate well the real function, at least in the location of the optima!
Bayes Forum Seminar - 24.02.2017 (10) Kriging surrogate - Gaussian process
Empirical modelling in high dimensional data:! (n) • y(x) assumed to underlie the data {x , tn} : adaption of model to data corresponds to inference of the function given the data! Generalisation of a (multivariate) Gaussian distribution to a function space of infinite dimension!
Specified by mean and covariance functions! y(x) GP µ(x),C(x, x0) ⇠ • mean is a function of x (the zero function)! • cov is a function C(x,x’), expected covariance between the values of the function y at the points x and x’ ! ! 1 1 T ! P (y(x) µ(x),A)= exp y(x) µ(x) A y(x) µ(x) | Z 2 ! ⇥ ⇤ y(x) lives in the infinite-dimensional space of all continuous functions of x
Bayes Forum Seminar - 24.02.2017 (11) Kriging surrogate
❖ Gaussian process regression, Wiener-Kolmogorov prediction !
❖ Kriging is employed within Bayesian formalism!
❖ Assumptions: !
• prior distribution and covariance of unknown function f(x)!
• observed data (x) are normally distributed ! -> responses y(x) Gaussian process !
!
P (ti y(x), Xi, I)P (y(x)) P (y(x) ti, Xi, I)= | | P (ti)
Bayes Forum Seminar - 24.02.2017 (12) Kriging surrogate
❖ Gaussian process regression, Wiener-Kolmogorov prediction!
❖ Kriging is employed within Bayesian formalism!
❖ Assumptions: !
• prior distribution and covariance of unknown function f(x)!
• observed data (x) are normally distributed ! -> responses y(x) Gaussian process !
! Posterior pdf of the emulated signal P (ti y(x), Xi, I)P (y(x)) P (y(x) ti, Xi, I)= | | P (ti)
Bayes Forum Seminar - 24.02.2017 (12) Kriging surrogate
❖ Gaussian process regression, Wiener-Kolmogorov prediction!
❖ Kriging is employed within Bayesian formalism!
❖ Assumptions: !
• prior distribution and covariance of unknown function f(x)!
• observed data (x) are normally distributed ! -> responses y(x) Gaussian process !
!
P (ti y(x), Xi, I)P (y(x)) P (y(x) ti, Xi, I)= | | P (ti) t = t n n { i}i=1 Bayes Forum Seminar - 24.02.2017 (12) Kriging surrogate
❖ Gaussian process regression, Wiener-Kolmogorov prediction !
❖ Kriging is employed within Bayesian formalism!
❖ Assumptions: !
• prior distribution and covariance of unknown function f(x)!
• observed data (x) are normally distributed ! -> responses y(x) Gaussian process !
!
P (ti y(x), Xi, I)P (y(x)) P (y(x) ti, Xi, I)= | | P (ti) t = t n n { i}i=1 Bayes Forum Seminar - 24.02.2017 target values (12) Kriging surrogate
❖ Gaussian process regression, Wiener-Kolmogorov prediction !
❖ Kriging is employed within Bayesian formalism!
❖ Assumptions: !
• prior distribution and covariance of unknown function f(x)!
• observed data (x) are normally distributed ! -> responses y(x) Gaussian process !
!
H(y(x),ti) P (ti y(x), Xi, I)P (y(x)) e P (y(x) ti, Xi, I)= | | P (ti) ⌘ Zti
Bayes Forum Seminar - 24.02.2017 (12) Minimization challenge d = Rs + n
Bayes Forum Seminar - 24.02.2017 (13) Minimization challenge d = Rs + n 1 1 1 (s, S)= exp( s†S s) G 2⇡S 2 | | p
Bayes Forum Seminar - 24.02.2017 (13) Minimization challenge d = Rs + n 1 1 1 (s, S)= exp( s†S s) G 2⇡S 2 | | p 1 1 H(d s)= ln P (d s)= (d Rs)†N (d Rs) | | 2
Bayes Forum Seminar - 24.02.2017 (13) Minimization challenge d = Rs + n 1 1 1 (s, S)= exp( s†S s) G 2⇡S 2 | | p 1 1 H(d s)= ln P (d s)= (d Rs)†N (d Rs) | | 2
1 1 H(d, s)= lnP(d, s) = s†S s + H(d s) 2 | b
Bayes Forum Seminar - 24.02.2017 (13) Minimization challenge d = Rs + n 1 1 1 (s, S)= exp( s†S s) G 2⇡S 2 | | p 1 1 H(d s)= ln P (d s)= (d Rs)†N (d Rs) | | 2
1 1 H(d, s)= lnP(d, s) = s†S s + H(d s) 2 | b m = arg min H(d, s) s H and dH/ds are calculated at several positions to slide towards the minimum -> computationally expensive
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | E(x) b | {z }
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | E(x) b | {z }
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b I =(x ,E = E(x ),v = @ E(x) )n n i i i i x |xi i=1
Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b I =(x ,E = E(x ),v = @ E(x) )n n i i i i x |xi i=1
v†(x x ) n i i E F (x)= w (x) Eie i if vi†(x xi) > 0 In i 8 i=1 E v†(x x )ifv†(x x ) 0 X < i i i i i : Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b I =(x ,E = E(x ),v = @ E(x) )n n i i i i x |xi i=1
v†(x x ) n i i E F (x)= w (x) Eie i if vi†(x xi) > 0 In i 8 i=1 E v†(x x )ifv†(x x ) 0 X < i i i i i : Bayes Forum Seminar - 24.02.2017 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b I =(x ,E = E(x ),v = @ E(x) )n n i i i i x |xi i=1
v†(x x ) n i i E F (x)= w (x) Eie i if vi†(x xi) > 0 In i 8 i=1 E v†(x x )ifv†(x x ) 0 X < i i i i i
1 1 : H(d s)= (d Rs)†N (d Rs) Bayes Forum Seminar - 24.02.2017 | 2 (13) Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | EE(x)) F (x) ⇡ In 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
b I =(x ,E = E(x ),v = @ E(x) )n n i i i i x |xi i=1
v†(x x ) n i i E F (x)= w (x) Eie i if vi†(x xi) > 0 In i 8 E v†(x x )ifv†(x x ) 0 i=1 < i i i i i X 2 2 ↵ ( x xi + ✏ ) 2 wi(x)= n | | ↵ (1) 2 2 2 : j=1( x xj + ✏ ) Bayes Forum Seminar - 24.02.2017 | | (13) P Surrogate minimization d = Rs + n 1/2 x = S s
1/2 1 1 H(x)=H(d, s = S x)= x†x + H(d s = S 2 x) 2 | E(x) 1 b H(x)= x†x + F (x)=H (x) | {z } 2 In In
v†(x x ) v†(x x ) n i i i i b E E @ H (x)=x w (x) ↵AEie i + vie i if vi†(x xi) > 0 x In i 8 i=1 ↵A E v†(x x ) + v if v†(x x ) 0 X < i i i i i i x: x ⇥ ⇤ x x where A = i w (x) j ( x x 2 + ✏2)↵ j ( x x 2 + ✏2)↵ | i| j | j| ⇥ X ⇤
Bayes Forum Seminar - 24.02.2017 (13) Application
Simulated sky signal employing NIFTY package! ! 1.6E+4 dimensions
NIFTY= Numerical Information Field Theory (M.Selig, T. Enßlin et al.)
Bayes Forum Seminar - 24.02.2017 (14) Application
Simulated dataset employing NIFTY package! ! d=R(s) + n
NIFTY= Numerical Information Field Theory (M.Selig, T. Enßlin et al.)
Bayes Forum Seminar - 24.02.2017 (14) Application
Simulated dataset employing NIFTY package! ! d=R(s) + n
find optimal solution for s given d
NIFTY= Numerical Information Field Theory (M.Selig, T. Enßlin et al.)
Bayes Forum Seminar - 24.02.2017 (14) Application
Simulated dataset employing NIFTY package! ! d=R(s) + n
find optimal solution for s given d
assumption: s is random field ! following some statistics and ! being constrained by the data
NIFTY= Numerical Information Field Theory (M.Selig, T. Enßlin et al.)
Bayes Forum Seminar - 24.02.2017 (14) Application -Results
Bayes Forum Seminar - 24.02.2017 (14) Application -Results
Bayes Forum Seminar - 24.02.2017 (14) Application -Results
Surrogate solution! coincide with ! Wiener filter solution
Bayes Forum Seminar - 24.02.2017 (15) Application -Results
Bayes Forum Seminar - 24.02.2017 (15) Application -Results
Bayes Forum Seminar - 24.02.2017 (15) Conclusion & Summary
Speed up of computer run of minimisation of Complex Energy function is desired ! A Kriging Surrogate sampler is developed to emulate the behaviour of the Complex Energy Function!
• powerful way to perform Bayesian inference about functions in high- dimensional space!
• non-intrusive approach, but still an approximation!
• reduces number of function evaluations!
• includes sensitivity analysis ! Application in high dimensions of Kriging Surrogate within NIFTY framework is shown !
• can speed up to a factor of ~100 other optimisation schemes
Bayes Forum Seminar - 24.02.2017 (16)