Machine Learning, Lecture 5 Kernel Machines

Machine Learning, Lecture 5 Kernel Machines

Outline lecture 5 2(39) Machine Learning, Lecture 5 1. Summary of lecture 4 Kernel Machines 2. Introductory GP example 3. Stochastic processes Thomas Schön 4. Gaussian processes (GP) Division of Automatic Control Construct a GP from a Bayesian linear regression model • GP regression Linköping University • Examples where we have made use of GPs in recent research Linköping, Sweden. • 5. Support vector machines (SVM) Email: [email protected], Phone: 013 - 281373, Chapter 6.4 – 7.2 (Chapter 12 in HTF, GP not covered in HTF) Office: House B, Entrance 27. AUTOMATIC CONTROL Machine Learning REGLERTEKNIK Machine Learning T. Schön LINKÖPINGS UNIVERSITET T. Schön Summary of lecture 4 (I/II) 3(39) Summary of lecture 4 (II/II) 4(39) A kernel function k(x, z) is defined as an inner product A neural network is a nonlinear function (as a function expansion) from a set of input variables to a set of output variables controlled by k(x, z) = φ(x)Tφ(z), adjustable parameters w. where φ(x) is a fixed mapping. This function expansion is found by formulating the problem as usual, which results in a (non-convex) optimization problem. This problem is Introduced the kernel trick (a.k.a. kernel substitution). In an solved using numerical methods. algorithm where the input data x enters only in the form of scalar products we can replace this scalar product with another choice of Backpropagation refers to a way of computing the gradients by kernel. making use of the chain rule, combined with clever reuse of information that is needed for more than one gradient. The use of kernels allows us to implicitly use basis functions of high, even infinite, dimensions (M ∞). → Machine Learning Machine Learning T. Schön T. Schön Introductory example (I/IV) 5(39) Introductory example (II/IV) 6(39) AUTOMATIC CONTROL Machine Learning REGLERTEKNIK Machine Learning T. Schön LINKÖPINGS UNIVERSITET T. Schön Introductory example (III/IV) 7(39) Introductory example (IV/IV) 8(39) Machine Learning Machine Learning T. Schön T. Schön Stochastic processes 9(39) Linear regression model on matrix form 10(39) Write the linear regression model (without noise) as T yn = w φ(xn), n = 1, . , N, Definition (Stochastic process): A stochastic process can be T defined as a family of random variables y(x), x . where w = w0 w1 ... wM 1 and { ∈ X } − T φ = 1 φ1(xn) ... φM 1(xn) on matrix form Property: For a fixed x , y(x) is a random variable. − ∈ X Y =Φw, Examples: Wiener process, Chinese restaurant process, Dirichlet processes, Poisson process, Gaussian process, Markov process. where y1 φ0(x1) φ1(x1) ... φM 1(x1) Åström K. J. (2006). Introduction to Stochastic Control Theory. Dover Publications, Inc., NY, USA. − y2 φ0(x2) φ1(x2) ... φM 1(x2) Y = . Φ = . −. . .. yN φ0(xN) φ1(xN) ... φM 1(xN) − AUTOMATIC CONTROL Machine Learning REGLERTEKNIK Machine Learning T. Schön LINKÖPINGS UNIVERSITET T. Schön Gram matrix made up of kernels 11(39) A Gaussian Process (GP) 12(39) The matrix K is formed from covariance functions (kernels) k(xn, xm) Kn,m = k(xn, xm) Definition (Gaussian process): A Gaussian process is a collection and it is referred to as the Gram matrix. of random variables, any finite number of which have a joint Definition (covariance function (kernel)): Given any collection of Gaussian distribution. points x1,..., xN, a covariance function k(xn, xm) defined the What does this mean? elements of an N N matrix × Kn,m = k(xn, xm), such that K is positive semidefinite. Machine Learning Machine Learning T. Schön T. Schön Samples from a GP 13(39) Samples from a GP 13(39) Let y(x) be a Gaussian process with mean function m = 0 and a Let y(x) be a Gaussian process with mean function m = 0 and a (x x )2/l (x x )2/l covariance function Cov(y(x), y(x0)) = k(x, x0) = e− − 0 . Let covariance function Cov(y(x), y(x0)) = k(x, x0) = e− − 0 . Let x = 1 : 20. Samples from this GP are shown below. x = 1 : 20. Samples from this GP are shown below. 1.5 0.5 2 2.5 2.5 2 2 2 1.5 1 0 1.5 1.5 1.5 1 0.5 1 1 −0.5 1 0.5 0.5 0.5 0 −1 0.5 0 0 0 y(x) y(x) y(x)y(x) y(x) y(x) y(x) −0.5 −0.5 −0.5 −0.5 −1.5 0 −1 −1 −1 −1 −1.5 −1.5 −2 −0.5 −1.5 −1.5 −2 −2 −2 −2.5 −1 −2.5 −2.5 −2 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 x x x x x x l = 1 l = 10 l = 20 l = 1 l = 10 l = 20 Note that each sample is a function. In this sense a GP can be Note that each sample is a function. In this sense a GP can be interpreted as giving a distribution over functions. interpreted as giving a distribution over functions. AUTOMATIC CONTROL Machine Learning REGLERTEKNIK Machine Learning T. Schön LINKÖPINGS UNIVERSITET T. Schön GP as a distribution over functions 14(39) Gaussian Process Regression (GPR) 15(39) It is commonly said that the GP defined by So how do we use this for regression? Assume that we are given the training data (y , x ) N and that we seek an estimate for y(x ). If p(Y X) = (Y 0, K), { n n }n=1 ∗ | N | we then assume that y(x) can be modeled by a GP, we have specifies a distribution over functions. The term “function” is potentially confusing, since in merely referes to a set of outputs y m(x) k(x, x) k(x, x ) N , ∗ values y1,..., yN that corresponds to a set of input variables y(x∗) ∼ m(x∗) k(x∗, x) k(x∗, x∗) x1,..., xN. and using standard Gaussian identities (lecture 1) we obtain the predictive (or conditional) density 1 y(x∗) y N k(x∗, x)k(x, x)− y m(x) + m(x∗), | ∼ − 1 k(x∗, x∗) k(x∗, x)k(x, x)− k(x, x∗) − Hence, there is no explicit functional form for the input-output map. Let us try this. Machine Learning Machine Learning T. Schön T. Schön GP predictive distribution 16(39) GP predictive distribution 16(39) Given: Given: T 3 T 4 x = 9.8 15.4 7.9 5.4 0.7 x = 9.8 15.4 7.9 5.4 0.7 T T 3 y = 0.1 2.1 1.3 1.7 0.01 2 y = 0.1 2.1 1.3 1.7 0.01 − − − − 2 1 Assume: Data generated by GP with m = 0, Assume: Data generated by GP with m = 0, 1 2 2 (x x0) /l (x x0) /l k(x, x0) = e− − . 0 k(x, x0) = e− − . 0 y(x*)|y(x) y(x*)|y(x) −1 Predictive GP: −1 Predictive GP: −2 1 −2 1 y(x∗) y N k(x∗, x)k(x, x)− y, y(x∗) y N k(x∗, x)k(x, x)− y, −3 | ∼ | ∼ −3 −4 1 0 2 4 6 8 10 12 14 16 18 20 1 0 2 4 6 8 10 12 14 16 18 20 k(x∗, x∗) k(x∗, x)k(x, x)− k(x, x∗) x* k(x∗, x∗) k(x∗, x)k(x, x)− k(x, x∗) x* − − l = 1 l = 10 AUTOMATIC CONTROL Machine Learning REGLERTEKNIK Machine Learning T. Schön LINKÖPINGS UNIVERSITET T. Schön GP predictive distribution 16(39) GP predictive distribution 16(39) Given: Given: T T 4 x = 9.8 15.4 7.9 5.4 0.7 4 x = 9.8 15.4 7.9 5.4 0.7 T 3 = 0.1 2.1 1.3 1.7 0.01 T t 3 y = 0.1 2.1 1.3 1.7 0.01 − − − − 2 2 Assume: Data generated by GP with m = 0, Assume: Data generated by GP with m = 0, 1 (x x )2/l 2 2 k(x, x ) = e 0 +σ δ , measurement 1 (x x0) /l 0 − − x,x0 k(x, x0) = e− − . 0 2 t(x*)|t(x) y(x*)|y(x) noise σ . 0 −1 Predictive GP: −1 −2 Predictive GP: −2 1 −3 y(x∗) y N k(x∗, x)k(x, x)− y, 1 | ∼ t(x∗) t N k(x∗, x)k(x, x)− t, −4 −3 0 2 4 6 8 10 12 14 16 18 20 | ∼ 0 2 4 6 8 10 12 14 16 18 20 1 x* x* k(x∗, x∗) k(x∗, x)k(x, x)− k(x, x∗) 1 − k(x∗, x∗) k(x∗, x)k(x, x)− k(x, x∗) = − = l 20 l 20 Machine Learning Machine Learning T. Schön T. Schön GP predictive distribution 16(39) Finding the hyperparameters 17(39) Given: T x = 9.8 15.4 7.9 5.4 0.7 The parameters of the kernels (e.g. l) are often referred to as hyper- T t = 0.1 2.1 1.3 1.7 0.01 parameters and they are typically found using empirical Bayes − − (lecture 2).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us