Belief Propagation and MRF's

Belief Propagation and MRF's

Belief propagation and MRF’s Bill Freeman 6.869 November 1, 2012 1 1 Wednesday, October 31, 12 Undirected graphical models • A set of nodes joined by undirected edges. • The graph makes conditional independencies explicit: If two nodes are not linked, and we condition on every other node in the graph, then those two nodes are conditionally independent. Conditionally independent, because are not connected by a line in the undirected graphical model 2 Wednesday, October 31, 12 Undirected graphical models: cliques • Clique: a fully connected set of nodes not a clique clique • A maximal clique is a clique that can’t include more nodes of the graph w/o losing the clique property. Non-maximal clique Maximal clique 3 Wednesday, October 31, 12 Undirected graphical models: probability factorization • Hammersley-Clifford theorem addresses the pdf factorization implied by a graph: A distribution has the Markov structure implied by an undirected graph iff it can be represented in the factored form 1 P = Ψ x ∏ xc Z Potential functions c∈ξ of states of Normalizing constant variables in set of maximal cliques maximal clique 4 Wednesday, October 31, 12 Graphical Models Markov Random Fields set of nodes edges connecting nodes 5 Wednesday, October 31, 12 Graphical Models Markov Random Fields set of nodes edges connecting nodes Nodes are associated with hidden variables Potential functions may depend on observations 5 Wednesday, October 31, 12 Graphical Models Markov Random Fields set of nodes edges connecting nodes Nodes are associated with hidden variables Potential functions may depend on observations 5 Wednesday, October 31, 12 Markov Random Fields (MRF) • Oftentimes MRF’s have a regular, grid-like structure (but they don’t need to). 6 Wednesday, October 31, 12 MRF nodes as pixels 7 Winkler, 1995, p. 32 Wednesday, October 31, 12 MRF nodes as patches image patches scene patches image Φ(xi, yi) Ψ(xi, xj) scene8 Wednesday, October 31, 12 Network joint probability 1 P(x, y) (x , x ) (x , y ) = ∏Ψ i j ∏Φ i i Z i, j i scene Scene-scene Image-scene image compatibility compatibility function function neighboring local scene nodes observations 9 Wednesday, October 31, 12 Energy formulation EL(x,y) = k + ∑β(xi,x j ) + ∑α(xi,yi ) (i, j ) scene Scene-scene Image-scene image compatibility compatibility function function neighboring local scene nodes observations 10 Wednesday, October 31, 12 In order to use MRFs: 11 22 Wednesday, October 31, 12 In order to use MRFs: • Given observations y, and the parameters of the MRF, how infer the hidden variables, x? 11 22 Wednesday, October 31, 12 In order to use MRFs: • Given observations y, and the parameters of the MRF, how infer the hidden variables, x? • How learn the parameters of the MRF? 11 22 Wednesday, October 31, 12 Markov Random Fields (MRF’s) • Inference in MRF’s. – Gibbs sampling, simulated annealing – Iterated conditional modes (ICM) – Belief propagation • Application example—super-resolution – Graph cuts – Variational methods • Learning MRF parameters. – Iterative proportional fitting (IPF) 12 23 Wednesday, October 31, 12 Outline of MRF section • Inference in MRF’s. – Gibbs sampling, simulated annealing – Iterated conditional modes (ICM) – Belief propagation • Application example—super-resolution – Graph cuts – Variational methods • Learning MRF parameters. – Iterative proportional fitting (IPF) 13 24 Wednesday, October 31, 12 Gibbs Sampling • Gibbs sampling: – A way to generate random samples from a (potentially very complicated) probability distribution. – Fix all dimensions except one. Draw from the resulting 1-d conditional distribution. Repeat for all dimensions, and repeat many times. – Take an average of a subset of those samples to estimate the posterior mean. (Wait for a “burn in” period before including samples. And then subsample Gibbs sampler outputs to find independent draws of the joint probability). 14 Reference: Geman and Geman, IEEE PAMI 1984. Wednesday, October 31, 12 Sampling from a 1-d function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 2. Compute distribution function from density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 2. Compute distribution function from density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 2. Compute distribution function from density function 15 26 Wednesday, October 31, 12 Sampling from a 1-d function 1. Discretize the density function 3. Sampling draw α ~ U(0,1); for k = 1 to n if break; ; 2. Compute distribution function from density function 15 26 Wednesday, October 31, 12 Gibbs Sampling 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling x2 x1 16 Slide by Ce Liu Wednesday, October 31, 12 Gibbs Sampling and Simulated Annealing • Gibbs sampling: – A way to generate random samples from a (potentially very complicated) probability distribution. – Fix all dimensions except one. Draw from the resulting 1-d conditional distribution. Repeat for all dimensions, and repeat many times • Simulated annealing: – A schedule for modifying the probability distribution so that, at “zero temperature”, you draw samples only from the MAP solution. 17 Reference: Geman and Geman, IEEE PAMI 1984. Wednesday, October 31, 12 Gibbs sampling and simulated annealing 18 Wednesday, October 31, 12 Gibbs sampling and simulated annealing Simulated annealing as you gradually lower the “temperature” of the probability distribution ultimately giving zero probability to all but the MAP estimate. 18 Wednesday, October 31, 12 Gibbs sampling and simulated annealing Simulated annealing as you gradually lower the “temperature” of the probability distribution ultimately giving zero probability to all but the MAP estimate. What’s good about it: finds global MAP solution. 18 Wednesday, October 31, 12 Gibbs sampling and simulated annealing Simulated annealing as you gradually lower the “temperature” of the probability distribution ultimately giving zero probability to all but the MAP estimate. What’s good about it: finds global MAP solution. What’s bad about it: takes forever. Gibbs sampling is in the inner loop… 18 Wednesday, October 31, 12 Gibbs sampling and simulated annealing 19 Wednesday, October 31, 12 Gibbs sampling and simulated annealing So you can find the mean value (MMSE estimate) of a variable by doing Gibbs sampling and averaging over the values that come out of your sampler. 19 Wednesday, October 31, 12 Gibbs sampling and simulated annealing So you can find the mean value (MMSE estimate) of a variable by doing Gibbs sampling and averaging over the values that come out of your sampler. You can find the MAP estimate of a variable by doing Gibbs sampling and gradually lowering the temperature parameter to zero. 19 Wednesday, October 31, 12 Outline of MRF section • Inference in MRF’s. – Gibbs sampling, simulated annealing – Iterated conditional modes (ICM) – Belief propagation • Application example—super-resolution – Graph cuts • Application example--stereo – Variational methods • Application example—blind deconvolution • Learning MRF parameters. – Iterative proportional fitting (IPF) 20 Wednesday, October 31, 12 Iterated conditional modes Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: – Condition on all the neighbors Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: – Condition on all the neighbors – Find the mode Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: – Condition on all the neighbors – Find the mode – Repeat. Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: – Condition on all the neighbors – Find the mode – Repeat. • Compare with Gibbs sampling… Described in: Winkler, 1995. Introduced by Besag in 1986. 21 Wednesday, October 31, 12 Iterated conditional modes • For each node: – Condition on all the neighbors – Find the mode – Repeat. • Compare with Gibbs sampling… • Very small region over which it’s a local maximum Described in: Winkler, 1995. Introduced by Besag in 1986.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    171 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us