Deep Learning Essentials Deep Generative Models
Russ Salakhutdinov
Machine Learning Department Carnegie Mellon University 2! Statistical Generative Models
Model family, loss function, + optimization algorithm, etc.
Prior Knowledge Data Learning
A probability Image x distribution probability p(x) p(x)
Sampling from p(x) generates new images:
Grover and Ermon, DGM Tutorial 3!
Unsupervised Learning
Non-probabilistic Models Probabilistic Generative) Ø Sparse Coding Models Ø Autoencoders Ø Others (e.g. k-means)
Tractable Models Non-Tractable Models Ø Generative Adversarial Ø Boltzmann Machines Networks Ø Mixture of Gaussians Ø Variational Ø Moment Matching Ø Autoregressive Models Autoencoders Networks Ø Normalizing Flows Ø Helmholtz Machines Ø Many others Ø Many others…
Explicit Density p(x) Implicit Density 4! Talk Roadmap
Fully Observed Models
Undirected Deep Generative Models Restricted Boltzmann Machines (RBMs), Deep Boltzmann Machines (DBMs)
Directed Deep Generative Models Variational Autoencoders (VAEs) Normalizing Flows
Generative Adversarial Networks (GANs) 5! Autoencoder
Feature Representation
Feed-back, Decoder Encoder Feed-forward, generative, bottom-up top-down path
Input Image
Details of what goes insider the encoder and decoder matter Need constraints to avoid learning an identity. 6! Autoencoder
Binary Features z