Lecture 11: Generative Models Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 1 May 9, 2019 Administrative ● A3 is out. Due May 22. ● Milestone is due next Wednesday. ○ Read Piazza post for milestone requirements. ○ Need to Finish data preprocessing and initial results by then. ● Don't discuss exam yet since people are still taking it. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 -2 May 9, 2019 Overview ● Unsupervised Learning ● Generative Models ○ PixelRNN and PixelCNN ○ Variational Autoencoders (VAE) ○ Generative Adversarial Networks (GAN) Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 3 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Data: (x, y) x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, regression, object detection, semantic segmentation, image captioning, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 4 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Data: (x, y) x is data, y is label Cat Goal: Learn a function to map x -> y Examples: Classification, regression, object detection, Classification semantic segmentation, image captioning, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 5 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Data: (x, y) x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, DOG, DOG, CAT regression, object detection, semantic segmentation, image Object Detection captioning, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 6 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Data: (x, y) x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, GRASS, CAT, TREE, SKY regression, object detection, semantic segmentation, image Semantic Segmentation captioning, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 7 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Data: (x, y) x is data, y is label Goal: Learn a function to map x -> y Examples: Classification, A cat sitting on a suitcase on the floor regression, object detection, semantic segmentation, image Image captioning captioning, etc. Caption generated using neuraltalk2 Image is CC0 Public domain. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 8 May 9, 2019 Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 9 May 9, 2019 Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data Examples: Clustering, K-means clustering dimensionality reduction, feature learning, density estimation, etc. This image is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 10 May 9, 2019 Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data 3-d 2-d Examples: Clustering, Principal Component Analysis dimensionality reduction, feature (Dimensionality reduction) learning, density estimation, etc. This image from Matthias Scholz is CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 11 May 9, 2019 Supervised vs Unsupervised Learning Unsupervised Learning Data: x Just data, no labels! Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature Autoencoders learning, density estimation, etc. (Feature learning) Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 12 May 9, 2019 Supervised vs Unsupervised Learning Unsupervised Learning Data: x Figure copyright Ian Goodfellow, 2016. Reproduced with permission. Just data, no labels! 1-d density estimation Goal: Learn some underlying hidden structure of the data Examples: Clustering, dimensionality reduction, feature 2-d density estimation learning, density estimation, etc. 2-d density images left and right are CC0 public domain Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 13 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Unsupervised Learning Data: (x, y) Data: x x is data, y is label Just data, no labels! Goal: Learn a function to map x -> y Goal: Learn some underlying hidden structure of the data Examples: Classification, regression, object detection, Examples: Clustering, semantic segmentation, image dimensionality reduction, feature captioning, etc. learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 14 May 9, 2019 Supervised vs Unsupervised Learning Supervised Learning Unsupervised Learning Training data is cheap Data: (x, y) Data: x Holy grail: Solve x is data, y is label Just data, no labels! unsupervised learning => understand structure of visual world Goal: Learn a function to map x -> y Goal: Learn some underlying hidden structure of the data Examples: Classification, regression, object detection, Examples: Clustering, semantic segmentation, image dimensionality reduction, feature captioning, etc. learning, density estimation, etc. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 15 May 9, 2019 Generative Models Given training data, generate new samples from same distribution Training data ~ pdata(x) Generated samples ~ pmodel(x) Want to learn pmodel(x) similar to pdata(x) Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 16 May 9, 2019 Generative Models Given training data, generate new samples from same distribution Training data ~ pdata(x) Generated samples ~ pmodel(x) Want to learn pmodel(x) similar to pdata(x) Addresses density estimation, a core problem in unsupervised learning Several flavors: - Explicit density estimation: explicitly define and solve for pmodel(x) - Implicit density estimation: learn model that can sample from pmodel(x) w/o explicitly defining it Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 17 May 9, 2019 Why Generative Models? - Realistic samples for artwork, super-resolution, colorization, etc. - Generative models of time-series data can be used for simulation and planning (reinforcement learning applications!) - Training generative models can also enable inference of latent representations that can be useful as general features FIgures from L-R are copyright: (1) Alec Radford et al. 2016; (2) Phillip Isola et al. 2017. Reproduced with authors permission (3) BAIR Blog. Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 18 May 9, 2019 Taxonomy of Generative Models Direct GAN Generative models Explicit density Implicit density Markov Chain Tractable density Approximate density GSN Fully Visible Belief Nets - NADE - MADE Variational Markov Chain - PixelRNN/CNN Variational Autoencoder Boltzmann Machine - NICE / RealNVP - Glow Figure copyright and adapted from Ian Goodfellow, Tutorial on Generative Adversarial Networks, 2017. - Ffjord Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 19 May 9, 2019 Taxonomy of Generative Models Direct Today: discuss 3 most GAN popular types of generative Generative models models today Explicit density Implicit density Markov Chain Tractable density Approximate density GSN Fully Visible Belief Nets - NADE - MADE Variational Markov Chain - PixelRNN/CNN Variational Autoencoder Boltzmann Machine - NICE / RealNVP - Glow Figure copyright and adapted from Ian Goodfellow, Tutorial on Generative Adversarial Networks, 2017. - Ffjord Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 20 May 9, 2019 PixelRNN and PixelCNN Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 21 May 9, 2019 Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Likelihood of Probability of i’th pixel value image x given all previous pixels Then maximize likelihood of training data Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 22 May 9, 2019 Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Likelihood of Probability of i’th pixel value image x given all previous pixels Complex distribution over pixel values => Express using a neural Then maximize likelihood of training data network! Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 23 May 9, 2019 Fully visible belief network Explicit density model Use chain rule to decompose likelihood of an image x into product of 1-d distributions: Will need to define ordering of “previous Likelihood of Probability of i’th pixel value pixels” image x given all previous pixels Complex distribution over pixel values => Express using a neural Then maximize likelihood of training data network! Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 24 May 9, 2019 PixelRNN [van der Oord et al. 2016] Generate image pixels starting from corner Dependency on previous pixels modeled using an RNN (LSTM) Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 25 May 9, 2019 PixelRNN [van der Oord et al. 2016] Generate image pixels starting from corner Dependency on previous pixels modeled using an RNN (LSTM) Fei-Fei Li & Justin Johnson & Serena Yeung Lecture 11 - 26 May 9, 2019 PixelRNN [van der Oord et al. 2016] Generate image pixels starting from corner Dependency on previous pixels modeled using an RNN (LSTM) Fei-Fei
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages136 Page
-
File Size-