이 글은 필자가 Pieter Abbeel 의 Deep Unsupervised Learning 2020을 듣고 정리한 글입니다.
This lecture is about how to get the data distribution, which means the primitive way of generative models. Generative models first came up with histograms, which is the basic model of the Likelihood-based model. And then for the neural approach, they use autoregressive models.
🫠 Likelihood-based models
🫠 Sampling-based: Histogram
🫠 Likelihood-based Generative models
🫠 Autoregressive models: Recurrent Neural Nets
🫠 Autoregressive models: Masking-based Models
🫠 Masking-based Models: MADE
🫠 Masking-based Models: Masked Convolutions
'Robotics & Perception > Unsupervised Learning' 카테고리의 다른 글
[CS294 Pieter Abbeel] 5. Implicit Models - GANs (0) | 2022.04.21 |
---|---|
[CS294 Pieter Abbeel] 4. Latent Variable Models - Variational AutoEncoder (VAE) (0) | 2022.04.21 |
[CS294 Pieter Abbeel] 3. Likelihood Models: Flow Models (0) | 2022.04.21 |
[CS294 Pieter Abbeel] 1. Intro (0) | 2022.04.19 |
[CS294 Pieter Abbeel] Deep Unsupervised Learning Contents (0) | 2022.04.19 |