Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space....

88
Deep Generative Models Jakub M. Tomczak AMLAB, UvA 2017/11/29

Transcript of Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space....

Page 1: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Deep Generative ModelsJakub M. TomczakAMLAB, UvA2017/11/29

Page 2: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 3: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 4: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 5: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 6: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 7: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?

Page 8: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?new data

Page 9: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?High probabilityof the blue label.=Highly probabledecision!

new data

Page 10: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Do we need generative modeling?High probabilityof the blue label.=Highly probabledecision!

High probabilityof the blue label.xLow probabilityof the object.=Uncertaindecision!

new data

Page 11: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Generative Modeling● Providing decision is not enough. How to evaluate uncertainty? Distribution of y is only a part of the story.● Generalization problem. Without knowing the distribution of x how we can generalize to new data?● Understanding the problem is crucial (“What I cannot create, I do not understand”, Richard P. Feynman). Properly modeling data is essential to make better decisions.

Page 12: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Generative Modeling● Semi-supervised learning.Use unlabeled data to train a better classifier.

Page 13: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Generative Modeling● Handling missing or distorted data.Reconstruct and/or denoise data.

Page 14: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Generative ModelingImage generation

Real GeneratedCHEN, Xi, et al. Variational lossy autoencoder. arXiv preprint arXiv:1611.02731, 2016.

Page 15: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Generative ModelingSequence generation

Generated

BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015.

Page 16: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

How to formulate a generative model?Modeling in a high-dimensional space is difficult.

Page 17: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

How to formulate a generative model?Modeling in a high-dimensional space is difficult.

Page 18: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

How to formulate a generative model?Modeling in a high-dimensional space is difficult. → modeling all dependencies among pixels.

Page 19: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

How to formulate a generative model?Modeling in a high-dimensional space is difficult. → modeling all dependencies among pixels.very inefficient!

Page 20: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

How to formulate a generative model?Modeling in a high-dimensional space is difficult. → modeling all dependencies among pixels.

A possible solution? → Models with latent variablesvery inefficient!

Page 21: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

Page 22: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

Energy function

Page 23: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

Energy functionPartition function

Page 24: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

It is called the Boltzmann (or Gibbs) distribution.

Page 25: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

Advantages:– Joint distribution is described by E(x).– Well-studied distribution in statistical physics.

Page 26: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann distributionA joint distribution of random variables:

Advantages:– Joint distribution is described by E(x).– Well-studied distribution in statistical physics.Drawbacks:– Calculation of partition function requires enumeration of all x, e.g., if x’s are binary → combinations.

Page 27: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann MachinesLet us consider the following energy function:where x are binary.

Page 28: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann MachinesLet us consider the following energy function:where x are binary.

first-order term(models “prior”)

Page 29: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann MachinesLet us consider the following energy function:where x are binary.

second-order term(models correlations) first-order term(models “prior”)

Page 30: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann MachinesLet us consider the following energy function:where x are binary.

second-order term(models correlations) first-order term(models “prior”)Very complicated…Can we do better?

Page 31: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Boltzmann MachinesLet us consider the following energy function:where x are binary.

second-order term(models correlations) first-order term(models “prior”)Very complicated…Can we do better? Latent variables!

Page 32: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Restricted Boltzmann MachinesLet us consider the following energy function:where x and h are binary.

Page 33: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Restricted Boltzmann MachinesLet us consider the following energy function:where x and h are binary.

imagehidden unitsweights (receptive field)

Page 34: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Restricted Boltzmann MachinesConditional dependencies:

where

Page 35: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling
Page 36: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM: TrainingWe train RBMs using the log-likelihood function:where

Page 37: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM: TrainingWe train RBMs using the log-likelihood function:where Free energy

softplus

Page 38: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM: TrainingLet us calculate gradient of the LL wrt parameters:

Page 39: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM: TrainingLet us calculate gradient of the LL wrt parameters:

easy difficultIt requires application of an approximate inference (e.g., MCMC).

Page 40: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM: Contrastive DivergenceApply k steps of Gibbs sampler and approximate the gradient as follows:

Hinton, G. E. (2002), Training products of experts by minimizing contrastive divergence, Neural Computation, 14(8), 1771-1800

Page 41: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

RBM with Gaussian inputs

Page 42: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Subspace RBM

Tomczak, J. M., & Gonczarek, A. (2017). Learning invariant features using Subspace Restricted Boltzmann Machine. Neural Processing Letters, 45(1), 173-182.

Page 43: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Deep Belief Network

Page 44: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Deep Boltzmann Machines

Page 45: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Training DBM

Page 46: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:

Page 47: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:

First sample z.Second, sample x for given z.

Page 48: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:

First sample z.Second, sample x for given z.

Page 49: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:● If and , then → Factor Analysis. ● What if we take a non-linear transformation of z? → an infinite mixture of Gaussians.

Page 50: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:● If and , then → Factor Analysis. ● What if we take a non-linear transformation of z? → an infinite mixture of Gaussians.Convenient but limiting!

Page 51: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:● If and , then → Factor Analysis. ● What if we take a non-linear transformation of z? → an infinite mixture of Gaussians.

Page 52: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Latent Variable Models● Latent variable model:● If and , then → Factor Analysis. ● What if we take a non-linear transformation of z? → an infinite mixture of Gaussians. Neural network

Page 53: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Deep Generative Models (DGM):Density Network

MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Page 54: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density NetworkNeural Network

MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Page 55: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density NetworkNeural NetworkHow to train this model?!

MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Page 56: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density Network● MC approximation:

where:MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Page 57: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density Network● MC approximation:

where:MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Sample z many times,apply log-sum-exp trick and maximize log-likelihood.

Page 58: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density Network● MC approximation:

where:MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

Sample z many times,apply log-sum-exp trick and maximize log-likelihood.It scales badly in high dimensional cases!

Page 59: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density Network

MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

PROSLog-likelihood approachEasy samplingTraining using gradient-basedmethodsCONSRequires explicit modelsFails in high dim. cases

Page 60: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Density Network

MacKay, D. J., & Gibbs, M. N. (1999). Density networks. Statistics and neural networks: advances at the interface.Oxford University Press, Oxford, 129-144.

PROSLog-likelihood approachEasy samplingTraining using gradient-basedmethodsCONSRequires explicit modelsFails in high dim. cases

Can we do better?

Page 61: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we haveDensity NetworkDensity Network

Generative Adversarial Net

Page 62: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we haveWorks only for low dim. cases...Inefficient training...Density NetworkDensity Network

Generative Adversarial Net

Page 63: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we haveWorks only for low dim. cases...Works for high dim. cases!Inefficient training...Density NetworkDensity Network

Generative Adversarial Net

Page 64: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we haveWorks only for low dim. cases...Works for high dim. cases!Doesn’t train a distribution...Inefficient training...

Unstable training...Density NetworkDensity Network

Generative Adversarial Net

Page 65: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we haveQUESTIONCan we stick to the log-likelihood approachbut with a simple training procedure?Density NetworkDensity Network

Generative Adversarial Net

Page 66: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: so far we have

Generative Adversarial Net

Density Network

Page 67: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderDensity Network

Generative Adversarial NetVariational Auto-Encoder

Kingma, D. P., & Welling, M. (2013). Auto-encoding Variational Bayes. arXiv preprint arXiv:1312.6114.

Page 68: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderDensity Network

Generative Adversarial NetVariational Auto-Encoder

Encoder Decoder

Kingma, D. P., & Welling, M. (2013). Auto-encoding Variational Bayes. arXiv preprint arXiv:1312.6114.

Page 69: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-Encoder

Page 70: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderVariational posterior

Page 71: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-Encoder

Reconstruction error Regularization

Page 72: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderOur objective it the evidence lower bound.We can approximate it using MC sample.

Page 73: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderOur objective it the evidence lower bound.We can approximate it using MC sample.How to properly calculate gradients (i.e., train the model)?

Page 74: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderPROBLEM: calculating gradient wrt parametersof the variational posterior (i.e., sampling process).

Kingma, D. P., & Welling, M. (2013). Auto-encoding Variational Bayes. arXiv preprint arXiv:1312.6114.

Page 75: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderPROBLEM: calculating gradient wrt parametersof the variational posterior (i.e., sampling process).SOLUTION: use a non-centered parameterization(a.k.a. reparameterization trick).

Kingma, D. P., & Welling, M. (2013). Auto-encoding Variational Bayes. arXiv preprint arXiv:1312.6114.

Page 76: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderPROBLEM: calculating gradient wrt parametersof the variational posterior (i.e., sampling process).SOLUTION: use a non-centered parameterization(a.k.a. reparameterization trick).

Output of a neural networkKingma, D. P., & Welling, M. (2013). Auto-encoding Variational Bayes. arXiv preprint arXiv:1312.6114.

Page 77: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-Encoder

Page 78: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderA deep neural net that outputs parametersof the variational posterior (encoder):

Page 79: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderA deep neural net that outputs parametersof the generator (decoder), e.g., a normal distribution or Bernoulli distribution.

Page 80: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderA prior that regularizes the encoder andtakes part in the generative process.

Page 81: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-Encoder

Page 82: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderFeedforward netsConvolutional netsPixelCNNGated PixelCNN

Page 83: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderNormalizing flowsVolume-preserving flowsGaussian processesStein Particle DescentOperator VI

Feedforward netsConvolutional netsPixelCNNGated PixelCNN

Page 84: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderNormalizing flowsVolume-preserving flowsGaussian processesStein Particle DescentOperator VI

Feedforward netsConvolutional netsPixelCNNGated PixelCNNAuto-regressive PriorObjective PriorStick-Breaking PriorVampPrior

Page 85: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: Variational Auto-EncoderNormalizing flowsVolume-preserving flowsGaussian processesStein Particle DescentOperator VI

Feedforward netsConvolutional netsPixelCNNGated PixelCNNAuto-regressive PriorObjective PriorStick-Breaking PriorVampPriorImportance Weighted AERenyi DivergenceStein Divergence

Page 86: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

DGM: VAEPROSLog-likelihood frameworkEasy samplingTraining using gradient-basedmethodsStable trainingDiscovers latent representationCould be easily combined withother probabilistic frameworks

CONSOnly explicit modelsProduces blurry images(?)

Page 87: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

In order to make betterdecisions, we need abetter understandingof reality.=generative modeling

Page 88: Deep Generative Models · BOWMAN, Samuel R., et al. Generating sentences from a continuous space. arXiv preprint arXiv:1511.06349, 2015. How to formulate a generative model? Modeling

Web-page:https://jmtomczak.github.ioCode on github:https://github.com/jmtomczakContact:[email protected]@gmail.com

Part of the presented research was funded by the European Commission within the Marie Skłodowska-Curie Individual Fellowship (Grant No. 702666, ''Deep learning and Bayesian inference for medical imaging'').