Variational Autoencoders - znu.ac.ircv.znu.ac.ir/afsharchim/DeepL/Variational In variational...

download Variational Autoencoders - znu.ac.ircv.znu.ac.ir/afsharchim/DeepL/Variational In variational autoencoders,

of 16

  • date post

    20-May-2020
  • Category

    Documents

  • view

    2
  • download

    0

Embed Size (px)

Transcript of Variational Autoencoders - znu.ac.ircv.znu.ac.ir/afsharchim/DeepL/Variational In variational...

  • Variational Autoencoders

  • Autoencoders vs Variational AE

  • Autoencoders vs Variational AE

  • Variational Autoencoders

  • Latent Variable

  • Latent Variable

  • Sampling Latent Variable

  • Sampling Latent Variable

  • VAE

  • KL-Divergence

    ENTROPY- If we use log​2​​ for our calculation we can

    interpret entropy as "the minimum number of bits it would take us to encode our information".

    Essentially, what we're looking at with the KL divergence is the

    expectation of the log difference between the probability of

    data in the original distribution with the approximating

    distribution. Again, if we think in terms of log2​​ we can interpret

    this as "how many bits of information we expect to lose"

    https://www.countbayesie.com/blog/2015/3/19/expectation-and-variance-from-high-school-to-grad-school

  • Deriving Loss Function

  • Deriving Loss Function

  • Loss Function

  • The easiest choice is N(0,1)

  • Keras-VAE

  • Keras-VAE