This paper introduces auto-encoding variational Bayes, a generative modeling technique that allows for efficient and scalable approximate inference. The method utilizes variational inference within the framework of autoencoders to learn the posterior distribution over latent variables. It approximates the intractable true posterior using a recognition model conditioned on the observations. The parameters are estimated by maximizing a evidence lower bound derived using Jensen's inequality. This allows for backpropagation to efficiently learn the generative and inference models jointly. The technique was demonstrated on density estimation tasks with MNIST data.