Approximate inference via variational sampling
-
2013-10-09 https://doi.org/10.14419/ijasp.v1i3.1293 -
Abstract
A new method called “variational sampling” is proposed to estimate integrals under probability distributions that can be evaluated up to a normalizing constant. The key idea is to fit the target distribution with an exponential family model by minimizing a strongly consistent empirical approximation to the Kullback-Leibler divergence computed using either deterministic or random sampling. It is shown how variational sampling differs conceptually from both quadrature and importance sampling and established that, in the case of random independence sampling, it may have much faster stochastic convergence than importance sampling under mild conditions. The variational sampling implementation presented in this paper requires a rough initial approximation to the target distribution, which may be found, e.g. using the Laplace method, and is shown to then have the potential to substantially improve over several existing approximate inference techniques to estimate moments of order up to two of nearly-Gaussian distributions, which occur frequently in Bayesian analysis. In particular, an application of variational sampling to Bayesian logistic regression in moderate dimension is presented.
-
Downloads
-
Received date: 2013-09-05
Accepted date: 2013-10-06
Published date: 2013-10-09