
Beyond Logconcavity: Provable Guarantees for Sampling Multimodal Distributions using Simulated Tempering Langevin Monte Carlo
A key task in Bayesian statistics is sampling from distributions that ar...
read it

Spectral Gap of Replica Exchange Langevin Diffusion on Mixture Distributions
Langevin diffusion (LD) is one of the main workhorses for sampling probl...
read it

Modified logSobolev inequalities for strongly logconcave distributions
We show that the modified logSobolev constant for a natural Markov chai...
read it

The Randomized Midpoint Method for LogConcave Sampling
Sampling from logconcave distributions is a well researched problem tha...
read it

Measuring Sample Quality with Diffusions
Standard Markov chain Monte Carlo diagnostics, like effective sample siz...
read it

Online Sampling from LogConcave Distributions
Given a sequence of convex functions f_0, f_1, ..., f_T, we study the pr...
read it

Langevin Monte Carlo without Smoothness
Langevin Monte Carlo (LMC) is an iterative algorithm used to generate sa...
read it
Simulated Tempering Langevin Monte Carlo II: An Improved Proof using Soft Markov Chain Decomposition
A key task in Bayesian machine learning is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality). One prevalent example of this is sampling posteriors in parametric distributions, such as latentvariable generative models. However sampling (even very approximately) can be #Phard. Classical results going back to Bakry and Émery (1985) on sampling focus on logconcave distributions, and show a natural Markov chain called Langevin diffusion mixes in polynomial time. However, all logconcave distributions are unimodal, while in practice it is very common for the distribution of interest to have multiple modes. In this case, Langevin diffusion suffers from torpid mixing. We address this problem by combining Langevin diffusion with simulated tempering. The result is a Markov chain that mixes more rapidly by transitioning between different temperatures of the distribution. We analyze this Markov chain for a mixture of (strongly) logconcave distributions of the same shape. In particular, our technique applies to the canonical multimodal distribution: a mixture of gaussians (of equal variance). Our algorithm efficiently samples from these distributions given only access to the gradient of the logpdf. For the analysis, we introduce novel techniques for proving spectral gaps based on decomposing the action of the generator of the diffusion. Previous approaches rely on decomposing the state space as a partition of sets, while our approach can be thought of as decomposing the stationary measure as a mixture of distributions (a "soft partition"). Additional materials for the paper can be found at http://tiny.cc/glr17. The proof and results have been improved and generalized from the precursor at www.arxiv.org/abs/1710.02736.
READ FULL TEXT
Comments
There are no comments yet.