
We introduce Langevin Monte Carlo methods for estimating expectations of observables under high-dimensional probability measures. We discuss discretization strategies and Metropolization methods for removing bias due to discretization error.
We then present a new unbiased method for Bayesian posterior means based on kinetic Langevin dynamics that combines advanced splitting methods with enhanced gradient approximations. Our approach avoids Metropolis correction by coupling Markov chains at different discretization levels in a multilevel Monte Carlo approach. Theoretical analysis demonstrates that our proposed estimator is unbiased, attains finite variance, and satisfies a central limit theorem. We prove similar results using both approximate and stochastic gradients and show that our method’s computational cost scales independently of the size of the dataset. Our numerical experiments demonstrate that our unbiased algorithm outperforms the “gold-standard” randomized Hamiltonian Monte Carlo.