Exploration of the (non-)asymptotic bias and variance of stochastic gradient langevin dynamics

Abstract

Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally infeasible. The recently proposed stochastic gradient Langevin dynamics (SGLD) method circumvents this problem in three ways: it generates proposed moves using …

Type
Publication
J. Mach. Learn. Res.
Sebastian Vollmer
Sebastian Vollmer
Professor for Applications of Machine Learning

My research interests lie at the interface of applied probability, statistical inference and machine learning.