Proximal MCMC for Bayesian Inference of Constrained and Regularized Estimation.

Journal: The American statistician
Published Date:

Abstract

This paper advocates proximal Markov Chain Monte Carlo (ProxMCMC) as a flexible and general Bayesian inference framework for constrained or regularized estimation. Originally introduced in the Bayesian imaging literature, ProxMCMC employs the Moreau-Yosida envelope for a smooth approximation of the total-variation regularization term, fixes variance and regularization strength parameters as constants, and uses the Langevin algorithm for the posterior sampling. We extend ProxMCMC to be fully Bayesian by providing data-adaptive estimation of all parameters including the regularization strength parameter. More powerful sampling algorithms such as Hamiltonian Monte Carlo are employed to scale ProxMCMC to high-dimensional problems. Analogous to the proximal algorithms in optimization, ProxMCMC offers a versatile and modularized procedure for conducting statistical inference on constrained and regularized problems. The power of ProxMCMC is illustrated on various statistical estimation and machine learning tasks, the inference of which is traditionally considered difficult from both frequentist and Bayesian perspectives.

Authors

  • Xinkai Zhou
    Department of Biostatistics, UCLA.
  • Qiang Heng
    Department of Computational Medicine, UCLA.
  • Eric C Chi
    Department of Statistics, Rice University.
  • Hua Zhou
    Department of Biostatistics, UCLA.

Keywords

No keywords available for this article.