Skip to main content

Statistics Seminar

Extending the Scope of Nonparametric Empirical Bayes presented by Bodhisattva Sen

Event Details

Date
Monday, October 2, 2023
Time
4-5 p.m.
Description

Abstract: In this talk we will describe two applications of empirical Bayes (EB) methodology. EB procedures estimate the prior probability distribution (in a Bayesian statistical model) from the data. In the first part we study the (Gaussian) signal plus noise model with multivariate, heteroscedastic errors. This model arises in many large-scale denoising problems (e.g., in astronomy). We consider the nonparametric maximum likelihood estimator (NPMLE) in this setting. We study the characterization, uniqueness, and computation of the NPMLE which estimates the unknown (arbitrary) prior by solving an infinite-dimensional convex optimization problem. The EB posterior means based on the NPMLE have low regret, meaning they closely target the oracle posterior means one would compute with the true prior in hand. We demonstrate the adaptive and near-optimal properties of the NPMLE for density estimation, denoising and deconvolution.

In the second half of the talk, we consider the problem of Bayesian high dimensional regression where the regression coefficients are drawn i.i.d. from an unknown prior. To estimate this prior distribution we propose and study a "variational empirical Bayes" approach — it combines EB inference with a variational approximation (VA). The idea is to approximate the intractable marginal log-likelihood of the response vector --- also known as the "evidence" --- by the evidence lower bound (ELBO) obtained from a naive mean field (NMF) approximation. We then maximize this lower bound over a suitable class of prior distributions in a computationally feasible way. We show that the marginal log-likelihood function can be (uniformly) approximated by its mean field counterpart. More importantly, under suitable conditions, we establish that this strategy leads to consistent approximation of the true posterior and provides asymptotically valid posterior inference for the regression coefficients.

Cost
Free

Tags