Statistics Seminar
In this talk, a set of scalable Bayesian inference procedures is developed for a general class of nonparametric regression models based on embarrassingly parallel MCMC. Specifically, we first perform independent nonparametric Bayesian inference on each subset split from a massive dataset, and then aggregate those local results into global counterparts. This aggregation step is explicit without involving any additional computation cost. By a careful partition, we show that our aggregated inference results obtain the oracle rule in the sense that they are equivalent to those obtained directly from the entire data (which are computationally prohibitive in practice, though). For example, aggregated credible balls achieve desirable credibility level and frequentist coverage possessed by their oracle counterparts (with similar radii). This oracle matching phenomenon occurs due to a delicate geometric structure of the infinite-dimensional parameter space in consideration. This theoretical talk is based on http://arxiv.org/abs/1508.04175