Data Analysis Recipes: Products of multivariate Gaussians in Bayesian inferences. (arXiv:2005.14199v1 [stat.CO])
<a href="http://arxiv.org/find/stat/1/au:+Hogg_D/0/1/0/all/0/1">David W. Hogg</a> (NYU) (MPIA) (Flatiron), <a href="http://arxiv.org/find/stat/1/au:+Price_Whelan_A/0/1/0/all/0/1">Adrian M. Price-Whelan</a> (Flatiron), <a href="http://arxiv.org/find/stat/1/au:+Leistedt_B/0/1/0/all/0/1">Boris Leistedt</a> (Imperial) (NYU)

A product of two Gaussians (or normal distributions) is another Gaussian.
That’s a valuable and useful fact! Here we use it to derive a refactoring of a
common product of multivariate Gaussians: The product of a Gaussian likelihood
times a Gaussian prior, where some or all of those parameters enter the
likelihood only in the mean and only linearly. That is, a linear, Gaussian,
Bayesian model. This product of a likelihood times a prior pdf can be
refactored into a product of a marginalized likelihood (or a Bayesian evidence)
times a posterior pdf, where (in this case) both of these are also Gaussian.
The means and variance tensors of the refactored Gaussians are straightforward
to obtain as closed-form expressions; here we deliver these expressions, with
discussion. The closed-form expressions can be used to speed up and improve the
precision of inferences that contain linear parameters with Gaussian priors. We
connect these methods to inferences that arise frequently in physics and
astronomy.

If all you want is the answer, the question is posed and answered at the
beginning of Section 3. We show two toy examples, in the form of worked
exercises, in Section 4. The solutions, discussion, and exercises in this Note
are aimed at someone who is already familiar with the basic ideas of Bayesian
inference and probability.

A product of two Gaussians (or normal distributions) is another Gaussian.
That’s a valuable and useful fact! Here we use it to derive a refactoring of a
common product of multivariate Gaussians: The product of a Gaussian likelihood
times a Gaussian prior, where some or all of those parameters enter the
likelihood only in the mean and only linearly. That is, a linear, Gaussian,
Bayesian model. This product of a likelihood times a prior pdf can be
refactored into a product of a marginalized likelihood (or a Bayesian evidence)
times a posterior pdf, where (in this case) both of these are also Gaussian.
The means and variance tensors of the refactored Gaussians are straightforward
to obtain as closed-form expressions; here we deliver these expressions, with
discussion. The closed-form expressions can be used to speed up and improve the
precision of inferences that contain linear parameters with Gaussian priors. We
connect these methods to inferences that arise frequently in physics and
astronomy.

If all you want is the answer, the question is posed and answered at the
beginning of Section 3. We show two toy examples, in the form of worked
exercises, in Section 4. The solutions, discussion, and exercises in this Note
are aimed at someone who is already familiar with the basic ideas of Bayesian
inference and probability.

http://arxiv.org/icons/sfx.gif