4

If I have the following model:

$$y\sim N_n(X\beta, \sigma^2 I_n)$$ with prior distributions:

$$\beta\sim N_n(\beta_0, B_0)$$ and $$\sigma^2 \sim IG(\alpha_0/ 2, \delta_0/2)$$

What would be the posterior of $\theta=(\beta,\sigma^2)$?

user2246905
  • 203
  • 2
  • 8
  • 1
    I didn't locate the duplicate I was after, but here's a pointer to some related answers ... one of the main parts of the calculation is discussed in the answers [Bayesian regression full conditional distribution](http://stats.stackexchange.com/questions/159990/bayesian-regression-full-conditional-distribution) and [Posterior Distribution for Bayesian Linear Regression](http://stats.stackexchange.com/questions/43903/posterior-distribution-for-bayesian-linear-regression/43907#43907) – Glen_b Dec 14 '15 at 03:38

1 Answers1

4

The joint posterior is a Normal-Inverse Gamma, with parameters $(\mu_n, B_n, \alpha_n, \beta_n)$ where:
$$B_n = \frac{1}{B_{0}^{-1} + n}$$ $$\mu_n = B_n(B_0^{-1}\mu_0 + n\bar{x})$$ $$\alpha_n = \alpha_0 + \frac{n}{2}$$ $$\beta_n = \beta_0 + \frac{1}{2}[\mu_0^{2}B_0^{-1} + \sum_{i}x_{i}^{2} - \mu^{2}_nB_n^{-1}]$$

Deriving these parameters can be beastly (see here for more details). Usually people solve for the full conditionals of $\beta$ and $\sigma^{2}$ (they each take a more manageable closed form distribution) and use Gibbs sampling to find the posterior for each parameter.

ilanman
  • 4,503
  • 1
  • 22
  • 46
  • Thanks for your answer. What about if I change the prior of $\beta$ to be a multivariate t distribution $t_nu(\beta_0, B_0)$ – user2246905 Dec 15 '15 at 04:05
  • I think that‘s not correct. The above question does not assume that $\beta$ depends on $\sigma$. Your answer only works for $\beta\sim N_n(\beta_0, \sigma B_0)$. – Frederik Ziebell Jun 25 '20 at 21:10