1

I might be wrong, I just feel that the following case is different from the problem of modelling observations with a conjugate prior:

Suppose I have $n$ different Gaussians each with a different (but known) mean $\mu_i$ and variance $v_i$:

$N(\mu_i,v_i), i=1..n$

So here the observations are actually about the parameters : $\mu_i,v_i$.

How to model the prior distribution of these $n$ Gaussians, so that I could generate another "similar" Gaussian from this prior distribution ?

Shockley
  • 177
  • 6
  • when you said "known", do you intend that the means and variances are known realizations of random variables, or that they are not random? – niandra82 Aug 04 '14 at 07:46
  • By that I simply mean their values are fixed and known, say, $u_3 = 0, v_3 = 1$. etc. – Shockley Aug 04 '14 at 08:02
  • And then there is not random variables. The distributions are not random (are normal), the parameters of the distributions are not random (are fixed). If you not have random variables how can you put prior distributions? – niandra82 Aug 04 '14 at 08:05
  • These parameters are random variables (of certain prior I am asking about), the $n$ sets of fixed values I mentioned are just $n$ observations of these random parameters. – Shockley Aug 04 '14 at 08:15
  • It is a bit confusing. Let me see if i understand. You have $n$ values $\mu_i$ and $v_i$, you know this values and they cannot be changed, ok? – niandra82 Aug 04 '14 at 08:21
  • Yes, these values are observations (of the corresponding random variables), so they can't be changed. But I want to use these observations to estimate the distribution they are drawn from. – Shockley Aug 04 '14 at 08:28
  • Let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/16186/discussion-between-shockley-and-niandra82). – Shockley Aug 04 '14 at 08:32

1 Answers1

1

If your $\mu_i$ and $v_i$ are observations, then you have to define a likelihood on it, something like $\mu_i, v_i \sim N(M, V) \times IG(a,b)$. In this case you assume that all the $\mu$ comes from a normal with mean $M$ and variance $V$ while the $v$ are from an inverse gamma $a,b$. Then you can put prior on the values $M,V,a,b$, and using MCMC you can have posterior sample from $M,V,a,b$. With the posterior samples you can in fact simulate other $\mu$ and $v$ "similar" to the one observed.

niandra82
  • 1,160
  • 8
  • 24
  • Thanks, I believe this is what I'm looking for. However, how to choose the priors for $M,V,a,b$ (there is [an unanswered post](http://stats.stackexchange.com/questions/82185/prior-elicitation-with-normal-gamma-or-normal-inverse-gamma)) is the next hard problem to handle. – Shockley Aug 04 '14 at 20:23
  • 1
    The prior should represent your prior information, so it is up to you. If you instead want distributions that make you easy the implementation of the gibbs sampler, i will suggest $M \sim N()$, $V \sim IG()$ and maybe $b \sim G()$, but for the last I am not sure... – niandra82 Aug 04 '14 at 20:35