1

When using the "non-informative" prior $\pi(\mu,\sigma)\propto\frac{1}{\sigma^2}$ where $\pi(\mu)\propto1$ and $\pi(\sigma^2)\propto\frac{1}{\sigma^2}$

Where is the no information for the parameter?

and if would be informative prior, Where one would see the 'information' given?

I saw the definition on Wikipedia https://en.wikipedia.org/wiki/Prior_probability and there mentions 'An informative prior expresses specific, definite $\color{blue}{\text{information}}$ about a variable.'

My question is where is this $\color{blue}{\text{information}}$ given in my particular example for instance?

Could you help please?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
user208618
  • 141
  • 1
  • 13
  • 1
    I don't think this question is a duplicate. The proposed duplicate is asking about the rationale for using informative vs. uninformative priors, whereas this question seems to be asking about how to identify informative vs. uninformative priors. – user20160 Feb 11 '19 at 00:02
  • @user20160 I don't think this is a duplicate neither. Though I didn't mean to ask how to identify informative vs. uninformative priors. I meant the current questions actually :) . I saw the definitions on Wikipedia https://en.wikipedia.org/wiki/Prior_probability and there mentions _An informative prior expresses specific, definite information about a variable._ where is this information given? – user208618 Feb 11 '19 at 01:17
  • @user20160 Your question 'how to identify informative vs. uninformative priors' is an interesting one btw. – user208618 Feb 11 '19 at 01:20
  • @Isa I see. You might consider editing your question to include this. This could help clear up the duplicate issue, and help people understand exactly what you're asking. – user20160 Feb 11 '19 at 01:42
  • @user20160 I've edited – user208618 Feb 11 '19 at 02:03
  • 1
    There are several entries on X validated that discuss why there is not such thing as a no-information prior: [What is the point of non-informative priors?](https://stats.stackexchange.com/q/27813/7224) and [History of uninformative prior theory](https://stats.stackexchange.com/q/246736/7224) and [Why are Jeffreys priors considered noninformative?](https://stats.stackexchange.com/q/7519/7224) and [What is an “uninformative prior”? Can we ever have one with truly no information?](https://stats.stackexchange.com/q/20520/7224). – Xi'an Feb 11 '19 at 06:09
  • @Xi'an notice that I didn't ask 'why there is not such thing as a no-information prior' .My question is where is this $\color\blue{information}$ given in my particular example for instance? – user208618 Feb 11 '19 at 06:16
  • 1
    I still think this large collection of detailed and pertinent answers all address the most elusive notion of _information_ that your question seeks. Rather than asking _where_ the information is, you should consider _what_ information means in this context. – Xi'an Feb 11 '19 at 08:35
  • @Xi'an Ok. I think that question: What _information_ means in this context ? implicitly answer my original question. I'll read the 5 linked questions with their many many answers then.. – user208618 Feb 11 '19 at 19:44
  • @Xi'an In your answer this question https://stats.stackexchange.com/questions/27813/what-is-the-point-of-non-informative-priors you say 'they represent an input from the statistician, hence are informative about something!' and then 'Those priors indeed give a reference against which one can compute either the reference estimator/test/prediction' . So with _informative about something_ do you mean the arbitrary distribution given to the prior by the statistician? – user208618 Feb 12 '19 at 23:52
  • When choosing a prior distribution, whether or not proper, one sets a measure over the parameter space. This measure $\mu$ defines the volumes $\mu(A)$ of measurable sets and as such is informative about which measurable sets are more weighted than others, &tc. – Xi'an Feb 13 '19 at 08:35
  • @Xi'an Is there another way to explain this without measures? I don't understand what you mean with your comment. Also, I read the other 4 links that you mentioned in your previous comment and I couldn't find the answer to my question. – user208618 Feb 13 '19 at 18:39
  • The shortest possible answer is that information is not a mathematically well-defined concept and that many versions have been proposed in the literature, differing from one field to the next and from one era to the next. – Xi'an Feb 13 '19 at 18:55
  • @Xi'an I am not seeking for a mathematically well-defined concept, just an understandable answer . You said that I was going to find an answer in the links and there's nothing. It was not fair from you nor the others to mark my question as [duplicate] then. – user208618 Feb 13 '19 at 19:08

0 Answers0