2

I was in statistics class today and the professor wrote "consistent estimator". The first thing I thought of was that the range of the estimator should be a subset of the set of plausible population parameters (this was not the definition). Is there a name for estimators with ranges larger than the set of plausible parameters? Does anyone ever use such strange estimators in applications? For example, we could consider (1-a)X+aV where X, V are unbiased estimators of some real parameter. A complex a-value would yield an unbiased estimator with complex outputs! (Note: a usual definition of an estimator would exclude this; this entails a loosening of the definition of an estimator to simply a function from the data sets.)

Jacob Wakem
  • 121
  • 3
  • A conventional definition of an estimator is that it is a function from the set of possible data values to the set of possible parameter values. If by "plausible" you mean "all," then according to this there is no such thing as an estimator with a larger range. Do you perhaps have a broader definition of "estimator" or a narrower definition of "plausible" in mind? – whuber Mar 03 '14 at 20:05
  • @whuber, I guess my definition of estimator is broader (course skirted the issue). By plausible I mean conforming to the assumptions ,if any, of what family of distributions you are in. I don't say possible because we work with a particular pop. With a particular distribution. – Jacob Wakem Mar 03 '14 at 21:22
  • Please explain precisely what your definition of an estimator is, then. Unless you disclose it we won't even know what you're asking! – whuber Mar 03 '14 at 21:40
  • For example, a negative standard deviation is always implausible. If you know you have a positive mean normal distribution (this is a family just like all normal distributions) then negative or 0 mean is implausible. Families of distributions are often defined in terms of a parameter with some bound. – Jacob Wakem Mar 03 '14 at 21:41
  • @whuber : a function from the set of possible data sets. – Jacob Wakem Mar 03 '14 at 21:47
  • Of course you might not know the possible data sets, so its usually expressed in terms of plausible data sets. – Jacob Wakem Mar 03 '14 at 22:01
  • That's too vague, Jacob: the issue here is *what is the codomain of that function*? And once again you use the word "plausible" but you haven't yet defined it. Your examples suggest it doesn't add any information; it's just a synonym for "any." – whuber Mar 03 '14 at 22:14
  • @whuber , its not too vague. I leave the codomain be. Interesting estimators might have low variance or no bias. – Jacob Wakem Mar 03 '14 at 22:37
  • You start with assumptions like normality. Plausible parameters conform to these and that you have a consistent distribution (total p is 1 etc) – Jacob Wakem Mar 03 '14 at 22:45
  • let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/13356/discussion-between-whuber-and-jacob-wakem) – whuber Mar 03 '14 at 23:24
  • @whuber: actually (from memory) Casella & Berger defines an estimator as *any function of the data* with the comment alike the line that it is difficult to make a better definition without making it to restrictive. (in some cases, in models with multiple variance parameters, some variance estimators might get negative). One should be able to studying them without outlawing them at the outset. – kjetil b halvorsen Sep 12 '18 at 20:34
  • @Kjetil It is interesting that you bring up this issue again just now, because recently I posted an answer that ends by pointing out that in a constructing an unbiased estimator it was necessary that the image of the estimator include impossible values of the parameter. See https://stats.stackexchange.com/a/366007/919. I imagine that C&B would have insisted all estimators be measurable, though :-). – whuber Sep 12 '18 at 21:14

0 Answers0