3

I need to prove that the expectation of absolute deviation is minimized by the median.

We are given that $$med(y|x) = \beta_0 + \beta_1x$$ and $x$ can take on three values with positive probability: $\tau_1, \tau_2, \tau_3$. $x$ is a scalar random variable, and $y$ given $x$ is continuously distributed.

So, we need to show that $0=S(b_0, b_1)=E[|y-b_0-b_1x|]$ when $(b_0, b_1) = (\beta_0, \beta_1)$. Or at least, show that $S(.,.)$ is minimized at $(\beta_0, \beta_1)$. Also, is this value unique?

I tried to take a conditional expectation using LIE by saying: $$E[|y-b_0-b_1x|]=E[E[|y-b_0-b_1x||x]]$$, and then splitting this expectation into probabilities: one term for where the function inside the absolute value is positive, and one where its negative, and taking the respective probabilities. Doing this, I get the answer I am looking for.

However, I was told that I cannot use LIE when I have an absolute value inside the expectations. So how do I do this? Also, why can't we use LIE here?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
akeenlogician
  • 313
  • 3
  • 8
  • See this post for a proof: http://stats.stackexchange.com/questions/270428/how-do-i-show-that-the-sample-median-minimizes-the-sum-of-absolute-deviations – kjetil b halvorsen Apr 03 '17 at 11:10

1 Answers1

1

You can certainly use the Law of Iterated Expectations "with an absolute value" -if this wasn't so it would mean that LIE cannot be used with non-negative random variables, which is wrong, it certainly can. Simply define

$$Z \equiv |Y-b_0-b_1X|$$

and you can of course write

$$E[Z]=E[E(Z|X)]$$

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241