1

x and y are two columns of financial data which have been standardized. Assuming one implements a simple linear regression on x and y, is it possible to observe a slope greater than 1?

I ran some numbers in Excel and cannot get the slope to ever exceed 1. Can someone please explain the mathematical reason why this is impossible?

user3138766
  • 239
  • 8
  • 1
    When both variables are standardized, the slope is the same as pearson's correlation which is always smaller than 1 in magnitude. See [here](https://stats.stackexchange.com/questions/22718/what-is-the-difference-between-linear-regression-on-y-with-x-and-x-with-y/22721#22721) – Demetri Pananos May 19 '21 at 00:24
  • But why is that the case? – user3138766 May 19 '21 at 00:40
  • It's a consequence of the Cauchy-Schwarz Inequality (or any equivalent inequality). – whuber May 19 '21 at 13:32

1 Answers1

2

It is a well known result that the slope of a simple linear regresison is

$$\hat{\beta_1} = r_{xy} \dfrac{s_y}{s_x}$$

Here $r_{xy}$ is the sample correlation coefficient and $s_x, s_y$ are the sampel standard deviations of $x$ and $y$. The result is obtained immediately when you realize that standardization fixes $s_x=s_y=1$.

Demetri Pananos
  • 24,380
  • 1
  • 36
  • 94