1

In wikipedia's definition of a linear regression model:

$y_i = \beta_1 x_{i1} + \cdots + \beta_p x_{ip} + \varepsilon_i = \mathbf{x}^{\rm T}_i\boldsymbol\beta + \varepsilon_i, \qquad i = 1, \ldots, n, $

Then am I correct in saying the following?

  • $\varepsilon_i$ are random variables $\Omega \rightarrow \mathbb{R}$.
  • $y_i$ are random variables $\Omega \rightarrow \mathbb{R}$.
  • $\beta_i$ are constants in $\mathbb{R}$.
  • $x_{ij}$ are constants in $\mathbb{R}$.
simonzack
  • 162
  • 1
  • 8

1 Answers1

2

In the classical linear regression framework, you are correct. The only thing I would like to add is that the true, unknown $\beta_i$ are constants but the estimated $\hat{\beta}_i$ are random variables as they depend on the random variables $y_i$.

TrynnaDoStat
  • 7,414
  • 3
  • 23
  • 39