3

I'm trying to refer to the coefficients other than the intercept. Is there a word/jargon that refers to coefficients other than the intercept? (I'm currently calling them 'other coefficients', which is mildly descriptive in the context, but not ideal.)

Nick Cox
  • 48,377
  • 8
  • 110
  • 156
stevec
  • 259
  • 1
  • 7

1 Answers1

2

Consider the multiple linear regression model

$$y=X\beta+\varepsilon$$

Here $y$ is the response vector, $X$ is the design matrix with (say) $p+1$ columns (the first column in this matrix is a vector of all ones corresponding to the intercept), $\beta=(\beta_0,\beta_1,\ldots,\beta_p)^T$ is the vector of regression coefficients and $\varepsilon$ is the random error.

Without resorting to vectors, we can write the model as

$$y=\beta_0+\beta_1 x_1+\beta_2 x_2+\cdots+\beta_p x_p+\varepsilon$$

In this model with $p$ regressors or predictor variables, the parameters $\beta_j,\,j=0,1,\ldots,p$ are simply called the regression coefficients. In fact, $\beta_j$ represents the expected change in the response $y$ per unit change in $x_j$ when all of the remaining regressor variables $x_i\,(i\ne j)$ are held constant. For this reason, the parameters $\beta_j,\,j=1,2,\ldots,p$ are often called partial regression coefficients. The parameter $\beta_0$ is of course separately called the intercept.

In simple linear regression we have $p=1$ and the regression coefficient $\beta_1$ is simply called the slope.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
  • 1
    I am happy with anyone referring to all of them -- all the $\beta$ elements -- as coefficients. Another name for intercept is just the constant. – Nick Cox Mar 24 '19 at 07:47
  • 1
    Gradient for slope is common also. – Nick Cox Mar 24 '19 at 08:11
  • Related: https://stats.stackexchange.com/questions/92992/whats-the-difference-between-regression-coefficients-and-partial-regression-coe?rq=1. – StubbornAtom Mar 24 '19 at 08:22