Questions tagged [regression-coefficients]

The parameters of a regression model. Most commonly, the values by which the independent variables will be multiplied to get the predicted value of the dependent variable.

Given the following models, as examples, built from regression techniques: \begin{align} Y &= \beta_0 + \beta_1X + \varepsilon \\ Y &= \beta_0 X^{\beta_1} + \varepsilon \end{align} $\beta_0$ and $\beta_1$ are regression-coefficients (or parameters) on above equations (regression models).

1809 questions
101
votes
18 answers

Including the interaction but not the main effects in a model

Is it ever valid to include a two-way interaction in a model without including the main effects? What if your hypothesis is only about the interaction, do you still need to include the main effects?
Glen
  • 6,320
  • 4
  • 37
  • 59
79
votes
1 answer

How to interpret coefficients in a Poisson regression?

How can I interpret the main effects (coefficients for dummy-coded factor) in a Poisson regression? Assume the following example: treatment <- factor(rep(c(1, 2), c(43, 41)), levels = c(1, 2), …
54
votes
3 answers

Interpretation of log transformed predictor and/or response

I'm wondering if it makes a difference in interpretation whether only the dependent, both the dependent and independent, or only the independent variables are log transformed. Consider the case of log(DV) = Intercept + B1*IV + Error I can…
49
votes
4 answers

How to interpret coefficients from a polynomial model fit?

I'm trying to create a second order polynomial fit to some data I have. Let's say I plot this fit with ggplot(): ggplot(data, aes(foo, bar)) + geom_point() + geom_smooth(method="lm", formula=y~poly(x, 2)) I get: So, a second order fit…
user13907
  • 687
  • 1
  • 6
  • 7
49
votes
3 answers

Derive Variance of regression coefficient in simple linear regression

In simple linear regression, we have $y = \beta_0 + \beta_1 x + u$, where $u \sim iid\;\mathcal N(0,\sigma^2)$. I derived the estimator: $$ \hat{\beta_1} = \frac{\sum_i (x_i - \bar{x})(y_i - \bar{y})}{\sum_i (x_i - \bar{x})^2}\ , $$ where $\bar{x}$…
43
votes
2 answers

Multiple regression or partial correlation coefficient? And relations between the two

I don't even know if this question makes sense, but what is the difference between multiple regression and partial correlation (apart from the obvious differences between correlation and regression, which is not what I am aiming at)? I want to…
33
votes
1 answer

Is there a way to use the covariance matrix to find coefficients for multiple regression?

For simple linear regression, the regression coefficient is calculable directly from the variance-covariance matrix $C$, by $$ C_{d, e}\over C_{e,e} $$ where $d$ is the dependent variable's index, and $e$ is the explanatory variable's index. If one…
David
  • 433
  • 1
  • 5
  • 8
29
votes
3 answers

Does the order of explanatory variables matter when calculating their regression coefficients?

At first I thought the order didn’t matter, but then I read about the gram-schmidt orthogonalization process for calculating multiple regression coefficients, and now I’m having second thoughts. According to the gram-schmidt process, the later an…
Ryan Zotti
  • 5,927
  • 6
  • 29
  • 33
28
votes
3 answers

What does "all else equal" mean in multiple regression?

When we do multiple regressions and say we are looking at the average change in the $y$ variable for a change in an $x$ variable, holding all other variables constant, what values are we holding the other variables constant at? Their mean? Zero? Any…
28
votes
4 answers

Importance of predictors in multiple regression: Partial $R^2$ vs. standardized coefficients

I am wondering what the exact relationship between partial $R^2$ and coefficients in a linear model is and whether I should use only one or both to illustrate the importance and influence of factors. As far as I know, with summary I get estimates of…
27
votes
2 answers

Interpretation of betas when there are multiple categorical variables

I understand the concept that $\hat\beta_0$ is the mean for when the categorical variable is equal to 0 (or is the reference group), giving the end interpretation that the regression coefficient is the difference in mean of the two categories. Even…
25
votes
3 answers

How to interpret main effects when the interaction effect is not significant?

I ran a Generalized Linear Mixed Model in R and included an interaction effect between two predictors. The interaction was not significant, but the main effects (the two predictors) both were. Now many textbook examples tell me that if there is a…
24
votes
3 answers

How to compute the standard errors of a logistic regression's coefficients

I am using Python's scikit-learn to train and test a logistic regression. scikit-learn returns the regression's coefficients of the independent variables, but it does not provide the coefficients' standard errors. I need these standard errors to…
24
votes
1 answer

Standard errors for multiple regression coefficients?

I realize that this is a very basic question, but I can't find an answer anywhere. I'm computing regression coefficients using either the normal equations or QR decomposition. How can I compute standard errors for each coefficient? I usually think…
Belmont
  • 1,273
  • 3
  • 12
  • 16
24
votes
1 answer

How to treat categorical predictors in LASSO

I am running a LASSO that has some categorical variable predictors and some continuous ones. I have a question about the categorical variables. The first step I understand is to break each of them into dummies, standardize them for fair…
1
2 3
99 100