Questions tagged [variance-decomposition]

A decomposition of variance explained by a model into additive contributions from each predictor.

In Ordinary Least Squares (OLS), one typically decomposes a model's $R^2$.

One common method is to add regressors to the model one by one and record the increase in $R^2$ as each regressor is added. Since this value depends on the regressors already in the model, one needs to do this for every possible order in which regressors can enter the model, and then average over orders. This is feasible for small models but becomes computationally prohibitive for large models, since the number of possible orders is $p!$ for $p$ predictors.

Grömping (2007, The American Statistician) gives an overview and pointers to literature in the context of assessing variable importance.

For other models, e.g., logistic regression or zero-inflated regression, one can use a similar approach once one has decided on an appropriate analogue of $R^2$, e.g. one of various pseudo-$R^2$s.

41 questions
28
votes
2 answers

Collinearity diagnostics problematic only when the interaction term is included

I've run a regression on U.S. counties, and am checking for collinearity in my 'independent' variables. Belsley, Kuh, and Welsch's Regression Diagnostics suggests looking at the Condition Index and Variance Decomposition…
22
votes
3 answers

How to split r-squared between predictor variables in multiple regression?

I have just read a paper in which the authors carried out a multiple regression with two predictors. The overall r-squared value was 0.65. They provided a table which split the r-squared between the two predictors. The table looked like this: …
luciano
  • 12,197
  • 30
  • 87
  • 119
12
votes
1 answer

Additive vs Multiplicative decomposition

My question is a really simple one but those are the ones that really get me :) I don't really know how to evaluate if a specific time series is to be decomposed using an additive or a multiplicative decomposition method. I know there are visual…
10
votes
1 answer

Interpretation of Impulse Response and Variance Decomposition Graphs

I am finding it difficult to interpret the following Impulse response and variance decomposition graphs-basically studying the effect of currencies on each other(I know the results from the Granger causality test,but how do we interpret the graphs…
10
votes
2 answers

Interpretation of Total Law of Covariance

let $X,Y,Z$ be random variables defined on the same probability space and let covariance of $X$ and $Y$ be finite, then the law of total covariance / covariance decomposition formula…
5
votes
1 answer

Meaning of covariance matrix row sums

Say I have an $n \times n$ covariance matrix for a sample set of $n$ random variables. Is there any meaning if the sum of the rows of this matrix? Is it a meaningful measurement of the contribution of each random variable to the variance of sum of…
user2303
  • 433
  • 4
  • 12
5
votes
1 answer

Decomposing $R^2$ into independent variables

Consider a linear regression model: $$y = β_0 + β_1X_1 + β_2X_2 + ... + β_kX_k + ε$$ where $R^2 = 1 - (SSR/SST)$. I would like to determine the contribution of a factor $i$ (call it $R^2_i$) into the total $R^2$ such that $R^2_1$ + $R^2_2$ + ...…
4
votes
1 answer

Sum of Square decomposition

Question about the Total, Explained, and Residual Sum of Squares. I am in the simple linear regression model. Could you help me clarify why the residual sum of squares (SSE where E stands for errors) $$SSE = \sum_{i=1}^n (\hat{Y}_{i}-Y_{i})^{2}…
3
votes
1 answer

Graphical proof of variance decomposition for linear regression

Suppose we aim to predict $Y$ from $X$ using the linear regression model $Y = mX + b$. There is a standard variance decomposition: $$\operatorname{Var}[Y] = \operatorname{Var}[\widehat{Y}] + \operatorname{Var}[R],$$ where $\widehat{Y}=mX+b$ is the…
D.W.
  • 5,892
  • 2
  • 39
  • 60
3
votes
1 answer

Question about an expectation

Let $x$ and $\gamma$ be vectors. Here it says that $$E[y-x'\gamma]^2 = E[(y-E[y|x])^2 + (E[y|x]-x'\gamma)^2]$$ However, I don't see why $$E[(y-E[y|x])(E[y|x]-x'\gamma)] = 0.$$ By the way, $E$ is the same as $E_{x,y}$ for all purposes. I wasn't…
Glassjawed
  • 457
  • 3
  • 13
2
votes
0 answers

Blinder-Oaxaca decomposition, logistic regression and unbalanced dataset: fitted probabilities numerically 0 or 1 occurred

I have a binary y outcome, a dummy variable gender for gender, and a set of covariates x (including some factor variables converted into dummies, excluding one dummy in each of the factor variables) and a set of control dummies cdummies (they come…
2
votes
0 answers

Variance decomposition for time series of weighted averages

I face the following problem: I have a time series $x_t,\;t=1,...,T$ where each value $x_t$ is a weighted average for various groups, that is $x_t = \sum_{i=1}^Nf_{it}*x_{it}$ with $\sum_{i=1}^Nf_{it}=1$. Now I want to know if I can decompose the…
2
votes
0 answers

Linear Mixed Regression Variance Decomposition

I'm pretty new to R and was hoping to get some advice on variance decomposition in mixed linear models. Similar to this question; How to estimate variance components with lmer for models with random effects and compare them with lme results, I have…
2
votes
0 answers

Does there exist a Bayesian analysis of bias-variance decomposition of an estimator?

I was wondering if anyone could spare a moment to help with the answers to the following questions. Suppose we have an estimator $\hat{\theta}:\mathbb{R}^{d}\rightarrow\mathbb{R}$ such that the number of parameters $p\gg d$. For a squared objective…
2
votes
0 answers

Calculating orthogonalized impulse response functions for vector error corrrection models

Background: I am working on orthogonal impuls response functions (OIRFs) for vector error correction models (VECMs). Its an exercise to develop understanding. I am given a bivariate VECM: $$ \Delta y_t = \begin{bmatrix}-0.15 \\0.3\end{bmatrix}…
1
2 3