In polynomial regression, estimating the effect of an independent variable on the outcome seems to be quite tricky (to me). For example: I want to compare the influence of $x$ on $z$ with the influence of $y$ on $z$ (I am satisfied with a statement like "the influence of $x$ is bigger than $y$").
In a linear model I would simply look at standardized regression coefficients (betas). But comparing residuals and fitted values in the linear model lm1 <- z~x+y
makes me want to include quadratic terms, and an interaction, too. Ergo: lm2 <- z ~ x + y + x² + y² + x:y
Since there are collinearities, standardized regression coefficients (betas) will become $>|1|$. An interpretation of these is not easy, because since $x$ and $x^2$ are related, the betas are related. Thus, I am simply not able to give a statement about the size of influence of $x$ and $y$ (compared to another).
I can calculate the same equation using orthogonal polynomials. This gives me betas $<|1|$. But, does it make sense to interpret them orthogonal? Since in reality $x$ and $x^2$ are not independent.
I read about Cohen's $d$, but this seems to have the same problems.
Now I thought about $\eta$'s in ANOVA. As I understand it, $\eta^2$ gives: "Out of the total variation in $z$, the proportion that can be attributed to a specific $x$." Moreover, ANOVA and linear regression are close. So my questions are:
- Can I use $\eta^2$ in collinear models to estimate the effect size of $x$ and $y$ on $z$? Or will $\eta^2$ automatically become "invalid" since there are more than one independent variable?
- Can I use the betas from the linear model (
lm1
) to estimate the influence of an independent variable on the outcome even if I use the polynomial model (lm2
)? (Is the combined influence of $x$ and $x^2$ inlm2
related to the influence of $x$ inlm1
?)