I saw this post for calculating a z-score between two slopes (Test a significant difference between two slope values), but is there something like this for two quadratics?
-
Are you testing equality (all parameters equal) against the alternative of any inequality? – Glen_b Sep 23 '16 at 01:04
-
Yeah, given two quadratic models are they significantly different. – Nitro Sep 23 '16 at 15:09
1 Answers
The discussion there focuses on comparing single parameters (slope estimates) from two separate regressions. Note that that is typically used when you only have the regression output for the two fits (rather than the original data) or you're not prepared to assume constant variance across the populations from which the two sets of data were drawn.
The aim here is to test for differences between three separate coefficients (the y-intercept, the coefficient of the linear term and the coefficient of the quadratic term). That's a little trickier.
If you are prepared to combine data sets (which would generally require you to assume constant variance across the two populations) it can be done easily enough by fitting both models at the same time via the usual method of fitting the quadratic terms plus a "group" indicator and "interactions" of the group with the linear and quadratic terms. If all the coefficients involving the group indicator are 0 then the quadratics are identical. So this allows us to test in a situation where the null and alternative are nested and this is a standard partial F test
Otherwise you'll need to do something a bit more complicated.
-
I figured it would probably be very complicated if a reduced model can't be used or combining the data sets is not possible. Thanks for the information though. – Nitro Sep 26 '16 at 18:28
-
Marcus If that's the case you need please say so. It is possible to get somewhere. – Glen_b Sep 26 '16 at 23:07