I have performed a regression analysis and on top of computing the coefficients, confidence intervals, and significance, I have the standard errors.
I have been asked to provide a numerical estimate of the goodness of fit for the models to the data. WIll these standard errors suffice?
I'm not sure how to interpret them though.
EDIT:
In light of my understanding of the distinction between the SE of coefficients and the SE of the regression, is there any way of calculating the SE of the regression given the SE of the coefficients?
EDIT:
Regarding the difference between standard error of the regression and standard error of the coefficients, this excerpt from link explains it quite well:
In general, the standard error of the coefficient for variable X is equal to the standard error of the regression times a factor that depends only on the values of X and the other independent variables (not on Y), and which is roughly inversely proportional to the standard deviation of X. Now, the standard error of the regression may be considered to measure the overall amount of "noise" in the data, whereas the standard deviation of X measures the strength of the "signal" in X. Hence, you can think of the standard error of the estimated coefficient of X as the reciprocal of the signal-to-noise ratio for observing the effect of X on Y. The larger the standard error of the coefficient estimate, the worse the signal-to-noise ratio--i.e., the less precise the measurement of the coefficient.