Given that linear models can be solved exactly via calculus, how is it possible to define a variance for the parameters ($\mathbf{a}$) which minimize some error function? say $Err=(o_i-f(x_i; \mathbf{a}))^2$ There no distribution of $\mathbf{a}$ to find the variance of, no?
Asked
Active
Viewed 25 times
0
-
2If you think as a frequentist, there is the distribution of the data $x$, as the actual sample is just one possible sample from the population. If you think as a Bayesian, in addition to the above you formalize your knowledge about parameters by a probability distribution as if the parameters were random variables. – Richard Hardy Sep 02 '21 at 09:31
-
2"Solved exactly" is misleading, because unless the fit is perfect, the data do *not* "exactly" fit the model. Random variables are introduced to analyze that lack of fit. This is done explicitly in many formulations, such as https://stats.stackexchange.com/a/148713/919, but in other formulations it is only implicit. – whuber Sep 02 '21 at 12:27