I'm quite new to statistics, so please bear with :)
I'm trying to estimate the uncertainty of a variable which is predicted using a linear equation. The linear equation is estimated with a series of observations. These observations have uncertainty in them.
I have dependent variable Y, an independent variable X, and an uncertainty in Y of u
Doing a linear regression produced the following parameters
Now, when I use the parameters and predict a Y value for an X lets say X = 4.5. With the parameters, we get a Y = a*4.5 + b
Doing Root Square Sum on Standard Errors of a and b gives 0.1426.
Is this the right way to estimate uncertainty from a linear regression prediction. Also this does not seem to take into account the uncertainties in the initial U of Y.
Sorry if this is a trivial question, but any help is much appreciated.
Thanks