1

I'm quite new to statistics, so please bear with :)

I'm trying to estimate the uncertainty of a variable which is predicted using a linear equation. The linear equation is estimated with a series of observations. These observations have uncertainty in them.

I have dependent variable Y, an independent variable X, and an uncertainty in Y of u

enter image description here

Doing a linear regression produced the following parameters

enter image description here

Now, when I use the parameters and predict a Y value for an X lets say X = 4.5. With the parameters, we get a Y = a*4.5 + b

Doing Root Square Sum on Standard Errors of a and b gives 0.1426.

Is this the right way to estimate uncertainty from a linear regression prediction. Also this does not seem to take into account the uncertainties in the initial U of Y.

Sorry if this is a trivial question, but any help is much appreciated.

Thanks

leviathan
  • 11
  • 1
  • How did you measure $u$ ? Please be precise about what $u$ is. Usually we have $Y$ measured with some error $e$ which is assumed to be a random variable, normally distributed with mean 0 and some variance, and $X$ measured without error and the simple linear model estimates the line of best fit, along with residuals that are estimates of $e$. – Robert Long Jan 12 '20 at 19:28
  • Usually the parameter estimates are correlated, often strongly so, whence the root square sum is incorrect no matter what you are trying to estimate. If you need a standard error for the predicted value of $Y,$ please see the duplicate. – whuber Jan 12 '20 at 20:33

0 Answers0