3

In orthogonal regression it is assumed that both variables have noise. I'm interested in the simplest possible case. That is, I have a very large number of data points $(X_1,Y_1), ..., (X_n,Y_n)$. I know that $Y = a X + b$ and that my $X_i$ and $Y_i$ are observations of an underlying $(X,Y)$ pair, with uncorrelated Gaussian noise.

The standard approaches for processing this data, to estimate $a$ and $b$ assume the standard deviation of the noise in $X$ and $Y$ is identical. (Mathworks has one such presentation.)

My question: is there a way to estimate $\sigma_X$ and $\sigma_Y$? My current approach examines the noise in $Y$ and $X$ after the orthogonal fit. And see if they are commensurate. Seems like a reasonable approach, but if there's an approach with a better foundation, I'd be interested in hearing about it.

amoeba
  • 93,463
  • 28
  • 275
  • 317
John
  • 636
  • 6
  • 6
  • 1
    [This wikipedia page](http://en.wikipedia.org/wiki/Errors-in-variables_models#Simple_linear_model) offers some considerations, even though it does not seem to answer your question specifically. – amoeba Feb 19 '15 at 14:02
  • Try [Numerical Recipes](http://apps.nrbook.com/empanel/index.html?pg=786) pp. 785-787: "starting guess for $b$ ... finding the standard errors $\sigma_a \sigma_b$ is more complicated ..." (A question back, have you found a simple way to plot a say 90 % error region in XY ?) – denis Jun 17 '15 at 08:49

0 Answers0