I'm working on a model for something and at the moment I prefer working solely in Excel. I've been double checking the results of the linear model in JMP, Minitab, and Statistica, and (more or less) been getting the same answers.
One thing that's coming out odd though is my standardized residuals, I'm getting much different answers than Excel's regression routine, and I know it has to do with how I am calculating them:
The standard deviation of our population varies relative to the output, so we work in terms of the relative standard deviation. We have an assumed %RSD of 5% (based on a lot of previous work, we also have reason to assume normality). From this I standardize the residuals by saying $\frac{(x-u)}{u\cdot RSD}$ where x = the observed value and u = the predicted value, so x-u = the residual.
Note that $u\cdot RSD = s$. Simple z-score. Problem is that the values Excel is giving me for the standardized residuals are much different than mine. This isn't exactly surprising since I am using a varying standard deviation. But their values don't seem to be tied to the reality of the data. One observation could be off by as much as 50% (around 6 standard deviations away) and the standardized residuals I'm given are only like 2 or 3.
Anyways, I'm having a really hard time finding out exactly how the residuals are standardized in a linear regression. Any help would be appreciated