I use matlab a lot to fit functions to datasets. I then need to determine some kind of error estimate on the fitted parameters.
Say I have a dataset of $N$ points which is given by the points $(x_i,y_i),\ i=1,\ldots,N$. I need to fit a function $f(x,P)$ where $P$ is the parameter matrix
$$P = \left[\begin{array}{cc} a_1\\ a_2\\\vdots\\a_k\end{array}\right]$$
where $k$ is the number of parameters to be fitted. I use the root-mean-squared method to find estimates for parameters $a_1,\ldots,a_k$. I build an error function:
$$\varepsilon(P,x,y)= \frac{1}{N}\left(\sum\limits_{i=1}^N (f(x,P)-y)^2\right)^{\frac{1}{2}}$$
and then use the matlab function fminsearch
to get the minimum of this function by optimizing $P$. (I know there are better matlab functions, but bare with me).
To get the estimates I calculate the Jacobian matrix $J_f$:
$$J_f = \left[\begin{array}{c}\frac{\partial f}{\partial a_1}(P',x) \\ \frac{\partial f}{\partial a_2}(P',x)\\ \vdots\end{array}\right]$$
where $P'$ denotes the fitted parameter matrix (best values to minimize the error function $\varepsilon$). I also calculate the variance on the residuals:
$$ \sigma_r = \frac{1}{N-k}\sum\limits_{i=1}^N r_i^2$$
with
$$ r_i = y_i - f(P',x_i), $$
the residuals. I can get the covariance matrix now:
$$ \operatorname{cov}(P) = \sigma_r(J_f^TJ_f)^{-1} $$
where the upper script $T$ stands for transposing and upper script $-1$ stands for the inverse matrix. TO get the error estimates I then need to get the point-wise square root of the covariance matrix:
$$s_P = \left[ \begin{array}{c} s_{a_1} \\ s_{a_2} \\ \vdots \end{array}\right] = \sqrt{\operatorname{cov}(P)}$$
The Question
Given the above method is at least a bit correct (I assume it is). Is there a similar approach for the case where not one function $f$ is fitted, but when there are multiple functions $f_1$ and $f_2$?
I'll explain what I mean by this. Let's say I have two data sets $(x_i,y_i)$ and $(x'_i,y'_i)$ which comply respectively to functions $f(x,P)$ and $f'(x',P)$ where $P$ is the parameter matrix. While $f$ and $f'$ are different functions, $P$ should be the same for both fits. Eg $f$ might be:
$$f(x,P)=a_1(1-exp(-x/a_2)exp(x/a_3)$$
while $f'$ is:
$$f'(x',P)=a_4(1+exp(x'/a_2))exp(x'/a_4)$$
I would therefore fit simultaneously with the error function:
$$\varepsilon(P,x,y) = \frac{1}{N}\left(\sum\limits_{i=1}^N (f(x,P)-y)^2\right)^{\frac{1}{2}} + \frac{1}{N'}\left(\sum\limits_{i=1}^N (f'(x',P)-y')^2\right)^{\frac{1}{2}}$$
and using fminsearch
again. This gives me the best estimate for $P$ using all data available (I actually have 4 datasets and 2 functions). How would I estimate the error/certainty bounds on the parameters in this case? (Preferrably using Jacobian again).
My english isn't top-notch, nor is my knowledge of statistics and therefore I may be mixing nomenclature a bit. Forgive me for this.