4

I've fit a system of non-linear ODE to some experimental data using Levemberg-Marquardt. After the algorithm converged, I estimated the Hessian matrix of the system using:

$H = (J^TJ)$

The covariance matrix is then the inverse of H:

$cov = H^{-1}$

To get an unbiased estimate, I rescaled cov like so:

$cov_{scaled} = cov * (RSS / (m - n))$

Where $m$ is the number of measurements, and $n$ is the number of parameters.

The diagonal of $cov_{scaled}$ gives me the uncertainty in the parameters.

However, if I am interested in the uncertainty of a 'meta parameter', such as:

$p_{meta} = p_1 + p_2$

How do I estimate that from $cov_{scaled}$?

What if $p_{meta}$ is a slightly more complex function, such as:

$p_{meta} = p_1/p_2$

Is there a generic approach?

I cannot really re-parametrize the system and fit p_new directly unfortunately.

Fede_v
  • 43
  • 1
  • 5
  • I suggest writing the equations in latex. – IcannotFixThis Apr 10 '14 at 12:47
  • Fixed, thanks for the suggestion, didn't realize you could use mathjax. – Fede_v Apr 10 '14 at 13:10
  • 1
    Heard of the delta method? – probabilityislogic Apr 10 '14 at 13:13
  • I've heard of it, but I'm not very familiar how it would apply to this case, while taking into account of the covariance of parameters p1 and p2. – Fede_v Apr 11 '14 at 07:32
  • http://en.wikipedia.org/wiki/Taylor_expansions_for_the_moments_of_functions_of_random_variables ... but check the various threads here on the subject for plenty of caveats – Glen_b Jan 13 '15 at 06:15
  • The Jacobian approximation to the Hessian is often not very good. You should use software which will calculate the actual Hessian at the optimum and then calculate the desired estimates of the standard deviations via the delta method automatically for you. AD Model builder will do this. – dave fournier Oct 10 '15 at 15:50

2 Answers2

1

This question is on "propagation of error" or "error propagation" and is a very common problem. If the variables are correlated, the problem is fairly complicated, if not, you may simply sum the squares of the errors multiplied by the partial derivatives of the function: http://en.wikipedia.org/wiki/Propagation_of_uncertainty#Simplification

Wikipedia has a very nice page on it: http://en.wikipedia.org/wiki/Propagation_of_uncertainty

Mark
  • 11
  • 1
1

The linear one is easy. Suppose $p_1$ and $p_2$ and $p_3$ are all parameters of your model (call the vector of these parameters $P$), then the variance-covariance matrix of the three parameters is the 3x3 matrix H. So to get Var($p_1$ + $p_2$), which is Var($(1,1,0)*P$), just compute $(1,1,0)*H*(1,1,0)'$.

AlaskaRon
  • 2,219
  • 8
  • 12