2

I have some confusion regarding some derivations of the equations for kriging in the wiki article. $\newcommand{\Var}{\rm Var}$

It says that kriging error is given by: \begin{align} \Sigma_k^2(x_0) &= \Var\big(\hat{Z}(x_0)-Z(x_0)\big) \tag{1} \\ &= E\left(\left(Z(x_0) - \hat{Z}(x_0)\right)^2\right) \tag{2} \\ &= \sum_{i=1}^n\sum_{j=1}^nw_i(x_0)w_j(x_0)c(x_i,x_j) + \Var\big(Z(x_0)\big) - \\ &\quad\ 2\sum_{i=1}^{n}w_i(x_0)c(x_i,x_0) \tag{3} \end{align}

I am confused how the equivalent expression $(3)$ from $(2)$ was derived.

gung - Reinstate Monica
  • 132,789
  • 81
  • 357
  • 650
user34790
  • 6,049
  • 6
  • 42
  • 64
  • The exposition in that article is in a logically reversed sequence. If you look further down you will see that the expectation of $\hat{Z}(x_0) - Z(x_0)$ is constrained to $0$, whence the variance (on the right of your first equation) reduces to the expectation of $(\hat{Z}(x_0) - Z(x_0))^2$. At this point, the closely related thread at http://stats.stackexchange.com/questions/30643 should fully answer your question. – whuber Jul 11 '12 at 20:54
  • I got it thanks. But my question was how the expression 1 was derived. I mean referring to your article that's fine but how come you get the summations in the expression. I want to know how it is derived – user34790 Jul 11 '12 at 21:02
  • Well, "derived" means starting from something and proceeding by logical steps. Given that *some* derivation is shown on the Wikipedia page and that (essentially the same) derivation is shown in my answer to that other thread, it's really hard to figure out what kind of answer would be useful to you. Why don't you indicate (1) where you want to start from and (2) at what step you begin not to follow. Then we might be able to help you out. – whuber Jul 11 '12 at 22:04
  • ok I have mentioned how was expression 3 was derived from 2 – user34790 Jul 11 '12 at 22:17
  • Yes, and I directed you to a step-by-step derivation of expression 3 from expression 2 in an answer to a related question. At what point in that answer would you like some clarification? – whuber Jul 11 '12 at 22:19
  • I found this in your post This, again, is a computation. It relies on the bilinearity and symmetry of covariance, whose application is responsible for the summations in the second line: Actually, I didn't get how the summation part came when we calculated the expectation of the square of the difference between the predicted and actual value of Z0 – user34790 Jul 12 '12 at 03:51
  • let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/4094/discussion-between-whuber-and-user34790) – whuber Jul 12 '12 at 12:34

1 Answers1

1

I read that you are confused about how to arrive at $(3)$ from $(2)$. It is convenient to think in terms of vectors. That is, the Kriging weights $w_i$ can be represented as vector $\mathbf{w}$ and the Kriging estimate can be represented as $$ \newcommand{\Cov}{\rm Cov} \newcommand{\E}{\rm E} \hat{Z}(x_0)=\mathbf{w}^T\mathbf{Z} $$ where $\mathbf{Z}$ is a vector of observed values and the Kriging weights are defined as $\mathbf{w}^T=\mathbf{\Sigma_0}^T\mathbf{\Sigma}^{-1}$ where $\mathbf{\Sigma}=\Cov[\mathbf{Z},\mathbf{Z}]$ and $\mathbf{\Sigma_0}=\Cov[\mathbf{Z},Z(x_0)]$. Then, assuming we have a covariance model $$\Cov[Z(x_1),Z(x_2)]=C(\|x_1-x_2\|)=C(r)$$ We can derive the Kriging uncertainty as follows: \begin{align*} \E[(Z(x_0)-\hat{Z}(x_0))^2]&=\E[(Z(x_0)-\mathbf{w}^T\mathbf{Z})^2]\\ &=\Cov[Z(x_0)-\mathbf{w}^T\mathbf{Z},Z(x_0)-\mathbf{w}^T\mathbf{Z}]\\ &=\Cov[Z(x_0),Z(x_0)]-2\Cov[\mathbf{w}^T\mathbf{Z},Z(x_0)]+\Cov[\mathbf{w}^T\mathbf{Z},\mathbf{w}^T\mathbf{Z}]\\ &=C(0)-2\mathbf{w}^T\Cov[\mathbf{Z},Z(x_0)]+\mathbf{w}^T\Cov[\mathbf{Z},\mathbf{Z}]\mathbf{w}\\ &=C(0)-2\mathbf{w}^T\mathbf{\Sigma_0}+\mathbf{w}^T\mathbf{\Sigma}\mathbf{w} \end{align*} The trickery is in pulling the Kriging weights out of the covariances (look up properties of covariances if those steps are not clear). What you are seeing on Wikipedia is the sum notation for the matrix operations presented above.

medley56
  • 121
  • 3