In principal component analysis, the correlation between the i-th principal component ($y_i$) and the random vector of observed variables x (of dimension $p$) is defined as:
$corr$( $y_i$ , x ) = $\lambda_i^{-0.5}$ * $cov$( $y_i$, x) * $\Delta^{-0.5}$ (#)
with $\lambda_i^{-0.5}$ $i$-th eigenvalue of the covariance matrix of the observed values ($\Sigma$) and $\Delta$ a $p*p$ diagonal matrix with the variances of x on its main diagonal:
$\Delta$ = $diag$($\Sigma$)
Why does this happen? Moreover, I only know of two possible computations of a correlation:
1) correlation between two scalar random variables: $corr$($x$,$y$) = $\frac{cov(x,y)}{ var(x)^{0.5}*var(y)^{0.5} }$
2) correlation "within" a random vector (of dimension $p$): $corr$(x) = $\Delta^{-0.5}$ * $cov$(x) * $\Delta^{-0.5}$
How can I extend such definitions to the two random vectors' case (or, in (#), a scalar random variable and a random vector)? Is the latter case perhaps called "cross correlation"?
Thank you in advance.