Say $X \in \mathbb{R}^n$ is a random variable with covariance $\Sigma \in \mathbb{R}^{n\times n}$. By definition, entries of the covariance matrix are covariances: $$ \Sigma_{ij} = Cov( X_i,X_j). $$ Also, it is known that entries of the precision $\Sigma^{-1}$ satisfy: $$ \Sigma^{-1}_{ij} = Cov(X_i,X_j| \{X_k\}_{k=1}^n \backslash X_i,X_j\}), $$ where the right hand side is the covariance of $X_i$ with $X_j$ conditioned on all other variables.
Is there a statistical interpretation to the entries of a square root of $\Sigma$ or $\Sigma^{-1}$? By square root of a square matrix $A$ I mean any matrix $M$ such that $M^tM = A$. An eigenvalue decomposition of said matrices does not give such entry-wise interpretation as far as I can see.