One must be careful when asking questions about the relationships between the elements of a complex random vector.
The short answer to your question is that you cannot say much for either cases simply by considering the covariance (or correlation) matrix.
Actually, the covariance (correlation) matrix is not enough to capture all the relationships that exist between the elements of a complex random vector. For that we need to also consider the so-called pseudo-covariance (pseudo-correlation) matrix.
The idea is to consider real and imaginary parts of the complex random vector elements and all possible relationships between them.
Explanation
Let us consider, for simplicity, a zero-mean complex random vector:
\begin{align}
\mathbf{z}=\mathbf{x}+j\mathbf{y} &=
\begin{bmatrix}
z_1 \\
\vdots \\
z_n
\end{bmatrix}=
\begin{bmatrix}
x_1+jy_1 \\
\vdots \\
x_n+jy_n
\end{bmatrix}.
\end{align}
Let us consider two different elements $z_k$ and $z_l$ of $\mathbf{z}$:
\begin{matrix}
z_k = x_k+jy_k \\
z_l = x_l+jy_l,
\end{matrix}
then, we have the following relationships we need to consider:
- Auto (A) relationships: between $x_k, y_k, x_l, y_l$ and themselves, respectively.
- Horizontal (H) relationships: between $x_k$ and $y_k$, and between $x_l$ and $y_l$.
- Vertical (V) relationships: between $x_k$ and $x_l$, and between $y_k$ and $y_l$.
- Diagonal (D) relationships: between $x_k$ and $y_l$, and between $x_l$ and $y_k$.
Note that we want all existing relationships between all real and imaginary parts.
The covariance matrix of $\mathbf{z}$ is defined as $\mathbf{C}_{zz}=\mathbb{E}[(\mathbf{z}-\boldsymbol{\mu})(\mathbf{z}-\boldsymbol{\mu})^H]$, which is equal to the correlation matrix $\mathbf{R}_{zz}=\mathbb{E}[\mathbf{z}\mathbf{z}^H]$ since we supposed $\boldsymbol{\mu}=\boldsymbol{0}$.
If we develop we get:
$$\mathbf{R}_{zz}=\mathbb{E}[\mathbf{z}\mathbf{z}^H]=\mathbb{E}[(\mathbf{x}+j\mathbf{y})(\mathbf{x}+j\mathbf{y})^H] = \mathbb{E}[(\mathbf{x}+j\mathbf{y})(\mathbf{x}-j\mathbf{y})^T] \\ = \mathbb{E}[\mathbf{x}\mathbf{x}^T]+ \mathbb{E}[\mathbf{y}\mathbf{y}^T] + j(\mathbb{E}[\mathbf{y}\mathbf{x}^T] - \mathbb{E}[\mathbf{x}\mathbf{y}^T])
$$
$$
= \mathbf{R}_{xx}+\mathbf{R}_{yy} + j(\mathbf{R}_{xy}^T-\mathbf{R}_{xy}) \tag{I}
$$
$$
=\mathbb{E}\begin{bmatrix}
x_1^2 & \cdots & x_1x_n\\
\vdots & \ddots & \vdots\\
x_nx_1 & \cdots & x_n^2
\end{bmatrix}
+
\mathbb{E}\begin{bmatrix}
y_1^2 & \cdots & y_1y_n\\
\vdots & \ddots & \vdots\\
y_ny_1 & \cdots & y_n^2
\end{bmatrix}
+j\left(
\mathbb{E}\begin{bmatrix}
x_1y_1 & \cdots & x_ny_1\\
\vdots & \ddots & \vdots\\
x_1y_n & \cdots & x_ny_n
\end{bmatrix}
-
\mathbb{E}\begin{bmatrix}
x_1y_1 & \cdots & x_1y_n\\
\vdots & \ddots & \vdots\\
x_ny_1 & \cdots & x_ny_n
\end{bmatrix}
\right )
\\
=\mathbb{E}\begin{bmatrix}
x_1^2 & \cdots & x_1x_n\\
\vdots & \ddots & \vdots\\
x_nx_1 & \cdots & x_n^2
\end{bmatrix}
+
\mathbb{E}\begin{bmatrix}
y_1^2 & \cdots & y_1y_n\\
\vdots & \ddots & \vdots\\
y_ny_1 & \cdots & y_n^2
\end{bmatrix}
+j
\mathbb{E}\begin{bmatrix}
0 & \cdots & (x_ny_1-x_1y_n)\\
\vdots & \ddots & \vdots\\
(x_1y_n-x_ny_1) & \cdots & 0
\end{bmatrix}.
$$
What we end up having is: (A) and (V) relationships from $\mathbf{R}_{xx}$ and $\mathbf{R}_{yy}$, and (D) relationships from $\mathbf{R}_{xy}$, BUT we loose the (H) relationships.
To compensate for that, and for the fact that both $\mathbf{R}_{xx}$ and $\mathbf{R}_{yy}$ define the real-part of $\mathbf{R}_{zz}$, we define the pseudo-covariance matrix of $\mathbf{z}$ as $\mathbf{\bar{C}}_{zz}=\mathbb{E}[(\mathbf{z}-\boldsymbol{\mu})(\mathbf{z}-\boldsymbol{\mu})^T]$, and similarly the pseudo-correlation matrix $\mathbf{\bar{R}}_{zz}=\mathbb{E}[\mathbf{z}\mathbf{z}^T]$, by simply replacing the Hermitian transpose with a transpose.
With a similar development, to the above one, we end up having:
$$
\mathbf{\bar{R}}_{zz}
= \mathbf{R}_{xx}-\mathbf{R}_{yy} + j(\mathbf{R}_{xy}^T+\mathbf{R}_{xy}) \tag{II}\\
=\mathbb{E}\begin{bmatrix}
x_1^2 & \cdots & x_1x_n\\
\vdots & \ddots & \vdots\\
x_nx_1 & \cdots & x_n^2
\end{bmatrix}
-
\mathbb{E}\begin{bmatrix}
y_1^2 & \cdots & y_1y_n\\
\vdots & \ddots & \vdots\\
y_ny_1 & \cdots & y_n^2
\end{bmatrix}
+j
\mathbb{E}\begin{bmatrix}
2x_1y_1 & \cdots & (x_ny_1+x_1y_n)\\
\vdots & \ddots & \vdots\\
(x_1y_n+x_ny_1) & \cdots & 2x_ny_n
\end{bmatrix}.
$$
So, we get here all relationships (A), (H), (V), and (D). Nonetheless, we cannot extract them unless we use jointly the covariance and pseudo-covariance matrices. For example, if we want (A) and (V) relationships of $\mathbf{x}$ we add $(I)$ and $(II)$ and consider the real part of the result.
The special, simple, case of circular complex random vectors
A simple case to deal with is the case of a circular complex random vector $\mathbf{z}$ for which we have $\mathbf{\bar{C}_{zz}}=\boldsymbol{0}$, otherwise we say that the complex random vector is noncircular.
From $(II)$ (replacing $\mathbf{R}$ with $\mathbf{C}$) we can see that $\mathbf{\bar{C}_{zz}}=\boldsymbol{0}$ implies:
$$\mathbf{C}_{xx}=\mathbf{C}_{yy} \tag{*}$$
$$\mathbf{C}_{xy}=-\mathbf{C}_{xy}^T \tag{**}$$
Equation $(*)$ indicates (considering diagonal elements) that for each element of $\mathbf{z}$ the variance of the real part $x_i$ is equal to the variance of the imaginary part $y_i$, $i=1,\dots, n$, i.e. $\mathbb{E}[(x_i-\mu_{x_i})^2]=\mathbb{E}[(x_i-\mu_{x_i})^2]$ (it helps to relate the name circular to equal variance on the real and imaginary axes, which gives points distributed inside a circle!). Moreover covariances are equal, i.e. $\mathbb{E}[(x_i-\mu_{x_i})(x_j-\mu_{x_j})]=\mathbb{E}[(y_i-\mu_{y_i})(y_j-\mu_{y_j})]$, $i\ne j$.
Equation $(**)$ shows that $\mathbf{C}_{xy}$ is skew-symmetric, which implies null diagonal elements, i.e. $\mathbb{E}[(x_i-\mu_{x_i})(y_i-\mu_{y_i})]=0$, indicating that $x_i$ and $y_i$ are uncorrelated.
Back to the OP questions
What relationships exist between elements of $\mathbf{z}$ if $\mathbf{C}_{zz}$ is (complex) diagonal?
We know now that we have to know something about $\mathbf{\bar{C}}_{zz}$ too.
If we suppose circular complex random variables then we have from $(I), (*)$, and $(**)$:
$$\mathbf{C}_{zz}=2\mathbf{C}_{xx}-2j\mathbf{C}_{xy}=2\mathbf{C}_{xx}=\text{diag}(\alpha_1, \dots, \alpha_n),$$
with skew-symmetric $\mathbf{C}_{xy}=\boldsymbol{0}$, due to the diagonal structure since the only non-zero elements (off diagonal) are now zero.
Finally, we have:
- (H) relationships of $\mathbf{x}$ and $\mathbf{y}$ are uncorrelatedness (circular case).
- (D) relationships of $\mathbf{x}$ and $\mathbf{y}$ are uncorrelatedness ($\mathbf{C}_{xy}=\boldsymbol{0}$).
- (A) relationships of $\mathbf{x}$ are the same as (A) relationships of $\mathbf{y}$ (circular case) and are such that $\mathbb{E}[(x_i-\mu_{x_i})^2]=\alpha/2$ (diagonal matrix).
- (V) relationships of $\mathbf{x}$ are the same as (V) relationships of $\mathbf{y}$ (circular case) and are uncorrelatedness (diagonal matrix).
If we suppose noncircular complex random variables then all we know is:
- (D) relationships of $\mathbf{x}$ and $\mathbf{y}$ are uncorrelatedness.
- (V) relationships of $\mathbf{x}$ and $\mathbf{y}$ are uncorrelatedness.
- $\mathbf{C}_{xx}+\mathbf{C}_{yy} = \text{diag}(\alpha_1, \dots, \alpha_n)$ and $\mathbf{C}_{xy}^T-\mathbf{C}_{xy} = \text{diag}(\beta_1, \dots, \beta_n)$ (with $\mathbf{C}_{zz}=\text{diag}(\alpha_1, \dots, \alpha_n)+j\text{diag}(\beta_1, \dots, \beta_n)$).
Hence, to know more about the relationships we need the computation of the pseudo-covariance matrix.
What relationships exist between elements of $\mathbf{z}$ if $\mathbf{R}_{zz}$ is (complex) diagonal?
If $\boldsymbol{\mu}\ne \boldsymbol{0}$, we have:
$$\mathbf{C}_{zz}=\mathbf{R}_{zz}-\boldsymbol{\mu}\boldsymbol{\mu}^H.$$
If at least two elements of $\boldsymbol{\mu}$ are non-zero then $\boldsymbol{\mu}\boldsymbol{\mu}^H$ is non-diagonal (since the off-diagonal elements are products of complex numbers which cannot be zero if the elements are non-zero), and so will be $\mathbf{C}_{zz}$. Then, considering this non-diagonal structure and, again, using $\mathbf{\bar{C}}_{zz}$, we can conduct a similar analysis as the one conducted above.