I am working on some imaging data: let $X$ be a matrix of size $t\times v$ ($v$ is the number of pixels of my images, $t$ a time dimension), centered and scaled to unit variance.
I am interested in the first SVD mode of this matrix. When I compute the SVD of $X$,
$$X = U\Sigma V^T,$$
I find that the first left-singular vector is 99.9% correlated with the row-wise mean of $X$. I suspect that they are in fact exactly proportional and that the small difference is due to floating-point error. Is this an artefact of my data, or can this be proven in general?
I am aware of this question, but the correlation matrix of $X$ does not have equal off-diagonal elements.