Let's say I have a SVD of a matrix $A = U \Sigma V^T$, $A \in \mathbb{R}^{n \times m}$, and I'm using top-k components corresponding to $\sigma_1, ..\sigma_k$, the k largest values on the diagonal of $\Sigma$. They are also square roots of k largest eigenvalues of $A^T A$.
I encountered the sum of $$\frac{\sum_1^k \sigma_i^2}{\sum_1^n \sigma_i^2}$$ being interpreted as "preserved variability". This makes sense if $A^T A$ were the variance - but it's not, not unless the mean along the rows of A is zero. What am I missing here? Is there an inherent assumption that SVD is only applied to "normalized" $A$?