14

This question is a generalisation of Eigenvalues of $AB$ and $BA$ where $A$ and $B$ are rectangular matrices which itself is a generalisation of Eigenvalues of $AB$ and $BA$ where $A$ and $B$ are square matrices.

Let $A$ be an $m \times n$ matrix and $B$ an $n \times m$ matrix. Obviously, the matrix products $AB$ and $BA$ are possible. Assume $n \leq m$, such that $AB$ is a weakly larger matrix than $BA$.

Facts:

  1. The rank of both $AB$ and $BA$ is at most $n$ (link 1)
  2. The number of non-zero eigenvalues of both $AB$ and $BA$ is at most $n$ (link 2)
  3. If the eigenvalues of $AB$ are $\lambda_1, \ldots, \lambda_n$, the eigenvalues of $BA$ are also $\lambda_1, \ldots, \lambda_n$ (link 3).

Questions:

  1. If the singular values of $AB$ are $\sigma_1, \ldots, \sigma_n$, what can be said about the singular values of $BA$?
  2. What does Fact 3, compared with the answer to Question 1, say about the differences and the similarities between eigenvalues and singular values?
LBogaardt
  • 203
  • 2
  • 18
  • @Ian Are the eigenvalues of $(AB)(B^TA^T)$ related to those of $(A^TB^T)(BA)$? I'm not seeing the connection. – Erick Wong Sep 28 '17 at 08:08
  • @ErickWong You're right, I made a simple mistake. – Ian Sep 28 '17 at 15:42
  • There is a relation between the singular values but a subtle one: Singular values of both $AB$ and $BA$ satisfy the same set of linear inequalities, called Horn inequalities, in terms of singular values of $A$ and $B$. I can tell you more if you are interested. – Moishe Kohan Oct 01 '17 at 21:55

1 Answers1

12

There is almost no relationship. For example, if we take $$A = \begin{bmatrix}x & 1 \\ 0 & 0\end{bmatrix}, \quad B = \begin{bmatrix}0 & 0 \\ 1 & y \end{bmatrix}$$ then the singular values of $AB$ are the square roots of the eigenvalues of $$(AB)^{\mathsf T} AB = \begin{bmatrix}1 & y \\ y & y^2\end{bmatrix}$$ so they are $\sqrt{1+y^2}$ and $0$. Similarly, the singular values of $BA$ are $\sqrt{1+x^2}$ and $0$. Even in this simple example, the nonzero singular value in one case can vary pretty much independently of the other case. (They must both be at least $1$, but we can tweak that by changing the $1$ in the matrices to some small $\epsilon>0$.)

By taking determinants, we can conclude that the product of the singular values of $AB$ is $\det(AB)$, while the product of the singular values of $BA$ is $\det(BA)$. So if $A$ and $B$ are both square matrices, the singular matrices in both cases have an equal product $\det(A)\det(B)$, which is some amount of dependence.

On the other hand, by taking tensor products of the construction above, we can start with $2n \times 2n$ square matrices $A$ and $B$ where

  • $AB$ has singular values $\sigma_1, \sigma_2, \dots, \sigma_n, 0, 0, \dots, 0$,
  • $BA$ has singular values $\sigma'_1, \sigma'_2, \dots, \sigma'_n, 0, 0, \dots, 0$,
  • and these are free to vary independently of each other.
Misha Lavrov
  • 129,298
  • 10
  • 116
  • 227
  • 1
    Thank you. This is somewhat surprising to me as I previously thought of singular values as just a generalisation of eigenvalues. You've shown that, although $AB$ and $BA$ have identical eigenvalues, their singular values are independent. Is there an intuitive/visual way to understand the difference between singular values and eigenvalues? – LBogaardt Oct 01 '17 at 08:24
  • 3
    The eigenvalue decomposition involves finding a single basis (of any kind) on which the matrix plays nicely. The singular value decomposition involves finding two different *orthonormal* bases on which the matrix plays nicely. In the end, it turns out that $A$ maps the eigenbasis of $BA$ to multiples of the eigenbasis of $AB$, hence your fact 3. For the equivalent to hold for SVD, $A$ would have to map orthonormal vectors to other orthonormal vectors, which would require special properties of $A$. – Misha Lavrov Oct 01 '17 at 16:31