0

If A is a symmetric matrix whose eigenvalues have absolute values that are less than $1$, then: $$\det(I-A)\neq0$$ where $I$ is identity matrix.

Why is that inequality correct?

Ben Grossmann
  • 216,163
  • 12
  • 149
  • 305
mokebe
  • 479
  • 1
  • 5
  • 17

2 Answers2

4

If you see "symmetric" (even normal is enough), diagonalize! \begin{align} \det(I - A) &= \det(UIU^* - U\Lambda U^*) \\&= \det(U(I-\Lambda) U^*) \\&= \det(U)\det(I-\Lambda)\det(U^*) \\&= \det(I-\Lambda) \\&= \prod\limits_{j=1}^n (1-\lambda_j) \\\det(I - A) &= 0 \iff \exists j\colon\lambda_j = 1 \end{align}

Eman Yalpsid
  • 2,996
  • 2
  • 21
  • 28
  • Now I feel bad for not including that the last line is naturally true for any $A$, but I don't want to bump this post. – Eman Yalpsid Mar 28 '17 at 17:03
1

Mosquito-nuking solution: $$\rho(A) < 1\implies \sum_{n=0}^{\infty}A^n\text{ is convergent}$$ (see Neumann series and spectral radius) and $$(I - A)^{-1} = \sum_{n=0}^{\infty}A^n\text{ exists}.$$