0

Here is the question which I am facing difficulty to solve

I know I have to prove determinant of a matrix as non zero. But how to proceed.

Please guide.

Thanks a lot in advance.

Sitanshu
  • 147
  • 1
  • 2
  • 12

3 Answers3

2

A Hermitian matrix $A$ admits the following decomposition (diagonalisation): $$A V = V \Lambda\tag{*}$$ Where $\Lambda=\textrm{diag}(\lambda_1,...,\lambda_n)$ and $V=[X_1,...X_n]$. We also know that the columns of the matrix $V$ are linearly independent (proof below), $i.e.$ $\det{(V)}\neq 0$.

Because $A$ is Hermitian $V^TV=I$ where $I$ is the identity matrix., therefore: $$\det(V)=1/\det{(V^T)}$$ From $(*)$ $$\det{(A)} = \det{(V\Lambda V^T)}=\det{(\Lambda)}=\prod_i^n{\lambda_i}$$ Therefore $A$ would be nonsingular if $\lambda_i\neq 0$


To prove that $V$ is nonsingular you can show that in the case $A$ is Hermitian the equations $$AV=V\Lambda \qquad UA=\Lambda U$$ show that $U=V^T$, and therefore the matrix $V$ is orthogonal:

HBR
  • 1,813
  • 9
  • 9
2

Let $i\ne j$. We have: $\newcommand\inner[2]{\langle #1, #2 \rangle}$

$$\lambda_i\inner{X_i}{X_j} = \inner{\lambda_i X_i}{X_j} = \inner{AX_i}{X_j} = \inner{X_i}{AX_j} = \inner{X_i}{\lambda_j X_j} = \overline{\lambda_j}\inner{X_i}{X_j}$$

Assume $\inner{X_i}{X_j} \ne 0$. By cancelling $\inner{X_i}{X_j}$ we get $\lambda_i = \overline{\lambda_j}$, and since the eigenvalues of a hermitian matrix are real, $\lambda_i = \lambda_j$. But, this is a contradiction with all the eigenvalues being distinct.

$\inner{X_i}{X_j} = 0$ follows, so the set $\{X_1, \ldots, X_n\}$ is orthogonal and thus linearly independent. This implies that $C$ is nonsingular.

mechanodroid
  • 45,723
  • 7
  • 39
  • 74
1

The columns are linearly independent, since they are eigenvectors that correspond to distinct eigenvalues. Therefore, the matrix is non-singular.

José Carlos Santos
  • 415,799
  • 256
  • 263
  • 447
  • How? . I am having difficulty understanding this concept. Please can you explain a bit more. I get a non singular matrix determinant is non zero but how a non singular matrix be proven from $$ λ1.X1 + λ2. X2 +... = 0 $$ which will give λ1 = λ2= ....=0 – Sitanshu Sep 02 '17 at 11:20
  • 1
    @Sitanshu That's a standard Linear Algebra theorem. You'll find a proof [here](https://math.stackexchange.com/questions/29371/how-to-prove-that-eigenvectors-from-different-eigenvalues-are-linearly-independe). – José Carlos Santos Sep 02 '17 at 11:28
  • 1
    I think the OP is asking how linear independence of matrix columns implies nonsingularity of the matrix. – mechanodroid Sep 02 '17 at 11:31
  • Ok. Thanks a lot. Yes that's what I was intending to ask. – Sitanshu Sep 02 '17 at 11:32
  • You can find a proof here: https://math.stackexchange.com/questions/529833/how-to-prove-invertibility-of-a-linear-independent-column-matrix – mechanodroid Sep 02 '17 at 11:38
  • @Sitanshu Because that's one of the properties of the deerminants: the determinant is $0$ if and only if the columns are linearly independent (and it works with lines too). See [here](https://math.stackexchange.com/questions/79356/using-the-determinant-to-verify-linear-independence-span-and-basis), for instance. – José Carlos Santos Sep 02 '17 at 11:39
  • Yes , I studied if det A not equal to 0, then it's linear independent. But is the reverse / vice -versa true , that is, if column vectors are linear independent, then det A not 0 and thus non singular? – Sitanshu Sep 02 '17 at 11:48
  • @Sitanshu Yes, it is. – José Carlos Santos Sep 02 '17 at 12:34
  • Ok. Thanks. This helped me in understanding linear algebra better :) – Sitanshu Sep 03 '17 at 04:27
  • @Sitanshu Please mark one of the answers as the accepted one. – José Carlos Santos Sep 03 '17 at 08:46