I have been looking into ridge regression as a method to address multicollinearity in data.
I am aware that multicollinearity can cause high variance in coefficient estimates. I have seen equations such as this:
$var(\hat{\beta}) = \sigma^2(X'X)^{-1}$
I have read that when perfect multicollinearity is present, the matrix is singular and hence no inverse exists. When multicollinearity is present (but not perfect multicollinearity) than the matrix becomes ill-conditioned. This apparently causes the $(X'X)^{-1}$ term to become very large, inflating the variance of $\beta$.
Seeing as the condition score of a matrix is the ratio is $ \sqrt{\frac{\lambda_{max}}{\lambda_{min}}}$ this suggests that multicollinearity causes a larger difference between the eigenvalues of $X'X$.
Based on the above i have 2 questions:
1) Why, when $X'X$ is ill-conditioned, does $(X'X)^{-1}$ become very large?
2) Please can you explain how multicollinearity cause the eigenvalues of X'X to change, as well as why there is a greater difference in their magnitudes between eachother?