Case 1. If your (p x n) design (data) matrix $X$ has more observations $n$ than variables $p$ (a landscape), then $XX'$ will be invertible unless there are linearly dependent variables. This means that all eigenvalues will be positive.
If you have linearly dependent variables, then $XX'$ will have a rank lesser than $p$: $rank(XX')<p$ then at least one eigenvalue will be zero, i.e. not positive.
Case 2. If there are more variables than observations, i.e. $p>n$ (a portrait), then $rank(XX')\le n<p$, then again at least one eigen value will be not positive, a zero.
Conceptually, what a regularization does is it tries to bring you from Case 2 to Case 1 by imposing a constraint, such that effectively "reduces" $p$ to $\tilde p$. Although it still holds that $p>n$, but now $\tilde p<n$. In this regard, again, conceptually, regularization changes the effective shape of your $X$ matrix from portrait (case 2) to landscape (case 1). We saw that the shape impact whether all eigenvalues are positive or not.