I believe your colleague meant if a column is highly correlated with another variable. If the design matrix is rank deficient then the regression parameters are under determined (there are an infinite amount of parameters which might explain the data).
High correlation is problematic for inference, but often the model can still be fit.
EDIT:
R will still fit a model with perfect collinearity
x = rnorm(100)
y = x
z = 2*x + rnorm(100)
lm(z~y+x)
> lm(z~y+x)
Call:
lm(formula = z ~ y + x)
Coefficients:
(Intercept) y x
0.05419 1.92536 NA
You will notice R will NA
one of the columns which are collinear. The methods by which R determines which columns are collinear and decides which variables to NA are a mystery to me but might be found in the documentation.
R will fit with a model with high collinearity too
sig = matrix(c(1, 0.99, 0.99, 1), nrow = 2)
X = MASS::mvrnorm(1000, c(0,0), Sigma = sig)
beta = c(2, 2)
y = X %*% beta + rnorm(1000)
z = X[,1]
w = X[,2]
lm(y~z+w)
Call:
lm(formula = y ~ z + w)
Coefficients:
(Intercept) z w
-0.04606 2.25132 1.73306
So it doesn't matter how adamant you colleague is, the proof is in the pudding, so to speak.