I've been applying PCA to some data sets and in all previous cases, the number of principal components generated by prcomp() was always equal to the number of original variables (m). However, I've stumbled upon this case where the number of PCs computed is less than m.
In this case, my data set has 40 variables (including the class label in the m-th column) and 36 observations. My code for applying PCA is:
data = read.csv("datasets/chemical/chocolate.csv", sep=",", dec=".")
m = length(data)
n = nrow(data)
pca = prcomp(data[,-m], scale=FALSE)
loadings = pca$rotation
scores = pca$x
But only 36 principal components were computed this time. So the loadings matrix returned by pca$rotation was 39x36.
Why does it happen? Why did prcomp() compute fewer principal components than the number of original variables in this particular case? Is this an issue with my code or with my data?