Based on my understanding from google and other posts, like this What is cross-validation for? and What is cross-validation? . I understand that (k- fold) cross validation means spliting the data into ($k-1$) training and 1 testing set. In other words, it is used for finding the model accuracy.
So, is that mean if I do cross-validation, then I don't need to split my data into training and testing dataset? Also, say if I do (in r):
model = train(target~., data = data, method = "glmnet",
trControl = trainControl("cv", number = 10),
tuneLength = 10
)
this code will find me the best model in those 10 trials, and then when I do the prediction, I am using the best model?