During the last years I've been running cross validation procedures (and sometimes nested CV) to have an estimate of my model while also doing hyperparameter search.
The usual procedure that people mention online is that after having an estime of prediction power from a CV procedure, we just need to refit everything again on the whole dataset in order to have a final model.
However, how can one do that with a Neural Network? This is a problem for me because usually I don't select the model resulted from the last epoch's backpropagation. I select a model looking to some criteria in a validation set (for example, if I run a NN for 50 epochs, I select the model with the lowest loss in the validation set). Thus, how can we refit a neural network in the whole dataset if the choice of where to stop or which model to select throughout the epochs depend on a validation set to check/avoid overfitting? I mean, I'm considering that even if I define some number of epochs in the inner loop of a CV procedure, we don't know whether that will lead to some overfitting when training in the whole dataset...
Has anyone had this sort of issues before? How do you usually do a CV with a final model selection given we want it to be some sort of neural network?