2

I've read this regarding the difference between epoch and mini-batch.

To clarify:

  • With an epoch value of 1000 and batch size of 50, does that mean that the model will use each data point exactly 1000 times in such an (random) order where at each iteration only 50 data points are used for optimization? (meaning a total of 50*1000 calculations?)
  • Is every data point used exactly 1000 times?

1 Answers1

1

Yes, each iteration uses mini-batch number of points (here 50) and there will be 20 iterations in an epoch. An epoch ends when we go over all the training samples exactly once. This means every training point is used exactly 1000 times.

gunes
  • 49,700
  • 3
  • 39
  • 75