2

I am beginner in machine learning and I came across these terms while going through code of cnn, I want to know how to what exactly they mean and other question is that when the weights of network change? after each batch size or after every epoch? Thank you.

thisisbhavin
  • 195
  • 1
  • 8

1 Answers1

3

In the context of Convolution Neural Networks (CNN), Batch size is the number of examples that are fed to the algorithm at a time. This is normally some small power of 2 like 32,64,128 etc. During training an optimization algorithm computes the average cost over a batch then runs backpropagation to update the weights. In a single epoch the algorithm is run with $n_{batches} = {n_{examples} \over batchsize} $ times. Generally the algorithm needs to train for several epochs to achieve convergence of weight values. Every batch is normally sampled randomly from the whole example set.

farhanhubble
  • 434
  • 3
  • 12