I am beginner in machine learning and I came across these terms while going through code of cnn, I want to know how to what exactly they mean and other question is that when the weights of network change? after each batch size or after every epoch? Thank you.
Asked
Active
Viewed 4,233 times
2
-
1also [What are the differences between 'epoch', 'batch', and 'minibatch'?](http://stats.stackexchange.com/q/117919/12359) – Franck Dernoncourt Mar 26 '17 at 16:18
1 Answers
3
In the context of Convolution Neural Networks (CNN), Batch size is the number of examples that are fed to the algorithm at a time. This is normally some small power of 2 like 32,64,128 etc. During training an optimization algorithm computes the average cost over a batch then runs backpropagation to update the weights. In a single epoch the algorithm is run with $n_{batches} = {n_{examples} \over batchsize} $ times. Generally the algorithm needs to train for several epochs to achieve convergence of weight values. Every batch is normally sampled randomly from the whole example set.

farhanhubble
- 434
- 3
- 12
-
so the weights will be updated '"n_batches" times in one epoch, right? – thisisbhavin Mar 26 '17 at 14:45
-
Yes, all weights are updated $n_{batches}$ times in a singe epoch. – farhanhubble Mar 26 '17 at 14:56