DropOut is a good way to reduce over-fitting in Neural Network, and I found a article on DropOut.
My question is for each epoch, we should randomly pick half of the neurons of the network, and set the weights of them to zero, and update the weights of the networks? Does it right?
Show my process
for each epoch:
randomly deactivate some neurons(in some percent) in each hidden layer
learning with mini-batch data and update the weighs of the whole-nets
get the output of weights