DropConnect is a generalization of Hinton's Dropout for regularizing large fully-connected layers within neural networks.
Questions tagged [dropconnect]
5 questions
9
votes
2 answers
What if all the nodes are dropped when using dropout?
When implementing dropout (or drop connect) - do you need to account for the case that every node in a layer is dropped?
Even though this is a very small chance, what is the correct approach to take in this scenario? Pick a new random set to drop…

Dan
- 1,288
- 2
- 12
- 30
4
votes
1 answer
Should I use drop connect and L2 regularization?
I've just learned about the drop connect technique for neural networks. It is my understanding that it is intended to reduce over fitting and is referred to as regularization. Is there additional benefit to using both drop connect and L2…

piRSquared
- 251
- 1
- 10
3
votes
1 answer
Rescaling weights after drop connect
From what I understand, you're supposed to rescale your activation layer after an application of dropout by an amount proportional to how much you dropped. Essentially truing up the lost relevance (poorly stated, but I hope I made myself clear). …

piRSquared
- 251
- 1
- 10
2
votes
2 answers
Can I use drop connect with sigmoid activation
The literature discusses tanh and relu activation. Does drop connect not work with sigmoid activation?

piRSquared
- 251
- 1
- 10
1
vote
1 answer
Drop Connect Back Propagation
I'm trying to implement drop connect. Am I supposed to use the same drop masks during back propagation?

piRSquared
- 251
- 1
- 10