1

Does it make any sense to have two (or more) neurons in a neural network with the same weights? (intuitively it makes no sense, since all the neurons would behave the same way).

Please consider both the input and hidden layers. What if the weights are equal 0?

elmes
  • 907
  • 1
  • 7
  • 10
  • 1
    If this is homework, and it really sounds like it from the "please consider" remark, then you should do it yourself, or failing that, say exactly what you have thought about and use the homework tag as stated in the FAQ. – Douglas Zare Jan 12 '13 at 23:43
  • 1
    same weights before the training phase, or during the training phase? – Franck Dernoncourt Jan 10 '17 at 17:59

1 Answers1

4

The weights are normally updated and can have many different values. You can have 2 or more weights with the same value.

If some weights are equal to zero, it just means that the neuron has no impact on the neuron of the next layer.

And one more thing to know is that you shouldn't initialize the values of all your weights at the beginning with the same value. See my previous answer here: stats.stackexchange.com/questions/45087/backpropagation/45092

ThiS
  • 1,382
  • 1
  • 12
  • 13