Is it correct to assume that there is an infinite number of combinations of weights that a neural network can have in its connections in order to produce a specific output when given a certain set of inputs?
In other words, when we are training a neural network using backpropagation, does it mean that we are not looking for the only possible ideal configuration but rather for one out of an infinite number of ideal configurations?
Is there proof for either of the situations?