I've seen several ways to normalize a data (features or even images) before use as input in a NN or CNN.
The most common I saw are:
- [0, 1]: (data - min(data)) / (max(data) - min(data))
- z-score: (data - mean(data)) / std.dev(data)
What would be the best/recommend? Are the way chosen really affect the training of the model?
Please, I'm really lost with so much opinions on this topic, would be good you could provide a reference as paper or book.