Say I have a mathematical function that I want a neural network to learn (a realistic and helpful example would be trigonometric sine()
). Due to mathematical functions' consistency, as opposed to a data set that has human-related aspects (such as handwritten letters), will neural networks understand and train to a higher degree of success in a lower epoch count when compared to a neural network for, say, digit recognition?
How would training data for something such as sine()
look like? Would it simply be a random input n
, and the desired output be sine(n)
? Or would the training data be similar to something like the MNIST-data set: consisting of a limited amount of constant inputs and desired output that would be run over by several epochs? Are there any known optimizations for neural networks to learn mathematical functions?
Thanks