2

Say I have a mathematical function that I want a neural network to learn (a realistic and helpful example would be trigonometric sine() ). Due to mathematical functions' consistency, as opposed to a data set that has human-related aspects (such as handwritten letters), will neural networks understand and train to a higher degree of success in a lower epoch count when compared to a neural network for, say, digit recognition?

How would training data for something such as sine() look like? Would it simply be a random input n, and the desired output be sine(n)? Or would the training data be similar to something like the MNIST-data set: consisting of a limited amount of constant inputs and desired output that would be run over by several epochs? Are there any known optimizations for neural networks to learn mathematical functions?

Thanks

iLearning
  • 21
  • 1
  • It isn't clear what you mean by a "mathematical function," except that you appear to have in mind something different from the [usual definition.](https://en.wikipedia.org/wiki/Function_(mathematics)) Could you explain your meaning? – whuber Jan 29 '17 at 22:31
  • Sorry I may be wrong. But actually the question was very clear: can mathematical functions be emulated by neural network. And the answer is yes: sine() is just one dimensional, with the correspond 1D output. But general problem is multi-dimensional forming a hypersurface, correct? – Peter Teoh Jan 30 '17 at 06:19
  • same as this: http://stats.stackexchange.com/questions/158348/can-a-neural-network-learn-a-functional-and-its-functional-derivative?rq=1 – Peter Teoh Jan 30 '17 at 06:19

0 Answers0