0

I am relatively new to Neurol Networks, and to understand the basics, I converted MNIST database into a format that I liked, and wrote a single layer NN with 784 neurons from scratch without using any library related to NN.

I trained NN over 600samples, and tested it on 10000samples (I accept that the reverse would be much better). I can see that as the NN trains with more and more samples, the error decreases almost exponentially with the training sample size. However, in the end, the test error was %90.

Is this normal for such a simple NN? Where can I find performance of different types of NNs with different parameters to compare with my own NN?

Of course, this is a very simple NN, but before making it more complex, I would like to the results I am getting are in agreement with others, and I would like to get an idea about what kind of different methods & schemes could be used in other situations.

Our
  • 207
  • 2
  • 8

1 Answers1

2

MNIST is a pretty simple, "toy" dataset, and you can get high performance ($>90\%$ accuracy) even with simple classifiers like logistic regression, single layer neural network, or $k$-NN. You can find summary of the results on the page maintained by Yann LeCun.

Tim
  • 108,699
  • 20
  • 212
  • 390