0

I am currently exploring the training of Neural Networks. I have some toy data and I've trained a NN with 2 hidden layers on it and I get 99 % accuracy on the test set. But the problem is that if I add an additional layer, my training set accuracy drops drastically to around 56 %. Thus I wanted to ask If anybody might know why that might be? Is it likely that I've coded something wrong (however I had no problem going from 1 -> 2 hidden layers), or it could be that if I try to train the a simple set on a very complex model (at least for the set), it could give me such erroneous results.

Thank you very much.

  • What is the training error? – Lucas Apr 19 '15 at 16:56
  • Like i said, the training set accuracy drops, to 56 %. – user118837 Apr 19 '15 at 16:57
  • No, you said *test set accuracy*. If the training *and* test accuracy drops you have an underfitting problem. If the training accuracy is still at 99%, you have an overfitting problem. – Lucas Apr 19 '15 at 19:28
  • Sorry, completely missed that. I have to correct that. I did not actually create a test set, just wanted to see how the training will change as I increase the size of the data set. – user118837 Apr 19 '15 at 22:30

0 Answers0