1

I trained a binary classifier.

Architecture is 32,64,64,128,128 CONV2D's, dropout and last layer is softmax.

Should I keep training after 200th epoch, is this leading to overfitting (I am at ~84%)?

My images

1 Answers1

0

There seems to be a lot of noise, especially regarding your validation loss. However, judging by the slight upward trend in the validation accuracy, I'd suggest letting you model run for a few more epochs.

Despite that, your model is clearly overfitting. I'd suggest doing something to regularize your model (e.g. Dropout, data augmentation). You can read this answer for more information in this regard.

Djib2011
  • 5,395
  • 5
  • 25
  • 36
  • "clearly overfitting" by looking at the accuracy chart or loss chart? I feel like it improves in terms of val acc ("slight upward trend"), that is why I thought it might not overfitting. – dl_best_DLL Mar 21 '20 at 18:15
  • https://imgur.com/a/SPSpcye These yellow dots - big fluctuates - also indicate overfitting? – dl_best_DLL Mar 21 '20 at 18:20
  • Since your training accuracy is much higher than the validation, the model is overfitting (despite the improvement in the model's performance). The fluctuation of your model's performance is another issue altogether. Some common culprits are a small validation set, a high learning rate, etc. – Djib2011 Mar 21 '20 at 23:40