I trained a binary classifier.
Architecture is 32,64,64,128,128
CONV2D
's, dropout and last layer is softmax
.
Should I keep training after 200th epoch, is this leading to overfitting (I am at ~84%)?
I trained a binary classifier.
Architecture is 32,64,64,128,128
CONV2D
's, dropout and last layer is softmax
.
Should I keep training after 200th epoch, is this leading to overfitting (I am at ~84%)?
There seems to be a lot of noise, especially regarding your validation loss. However, judging by the slight upward trend in the validation accuracy, I'd suggest letting you model run for a few more epochs.
Despite that, your model is clearly overfitting. I'd suggest doing something to regularize your model (e.g. Dropout, data augmentation). You can read this answer for more information in this regard.