1

I have a neural network that gives out a continuous value as output and I need to classify it as class 0 or 1. I am currently using a sigmoid on this continuous output value but after sigmoid the values for both class 0 and 1 are greater than 0.5. So when I put the condition:

if sigmoid_output > 0.5:
  assigned_class = 1
else:
  assigned_class = 0

then all are assigned the same class but when I change this threshold of 0.5 to 0.5055 after manual review on the threshold, I get a better distribution of output in class 0 and 1. So I was wondering if there is any optimal way to determine this threshold value. I was thinking on the lines of logistic regression, as sigmoid is a special case of logistic regression but could not infer anything on my own.

Alex
  • 11
  • 1
  • 1
    That link is a good read, but I am concerned that you made a mistake in setting up your neural network if you are getting two probabilities greater than $0.5$. If the probability of being a picture of a dog is $0.51$, then the probability of being a cat cannot be $0.5$ (unless we have a weird situation like a [CatDog](https://en.wikipedia.org/wiki/CatDog).) – Dave Dec 04 '21 at 16:36
  • Are the classes mutually exclusive? Or Is it possible for an input to have both classes present? For instance, an image could have a cat and a dog, or just a dog, or just a cat. – Sycorax Dec 04 '21 at 17:49
  • @Sycorax yes the classes are mutually exclusive – Alex Dec 04 '21 at 18:00
  • Your first paragraph could be read to either be about the predictions for examples having 0 or 1 as labels, or maybe it’s about the predictions of 2 sigmoid units. It’s not clear. But for a mutually exclusive 2-class problem, you only need 1 sigmoid neuron. – Sycorax Dec 04 '21 at 18:14

0 Answers0