1

We have a VGG16 network trained from scratch with a Sigmoid output function. We have 6 classes and the usual output looks like this:

scores': [6.494849458249519e-08, 1.8738395510808914e-06, 3.010111981893715e-07, 0.0, 0.0, 0.8633317947387695]

The problem is that the output value is very low in each class, I would like to have a normalized output that sums to 1.0 Thanks

kambi
  • 113
  • 2

2 Answers2

1

Sigmoid outputs will each vary between 0 and 1, but if you have $k$ sigmoid units, then the total can vary between 0 and $k$. By contrast, a softmax function sums to 1 and has non-negative values.

Sycorax
  • 76,417
  • 20
  • 189
  • 313
0

If you are concerned about the output being too low, try re-scaling the output. I don't clearly understand what you mean by normed output sum to 1.

However, if you want the sum of output to be 1, one possible way is considering log-likelihood or log-it transformation. The log function takes into account properly the negative indices in powers of 10.

Roger Vadim
  • 1,481
  • 6
  • 17