I am trying to learn about naive Bayes by implementing a simple naive Bayes model to classify the titanic dataset (so a binary classification). To keep it simple for now I am just including the two most strongly correlated features, sex and the fare paid for the ticket. So the sex is a binary variable (0 or 1 for male or female) whereas the fare is a discrete variable. Thus for the discrete variable I have modelled it using a Poisson distribution (after using a histogram to see it follows this shape quite well) using the mean of the data as lambda.
Here comes the question, given a new data point that I want to predict, lets say they paid 25 dollars and were female. I would thus get a probability value for the sex variable (one for the probability of surviving and being female and one for dying and being female), whereas for the discrete variable I should get a likelihood value (the likelihood of paying 25 dollars and surviving and the likelihood of paying 25$ and dying). Normally we take the log of likelihood values, but not for probabilities. How should I go about this here? Do I take the log of the likelihood and add it to the raw probability and then see which category (death or survival) gets the highest overall score?