Questions tagged [bayes]

Combining probabilities with Bayes' Theorem, especially as used for conditional inference.

Bayes' theorem is a basic result about the manipulation about conditional probabilities. For some events $A$ and $B$, the theorem reads as:

$$P(A|B) = \frac{P(B|A)P(A)}{P(B)}$$

The theorem is taken as a starting point for Bayesian inference, with $A$ taking the role of parameters, and $B$ taking the role of data.

226 questions
94
votes
12 answers

Who Are The Bayesians?

As one becomes interested in statistics, the dichotomy "Frequentist" vs. "Bayesian" soon becomes commonplace (and who hasn't read Nate Silver's The Signal and the Noise, anyway?). In talks and introductory courses, the point of view is…
Antoni Parellada
  • 23,430
  • 15
  • 100
  • 197
27
votes
3 answers

Why Normalizing Factor is Required in Bayes Theorem?

Bayes theorem goes $$ P(\textrm{model}|\textrm{data}) = \frac{P(\textrm{model}) \times P(\textrm{data}|\textrm{model})}{P(\textrm{data})} $$ This is all fine. But, I've read somewhere: Basically, P(data) is nothing but a normalising constant,…
16
votes
7 answers

What do/did you do to remember Bayes' rule?

I think a good way to remember the formula is to think of the formula like this: The probability that some event A has a particular outcome given an independent event B's outcome = the probability of both outcomes occurring simultaneously / whatever…
moonman239
  • 301
  • 3
  • 8
14
votes
1 answer

Bayes Theorem with multiple conditions

I don't understand how this equation was derived. $P(I|M_{1}\cap M_{2}) \leq \frac{P(I)}{P(I')}\cdot \frac{P(M_{1}|I)P(M_{2}|I)}{P(M_{1}|I')P(M_{2}|I')}$ This equation was from the paper "Trial by Probability" where the case of OJ Simpson was given…
Sakurabe
  • 143
  • 1
  • 1
  • 8
13
votes
2 answers

Why did Thomas Bayes find Bayes' theorem so challenging?

This is more of a history of science question, but I hope it's on-topic here. I've read that Thomas Bayes only managed to discover Bayes' theorem for the special case of a uniform prior, and even then he struggled with it, apparently. Considering…
MWB
  • 1,143
  • 9
  • 18
13
votes
2 answers

Why is Bayes Classifier the ideal classifier?

It is considered the ideal case in which the probability structure underlying the categories is known perfectly. Why is that with Bayes classifier we achieve the best performance that can be achieved ? What is the formal proof/explanation for this?…
DoOrDoNot
  • 231
  • 2
  • 5
12
votes
2 answers

Linear discriminant analysis and Bayes rule: classification

What is the relation between Linear discriminant analysis and Bayes rule? I understand that LDA is used in classification by trying to minimize the ratio of within group variance and between group variance, but I don't know how Bayes rule use in it.
zca0
  • 761
  • 3
  • 9
  • 14
11
votes
1 answer

What is the difference between probability and fuzzy logic?

I have been working with fuzzy logic (FL) for years and I know there are differences between FL and probability specially concerning the way FL deals with uncertainty. However, I would like to ask what more differences exist between FL and…
a.desantos
  • 591
  • 1
  • 3
  • 12
11
votes
2 answers

What is a "Unit Information Prior"?

I've been reading Wagenmakers (2007) A practical solution to the pervasive problem of p values. I'm intrigued by the conversion of BIC values into Bayes factors and probabilities. However, so far I don't have a good grasp of what exactly a unit…
Matt Albrecht
  • 3,213
  • 1
  • 24
  • 32
10
votes
4 answers

Why does Bayes' Theorem work graphically?

From a mathematical standpoint Bayes' Theorem makes perfect sense to me (i.e., deriving and proving), but what I do not know is whether or not there is a nice geometric or graphical argument that can be shown to explain Bayes' Theorem. I tried…
user25658
10
votes
2 answers

Locomotive problem with various size companies

I'm working through Think Bayes (free here: http://www.greenteapress.com/thinkbayes/) and I'm on exercise 3.1. Here's a summary of the problem: "A railroad numbers its locomotives in order 1..N. One day you see a locomotive with the number 60.…
Justin Bozonier
  • 1,119
  • 2
  • 11
  • 24
10
votes
2 answers

Naive Bayes on continuous variables

Please allow me to ask a basic question. I understand the mechanics of Naive Bayes for discrete variables, and can redo the calculations "by hand". (code of HouseVotes84 all the way per below). However - I am struggling to see how the mechanics…
Wouter
  • 2,102
  • 3
  • 17
  • 26
10
votes
1 answer

Is this a correct way to continually update a probability using Bayes Theorem?

Let's say I'm trying to find out the probability that someone's favorite ice cream flavor is vanilla. I know that the person also enjoys horror movies. I want to find out the probability that the person's favorite ice cream is vanilla given that…
user1626730
  • 203
  • 1
  • 6
9
votes
2 answers

Applying Bayes: Estimating a bimodal distribution

I'm trying to estimate a bunch of bimodal distributions, i.e. two means and two standard deviations, based on a variable number of inputs. If no input is present, a constant value should be returned. From my anecdotal knowledge of what Bayes is…
user979
  • 403
  • 3
  • 10
9
votes
1 answer

What would be an example of when L2 is a good loss function for computing a posterior loss?

L2 loss, together with L0 and L1 loss, are three a very common "default" loss functions used when summarising a posterior by the minimum posterior expected loss. One reason for this is perhaps that they are relatively easy to compute (at least for…
Rasmus Bååth
  • 6,422
  • 34
  • 57
1
2 3
15 16