Questions tagged [conditional-probability]

The probability that an event A will occur, when another event B is known to occur or to have occurred. It is commonly denoted by P(A|B).

From stat.yale.edu:

"The conditional probability of an event B is the probability that the event will occur given the knowledge that an event A has already occurred. This probability is written P(B|A), notation for the probability of B given A. In the case where events A and B are independent (where event A has no effect on the probability of event B), the conditional probability of event B given event A is simply the probability of event B, that is P(B). If events A and B are not independent, then the probability of the intersection of A and B (the probability that both events occur) is defined by P(A and B) = P(A)P(B|A)."

Excerpt reference: Wikipedia.

2137 questions
159
votes
2 answers

Deriving the conditional distributions of a multivariate normal distribution

We have a multivariate normal vector ${\boldsymbol Y} \sim \mathcal{N}(\boldsymbol\mu, \Sigma)$. Consider partitioning $\boldsymbol\mu$ and ${\boldsymbol Y}$ into $$\boldsymbol\mu = \begin{bmatrix} \boldsymbol\mu_1 \\ …
Flying pig
  • 5,689
  • 11
  • 32
  • 31
147
votes
15 answers

Amazon interview question—probability of 2nd interview

I got this question during an interview with Amazon: 50% of all people who receive a first interview receive a second interview 95% of your friends that got a second interview felt they had a good first interview 75% of your friends that DID NOT…
Rick
  • 1,431
  • 2
  • 11
  • 9
98
votes
3 answers

Can someone explain Gibbs sampling in very simple words?

I'm doing some reading on topic modeling (with Latent Dirichlet Allocation) which makes use of Gibbs sampling. As a newbie in statistics―well, I know things like binomials, multinomials, priors, etc.―,I find it difficult to grasp how Gibbs sampling…
Thea
  • 983
  • 1
  • 7
  • 4
62
votes
3 answers

A generalization of the Law of Iterated Expectations

I recently came across this identity: $$E \left[ E \left(Y|X,Z \right) |X \right] =E \left[Y | X \right]$$ I am of course familiar with the simpler version of that rule, namely that $E \left[ E \left(Y|X \right) \right]=E \left(Y\right) $ but I was…
JohnK
  • 18,298
  • 10
  • 60
  • 103
60
votes
4 answers

How to generate correlated random numbers (given means, variances and degree of correlation)?

I'm sorry if this seems a bit too basic, but I guess I'm just looking to confirm understanding here. I get the sense I'd have to do this in two steps, and I've started trying to grok correlation matrices, but it's just starting to seem really…
38
votes
14 answers

What is the intuition behind the formula for conditional probability?

The formula for the conditional probability of $\text{A}$ happening given that $\text{B}$ has happened is:$$ P\left(\text{A}~\middle|~\text{B}\right)=\frac{P\left(\text{A} \cap \text{B}\right)}{P\left(\text{B}\right)}. $$ My textbook explains the…
WorldGov
  • 705
  • 7
  • 14
32
votes
8 answers

What is the probability that this person is female?

There is a person behind a curtain - I do not know whether the person is female or male. I know the person has long hair, and that 90% of all people with long hair are female I know the person has a rare blood type AX3, and that 80% of all people…
ProbablyWrong
  • 333
  • 3
  • 7
32
votes
4 answers

Can anyone explain conjugate priors in simplest possible terms?

I have been trying to understand the idea of conjugate priors in Bayesian statistics for a while but I simply don't get it. Can anyone explain the idea in the simplest possible terms, perhaps using the "Gaussian prior" as an example?
Jenna Maiz
  • 779
  • 7
  • 17
31
votes
4 answers

Intuition for Conditional Expectation of $\sigma$-algebra

Let $(\Omega,\mathscr{F},\mu)$ be a probability space, given a random variable $\xi:\Omega \to \mathbb{R}$ and a $\sigma$-algebra $\mathscr{G}\subseteq \mathscr{F}$ we can construct a new random variable $E[\xi|\mathscr{G}]$, which is the…
31
votes
5 answers

Wikipedia entry on likelihood seems ambiguous

I have a simple question regarding "conditional probability" and "Likelihood". (I have already surveyed this question here but to no avail.) It starts from the Wikipedia page on likelihood. They say this: The likelihood of a set of parameter…
30
votes
8 answers

Two dice rolls - same number in sequence

I am currently studying Statistical Inference class on Coursera. In one of the assignments, the following question comes up. | Suppose you rolled the fair die twice. What is the probability of rolling the same number two times in a row? 1:…
Rishabh Sagar
  • 403
  • 1
  • 4
  • 5
27
votes
4 answers

Problem with proof of Conditional expectation as best predictor

I have an issue with the proof of $E(Y|X) \in \arg \min_{g(X)} E\Big[\big(Y - g(X)\big)^2\Big]$ which very likely reveal a deeper misunderstanding of expectations and conditional expectations. The proof I know goes as follows ( another version…
27
votes
3 answers

Why Normalizing Factor is Required in Bayes Theorem?

Bayes theorem goes $$ P(\textrm{model}|\textrm{data}) = \frac{P(\textrm{model}) \times P(\textrm{data}|\textrm{model})}{P(\textrm{data})} $$ This is all fine. But, I've read somewhere: Basically, P(data) is nothing but a normalising constant,…
25
votes
2 answers

Definition of Conditional Probability with multiple conditions

Specifically, say I have two events, A and B, and some distribution parameters $ \theta $, and I'd like to look at $P(A | B,\theta)$. So, the simplest definition of conditional probability is, given some events A and B, then $P(A|B) = \frac{P(A \cap…
Splanky222
  • 251
  • 1
  • 4
  • 3
25
votes
3 answers

Is there any difference between Frequentist and Bayesian on the definition of Likelihood?

Some sources say likelihood function is not conditional probability, some say it is. This is very confusing to me. According to most sources I have seen, the likelihood of a distribution with parameter $\theta$, should be a product of probability…
1
2 3
99 100