Questions tagged [bayes-rule]
13 questions
5
votes
1 answer
Why ignore the denominator of bayes rule?
I am a new beginner in stats. I have specifically diverted my attention towards this because, I wish to understand the concept of Deep Bayesian Learning, so I am starting with the basics. The question is:
The Bayes rule equation is given by
$P(X |…

Animesh Karnewar
- 153
- 1
- 5
4
votes
1 answer
Bayesian Updating from Two Perspectives
Suppose there is a game of luck with chance of winning $p_w = .01$. You can attempt to cheat with success probability $p_c = .005$. If you successfully cheat, your win probability is $p_{w|c} = .3$. Players always try to cheat, and if you try to…

Jason
- 595
- 4
- 14
2
votes
1 answer
Bayes rule when data $D$ is split into two independent parts: $D_a, D_b$
In this machine learning paper Overcoming catastrophic forgetting in neural
networks, they present to you equation 1, the log of bayes rule:
$$ \log p(\theta|D) = \log p(D|\theta) + \log p(\theta) - \log p(D) \quad (1)$$
Where $\theta$ are the…

clam
- 198
- 2
- 6
1
vote
1 answer
Explanation of Equation 5.3 from Gaussian Processes for Machine Learning
I am currently reading through C. E. Rasmussen & C. K. I. Williams' Gaussian Processes for Machine Learning and was going through chapter 5. I could not exactly understand the derivation of equation 5.3. It would be helpful if any of you can explain…

Deepak Narayanan
- 43
- 3
1
vote
1 answer
Posterior distribution from piecewise likelihood
Consider a hierarchical Bayesian model for analysing data from an inhomogeneous Poisson process that we observe in discrete time. Let $Y_i, i = 1,...,n$, be the number of events occurring in the time interval $[i−1,i]$ and assume that
$P(Y_i=y_i |…

Grautus
- 11
- 1
1
vote
0 answers
Conditional Probability Question $(X \cup Y)$
I cannot seem to solve this conditional probability question.
Suppose $X$ and $Y$ are two events from a sample space with $\Pr(X) = 0.25$, $\Pr(Y) = 0.5$ and $\Pr(X|X \cup Y) = 0.5.$ Find $\Pr(X \cup Y)$.
I know that the union of $X$ and $Y$ will…

Joe Ademo
- 75
- 5
0
votes
0 answers
Interpreting a Table for Bayes Rule
The question and related table is given in the picture. I applied bayes rule as:
$\cfrac{P(X_1=-1,X_2=1|C_1)}{P(X_1=-1,X_2=1)} = \cfrac{P(X_1=-1|C_1)P(X_2=1|C1)P(C_1)}{P(X_1=-1|C_1)P(X_2=1|C1)P(C_1) + P(X_1=-1|C2)P(X_2=1|C2)P(C_2)}$
From the table,…

kursat
- 13
- 3
0
votes
1 answer
Marginal likelihood: Why is it difficult to compute in this case?
I have been reading up a bit on generative models particularly trying to understand the math behind VAE. While looking at a talk online, the speaker mentions the following definition of marginal likelihood, where we integrate out the latent…

Luca
- 4,410
- 3
- 30
- 52
0
votes
0 answers
Number of parameters to calculate in Naive Bayes with and without independence assumption
I am just getting started with trying to understand the theory behind Naive Bayes a bit.
$Y$ = boolean-valued rv
$X_i$ = boolean-valued rv (part of random vector $\vec{X}$).
From what I understand, we want this:
$$
P(Y|X_1,X_2,X_3)
$$
It can be…

user3629892
- 151
- 4
0
votes
1 answer
Question about some basics
I have been wondering about an issue connected with prevalence.
And came up to some conclusions which I would like to verify. It's not complicated. Let's assume we have two prevalences: (A) percentage of population with cancer - 10% and (B)…

Ady
- 1
0
votes
1 answer
Computing posterior based on sum of multivariate normal distribution
Currently I am exploring topics for my undergrad thesis. Although I took a course in Bayesian statistics, I am not yet sure how to proceed in finding the posterior in the following case.
I have a d-dimensional prior distribution $\theta \sim \Pi…

J. Dekker
- 108
- 1
- 9
0
votes
1 answer
Conditional Independence
I have a joint probability, which factors as follows:
$P(A,B,C,D) = P(A,B) \cdot P(C|A) \cdot P(D|B)$
So I know that $C$ and $D$ are independent given $P(A, B)$ right?
I want to infer $P(A,B|C,D)$.
I use Bayes' Rule:
$P(A,B,C,D) = P(A,B)…

user3429986
- 317
- 1
- 6
0
votes
1 answer
Conditional probabilities involving random variables and functions of these variables
I have that $Z = X + 2Y$. $X, Y$ are independent. I know $f_X(x), f_Y(y), f_{X,Y}(x,y)$ and $f_Z(z). $ How can I find $f(x,y|z)$?
I know that $f(x,y|z) = f(x, y, z)/f(z) = f(z| x, y)*f(x, y)/f(z)$ but i'm stuck because I can't find $f(x, y, z)$ …

Bruce Kane
- 3
- 2