Joint probability distribution of several random variables gives the probability that all of them simultaneously lie in a particular region.
Questions tagged [joint-distribution]
747 questions
32
votes
3 answers
Shouldn't the joint probability of 2 independent events be equal to zero?
If the joint probability is the intersection of 2 events, then shouldn't the joint probability of 2 independent events be zero since they don't intersect at all? I'm confused.

gaston
- 511
- 4
- 5
16
votes
5 answers
Difference between the terms 'joint distribution' and 'multivariate distribution'?
I am writing about using a 'joint probability distribution' for an audience that would be more likely to understand 'multivariate distribution' so I am considering using the later. However, I do not want to loose meaning while doing this.
Wikipedia…

David LeBauer
- 7,060
- 6
- 44
- 89
16
votes
1 answer
Upper bounds for the copula density?
The Fréchet–Hoeffding upper bound applies to the copula distribution function and it is given by
$$C(u_1,...,u_d)\leq \min\{u_1,..,u_d\}.$$
Is there a similar (in the sense that it depends on the marginal densities) upper bound for the copula…

Coppola
- 161
- 3
15
votes
3 answers
Why don't we see Copula Models as much as Regression Models?
Is there any reason that don't see Copula Models as much as we see Regression Models (e.g. https://en.wikipedia.org/wiki/Vine_copula, https://en.wikipedia.org/wiki/Copula_(probability_theory)) ?
I have spent the last few months casually reading…

stats_noob
- 5,882
- 1
- 21
- 42
15
votes
1 answer
How to find marginal distribution from joint distribution with multi-variable dependence?
One of the problems in my textbook is posed as follows. A two-dimensional stochastic continuous vector has the following density function:
$$
f_{X,Y}(x,y)=
\begin{cases}
15xy^2 & \text{if 0 < x < 1 and 0 < y < x}\\
0 &…

soren.qvist
- 251
- 1
- 2
- 4
12
votes
1 answer
Necessary and sufficient condition on joint MGF for independence
Suppose I have a joint moment generating function $M_{X,Y}(s,t)$ for a joint distribution with CDF $F_{X,Y}(x,y)$. Is $M_{X,Y}(s,t)=M_{X,Y}(s,0)⋅M_{X,Y}(0,t)$ both a necessary and sufficient condition for independence of $X$ and $Y$? I checked a…

Silverfish
- 20,678
- 23
- 92
- 180
12
votes
3 answers
Maximum likelihood estimator of joint distribution given only marginal counts
Let $p_{x,y}$ be a joint distribution of two categorical variables $X,Y$, with $x,y\in\{1,\ldots,K\}$. Say $n$ samples were drawn from this distribution, but we are only given the marginal counts, namely for $j=1,\ldots,K$:
$$
S_j =…

R S
- 507
- 1
- 5
- 15
11
votes
3 answers
How to compare joint distribution to product of marginal distributions?
I have two finite-sampled signals, $x_1$ and $x_2$, and I want to check for statistical independence.
I know that for two statistically independent signals, their joint probability distribution is a product of the two marginal distributions.
I have…

Rachel
- 571
- 2
- 7
- 17
11
votes
2 answers
What is the moment of a joint random variable?
Simple question, yet surprisingly difficult to find an answer online.
I know that for a RV $X$, we define the kth moment as $$\int X^k \ d P = \int x^k f(x) \ dx$$
where the equality follows if $p = f \cdot m$, for a density $f$ and Lebesgue measure…

Charac
- 111
- 3
10
votes
4 answers
Product of 2 Uniform random variables is greater than a constant with convolution
I am trying to formulate the following question. X and Y are IID , uniform r.v. with ~U(0,1)
What is the probability of P( XY > 0.5) = ?
0.5 is a constant here and can be different.
I do respect the geometrical solutions but what i would like to see…

math_law
- 231
- 1
- 7
10
votes
2 answers
Does the multivariate Central Limit Theorem (CLT) hold when variables exhibit perfect contemporaneous dependence?
The title sums up my question, but for clarity consider the following simple example. Let $X_i \overset{iid}{\backsim} \mathcal{N}(0, 1)$, $i = 1, ..., n$. Define:
\begin{equation}
S_n = \frac{1}{n} \sum_{i=1}^n…

Colin T Bowers
- 745
- 6
- 23
10
votes
1 answer
Is the maximum entropy distribution consistent with given marginal distributions the product distribution of the marginals?
There are generally many joint distributions $P(X_1 = x_1, X_2 = x_2, ..., X_n = x_n)$ consistent with a known set marginal distributions $f_i(x_i) = P(X_i = x_i)$.
Of these joint distributions, is the product formed by taking the product of the…

wnoise
- 515
- 4
- 16
9
votes
2 answers
Problem calculating joint and marginal distribution of two uniform distributions
Suppose we have random variable $X_1$ distributed as $U[0,1]$ and $X_2$ distributed as $U[0,X_1]$, where $U[a,b]$ means uniform distribution in interval $[a,b]$.
I was able to compute joint pdf of $(X_1,X_2)$ and marginal pdf of $X_1$.
$$…
user1102
9
votes
1 answer
Mahalanobis distance on non-normal data
Mahalanobis distance, when used for classification purposes, typically assumes a multivariate normal distribution, and the distances from the centroid should then follow a $\chi^2$ distribution (with $d$ degrees of freedom equal to the number of…

jmilloy
- 163
- 1
- 8
9
votes
1 answer
Spacings between discrete uniform random variables
Let $U_1, \ldots, U_n$ be $n$ i.i.d discrete uniform random variables on (0,1) and their order statistics be $U_{(1)}, \ldots, U_{(n)}$.
Define $D_i=U_{(i)}-U_{(i-1)}$ for $i=1, \ldots, n$ with $U_0=0$.
I am trying to figure out the joint…

user13154
- 793
- 1
- 5
- 15