Questions tagged [random-variable]

A random variable or stochastic variable is a value that is subject to chance variation (i.e., randomness in a mathematical sense).

As opposed to other mathematical variables, a random variable conceptually does not have a single, fixed value (even if unknown); rather, it can take on a set of possible different values, each with an associated probability.

Reference: Wikipedia

2176 questions
98
votes
8 answers

Generate a random variable with a defined correlation to an existing variable(s)

For a simulation study I have to generate random variables that show a predefined (population) correlation to an existing variable $Y$. I looked into the R packages copula and CDVine which can produce random multivariate distributions with a given…
Felix S
  • 4,432
  • 4
  • 26
  • 34
95
votes
6 answers

Convergence in probability vs. almost sure convergence

I've never really grokked the difference between these two measures of convergence. (Or, in fact, any of the different types of convergence, but I mention these two in particular because of the Weak and Strong Laws of Large Numbers.) Sure, I can…
raegtin
  • 9,090
  • 12
  • 48
  • 53
90
votes
9 answers

What is meant by a "random variable"?

What do they mean when they say "random variable"?
Baltimark
  • 2,028
  • 3
  • 19
  • 20
72
votes
6 answers

What are i.i.d. random variables?

How would you go about explaining i.i.d (independent and identically distributed) to non-technical people?
user333
  • 6,621
  • 17
  • 44
  • 54
70
votes
3 answers

How is the minimum of a set of IID random variables distributed?

If $X_1, ..., X_n$ are independent identically-distributed random variables, what can be said about the distribution of $\min(X_1, ..., X_n)$ in general?
Simon Nickerson
  • 811
  • 1
  • 8
  • 9
67
votes
1 answer

Variance of product of multiple independent random variables

We know the answer for two independent variables: $$ {\rm Var}(XY) = E(X^2Y^2) − (E(XY))^2={\rm Var}(X){\rm Var}(Y)+{\rm Var}(X)(E(Y))^2+{\rm Var}(Y)(E(X))^2$$ However, if we take the product of more than two variables, ${\rm Var}(X_1X_2 \cdots…
damla
  • 791
  • 1
  • 7
  • 5
52
votes
4 answers

Why does the correlation coefficient between X and X-Y random variables tend to be 0.7

Taken from Practical Statistics for Medical Research where Douglas Altman writes in page 285: ...for any two quantities X and Y, X will be correlated with X-Y. Indeed, even if X and Y are samples of random numbers we would expect the…
nostock
  • 1,337
  • 4
  • 15
  • 22
44
votes
3 answers

Intuitive explanation for density of transformed variable?

Suppose $X$ is a random variable with pdf $f_X(x)$. Then the random variable $Y=X^2$ has the pdf $$f_Y(y)=\begin{cases}\frac{1}{2\sqrt{y}}\left(f_X(\sqrt{y})+f_X(-\sqrt{y})\right) & y \ge 0 \\ 0 & y \lt 0\end{cases}$$ I understand the calculus…
lowndrul
  • 2,057
  • 1
  • 18
  • 20
41
votes
9 answers

How can I efficiently model the sum of Bernoulli random variables?

I am modeling a random variable ($Y$) which is the sum of some ~15-40k independent Bernoulli random variables ($X_i$), each with a different success probability ($p_i$). Formally, $Y=\sum X_i$ where $\Pr(X_i=1)=p_i$ and $\Pr(X_i=0)=1-p_i$. I am…
41
votes
6 answers

Intuitive explanation of convergence in distribution and convergence in probability

What is the intuitive difference between a random variable converging in probability versus a random variable converging in distribution? I've read numerous definitions and mathematical equations, but that does not really help. (Please keep in mind,…
nicefella
  • 1,153
  • 2
  • 13
  • 18
40
votes
2 answers

What is it meant with the $\sigma$-algebra generated by a random variable?

Often, in the course of my (self-)study of statistics, I've met the terminology "$\sigma$-algebra generated by a random variable". I don't understand the definition on Wikipedia, but most importantly I don't get the intuition behind it. Why/when do…
DeltaIV
  • 15,894
  • 4
  • 62
  • 104
39
votes
3 answers

Brain-teaser: What is the expected length of an iid sequence that is monotonically increasing when drawn from a uniform [0,1] distribution?

This is an interview question for a quantitative analyst position, reported here. Suppose we are drawing from a uniform $[0,1]$ distribution and the draws are iid, what is the expected length of a monotonically increasing distribution? I.e., we…
38
votes
8 answers

Simple examples of uncorrelated but not independent $X$ and $Y$

Any hard-working student is a counterexample to "all students are lazy". What are some simple counterexamples to "if random variables $X$ and $Y$ are uncorrelated then they are independent"?
Clare Brown
  • 51
  • 1
  • 2
  • 3
38
votes
2 answers

Variance of a function of one random variable

Lets say we have random variable $X$ with known variance and mean. The question is: what is the variance of $f(X)$ for some given function f. The only general method that I'm aware of is the delta method, but it gives only aproximation. Now I'm…
Tomek Tarczynski
  • 3,854
  • 7
  • 29
  • 37
37
votes
4 answers

Functions of Independent Random Variables

Is the claim that functions of independent random variables are themselves independent, true? I have seen that result often used implicitly in some proofs, for example in the proof of independence between the sample mean and the sample variance of…
JohnK
  • 18,298
  • 10
  • 60
  • 103
1
2 3
99 100