What are the meanings of "stochastically dependent" and "functionally dependent"? What is the difference? (I saw the usage of the above terms in the paper "Maximum entropy sampling and optimal Bayesian experimental design")
-
1I discuss such a distinction in a reply (to an unrelated question) at http://stats.stackexchange.com/a/17148. – whuber Jan 05 '12 at 13:35
-
Thanks a lot. I got the intuition. Is there a place that I can find the formal definition? – Jian Jan 06 '12 at 05:34
-
1"Stochastic" dependence is just the standard definition of lack of independence of probabilities or random variables; see almost any textbook on probability. For a clear definition of functional dependence see the [second sentence here.](http://mathdl.maa.org/images/upload_library/22/Ford/WFNewns.pdf) For even more information, Google "functional dependence mathematics definition". – whuber Jan 06 '12 at 14:17
-
@whuber The link you have given is broken. – Dilip Sarwate Apr 11 '17 at 15:58
-
@Dilip That's a shame--one would expect a national organization like the MAA to manage its site better than that. I searched their site and could not find that document again. – whuber Apr 11 '17 at 16:16
2 Answers
Let $A$ and $B$ denote Bernoulli random variables representing the inputs to an Exclusive-OR gate, and let the Bernoulli random variable $C = A\oplus B = A + B \bmod 2$ represent the output of the Exclusive-OR gate. We assume that the two inputs are functionally independent or physically independent random variables because the two input signals come from different circuits and we have no reason to believe that there is any dependence between them. We capture this notion in probability by insisting that $A$ and $B$ are stochastically independent random variables, that is, for $i,j \in \{0,1\}$ $$P(A = i, B = j) = P(A=i)P(B=j).\tag{1}$$ Now, the output of the Exclusive-OR gate is very much functionally dependent on its inputs. Notice that $$P(C = 1) = P(A=1,B=0) + P(A=0,B=1).\tag{2}$$ Note also that \begin{align} P(C=1,A=1) &= P(A=1,B=0) = P(A=1)P(B=0)\tag{3}\\ P(C=1,A=0) &= P(A=0,B=1) = P(A=0)P(B=1)\tag{4}\\ P(C=0,A=1) &= P(A=1,B=1) = P(A=1)P(B=1)\tag{5}\\ P(C=0,A=0) &= P(A=0,B=0) = P(A=0)P(B=0)\tag{6} \end{align}
Now for the fun part. $A$ and $B$ have been assumed to be independent Bernoulli) random variables already but now assume further that they both have parameter $\frac 12$. Then, from $(1)$ and $(2)$, we readily determine that $C$ also has parameter $\frac 12$. Furthermore, the right sides of $(3)$-$(6)$ have value $\frac 14 = P(C=i)P(A=j)$, that is, $C$ and $A$ are stochastically independent random variables even though $C$ and $A$ are very much functionally dependent random variables. Similarly, $C$ and $B$ are stochastically independent but functionally dependent random variables. In fact, $A,B,C$ are an example of pairwise independent random variables that are not mutually independent random variables.
Functionally independent random variables are assumed to be stochastically independent random variables; indeed the notion of the functional independence is the source of the assumption of stochastic independence. However, stochastic independence should not be assumed to imply functional independence; stochastically independent random variables could very well be functionally dependent.
Note that if in the example above we assume that independent Bernoulli random variables $A$ and $B$ have parameters distinct from $\frac 12$, then $C$ and $A$ are no longer stochastically independent random variables. On the other hand, the independence of $A$ and $B$ survives the departure from $\frac 12$ because that independence is based on the functional independence of $A$ and $B$.

- 41,202
- 4
- 94
- 200
I would say, stochastic dependence among two random variables X, Y is just (given by) conditional distributions of Y given X and X given Y (similarly dependence of any two random events plus obvious generalizations of this problem). These conditional distributions are directly obtainable from a joint distribution of the random vector (X, Y). The task to find the dependence is to find the joint distribution of (X, Y) which often may be a hard task. For some examples see the "parameter dependence method" for construction of the conditional distributions or see the concept of "weak [i.e., stochastic] dependence".
Edit
To my previous answer let me add what follows: When there is a functional (strong) dependence of the variable Y on X any deterministic value x (when X = x) determines an exact deterministic value y (Y = y) according to a functional rule y = f(x). When stochastic (weak) dependence of y on x takes place a particular value x does not exactly determine a value y but instead affects probability (or probability density or hazard rate) of occuring the random event Y = y. Between others x affects the "probability of y" through affecting any parameter of G(y; A), where G( ) is a cdf and A is its parameter so that A becomes (deterministically) determined by x (A --> A(x) ). Thus through the assignment A --> A(x) we obtain the conditional distribution G*(y | x) = G*(y; A(x)). The latter describes the stochastic dependence of Y given X = x. For more see the "parameter dependence method".

- 281,159
- 54
- 637
- 1,101

- 11
- 1