2

I have the following recursive linear model

$X = \varepsilon_1$

$Z = \beta_2 X + \varepsilon_2$

$Y = \beta_3 Z + \varepsilon_3$

By solving the model its easy to see that $E[Y|X, Z]$ is linear too. But what happens in the nonlinear case? Let's say I assume Z and Y to be Probit (or Logit, Poisson, etc)

$X = \varepsilon_1$

$Pr(Z = 1|X) = \Phi(\beta_2 X)$

$Pr(Y = 1|Z) = \Phi(\beta_3 Z)$

Would in this case $Pr(Y = 1|X, Z)$ be Probit too? In preliminary simulations I've run this seems to work. However, I don't know which kind of theoretical result I could apply here. Would be great if you could point me to some literature too. Thanks for your help!

  • 1
    1. Linear in what? Every random variable is a linear combination of itself. 1a. E(Y|X, Z) = E(Y|Z) here, and that's equal to b3 * Z + E(eps_3). 2. Probability of Y given Z and X is the same as probability of Y given only Z, which you have specified. – Gijs Feb 16 '18 at 09:57
  • My question is motivated by the case when there is an unobserved confoundier between X and Y. So in a directed acyclic graph this would mean $X \leftarrow U \rightarrow Y$. Or in the linear case with additive errors: $Cov(\varepsilon_1, \varepsilon_3) \neq 0$. Then I can compute the causal effect of X on Y by frontdoor adjustment (Pearl 2009, p. 81-3) as $\sum_Z P(Z|X) \sum_{X'} P(Y|X',Z) P(X')$. Now I would like to make parametric assumptions for Y given its parents $(Z, \varepsilon_3)$. – user2746008 Feb 16 '18 at 11:56
  • 2
    But then $X$ is not independent of $Y$, given $Z$, because of the confounder $U$. So $P(Y|X,Z) \neq P(Y|Z)$, or am I missing something? – user2746008 Feb 16 '18 at 12:06
  • 1
    Where is U in your calculations above? Or the covariance between eps1 and eps2? – Gijs Feb 16 '18 at 12:41
  • Not mentioned, I agree. – user2746008 Feb 16 '18 at 12:53

0 Answers0