In the case of three or more variables you can use the same lines of reasoning in that thread you linked to.
To address your other question, if I understand it correctly, you can define transformations of independent random variables that use both "old" random variables to get each of the "new" random variabes, and still end up with independent things.
Consider the following pretty well known example. Assume $X_1,X_2 \sim \text{Gamma}(\alpha, \beta)$. That means the original density is
$$
f_{X_1,X_2}(x_1,x_2) = \frac{1}{\Gamma(\alpha)^2 \beta^{2\alpha}}x_1^{\alpha-1}x_2^{\alpha-1}\exp\left[-\frac{x_1+x_2}{\beta}\right].
$$
Then define $Y_1 = X_1/(X_1 + X_2)$ and $Y_2 = X_1 + X_2$. Then the new joint density is
$$
f_{Y_1,Y_2}(y_1,y_2) = \left[\frac{\Gamma(2\alpha)}{\Gamma(\alpha)^2 }y_1^{\alpha-1}(1-y_1)^{\alpha-1} \right]\left[\frac{1}{\beta^{2\alpha}\Gamma(2\alpha)}y_2^{2\alpha-1}\exp\left[-\frac{y_2}{\beta}\right] \right].
$$
$Y_1 \sim \text{Beta}(\alpha,\alpha)$, $Y_2 \sim \text{Gamma}(2\alpha,\beta)$, and they are independent.