The context of this question is ordinary least squares. $X$ denotes the design matrix.
I would like a proof of the claim – or a corrected version thereof – made in this other question that the exogeneity assumption $E[\epsilon|X] = 0$ implies that $E\!\left[h(X)^T\epsilon\right] = 0$ for any function $h$ on $X$. (By "any function" I guess what is meant is an arbitrary function $\mathbb{R}^{n\times p}\to \mathbb{R}^{n\times p}$, where $n\times p$ is the shape of the matrix $X$.)
The claim for the special case $h(X)=X$ says that the errors are uncorrelated with the regressors, which we can be prove using the expected value of the product of two random variables:
\begin{align} E\!\left[X^T\epsilon\right] &= E_X\!\left[\vphantom{\sum}X^T E[\epsilon|X]\right] \\ &= E_X\!\left[X^T 0\right] \\ &= 0 \end{align}
However, if we try the same proof for the general case, then we seem to run into trouble:
\begin{align} E\!\left[h(X)^T\epsilon\right] &= E_{h(X)}\!\left[\vphantom{\sum}h(X)^T E[\epsilon|h(X)]\right] \\ &\overset{?}{=} E_{h(X)}\!\left[\vphantom{\sum}h(X)^T E[\epsilon|X]\right] \\ &= E_{h(X)}[h(X)^T 0] \\ &= 0 \end{align}
Does the equality $E[\epsilon|h(X)] = E[\epsilon|X]$ hold if and only if $h$ is injective?