Say we are fitting a model to count data (if needed we can assume it follows zero inflated Poisson model).
To be specific, for each sample we have multivariate factors $\boldsymbol{x}$, the (unknown) mean number of counts $y \sim \boldsymbol{x}$, and the counts $c \sim y$. I want to fit an arbitrary positive function $f$ with parameters $\theta$ to predict $y$ given $\boldsymbol{x}$.
Mathematically speaking, does the $\theta$ fitted to $(c,\boldsymbol{x})$ always converge to the $\theta$ fitted to $(y,\boldsymbol{x})$ as the sample size increases? Is there a theory that guarantees this for some class of models/estimators?