Consider a set $U$. My signal is a piece-wise constant "function" $Sig: t \mapsto s$, i.e. the signal at time $t$ equals to some subset $s \subset U$. One can see $Sig(t)$ as a stochastic process.
For a given sequence of time points $\{t_i\}$, we have sequence of partial observations $\{ Ob(t_i) \}$. Each observation $Ob(t_i)$ is not a noised value of $Sig(t_t)$, but rather a "random" part of the "real" signal. I mean:
$$ Ob(t_i) \subset Sig(t_i) \,.$$
(For example, $Sig(t_i)$ can be a graph, and $Ob(t_i)$ will result in a random subgraph of $Sig(t_i)$).
Is there a theory that describes such partially observed stochastic processes? In particular, is there a method to infer (estimate) the parameters of the underlying stochastic process $Sig(t)$ using only a given sequence of observations $\{Ob(t_i)\}$?
If you want to add some addition properties (constraints) to $Sig(t)$ or to $Ob(t)$, you are welcome.
Update:
For example, a very simple problem from this class is considered here: https://stats.stackexchange.com/questions/83998/if-maria-performs-more-observations-per-unit-of-time-than-maximilien-how-can-he