1

This question relates to my other question, where I actually noticed a mistake. Suppose I have a random variable $X\in[0,1]$ and a signal $S\in[0,1]$ bearing some info about $X$. I know $f_X(x)$ and $f_X(x|s)$. Can I calculate $f_S(s|x)$ and/or $f_S(s)$?

To be more specific, suppose I have uniform prior and beta posterior: $$f_X(x)=1,\ f_X(x|s)=\frac{1}{B(1+s\phi,1+(1-s)\phi)}x^{s\phi}(1-x)^{(1-s)\phi}.$$

Now, using Bayes rule for $f_S(s|x)$, I arrive at: $$f_S(s|x)=\frac{f_X(x|s)f_S(s)}{f_X(x)}=\frac{1}{B(1+s\phi,1+(1-s)\phi)}x^{s\phi}(1-x)^{(1-s)\phi}\color{red}{\cdot f_S(s)},$$ (I marked with red my mistake from the aforementioned question).

Now, how can I get $f_S(s)$ or $f_S(s|x)$ explicitly? What am I missing?

Joanna F
  • 123
  • 9

1 Answers1

2

You cannot identify $f(s|x)$ and $f(s)$ from $f(x|s)$ and $f(x)$ in general.

For instance, consider $\mathfrak{X}=\{1,2\}$ and $\mathscr{S}=\{1,2,3\}$. If $$f(x)=(\beta_1,\beta_2)=(\beta_1,1-\beta_1)$$ $$f(x=i|s=j)=p_{ij}$$ and $$f(s)=(\alpha_1,\alpha_2,\alpha_3)=(\alpha_1,\alpha_2,1-\alpha_1-\alpha_2)$$ then this vector $f(s)$ is solution of $$\beta_1=\alpha_1 p_{11}+\alpha_2 p_{21}+\alpha_3 p_{31}$$ since$$\beta_2=\alpha_1 p_{12}+\alpha_2 p_{22}+\alpha_3 p_{32}$$ is identical. There is therefore no unique solution in $(\alpha_1,\alpha_2)$.

In the general case, there is no reason for the integral equation $$f(x)=\int_\mathscr{S} f(x|s)f(s)\text{d}s$$ to have a unique solution. For instance, if $s=(s_1,s_2)$ and $f(x|s)=f(x|s_1)$, any distribution on $S_2$ is acceptable.

Xi'an
  • 90,397
  • 9
  • 157
  • 575