I am currently working on a question which seems to have an obvious answer, but it it seems just impossible for me to find a stringent proof of this relation (if it is true).
Imagine the following scenario:
You are asked to bet on the chance of your favorite soccer team winning their next game. Before you make your bet however, you are provided with some information, on which your probability will be conditionalized. All information you get is negative, in the sense that each peace of information individually can only decrease the chance of your team winning. Let's look at an example, where we have the following:
Event A, the team wins
Information x (e.g. that the strongest player is injured), where P(A|x) < P(A)
Information y (e.g. that the team moral is low.), where P(A|y) < P(A)
Given this scenario, my assumption would be that a bet on event A should always favor the event
P(A|x) > P(A|x&y), and P(A|y) > P(A|x&y)
because each additional (negative) piece of information can only further decrease the chance of your favorite team winning.
So, formally, what I am looking for is a proof that takes
P(A|x) < P(A), P(A|y) < P(A) and yields P(A|x) > P(A|x&y), P(A|y) > P(A|x&y).
I have been juggling probabilities for many hours now, and managed to show that this is equivalent to showing that
P(A|x) > P(A|x&y) <=> P(A|x&¬y) > P(A|x&y),
if I am not mistaken.
I can also see that from P(A|y) < P(A) it follows that
P(A|y) < P(A|¬y),
and similarly for x.
However, I struggle to tie these loose ends together, although I feel like the solution can not be far away, or must be much simpler than what I make it look like.
It would be great if someone knew whether this relation is in fact true or not, and if so whether someone could provide a few hints for how to get to this conclusion. I am not asking for a full prove, as I would like to work it out myself as much as I can, but any help would be very much appreciated!
Thank you all,
Zito