Here's an example. Suppose that I bet 10 dollars on whether a coin toss comes up heads. If it comes up heads, I get \$20, and if I comes up tails, I give up \$10. At the same time, I bet \$10 on whether a pair of dice turns up "snake eyes" (two 1's). If it comes up snake eyes, I get \$20, and if it doesn't, I lose \$10.
What is the expectation of my winnings? If $X$ is the random variable for the the amount of money I win on the coin toss (this amount will be negative if I lose), and $Y$ is the amount of money I win on the dice toss, the expectation of my winnings is $\mathsf{E}(X + Y) = \mathsf{E}X + \mathsf{E}Y$. This could also be expressed in terms of subtraction, by defining $X$ and $Y$ in a different way.
Obviously, there are more ways to use this idea than by calculating expectations. The idea also applies to many things other than simple games of chance, of course. For example, suppose that a bird eats a berry, and it will either provide the bird with a certain amount of energy, or it will reduce the bird's energy, if the berry is slightly poisonous. After that, the bird tries to catch an insect. It will gain energy if it catches the insect quickly, but if it is difficult to catch the insect, it might lose more energy. Given certain probabilities for the outcomes of eating the berry and the outcomes of pursuing the insect, what the expectation of the energy increase?
Suppose that there is a need to sum many identical random variables (perhaps many instances of the bird eating berries), but that the number of random variables depends on another variable. Maybe the number of berries eaten depends on when a predator arrives. This could be represented by multiplying the random variable for berry-eating, $X$, by the random variable for the arrival time of a predator, $Z$.
There are cases where division matters. Suppose that if the bird is ill, then nutritional value of the berries is reduced by dividing the energy by a random quantity $W$ that reflects how ill the bird is.