If two fair coins, a dime and a penny are flipped and you are told that the dime came up heads, then what is the likelihood that the penny comes up heads as well? Obviously 1/2.
On the other hand, if you are told that two coins are flipped and at least one is heads, then there is only a 1/3 chance that the other is heads.
It seems paradoxical. But there is a simple way to understand this. First consider the probabilities before you know anything about the flipped values. Represent a dime landing on heads as Dh; a dime landing on tails as Dt; and similarly for the penny: Ph and Pt.
Initially there are four possibilities:
Initially, all four possibilities have the same probability. If you are told that at least one has heads, then you are left with these possibilities:
All of those are equally likely. In only one of the three cases are there two heads. So, a tail is more likely for the other coin.
But before booking a trip to Vegas, note that if, instead of being told that one coin has landed on heads, you are told that the dime lands on heads, then two possibilities remain
--and each has the same likelihood. Knowing the value of the dime doesn't help you guess the value of the penny.
The key to unraveling such apparent paradoxes is to characterize the initial set of possibilities ("initial" meaning before you receive any extra information) and then to eliminate possibilities based on that extra information.
Here is a different sounding problem for which this method works. There are four socks in a drawer: two red and two blue. They all feel the same to the touch. If you choose two without looking at them, what are the chances they will be of the same color?
To understand this, it is best to give labels to each sock: Ra and Rb for the red socks and Ba and Bb for the blue socks. We don't finally care which red sock we get or which blue one but this allows us to lay out the probabilities precisely. So, we can get (in order from left to right):
All these are equally likely but only four of these twelve choices lead to a desired outcome.
We can also approach this in a more abstract way. There is a probability of 1/2 that my first sock is red and in that case the likelihood that my next sock is red is 1/3, because only one of the remaining three socks is red. So the probability of two reds is 1/6. Similarly the probability of two blues is 1/6. Adding these up (because they are mutually exclusive), we get 1/3.
Here's a first problem for you:
1. Suppose I tell you the first sock is blue. Then what is the chance you get a pair? Suppose I tell you that at least one sock is blue. Then what is the chance for a pair?
In case socks lack sufficient glitter, let's return to a gambling scenario.
There are five opaque boxes. Two contain $10,000 and the others contain the same weight in green paper. So, from the outside, they are indistinguishable. You are allowed to choose a single box and so want one with $10,000. Your adversary knows which boxes have the money.
Here are the rules of the game. You point to a box. Your adversary must open two other boxes that have no money. There are now three unopened boxes including the one you pointed to originally. You now have the option to switch your choice.
2. Do you switch or not? Calculate the probabilities and see.
3. Suppose your adversary opens only one moneyless box. Do you switch in that case? The same simple method works.
4. Oh, yes, the following should be easy assuming a society where boys and girls are born in roughly even numbers. If you know a family has two kids and you see one playing outside and the kid is a girl, what is the chance that the other is a girl?