Math Fun

Correct

You decide to play a game with your friend where your friend places a coin under one of three cups. Your friend would then switch the positions of two of the cups several times so that the coin under one of the cups moves with the cup it is under. You would then select the cup that you think the coin is under. If you won, you would receive the coin, but if you lost, you would have to pay.

As the game starts, you realize that you are really tired, and you don’t focus very well on the moving of the cups. When your friend stops moving the cups and asks you where the coin is, you only remember a few things:

He put the coin in the rightmost cup at the start.

He switched two of the cups 3 times, as follows:

The first time he switched two of the cups, the rightmost one was switched with another.

The second time he switched two of the cups, the rightmost one was not touched.

The third time he switched two of the cups, the rightmost one was switched with another. This was the last switch

Which cup is most likely to hold the coin?

Cup and coin:[tab]1/2 cup on the right[/tab]

Yes. It is what he COULD think.
And he COULD think, “if everyone were to simply start with the same number then…”
And just as he wasn’t actually brown when he thought of what COULD be the case, to see where it would lead, he wasn’t thinking of an actual number when he thought, “what if everyone simply started with the same number…” so as to see where that would lead. Just as he COULD possibly be brown, he COULD possibly think of a common number.

And very importantly, he MUST eliminate such a possibility before he can ASSUME that everyone isn’t thinking such a thing rather than thinking of whether they are brown (or “not blue”).

Perfect logicians don’t ASSUME, they ELIMINATE alternatives by EXAMINING them (not simply jumping to the first possible assumption of what 200 others might be thinking).

James, I’m happy to keep talking about this, but it seems like everyone else has moved on and I don’t want to monopolize the thread. I’ll send you a PM with my response.

New Math Logic Fun:

Say that a chess board has two diagonally opposite corners removed, so that it contains 62 squares. Is it possible to place 2x1 rectangular tiles on the board to cover every square without the rectangles hanging off the board?

I know!
[tab]No
By my calculations, no it’s not. I’ve seen a similar problem before, and the logic works like this:

First of all, let’s call these ‘2x1 tiles’ dominos, and let’s assume that the dimensions of the dominos are exactly 2x1 relative to the size of each equally sized tile on the chessboard.
Also, we’re assuming that the only way to place them is for each to fully cover 2 tiles, of course, and the only way to do that is to align them with the chess grid (no partial covers, no diagonal pieces, etc).

Now, notice this: whichever way you place the domino, such that it fully covers two squares and is perfectly aligned with the grid, it covers 1 black and 1 white square. You cannot place it anywhere where it covers 2 squares fully, it’s not diagonal, and the squares are of the same color.

So if you think about it, every domino must be covering 1 black and 1 white square, so the number of blacks must be equal to the number of whites covered.

Now think back to the problem statement, in which opposite corners of the chess board were removed. You start out with a chessboard of 64 squares, 32 black and 32 white. Equal black and white. When you remove 2 corners that are opposite each other, you remove 2 corners of the same color. This means that you’re left with 30black and 32white, or 32black and 30 white. In either case, you’re left with an unequal number of black and white squares.

Since each domino, to be placed correctly, must be covering 1 black and 1 white square, you can only fully cover 60 of the squares with even the best strategy, as the remaining 2 will be of the same color.[/tab]

Correct, and thorough!

Awesome explanation. I couldn’t put my finger on why I couldn’t do it.

I know the explanation to the cup and coin:
[tab]The cup that initially covers the coin is swapped with one of the other two, giving a probability of 1/2 that it’s in either. The probability that it’s in the initial position is zero at this point.
Next, swapping the two not in the initial position does not change these probabilities.
Finally, swapping one of the two not in the intial position, with the cup in the initial position, means there’s a probabilty of 1/2 for each of them that they were swapped back to the initial position, and 1/2 that they were not.
For each of these two, the probability of 1/2 that the coin was swapped into that position AND the probability of 1/2 that it was NOT swapped back out makes a probability of 1/4 that it’s still there.
Two probabilities of 1/4 (for the coin being in a non initial position leaves a probability of 1/2 that it’s back in the original position.[/tab]
And the explanation of the stick one:
[tab]The average position of the first snap is bang in the middle.
The average position of the second snap is bang in the middle of either of the 2 pieces resulting from the first snap.
So the average length of (either of the 2) the shorter piece(s) is a quarter of the original length of 1m.[/tab]
I can intuitively work out the baby one from long-unused conditional probability knowledge that sits somewhere in my subconscious, but as for the proper explanation I can’t quite explain it right.

I think you misunderstood that one.
There aren’t 2 snaps. There’s 1 snap, into 2 pieces.

The baby one:
[tab]With two children, there are 4 possible combinations:
B B
B G
G B
G G
If you know that one child is a boy, then that eliminates one possibility, leaving three:
B B
B G
G B
Thus, in only one third of cases where there is at least one boy will the other child be a boy.[/tab]

Bam. Thanks.

Oh yeah. And now Carleas’ explanation makes a lot more sense.

Incorrect explanation to the chess board. The coloring has nothing to do with it.

If that’s all you have to say about it, just that it’s incorrect without any explanation of what’s incorrect about it, while everyone else gets it and thinks it’s a good explanation (and even mathematicians agree – this is a well-solved problem), then I’ll just assume you don’t get it.

If you’d like to expound, however, please do.

I’ve always found this puzzle quite interesting:

There are 5 pirates, of different ages. They are all, as luck would have it, perfect logicians. They also have just found some treasure: 100 coins. They are deciding how to split it. The rules for deciding the splitting are as follows:

The oldest proposes how to split the treasure.
Everyone except him votes on if it’s a good split.
Each pirate, having a strong sense of self-interest, bases their vote on the question, “Would I get less if this proposal failed?”
If the answer to that question is “yes,” they vote for it, else they vote against it.
If at least 50% of voting pirates vote for it, it passes, else it fails and the oldest is thrown over board, and the new oldest makes a new proposal.

So, again, they’re perfect logicians, and each ones has the explicit goals of
a) maximizing the money that gets split to them
b) not dying

So, what happens?

There’s an alternate version of the puzzle, in which all pirates vote, including the one who made the proposal, which has a different solution methinks.

[tab]So, when they get to the last two pirates, the youngest pirate will reject anything, because then she’ll end up with all the treasure.

When there are three, then, the second youngest will accept anything, possibly even zero, assuming being thrown overboard is less than just getting nothing*. The third youngest will then offer something like “I get everything, and neither of you get anything.” The vote will split, and the proposal will pass.

So the fourth youngest, knowing this, then knows that anything better than nothing will be accepted by the two youngest. He’ll propose something like “1 for the youngest, 1 for the second youngest, none for the third oldest, and the rest for me” The third will vote against, but the others will vote for, and it will pass.

The fifth, then, only has to give the youngest two 2 gold, the third youngest 1 gold, the fourth youngest 0 gold, and keep the rest to himself. The fourth will vote against, and the other three will vote for.

*If this assumption is bad, then the second youngest pirate gets 1 additional gold each round.[/tab]

here we go again #-o

My solution (I’ve tabbed each step because I’ve made it very detailed - you have been warned - answer is underlined in the final tab):

[tab]The only pirate who has no risk of dying (b) is the youngest pirate.
In the most extreme case, where he was the only one left, he would get the maximum of 100 coins all for himself, fulfilling (a) perfectly as well as (b).

In the next most extreme case, with 2 pirates left, the youngest would be the only voter, resulting in either 100% vote in favour of the proposal made by the oldest or 0%, which would get the oldest thrown overboard.
So to survive at all, the oldest has to offer an acceptable proposal for the youngest, who has nothing to lose. The only proposal that the youngest will accept is all 100 coins, or he will vote against it, be the only pirate left, and get all 100 coins anyway, due to (a).
So due to (b), the oldest of the two would offer 100 coins to the other pirate and have to put up with nothing for himself.[/tab]
[tab]With 3 pirates left, there are two voters. The oldest only has to satisfy 1 of them to get “at least 50%” of the vote, so he has to work out which one to satisfy and how much will satisfy them. The other doesn’t need to be offered anything, in order to maximise how much the other 2 get, as with (a) - it’s fine if just one of them votes against the proposal.
He can actually offer just 1 coin to the 2nd youngest, and nothing to the youngest, and take 99 for himself.

It is in the interests of the 2nd youngest pirate to vote for the oldest pirate of 3 to be kept alive (accepting his proposal) as long as the proposal involves 1 coin or more for him, because even accepting 1 coin would be better than the 0 that he would get if the oldest pirate of 3 is thrown overboard (as in the scenario above with only the 2 youngest pirates). To ensure the life of the oldest of 3, the 2nd youngest needs to either be sure that the youngest will vote for him (in order for him to get at least 50% of the vote) whether he himself votes for the proposal or not, or he needs to vote for him himself (in order to give at least 50% of the vote) whether the youngest follows suit or not.
As pointed out, the youngest is ensured (b) and at least 0 coins, so only has to worry about getting as many more coins as he can more than 0, as with (a). He will have worked out the dilemma of the 2nd youngest, and that he’s in danger of getting 0 coins if he votes for any proposal. His concern is to get at least 1 coin, and the best scenario is that both vote against whatever the proposal is, so he can secure 100 coins - but that’s only a possibility if he can be sure that the 2nd youngest will vote against the proposal, then he can too, and he’s in the money.

At this point, he can’t actually be sure that the 2nd youngest will vote against the proposal if 0 coins are offered to him, because he’ll get 0 coins whether he votes for or against. Needing to be sure in order to secure his 100 coins by also voting against any proposal, this possibility goes out the window. So he will also accept anything more than 0 coins, even 1 coin, in order to escape the 0 coins he’s in danger of being offered.

Having worked out this, the oldest of 3 could actually offer either both of them 1, or either one of them 1 and the other 0. The former option only leaves him with 98 instead of 99 as in the latter 2 options. He can be more sure of his proposal being accepted with at least 1 vote if he offers the 1 to the 2nd youngest who is in the worse position - having nothing to lose.[/tab]
[tab]With 4 pirates left, there are 3 voters. The eldest needs to please 2 of these 3 to get at least 50% of their vote.
The 2nd youngest would do better to ensure the survival of the oldest of 4 pirates if he offers him 2 or more coins, else if he is thrown overboard, he’s in danger of getting 1 or less (as in the above scenario with only 3 pirates). Being offered 2 coins or more will get the oldest pirate 1 of the 2 votes he needs to stay alive.
Assuming the 2nd youngest is offered 2 coins, this would leave a maximum of 98 for the 3rd youngest, which is less than he would get if the oldest of 4 pirates was thrown overboard. It’s in his best interest to vote against the eldest. He can ensure this if the eldest offers less than 2 coins to the 2nd youngest, by being the 2nd vote against any proposal, or at least if he can be sure that the youngest will vote against the oldest pirate whether 2 coins are going to the 2nd youngest or not. The youngest is going to vote for whatever he is offered, as long as it is more than the 0 he’s going to get if the oldest of 4 pirates is thrown overboard.
So if the oldest of 4 proposes 2 to the 2nd youngest, 1 to the youngest, and nothing to the 3rd youngest, he can take 97 for himself and still have his proposal accepted.
However, since the 3rd youngest will have worked this out, he knows he is in danger of getting nothing, so will also accept as little as 1 coin (or more). The oldest of 4 will deduce this and question giving 2 to the 2nd youngest. If he offers him nothing, losing his vote, can he guarantee the 2 votes of the other two? No, because if the other two work out that the 2nd youngest won’t vote for the proposal, the 3rd youngest stands to get 98 coins if he also votes against the proposal and gets the oldest of 4 thrown overboard, which he will prefer.
So the oldest of 4 has to give 2 or more to the 2nd youngest, and 1 can go to the youngest and nothing to the 3rd youngest. This is because the youngest will fear the possibility of the oldest of 4 pirates being thrown overboard and getting 0 coins. He is loss averse and the 2nd youngest will be swayed by the gain of 1 coin. The 3rd youngest will vote against but this won’t matter with 2/3 of the vote passing the proposal. Without giving the 1 coin to the youngest, the 3rd youngest has a chance of his vote being combined with the vote of the youngest to move them over to the 3 pirate scenario where the 3rd youngest gets 98 coins. So the 1 coin to the youngest secures that this possibility doesn’t happen. The oldest of 4 gets his 97 coins.[/tab]
[tab]Now the Grande Finale:
With all 5 pirates
, there are 4 voters, and so only 2 of these 4 need to be satisfied.
According to 4 pirate scenario above, the pirates most easily pleased will be the 3rd youngest who would stand to get no coins, and the youngest who will get only 1 coin. Offer the youngest 2, the 3rd youngest 1, and they’ll have to accept. The other 2 are going to vote against getting nothing, but that doesn’t affect the proposal of 2 coins to the youngest, 1 coin to the 3rd youngest, nothing for the other two, and the remaining 97 coins for the oldest pirate. It will still pass with at least 50% of the vote in favour.

Perhaps interestingly, this may all change considering the possibility of finding coins in future. This might turn the pirates against each other, especially the youngest one against the others since he will never be thrown overboard if they keep the same rules, and especially since the 2nd oldest and the 2nd youngest are going to want more than nothing in future, and their primary target is going to be the oldest pirate who stands to rip them off in future in a similar way, and he’s the first in line to be thrown overboard.[/tab]

I think, between Carleas and Sil, my original answer was closest to Sil’s. But, Carleas said something interesting that I hadn’t thought of.
[tab]My first Solution Attempt (proposer can’t vote problem):

If there is only 1, he will take 100 coins.
If there are 2, the only way the oldest will survive is if he offers 1 100 coins. (remember this step, this is the step that I think is very likely wrong in my original reasoning, based on something Carleas said)
So, if there are 2, 1 gets 100 and 2 gets 0.
If there are 3, the oldest has to only earn 1 vote. He gives 1 coin to #2 and 0 to #1, and 99 for himself.
If there are 4, the oldest has to earn 2 votes.
He gives 1 to #1, 2 to #2, and 0 to #3, leaving 97 for himself.
If there are 5, the oldest has to earn 2 votes.
He gives 1 to #3, 2 to #1, and 0 to #2 and #4, leaving him with 97

see the second tab for my revised reasoning[/tab]
[tab]If there is only 1, he takes 100 coins. This is still true.
If there are 2, I thought before that 2 could offer 100 and survive. However…this isn’t the case, according to the way I’ve worded the problem.
“Each pirate, having a strong sense of self-interest, bases their vote on the question, “Would I get less if this proposal failed?”
If the answer to that question is “yes,” they vote for it, else they vote against it.”

If 2 offers 1 100 coins, the answer to the question is “no”, so thus he votes against it.
2 dies no matter what if there are only 2.

And Carleas was right, then:
If there are only 3, the oldest doesn’t have to offer 2 anything, 2 will still vote for the proposal in order to not die.
So if there are 3, 3 offers 2 0 and offers 1 0, gets 2’s vote and keeps 100 for himself.
If there are 4, then, the oldest needs 2 votes.
He can give 1 to #1 and 1 to #2, 0 to #3 and keep 98 for himself.
If there are 5, oldest needs 2 votes.
He gives 2 to (either of #1 or #2) and 1 to #3, and keeps 97 for himself.[/tab]

Aren’t our solutions the same?

I don’t think this needs to be tabbed: it’s a good observation, but are we to give it priority over “the explicit goal” of “b) not dying” when they conflict?

[tab]Perhaps it doesn’t matter hugely, since the solutions are identical if the revised version ended up with “#1” being offered the 2 coins instead of “#2” rather than the other way around.

However, my solution that has disregarded the stalemate procedure in favour of considering “b)” and certainty, would mean that “the other way around” would make all the difference. Perhaps I’ll re-evaluate my solution in light of this procedure, but give “not dying” precedence when they conflict - see if it makes a difference.[/tab]

" So, if there are 2, 1 gets 100 and 2 gets 0.
If there are 3, the oldest has to only earn 1 vote. He gives 1 coin to #2 and 0 to #1, and 99 for himself.
If there are 4, the oldest has to earn 2 votes.
He gives 1 to #1, 2 to #2, and 0 to #3, leaving 97 for himself. "

Why does he have to give #2 more than 1 coin?
I don’t see a problem with your solution of the chessboard, it seems indefeatable. Not so with the aryan guru and her so called logical necessity. Obviousness doesn’t figure into logic, it’s too far short of certainty. But the riddle is ‘ambiguous’ - it’s not meant to be taken literally.