Options

It's Math-Puzzle Time

1356

Posts

  • Options
    SithDrummerSithDrummer Registered User regular
    edited April 2008
    Savant wrote: »
    t Savant: That only works if you maintain that X is a variable, but once you've opened the case, it's not.

    Opening the case has no effect on how much money is in them, or what the probability of choosing the bigger case is. It just changes how much you know about the possible amounts of money in the cases.

    If you started out with a briefcase with X dollars, and then were offered to switch with equal chances of doubling or halving your money, then you would always want to switch. But you don't have equal chances of doubling or halving in this problem. It is set with probability 1 what you will gain or lose by switching.
    Agreed; the probability is in whether or not your decision is correct.

    But you can't say that you risk Y in either direction without making Y fluctuate based on your chosen case:
    1) I won "Y" because I doubled my original case value of Y!
    2) I lost "Y" because I halved my original case value of 2Y!

    The two Ys you've won/lost here are not equal when you know the value of your original case.

    SithDrummer on
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    Savant wrote: »
    t Savant: That only works if you maintain that X is a variable, but once you've opened the case, it's not.

    Opening the case has no effect on how much money is in them, or what the probability of choosing the bigger case is. It just changes how much you know about the possible amounts of money in the cases.

    If you started out with a briefcase with X dollars, and then were offered to switch with equal chances of doubling or halving your money, then you would always want to switch. But you don't have equal chances of doubling or halving in this problem. It is set with probability 1 what you will gain or lose by switching.

    You're correct that the amount of money in the cases is fixed. However, you don't know what's in the other case, so you have to make your decision with the information you do have. You know that if you switch you have a 50% chance of gaining $100 and a 50% chance of losing $50. The fact that which one is true is determined before you make your choice is irrelevant.

    Consider it another way: let's say you're given this choice multiple times with different pairs of briefcases, and the choices aren't rigged (ie, you have an equal chance of starting with more money than less money for any given trial). Statistically you will get more money by switching every time than staying every time, even though all of the briefcases are prepared beforehand.

    Smasher on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Savant wrote: »
    t Savant: That only works if you maintain that X is a variable, but once you've opened the case, it's not.

    Opening the case has no effect on how much money is in them, or what the probability of choosing the bigger case is. It just changes how much you know about the possible amounts of money in the cases.

    If you started out with a briefcase with X dollars, and then were offered to switch with equal chances of doubling or halving your money, then you would always want to switch. But you don't have equal chances of doubling or halving in this problem. It is set with probability 1 what you will gain or lose by switching.
    Agreed; the probability is in whether or not your decision is correct.

    But you can't say that you risk Y in either direction without making Y fluctuate based on your chosen case:
    1) I won "Y" because I doubled my original case value of Y!
    2) I lost "Y" because I halved my original case value of 2Y!

    The two Ys you've won/lost here are not equal when you know the value of your original case.

    Er, the two Ys are equal. One briefcase has twice as much money as the other from the start, regardless of your choice. The first one has Y in it, the second has 2Y. Opening the first one to see that there are X dollars in it doesn't change anything about the states of the briefcases, or that switching between them will always increase or decrease the value by exactly Y.

    Would you want to switch if you didn't get to open the first one? The expected value would be zero, as you don't know if you took the big or small one to start. And opening it doesn't change that.

    Savant on
  • Options
    GooeyGooey (\/)┌¶─¶┐(\/) pinch pinchRegistered User regular
    edited April 2008
    The amount of money in the case is not important. When you choose your case you have a 50/50 chance to multiply that case by .5 and to multiply that case by 2 by switching. Removing all emotions, it logically makes sense to switch as you have no way of knowing if your case is y or 2y. You do know that the potential gain from switching is greater than the potential loss, and therefore the logical choice - given you're adhering to the rules of opportunity cost.

    Gooey on
    919UOwT.png
  • Options
    Smug DucklingSmug Duckling Registered User regular
    edited April 2008
    I'm of the camp that it doesn't matter if you switch, and I wrote a script to test it. The results of switching and not switching are virtually identical (this is inspired by the thought experiment given earlier)

    If you have Perl you can try it out.

    Tell me if I am not doing something that I should be doing.
    my $result1 = 0;
    my $result2 = 0;
    my @arr = (10, 5);
    # Switching
    foreach(1..10000) {
    	# Randomly generate 0 or 1
    	my $rand = int(rand()+0.5);
    	my $val = $arr[abs($rand-1)];
    	$result1 += $val;
    }
    
    # Not switching
    foreach(1..10000) {
            # Randomly generate 0 or 1
    	my $rand = int(rand()+0.5);
    	my $val = $arr[$rand];
    	$result2 += $val;
    }
    
    print("Result 1: $result1\nResult 2: $result2\n");
    

    Suppose the briefcases have Y and 2Y dollars.

    If you see X dollars and you switch, there are two possibilities, each with 1/2 probability:

    1: X = Y, and you lose Y dollars and gain 2Y dollars (total gain of Y dollars)

    2: X = 2Y, and you lose 2Y dollars and gain Y dollars (total loss of Y dollars).

    Then the expected value of switching is:

    (1/2)(Y) + (1/2)(-Y) = 0

    Thus there is no benefit (or penalty) to switching.

    Smug Duckling on
    smugduckling,pc,days.png
  • Options
    AroducAroduc regular
    edited April 2008
    What's wrong is that you're not using X, you're using preset variables. There are only two possible numbers in your program, where the problem presents three.

    Aroduc on
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    I think this is actually a pretty complicated problem, based on what I've read here:

    http://members.aol.com/kiekeben/envelope.html

    localh77 on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Gooey wrote: »
    The amount of money in the case is not important. When you choose your case you have a 50/50 chance to multiply that case by .5 and to multiply that case by 2 by switching. Removing all emotions, it logically makes sense to switch as you have no way of knowing if your case is y or 2y. You do know that the potential gain from switching is greater than the potential loss, and therefore the logical choice - given you're adhering to the rules of opportunity cost.

    You are applying a noninformative prior in a situation where it doesn't make sense.

    This isn't a matter of flipping a coin and doubling or halving your money. If you could flip a coin repeatedly to do that, it would behoove you to do so as you always stand to gain more than you lose. Read the problem statement again carefully:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    Upfront one case has twice as much money as the other. Making your first choice of the briefcase locks in whether or not you will gain by switching to the second case. There is no coin flip on the switching choice, it is just an opportunity to remake your first choice. To make this even more obvious, would you then want to switch again if you were given the opportunity to without opening the other case? You are going to either gain or lose Y dollars by switching, and gain or lose back that Y dollars if you switch back.

    Savant on
  • Options
    GooeyGooey (\/)┌¶─¶┐(\/) pinch pinchRegistered User regular
    edited April 2008
    Savant wrote: »
    Gooey wrote: »
    The amount of money in the case is not important. When you choose your case you have a 50/50 chance to multiply that case by .5 and to multiply that case by 2 by switching. Removing all emotions, it logically makes sense to switch as you have no way of knowing if your case is y or 2y. You do know that the potential gain from switching is greater than the potential loss, and therefore the logical choice - given you're adhering to the rules of opportunity cost.

    You are applying a noninformative prior in a situation where it doesn't make sense.

    This isn't a matter of flipping a coin and doubling or halving your money. If you could flip a coin repeatedly to do that, it would behoove you to do so as you always stand to gain more than you lose. Read the problem statement again carefully:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    Upfront one case has twice as much money as the other. Making your first choice of the briefcase locks in whether or not you will gain by switching to the second case. There is no coin flip on the switching choice, it is just an opportunity to remake your first choice. To make this even more obvious, would you then want to switch again if you were given the opportunity to? You are going to either gain or lose Y dollars by switching, and gain or lose back that Y dollars if you switch back.

    No.

    You have no way to know wether or not the case you have picked is y or 2y. It's that simple. Given the rules of the problem it presents you with 3 values, not 2.

    x, .5x and 2x

    x is the case you pick. The number in the second case has equal probability of being .5x and 2x, given the fact that you have no idea if the second case is y or 2y.

    Gooey on
    919UOwT.png
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    Python:
    #!/usr/bin/python
    
    import random
    
    def main():
    	trials = 10000
    	knownMoney = 100 # value of the suitcase of money we're shown
    	noSwitch = 0
    	switch = 0
    	for n in range(trials):
    		noSwitch += knownMoney # always stay, so always get the same amount
    		# randomly choose whether known suitcase is the smaller or bigger one
    		if random.random() >=.5: # if it's the bigger one
    			switch += knownMoney / 2
    		else: #it's the smaller one
    			switch += knownMoney * 2
    	print 'no switching:', noSwitch
    	print 'switching:', switch
    			
    if __name__ == "__main__":
    	main()
    

    And the results are:

    no switching: 1000000
    switching: 1250450

    Smasher on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Gooey wrote: »
    Savant wrote: »
    Gooey wrote: »
    The amount of money in the case is not important. When you choose your case you have a 50/50 chance to multiply that case by .5 and to multiply that case by 2 by switching. Removing all emotions, it logically makes sense to switch as you have no way of knowing if your case is y or 2y. You do know that the potential gain from switching is greater than the potential loss, and therefore the logical choice - given you're adhering to the rules of opportunity cost.

    You are applying a noninformative prior in a situation where it doesn't make sense.

    This isn't a matter of flipping a coin and doubling or halving your money. If you could flip a coin repeatedly to do that, it would behoove you to do so as you always stand to gain more than you lose. Read the problem statement again carefully:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    Upfront one case has twice as much money as the other. Making your first choice of the briefcase locks in whether or not you will gain by switching to the second case. There is no coin flip on the switching choice, it is just an opportunity to remake your first choice. To make this even more obvious, would you then want to switch again if you were given the opportunity to? You are going to either gain or lose Y dollars by switching, and gain or lose back that Y dollars if you switch back.

    No.

    You have no way to know wether or not the case you have picked is y or 2y. It's that simple. Given the rules of the problem it presents you with 3 values, not 2.

    x, .5x and 2x

    x is the case you pick. The number in the second case has equal probability of being .5x and 2x, given the fact that you have no idea if the second case is y or 2y.

    There are only 2 cases. So of the three values x, .5x, and 2x, one of them has zero probability of being in the case, and the other two have equal probability as there is one case of each. Your first choice of the case and measuring of x zeros out the probability of either .5x or 2x, you just don't know which one it is.

    Savant on
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    Gooey wrote: »
    Savant wrote: »
    Gooey wrote: »
    The amount of money in the case is not important. When you choose your case you have a 50/50 chance to multiply that case by .5 and to multiply that case by 2 by switching. Removing all emotions, it logically makes sense to switch as you have no way of knowing if your case is y or 2y. You do know that the potential gain from switching is greater than the potential loss, and therefore the logical choice - given you're adhering to the rules of opportunity cost.

    You are applying a noninformative prior in a situation where it doesn't make sense.

    This isn't a matter of flipping a coin and doubling or halving your money. If you could flip a coin repeatedly to do that, it would behoove you to do so as you always stand to gain more than you lose. Read the problem statement again carefully:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    Upfront one case has twice as much money as the other. Making your first choice of the briefcase locks in whether or not you will gain by switching to the second case. There is no coin flip on the switching choice, it is just an opportunity to remake your first choice. To make this even more obvious, would you then want to switch again if you were given the opportunity to? You are going to either gain or lose Y dollars by switching, and gain or lose back that Y dollars if you switch back.

    No.

    You have no way to know wether or not the case you have picked is y or 2y. It's that simple. Given the rules of the problem it presents you with 3 values, not 2.

    x, .5x and 2x

    x is the case you pick. The number in the second case has equal probability of being .5x and 2x, given the fact that you have no idea if the second case is y or 2y.

    Exactly. Which means that you should switch, regardless of the case you initially picked, which is confusing. If fact, you don't even need to open the case, you know before you do that you should switch. The resolution seems to be in it being impossible to initially fill up the cases (i.e., picking a random number between 0 and infinity doesn't seem possible).

    localh77 on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    See, if the problem said "pick a briefcase, then the second briefcase is set to be either double or half the value of that briefcase and you are offered to switch" you would want to switch. But since he isn't talking about opening up a briefcase and stuffing it or emptying it of money, I somewhat doubt that is what the problem is.

    Savant on
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    Savant wrote: »
    See, if the problem said "pick a briefcase, then the second briefcase is set to be either double or half the value of that briefcase and you are offered to switch" you would want to switch. But since he isn't talking about opening up a briefcase and stuffing it or emptying it of money, I somewhat doubt that is what the problem is.

    But I think the problem could be restated that way. It wasn't really clear how the briefcases are filled, which is the problem. "You're offered two briefcases. One contains twice as much money as the other."

    Edit: nevermind, that's not really what I meant. What I meant was that the host could have filled the briefcases that way (that is, put a random amount of money in one briefcase. Then put either double or half that in the other briefcase).

    localh77 on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    localh77 wrote: »
    Savant wrote: »
    See, if the problem said "pick a briefcase, then the second briefcase is set to be either double or half the value of that briefcase and you are offered to switch" you would want to switch. But since he isn't talking about opening up a briefcase and stuffing it or emptying it of money, I somewhat doubt that is what the problem is.

    But I think the problem could be restated that way. It wasn't really clear how the briefcases are filled, which is the problem. "You're offered two briefcases. One contains twice as much money as the other."

    Er, since he isn't talking about changing the values of the briefcases after you chose the first one, I think it is pretty safe to say that he isn't doing that. The problem would be very stupidly worded and simply wrong if he meant that but did not say it.

    If there are just two briefcases at the start and their values are constant, then the expected value is zero. The outcome of your choice to switch is directly dependent upon your first choice of briefcase.

    Savant on
  • Options
    ElJeffeElJeffe Not actually a mod. Roaming the streets, waving his gun around.Moderator, ClubPA mod
    edited April 2008
    Gooey wrote: »
    You have no way to know wether or not the case you have picked is y or 2y. It's that simple. Given the rules of the problem it presents you with 3 values, not 2.

    x, .5x and 2x

    x is the case you pick. The number in the second case has equal probability of being .5x and 2x, given the fact that you have no idea if the second case is y or 2y.

    This is the error you're making. It makes no sense to say that you need three values to model the values in two briefcases.

    Okay, here's another way of looking at it. Let's assume that we have two sets of values rather than an infinite number of sets. The briefcases either contain 50 and 100 dollars, or 100 and 200 dollars. If you some up all the possible amounts you can win or lose by switching, you'll find that switching yields an expected return of 0. The net expected gain when you gain is $75, and the net expected loss when you lose is $75, and you'll gain 50% of the time, while losing 50% of the time. You'll gain as often as you'll lose, and you'll gain or lose the same amount on average.

    But there are more than two possible sets of values! you say. Fine, add more. Add a set containing 200 and 400 dollars, and recalculate the results. It'll be the same result in the end. Add another containing 25 and 50, and another containing 30 and 60, and another containing 1M and 2M. Run the numbers after each addition, you'll always get the same result - no net expected gain from switching. As the number of variations approaches infinity, you'll never get a situation in which the net expected gain deviates from zero.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    Marty81Marty81 Registered User regular
    edited April 2008
    Right, there's no coinflip, as the amount of money in the suitcases is determined ahead of time. I'm not being offered X dollars and a coinflip to double or half it, because I'd take that coinflip any time. I'm being offered the choice between two boxes. One has X and the other 2X. If I pick X and switch, I gain X. If I pick 2X and switch, I lose X. That's it.

    In any case, even if it doesn't help, at least switching suitcases doesn't hurt you! :P

    edit: incomplete thought

    Marty81 on
  • Options
    zakkielzakkiel Registered User regular
    edited April 2008
    Smasher wrote: »
    Moridin wrote: »
    Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

    No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

    That one's called the Monty Hall problem. I'll save you the trouble and post it:

    You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

    Argument start! Though simulations prove the following is correct:
    You should always switch.

    Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

    If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

    Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    zakkiel on
    Account not recoverable. So long.
  • Options
    Smug DucklingSmug Duckling Registered User regular
    edited April 2008
    Smasher wrote: »
    Python:
    #!/usr/bin/python
    
    import random
    
    def main():
    	trials = 10000
    	knownMoney = 100 # value of the suitcase of money we're shown
    	noSwitch = 0
    	switch = 0
    	for n in range(trials):
    		noSwitch += knownMoney # always stay, so always get the same amount
    		# randomly choose whether known suitcase is the smaller or bigger one
    		if random.random() >=.5: # if it's the bigger one
    			switch += knownMoney / 2
    		else: #it's the smaller one
    			switch += knownMoney * 2
    	print 'no switching:', noSwitch
    	print 'switching:', switch
    			
    if __name__ == "__main__":
    	main()
    

    And the results are:

    no switching: 1000000
    switching: 1250450

    I'm pretty sure this is wrong because you're choosing things in the wrong order. You're saying that in a situation with a $50 and $100 briefcase, you ALWAYS choose the $100 briefcase, and in a situation with a $100 and $200 briefcase, you ALWAYS choose the $100 briefcase, when in reality this is not true. Sometimes in each of these situations you will choose the $50 briefcase or the $200 briefcase.

    Smug Duckling on
    smugduckling,pc,days.png
  • Options
    Smug DucklingSmug Duckling Registered User regular
    edited April 2008
    zakkiel wrote: »
    Smasher wrote: »
    Moridin wrote: »
    Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

    No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

    That one's called the Monty Hall problem. I'll save you the trouble and post it:

    You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

    Argument start! Though simulations prove the following is correct:
    You should always switch.

    Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

    If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

    Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    There is a benefit to switching for the same reason as there would be a benefit to switching if instead of showing the goat he pointed at the correct door and said "This is the correct door - choose it!". In that case I think you agree that history matters.

    Intuitively, what he is doing by showing the goat is providing indirect evidence (not conclusive, but evidence nonetheless), by not choosing the other door, that the door that he did not choose has the car behind it.

    Hope that makes sense.

    Smug Duckling on
    smugduckling,pc,days.png
  • Options
    Marty81Marty81 Registered User regular
    edited April 2008
    3 doors explanation:

    Think about playing the "never switch" strategy. You win 1/3 of the time, because you had a 1/3 chance to pick the right door at the beginning.

    Think about playing the "always switch" strategy. Your goal here is to pick the WRONG door at the beginning, therefore getting the right door when you switch. You have a 2/3 chance of picking the wrong door the first time, so you have a 2/3 chance of winning with this strategy.

    Marty81 on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    zakkiel wrote: »
    Smasher wrote: »
    Moridin wrote: »
    Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

    No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

    That one's called the Monty Hall problem. I'll save you the trouble and post it:

    You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

    Argument start! Though simulations prove the following is correct:
    You should always switch.

    Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

    If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

    Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    Here's a simple way to think of it:
    2 goats and a car. So you have 2/3 chance of getting a goat on the first try, and 1/3 on getting the car. He then reveals a goat, reducing the pool of goats by one. So if you started with a goat and switch, you always get the car (the other goat is gone), and if you start with the car and switch you get the other goat. So switching just reverses your first choice to be a car if you had a goat and a goat if you had the car, and since there is 2/3 chance you picked a goat first you always want to switch.

    As for smasher's code, he built in the assumption that when you open the first suitcase with 100 dollars, then you have a coin flip for the second suitcase being $50 or $200. That's not what the problem says, the amount of money in the second suitcase stays constant when you choose the first one.

    Savant on
  • Options
    AdrienAdrien Registered User regular
    edited April 2008
    zakkiel wrote: »
    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    The n = 100 case really helps in getting it intuitively. I like to use a deck of cards. Your goal is to pick the ace of spades. I'll shuffle the deck and let you choose one card. Then I'll go through the remaining deck and remove fifty cards that are not the ace of spades. Would you like to switch to the one remaining card, or go with your bet that your original choice was the ace of spades?

    Adrien on
    tmkm.jpg
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    zakkiel wrote: »
    Smasher wrote: »
    Moridin wrote: »
    Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

    No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

    That one's called the Monty Hall problem. I'll save you the trouble and post it:

    You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

    Argument start! Though simulations prove the following is correct:
    You should always switch.

    Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

    If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

    Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    Try again.

    This time, imagine that there are 100 doors, with 99 goats and one car.

    Do you switch?

    Apothe0sis on
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    Just so it doesn't get lost:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    There are four possible outcomes to the scenario: you pick the smaller briefcase and keep (Sk), pick the smaller briefcase and switch (Ss), pick the larger briefcase and keep (Lk), or pick the larger briefcase and switch(Ls).

    First, let's look at the problem where we don't know any of the values of the cases (ie, we don't look in the first one). Assuming y is the value in the smaller case, we calculate the expected values of switching (0 if we keep):

    Sk: 0
    Ss: y
    Lk: 0
    Ls: -y

    We don't know whether we have the larger or smaller briefcase, so our probabilities are the sums of switching or keeping, which are both 0. All is well.

    Now let's look at where we know the values of both cases, but don't know which one we picked initially. Let's go with 100 and 200:

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -100

    Again, our expected value is 0 for both keeping and switching. No arguments here.

    Finally, we look at the originally posed problem, where we know the first suitcase has $X (where X is an actual value and not a variable. I'll use $100).

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -50

    This is where the three value thing other people were talking about comes in. Yes, there are only two values, but you don't know which one the other value is. Using the information you have available to you, your expected value is greater for switching than it is for staying. The reason this problem is different from the other two above is that you're using two different values for the smaller suitcase when calculating the gain or loss, as opposed to the other two problems where you're using only one.
    Smasher wrote: »
    Python:
    #!/usr/bin/python
    
    import random
    
    def main():
        trials = 10000
        knownMoney = 100 # value of the suitcase of money we're shown
        noSwitch = 0
        switch = 0
        for n in range(trials):
            noSwitch += knownMoney # always stay, so always get the same amount
            # randomly choose whether known suitcase is the smaller or bigger one
            if random.random() >=.5: # if it's the bigger one
                switch += knownMoney / 2
            else: #it's the smaller one
                switch += knownMoney * 2
        print 'no switching:', noSwitch
        print 'switching:', switch
                
    if __name__ == "__main__":
        main()
    
    And the results are:

    no switching: 1000000
    switching: 1250450

    I'm pretty sure this is wrong because you're choosing things in the wrong order. You're saying that in a situation with a $50 and $100 briefcase, you ALWAYS choose the $100 briefcase, and in a situation with a $100 and $200 briefcase, you ALWAYS choose the $100 briefcase, when in reality this is not true. Sometimes in each of these situations you will choose the $50 briefcase or the $200 briefcase.

    You're looking at it in the wrong order. We know we have a given amount in the first suitcase; that's given in the problem. I picked 100 arbitrarily, but it works with any value. You're correct that I won't always pick the $100 suitcase for either of the two scenarios, but the problem states that I did as a premise (technically $X, but yeah), and so I work from that. What isn't specified is whether the $100 is the larger or smaller of the two, and so that varies randomly.

    Smasher on
  • Options
    AroducAroduc regular
    edited April 2008
    ANNNNNYWAY, for the briefcases... my take on it has always been that it's just a deceptive framing problem. You're not given two choices. You're not given any control or choice at all until you're asked to switch.

    If you change the problem to just:
    Flip a coin: if it's heads, halve the money you've got; if it's tails, double it. Take the bet?

    Then it's a trivially obvious answer, and that's exactly what the situation is when you're first given control. All the stuff before is just sleight of hand to trick you into thinking there's an extra choice to make and more strategy than just the one choice. You're given X up front no matter what happens. From there, it just gets halved or doubled.

    Aroduc on
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    Savant wrote: »
    As for smasher's code, he built in the assumption that when you open the first suitcase with 100 dollars, then you have a coin flip for the second suitcase being $50 or $200. That's not what the problem says, the amount of money in the second suitcase stays constant when you choose the first one.

    What the problem is implicitly asking for is the course of action with the best expected value. The idea of the expected value is that it's the average amount you would gain (or lose) if you repeated the experiment/problem a large number of times. The problem states we find $X (X again being a value rather than a variable), so for each trial that should remain constant. The only unknown variable we have left is whether the case with $X in it was the larger or smaller one, so we decide that randomly for each trial and set the value of the other briefcase accordingly.

    Here's another way of looking at it. Let's rephrase the problem so that one briefcase contains $50 more than the other, and that the first one you choose has $100. The expected value of switching for that problem is 0, as you either go from 100 to 50 or 100 to 150. Compare that to the original problem where one briefcase has twice as much money as the other briefcase, and the first briefcase you choose has $100. You lose the same amount as before if you switch to the smaller one, but you gain more ($100 compared to $50) if you switch to the bigger one. Clearly the expected values can't be the same, and they aren't.

    Smasher on
  • Options
    JamesKeenanJamesKeenan Registered User regular
    edited April 2008
    Well... Of the cases thing, I'm of the school that it's always better to switch. Mind you, it's a variant of the ".5X vs. 2X" train of thought which I like to call, "Might as well."

    Say you open your case and you see 50 dollars. Now, there are two options here, if we think of it in terms of Y and 2Y. Either Y is 25, or Y is 50. In either case we have already automatically won Y. It's whether we win that other Y that counts.

    If Y is 25, we stand to lose 25 dollars, whereas if Y is 50, we stand to gain 50. I figur' that it's better to try and win 50 than lose 25.

    JamesKeenan on
  • Options
    zakkielzakkiel Registered User regular
    edited April 2008
    Savant wrote: »
    zakkiel wrote: »
    This makes no sense to me at all. It seems to violate basic combinatorics. If you have one car and two unknown doors, there can't be a 2/3 chance for each door, regardless of what history there is or how many doors have been eliminated. History is irrelevant. By eliminating all options except two, you change the odds to 50% for each door. There's no benefit to switching.

    Here's a simple way to think of it:
    2 goats and a car. So you have 2/3 chance of getting a goat on the first try, and 1/3 on getting the car. He then reveals a goat, reducing the pool of goats by one. So if you started with a goat and switch, you always get the car (the other goat is gone), and if you start with the car and switch you get the other goat. So switching just reverses your first choice to be a car if you had a goat and a goat if you had the car, and since there is 2/3 chance you picked a goat first you always want to switch.
    This makes sense. The n=100 cases don't do anything for me without this.

    zakkiel on
    Account not recoverable. So long.
  • Options
    zakkielzakkiel Registered User regular
    edited April 2008
    Well... Of the cases thing, I'm of the school that it's always better to switch. Mind you, it's a variant of the ".5X vs. 2X" train of thought which I like to call, "Might as well."

    Say you open your case and you see 50 dollars. Now, there are two options here, if we think of it in terms of Y and 2Y. Either Y is 25, or Y is 50. In either case we have already automatically won Y. It's whether we win that other Y that counts.

    If Y is 25, we stand to lose 25 dollars, whereas if Y is 50, we stand to gain 50. I figur' that it's better to try and win 50 than lose 25.

    The problem I have here is that the amount of money he sees in the briefcase is supposed to be irrelevant to his decision. Therefore, actually opening the briefcase does not give him any information, and his decision to switch should be the same even if the case is never opened. But if thats the case, you're advocating an infinite loop of switching, as already observed.

    zakkiel on
    Account not recoverable. So long.
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Aroduc wrote: »
    ANNNNNYWAY, for the briefcases... my take on it has always been that it's just a deceptive framing problem. You're not given two choices. You're not given any control or choice at all until you're asked to switch.

    If you change the problem to just:
    Flip a coin: if it's heads, halve the money you've got; if it's tails, double it. Take the bet?

    Then it's a trivially obvious answer, and that's exactly what the situation is when you're first given control. All the stuff before is just sleight of hand to trick you into thinking there's an extra choice to make and more strategy than just the one choice. You're given X up front no matter what happens. From there, it just gets halved or doubled.

    How does that make any sense? You are operating on the probabilities of how much money the guy first put into the cases, which is totally unknown. It ain't no coin flip, at least not in any way you stated the problem.

    When you say "always switch", you are betting that the guy put more money in both cases and gave you the smaller one. But you have given no indication as to what mechanism he uses to decide how much money to put in the cases. If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    In other words, you are assuming a uniform prior distribution when that will result in infinite probability, or at very least are assuming a prior where having {X, 2X} in the bags is the same as {X/2, X} for some value of X. The first one is impossible, and the second one is possible but you have given no reason to believe that assumption holds.

    Thus if the problem is how you stated it, it is unsolvable with a lack of sufficient information.

    Savant on
  • Options
    DocDoc Registered User, ClubPA regular
    edited April 2008
    To clarify the reason that switching is good:

    There are one million doors. You pick door #1, then the host opens all the doors except for your door and door #345,329.

    Would you switch to that door?

    Doc on
  • Options
    Premier kakosPremier kakos Registered User, ClubPA regular
    edited April 2008
    Apothe0sis wrote: »
    This is most assuredly not my series, and math geeks will probably see why this is funny.

    2, 5, 877, 27644437, 35742549198872617291353508656626642567, 359334085968622831041960188598043661065388726959079837 ... ?

    93074010508593618333390737086787716686243541818180888374481714163260295083047249735772767217242983302809460000977534180207136267393493843565165244803160631468142310993331802362163022244196698256971348116848613873420484831145693123069522851464977918752008824682368616575925219950816540476437905078816850961411910271333677033484348376563240090830825402255766897769433849999097072825800009667464269609454294177226626947696038219790689936750529759962907392539903707559051710294385488230399560792411004667389541080494330079820301708455191991301691046952805571310530593988681849292364334866037961772856489592631353387119588912930029104717108771889838640129969152323291892894440849105218746915343681917125357252705291568270795558661856036155452894768239527100657674295540074496397200222788808730895367566036148544193391411249930638389221671262805683765652287931774606379446475019704511886714677938481928235170230529782702798198071512791277092287199903418895014552315221588641882843663507160334370905336400728991988818738787812880619370936400288087393153584463921744028247514969200055227428251657073540384943632840780163289976513840862197690165020628387737123529546265113690620682595841836568269584610937749225174156188272089587049752042320435999387410834519188271208577279962713103121177117736246578720020849615208138290410547459210639313481871280957180797480718884005158357183543875383933081640708958426808475868980596063805203682907241542158286516965706176501691352009055980316953619941361963586164642762338959226194401591549258894070494114321789561253423874645767485673563010694616837670389191021116993326818985640677682311168596513135927575792933795897024983955212555699886813758658727223213225641249054291854713271825236198768288749577317015750567399596468873488940152346191448776708760260862506238255653851554400298770502418390469037927740196990922130058457344538461597268140533944714325634938884545914139335512028740689585456916586292846456683229763623263845961927185120666686527368190661471902546889836939907242929408922820078908112593178663177685220525268101383971283991711468187276352738607911284318470208480309880183721371226886168592890172034997639285024092759687525920453573640538777106302852351553054823775813450680545320959747676102527895283952321119565456131914284468837228528467270883859016854414206071054324686179724452435704506155435210031925383788518515000655319634148229746788564381020601053143272002310437607878237640840123305186361012402650803349859965081202294515347182519213721197040915413263249473539740781608786907273923065127196445526332443113542957189094428043671781635432417130645135281343627517544700098433529452127971455501702330453614487341357174977756767117068294356318437149385243962447271471217433312351733571281240293461665450829761335596591586210391312306966997773285594528746729346018039023115838145677882687881461094746947227827301981448949646394994521319966023728976458814160934241105908961406982946398028919136619104849916909167562570774608666807683843343671704615600840389419697202833796957720397144242132316427467001808219485482456195736463596111707485715440237594384459161928415836077852237526665117484048997247449275300584446550437546119923676017959462712581196976718470946270331842562972612728361652280030892982127111700793139354703946990580256780069816918913085639153753984131390468635302755088889211367474268779633561838363953846659135906229513613392267266814066735012127403702413192430883159093465043866796901005656737875251268652602552279882927553368913466086109551491491194789740567321879833676289767448527915219283173310873247606366284048111931661775107155492303602877951956085944593035383609134038714354896277016656832069877486297785906138808032199478041298446201040604532790336475388815136237012786896885113359098836781297206766936652939896086173174247047512753636594129042150203504101570396300673678733698741829932012118685174241471375329706399365116190852969742529040156369682356942527779947968734604866975128678544682420679340574499610003971519618440516937488305570847146550169657044672599070091974948167777874966859847710614503223797831808567277923192280169244649411883773069412702823257335336231095595728578499725749559860832154398909223367972897328124149325331610872832467828260416056336532880372285600592198569202140923550067859152359527561994066586118059826706360048699302434574822753790260890021587852244744860631603541791133619526793181058747657744949691213833240225173150813910220165362694840945664849801045511812349221956400744150897341864900858146835458095842131465890241777553970152159303024640985017795077066167761624753595558912106613186934139850039921207148473490125950756237204950896105932772825720856228894276720028932601843556239266022962890064643933856257123158863877848475127602406307792588595211703986239432550726691852534321783665511020801887786921172059700879026067285947359724825015339666089008466190352993533122497861078203104758579483775098260468006288071847722122648015073671866043577071451595398504208603083753549564030349663312899219714349854025049280827748345322558570768577632728362536039844811568342727553594911765165120515649387783590754617053981056103148444168056157453359284719489933160382315541998163668080430170189604432196012778454138678438477287617931797344197371492016925294293920435571230125441856377635430992383031641860714124008747101118765506416821101339484788073020563327420998407651377506476214794114648110563115377659800929141531398790434303752231082196206662416811943482851130965329365467377939976152662541912142094277951396022236086513406015986868896776219944649030822428734842983667486613881443827330912807645197812117423741459096670510731379104448554954932379389781897741126092080837369956421197209552100624952564640377427157704473125717924200923183638927529737419861594670798574290635462323151713380155501912570516986807486808850943703276543637772748688396151956171036621635411599055947803963844239542311343491664913052539486422164914206103599970496248266628264533316702997562527874219219368930835654767520116901325628008414215779935053300527454338095585904167331992744976932786018872605231337251856220370815003679385877026959961549970646582412358020805216001234519254278426271698879069271009841356762362776355079842217812045389559364458319251495759620406596953099358508401125247456305868316439298039940486165859806267316803843769250751290909537174237439362050585335391392533062430185171682628621992342320221279940728553592588293913252900669869333819652288398187260320155987924359450740883499506500844712757333449557083885253703080601131

    Fuck you and your prime bell numbers.

    Premier kakos on
  • Options
    AroducAroduc regular
    edited April 2008
    Savant wrote: »
    If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    I'm glad you agree with me.

    Aroduc on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Aroduc wrote: »
    Savant wrote: »
    If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    I'm glad you agree with me.

    That's not how you stated your problem though. If you are going to go Bayesian on me you can't assume a uniform prior without stating it (ignoring that it is improper).

    Savant on
  • Options
    FunkyWaltDoggFunkyWaltDogg Columbia, SCRegistered User regular
    edited April 2008
    My gut insists that it doesn't matter whether you switch briefcases, but I can't seem to work up a proof that's not dependent on assuming a normal or bounded uniform distribution for X (where the briefcases contain X and 2X).

    FunkyWaltDogg on
  • Options
    SithDrummerSithDrummer Registered User regular
    edited April 2008
    Savant wrote: »
    Aroduc wrote: »
    Savant wrote: »
    If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    I'm glad you agree with me.

    That's not how you stated your problem though. If you are going to go Bayesian on me you can't assume a uniform prior without stating it (ignoring that it is improper).
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?
    :^:

    SithDrummer on
  • Options
    AdrienAdrien Registered User regular
    edited April 2008
    Savant wrote: »
    If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    What if he tells you, "there's X dollars in that bag, and an even shot of being double or half in the one you have"?

    Adrien on
    tmkm.jpg
  • Options
    KingAgamemnonKingAgamemnon Registered User regular
    edited April 2008
    Absurdist wrote: »
    Here is another tricky one!

    What's the next number in this sequence?

    10, 0, 3, 20, 16, 51, 32, 67, 74, ?

    That is not a math question. You can't know what the next number is.

    KingAgamemnon on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Adrien wrote: »
    Savant wrote: »
    If he tells you "there's X dollars in this bag, and an even shot of being double or half in the other bag" then you switch.

    What if he tells you, "there's X dollars in that bag, and an even shot of being double or half in the one you have"?

    X dollars in the one you didn't take? With even shot of being double or half of X in the one you did (of course not showing you what is in your bag before you decide)? If that's the case then you don't switch.

    Savant on
Sign In or Register to comment.