As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

It's Math-Puzzle Time

1246

Posts

  • Options
    FunkyWaltDoggFunkyWaltDogg Columbia, SCRegistered User regular
    edited April 2008
    The difference between that question and the original statement is that the odds of the other bag being half or double aren't even. X must be chosen off some distribution. If that distribution is known, you can make an educated guess at whether you've opened the X bag or the 2X bag. If you don't know the distribution you are just guessing, but it's still there, affecting the outcome.

    FunkyWaltDogg on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    Apothe0sis wrote: »
    This is most assuredly not my series, and math geeks will probably see why this is funny.

    2, 5, 877, 27644437, 35742549198872617291353508656626642567, 359334085968622831041960188598043661065388726959079837 ... ?
    93074010508593618333390737086787716686243541818180888374481714163260295083047249735772767217242983302809460000977534180207136267393493843565165244803160631468142310993331802362163022244196698256971348116848613873420484831145693123069522851464977918752008824682368616575925219950816540476437905078816850961411910271333677033484348376563240090830825402255766897769433849999097072825800009667464269609454294177226626947696038219790689936750529759962907392539903707559051710294385488230399560792411004667389541080494330079820301708455191991301691046952805571310530593988681849292364334866037961772856489592631353387119588912930029104717108771889838640129969152323291892894440849105218746915343681917125357252705291568270795558661856036155452894768239527100657674295540074496397200222788808730895367566036148544193391411249930638389221671262805683765652287931774606379446475019704511886714677938481928235170230529782702798198071512791277092287199903418895014552315221588641882843663507160334370905336400728991988818738787812880619370936400288087393153584463921744028247514969200055227428251657073540384943632840780163289976513840862197690165020628387737123529546265113690620682595841836568269584610937749225174156188272089587049752042320435999387410834519188271208577279962713103121177117736246578720020849615208138290410547459210639313481871280957180797480718884005158357183543875383933081640708958426808475868980596063805203682907241542158286516965706176501691352009055980316953619941361963586164642762338959226194401591549258894070494114321789561253423874645767485673563010694616837670389191021116993326818985640677682311168596513135927575792933795897024983955212555699886813758658727223213225641249054291854713271825236198768288749577317015750567399596468873488940152346191448776708760260862506238255653851554400298770502418390469037927740196990922130058457344538461597268140533944714325634938884545914139335512028740689585456916586292846456683229763623263845961927185120666686527368190661471902546889836939907242929408922820078908112593178663177685220525268101383971283991711468187276352738607911284318470208480309880183721371226886168592890172034997639285024092759687525920453573640538777106302852351553054823775813450680545320959747676102527895283952321119565456131914284468837228528467270883859016854414206071054324686179724452435704506155435210031925383788518515000655319634148229746788564381020601053143272002310437607878237640840123305186361012402650803349859965081202294515347182519213721197040915413263249473539740781608786907273923065127196445526332443113542957189094428043671781635432417130645135281343627517544700098433529452127971455501702330453614487341357174977756767117068294356318437149385243962447271471217433312351733571281240293461665450829761335596591586210391312306966997773285594528746729346018039023115838145677882687881461094746947227827301981448949646394994521319966023728976458814160934241105908961406982946398028919136619104849916909167562570774608666807683843343671704615600840389419697202833796957720397144242132316427467001808219485482456195736463596111707485715440237594384459161928415836077852237526665117484048997247449275300584446550437546119923676017959462712581196976718470946270331842562972612728361652280030892982127111700793139354703946990580256780069816918913085639153753984131390468635302755088889211367474268779633561838363953846659135906229513613392267266814066735012127403702413192430883159093465043866796901005656737875251268652602552279882927553368913466086109551491491194789740567321879833676289767448527915219283173310873247606366284048111931661775107155492303602877951956085944593035383609134038714354896277016656832069877486297785906138808032199478041298446201040604532790336475388815136237012786896885113359098836781297206766936652939896086173174247047512753636594129042150203504101570396300673678733698741829932012118685174241471375329706399365116190852969742529040156369682356942527779947968734604866975128678544682420679340574499610003971519618440516937488305570847146550169657044672599070091974948167777874966859847710614503223797831808567277923192280169244649411883773069412702823257335336231095595728578499725749559860832154398909223367972897328124149325331610872832467828260416056336532880372285600592198569202140923550067859152359527561994066586118059826706360048699302434574822753790260890021587852244744860631603541791133619526793181058747657744949691213833240225173150813910220165362694840945664849801045511812349221956400744150897341864900858146835458095842131465890241777553970152159303024640985017795077066167761624753595558912106613186934139850039921207148473490125950756237204950896105932772825720856228894276720028932601843556239266022962890064643933856257123158863877848475127602406307792588595211703986239432550726691852534321783665511020801887786921172059700879026067285947359724825015339666089008466190352993533122497861078203104758579483775098260468006288071847722122648015073671866043577071451595398504208603083753549564030349663312899219714349854025049280827748345322558570768577632728362536039844811568342727553594911765165120515649387783590754617053981056103148444168056157453359284719489933160382315541998163668080430170189604432196012778454138678438477287617931797344197371492016925294293920435571230125441856377635430992383031641860714124008747101118765506416821101339484788073020563327420998407651377506476214794114648110563115377659800929141531398790434303752231082196206662416811943482851130965329365467377939976152662541912142094277951396022236086513406015986868896776219944649030822428734842983667486613881443827330912807645197812117423741459096670510731379104448554954932379389781897741126092080837369956421197209552100624952564640377427157704473125717924200923183638927529737419861594670798574290635462323151713380155501912570516986807486808850943703276543637772748688396151956171036621635411599055947803963844239542311343491664913052539486422164914206103599970496248266628264533316702997562527874219219368930835654767520116901325628008414215779935053300527454338095585904167331992744976932786018872605231337251856220370815003679385877026959961549970646582412358020805216001234519254278426271698879069271009841356762362776355079842217812045389559364458319251495759620406596953099358508401125247456305868316439298039940486165859806267316803843769250751290909537174237439362050585335391392533062430185171682628621992342320221279940728553592588293913252900669869333819652288398187260320155987924359450740883499506500844712757333449557083885253703080601131

    Fuck you and your prime bell numbers.

    Well, I did NOT expect that.

    Apothe0sis on
  • Options
    AresProphetAresProphet Registered User regular
    edited April 2008
    This won't induce as many math headaches as some of the other stuff in this thread.

    _, _, _, 7, 7, 9, 13, 10, 9, 1, 4, 9, 16, 16, 9, 13 ...

    What are the first three numbers in this series?

    AresProphet on
    ex9pxyqoxf6e.png
  • Options
    ClipseClipse Registered User regular
    edited April 2008
    Here's one of my favorite math puzzles/riddles:

    Mr. P knows the product of two numbers from the set {2, 3, ..., 99}.
    Mr. S knows the sum of those same two numbers.

    They have the following (truthful) dialog:
    Mr. P: I don't know the two numbers.
    Mr. S: I already knew that you didn't know the two numbers.
    Mr. P: Ah, now I know the two numbers.
    Mr. S: Ah, now I know the two numbers as well.

    What are the two numbers?

    Very slight hint:
    Some brute forcing required, but if you do it right it is pen&paper friendly.


    On a side note, the xkcd blue eyes/brown eyes riddle reminded me of an apparent paradox related to pop quizzes. One day a teacher says to his class, "We're going to have a surprise quiz next week; you won't know what day it is on until you show up to class on the day of." One student realizes that this implies the quiz can't possibly be on Friday, as if that were the case he would know on Thursday night that the quiz was on Friday. So, the quiz must be on Monday, Tuesday, Wednesday, or Thursday. However, if the quiz was on Thursday, the student would know that to be true on Wednesday night. Continuing, we see that the quiz must be on Monday, which means that it cannot possibly be a surprise. Thus, the quiz must not exist (as stated).

    Clipse on
  • Options
    SithDrummerSithDrummer Registered User regular
    edited April 2008
    Clipse wrote: »
    On a side note, the xkcd blue eyes/brown eyes riddle reminded me of an apparent paradox related to pop quizzes. One day a teacher says to his class, "We're going to have a surprise quiz next week; you won't know what day it is on until you show up to class on the day of." One student realizes that this implies the quiz can't possibly be on Friday, as if that were the case he would know on Thursday night that the quiz was on Friday. So, the quiz must be on Monday, Tuesday, Wednesday, or Thursday. However, if the quiz was on Thursday, the student would know that to be true on Wednesday night. Continuing, we see that the quiz must be on Monday, which means that it cannot possibly be a surprise. Thus, the quiz must not exist (as stated).
    That was from one of the Sideways Stories from Wayside School math books.

    SithDrummer on
  • Options
    SeguerSeguer of the Void Sydney, AustraliaRegistered User regular
    edited April 2008
    The briefcases:

    The way I read the problem is that the two cases are filled from the start, one with $100, one with $200 (for example).

    You choose a case. It is either the $100 or $200 one, but you don't know what the values of the cases are (you don't know about the $100 and $200, just the double or half thing). You know the amount in the case you've picked.

    If you get the $100 case, you think there are two cases in which it could be $50 or $200.
    If you get the $200 case, you think there are two cases in which it could be $100 or $400.

    In reality there are only the two cases: $100 and $200. Logically, you can't know whether the other case is half or double the one you've picked, because you don't know the values of both cases, only the one you chose initially.

    In this case, it would be up to the person to decide whether they want to risk a loss for a gain (whether they are greedy, risky, cautious, etc), as you can't logically determine the best course of action.

    Is this not the original intent of the problem as stated?

    Seguer on
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    Smasher wrote: »
    Just so it doesn't get lost:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?
    There are four possible outcomes to the scenario: you pick the smaller briefcase and keep (Sk), pick the smaller briefcase and switch (Ss), pick the larger briefcase and keep (Lk), or pick the larger briefcase and switch(Ls).

    First, let's look at the problem where we don't know any of the values of the cases (ie, we don't look in the first one). Assuming y is the value in the smaller case, we calculate the expected values of switching (0 if we keep):

    Sk: 0
    Ss: y
    Lk: 0
    Ls: -y

    We don't know whether we have the larger or smaller briefcase, so our probabilities are the sums of switching or keeping, which are both 0. All is well.

    Now let's look at where we know the values of both cases, but don't know which one we picked initially. Let's go with 100 and 200:

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -100

    Again, our expected value is 0 for both keeping and switching. No arguments here.

    Finally, we look at the originally posed problem, where we know the first suitcase has $X (where X is an actual value and not a variable. I'll use $100).

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -50

    This is where the three value thing other people were talking about comes in. Yes, there are only two values, but you don't know which one the other value is. Using the information you have available to you, your expected value is greater for switching than it is for staying. The reason this problem is different from the other two above is that you're using two different values for the smaller suitcase when calculating the gain or loss, as opposed to the other two problems where you're using only one.
    Smasher wrote: »
    Python:
    #!/usr/bin/python
    
    import random
    
    def main():
        trials = 10000
        knownMoney = 100 # value of the suitcase of money we're shown
        noSwitch = 0
        switch = 0
        for n in range(trials):
            noSwitch += knownMoney # always stay, so always get the same amount
            # randomly choose whether known suitcase is the smaller or bigger one
            if random.random() >=.5: # if it's the bigger one
                switch += knownMoney / 2
            else: #it's the smaller one
                switch += knownMoney * 2
        print 'no switching:', noSwitch
        print 'switching:', switch
                
    if __name__ == "__main__":
        main()
    
    And the results are:

    no switching: 1000000
    switching: 1250450

    I'm pretty sure this is wrong because you're choosing things in the wrong order. You're saying that in a situation with a $50 and $100 briefcase, you ALWAYS choose the $100 briefcase, and in a situation with a $100 and $200 briefcase, you ALWAYS choose the $100 briefcase, when in reality this is not true. Sometimes in each of these situations you will choose the $50 briefcase or the $200 briefcase.

    You're looking at it in the wrong order. We know we have a given amount in the first suitcase; that's given in the problem. I picked 100 arbitrarily, but it works with any value. You're correct that I won't always pick the $100 suitcase for either of the two scenarios, but the problem states that I did as a premise (technically $X, but yeah), and so I work from that. What isn't specified is whether the $100 is the larger or smaller of the two, and so that varies randomly.

    Do you realize what you're arguing there? I mean, you're conceding that if you don't look in the case, then your expected return is 0 if you always switch. You're thus saying that if you look in the case before always switching - that is, if you don't alter your actual actions one bit - that your new knowledge somehow bends reality in such a way as to give you a positive expected return. That makes no sense.

    Also, please read my prior post, in which I start with a small number of varying monetary amounts and demonstrate that for any arbitrarily large finite set of amounts, the expected return is always 0. Respond to that, respond to my point above, and then get back to me.

    I think part of the problem is that people keep comparing the knowledge gained in this puzzle to the knowledge gained in the blue-eyes puzzle, and deciding that trivial knowledge can change an outcome. In the blue-eyes example, the knowledge is affecting a person's behavior. People's behavior != deterministic events.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    Clipse wrote: »
    On a side note, the xkcd blue eyes/brown eyes riddle reminded me of an apparent paradox related to pop quizzes. One day a teacher says to his class, "We're going to have a surprise quiz next week; you won't know what day it is on until you show up to class on the day of." One student realizes that this implies the quiz can't possibly be on Friday, as if that were the case he would know on Thursday night that the quiz was on Friday. So, the quiz must be on Monday, Tuesday, Wednesday, or Thursday. However, if the quiz was on Thursday, the student would know that to be true on Wednesday night. Continuing, we see that the quiz must be on Monday, which means that it cannot possibly be a surprise. Thus, the quiz must not exist (as stated).
    That was from one of the Sideways Stories from Wayside School math books.

    I always liked that one. The solution, as it is:
    Because the quiz cannot "logically" take place on any given day while still being a surprise, it basically breaks the logic of the puzzle. The mere existence of the impossible surprise quiz becomes the surprise, and thus the quiz can take place on any day - including Friday - and it will still be a surprise.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    Dance CommanderDance Commander Registered User regular
    edited April 2008
    I agree with ElJeffe that switching can't be the correct strategy, because if you could somehow cause people to forget what was in the first case and then show them the contents of the new one and offer them the same choice, they would keep switching, and the expected value would keep going up.
    But I can't figure out where anyone has done any math wrong.

    Dance Commander on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Clipse wrote: »
    On a side note, the xkcd blue eyes/brown eyes riddle reminded me of an apparent paradox related to pop quizzes. One day a teacher says to his class, "We're going to have a surprise quiz next week; you won't know what day it is on until you show up to class on the day of." One student realizes that this implies the quiz can't possibly be on Friday, as if that were the case he would know on Thursday night that the quiz was on Friday. So, the quiz must be on Monday, Tuesday, Wednesday, or Thursday. However, if the quiz was on Thursday, the student would know that to be true on Wednesday night. Continuing, we see that the quiz must be on Monday, which means that it cannot possibly be a surprise. Thus, the quiz must not exist (as stated).
    That was from one of the Sideways Stories from Wayside School math books.

    I always liked that one. The solution, as it is:
    Because the quiz cannot "logically" take place on any given day while still being a surprise, it basically breaks the logic of the puzzle. The mere existence of the impossible surprise quiz becomes the surprise, and thus the quiz can take place on any day - including Friday - and it will still be a surprise.
    Paradoxes are not what I would refer to as "good solutions". :P

    Also, I have no idea why people are arguing over the 2 envelope problem. Especialy the arguments over probabilities and distributions.

    Apothe0sis on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    I agree with ElJeffe that switching can't be the correct strategy, because if you could somehow cause people to forget what was in the first case and then show them the contents of the new one and offer them the same choice, they would keep switching, and the expected value would keep going up.
    But I can't figure out where anyone has done any math wrong.

    I do not follow this criticism. Can you explain it for me please?

    Apothe0sis on
  • Options
    Dance CommanderDance Commander Registered User regular
    edited April 2008
    Basically I sit down and look at the expected values and say, "oh, switching is a good idea." So you switch from A to B. If case A had $100 in it, then the expected value of case B is (200+50)/2 = $125
    Now you do the exact same thing again, only you pick case B initially. Now once again you pick case A, and its expected value is now (250+75)/2 = $162.5.
    Etc. etc. I realize this doesn't work if you remember...
    oh, I'm being dumb. You don't get the expected value of the case, you get either 200 or 50, not 125. Nevermind.

    Dance Commander on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    So...

    Does that mean you don't like that criticism anymore?

    Apothe0sis on
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Smasher wrote: »
    Just so it doesn't get lost:
    Aroduc wrote: »
    One little counterintuitive riddle that I always find interesting to see how people react is this:

    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?
    There are four possible outcomes to the scenario: you pick the smaller briefcase and keep (Sk), pick the smaller briefcase and switch (Ss), pick the larger briefcase and keep (Lk), or pick the larger briefcase and switch(Ls).

    First, let's look at the problem where we don't know any of the values of the cases (ie, we don't look in the first one). Assuming y is the value in the smaller case, we calculate the expected values of switching (0 if we keep):

    Sk: 0
    Ss: y
    Lk: 0
    Ls: -y

    We don't know whether we have the larger or smaller briefcase, so our probabilities are the sums of switching or keeping, which are both 0. All is well.

    Now let's look at where we know the values of both cases, but don't know which one we picked initially. Let's go with 100 and 200:

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -100

    Again, our expected value is 0 for both keeping and switching. No arguments here.

    Finally, we look at the originally posed problem, where we know the first suitcase has $X (where X is an actual value and not a variable. I'll use $100).

    Sk: 0
    Ss: 100
    Lk: 0
    Ls: -50

    This is where the three value thing other people were talking about comes in. Yes, there are only two values, but you don't know which one the other value is. Using the information you have available to you, your expected value is greater for switching than it is for staying. The reason this problem is different from the other two above is that you're using two different values for the smaller suitcase when calculating the gain or loss, as opposed to the other two problems where you're using only one.
    Smasher wrote: »
    Python:
    #!/usr/bin/python
    
    import random
    
    def main():
        trials = 10000
        knownMoney = 100 # value of the suitcase of money we're shown
        noSwitch = 0
        switch = 0
        for n in range(trials):
            noSwitch += knownMoney # always stay, so always get the same amount
            # randomly choose whether known suitcase is the smaller or bigger one
            if random.random() >=.5: # if it's the bigger one
                switch += knownMoney / 2
            else: #it's the smaller one
                switch += knownMoney * 2
        print 'no switching:', noSwitch
        print 'switching:', switch
                
    if __name__ == "__main__":
        main()
    
    And the results are:

    no switching: 1000000
    switching: 1250450

    I'm pretty sure this is wrong because you're choosing things in the wrong order. You're saying that in a situation with a $50 and $100 briefcase, you ALWAYS choose the $100 briefcase, and in a situation with a $100 and $200 briefcase, you ALWAYS choose the $100 briefcase, when in reality this is not true. Sometimes in each of these situations you will choose the $50 briefcase or the $200 briefcase.

    You're looking at it in the wrong order. We know we have a given amount in the first suitcase; that's given in the problem. I picked 100 arbitrarily, but it works with any value. You're correct that I won't always pick the $100 suitcase for either of the two scenarios, but the problem states that I did as a premise (technically $X, but yeah), and so I work from that. What isn't specified is whether the $100 is the larger or smaller of the two, and so that varies randomly.

    Do you realize what you're arguing there? I mean, you're conceding that if you don't look in the case, then your expected return is 0 if you always switch. You're thus saying that if you look in the case before always switching - that is, if you don't alter your actual actions one bit - that your new knowledge somehow bends reality in such a way as to give you a positive expected return. That makes no sense.

    I made a mistake. In the first of the three examples I used y to represent the value in the smaller suitcase, but that value is different between Ss and Ls and so the two don't cancel out. That means that you'll still gain expected value whether you know the value of neither of the suitcases or only one of them. The case where you know both of the values (but not which one you have) has an expected value of zero though.
    ElJeffe wrote:
    Also, please read my prior post, in which I start with a small number of varying monetary amounts and demonstrate that for any arbitrarily large finite set of amounts, the expected return is always 0. Respond to that, respond to my point above, and then get back to me.
    (Quoted for reference, spoilered for size)
    ElJeffe wrote: »
    Gooey wrote: »
    You have no way to know wether or not the case you have picked is y or 2y. It's that simple. Given the rules of the problem it presents you with 3 values, not 2.

    x, .5x and 2x

    x is the case you pick. The number in the second case has equal probability of being .5x and 2x, given the fact that you have no idea if the second case is y or 2y.

    This is the error you're making. It makes no sense to say that you need three values to model the values in two briefcases.

    Okay, here's another way of looking at it. Let's assume that we have two sets of values rather than an infinite number of sets. The briefcases either contain 50 and 100 dollars, or 100 and 200 dollars. If you some up all the possible amounts you can win or lose by switching, you'll find that switching yields an expected return of 0. The net expected gain when you gain is $75, and the net expected loss when you lose is $75, and you'll gain 50% of the time, while losing 50% of the time. You'll gain as often as you'll lose, and you'll gain or lose the same amount on average.

    But there are more than two possible sets of values! you say. Fine, add more. Add a set containing 200 and 400 dollars, and recalculate the results. It'll be the same result in the end. Add another containing 25 and 50, and another containing 30 and 60, and another containing 1M and 2M. Run the numbers after each addition, you'll always get the same result - no net expected gain from switching. As the number of variations approaches infinity, you'll never get a situation in which the net expected gain deviates from zero.

    In the second paragraph I'm pretty sure your math is wrong. You're including the gain of switching from $50 to $100 and the loss of switching from $200 to $100 in your averages, but neither of those can happen because you know you have $100! The only gain that can happen is $100 and the only loss is $50, and since they do have equal odds that means the expected value of switching is positive.

    I don't know what you're referring to in the third paragraph with the multiple sets of values. You know the value of one of the briefcases, so there's only two sets of values they could be. Where are these other sets coming from?

    Smasher on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    Apothe0sis on
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    #2. If you gain, you gain Y, but if you lose you also lose Y? No. If you gain, you gain Y; if you lose, you lose Y/2 (which is the same as #1).

    localh77 on
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    I think it all depends on how the values are initially chosen, which is not simply. According to http://http://www.u.arizona.edu/~chalmers/papers/envelope.html, "It is a consequence of the fact that given infinite expectations, any given finite value will be disappointing."

    localh77 on
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    There's a correct formulation, and it's 2. The first formulation incorrectly ignores the possibilities that are associated with the two cases in which you can be shown $100 in that case. Yes, in the case where you're shown $100, the other case can have either $50 or $200, which means that you can stand to lose $50 or gain $100. BUT the first situation there - where you lose $50 - is tied to the situation in which you're shown $50 and stand to gain $50 by choosing the other case. The second situation - where you can gain $100 - is tied to the situation in which you're shown $200 but stand to lose $100 by choosing the other case.

    Hey, I tell you what. To anyone who's convinced the expected value for always switching is higher than that for always keeping the first case, I have a proposition. I will run a simulation in which I "fill" two cases with money, a la the puzzle. I will pick a value Y at random, with the stipulation that the number will be far less than 100 (this keeps the amount from hitting infinity). The virtual cases will contain Y and 2Y dollars.

    You pick a case, I tell you the amount, and you decide whether to swap or not. If it's less, you lose that amount, and if it's more, you gain that amount. Basically, your cumulative winnings is the amount you gain based on your strategy of gaming the expected values. But there's a catch - you must pay 10% of the value of your initially chosen case for the privilege of swapping. Since your argument is that you increase your expected winnings by 25% using your "always swap" strategy, you'd still stand to come out ahead.

    We'll run the simulation 100 times using randomly selected values. You come out ahead, I pay you. I come out ahead, you pay me. Deal or no deal?

    In all seriousness, I would like someone to actually model this scenario correctly by picking values at random and using the logic inherent in the actual puzzle, rather than using Smasher's demonstrably flawed coin-flip model, where the contents of the second case is determined after the first case is picked. I'd do it myself if I could code worth a shit.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    localh77localh77 Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    There's a correct formulation, and it's 2. The first formulation incorrectly ignores the possibilities that are associated with the two cases in which you can be shown $100 in that case. Yes, in the case where you're shown $100, the other case can have either $50 or $200, which means that you can stand to lose $50 or gain $100. BUT the first situation there - where you lose $50 - is tied to the situation in which you're shown $50 and stand to gain $50 by choosing the other case. The second situation - where you can gain $100 - is tied to the situation in which you're shown $200 but stand to lose $100 by choosing the other case.

    Hey, I tell you what. To anyone who's convinced the expected value for always switching is higher than that for always keeping the first case, I have a proposition. I will run a simulation in which I "fill" two cases with money, a la the puzzle. I will pick a value Y at random, with the stipulation that the number will be far less than 100 (this keeps the amount from hitting infinity). The virtual cases will contain Y and 2Y dollars.

    You pick a case, I tell you the amount, and you decide whether to swap or not. If it's less, you lose that amount, and if it's more, you gain that amount. Basically, your cumulative winnings is the amount you gain based on your strategy of gaming the expected values. But there's a catch - you must pay 10% of the value of your initially chosen case for the privilege of swapping. Since your argument is that you increase your expected winnings by 25% using your "always swap" strategy, you'd still stand to come out ahead.

    We'll run the simulation 100 times using randomly selected values. You come out ahead, I pay you. I come out ahead, you pay me. Deal or no deal?

    In all seriousness, I would like someone to actually model this scenario correctly by picking values at random and using the logic inherent in the actual puzzle, rather than using Smasher's demonstrably flawed coin-flip model, where the contents of the second case is determined after the first case is picked. I'd do it myself if I could code worth a shit.

    "the number will be far less than 100" - what does that mean?

    localh77 on
  • Options
    SmasherSmasher Starting to get dizzy Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    There's a correct formulation, and it's 2. The first formulation incorrectly ignores the possibilities that are associated with the two cases in which you can be shown $100 in that case. Yes, in the case where you're shown $100, the other case can have either $50 or $200, which means that you can stand to lose $50 or gain $100. BUT the first situation there - where you lose $50 - is tied to the situation in which you're shown $50 and stand to gain $50 by choosing the other case. The second situation - where you can gain $100 - is tied to the situation in which you're shown $200 but stand to lose $100 by choosing the other case.

    Those situations are irrelevant because the problem states we chose the envelope with $X (ie, $100) in it. The case where the two envelopes were $50 and $100 and we chose the $50 has as much impact as the case where the envelopes have $3 and $6: none, because we know neither of them happened.

    The only cases that can be true are the ones where we ended up choosing the $100 envelope first, and there are only two of those. I could randomly choose numbers until I got ones that fit the parameters of the problem, but not only would that take forever but I'd get exactly the same 50/50 distribution of cases anyway.
    ElJeffe wrote:
    Hey, I tell you what. To anyone who's convinced the expected value for always switching is higher than that for always keeping the first case, I have a proposition. I will run a simulation in which I "fill" two cases with money, a la the puzzle. I will pick a value Y at random, with the stipulation that the number will be far less than 100 (this keeps the amount from hitting infinity). The virtual cases will contain Y and 2Y dollars.

    You pick a case, I tell you the amount, and you decide whether to swap or not. If it's less, you lose that amount, and if it's more, you gain that amount. Basically, your cumulative winnings is the amount you gain based on your strategy of gaming the expected values. But there's a catch - you must pay 10% of the value of your initially chosen case for the privilege of swapping. Since your argument is that you increase your expected winnings by 25% using your "always swap" strategy, you'd still stand to come out ahead.

    We'll run the simulation 100 times using randomly selected values. You come out ahead, I pay you. I come out ahead, you pay me. Deal or no deal?

    In all seriousness, I would like someone to actually model this scenario correctly by picking values at random and using the logic inherent in the actual puzzle, rather than using Smasher's demonstrably flawed coin-flip model, where the contents of the second case is determined after the first case is picked. I'd do it myself if I could code worth a shit.

    The main issue with this is that the trials would have highly uneven weights due to the random money values involved. That defeats the purpose of having multiple trials, as it makes the outcome almost entirely dependent on the top 1-3 or so results and is thus unacceptably subject to luck/chance/the Force. We could narrow down the range of random values, but the more we do that the more it would seem to defeat the purpose of having random values in the first place.

    Incidentally, what method would you use to pick the values in the envelopes?

    Smasher on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    localh77 wrote: »
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    I think it all depends on how the values are initially chosen, which is not simply. According to http://http://www.u.arizona.edu/~chalmers/papers/envelope.html, "It is a consequence of the fact that given infinite expectations, any given finite value will be disappointing."

    That link shows pretty quickly what is being swept under the rug here: the probability distribution of X. I called that the prior distribution but maybe that wasn't the proper terminology. Namely, that the value of X cannot simply be pulled out of the air, and that not all values of X are equally likely. If you have the improper distribution where all positive values of X have equal probability (the amount of money in the bag is any size of equal chance), then that results in an infinite integral and thus is not a proper probability distribution. On that page g(x) is the probability density for Y that we were talking about (the amount of money in the smaller bag) and h(x) is the probability density for X the value in the bag that we opened. Whether or not you would want to switch for a particular value of X (say $100) is heavily dependent upon g(x).

    The paradox (that you should always switch with positive expectation) does hold in one situation: when E[X] is infinite. But weird things tend to happen with infinite expectations.

    The relevant expectation for switching is E[K-A] on that page. That is the average value that you expect to gain from switching. If E[X] is finite, then E[K-A] = 0. What this means is that for some values of X that you get out of the bag it may be positive expectation to switch, but then there would be other values of X where you would get negative expecations, such that on average you get no benefit.

    If we know about the underlying distribution g(x), then we can figure out that it may be a good idea to switch when we see X dollars in the first bag. But since there is no indication that we know that, the expectations of gains are zero. Which is consistent with symmetry of the situation.

    Savant on
  • Options
    hesthefastesthesthefastest Registered User regular
    edited April 2008
    My two cents on the briefcase problem is this:
    It comes down to whether you think you have the case once you pick it. If I were to pick case A and receive 50$, then the guy asks me to either double or half what I already have, I'd do it. It's like the coin flip scenario mentioned previously. But if I choose Case A, then have to choose to switch (before I have the money from Case A), then the choice is the same; either Case A or B, it makes no difference. If it were better to switch it, it would then be better to switch back, and then back and forth forever.

    It depends on whether you treat the two decisions as seperate problems or as one whole problem. Thinking on it now, I'd go with the latter.

    hesthefastest on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    I don't like the envelope question.

    It's too hard to think of a model that makes sense for computer simulation. Smasher's seems ok to me, but it also feels a little wrong.

    Apothe0sis on
  • Options
    SavantSavant Simply Barbaric Registered User regular
    edited April 2008
    Apothe0sis wrote: »
    I don't like the envelope question.

    It's too hard to think of a model that makes sense for computer simulation. Smasher's seems ok to me, but it also feels a little wrong.

    The page mentioned earlier (http://http://www.u.arizona.edu/~chalmers/papers/envelope.html) is pretty rigorous about solving it. The important detail that is easily looked over is that the X dollars you see in the first bag depends on some probability distribution. If you know what that distribution is then you can figure out when to switch and when not to. So if you know how likely it is for different amounts of money are to be in the bags then you can strategize, but if you don't then the expected value of switching is zero (barring degenerate distributions of X).

    The problem with Smasher's simulation is that he assumes things about the distribution of X that are not in the problem statement (what's g(x)?), and that even with those assumptions he only evaluates it for one specific value of X (known money isn't always going to be 100).

    Savant on
  • Options
    MatrijsMatrijs Registered User regular
    edited April 2008
    Savant wrote: »
    localh77 wrote: »
    Apothe0sis wrote: »
    Regarding the envelopes:

    Which does everyone think is the correct formulation? Does anyone think there IS a correct formulation? I get my "I think there's something wrong" neck prickle in the second formulation, but then I recognise that Smullyan is a logic freak who's devoted far mroe time to it than I have...

    1. Let the amount in the envelope you chose be A. Then by swapping, if you gain you gain A but if you lose you lose A/2. So the amount you might gain is strictly greater than the amount you might lose.
    2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose.

    I think it all depends on how the values are initially chosen, which is not simply. According to http://http://www.u.arizona.edu/~chalmers/papers/envelope.html, "It is a consequence of the fact that given infinite expectations, any given finite value will be disappointing."

    That link shows pretty quickly what is being swept under the rug here: the probability distribution of X. I called that the prior distribution but maybe that wasn't the proper terminology. Namely, that the value of X cannot simply be pulled out of the air, and that not all values of X are equally likely. If you have the improper distribution where all positive values of X have equal probability (the amount of money in the bag is any size of equal chance), then that results in an infinite integral and thus is not a proper probability distribution. On that page g(x) is the probability density for Y that we were talking about (the amount of money in the smaller bag) and h(x) is the probability density for X the value in the bag that we opened. Whether or not you would want to switch for a particular value of X (say $100) is heavily dependent upon g(x).

    The paradox (that you should always switch with positive expectation) does hold in one situation: when E[X] is infinite. But weird things tend to happen with infinite expectations.

    The relevant expectation for switching is E[K-A] on that page. That is the average value that you expect to gain from switching. If E[X] is finite, then E[K-A] = 0. What this means is that for some values of X that you get out of the bag it may be positive expectation to switch, but then there would be other values of X where you would get negative expecations, such that on average you get no benefit.

    If we know about the underlying distribution g(x), then we can figure out that it may be a good idea to switch when we see X dollars in the first bag. But since there is no indication that we know that, the expectations of gains are zero. Which is consistent with symmetry of the situation.

    The original problem description reads,
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    I don't see anything in there about a finite limit on the dollar value in the briefcases. In fact, that's key to the whole setup. The idea here is that, unlike any problem where the setup involves a finite limit on the amount that could be in a briefcase (or a finite limit for X, they're the same thing in practice), we can never know whether we have the higher value briefcase or not. In the case where X has a finite limit, we will inevitably know which briefcase we have, for certain, in 25% of the cases, which makes the whole problem very silly. The point of the problem is the case where you don't know.

    To illustrate why this is important, let me present the following two examples, based on the following edited problem:
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    ADDENDUM: Y, the value of the money in the smaller briefcase is equally likely to be any value between 0 and 100, but cannot exceed 100

    Example 1:
    You open a briefcase. The briefcase contains $198.

    Example 2:
    You open a briefcase. The briefcase contains $20.

    In example 1, you know for certain that you're holding the larger briefcase. Therefore, your expected value looks like this:
    Do not switch: 1*198 = 198 (probability 1 that you will receive 198 dollars)
    Switch: 1*99 + 0*396 = 99 (we know that the other briefcase is the smaller one, and therefore holds $99, and we know that the other briefcase cannot be the larger value, which, in this case, would be $396)

    But in example 2, you don't know which briefcase you're holding. Therefore, your expected value looks like this:
    Do not switch: 1*20 = 20 (probability 1 that you will receive 20 dollars)
    Switch: .5*10 + .5*40 = 25 (probability .5 that you are holding the smaller case, which means the other case will hold 40, and probability .5 that you are holding the larger case, which means that the other case will hold 10)

    What you should take away from these two examples is that switching is a universally better tactic provided that you do not know which briefcase is which. That's the effect of the finite limit - it tells you, in some cases, which briefcase is which.

    Now let's consider the actual problem once more:
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    What is a reasonable upper bound on X? Hard to say, isn't it? Without further information, the only perfectly certain conclusion we can make about X's upper bound is that it is less than the complete supply of currency times two thirds (since they'd have to fill the other briefcase as well).

    This penny-ante $100/$200/$50 stuff we've been talking about is certainly nowhere near the upper bound, though. If somebody's giving you significant sums of money for free, it's pretty hard to say how much they're likely to give.

    Therefore, for almost every case we are likely to see, we cannot know for certain which briefcase is which. This means that, in almost every case, switching is the best strategy.

    Matrijs on
  • Options
    Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    edited April 2008
    I think I agree with Matrij, and also note that this whole thing reminds me of the discussion of the "I have two children and one is a girl" thing.

    Apothe0sis on
  • Options
    GorakGorak Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Do you realize what you're arguing there? I mean, you're conceding that if you don't look in the case, then your expected return is 0 if you always switch. You're thus saying that if you look in the case before always switching - that is, if you don't alter your actual actions one bit - that your new knowledge somehow bends reality in such a way as to give you a positive expected return. That makes no sense.


    It's clearly a quantum suitcase.


    Don't you know anything Jeff? :P

    Gorak on
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    Smasher wrote: »
    The main issue with this is that the trials would have highly uneven weights due to the random money values involved. That defeats the purpose of having multiple trials, as it makes the outcome almost entirely dependent on the top 1-3 or so results and is thus unacceptably subject to luck/chance/the Force. We could narrow down the range of random values, but the more we do that the more it would seem to defeat the purpose of having random values in the first place.

    We could always throw out the X highest and lowest values, or something. The problem isn't insurmountable, if you want to take me up on it. You're clearly right - what do you have to lose? ;)
    Incidentally, what method would you use to pick the values in the envelopes?

    RNG applied over the given range that I pick in the beginning.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    Matrijs wrote: »
    The original problem description reads,
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    I don't see anything in there about a finite limit on the dollar value in the briefcases. In fact, that's key to the whole setup. The idea here is that, unlike any problem where the setup involves a finite limit on the amount that could be in a briefcase (or a finite limit for X, they're the same thing in practice), we can never know whether we have the higher value briefcase or not. In the case where X has a finite limit, we will inevitably know which briefcase we have, for certain, in 25% of the cases, which makes the whole problem very silly. The point of the problem is the case where you don't know.

    To illustrate why this is important, let me present the following two examples, based on the following edited problem:
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    ADDENDUM: Y, the value of the money in the smaller briefcase is equally likely to be any value between 0 and 100, but cannot exceed 100

    Example 1:
    You open a briefcase. The briefcase contains $198.

    Example 2:
    You open a briefcase. The briefcase contains $20.

    In example 1, you know for certain that you're holding the larger briefcase. Therefore, your expected value looks like this:
    Do not switch: 1*198 = 198 (probability 1 that you will receive 198 dollars)
    Switch: 1*99 + 0*396 = 99 (we know that the other briefcase is the smaller one, and therefore holds $99, and we know that the other briefcase cannot be the larger value, which, in this case, would be $396)

    But in example 2, you don't know which briefcase you're holding. Therefore, your expected value looks like this:
    Do not switch: 1*20 = 20 (probability 1 that you will receive 20 dollars)
    Switch: .5*10 + .5*40 = 25 (probability .5 that you are holding the smaller case, which means the other case will hold 40, and probability .5 that you are holding the larger case, which means that the other case will hold 10)

    This was why I stipulated in my offer that the maximum value of Y would be "far below 100". If you pick a random number between 1 and infinity, you're going to invariably wind up with an obscenely large number. If you have a weighted probability distribution, that's going to affect the results. And if you tell the person that the values will never exceed X, then the person knows they have the case with the most money whenever it's found to exceed X/2.

    By picking a maximum value for X but keeping it secret, you can have a linear probability distribution without giving away any information (assuming either the player doesn't play a large enough number of games to be able to deduce the maximum, or that all simulations are run at once.) If the maximum value is 30, that's "far below 100", but the player doesn't have enough information to deduce that getting a case with $16+ in it is definitely the larger.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    TechnicalityTechnicality Registered User regular
    edited April 2008
    I'm on the "doesn't matter if you switch" side of the fence for this reason:

    Add an imaginary second person and give them the case the player rejects and then do the whole thing a million times with the same cash amounts as the first time, but new pairs of real+imaginary people.

    There is no way the real people are going to come out significantly ahead of the imaginary ones by using the strategy of always switching (like they do in the 3 doors thing).

    Technicality on
    handt.jpg tor.jpg

  • Options
    DaenrisDaenris Registered User regular
    edited April 2008
    Well, I just wrote up a quick program that picks a random value between 50 and 250 as the low value and randomly assigns it to case 1 or 2, then assigns twice that value to the other case.

    Running 1,000 trials in which I always pick case 1 and always switch, the total value was 229,949 and the total value of the other cases was 224,467. Running 10,000 trials yielded a winnings of 2,255,928 with the other cases totaling 2,252,136. And running 100,000 the winnings were 22,451,448 and the other case was 22,484,589.

    Let me know what changes you want made to the program if any to better model the idea, but I think this does it pretty well. If switching was always the better option, we should see a clear separation in the winnings vs the other cases because the simulation always switches.

    Daenris on
  • Options
    Premier kakosPremier kakos Registered User, ClubPA regular
    edited April 2008
    Apothe0sis wrote: »
    Apothe0sis wrote: »
    This is most assuredly not my series, and math geeks will probably see why this is funny.

    2, 5, 877, 27644437, 35742549198872617291353508656626642567, 359334085968622831041960188598043661065388726959079837 ... ?
    93074010508593618333390737086787716686243541818180888374481714163260295083047249735772767217242983302809460000977534180207136267393493843565165244803160631468142310993331802362163022244196698256971348116848613873420484831145693123069522851464977918752008824682368616575925219950816540476437905078816850961411910271333677033484348376563240090830825402255766897769433849999097072825800009667464269609454294177226626947696038219790689936750529759962907392539903707559051710294385488230399560792411004667389541080494330079820301708455191991301691046952805571310530593988681849292364334866037961772856489592631353387119588912930029104717108771889838640129969152323291892894440849105218746915343681917125357252705291568270795558661856036155452894768239527100657674295540074496397200222788808730895367566036148544193391411249930638389221671262805683765652287931774606379446475019704511886714677938481928235170230529782702798198071512791277092287199903418895014552315221588641882843663507160334370905336400728991988818738787812880619370936400288087393153584463921744028247514969200055227428251657073540384943632840780163289976513840862197690165020628387737123529546265113690620682595841836568269584610937749225174156188272089587049752042320435999387410834519188271208577279962713103121177117736246578720020849615208138290410547459210639313481871280957180797480718884005158357183543875383933081640708958426808475868980596063805203682907241542158286516965706176501691352009055980316953619941361963586164642762338959226194401591549258894070494114321789561253423874645767485673563010694616837670389191021116993326818985640677682311168596513135927575792933795897024983955212555699886813758658727223213225641249054291854713271825236198768288749577317015750567399596468873488940152346191448776708760260862506238255653851554400298770502418390469037927740196990922130058457344538461597268140533944714325634938884545914139335512028740689585456916586292846456683229763623263845961927185120666686527368190661471902546889836939907242929408922820078908112593178663177685220525268101383971283991711468187276352738607911284318470208480309880183721371226886168592890172034997639285024092759687525920453573640538777106302852351553054823775813450680545320959747676102527895283952321119565456131914284468837228528467270883859016854414206071054324686179724452435704506155435210031925383788518515000655319634148229746788564381020601053143272002310437607878237640840123305186361012402650803349859965081202294515347182519213721197040915413263249473539740781608786907273923065127196445526332443113542957189094428043671781635432417130645135281343627517544700098433529452127971455501702330453614487341357174977756767117068294356318437149385243962447271471217433312351733571281240293461665450829761335596591586210391312306966997773285594528746729346018039023115838145677882687881461094746947227827301981448949646394994521319966023728976458814160934241105908961406982946398028919136619104849916909167562570774608666807683843343671704615600840389419697202833796957720397144242132316427467001808219485482456195736463596111707485715440237594384459161928415836077852237526665117484048997247449275300584446550437546119923676017959462712581196976718470946270331842562972612728361652280030892982127111700793139354703946990580256780069816918913085639153753984131390468635302755088889211367474268779633561838363953846659135906229513613392267266814066735012127403702413192430883159093465043866796901005656737875251268652602552279882927553368913466086109551491491194789740567321879833676289767448527915219283173310873247606366284048111931661775107155492303602877951956085944593035383609134038714354896277016656832069877486297785906138808032199478041298446201040604532790336475388815136237012786896885113359098836781297206766936652939896086173174247047512753636594129042150203504101570396300673678733698741829932012118685174241471375329706399365116190852969742529040156369682356942527779947968734604866975128678544682420679340574499610003971519618440516937488305570847146550169657044672599070091974948167777874966859847710614503223797831808567277923192280169244649411883773069412702823257335336231095595728578499725749559860832154398909223367972897328124149325331610872832467828260416056336532880372285600592198569202140923550067859152359527561994066586118059826706360048699302434574822753790260890021587852244744860631603541791133619526793181058747657744949691213833240225173150813910220165362694840945664849801045511812349221956400744150897341864900858146835458095842131465890241777553970152159303024640985017795077066167761624753595558912106613186934139850039921207148473490125950756237204950896105932772825720856228894276720028932601843556239266022962890064643933856257123158863877848475127602406307792588595211703986239432550726691852534321783665511020801887786921172059700879026067285947359724825015339666089008466190352993533122497861078203104758579483775098260468006288071847722122648015073671866043577071451595398504208603083753549564030349663312899219714349854025049280827748345322558570768577632728362536039844811568342727553594911765165120515649387783590754617053981056103148444168056157453359284719489933160382315541998163668080430170189604432196012778454138678438477287617931797344197371492016925294293920435571230125441856377635430992383031641860714124008747101118765506416821101339484788073020563327420998407651377506476214794114648110563115377659800929141531398790434303752231082196206662416811943482851130965329365467377939976152662541912142094277951396022236086513406015986868896776219944649030822428734842983667486613881443827330912807645197812117423741459096670510731379104448554954932379389781897741126092080837369956421197209552100624952564640377427157704473125717924200923183638927529737419861594670798574290635462323151713380155501912570516986807486808850943703276543637772748688396151956171036621635411599055947803963844239542311343491664913052539486422164914206103599970496248266628264533316702997562527874219219368930835654767520116901325628008414215779935053300527454338095585904167331992744976932786018872605231337251856220370815003679385877026959961549970646582412358020805216001234519254278426271698879069271009841356762362776355079842217812045389559364458319251495759620406596953099358508401125247456305868316439298039940486165859806267316803843769250751290909537174237439362050585335391392533062430185171682628621992342320221279940728553592588293913252900669869333819652288398187260320155987924359450740883499506500844712757333449557083885253703080601131

    Fuck you and your prime bell numbers.

    Well, I did NOT expect that.

    Please, give me some credit. A sequence of prime numbers that grows that fast pretty much narrows it down to a very small set of sequences, the most famous of which is the prime Bells.

    Premier kakos on
  • Options
    Premier kakosPremier kakos Registered User, ClubPA regular
    edited April 2008
    This won't induce as many math headaches as some of the other stuff in this thread.

    _, _, _, 7, 7, 9, 13, 10, 9, 1, 4, 9, 16, 16, 9, 13 ...

    What are the first three numbers in this series?

    I'll give you the first FOUR numbers.

    0,1,4,9

    Premier kakos on
  • Options
    Premier kakosPremier kakos Registered User, ClubPA regular
    edited April 2008
    1, 4, 24, 192, 1728, 17280, 207360, 2903040, 43545600,...

    What is the next number in this sequence?

    Premier kakos on
  • Options
    SithDrummerSithDrummer Registered User regular
    edited April 2008
    1, 4, 24, 192, 1728, 17280, 207360, 2903040, 43545600,...

    What is the next number in this sequence?
    696,729,600 - you're multiplying by increasing composite integers.

    SithDrummer on
  • Options
    ElJeffeElJeffe Moderator, ClubPA mod
    edited April 2008
    1, 4, 24, 192, 1728, 17280, 207360, 2903040, 43545600,...

    What is the next number in this sequence?

    Your mother is a sequence.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    MatrijsMatrijs Registered User regular
    edited April 2008
    Daenris wrote: »
    Well, I just wrote up a quick program that picks a random value between 50 and 250 as the low value and randomly assigns it to case 1 or 2, then assigns twice that value to the other case.

    Running 1,000 trials in which I always pick case 1 and always switch, the total value was 229,949 and the total value of the other cases was 224,467. Running 10,000 trials yielded a winnings of 2,255,928 with the other cases totaling 2,252,136. And running 100,000 the winnings were 22,451,448 and the other case was 22,484,589.

    Let me know what changes you want made to the program if any to better model the idea, but I think this does it pretty well. If switching was always the better option, we should see a clear separation in the winnings vs the other cases because the simulation always switches.

    The problem with your model is the upper bound. In the actual problem, there is no upper bound on the amount of money which could be in the suitcase. If you add in a fixed upper bound, as I note in my post on the previous page, there are situations like example two where, given that you give the player complete information about the method of random selection, he can know which briefcase he has by only opening one.

    Matrijs on
  • Options
    Premier kakosPremier kakos Registered User, ClubPA regular
    edited April 2008
    What is the next number in this sequence?

    1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 1, 1, 1, 1, 1, 8, 1,...

    Premier kakos on
  • Options
    MatrijsMatrijs Registered User regular
    edited April 2008
    ElJeffe wrote: »
    Matrijs wrote: »
    The original problem description reads,
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    I don't see anything in there about a finite limit on the dollar value in the briefcases. In fact, that's key to the whole setup. The idea here is that, unlike any problem where the setup involves a finite limit on the amount that could be in a briefcase (or a finite limit for X, they're the same thing in practice), we can never know whether we have the higher value briefcase or not. In the case where X has a finite limit, we will inevitably know which briefcase we have, for certain, in 25% of the cases, which makes the whole problem very silly. The point of the problem is the case where you don't know.

    To illustrate why this is important, let me present the following two examples, based on the following edited problem:
    You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

    ADDENDUM: Y, the value of the money in the smaller briefcase is equally likely to be any value between 0 and 100, but cannot exceed 100

    Example 1:
    You open a briefcase. The briefcase contains $198.

    Example 2:
    You open a briefcase. The briefcase contains $20.

    In example 1, you know for certain that you're holding the larger briefcase. Therefore, your expected value looks like this:
    Do not switch: 1*198 = 198 (probability 1 that you will receive 198 dollars)
    Switch: 1*99 + 0*396 = 99 (we know that the other briefcase is the smaller one, and therefore holds $99, and we know that the other briefcase cannot be the larger value, which, in this case, would be $396)

    But in example 2, you don't know which briefcase you're holding. Therefore, your expected value looks like this:
    Do not switch: 1*20 = 20 (probability 1 that you will receive 20 dollars)
    Switch: .5*10 + .5*40 = 25 (probability .5 that you are holding the smaller case, which means the other case will hold 40, and probability .5 that you are holding the larger case, which means that the other case will hold 10)

    This was why I stipulated in my offer that the maximum value of Y would be "far below 100". If you pick a random number between 1 and infinity, you're going to invariably wind up with an obscenely large number. If you have a weighted probability distribution, that's going to affect the results. And if you tell the person that the values will never exceed X, then the person knows they have the case with the most money whenever it's found to exceed X/2.

    By picking a maximum value for X but keeping it secret, you can have a linear probability distribution without giving away any information (assuming either the player doesn't play a large enough number of games to be able to deduce the maximum, or that all simulations are run at once.) If the maximum value is 30, that's "far below 100", but the player doesn't have enough information to deduce that getting a case with $16+ in it is definitely the larger.

    The problem is still there even if you don't tell the player that there's an upper bound on X. The trick of the problem is in the "random selection" of X. Truly random selection results in an equal chance in all cases that the other briefcase contains .5X or 2X, whereas any situation with an upper bound alters that probability (whether the player knows it or not), and thus fundamentally alters the problem.

    You can say that that's unrealistic, and that there is always an upper bound on X, and I would agree with you. My point, though, is that a sufficiently high upper bound of X results in the player wanting to switch in most cases - namely, in the 75% of cases where he cannot deduce, based on the upper bound of X, which briefcase he's opened.

    I will agree to this: if you provide an upper bound for X, then do not tell the player what it is and do not give the player enough trials to be able to deduce it (this is actually probably a pretty nasty sticking point, as even one trial allows him to begin weighting his switch/not switch choice to beat the average), he can't do better by strategically switching or not switching.

    Matrijs on
  • Options
    GooeyGooey (\/)┌¶─¶┐(\/) pinch pinchRegistered User regular
    edited April 2008
    Matrijs wrote: »
    Daenris wrote: »
    Well, I just wrote up a quick program that picks a random value between 50 and 250 as the low value and randomly assigns it to case 1 or 2, then assigns twice that value to the other case.

    Running 1,000 trials in which I always pick case 1 and always switch, the total value was 229,949 and the total value of the other cases was 224,467. Running 10,000 trials yielded a winnings of 2,255,928 with the other cases totaling 2,252,136. And running 100,000 the winnings were 22,451,448 and the other case was 22,484,589.

    Let me know what changes you want made to the program if any to better model the idea, but I think this does it pretty well. If switching was always the better option, we should see a clear separation in the winnings vs the other cases because the simulation always switches.

    The problem with your model is the upper bound. In the actual problem, there is no upper bound on the amount of money which could be in the suitcase. If you add in a fixed upper bound, as I note in my post on the previous page, there are situations like example two where, given that you give the player complete information about the method of random selection, he can know which briefcase he has by only opening one.

    This. If there is a finite upper bound to the amount that can be in the cases it changes the problem entirely.

    Gooey on
    919UOwT.png
Sign In or Register to comment.