Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.

Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

## Posts

If my three were a four, and my one where a three, what I am would be nine less than half what I'd be.

I am a three digit, whole number. What am I?

Speaking of, wouldn't anyone with an Erdos number of 0 be dead, and thus unable to solve the problem?

More complex explanation:

a,b, andc. ---> x = 100a+10b+c.y is the "new" number created by changing two digits in x, and leaving the third the same. In changing one digit from 1 to 3, for example, we are essentially just adding 2 to it. If that digit happens to be

a, then we are adding 200 to x; if it isb, we are adding 20, and so on. There are six combinations of changes, resulting in six possible values to ultimately add to x in order to get y: 210, 201, 021, 120, 102, and 012.We also know that x = (1/2)y-9. Solving for y, we get y = 2x+18.

So let's simplify. Our options are as follows:

x + 210 = y = 2x + 18 ---> x = 210 - 18

x + 201 = y = 2x + 18 ---> x = 201 - 18

x + 021 = y = 2x + 18 ---> x = 021 - 18

x + 120 = y = 2x + 18 ---> x = 120 - 18

x + 102 = y = 2x + 18 ---> x = 102 - 18

x + 012 = y = 2x + 18 ---> x = 012 - 18

Solving for x, we get six answers: 192, 183, 3, 102, 84, and 6. Only one has both a 1 and a 3 in its digits: 183.

Also, Paul Erdös is the only person with an Erdös number of 0, and he

isdead. There are a lot of people still alive with an Erdös number of 1, however. And Natalie Portman has an Erdös number of 5.(143 + 9) * 2 = 304

(153 + 9) * 2 = 324

(163 + 9) * 2 = 344

(173 + 9) * 2 = 364

(183 + 9) * 2 = 384

Those all fit, right?

Edit: Oh, I didn't get that the 3rd number had to stay the same.

I always like math problems that favor reason over brute force.

I make tweet.

Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

Well, I used a little brute force. But only after I'd whittled down the pool of options to a few possibilities. The alternative was figuring out how to set up a system of equations and solving for X, Y and Z. My way took about 3 minutes.

I make tweet.

That one's called the Monty Hall problem. I'll save you the trouble and post it:

You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

Argument start! Though simulations prove the following is correct:

Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

Loose: about to slip, to release (Ex: "That knot is loose." "Loose arrows.")

Eh, deal or no deal is an exercise in expected value. And I'm pretty sure what the banker offers as a deal is always a bit less than the expected value, too.

It's actually not, I did do that math at one point trying to figure out what they were doing to calculate it, figuring as you did. It's weighted by how many cases are left. More cases = over expected value and as the number decreases it goes below.

Loose: about to slip, to release (Ex: "That knot is loose." "Loose arrows.")

I make tweet.

In other news, I need to know why that eye problem is tricky...

can't he see 100 blue eye'd people? All the time? That doesn't help them know their own eye color. I don't think I understand the problem

Wow thats a killer, it must be like "C(x) = C(x-1)+C(x-2)" or some weird sequence.

Good Puzzle!

I reduced it to one person with blue eyes. They'd obviously see nobody else with blue eyes, know their eyes are blue, and leave on the first night.

2 people. They both see one person with blue eyes. That person doesn't leave the first night and so they conclude there must be somebody else with blue eyes. They both leave on night #2.

3 people. After two nights they conclude there must be three, and that they are one of the three, so all three leave on night #3.

... and so on.

Edit: Yep, I was right. You just have to approach it the right way and it falls out.

I find it amazing that people will get an offer of 250,000 or whatever and turn it down because they are convinced they have the 1,000,000 case. If I were on a show like that I would go long enough to get rid of my debt, if not take a huge chunk out of it, and then quit.

Most people probably think the same thing, before being wooed by HUGE MONIES (that they probably won't get, but whatever, details).

I make tweet.

You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you?

If you are trying to maximize expected value, then yes you always switch. $(2X + .5X) / 2 = $1.25X > $X. You have an equal chance of losing $.5X as gaining $X.

This is true whether you open the case or not. You could just say, "Given this unopened case, I stand to either gain X or lose X/2 by switching."

I understand the "logic", but I would like to see a simulation run to empirically demonstrate this, because it's nor merely counterintuitive, it's nonsensical.

edit: I guess the problem is that if it's true that you're better off swapping to case B after picking case A, it's also true that you're better off swapping to case A after picking case B. Thus, you're better off ending up with case B and also better off ending up with case A which are mutually exclusive statements.

I make tweet.

Actually, now that I think about it I was wrong.

This actually seems similar to the blue-eyes problem in one respect: you begin with a catalyst action already fulfilled. In the blue-eyes problem, the guru has declared the existence of blue eyes, beginning the problem. In this one, you've already picked one case, giving you a starting position and dilemma - stay or switch. If you hadn't chosen anything yet, then the two cases are functionally identical and your probability is 1/2 either way.

Savant, you're saying that an equivalent value Y is at risk in either direction. The problem is that it depends on whether or not your case is the more valuable one. If X is the high case, X = 2Y and you're risking Y for a perceived chance at 4Y. If X is the low case, X = Y and you're risking Y/2 for a perceived chance at 2Y. But no matter what, you're risking X/2 for 2X, and how Y and X are related is ultimately irrelevant. You've simply changed the variable's name.

Let's contrast this problem with a similar one that has a different answer. Suppose you're told the briefcases have Y and 2Y dollars in them. In this case you'll either gain or lose Y dollars by switching, and so your expected value for switching is 0 (ie, it doesn't matter if you do or not). This holds whether you know the value of Y or not.

The difference between the two scenarios is that in the latter one you know the value of both briefcases, while in the former you don't. This frequently causes confusion when people treat the first one as if it were the second one, which it is not.

BTW, don't look at ratios, they'll mess you up. All that matters is the absolute gain or loss you get from switching.

Yeah, this problem is the second case. There are two briefcases at the start, and one has twice as much money as the other. So they start at Y and 2Y, but you just don't know which is which or what Y is. When you open one you know that X=Y or X=2Y, but that has no effect on how much you stand to gain or lose by switching between them. You just know you will gain or lose X/2 or X by switching, but you don't have an equal chance of losing or gaining those two values. It is set that it is one or the other, you just don't know which.

Functionally, it'll work as follows:

Case I:

- Computer chooses A or B.

- Computer asked if he wants to switch.

- Computer keeps initial choice.

Case II:

- Computer chooses A or B.

- Computer asked if he wants to switch.

- Computer changes.

I think it's pretty clear here that picking A then switching to B is indistinguishable from picking B at the outset, and picking B then switching to A is indistinguishable from picking A at the outset - there's no reason why it wouldn't be. If we ran this simulation, we would find no statistical discrepancy in how much money each computer person expected to get.

Okay, now let's recreate the situation in the puzzle. We add a step in there whereby we "show" the computer the contents of his initial choice. It's easy to see that adding such a step would have no effect on the internal logic of the first simulation - if nothing functionally changes in the simulation, then the simulation would have to yield identical results. ie, there would be no statistical change between keeping the initial choice and switching.

I make tweet.

@ ElJeffe: Your thought experiment begins with the choice of A or B, which has already been made in the original question. Choosing to switch or stay when given a tangible piece of the Y/2Y is the dilemma.

Opening the case has no effect on how much money is in them, or what the probability of choosing the bigger case is. It just changes how much you know about the possible amounts of money in the cases.

If you started out with a briefcase with X dollars, and then were offered to switch with equal chances of doubling or halving your money, then you would always want to switch. But you don't have equal chances of doubling or halving in this problem. It is set with probability 1 what you will gain or lose by switching.

Didn't read through yet, so I don't know if someone's done this already:

Proof by induction. Suppose that there are n blue-eyed people on the island.

Claim: all n people will leave the island on the nth night.

Base case: n = 1

The one blue-eyed person will notice on the 1st day that he can see no blue-eyed people on the island. However, there must be at least one, so he deduces that he must have blue eyes and leaves on the 1st night.

Induction:

Suppose we have n blue-eyed people on the island (n > 1), and assume that the claim is true for n - 1 blue-eyed people.

Take an arbitrary blue-eyed person. He sees n - 1 blue-eyed people. He knows that if he does not have blue eyes, then by the induction hypothesis, the n - 1 blue-eyed people will leave on the n - 1th night. However, every blue-eyed person is thinking the same thing, so none of them will leave on the n - 1th night, leading this person to conclude that there must be more than n - 1 blue-eyed people on the island.

Thus this blue-eyed person realizes that he must have blue eyes and so leaves the island the next night, the nth night. So all the blue-eyed people leave the island on the nth night, completing the induction.

Also noteworthy is that if there were only blue-eyed and brown-eyed people on the island, then after all the blue-eyed people had left on the nth night, all the brown-eyed people would then leave the next night, since logically none of them could have blue eyes since otherwise the blue-eyed people would have waited. However, this doesn't happen since they know of at least two other possible colours (brown and green).

:^: