The image size limit has been raised to 1mb! Anything larger than that should be linked to. This is a HARD limit, please do not abuse it.
Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

# It's Math-Puzzle Time

## Posts

• Registered User regular
edited April 2008
Absurdist wrote: »
Nobody can solve my puzzlers. Do I win something?

10, 0, 3, 20, 16, 51, 32, 67, 74, ?
Seventy three is one answer, but any number with twelve letters would do. The number of letters increases by one each time.

Absurdist wrote: »

1, 4, 7, 12, 15, 18, 21, 24, 27, ?
Isn't this the same problem?

Suckafish on
• Registered User regular
edited April 2008
Here is a fairly easy one.

If my three were a four, and my one where a three, what I am would be nine less than half what I'd be.
I am a three digit, whole number. What am I?

Suckafish on
• Registered User regular
edited April 2008
Suckafish wrote: »
Here is a fairly easy one.

If my three were a four, and my one where a three, what I am would be nine less than half what I'd be.
I am a three digit, whole number. What am I?
183

Savant on
• Registered User regular
edited April 2008
It has multiple answers, doesn't it?

Scooter on
• Registered User regular
edited April 2008
Suckafish wrote: »
Absurdist wrote: »
Nobody can solve my puzzlers. Do I win something?

10, 0, 3, 20, 16, 51, 32, 67, 74, ?
Seventy three is one answer, but any number with twelve letters would do. The number of letters increases by one each time.

Absurdist wrote: »

1, 4, 7, 12, 15, 18, 21, 24, 27, ?
Isn't this the same problem?
And either way, it's not a math puzzle.

SithDrummer on
It's an easy game to hate
• Registered User regular
edited April 2008
Absurdist wrote: »
Apothe0sis wrote: »
Absurdist wrote: »
Apothe0sis wrote: »
This is most assuredly not my series, and math geeks will probably see why this is funny.

2, 5, 877, 27644437, 35742549198872617291353508656626642567, 359334085968622831041960188598043661065388726959079837 ... ?

Usually when they jump right from small numbers into enormous numbers it means that the series is a list of numbers with an unusual property.
The first three are prime, and I will bet that the rest are too. But if I am right about the "unusual property" thing (i.e. if the series is not derived from performing operations on its members) then it's just a case of either recognizing these numbers or not recognizing them. I don't recognize them. Will you tell me if the series is operationally derived or not? Because if it is, then I will try to solve it. If it isn't, then I give up
You probably have to recognise the series. It's concievably solvable, but you'd have to have an Erdos number of 0 or something to do so.

REAL SPOILER
You do not want to solve it anyway, I gave it as a joke because the next number has 6539 numbers in it

I'm just happy that I know what an Erdos number is.

Speaking of, wouldn't anyone with an Erdos number of 0 be dead, and thus unable to solve the problem?

FunkyWaltDogg on
Burnage wrote:
FWD is very good at this game.
• Registered User regular
edited April 2008
Scooter wrote: »
It has multiple answers, doesn't it?
Not for 3 digits, as far as I can tell.

More complex explanation:
x is our number to find, and it's got 3 digits a, b, and c. ---> x = 100a+10b+c.

y is the "new" number created by changing two digits in x, and leaving the third the same. In changing one digit from 1 to 3, for example, we are essentially just adding 2 to it. If that digit happens to be a, then we are adding 200 to x; if it is b, we are adding 20, and so on. There are six combinations of changes, resulting in six possible values to ultimately add to x in order to get y: 210, 201, 021, 120, 102, and 012.

We also know that x = (1/2)y-9. Solving for y, we get y = 2x+18.

So let's simplify. Our options are as follows:
x + 210 = y = 2x + 18 ---> x = 210 - 18
x + 201 = y = 2x + 18 ---> x = 201 - 18
x + 021 = y = 2x + 18 ---> x = 021 - 18
x + 120 = y = 2x + 18 ---> x = 120 - 18
x + 102 = y = 2x + 18 ---> x = 102 - 18
x + 012 = y = 2x + 18 ---> x = 012 - 18

Solving for x, we get six answers: 192, 183, 3, 102, 84, and 6. Only one has both a 1 and a 3 in its digits: 183.

Also, Paul Erdös is the only person with an Erdös number of 0, and he is dead. There are a lot of people still alive with an Erdös number of 1, however. And Natalie Portman has an Erdös number of 5.

SithDrummer on
It's an easy game to hate
• Registered User regular
edited April 2008
I just did it by plugging in numbers in different spots, and:

(143 + 9) * 2 = 304
(153 + 9) * 2 = 324
(163 + 9) * 2 = 344
(173 + 9) * 2 = 364
(183 + 9) * 2 = 384

Those all fit, right?

Edit: Oh, I didn't get that the 3rd number had to stay the same.

Scooter on
• Registered User regular
edited April 2008
Scooter wrote: »
I just did it by plugging in numbers in different spots, and:

(143 + 9) * 2 = 304
(153 + 9) * 2 = 324
(163 + 9) * 2 = 344
(173 + 9) * 2 = 364
(183 + 9) * 2 = 384

Those all fit, right?
I don't think that the third digit is allowed to change. That's the limitation that keeps it down to six possible additions.

SithDrummer on
It's an easy game to hate
• Moderator, ClubPA mod
edited April 2008
Scooter wrote: »
It has multiple answers, doesn't it?
Not for 3 digits, as far as I can tell.

More complex explanation:
x is our number to find, and it's got 3 digits a, b, and c. ---> x = 100a+10b+c.

y is the "new" number created by changing two digits in x, and leaving the third the same. In changing one digit from 1 to 3, for example, we are essentially just adding 2 to it. If that digit happens to be a, then we are adding 200 to x; if it is b, we are adding 20, and so on. There are six combinations of changes, resulting in six possible values to ultimately add to x in order to get y: 210, 201, 021, 120, 102, and 012.

We also know that x = (1/2)y-9. Solving for y, we get y = 2x+18.

So let's simplify. Our options are as follows:
x + 210 = y = 2x + 18 ---> x = 210 - 18
x + 201 = y = 2x + 18 ---> x = 201 - 18
x + 021 = y = 2x + 18 ---> x = 021 - 18
x + 120 = y = 2x + 18 ---> x = 120 - 18
x + 102 = y = 2x + 18 ---> x = 102 - 18
x + 012 = y = 2x + 18 ---> x = 012 - 18

Solving for x, we get six answers: 192, 183, 3, 102, 84, and 6. Only one has both a 1 and a 3 in its digits: 183.

Also, Paul ErdÃ¶s is the only person with an ErdÃ¶s number of 0, and he is dead. There are a lot of people still alive with an ErdÃ¶s number of 1, however. And Natalie Portman has an ErdÃ¶s number of 5.
I just used reason. The second number has to be roughly twice the first. Changing the first digit from a 3 to a 4 can't accomplish that, and leaving the first digit the same definitely can't accomplish that, so the first digit has to be a 1 in the first number and a 3 in the second. If the first digit goes from 1 to 3, the second digit will probably have to be fairly high, such that the first number is close to 200 and the second is close to 400 (since 400 is twice 200). 3->4 won't accomplish that, so that pair must go in the final digit, which means the second digit has to be both high and constant between the two numbers. I guessed 9 first - that didn't work, so I tried 8. Bingo. 183 -> 384.

I always like math problems that favor reason over brute force.

ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."

I make tweet.
• Registered User regular
edited April 2008
A group of people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. No one knows the color of their eyes. Every night at midnight, a ferry stops at the island. If anyone has figured out the color of their own eyes, they [must] leave the island that midnight. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (excluding themselves), but they cannot otherwise communicate. Everyone on the island knows all the rules in this paragraph.

On this island there are 100 blue-eyed people, 100 brown-eyed people, and the Guru (she happens to have green eyes). So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), but that does not tell him his own eye color; as far as he knows the totals could be 101 brown and 99 blue. Or 100 brown, 99 blue, and he could have red eyes.

The Guru is allowed to speak once (let's say at noon), on one day in all their endless years on the island. Standing before the islanders, she says the following:

"I can see someone who has blue eyes."

Who leaves the island, and on what night?

There are no mirrors or reflecting surfaces, nothing dumb. It is not a trick question, and the answer is logical. It doesn't depend on tricky wording or anyone lying or guessing, and it doesn't involve people doing something silly like creating a sign language or doing genetics. The Guru is not making eye contact with anyone in particular; she's simply saying "I count at least one blue-eyed person on this island who isn't me."

And lastly, the answer is not "no one leaves."

Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

Moridin on
• Registered User regular
edited April 2008
Sounds like you still used some form of brute force, Jeffe - testing the 9, then the 8. I only tested a few more, for thoroughness.

SithDrummer on
It's an easy game to hate
• Moderator, ClubPA mod
edited April 2008
Sounds like you still used some form of brute force, Jeffe - testing the 9, then the 8. I only tested a few more, for thoroughness.

Well, I used a little brute force. But only after I'd whittled down the pool of options to a few possibilities. The alternative was figuring out how to set up a system of equations and solving for X, Y and Z. My way took about 3 minutes.

ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."

I make tweet.
• Registered User regular
edited April 2008
I love math!

SithDrummer on
It's an easy game to hate
• Starting to get dizzy Registered User regular
edited April 2008
Moridin wrote: »
Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

That one's called the Monty Hall problem. I'll save you the trouble and post it:

You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

Smasher on
• Registered User regular
edited April 2008
Smasher wrote: »
Moridin wrote: »
Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

No one ever believes the reasoning behind the answer to this at first. It's like the whole "which door do you look behind" game-show puzzle in terms of the answer being counterintuitive. Speaking of...I should look that one up, too.

That one's called the Monty Hall problem. I'll save you the trouble and post it:

You're asked to choose among three closed doors. Behind one door is a car, and behind the other two are goats. After you make your pick (though the door you picked is not opened, so you don't know what's behind it), the host opens one of the two doors you didn't pick. The host knows which door the car is behind, and he will always open a door with a goat behind it. He then gives you the option of switching your pick to the other unopened door. Assuming you want the car rather than a goat, should you switch, and why or why not?

Argument start! Though simulations prove the following is correct:
You should always switch.

Reasoning: When you choose an original door the probability of choosing the car is 1/3. So the probability of the car being behind the other two doors is 2/3. When one of the goats is revealed, that probability doesn't change so there are now two doors with a 2/3 probability of having a car behind it.

If you think I'm wrong, consider the following: 1000 doors, the host reveals 998 goats after you choose a door. You're gonna want to switch.

Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

enlightenedbum on
• Registered User regular
edited April 2008

Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

Eh, deal or no deal is an exercise in expected value. And I'm pretty sure what the banker offers as a deal is always a bit less than the expected value, too.

Moridin on
• Registered User regular
edited April 2008
Moridin wrote: »

Interestingly to me at least, Deal or No Deal is built on a similar principle, but allows you to open the biggest money case. I've never decided on an optimum strategy for that show, though it might be an interesting problem in itself.

Eh, deal or no deal is an exercise in expected value. And I'm pretty sure what the banker offers as a deal is always a bit less than the expected value, too.

It's actually not, I did do that math at one point trying to figure out what they were doing to calculate it, figuring as you did. It's weighted by how many cases are left. More cases = over expected value and as the number decreases it goes below.

enlightenedbum on
• Moderator, ClubPA mod
edited April 2008
I started a thread on Deal Or No Deal shortly after it debuted to discuss optimum strategies. It didn't get much interest.

ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."

I make tweet.
• Registered User regular
edited April 2008
I taught the monty hall problem today in Algebra.

In other news, I need to know why that eye problem is tricky...

can't he see 100 blue eye'd people? All the time? That doesn't help them know their own eye color. I don't think I understand the problem

musanman on
• Registered User
edited April 2008
Absurdist wrote: »
Here is another tricky one!

What's the next number in this sequence?

10, 0, 3, 20, 16, 51, 32, 67, 74, ?

Wow thats a killer, it must be like "C(x) = C(x-1)+C(x-2)" or some weird sequence.

Good Puzzle!

Laudani on
• Registered User regular
edited April 2008
Moridin wrote: »
...
"I can see someone who has blue eyes."

Who leaves the island, and on what night?
I think I have it. On the 100th night after the guru speaks (including the night of the day she does speak), all people with blue eyes will leave.

I reduced it to one person with blue eyes. They'd obviously see nobody else with blue eyes, know their eyes are blue, and leave on the first night.

2 people. They both see one person with blue eyes. That person doesn't leave the first night and so they conclude there must be somebody else with blue eyes. They both leave on night #2.

3 people. After two nights they conclude there must be three, and that they are one of the three, so all three leave on night #3.

... and so on.

Suckafish on
• Registered User regular
edited April 2008
I really enjoyed the blue-eyes problem - if you want to follow other peoples' train of thought on it, I'd check this thread out first.

SithDrummer on
It's an easy game to hate
• Registered User regular
edited April 2008
I think I just figured out the blue eyes one. It's pretty weird and unintuitive.

Edit: Yep, I was right. You just have to approach it the right way and it falls out.

Savant on
• (\/)┌¶─¶┐(\/) pinch pinchRegistered User regular
edited April 2008
ElJeffe wrote: »
I started a thread on Deal Or No Deal shortly after it debuted to discuss optimum strategies. It didn't get much interest.

I find it amazing that people will get an offer of 250,000 or whatever and turn it down because they are convinced they have the 1,000,000 case. If I were on a show like that I would go long enough to get rid of my debt, if not take a huge chunk out of it, and then quit.

Gooey on
• Moderator, ClubPA mod
edited April 2008
Gooey wrote: »
I find it amazing that people will get an offer of 250,000 or whatever and turn it down because they are convinced they have the 1,000,000 case. If I were on a show like that I would go long enough to get rid of my debt, if not take a huge chunk out of it, and then quit.

Most people probably think the same thing, before being wooed by HUGE MONIES (that they probably won't get, but whatever, details).

ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."

I make tweet.
• regular
edited April 2008
One little counterintuitive riddle that I always find interesting to see how people react is this:

You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have $X in it. You're then offered the chance to switch to briefcase B. Should you? Aroduc on • Registered User regular edited April 2008 Logically, you should. You stand to lose 50% of your winnings with an equivalent chance to increase them by 100%. SithDrummer on It's an easy game to hate • Registered User regular edited April 2008 Aroduc wrote: » One little counterintuitive riddle that I always find interesting to see how people react is this: You're offered two briefcases. One contains twice as much money as the other. You pick briefcase A, and it's shown to have$X in it. You're then offered the chance to switch to briefcase B. Should you?
edit: I'm pretty sure this is wrong.

If you are trying to maximize expected value, then yes you always switch. $(2X + .5X) / 2 =$1.25X > $X. You have an equal chance of losing$.5X as gaining $X. Savant on • Moderator, ClubPA mod edited April 2008 Logically, you should. You stand to lose 50% of your winnings with an equivalent chance to increase them by 100%. This is true whether you open the case or not. You could just say, "Given this unopened case, I stand to either gain X or lose X/2 by switching." I understand the "logic", but I would like to see a simulation run to empirically demonstrate this, because it's nor merely counterintuitive, it's nonsensical. edit: I guess the problem is that if it's true that you're better off swapping to case B after picking case A, it's also true that you're better off swapping to case A after picking case B. Thus, you're better off ending up with case B and also better off ending up with case A which are mutually exclusive statements. ElJeffe on Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops." I make tweet. • Registered User regular edited April 2008 ElJeffe wrote: » Logically, you should. You stand to lose 50% of your winnings with an equivalent chance to increase them by 100%. This is true whether you open the case or not. You could just say, "Given this unopened case, I stand to either gain X or lose X/2 by switching." I understand the "logic", but I would like to see a simulation run to empirically demonstrate this, because it's nor merely counterintuitive, it's nonsensical. edit: I guess the problem is that if it's true that you're better off swapping to case B after picking case A, it's also true that you're better off swapping to case A after picking case B. Thus, you're better off ending up with case B and also better off ending up with case A which are mutually exclusive statements. Actually, now that I think about it I was wrong. The cases contain Y and 2Y dollars. Switching between the two always risks Y dollars. If you see X in one of them, you are never going to have a chance at getting Y/2 or 4Y, Y dollars is always what is at stake. If X=Y then you are always going to gain Y dollars by switching, and if X=2Y you will always lose Y dollars. So switching is even money. Savant on • Registered User regular edited April 2008 ElJeffe wrote: » edit: I guess the problem is that if it's true that you're better off swapping to case B after picking case A, it's also true that you're better off swapping to case A after picking case B. Thus, you're better off ending up with case B and also better off ending up with case A which are mutually exclusive statements. This seems to be a problem with wording. You are not truly "better off" switching if you already have the high case, but with the limited information available you have a better risk-benefit ratio by switching, even if ultimately it is not a good choice. This actually seems similar to the blue-eyes problem in one respect: you begin with a catalyst action already fulfilled. In the blue-eyes problem, the guru has declared the existence of blue eyes, beginning the problem. In this one, you've already picked one case, giving you a starting position and dilemma - stay or switch. If you hadn't chosen anything yet, then the two cases are functionally identical and your probability is 1/2 either way. Savant, you're saying that an equivalent value Y is at risk in either direction. The problem is that it depends on whether or not your case is the more valuable one. If X is the high case, X = 2Y and you're risking Y for a perceived chance at 4Y. If X is the low case, X = Y and you're risking Y/2 for a perceived chance at 2Y. But no matter what, you're risking X/2 for 2X, and how Y and X are related is ultimately irrelevant. You've simply changed the variable's name. SithDrummer on It's an easy game to hate • Starting to get dizzy Registered User regular edited April 2008 It seems counterintuitive because there's two separate situations you could be in. Let's say the briefcase has$100 in it. In one situation the other briefcase has $200, while in the other situation it has$50. Since the two situations are presumably equally likely, your expected value is higher for switching than it is for staying.

Let's contrast this problem with a similar one that has a different answer. Suppose you're told the briefcases have Y and 2Y dollars in them. In this case you'll either gain or lose Y dollars by switching, and so your expected value for switching is 0 (ie, it doesn't matter if you do or not). This holds whether you know the value of Y or not.

The difference between the two scenarios is that in the latter one you know the value of both briefcases, while in the former you don't. This frequently causes confusion when people treat the first one as if it were the second one, which it is not.

BTW, don't look at ratios, they'll mess you up. All that matters is the absolute gain or loss you get from switching.

Smasher on
• regular
edited April 2008
Hee hee... I told you people would give interesting answers. Some completely wrong ones in there, but it makes the brain do strange things.

Aroduc on
• Registered User regular
edited April 2008
Smasher wrote: »
It seems counterintuitive because there's two separate situations you could be in. Let's say the briefcase has $100 in it. In one situation the other briefcase has$200, while in the other situation it has \$50. Since the two situations are presumably equally likely, your expected value is higher for switching than it is for staying.

Let's contrast this problem with a similar one that has a different answer. Suppose you're told the briefcases have Y and 2Y dollars in them. In this case you'll either gain or lose Y dollars by switching, and so your expected value for switching is 0 (ie, it doesn't matter if you do or not). This holds whether you know the value of Y or not.

The difference between the two scenarios is that in the latter one you know the value of both briefcases, while in the former you don't. This frequently causes confusion when people treat the first one as if it were the second one, which it is not.

BTW, don't look at ratios, they'll mess you up. All that matters is the absolute gain or loss you get from switching.

Yeah, this problem is the second case. There are two briefcases at the start, and one has twice as much money as the other. So they start at Y and 2Y, but you just don't know which is which or what Y is. When you open one you know that X=Y or X=2Y, but that has no effect on how much you stand to gain or lose by switching between them. You just know you will gain or lose X/2 or X by switching, but you don't have an equal chance of losing or gaining those two values. It is set that it is one or the other, you just don't know which.

Savant on
• Moderator, ClubPA mod
edited April 2008
I'm going to go all thought-experiment on this one. Pretend you write a computer simulation to model the act of picking a case, then deciding whether or not to switch. In the first case, we write it to not show the computer person how much is in the case. Since the argument is that switching is statistically better, we'll run two simulations - one in which the computer always switches, and one in which the computer never switches.

Functionally, it'll work as follows:

Case I:
- Computer chooses A or B.
- Computer asked if he wants to switch.
- Computer keeps initial choice.

Case II:
- Computer chooses A or B.
- Computer asked if he wants to switch.
- Computer changes.

I think it's pretty clear here that picking A then switching to B is indistinguishable from picking B at the outset, and picking B then switching to A is indistinguishable from picking A at the outset - there's no reason why it wouldn't be. If we ran this simulation, we would find no statistical discrepancy in how much money each computer person expected to get.

Okay, now let's recreate the situation in the puzzle. We add a step in there whereby we "show" the computer the contents of his initial choice. It's easy to see that adding such a step would have no effect on the internal logic of the first simulation - if nothing functionally changes in the simulation, then the simulation would have to yield identical results. ie, there would be no statistical change between keeping the initial choice and switching.

ElJeffe on
Maddie: "I named my feet. The left one is flip and the right one is flop. Oh, and also I named my flip-flops."

I make tweet.
• Registered User regular
edited April 2008
@ Savant: That only works if you maintain that X is a variable, but once you've opened the case, it's not.

@ ElJeffe: Your thought experiment begins with the choice of A or B, which has already been made in the original question. Choosing to switch or stay when given a tangible piece of the Y/2Y is the dilemma.

SithDrummer on
It's an easy game to hate
• Registered User regular
edited April 2008
t Savant: That only works if you maintain that X is a variable, but once you've opened the case, it's not.

Opening the case has no effect on how much money is in them, or what the probability of choosing the bigger case is. It just changes how much you know about the possible amounts of money in the cases.

If you started out with a briefcase with X dollars, and then were offered to switch with equal chances of doubling or halving your money, then you would always want to switch. But you don't have equal chances of doubling or halving in this problem. It is set with probability 1 what you will gain or lose by switching.

Savant on
• Registered User regular
edited April 2008
Moridin wrote: »
Shamelessly stolen from http://www.xkcd.com/blue_eyes.html

Didn't read through yet, so I don't know if someone's done this already:
All the blue-eyed people with leave on the 100th night.

Proof by induction. Suppose that there are n blue-eyed people on the island.

Claim: all n people will leave the island on the nth night.

Base case: n = 1
The one blue-eyed person will notice on the 1st day that he can see no blue-eyed people on the island. However, there must be at least one, so he deduces that he must have blue eyes and leaves on the 1st night.

Induction:
Suppose we have n blue-eyed people on the island (n > 1), and assume that the claim is true for n - 1 blue-eyed people.

Take an arbitrary blue-eyed person. He sees n - 1 blue-eyed people. He knows that if he does not have blue eyes, then by the induction hypothesis, the n - 1 blue-eyed people will leave on the n - 1th night. However, every blue-eyed person is thinking the same thing, so none of them will leave on the n - 1th night, leading this person to conclude that there must be more than n - 1 blue-eyed people on the island.

Thus this blue-eyed person realizes that he must have blue eyes and so leaves the island the next night, the nth night. So all the blue-eyed people leave the island on the nth night, completing the induction.

Also noteworthy is that if there were only blue-eyed and brown-eyed people on the island, then after all the blue-eyed people had left on the nth night, all the brown-eyed people would then leave the next night, since logically none of them could have blue eyes since otherwise the blue-eyed people would have waited. However, this doesn't happen since they know of at least two other possible colours (brown and green).

Smug Duckling on
• Registered User regular
edited April 2008
Apothe0sis wrote: »
2, 5, 877, 27644437, 35742549198872617291353508656626642567, 359334085968622831041960188598043661065388726959079837 ... ?

They're the primes in the Bell sequence. http://en.wikipedia.org/wiki/Bell_number

Marty81 on