The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
Okay, so I'm reviewing for an Algebra exam and there is a question on the review packet that I really do not understand how to do. It's probably ridiculously simple, but I really can't seem to figure it out.
Michelle wants to build a swimming pool surrounded by a sidewalk of uniform width. She wants the dimensions of the pool and sidewalk to be 16 meters by 20 meters. If the pool must have an area of 192 square meters how wide should the sidewalk be?
I'm going to try and not do the problem, but just give you a hint
Think of it as two rectangles, one (the pool) inside the other, forming a border that is the sidewalk. You know the sidewalk has a constant width, your maximum dimensions and how big the area of the pool it, you should be able to write a formula to express the relationship with all the information given and solve for the width of the sidewalk.
Three (at least) equations, three unknowns. You know the length of the pool plus twice the width of the sidewalk is 20 meters, you know the width of the pool plus twice the width of the sidewalk is 16 meters, and you know the width of the pool times the length of the pool is 192 square meters. So:
L + 2d= 20
W + 2d = 16
L * W = 192
You know the pool is 4 meters longer than it is wide, so you can solve the last equation for either L or W and substitute that value into whichever of the first two equations is appropriate to solve for d.
Ah, I got it, thanks! The part I forgot about was coming up with the three equations. I was trying to find a way to get to a quadratic equation, but the way the question was written doesn't make it as directly obvious as I'm used to.
Here's what I got:
L + 2d= 20
W + 2d = 16
L * W = 192
In which L is the Length of the Pool, W is the Width of the Pool, and d is the Width of the Sidewalk
L=20-2d
W=16-2d
L * W = 192, therefore
(20-2d)(16-2d)=192
Expanding gives me
4d^2 - 72 + 128 = 0
Factoring gives me
(2d - 32) (2d - 4) = 0
Therefore, d can equal either 16 or 2
Plugging those values in, we find that L = -12 when d = 16, therefore d cannot be 16 as you cannot have a negative length
Therefore, as d is equal to 2, the width of the sidewalk is 2 meters.
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
Also, this is familiar to the famous question found in the movie 21: If the grand prize is behind one door (door 1, 2, or 3), and you pick one, should you switch after the host reveals an incorrect door (one that you didnt choose)?
In any case, use a Bayesian approach to solve the door problem. After some math kung fu, you see that the new information shown to you by the host means you should switch doors.
You pretty much outlined the statistical answer of yes to both, switching and switching back.
Another reason to favour switching is the most you can lose is 1/2v, but you also stand to gain v, and double your money.
edit: folken I do not think he made a mistake in his formula. he has a 1/2 chance to double his money, 1/2(2v), and he has a 1/2 chance to lose HALF his money, so 1/2(v/2)
Well, imaging there is no initial choice. You just have v dollars and some dude offers to trade you an envelope that has a 50% chance of 2v dollars and a 50% chance of 0.5v dollars. You should switch. But you shouldn't switch back, because the expected value of the unknown envelope is larger than the original v dollars.
Actually after reading a little about this problem I realize that I might be dumb.
Well, imaging there is no initial choice. You just have v dollars and some dude offers to trade you an envelope that has a 50% chance of 2v dollars and a 50% chance of 0.5v dollars. You should switch. But you shouldn't switch back, because the expected value of the unknown envelope is larger than the original v dollars.
Actually after reading a little about this problem I realize that I might be dumb.
it's a paradox it's supposed to make you feel dumb.
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.
No.
If one envelope has twice as much as the other, one envelope has v, the other has 2v.
Example: I have an envelope that has $100. Let v = 100. You want to know the probability of switching for twice that much, i.e. 2v = 200.
In your equation, you are comparing, 0.5v to 2v, one of which is 4 times the other.
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.
No.
If one envelope has twice as much as the other, one envelope has v, the other has 2v.
Example: I have an envelope that has $100. Let v = 100. You want to know the probability of switching for twice that much, i.e. 2v = 200.
In your equation, you are comparing, 0.5v to 2v, one of which is 4 times the other.
v is the value of the envelope you currently have in your hand.
If it happens to be the larger of the two envelopes, how much is in the other envelope?
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.
No.
If one envelope has twice as much as the other, one envelope has v, the other has 2v.
Example: I have an envelope that has $100. Let v = 100. You want to know the probability of switching for twice that much, i.e. 2v = 200.
In your equation, you are comparing, 0.5v to 2v, one of which is 4 times the other.
You have an envolope. Amount in said envolope is v. The other envelope has either double (2v) or half (1/2v) in it. The probability of each envolope is 1/2.
Yes, but this is in respect to the envolope you have. I'm not saying "you have an envolope, and there are two more," I am (as is Doc) saying that you have an envolope in your hand. It has a a value of 'v' money in it. There is ONE more envelope on the table. You know it has either DOUBLE the amount of money or HALF the amount of money in it. Therefore, in relation to the amount 'v' which is currently in your envelope, the ONE other envelope on the table has EITHER 1/2v or 2v worth of money in it. I never said there were 3 envelopes.
Edit: clarify more. what the formula does it say okay, I have a 1/2 chance to get 1/2v, and a 1/2 chance to get 2v. So 1/2(1/2v) + 1/2(2v)
Yes, but this is still incorrect with respect to calculating expected values: the sum of possibilities times probabilities. You can't define the possibilities one way, and then change them just because you picked one up.
Also, given this logic, your expected value increases by switching the envelope over and over again.
I think you're confused about the premise of this problem, first and foremost. There are two envelopes. Envelope 1 contains $v. Envelope 2 contains either $0.5v or $2v with each case being equally likely. You are currently in possession of envelope 1.
The expected value of envelope 1 is $v trivially, as you know there is $v in it.
The expected value of envelope 2 is $1.25v by computation (0.5 * $0.5v + 0.5*$2v).
Both of these can easily be verified replacing v with an actual number.
However, the "twin envelope paradox" (not actually a paradox) stems from a faulty continuation of this reasoning through multiple iterations. Essentially the argument goes:
Expected value of envelope 2 is $1.25v, so trade to get envelope 2.
Call amount in envelope 2 $x, then envelope 1 has expected value $1.25x
Trade again to get expected value $1.25x = $1.5625v.
Repeat to get arbitrarily large expected value.
The problem comes in saying that envelope 1 has expected value $1.25x. This is "computed" by treating x as a fixed value, which it is not. There is a 50% chance it is 2v, and a 50% chance that it is 0.5v. Envelope 1 will have $2x in it only if x is 0.5v, and $0.5x in it only if x is 2v - giving a real expected value of (gasp) $v.
Actually, now that I think of it, this should be approached in terms of net gain. No matter the outcome, we gain money by picking up an envelope. When we pick up the first envelope, we hold $v. Should we switch to a second envelope, we stand to either lose 0.5v or gain v, with equal probability.
By switching, the expected gain is (0.5)(-0.5v) + (0.5)(v) = (1/4)v. This is the same as the 1.25 Doc arrived at, but I still can't wrap my mind around his reasoning.
Actually, now that I think of it, this should be approached in terms of net gain. No matter the outcome, we gain money by picking up an envelope. When we pick up the first envelope, we hold $v. Should we switch to a second envelope, we stand to either lose 0.5v or gain v, with equal probability.
By switching, the expected gain is (0.5)(-0.5v) + (0.5)(v) = (1/4)v. This is the same as the 1.25 Doc arrived at, but I still can't wrap my mind around his reasoning.
This is going to sound awfully disagreeable of me, but my last post was incorrect and yours (previous to this one) were, in fact, correct (at least in end result, I won't go over them thoroughly again). My last post was a great explanation (in my opinion, at least) of a situation that Doc did not post. I saw his was similar to the oft-mentioned twin envelope paradox and assumed it was the standard statement of such. Unfortunately for me, it isn't.
The problem with your analysis of the net gain is treating the amount gained or lost as dependent on $v, when in fact it is a fixed amount. Call the smaller amount of money $n and the larger amount $2n. There are two possible cases:
$v = $n, in which case switching yields a total of $2n for a net gain of $n.
$v = $2n, in which case switching yields a total of $n for a net loss of $n.
Consequently total net gain from switching is $0.
This is one version of the paradox that I think is a little more clear. The game works like this:
-There are two envelopes, and you are told that one of them contains double the value of the other
-You pick one of the envelopes *and open it*. You observe it has some fixed value v.
-You are then given the choice between accepting the value v, or switching to the other envelope
So after opening the first envelope, you figure that it's equally likely to have been the higher value or the lower value, so you compute the expected value of the other to be v/2*1/2 + 2v*1/2 = 1.25v. So the best thing to do is to switch to the other envelop. Doesn't appear to be a problem so far.
However, a problem shows up when you think one step back before you make your first choice: You look at envelope A, and say "If I picked A, I would surely switch to B". Likewise, "If I picked B, I would surely switch to A". The paradox here is that you can never come to a conclusion about which envelop to pick in the first place.
One proposed resolution to this paradox is as follows:
The reasoning that the expected value is 1.25v relies on an impossible assumption. It assumes that no matter what fixed value v you see when you open the first envelop, you believe that there is a 50-50 chance that the other envelop is 2v, and a 50-50 chance that the other envelop is v/2. So, if you knew one of the possible values was $1, for example, then the pairs ($0.5, $1) and ($1, $2) have to be equally likely. By the same reasoning, since $2 is possible, the pairs ($1, $2) and ($2, $4) are equally likely, and so on. (LIkewise, ($0.25, $0.5), and ($0.5, $1) are equally likely, and we assume that there's some way to transfer fractions of cents, though that's not really important).
So this says that the values $1, $2, $4, $8, $16, ... are all equally likely to show up. Infinitely many values all have to be equally likely to show up. And "1/infinity" = 0 (blah blah limits blah blah blah). So the probability of any particular value showing up has to be 0. This is an impossible probability distribution, so the game cannot possibly work like this.
The answer (as stated above) is that the question is ambiguous about probability distributions. It presumes that you can just choose a number uniformly at random from 0 to infinity, which is not true at all.
Aw, I was trying to remember the answer to that envelope paradox and you went and answered it. But yeah, there is no proper uniform distribution that goes out to infinity.
Posts
Think of it as two rectangles, one (the pool) inside the other, forming a border that is the sidewalk. You know the sidewalk has a constant width, your maximum dimensions and how big the area of the pool it, you should be able to write a formula to express the relationship with all the information given and solve for the width of the sidewalk.
Then label what you know.
Label what you don't know with variables.
Express what you don't know in terms of what you do know.
Solve.
L + 2d= 20
W + 2d = 16
L * W = 192
You know the pool is 4 meters longer than it is wide, so you can solve the last equation for either L or W and substitute that value into whichever of the first two equations is appropriate to solve for d.
Here's what I got:
L + 2d= 20
W + 2d = 16
L * W = 192
In which L is the Length of the Pool, W is the Width of the Pool, and d is the Width of the Sidewalk
L=20-2d
W=16-2d
L * W = 192, therefore
(20-2d)(16-2d)=192
Expanding gives me
4d^2 - 72 + 128 = 0
Factoring gives me
(2d - 32) (2d - 4) = 0
Therefore, d can equal either 16 or 2
Plugging those values in, we find that L = -12 when d = 16, therefore d cannot be 16 as you cannot have a negative length
Therefore, as d is equal to 2, the width of the sidewalk is 2 meters.
Is this correct?
L + 2d= 20
W + 2d = 16
L * W = 192
d=2 so...
L + 4 = 20 -> L = 16
W + 4 = 16 -> W = 12
16*12=192
You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?
e = statistically expected value of the other envelope
v = value of the envelope you have now
e = 0.5 (2v) + 0.5 (0.5v)
That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.
It simplifies to this:
e = 1.25v
So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.
Should you switch? If so, you are given the opportunity to switch back. Should you do that?
PhD student in statistics here. I know my shit.
There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).
Also, this is familiar to the famous question found in the movie 21: If the grand prize is behind one door (door 1, 2, or 3), and you pick one, should you switch after the host reveals an incorrect door (one that you didnt choose)?
In any case, use a Bayesian approach to solve the door problem. After some math kung fu, you see that the new information shown to you by the host means you should switch doors.
EDIT: This is the Monty Hall Problem for those who are interested.
In Doc's case, he only has 2 envelopes, and revealing more information ends the game. So in any case, his chances are only 50/50.
You pretty much outlined the statistical answer of yes to both, switching and switching back.
Another reason to favour switching is the most you can lose is 1/2v, but you also stand to gain v, and double your money.
edit: folken I do not think he made a mistake in his formula. he has a 1/2 chance to double his money, 1/2(2v), and he has a 1/2 chance to lose HALF his money, so 1/2(v/2)
Actually after reading a little about this problem I realize that I might be dumb.
it's a paradox it's supposed to make you feel dumb.
No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.
I am too stupid to start it.
No.
If one envelope has twice as much as the other, one envelope has v, the other has 2v.
Example: I have an envelope that has $100. Let v = 100. You want to know the probability of switching for twice that much, i.e. 2v = 200.
In your equation, you are comparing, 0.5v to 2v, one of which is 4 times the other.
v is the value of the envelope you currently have in your hand.
If it happens to be the larger of the two envelopes, how much is in the other envelope?
You have an envolope. Amount in said envolope is v. The other envelope has either double (2v) or half (1/2v) in it. The probability of each envolope is 1/2.
1/2 (0.5v) + 1/2 (2v) = 5/4 (v) or 1.25v
There are 2 envelopes, so if you want the expected value of the two envelopes, it is (0.5*v) + (0.5*2v).
Edit: clarify more. what the formula does it say okay, I have a 1/2 chance to get 1/2v, and a 1/2 chance to get 2v. So 1/2(1/2v) + 1/2(2v)
No, you want the expected value of the other envelope.
Also, given this logic, your expected value increases by switching the envelope over and over again.
I think you're confused about the premise of this problem, first and foremost. There are two envelopes. Envelope 1 contains $v. Envelope 2 contains either $0.5v or $2v with each case being equally likely. You are currently in possession of envelope 1.
The expected value of envelope 1 is $v trivially, as you know there is $v in it.
The expected value of envelope 2 is $1.25v by computation (0.5 * $0.5v + 0.5*$2v).
Both of these can easily be verified replacing v with an actual number.
However, the "twin envelope paradox" (not actually a paradox) stems from a faulty continuation of this reasoning through multiple iterations. Essentially the argument goes:
Expected value of envelope 2 is $1.25v, so trade to get envelope 2.
Call amount in envelope 2 $x, then envelope 1 has expected value $1.25x
Trade again to get expected value $1.25x = $1.5625v.
Repeat to get arbitrarily large expected value.
The problem comes in saying that envelope 1 has expected value $1.25x. This is "computed" by treating x as a fixed value, which it is not. There is a 50% chance it is 2v, and a 50% chance that it is 0.5v. Envelope 1 will have $2x in it only if x is 0.5v, and $0.5x in it only if x is 2v - giving a real expected value of (gasp) $v.
By switching, the expected gain is (0.5)(-0.5v) + (0.5)(v) = (1/4)v. This is the same as the 1.25 Doc arrived at, but I still can't wrap my mind around his reasoning.
This is going to sound awfully disagreeable of me, but my last post was incorrect and yours (previous to this one) were, in fact, correct (at least in end result, I won't go over them thoroughly again). My last post was a great explanation (in my opinion, at least) of a situation that Doc did not post. I saw his was similar to the oft-mentioned twin envelope paradox and assumed it was the standard statement of such. Unfortunately for me, it isn't.
The problem with your analysis of the net gain is treating the amount gained or lost as dependent on $v, when in fact it is a fixed amount. Call the smaller amount of money $n and the larger amount $2n. There are two possible cases:
$v = $n, in which case switching yields a total of $2n for a net gain of $n.
$v = $2n, in which case switching yields a total of $n for a net loss of $n.
Consequently total net gain from switching is $0.
This is one version of the paradox that I think is a little more clear. The game works like this:
-There are two envelopes, and you are told that one of them contains double the value of the other
-You pick one of the envelopes *and open it*. You observe it has some fixed value v.
-You are then given the choice between accepting the value v, or switching to the other envelope
So after opening the first envelope, you figure that it's equally likely to have been the higher value or the lower value, so you compute the expected value of the other to be v/2*1/2 + 2v*1/2 = 1.25v. So the best thing to do is to switch to the other envelop. Doesn't appear to be a problem so far.
However, a problem shows up when you think one step back before you make your first choice: You look at envelope A, and say "If I picked A, I would surely switch to B". Likewise, "If I picked B, I would surely switch to A". The paradox here is that you can never come to a conclusion about which envelop to pick in the first place.
One proposed resolution to this paradox is as follows:
The reasoning that the expected value is 1.25v relies on an impossible assumption. It assumes that no matter what fixed value v you see when you open the first envelop, you believe that there is a 50-50 chance that the other envelop is 2v, and a 50-50 chance that the other envelop is v/2. So, if you knew one of the possible values was $1, for example, then the pairs ($0.5, $1) and ($1, $2) have to be equally likely. By the same reasoning, since $2 is possible, the pairs ($1, $2) and ($2, $4) are equally likely, and so on. (LIkewise, ($0.25, $0.5), and ($0.5, $1) are equally likely, and we assume that there's some way to transfer fractions of cents, though that's not really important).
So this says that the values $1, $2, $4, $8, $16, ... are all equally likely to show up. Infinitely many values all have to be equally likely to show up. And "1/infinity" = 0 (blah blah limits blah blah blah). So the probability of any particular value showing up has to be 0. This is an impossible probability distribution, so the game cannot possibly work like this.
(I should mention you can modify the probability distribution to make this paradox still more or less work. The "resolution" to that is basically that the expected value of an envelope is infinite, and making decisions when the payoffs are infinite doesn't really make any sense anyways)
Sweet, everyone's wrong!
Yeah, that's the best part. The real question is "where is the flaw in that reasoning?"