Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.

Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

robotsintheskies
Registered User

Okay, so I'm reviewing for an Algebra exam and there is a question on the review packet that I really do not understand how to do. It's probably ridiculously simple, but I really can't seem to figure it out.

Michelle wants to build a swimming pool surrounded by a sidewalk of uniform width. She wants the dimensions of the pool and sidewalk to be 16 meters by 20 meters. If the pool must have an area of 192 square meters how wide should the sidewalk be?

Thanks for any help!

Michelle wants to build a swimming pool surrounded by a sidewalk of uniform width. She wants the dimensions of the pool and sidewalk to be 16 meters by 20 meters. If the pool must have an area of 192 square meters how wide should the sidewalk be?

Thanks for any help!

## Posts

Think of it as two rectangles, one (the pool) inside the other, forming a border that is the sidewalk. You know the sidewalk has a constant width, your maximum dimensions and how big the area of the pool it, you should be able to write a formula to express the relationship with all the information given and solve for the width of the sidewalk.

Then label what you know.

Label what you don't know with variables.

Express what you don't know in terms of what you do know.

Solve.

L + 2d= 20

W + 2d = 16

L * W = 192

You know the pool is 4 meters longer than it is wide, so you can solve the last equation for either L or W and substitute that value into whichever of the first two equations is appropriate to solve for d.

Here's what I got:

L + 2d= 20

W + 2d = 16

L * W = 192

In which L is the Length of the Pool, W is the Width of the Pool, and d is the Width of the Sidewalk

L=20-2d

W=16-2d

L * W = 192, therefore

(20-2d)(16-2d)=192

Expanding gives me

4d^2 - 72 + 128 = 0

Factoring gives me

(2d - 32) (2d - 4) = 0

Therefore, d can equal either 16 or 2

Plugging those values in, we find that L = -12 when d = 16, therefore d cannot be 16 as you cannot have a negative length

Therefore, as d is equal to 2, the width of the sidewalk is 2 meters.

Is this correct?

L + 2d= 20

W + 2d = 16

L * W = 192

d=2 so...

L + 4 = 20 -> L = 16

W + 4 = 16 -> W = 12

16*12=192

You are given a choice of two envelopes. One contains twice the amount of money as the other, but you don't know their values. You select one envelope. Should you switch to the other if given the opportunity?

e = statistically expected value of the other envelope

v = value of the envelope you have now

e = 0.5 (2v) + 0.5 (0.5v)

That is, there is a 0.5 probability that the other envelope contains twice what yours does, and a 0.5 probability that it contains half of what yours does. Add up the probabilities times the multiplier and the value to get the expected value of the other envelope.

It simplifies to this:

e = 1.25v

So the expected value of the other envelope is 1.25v, or 1.25 times the value of the envelope you have now.

Should you switch? If so, you are given the opportunity to switch back. Should you do that?

PhD student in statistics here. I know my shit.

There's a problem in your formula for e. It should read e = 0.5(v) + 0.5(2v).

Also, this is familiar to the famous question found in the movie 21: If the grand prize is behind one door (door 1, 2, or 3), and you pick one, should you switch after the host reveals an incorrect door (one that you didnt choose)?

In any case, use a Bayesian approach to solve the door problem. After some math kung fu, you see that the new information shown to you by the host means you should switch doors.

EDIT: This is the Monty Hall Problem for those who are interested.

In Doc's case, he only has 2 envelopes, and revealing more information ends the game. So in any case, his chances are only 50/50.

Twitter:Folken_fgcSteam:folken_XBL:flashg03You pretty much outlined the statistical answer of

yesto both, switching and switching back.Another reason to favour switching is the most you can lose is 1/2v, but you also stand to gain v, and double your money.

edit: folken I do not think he made a mistake in his formula. he has a 1/2 chance to double his money, 1/2(2v), and he has a 1/2 chance to lose HALF his money, so 1/2(v/2)

Actually after reading a little about this problem I realize that I might be dumb.

it's a paradox it's supposed to make you feel dumb.

No, if you are holding the envelope that has the bigger value, switching would yield 0.5v.

I am too stupid to start it.

No.

If one envelope has twice as much as the other, one envelope has v, the other has 2v.

Example: I have an envelope that has $100. Let v = 100. You want to know the probability of switching for twice that much, i.e. 2v = 200.

In your equation, you are comparing, 0.5v to 2v, one of which is 4 times the other.

Twitter:Folken_fgcSteam:folken_XBL:flashg03v is the value of the envelope you currently have in your hand.

If it happens to be the larger of the two envelopes, how much is in the other envelope?

Twitter:Folken_fgcSteam:folken_XBL:flashg03You have an envolope. Amount in said envolope is v. The other envelope has either double (2v) or half (1/2v) in it. The probability of each envolope is 1/2.

1/2 (0.5v) + 1/2 (2v) = 5/4 (v) or 1.25v

There are 2 envelopes, so if you want the expected value of the two envelopes, it is (0.5*v) + (0.5*2v).

Twitter:Folken_fgcSteam:folken_XBL:flashg031/2v or 2v worth of money in it. I never said there were 3 envelopes.EITHEREdit: clarify more. what the formula does it say okay, I have a 1/2 chance to get 1/2v,

anda 1/2 chance to get 2v. So 1/2(1/2v) + 1/2(2v)No, you want the expected value of

the other envelope.Twitter:Folken_fgcSteam:folken_XBL:flashg03Also, given this logic, your expected value increases by switching the envelope over and over again.

Twitter:Folken_fgcSteam:folken_XBL:flashg03I think you're confused about the premise of this problem, first and foremost. There are two envelopes. Envelope 1 contains $v. Envelope 2 contains either $0.5v or $2v with each case being equally likely. You are currently in possession of envelope 1.

The expected value of envelope 1 is $v trivially, as you know there is $v in it.

The expected value of envelope 2 is $1.25v by computation (0.5 * $0.5v + 0.5*$2v).

Both of these can easily be verified replacing v with an actual number.

However, the "twin envelope paradox" (not actually a paradox) stems from a faulty continuation of this reasoning through multiple iterations. Essentially the argument goes:

Expected value of envelope 2 is $1.25v, so trade to get envelope 2.

Call amount in envelope 2 $x, then envelope 1 has expected value $1.25x

Trade again to get expected value $1.25x = $1.5625v.

Repeat to get arbitrarily large expected value.

The problem comes in saying that envelope 1 has expected value $1.25x. This is "computed" by treating x as a fixed value, which it is not. There is a 50% chance it is 2v, and a 50% chance that it is 0.5v. Envelope 1 will have $2x in it only if x is 0.5v, and $0.5x in it only if x is 2v - giving a real expected value of (gasp) $v.

By switching, the expected gain is (0.5)(-0.5v) + (0.5)(v) = (1/4)v. This is the same as the 1.25 Doc arrived at, but I still can't wrap my mind around his reasoning.

Twitter:Folken_fgcSteam:folken_XBL:flashg03This is going to sound awfully disagreeable of me, but my last post was incorrect and yours (previous to this one) were, in fact, correct (at least in end result, I won't go over them thoroughly again). My last post was a great explanation (in my opinion, at least) of a situation that Doc did

notpost. I saw his was similar to the oft-mentioned twin envelope paradox and assumed it was the standard statement of such. Unfortunately for me, it isn't.The problem with your analysis of the net gain is treating the amount gained or lost as dependent on $v, when in fact it is a fixed amount. Call the smaller amount of money $n and the larger amount $2n. There are two possible cases:

$v = $n, in which case switching yields a total of $2n for a net gain of $n.

$v = $2n, in which case switching yields a total of $n for a net loss of $n.

Consequently total net gain from switching is $0.

This is one version of the paradox that I think is a little more clear. The game works like this:

-There are two envelopes, and you are told that one of them contains double the value of the other

-You pick one of the envelopes

*and open it*. You observe it has some fixed value v.-You are then given the choice between accepting the value v, or switching to the other envelope

So after opening the first envelope, you figure that it's equally likely to have been the higher value or the lower value, so you compute the expected value of the other to be v/2*1/2 + 2v*1/2 = 1.25v. So the best thing to do is to switch to the other envelop. Doesn't appear to be a problem so far.

However, a problem shows up when you think one step back before you make your first choice: You look at envelope A, and say "If I picked A, I would surely switch to B". Likewise, "If I picked B, I would surely switch to A". The paradox here is that you can never come to a conclusion about which envelop to pick in the first place.

One proposed resolution to this paradox is as follows:

The reasoning that the expected value is 1.25v relies on an impossible assumption. It assumes that no matter what fixed value v you see when you open the first envelop, you believe that there is a 50-50 chance that the other envelop is 2v, and a 50-50 chance that the other envelop is v/2. So, if you knew one of the possible values was $1, for example, then the pairs ($0.5, $1) and ($1, $2) have to be equally likely. By the same reasoning, since $2 is possible, the pairs ($1, $2) and ($2, $4) are equally likely, and so on. (LIkewise, ($0.25, $0.5), and ($0.5, $1) are equally likely, and we assume that there's some way to transfer fractions of cents, though that's not really important).

So this says that the values $1, $2, $4, $8, $16, ... are all equally likely to show up. Infinitely many values all have to be equally likely to show up. And "1/infinity" = 0 (blah blah limits blah blah blah). So the probability of any particular value showing up has to be 0. This is an impossible probability distribution, so the game cannot possibly work like this.

(I should mention you can modify the probability distribution to make this paradox still more or less work. The "resolution" to that is basically that the expected value of an envelope is infinite, and making decisions when the payoffs are infinite doesn't really make any sense anyways)

Sweet, everyone's wrong!

Twitter:Folken_fgcSteam:folken_XBL:flashg03Yeah, that's the best part. The real question is "where is the flaw in that reasoning?"