Trusty Old Rock (Logic Problem)

What value do you submit?

  • 2

    Votes: 2 3.8%
  • high 90s-100

    Votes: 43 81.1%
  • other strategy (explain)

    Votes: 0 0.0%
  • Joke about Sarah Palin's daughter

    Votes: 8 15.1%

  • Total voters
    53
Probabiltiy dictates picking $2.

If you pick $2, you have a 50% chance of getting $2, and a 50% chance of getting $4. If you pick 99$, you have only a 1/97 of $99, 1/97 of getting $101, and a 95/97 chance of getting less than your opponent.

My math is probably off, but it made sense in my head. I'll post a better report later.
 
Probabiltiy dictates picking $2.

If you pick $2, you have a 50% chance of getting $2, and a 50% chance of getting $4. If you pick 99$, you have only a 1/97 of $99, 1/97 of getting $101, and a 95/97 chance of getting less than your opponent.

If you're assuming your opponent to pick randomly, than you have just as low a chance of getting $2 as you do of getting $100 actually. However, you could pick say, $20. Your risk there is higher, but still low, and you get a highger, but still low, payback.
 
"Rationality" in mathematics has a specific meaning that is slightly different from the daily use of the word. Choosing $2 is called "rational" because "rational" means a deviation from the $2 strategy, provided that the opponent has chosen that and does not change his mind, will result in a net loss of expected payoff, not because of an assumption of competitiveness. Also, being "rational" does not mean getting the best possible payoff, or the best chance at a good payoff. It simply means if the two players happen to be already stuck at the $2 strategy, it's the best choice for them to stay at that strategy.

Congratulations, you can misdefine things! What you have defined above firstly is actually the definition of a Nash Equilibrium, not "rational". I also know there can be multiple Nash equilibria, etc...

I don't believe that rationality has ever been defined in game theory as "minimizing risk of loss," that would be an example of simply making a wrong assumption. If the point of a scenario is to minimize loss then such an action may be rational just as if the point is to maximize gain - neither is default.
With this rule in mind, if both player start reasoning with the assumption that both will pick $100, they will get to the equilibrium point of $2 again.
Right, except you have no proof of that. The only argument seems to be "you assume the other player will play $2, therefore it is best for you to do so." (which if they do, is true, I'm not denying that, that's why it is the Nash equilibrium) This is not the case - the problem in all its history has specifically said that the other person, like you, wants to get as much as he/she can, so I don't see why it's rational to assume them choosing $2 in the first place.

Your assumption of versus random is very different from the assumption used in game theory, that is, your opponent will try to minimise any loss to his payoff. Different assumptions give different results. Also note that minimising loss is subtly different from maximising payoff, which is one of the reasons simple TD does not model human behaviour well.
I have never read anything on game theory that said minimizing risk of loss is the only goal, at any rate, if this is so, it only means those theories on game theory are in fact irrational.

Also, Gooblah

Understandable mistake, I'd recommend reconsidering more of what others have posted, not just myself, as this is a poor argument - even if you try to "add it up" that way, 1/97 * 99 + 1/97*101 + 1/97*x + ... will add up to a lot more than $2-4.
 
seeing as its uncompetitive, why wouldn't you both choose $100? what kind of moron would care about winning $2?
 
Well, $99 would get you $101 possibly. But yeah $2 or $4 is nothing. You'd have to be really petty if someone getting $4 more than you is important.

I wouldn't care, just give me my 97 dollars man (other guy chooses 99, I choose 100)
 
Flip a coin for 99 or 100.

99 offers the highest possible outcome of 101 should my opponent pick 100, I get 99 if s/he picks 99, and s/he decides how little s/he wants otherwise.

OTOH, picking 100 ensures that I either get 100, 97, or my counterpart will feel silly lowballing his/her winnings.

Every other number is a rehashed version of the first argument at a lower prize point, and since I've only money to gain by participating (nothing to lose), there's not much of a point going below 99 in a dismal attempt to outdo my counterpart.

Since both going 100 gives out the most total money, I see 99 and 100 being essentially equal in justification, and without some other prejudice, I would not mind random chance deciding between those two numbers.

Spoiler :
I have thought of rephrasing the problem in this way:

Suppose that one contestant has assumed the other will pick n. There are three basic options: Attempt to go over n, attempt to go exactly one under n, or match n exactly.

If that contestant matches n, then n dollars are won, otherwise the contestant has accidentally/intentionally guessed higher or lower. If the contestant goes higher, then n-2 dollars are won regardless of how much higher the guess is. If the contestant goes lower, then anywhere between 4 and n+1 dollars are won.

If n is random, then the contestant has a 1 in 99 chance of guessing it exactly, a 49 in 99 chance of going low, and a 49 in 99 chance of going high when trying to guess it exactly. If the contestant is going high, the contestant automatically picks 100 and takes either 100 or n-2, without caring about the reasoning for n. If going low, the contestant has a 1 in 98 chance to take n+1, a 2 in 98 chance to take n, a 1 in 98 chance to take n-1, and a 94 in 98 chance to take n-2 or worse if n is random. Without a better-than-random guess on n, the contestant has only a 4 in 98 chance to beat the picking higher strategy by trying to go low.

E[going higher]: (100 +98(n-2))/99 = (98n - 96)/99
E[going equal, n>=3]: (sum[i=2,i=n-1]{i+2} + n +(100-n)(n-2))/99 = 2n - 4 +(1/2)n^2 - (1/2)n -1 + n + 100n - 200 - n^2 +2n
= (-(1/2)n^2 + 104.5n - 205)/99
E[going equal/low, n=2]: (n+98(n-2))/99 = (99n - 96)/99= (198 - 96)/99 = 102/99
E[going lower, n>=3]: (sum[i=2,i=n-1]{i+2} + n +(99-n)(n-2))/99 = 2n - 4 +(1/2)n^2 - (1/2)n -1 + n + 99n - 198 - n^2 +2n
= (-(1/2)n^2 + 103.5n - 203)/99

For random n>=3, going higher and thus picking 100 is the best choice, but n is not likely to be random, so the perceived expected values would likely be different.
 
I can't believe nobody's noticed that it's a game show! So the effect on your reputation totally outweighs any monetary gain, even if you are the most selfish bastard in history!

Which just goes to prove, in spades, the point I made in a whole 'nother thread - game theory as usually applied is insane.

Also, In closing,
100.

For the same reason that you do not torture, you do not negotiate with terrorists, you do not betray your principles, you do not break the seal of the confessional.

The ideal may be difficult, the execution imperfect. But the right answer is and remains 100 until the end of time.

What. He. Said.
 
EDIT: Rashiminos pointed out I'd misunderstood the question. I'd still pick either 99 or 100, but not for the specific value calculations done here.

Alright, lets look at this differently. If you pick 100 and the other player does not, you get $98. If you were to pick 96 and the other player picks more, you get the same $98. If you pick any number lower than 96, you will receive less than you did when you picked 100 and lost anyway. That is why this does not create a classical race to the bottom, and why the only rational choice lies in the 96-100 range. Now, lets do some more math.

You pick 96.
Value = (chance of win * 98) + (chance of tie * 96) + (chance of loss * 94)
Value = ( 4/100 * 98 ) + ( 1/100 * 96) + (95/100 * 94) = $94.18

Similarly:
Pick 96 = $94.18
Pick 97 = $95.14
Pick 98 = $96.10
Pick 99 = $97.06
Pick 100 = $98.02

Now, this assumes that the choice your opponent makes is independent, which some might argue isn't strictly true. However, if the other player is trying to engage in silly head games as well, and the other player clearly must not know your guess (in which case you lose always, and 100 is still the better bet), then their guess is to you, completely random, as you can hardly replicate their thought process. Thus 100 is always the strongest choice. If you assume the other player is as rational, 100 is an even stronger choice than simple probability would suggest.
 
Probabiltiy dictates picking $2.

How so? With quick calculations (my math may be a bit rusty but I think I'm correct) the best payoff is given by 96 and 97 ($49.08) and the worst by 2 ($3.98) if going purely by the probabilities.

Spoiler :
2 -> $3.98
3 -> $4.93
4 -> $5.87
5 -> $6.80
6 -> $7.72
7 -> $8.63
8 -> $9.53
9 -> $10.41
10 -> $11.29
11 -> $12.16
12 -> $13.02
13 -> $13.87
14 -> $14.71
15 -> $15.54
16 -> $16.35
17 -> $17.16
18 -> $17.96
19 -> $18.75
20 -> $19.53
21 -> $20.29
22 -> $21.05
23 -> $21.80
24 -> $22.54
25 -> $23.26
26 -> $23.98
27 -> $24.69
28 -> $25.38
29 -> $26.07
30 -> $26.75
31 -> $27.41
32 -> $28.07
33 -> $28.72
34 -> $29.35
35 -> $29.98
36 -> $30.60
37 -> $31.20
38 -> $31.80
39 -> $32.38
40 -> $32.96
41 -> $33.53
42 -> $34.08
43 -> $34.63
44 -> $35.16
45 -> $35.69
46 -> $36.20
47 -> $36.71
48 -> $37.20
49 -> $37.69
50 -> $38.16
51 -> $38.63
52 -> $39.08
53 -> $39.53
54 -> $39.96
55 -> $40.38
56 -> $40.80
57 -> $41.20
58 -> $41.60
59 -> $41.98
60 -> $42.35
61 -> $42.72
62 -> $43.07
63 -> $43.41
64 -> $43.75
65 -> $44.07
66 -> $44.38
67 -> $44.69
68 -> $44.98
69 -> $45.26
70 -> $45.54
71 -> $45.80
72 -> $46.05
73 -> $46.29
74 -> $46.53
75 -> $46.75
76 -> $46.96
77 -> $47.16
78 -> $47.35
79 -> $47.54
80 -> $47.71
81 -> $47.87
82 -> $48.02
83 -> $48.16
84 -> $48.29
85 -> $48.41
86 -> $48.53
87 -> $48.63
88 -> $48.72
89 -> $48.80
90 -> $48.87
91 -> $48.93
92 -> $48.98
93 -> $49.02
100 -> $49.02
94 -> $49.05
99 -> $49.05
95 -> $49.07
98 -> $49.07
96 -> $49.08
97 -> $49.08


For any sane person (excepting those who'd pick $2 just to annoy their opponent) there's never a reason to pick number less than 99. The obvious reason being that by mutually taking 100 you'd still get at least the same amount. In a perfect world I'd pick 100 but depending on my opponent my real pick would be either 100 or 99. All the other picks would be just insane in my opinion.
 
If I pick $100, my payout is determined by what my opponent chooses. The insignificant potential difference of $4 does not justify lowering my pick and squandering possible value far in excess of $4. However, lowering my pick to $98 has no possible adverse outcome for me, but also provides no benefit. Lowering it to $99 also has no adverse outcome, but includes the possibility of gaining $1 over the $100 bet, so unless I'm trying to help the other guy, I should do that.
 
You pick 96.
Value = (chance of win * 98) + (chance of tie * 96) + (chance of loss * 94)
Value = ( 4/100 * 98 ) + ( 1/100 * 96) + (95/100 * 94) = $94.18

You'd give 100. Either way you get $98 or $100, which is pretty sweet either way.

Have to say it again since people are still missing it, when you pick higher than the other person, you get 2 less than the number S/HE PICKED.
 
By the way, for those of you arguing for low numbers, you ONLY actually benefit from aiming below your opponent rather than shooting high if you guess zero, one, two, or three dollars less than he does.
 
Have to say it again since people are still missing it, when you pick higher than the other person, you get 2 less than the number S/HE PICKED.

Ah, yes, totally missed that. That does make things less obvious.

Though in that case I think the drive to pick higher just becomes stronger. I would still choose large because it doesn't really make sense for the other person to pick lower, as they too want to maximize their gain without respect to my gain. Undercutting my 'opponent' only really hurts me.

The issue here is that the $2 penalty so to speak is so small relative to the potential $101 payoff.
 
I wouldn't play the game. I'd pick up my chair and throw it through the reflective glass window and then beat up the experimenters watching me.
 
I would choose a few $ less than $100. Unless you assume for some odd reason that your opponent is an idiot and wants to win even though there's no incentive, he'll probably choose about the same number. Maybe I lose a few $, maybe I gain a few extra $. The chance of winning a few extra $ is nice, but if not, meh.

Put me down as one of those who think that rationality is maximising gain, since that's how it's defined for game theory anyway.


EDIT: I'll hazard a guess and say that the pure strategy Nash Equilibrium is at 2, but there is probably at least one mixed strategy Nash Equilibrium at a high value that would justify most people's hunch for choosing something close to 100.

PS: I wonder if the choices people make offer any clue to what their personalities are like. I think I wouldn't really want to get to know those who choose $2.
 
You guys looking at the opponent picking randomly are ridiculous :p Your opponent isn't going to pick randomly, so your model is worthless. Sorry.
 
I'd go either 99 or 100. My money will probably not depend on my choice at all so I'll just trust that the other guy isn't an idiot. 99 for a little bit of potential gain, but not much risk.
 
Back
Top Bottom