## Interview Question

Sales Strat Intern Interview New York, NY

`Goldman Sachs`

## Suppose we hire you, and you and the rest of the new

interns decide to go buy a cup of coffee. Each intern purchases one cup of coffee. One of the interns suggests everyone play a game. Everyone will flip a fair coin, dividing the group of interns into two subgroups: those that got heads and those that got tails. The game is this: whichever group is smaller evenly splits the cost of everyone's cup of coffee (i.e. if there are 5 interns, 3 get H, 2 get T, then the two interns that got tails each buy 2.5 cups of coffee). However, nothing says you need to play this game. You can choose to buy your own cup of coffee and not play the game at all. The question: Should you play this game? (Note: You may assume that there is an odd number of interns, so there are no ties, and that if everyone gets H or everyone gets T, then everyone loses and just buys their own cup of coffee).

## Interview Answer

13 Answers

I dont get it, Would you please provide more hints?

Assume each coffee costs $1, for simplicity. So this is effectively a choice between two outcomes: paying $1 with probability 100%, or paying $0 with some probability and paying more than $0 with some probability. So you ask yourself: what is your expected cost in the second case? Give that a try and see if you can figure it out. However, I want to remind you that the question is "should you play this game?" The answer to this question isn't just a math question. If you only work out expected values, you've missed the point. For example, a separate question (with the same kind of flavor as the direction I'm trying to lead you) is this: suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question. Let me know if you have anymore questions. And if you want me to post the answer, just let me know.

I think I get it. It's about investor's risk appetite. Investors is likely to take guaranteed gains, here is $1.

Well yes and no. This is indeed a risk aversion question. If you work it out, you'll find that the EV in each case is exactly the same (your EV is -1 cup of coffee in both scenarios) but that's not the end of it. It's also really a question for them to test your risk aversion. You can really support either answer, and *should* comment on the validity of either answer. My answer was to go with buying my own cup of coffee, and followed it up with a story where a friend of mine had tried to get us to play credit card roulette (which is similar in spirit to this game) and I told him that I did, in fact, say no in that instance and why I said no. However, traditionally people are risk-averse when faced with gains and risk-seeking when faced with losses, so many would probably choose to play the game. But this is as much a question about your psychology as it is about your math skills. And that's why this is such an awesome question, and is probably a question that kills most people they ask it to.

Would you please explain to me how do you get -1 for the second scenario?

There are 50% H and 50% T. Therefore players have 50% opportunity in the winning group. Given 5 interns, there are two combination of lossing group (4 vs. 1 and 3 vs. 2). EV=0.5*0 + 0.5*(0.5*-5 + 0.5*-2.5) = -1.875.

The probability of winning isn't 50%. It's actually slightly above 50%, but that's not the way to look at it. The total number of coffees that need to be bought is n, where n is the number of interns. Going into this game, every intern is the same so they each have the same expected value, and the sum of the expected values must equal -n. So everyone has an EV of -1 as claimed.

Thank you very much. I finally got it.

I'm still not sure why you said no if the EV is the same?

It's a matter of risk preference. See the example I gave in my Mar 19 posting: "Suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question." You would probably not play in the first instance and consider possibly playing in the second. (And those instances even have the risky game with a HIGHER EV). This coffee game is the same kind of game, and even if the EV is the same in each case, the volatility is not the same. It is usually a good rule of thumb to take the lower volatility outcome if the EV is the same (think Sharpe Ratio here! Or think efficient frontier here! Both get the point across I think.). Does that help?

I say yes. To play the game. Only because I'm prepared if I lost. The fact that I can afford to loose, makes me want to try my chance at wining

I would play the game. Consider the expected price with n total interns splitting a total cost of P. That is P/2*E[1/(N)|you're paying]. This is simply equal to P/2*E[1/(N+1)] where n is a binomial now corresponding to n-1 interns. Notice that 1/n+1 is a concave function. That means that P/2*E[1/(N+1)] <= P/2*(1/E[N]+1)=P/2 * 1/( (n-1)/2 + 1)) = P / (n +1). So it pays to play on average

Apologies, but both of the above answers are incorrect. The EV of playing is the same as the EV of not playing.

## Add Answers or Comments

To comment on this, Sign In or Sign Up.

Hint: Despite its look, this is not a math question.