# Interview Questions

interview questions shared by candidates

## Interview Questions

### Assistant Trader at Jane Street was asked...

what is the expected number of flips of a coin to simulate a 6 sided die. 12 AnswersE(x) = .75*3 + .25*E(X+1). From there it simplifies to E(X) = 10/3 E(x)=3+0.25E(x), so E(x)=4. Hi Can you please explain this in detail ? I dont seem to follow how did u guys do this ? Thanks Show More Responses 11/3. E(x) = .75*3 + .25*E(X+2). The probability of a six sided die is 1/6, while the probability of a coin flip is 1/2, thus, in order to simulate a six sided die the coin must have the same probability, so 1/2 has to become 1/6. From there, you can write the equation1/2^x=1/6, or to make it simpler, 2^x=6. Without using guess and check, you can rewrite the equation to log base 2 6=x (or log2(6)=x), and from there solve: log(6)/log(2), which comes out to 2.58496, or 2.585 (2^2.585=6.000). So, to answer your question, the coin would have to flip 2.585 in order to simulate the flip of a coin. x=2.585 3 flips. The coin gives 1 out of 2 possibilities per flip. 3 flips will give you 6 possible combinations like the 6 sided die (1 heads, 2 tails, 2 heads, 1 tail, etc.). It is easier to look at it as almost 3 flips set up as a binary number. It would be perfect if you were representing an 8 sided die, but since you want only 6 possible results and 2 flips is too little and 3 flips is too big you have to adjust your fomula as such: 0=tails 1=heads 000 = Null this means you scratch these results and flip again three more times. 001 = 1 010 = 2 011 = 3 100 =4 101 = 5 110 = 6 111 = is another Null result Of course the Nulls can be made to be anything as are the numbers it is all relative. This give you a 1 in 6 chance everytime. Wsc is on the right track. You have to flip 3 coins at least, but the probability of a null result is 1/4 every time, so you would need to flip 3 more coins. This is a geometric sequence 3+3*1/4+3*(1/4)^2 etc, so the answer is 3/(1-1/4) = 4 expected coin flips. 4 The correct answer is 11/3, as written by "Someone" on July 9, 2011. Both William's and P's answers are close, but not quite accurate, because they assumed that it takes 3 flips to discard a null result. Instead, it only takes two flips to discard a null result (HHT or HHH) since we don't have to flip the third coin if the first two are already heads (HH). That's why the answer is slightly less than 4. Indeed, note that we have a 3/4 chance of being able to get one of the 6 good results (HTH, HTT, THH, THT, TTH, TTT) that yields a random number from 1 to 6, and a 1/4 chance of having to reflip the coins (HHH or HHT). The key is that for those two results to be discarded, we only needed to flip 2 coins to determine that! There's no need to flip a 3rd coin if the first two were already HH (it'd be a waste of a throw!). Hence, 3/4 of the time, the number of flips required is three and 1/4 of the time, the expected number of flips required is 2+E[X]: E[X] = 3/4*3 + 1/4*(2+E[X]). Alternatively, we can flip the coin twice, and then note that 3/4 of the time (for HT,TH,or TT), we only flip one more coin, and 1/4 of the time (for HH), we have to re-flip, so E[X] = 2 + (3/4*1+1/4*E[X]) Either way, we have E[X] = 11/4 + 1/4*E[X], which yields the desired result, E(x) = 11/3. 4.5 It's 11/3. Say you want HHH and TTT to be your rolls to start over so that the last roll is 50-50 to be H or T and you'll only have to roll twice more since there will be no bias. Roll your first roll, it's H or T. After that there is a probability of 1/4 that you'll have to 2 times again. So there a 3/4 chance you won't have to roll again. So using the equation 1/p to find EN we get 1+2(4/3)=1+8/3=11/3 You can't flip a die because you only have a coin. DUH |

### Junior Trader at Jane Street was asked...

During the third interview : imagine an infinite chess board. If the horse from 1 case, in how many cases can he possibly ends after 10 moves. You actually don't have to give a number but a 95% confidence interval. pen and paper allowed 13 Answers751 Don't really understand the question... the horse from 1 case? and whats the intuition of the answer given? I think it's much more than 751. I found 7938126 cases, and there are more. Show More Responses The answer is actually 741. I probably wouldn't have thought of this super quickly during an interview, but just think about it this way: in 10 moves, the most the knight can move to the right (or to the left) is 20 spaces. So you are confined to a 41-by-41 box of spaces. Now just label each column (or row if you'd prefer, as this is obviously symmetric) 0-40 (so the knight begins on the center square of column 20). Then just ask yourself....how many spaces in column 0 can the knight land on in 10 moves? How about column 1? And so on. If you think of it this way the answer will become clear. To "candidate" I suspect you are solving the wrong problem. If you're looking at possible paths that a knight might take then maybe I'd believe your answer. But for spaces to land on this is definitely wrong. By the way, since this is only a confidence interval, my friend had a very good way of estimating this. Each knight moves along the diagonal of a triangle with legs 1 and 2; in other words, the knight moves sqrt{5} away from the center with each move. So the spaces that the knight could land on would be roughly inside a circle of radius 10sqrt{5}, which has area about 1500. But the knight can only land on half of those squares because in an even number of moves he must land on a square that's the same color as the square he started on. So your estimate would be 750, so take some reasonable interval around 750. «wlfgngpck» I did it the way you said by giving half the area of the circus of radius 10*sqrt(5) so 750 is not bad for a true answer of 741 (or 751 I don't remember) Solution visualised : http://twitpic.com/9jegtb Formula (no idea why, but it works) x=amount of steps (square(1+4*x) + 1) / 2 - square(4*x) / 16 Range as square : ( 2x + 1 + 2x ) * (2x + 1 + 2x) = 41*41 = 1681 Add centerpoint : + 1 / 1682 Only half of the fields will be posible : / 2 = 841 Corners are not reachable, 1/8th of each direction without center lines : ( 2x*2x / 2 ) /8 = 100 Result 841-100=741 741 is too big, if the knight is at (0;0) can get to (20;10) in 10 moves but not to (19;9) in 10 moves...try it out with 2 moves...it's not 37 cases but only 33 because he can't get to (2;2) (-2;-2) (-2;2) (2;-2) in 2 moves... sorry I meant to write that he can't reach (14;14), of course he can reach (19;9) so my answer would be 741-4=737 cause he cant reach (14;14), (14;-14)and (-14,14) (-14;-14) no my bad, the can be reached 741 is correct, it's just in the case 2 that you have to substract these diagonals... he can get to 20 spaces in all 4 directions. thats 41x41 = 1681. he cannot reach the corners which are composed of 10 + 9 + 8...+2+1 boxes = 55. so four corners are 220 boxes. that leaves him at 1461. subtract one for the one he is on currently -> 1460 and divide by 2 bc he cant land on the opposite color from where he started. answer is 731 7n^2 + 4n + 1 for any n>1 |

Flip a coin until either HHT or HTT appears. Is one more likely to appear first? If so, which one and with what probability? 13 AnswersLet A be the event that HTT comes before HHT. P{A} = P{A|H}P{H} + P{A|T}P{T} = .5P{A|H} + .5P{A|T} P{A|T} = P{A} therefore, P{A|H} = P{A|T} P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Therefore, 2P{A|H} = P{A|HT} P{A|HT} = P{A|HTT}P{T} + P{A|HTH}P{H} = (1)(.5) + P{A|H}(.5) 2P{A|H} = .5 + P{A|H}(.5) P{A|H} = 1/3 and P{A|H} = P{A}, therefore, P{A} = 1/3 So, HHT is more likely to appear first and it appears first 2/3 of the time. P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Need help - - why is P{A|HH} = 0 ? P(A|HH) = 0 because after a sequence of consecutive heads, you can no longer achieve HTT. The moment you get a tail, you will have the sequence HHT. This the reason HHT is more likely to occur first than HTT. Show More Responses HHT is more likely to appear first than HTT. The probability of HHT appearing first is 2/3 and thus the probability of HTT appearing first is 1/3. Indeed, both sequences need H first. Once H appeared, probability of HHT is 1/2 (b/c all you need is one H), and probability of HTT is 1/4 (b/c you need TT). Thus HHT is twice is likely to appear first. So, if the probability that HTT appears first is x, then the probability that HHT appears first is 2x. Since these are disjoint and together exhaust the whole probability space, x+2x=1. Therefore x=1/3. You guys seem to be mixing order being relevant and order being irrelevant. If order is relevant (meaning HHT is not the same as HTH) then this has a 1/8 of occuring in the first 3 tosses. Also HTT has a 1/8 chance of occurring in the first 3 tosses, making them equally likely. Now, if order is not relevant. (so HHT and THH are the same), then this has a (3 choose 2) * (1/8) probability of happening in the first 3 tosses. The same goes for HTT (which would be the same as THT etc and others) so this has a (3 choose 2) * 1/8 probability of happening in the first 3 tosses as well. Either way they come out to being equally likely, please comment on my mistake if I am doing something wrong. Ending up with HHT more likely (with probabilty 2/3). HHT is more likely (2/3) probability. People with wrong answers: Did you not Monte Carlo this? It takes 5 minutes to write a program, and you can then easily see that 2/3 is correct empirically. I don't get it. Shouldn't P{A|HH} = P{A} in the same sense that P{A|HTH} = P{A} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A} Am i wrong? sorry, i meant: I don't get it. Shouldn't P{A|HH} = P{A|H} in the same sense that P{A|HTH} = P{A|H} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A|H} Am i wrong? Above link is the best solution I have seen for this problem http://dicedcoins.wordpress.com/2012/07/19/flip-hhh-before-htt/ Apologies, Below* Here's my answer. Let x = probability of winning after no heads (or a tail). y=probability after just one heads. z=probability after two heads. w=probability after HT. Thus x=(1/2)x+(1/2)y, y=(1/2)z+(1/2)w, z=1/2 + (1/2)z, w=(1/2)y. Therefore, z=1, y=2/3, w=1/3, x=2/3. We wanted x at the beginning, so it is 2/3 that HHT comes up first. |

### Trader Intern at Jane Street was asked...

Russian Roulette - 4 blanks 2 bullets, all in a row. If someone shoots a blank next to you, would you take another shot or spin 12 Answerstake another shot, 3/4 chance of surviving vs 2/3 if you respin Here is my answer: the prob. of survival after re-spin is: 3/5. the prob. of survival with on re-spin is: 1/15 * 6/4 = 1/10. correction: the prob. of survival after re-spin is: 4/6. Show More Responses my final correction: the prob. of survival after re-spin is: 4/6 = 2/3 the spin with no spin is: 2/5 = c(4, 2) / c(6, 2). Denoting the blanks by 0's and live bullets by 1's, and adjoining the left and right edges (denoted by ~) , so as to make a cylinder, the following diagram illustrates how the bullets are arranged in the cylinder of the revolver: ~000011~ . Denote the bullet that is to be fired if the trigger is pulled by enclosing it in parenthesis. Denote an empty chamber by *. Assume that when you pull the trigger, the cylinder rotates clockwise (to the right in the diagram above). If when you pulled the trigger for the first time, the bullet was a blank, then before and after you pulled the trigger, the cylinder was and is in one of the following states: 1) Before: ~(0)00011~ After: ~*0001(1)~ 2) Before: ~0(0)0011~ After: ~(0)*0011~ 3) Before: ~00(0)011~ After: ~0(0)*011~ 4) Before: ~000(0)11~. After: ~00(0)*11~ If you pull the trigger again without spinning the cylinder, then only in case one will the bullet be a live bullet, yielding a 3/4 probability of the bullet being a blank . If you spin the cylinder and then pull the trigger, then you have a 4/6=2/3 probability of the bullet being a blank. Clearly, you should not spin the cylinder. don't spin, 3/4 prob survival if not spinned respin - 4/6 = 2/3 (obvious) don't spin - you only die if the blank was the "last" one, which is 1/4 chance. hence 3/4 chance of live. since 3/4>2/3 don't spin. Wouldn't play. Spin Spin: 4/6=0.6667 to survive Not Spin: C(4,2)/C(5,2)=3/5=0.6 condition on 2 bullet is in consecutive position and someone survives the 1st shot, we have: spin it-> survival rate is 4/6=2/3=67% not spin it-> survival probability=3/4=75% so...not spin it will have a bigger chance of survival. the question did not say that the bullets are next to each other so in this case you should spin I. Using Bayes' Theorem: P[0] = probability of a blank : 4/6 P[1] = probability of a bullet : 2/6 P[1|0] = probability of a bullet given a blank : find this P[0|1] = probability of a blank given a bullet If a bullet occurs then the next pull can be a bullet or a blank and so P[0|1] = 1/2 P[1|0] = P[0|1]*P[1] / P[0] = 1/4 So there is a 25% chance that the next pull is a bullet, or a 75% chance that it is a blank. The first pull had a probability of 8/12 of being a blank. Given a blank on the first pull, the second pull has a 9/12 probability of being a blank. II. From a frequentist perspective: So when the first pull is made and it is a blank the event space decreases from 6 possible outcomes to 4 possible outcomes. The first pull was on the last bullet or the the first pull was on the second to last blank. Out of the four possible events there is one where after the pull the chamber is sitting on the blank right before the bullet. Out of the four events 3 are blanks (3/4) and 1 is a bullet (1/4). |

### Assistant Trader at Jane Street was asked...

Simulate a 6 sided die with a coin. 13 Answerssplit into two sets {1,2,3,4} {5,6} with one assigned to heads and the others tails. If in the {5,6} set flip again. If in the {1,2,3,4} set assign {1,2} to heads and {3,4} to tails. then split again. Toss three coins, we get 8 outcomes: HHH, HHT, HTH, HTT, THH, THT, TTH, TTT, for the first two, we toss again until neither HHT or HHT appears, for the remaining 6, we assign 1~6 individually. I propose another solution. Use 4 coins. Flip 2 coins together and separately another 2 coins together. Sample space for a 2 coin flip is: HH, TT, HT, TH (assume that HT=TH) For the two 2-coin flips assign value of dice, for example (column 1: 2 coin flip, column 2 other 2-coin flip) HH HH => 1 HH HT/TH => 2 HH TT => 3 TT TT => 4 TT HT=> 5 HT HT=> 6 with TH=HT Show More Responses With 1 coin, flip three times and HHH=1,HHT=2, HTH=3,HTT=4, THH=5, THT=6. If the first twice is TT, then abandon the first T and start with the second T. The candidate and William have problems. I think Ey's solution works. For the candidate's solution, you will get way more 5s and 6s, since fully 1/2 of the results will be either a 5 or a 6. Similarly, for William's solution, if you toss out results starting with HH, then you end up with ANY result that starts with a tail, and only half the results that start with a H. That means you will end up with more 3s, 4s, 5s, and 6s than you should. I think by simply changing what you throw out you could fix William's solution. If you threw out all HHH and all TTT you would be closer, but a solution where you don't toss out any data is still better. Nope. I'm a moron. Can't edit earilier post, but Ey, sorry, I gave you too much credit. Can't count HT and TH as the same thing because you get too many of them. No idea why I didn't see that 10 minutes ago. You can't map 4 coin results to three numbers fairly. You will end up with too many 2s and 5s and way too many 6s with your solution. Now we have no valid answers. I think someone proved it can't be done elsewhere. Interesting problem though. Toss 1: Heads=even, start with 2, tails=odd, start with 1 Toss 2: Heads=0, tails=2, add to total Toss 3: Heads=0, tails=2, add to total It cannot be done. The odds of any set of coin flips is always a power of 2, and there is no power of 2 for which 6 is a factor, because 6 factors to 2 * 3 and both are prime numbers. Eric, you're right, I thought my solution worked as well. My idea was that we do 2 independent 2-coin flips. Each 2 coin flip has the following possible outcome: HH, TT, HT(or TH, same thing). When combining the outcomes of the 2 independent 2 coin flips this is what we get: (1) HH (2) HH (1) HH (2) HT (1) HH (2) TT (1) TT (2) HH (1) TT (2) HT (1) TT (2) TT (1) HT (2) HT (1) HT (2) HH (1) HT (2) TT we get 9 possible outcomes, even if we set for example outcomes (1) HH (2) TT to be equivalent to (1) TT (2) HH, and similarly with others to get 6 outcomes, it won't work... HHH, TTT, start the process again. Assign the rest values of dice. HHT = 1 HTH = 2 HTT = 3 THH = 4 THT = 5 TTH = 6 Geez, they should really allow for nested responses for better discussions. I agree that with the approach first stated by William (except he had a typo, "HHT or HHT appears" should read "HHH or HHT appears" he was just referring to first two (or any two of the eight combination that one chooses not to assign values to). The way that "Do three flips" describes is clearer. Other approaches are problematic in that the chances of getting 1-6 are disproportionate leading to an unfair dice (for reasons that others stated). "Binary Thinkers" approach is also incorrect as for similar reasons. there 8 possible accounts (3 flips 2^3 = 8). Although it's a unique way to account for all the numbers 1 through 6... 3's are and 4's are precisely twice as likely to occur (2/8 or 1/4 chance) vs the other numbers (1/8 chance). You may ask for a probability of THHHHH if we get a consequent combination with 5 heads and one tail. Here are the 6 possible combinations: THHHHH HTHHHH HHTHHH HHHTHH HHHHTH HHHHHT so the probability of each is 1/6 Not a precise solution, but good enough for many purposes: Flip the coin a large number of times, interpret the result as number in base 2, and take it modulo 6. Some of the six numbers will be slightly disadvantaged, but you can make that difference arbitrarily small by increasing the number of coin flips. |

### Quantitative Trader at Jane Street was asked...

N points lie on a circle. You draw lines connecting all the points to each other. These lines divide up the circle into a number of regions. How many regions is this? Assume that the points are scattered in such a way as to give the maximum number of regions for that N. 12 Answersn choose 4 + n choose 2 + 1 I don't think the above answer is right.What happens if N is less than 4? Even if N is at least four, it still doesn't work For example, take the case when 4 lines are drawn, you can make 11 regions, but Dimitar's formula gives 4c4+4c2+1=8 The way you do it, is you start out with 1 region with 0 lines. Then you draw 1 line to get 2 regions. When the maximum number of regions is created, each new line will cross every previous line and will create a new region each time it crosses line. It also creates a new region just by being created. So the 2nd line makes 2 new regions, the third line makes 3 new regions, the fourth line makes 4 new regions etc. So the answer is the sum of the first N natural numbers +1. In general this is the formula N(N+1)/2+1 Hi, I also get Dimtar's formula, you may use the recursion method to do it, which takes some time. Is there any easy way to get this one? Btw, when there are 4 dots on a circle, I think it shoud still give 8 regions. I think 11 is the answer to another problem. If you make n cuts on a circle, the max shares you can create. Show More Responses I don't think that is correct. I can't see a way that, while obeying these rules, you can have 11 sections for 4 points. Perhaps the formula 2^(N-1)? i got 2^(N-1) as well 1 dot - 1 region 2 dots - 2 regions 3 dots - 4 regions 4 dots - 8 regions 2^(N-1), N greater than or equal to 1 The unanswered question (and the reasoning behind the combinations thing) is whether two adjacent regions may have the line that divides them ignored for counting up a them both together as a new region. Dimitar's way assumes you may erase a line and Mike's way does not. However, for Dimitar's way, I think the combinations do not just stop at the nC4 + nC2, but really keep going up to n (and Dimitar just stopped at ~6 when he was testing the base cases). Mike's way assumes each region cannot be made up of sub regions, which I feel is the more correct answer, since it gives a scalable answer without summation/set notation, though you should ask the interviewer which he'd prefer. Let's take the 3 dot example: Mike's thinking: =There's the middle triangle and the 3 arc regions, so 4 Dimitar's thinking: =Mike's + triangle + 1,2,3 regions =4 + 3C1 + 3C2 + 3C3 =4 + 3 + 3 + 1 = 11 which really seems to break down into: 2^n-1+summation(nC1->n). I dislike Dimitar's way, because it invariably includes the entire circle as a "region" that has been created, which seems to go against the spirit of the problem. Err, 2*n-1+summation(n-1C->n-1) This is a famous example in mathematics; it's often used as a warning against naive generalization. Here are the answers for the first six natural numbers: (# points) : (# regions) 1 : 1 2 : 2 3 : 4 4 : 8 5 : 16 6 : 31 Yes, 31. You can see, e.g., Conway and Guy's "The Book of Numbers" for an account of this. it is "Moser's circle problem" 1+nC2+nC4 srsly? draw 4 lines yourself and you can easily find its 11 The last answer should be 1+(n-1)*n*(n^2-5n+18)/24. At first I thought it's 2^(n-1), because for the first 5 terms it's really the same, but when n=6, it should be 31. Solution: I use recursion. Assume the number of regions for n points is f(n). If we add a new point, we can count how many new regions are created. 1. outside the polygon of original n points, create n new regions. 2. inside the polygon, once the line between the new point and original point cuts the original line, there will be one more region. Label the original points clockwise as 1 2 3 ... n, and the line of the new point with point i will cut (i-1)(n-i) lines. This number is the number of points before the points i times the number of points after the point i. Then how to calculate the sum of (i-1)*(n-i), i.e., sum(k)*n + sum(k*(k+1)). The first term is easy. The second term in fact is a question in the book Heard On the Street. First we guess it's k(k+1)(k+2)/3, and then use mathematical induction to prove it in an easy way. After all, I get f(n+1) = f(n) + n + n*(n-1)*(n-2)/6. And then use recursion and formula for sum(k^3), sum(k^2), sum(k) to get the last answer. |

### Assistant Trader at Jane Street was asked...

You have a box filled with cash. Cash value is uniformly randomly distributed from 1 to 1000. You are trying to win the box in an auction: you win the box if you bid at least the value of the cash in the box; you win nothing if you bid less (but you lose nothing). If you win the box, you can resell it for 150% of its value. How much should you bid to maximize the expected value of your profit (resale of box minus bid)? 11 AnswersMy bad, I meant "0 to 1000" not "1 to 1000". Not that it affects the answer (hint hint). the expectation is always negative? bid 0. The expectation is negative. Show More Responses can someone explain why it is the case mathematically?? I thought your expected payoff is (0+1000)/2 x 1.5=750.. so if you bid 500. you will get expected 250 profit... Anon, If you bid 500 and it is worth 1, you only get paid $2 so you lose 498. $0 is correct. Generalize this for 1,2,5,10 maybe and find the EV of all those, or just 1,2,10, and then you will be able to "guess" that it isn't getting more and more negative and you should bid 0 you should bet 1, that is the minimum in the box, so you will win 0 or .5 if you bet one you'll never win. if you bet 500, you're missing out if value(box) >500 and <750 since those are +EV too. believe the answer is 749. The answer is bid 0 dollars. Soln: If you bid X dollar, you get the box if if it contains Y dollars, where Y < X. Or you nothing, which doesn't hurt anyway. Now think of possible values of Y in PAIRS, (0, X), (1, X-1), (2, X-2)... in each pair (a, b), expected payoff is P(Y being a or b) * E(payoff given Y = a or b). but the second term E(payoff given Y = a or b) = 1/2 (3/2 a - X) + 1/2 (3/2 b - X) = 3/4 (a + b) - X = (-1/4) X, which is always negative. If we sum over all possible pairs, we should get a negative number. Ofocz you need to handle the edge case, where X is small and the pairing does not really work. I think that since we are dealing with expected profit, we should ignore the case when n (our bid) is less than X (cash in the box) as nothing happens in this case. So we create a new distribution where X is uniformly distributed on [0, n]. Thus, E(Profit) = E(1.5X - n) = 1.5E(X) - n = 1.5*n*(n+1)/(2*(n+1)) - n = 0.75n - n = -0.25n <= 0. So the expected profit is maximised (=0) if n=0. Sorry, guys. I seem to have made a mistake in my solution above. Let X = cash value, n = our bid price. Then Profit = 1.5X - n. We want to find E(1.5X - n) = 1.5*E(X) - E(n). Now in order to find E(X), we notice that that X is uniformly distributed. So every outcome has a probability 1/1001. Also we only need to sum up over the values from 0 to n. Hence, E(X) = n(n+1)/(2*1001). For E(n), we see that we will pay the amount n with probability (n+1)/1001 and 0 otherwise. So E(n) = n(n+1)/1001. Therefore, E(Profit) = 1.5*n(n+1)/(2*1001) - n(n+1)/1001 = -0.25*n(n+1)/1001<=0. In order to maximize profit, we should bid zero. The answer is 0 if you ask me and here is the reason. let V be the value of the box and let x be your bid then you profit P is P=(1.5V-x)1_{V<=x}. we take expected value of this and use conditional expectation to get E(P)=E(1.5v-x|v<=x)p(v<x)=(1.5(x/2)-x)(x/1000)=-0.25x^2/1000. |

You are playing a game where the player gets to draw the number 1-100 out of hat, replace and redraw as many times as they want, with their final number being how many dollars they win from the game. Each "redraw" costs an extra $1. How much would you charge someone to play this game? 10 Answers10? redraw 10 times and get the payoff around 77? the average draw will pay out $50.50 Show More Responses Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of $1 is sunk cost); continue, then the expectation of benefit would be $50.50-1=49.50. This means, as long as you have get more than $50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: $Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73. All of the above answers are way off. For a correct answer, see Yuval Filmus' answer at StackExchange: http://math.stackexchange.com/questions/27524/fair-value-of-a-hat-drawing-game Yuval Filmus proves that the value of the game is 1209/14=86.37 and the strategy is to stay on 87 and throw again on 86 and below.. Let x be the expected value and n be the smallest number you'll stop the game at. Set up equation with x and n, get x in terms of n, take derivative to find n that maximizes x, plug in the ceiling (because n must be integer) and find maximum of x. ceiling ends up being 87, x is 87.357, so charge $87.36 or more I guess the question asks for the expected value of the game given an optimal strategy. I suppose the strategy is to go on the next round if the draw is 50 or less. Hence, the expected value of each round is: (1) 1/2*1/50(51 + 52 + ... + 100) (2) 1/2*1/2*1/50(51 + 52 + ... + 100) - 1/2 (3) 1/2^3 * 1/50 (51 + 52 + ... + 100) - 1/4 .... Sum all these up to infinity, you'd get 74.50. This is all very interesting and I'm sure has some application...but to trading? I don't think so. I own a seat on the futures exchange and was one of the largest derivatives traders on the floor. Math skills and reasoning are important but not to this level. I would associate day trading/scalping more to race car driving i.e. getting a feel for what's going on during the day, the speed at which the market is moving and the tempo for the up and down moves. If I were the interviewer at one of these firms, I throw a baseball at your head and see if you were quick enough and smart enough to duck. Then if you picked it up and threw it at my head I'd know that you had the balls to trade. I know guys who can answer these questions, work at major banks, have a team of quants working for them and call me up to borrow money from me because they're not making money. At the end of the day, if you want to be a trader then...be a trader. If you want to be a mathematician then be a mathematician. It's cool to multiply a string of numbers in your head, I can do this also, but never in my trading career did I make money because in an instant I could multiply 87*34 or answer Mensa questions which...realistically the above answer is: it depends on the market as the market will dictate the price. You may want to charge $87 to play that game but you'd have to be an idiot to play it. In trading terms this means that when AAPL is trading at $700 everyone would love to buy it at $400. Now that it's trading at $400 everyone is afraid that it's going to $0. Hope this helps. No offense to the math guys on this page, just want to set the trading record straight. |

Design patterns questions 10 AnswersHi Did you get a call for Onsite for this position? I have it on Monday i.e 03-25.If you went through onsite interview.Please share your experience I dint get a call for on site yet. When was your 1st interview? It was on 03-14.Within 2 days they scheduled for onsite interview.By the way what all design pattern questions they asked you in first round?I was asked only Observer Pattern and Singleton Pattern Show More Responses Ya the same thing.....I already listed right...I had a difficulty in answering Observer pattern so listed here so that others can prepare well :) On site they will ask to write small code I guess but not so complicated. I heard that fidelity interview is not that tough as tech companies. So prepare well, answer confidently. All the best for the interview. Did you guys gave the on-site interview? How was the experience? Could you guys share your onsite experience ? @above Dude my onsite is scheduled next week. So I dint have any experience to share apart from the phone interview questions. Now its your turn to share the experience so that I can do somewhat better next week. So please do share your experience. Hi, I had my phone interview this week.The interview didn't go well but I have a similar interview with the same company in a different location.Can someone please tell the questions that were asked onsite? Thanks Hello, I heard that the questions asked on onsite were related to map reduce scenarios like how to approach a situation (which the interviewer tells) and develop map and reduce tasks (orally not written). Different behavioural questions apart from these. Has anybody actually given the onsite interview?Please share your experience. |

### Sales Strat Intern at Goldman Sachs was asked...

Suppose we hire you, and you and the rest of the new interns decide to go buy a cup of coffee. Each intern purchases one cup of coffee. One of the interns suggests everyone play a game. Everyone will flip a fair coin, dividing the group of interns into two subgroups: those that got heads and those that got tails. The game is this: whichever group is smaller evenly splits the cost of everyone's cup of coffee (i.e. if there are 5 interns, 3 get H, 2 get T, then the two interns that got tails each buy 2.5 cups of coffee). However, nothing says you need to play this game. You can choose to buy your own cup of coffee and not play the game at all. The question: Should you play this game? (Note: You may assume that there is an odd number of interns, so there are no ties, and that if everyone gets H or everyone gets T, then everyone loses and just buys their own cup of coffee). 18 AnswersHint: Despite its look, this is not a math question. I dont get it, Would you please provide more hints? Assume each coffee costs $1, for simplicity. So this is effectively a choice between two outcomes: paying $1 with probability 100%, or paying $0 with some probability and paying more than $0 with some probability. So you ask yourself: what is your expected cost in the second case? Give that a try and see if you can figure it out. However, I want to remind you that the question is "should you play this game?" The answer to this question isn't just a math question. If you only work out expected values, you've missed the point. For example, a separate question (with the same kind of flavor as the direction I'm trying to lead you) is this: suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question. Let me know if you have anymore questions. And if you want me to post the answer, just let me know. Show More Responses I think I get it. It's about investor's risk appetite. Investors is likely to take guaranteed gains, here is $1. Well yes and no. This is indeed a risk aversion question. If you work it out, you'll find that the EV in each case is exactly the same (your EV is -1 cup of coffee in both scenarios) but that's not the end of it. It's also really a question for them to test your risk aversion. You can really support either answer, and *should* comment on the validity of either answer. My answer was to go with buying my own cup of coffee, and followed it up with a story where a friend of mine had tried to get us to play credit card roulette (which is similar in spirit to this game) and I told him that I did, in fact, say no in that instance and why I said no. However, traditionally people are risk-averse when faced with gains and risk-seeking when faced with losses, so many would probably choose to play the game. But this is as much a question about your psychology as it is about your math skills. And that's why this is such an awesome question, and is probably a question that kills most people they ask it to. Would you please explain to me how do you get -1 for the second scenario? There are 50% H and 50% T. Therefore players have 50% opportunity in the winning group. Given 5 interns, there are two combination of lossing group (4 vs. 1 and 3 vs. 2). EV=0.5*0 + 0.5*(0.5*-5 + 0.5*-2.5) = -1.875. The probability of winning isn't 50%. It's actually slightly above 50%, but that's not the way to look at it. The total number of coffees that need to be bought is n, where n is the number of interns. Going into this game, every intern is the same so they each have the same expected value, and the sum of the expected values must equal -n. So everyone has an EV of -1 as claimed. Thank you very much. I finally got it. I'm still not sure why you said no if the EV is the same? It's a matter of risk preference. See the example I gave in my Mar 19 posting: "Suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question." You would probably not play in the first instance and consider possibly playing in the second. (And those instances even have the risky game with a HIGHER EV). This coffee game is the same kind of game, and even if the EV is the same in each case, the volatility is not the same. It is usually a good rule of thumb to take the lower volatility outcome if the EV is the same (think Sharpe Ratio here! Or think efficient frontier here! Both get the point across I think.). Does that help? I say yes. To play the game. Only because I'm prepared if I lost. The fact that I can afford to loose, makes me want to try my chance at wining I would play the game. Consider the expected price with n total interns splitting a total cost of P. That is P/2*E[1/(N)|you're paying]. This is simply equal to P/2*E[1/(N+1)] where n is a binomial now corresponding to n-1 interns. Notice that 1/n+1 is a concave function. That means that P/2*E[1/(N+1)] <= P/2*(1/E[N]+1)=P/2 * 1/( (n-1)/2 + 1)) = P / (n +1). So it pays to play on average Apologies, but both of the above answers are incorrect. The EV of playing is the same as the EV of not playing. Show More Responses No way would I play! The most money I save is a dollar the most money I can lose ( in the case of five interns) is 4 dollars... That's a 400 percent downside verse a 100 percent upside ( kind of you cannot technically compute your return on zero dollars invested). Plus, I do not even drink coffee! Stupid...I don't have time to play games at work. The EV in both scenarios are not the same: it's clear when think about the case where there are n = 3 interns. I've already explains why they are the same. In either scenario there will be n cups of coffee bought (3 in your case) so total EV is -n (-3 in your case). In the game each person is the same as any other so their EV must all be the same. That is why in the game the EV is -1 (same as if they buy their own cup of coffee). To argue otherwise is to argue that either the total EV is not the number of coffees purchased or that someone has an unfair advantage in the game. Incidentally, if you are going to claim that someone who has given you the answer is wrong, you should provide more of a response than "go think about it." if v = pay by game -1 . Then all have five intern have same distributed V by symmetry 5EV=0 It's zero sum. |