# Interview Questions

interview questions shared by candidates

## Interview Questions

### Quantitative Trader at Jane Street was asked...

N points lie on a circle. You draw lines connecting all the points to each other. These lines divide up the circle into a number of regions. How many regions is this? Assume that the points are scattered in such a way as to give the maximum number of regions for that N. 12 Answersn choose 4 + n choose 2 + 1 I don't think the above answer is right.What happens if N is less than 4? Even if N is at least four, it still doesn't work For example, take the case when 4 lines are drawn, you can make 11 regions, but Dimitar's formula gives 4c4+4c2+1=8 The way you do it, is you start out with 1 region with 0 lines. Then you draw 1 line to get 2 regions. When the maximum number of regions is created, each new line will cross every previous line and will create a new region each time it crosses line. It also creates a new region just by being created. So the 2nd line makes 2 new regions, the third line makes 3 new regions, the fourth line makes 4 new regions etc. So the answer is the sum of the first N natural numbers +1. In general this is the formula N(N+1)/2+1 Hi, I also get Dimtar's formula, you may use the recursion method to do it, which takes some time. Is there any easy way to get this one? Btw, when there are 4 dots on a circle, I think it shoud still give 8 regions. I think 11 is the answer to another problem. If you make n cuts on a circle, the max shares you can create. Show More Responses I don't think that is correct. I can't see a way that, while obeying these rules, you can have 11 sections for 4 points. Perhaps the formula 2^(N-1)? i got 2^(N-1) as well 1 dot - 1 region 2 dots - 2 regions 3 dots - 4 regions 4 dots - 8 regions 2^(N-1), N greater than or equal to 1 The unanswered question (and the reasoning behind the combinations thing) is whether two adjacent regions may have the line that divides them ignored for counting up a them both together as a new region. Dimitar's way assumes you may erase a line and Mike's way does not. However, for Dimitar's way, I think the combinations do not just stop at the nC4 + nC2, but really keep going up to n (and Dimitar just stopped at ~6 when he was testing the base cases). Mike's way assumes each region cannot be made up of sub regions, which I feel is the more correct answer, since it gives a scalable answer without summation/set notation, though you should ask the interviewer which he'd prefer. Let's take the 3 dot example: Mike's thinking: =There's the middle triangle and the 3 arc regions, so 4 Dimitar's thinking: =Mike's + triangle + 1,2,3 regions =4 + 3C1 + 3C2 + 3C3 =4 + 3 + 3 + 1 = 11 which really seems to break down into: 2^n-1+summation(nC1->n). I dislike Dimitar's way, because it invariably includes the entire circle as a "region" that has been created, which seems to go against the spirit of the problem. Err, 2*n-1+summation(n-1C->n-1) This is a famous example in mathematics; it's often used as a warning against naive generalization. Here are the answers for the first six natural numbers: (# points) : (# regions) 1 : 1 2 : 2 3 : 4 4 : 8 5 : 16 6 : 31 Yes, 31. You can see, e.g., Conway and Guy's "The Book of Numbers" for an account of this. it is "Moser's circle problem" 1+nC2+nC4 srsly? draw 4 lines yourself and you can easily find its 11 The last answer should be 1+(n-1)*n*(n^2-5n+18)/24. At first I thought it's 2^(n-1), because for the first 5 terms it's really the same, but when n=6, it should be 31. Solution: I use recursion. Assume the number of regions for n points is f(n). If we add a new point, we can count how many new regions are created. 1. outside the polygon of original n points, create n new regions. 2. inside the polygon, once the line between the new point and original point cuts the original line, there will be one more region. Label the original points clockwise as 1 2 3 ... n, and the line of the new point with point i will cut (i-1)(n-i) lines. This number is the number of points before the points i times the number of points after the point i. Then how to calculate the sum of (i-1)*(n-i), i.e., sum(k)*n + sum(k*(k+1)). The first term is easy. The second term in fact is a question in the book Heard On the Street. First we guess it's k(k+1)(k+2)/3, and then use mathematical induction to prove it in an easy way. After all, I get f(n+1) = f(n) + n + n*(n-1)*(n-2)/6. And then use recursion and formula for sum(k^3), sum(k^2), sum(k) to get the last answer. |

### Assistant Trader at Jane Street was asked...

You have a box filled with cash. Cash value is uniformly randomly distributed from 1 to 1000. You are trying to win the box in an auction: you win the box if you bid at least the value of the cash in the box; you win nothing if you bid less (but you lose nothing). If you win the box, you can resell it for 150% of its value. How much should you bid to maximize the expected value of your profit (resale of box minus bid)? 11 AnswersMy bad, I meant "0 to 1000" not "1 to 1000". Not that it affects the answer (hint hint). the expectation is always negative? bid 0. The expectation is negative. Show More Responses can someone explain why it is the case mathematically?? I thought your expected payoff is (0+1000)/2 x 1.5=750.. so if you bid 500. you will get expected 250 profit... Anon, If you bid 500 and it is worth 1, you only get paid $2 so you lose 498. $0 is correct. Generalize this for 1,2,5,10 maybe and find the EV of all those, or just 1,2,10, and then you will be able to "guess" that it isn't getting more and more negative and you should bid 0 you should bet 1, that is the minimum in the box, so you will win 0 or .5 if you bet one you'll never win. if you bet 500, you're missing out if value(box) >500 and <750 since those are +EV too. believe the answer is 749. The answer is bid 0 dollars. Soln: If you bid X dollar, you get the box if if it contains Y dollars, where Y < X. Or you nothing, which doesn't hurt anyway. Now think of possible values of Y in PAIRS, (0, X), (1, X-1), (2, X-2)... in each pair (a, b), expected payoff is P(Y being a or b) * E(payoff given Y = a or b). but the second term E(payoff given Y = a or b) = 1/2 (3/2 a - X) + 1/2 (3/2 b - X) = 3/4 (a + b) - X = (-1/4) X, which is always negative. If we sum over all possible pairs, we should get a negative number. Ofocz you need to handle the edge case, where X is small and the pairing does not really work. I think that since we are dealing with expected profit, we should ignore the case when n (our bid) is less than X (cash in the box) as nothing happens in this case. So we create a new distribution where X is uniformly distributed on [0, n]. Thus, E(Profit) = E(1.5X - n) = 1.5E(X) - n = 1.5*n*(n+1)/(2*(n+1)) - n = 0.75n - n = -0.25n <= 0. So the expected profit is maximised (=0) if n=0. Sorry, guys. I seem to have made a mistake in my solution above. Let X = cash value, n = our bid price. Then Profit = 1.5X - n. We want to find E(1.5X - n) = 1.5*E(X) - E(n). Now in order to find E(X), we notice that that X is uniformly distributed. So every outcome has a probability 1/1001. Also we only need to sum up over the values from 0 to n. Hence, E(X) = n(n+1)/(2*1001). For E(n), we see that we will pay the amount n with probability (n+1)/1001 and 0 otherwise. So E(n) = n(n+1)/1001. Therefore, E(Profit) = 1.5*n(n+1)/(2*1001) - n(n+1)/1001 = -0.25*n(n+1)/1001<=0. In order to maximize profit, we should bid zero. The answer is 0 if you ask me and here is the reason. let V be the value of the box and let x be your bid then you profit P is P=(1.5V-x)1_{V<=x}. we take expected value of this and use conditional expectation to get E(P)=E(1.5v-x|v<=x)p(v |

### Software Engineer at Google was asked...

Write a function Brackets(int n) that prints all combinations of well-formed brackets. For Brackets(3) the output would be ((())) (()()) (())() ()(()) ()()() 11 Answerspublic class Parenth2 { static int total = 3; static private void Brackets(String output, int open, int close, int pairs) { if ((open == pairs) && (close == pairs) && output.length() == total * 2) { System.out.println(output); } else { if (open < pairs) Brackets(output + "(", open + 1, close, pairs); if (close < open) Brackets(output + ")", open, close + 1, pairs); } } public static void main(String[] args) { Brackets("", 0, 0, total); } } You almost got thte answer, just a couple of errors... /* * To change this template, choose Tools | Templates * and open the template in the editor. */ /** * * @author Owner */ public class Parenth4 { static private void Brackets(String output, int open, int close, int pairs, boolean opened, boolean closed, int total) { if ((open == pairs) && (close == pairs) && output.length() == total * 2) { System.out.println(output); } else { if ((open < pairs)) Brackets(output + "(", open + 1, close, pairs,true, closed, total); if ((close < open)&& opened) Brackets(output + ")", open, close + 1, pairs,opened,closed, total); } } public static void Brackets(int total) { Brackets("", 0, 0, total,false,false, total); } public static void main(int number){ Brackets(number); } } This is a DP/memoization question, I believe. The base case is 0 and 1 bracket (the answer is empty or (). The recurrence is: bracket(n) = all combination of results from bracket(n-1) and from bracket (1) from bracket(n-2) and bracket(2) . . from bracket(1) and bracket(n-1) lastly, '(' . bracket(n-1) ')' The DP version of the above recurrence is straightforward. (Btw, this recurrence obviously will produce duplicate, but it's not hard to produce a modification that does not produce duplicate.) ;) Show More Responses Here is a F# implementation: let rec Br total output openp closep = if openp = total && closep = total then printfn "%s" output else if openp < total then Br total (output + "( ") (openp + 1) closep if closep < openp then Br total (output + " )") openp (closep + 1) let Brackets total = Br total "" 0 0 Brackets 5 let read = System.Console.ReadLine() void parens(int nPairs) { parens("", nPairs, nPairs); } void parens(string ans, int leftCount, int rightCount) { if (leftCount==0 && rightCount==0) { cout 0) { parens(ans+"(", leftCount-1, rightCount); } if (rightCount>leftCount) { parens(ans+")", leftCount, rightCount-1); } } To Remove duplicates simply use java Set :) public static String bracket(int a) { Set s = new TreeSet(); bracketIntern(s, a, "", ""); System.out.println(s); return s.toString(); } public static void bracketIntern(Set set,int a, String preFix,String suffix) { if(a == 1) { set.add(preFix+"()"+suffix); return; } bracketIntern(set, a-1, preFix+"()", suffix); bracketIntern(set, a-1, preFix+"(", ")"+suffix); bracketIntern(set, a-1, preFix, "()"+suffix); } I think you can also build a trie, and the traverse the trie to print out all the combinations. Complete python program to print the all combinations of well-formed brackets. How to calculate its Big-o? The recursive recurrence seems a little bit complicated. ---- def brackets(n): sol = [] li = [' ' for x in range(n*2)] def recu_brackets(opened, closed): if n - opened: li[opened + closed] = '(' recu_brackets(opened + 1, closed) if n - closed and opened > closed: li[opened + closed] = ')' recu_brackets(opened, closed + 1) if opened == n and closed == n: sol.append(''.join(li)) recu_brackets(0, 0) print ' '.join(sol) brackets(3) wasn't all that neat. o well public class MakeBrackets{ static List make(int number){ List list = new LinkedList(); if (number == 0) {list.add(""); return list;} if (number == 1) {list.add("()"); return list;} for (int i = 0; i < number; i++){ for (String item : make(i)){ for (String item2 : make(number - 1 - i)){ list.add("(" + item + ")" + item2); } } } return list; } public static void main(String[] args){ System.out.println(make(Integer.parseInt(args[0]))); } } def brackets(n): return bracketsPrefix("", n, n) def bracketsPrefix(prefix, opens, closes): total = 0 assert opensopens: total += bracketsPrefix(prefix+")", opens, closes-1) if opens>0: total += bracketsPrefix(prefix+"(", opens-1, closes) return total if __name__=="__main__": for i in range(6): n = brackets(i) print i,n x = raw_input("** ") This question gets so popular and I found this article has a really good explanation: https://goo.gl/E0m0EC |

### Sales Strat Intern at Goldman Sachs was asked...

Suppose we hire you, and you and the rest of the new interns decide to go buy a cup of coffee. Each intern purchases one cup of coffee. One of the interns suggests everyone play a game. Everyone will flip a fair coin, dividing the group of interns into two subgroups: those that got heads and those that got tails. The game is this: whichever group is smaller evenly splits the cost of everyone's cup of coffee (i.e. if there are 5 interns, 3 get H, 2 get T, then the two interns that got tails each buy 2.5 cups of coffee). However, nothing says you need to play this game. You can choose to buy your own cup of coffee and not play the game at all. The question: Should you play this game? (Note: You may assume that there is an odd number of interns, so there are no ties, and that if everyone gets H or everyone gets T, then everyone loses and just buys their own cup of coffee). 18 AnswersHint: Despite its look, this is not a math question. I dont get it, Would you please provide more hints? Assume each coffee costs $1, for simplicity. So this is effectively a choice between two outcomes: paying $1 with probability 100%, or paying $0 with some probability and paying more than $0 with some probability. So you ask yourself: what is your expected cost in the second case? Give that a try and see if you can figure it out. However, I want to remind you that the question is "should you play this game?" The answer to this question isn't just a math question. If you only work out expected values, you've missed the point. For example, a separate question (with the same kind of flavor as the direction I'm trying to lead you) is this: suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question. Let me know if you have anymore questions. And if you want me to post the answer, just let me know. Show More Responses I think I get it. It's about investor's risk appetite. Investors is likely to take guaranteed gains, here is $1. Well yes and no. This is indeed a risk aversion question. If you work it out, you'll find that the EV in each case is exactly the same (your EV is -1 cup of coffee in both scenarios) but that's not the end of it. It's also really a question for them to test your risk aversion. You can really support either answer, and *should* comment on the validity of either answer. My answer was to go with buying my own cup of coffee, and followed it up with a story where a friend of mine had tried to get us to play credit card roulette (which is similar in spirit to this game) and I told him that I did, in fact, say no in that instance and why I said no. However, traditionally people are risk-averse when faced with gains and risk-seeking when faced with losses, so many would probably choose to play the game. But this is as much a question about your psychology as it is about your math skills. And that's why this is such an awesome question, and is probably a question that kills most people they ask it to. Would you please explain to me how do you get -1 for the second scenario? There are 50% H and 50% T. Therefore players have 50% opportunity in the winning group. Given 5 interns, there are two combination of lossing group (4 vs. 1 and 3 vs. 2). EV=0.5*0 + 0.5*(0.5*-5 + 0.5*-2.5) = -1.875. The probability of winning isn't 50%. It's actually slightly above 50%, but that's not the way to look at it. The total number of coffees that need to be bought is n, where n is the number of interns. Going into this game, every intern is the same so they each have the same expected value, and the sum of the expected values must equal -n. So everyone has an EV of -1 as claimed. Thank you very much. I finally got it. I'm still not sure why you said no if the EV is the same? It's a matter of risk preference. See the example I gave in my Mar 19 posting: "Suppose I give you a choice of two outcomes. Either you get $1 with 100% probability or I give you $500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was $50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question." You would probably not play in the first instance and consider possibly playing in the second. (And those instances even have the risky game with a HIGHER EV). This coffee game is the same kind of game, and even if the EV is the same in each case, the volatility is not the same. It is usually a good rule of thumb to take the lower volatility outcome if the EV is the same (think Sharpe Ratio here! Or think efficient frontier here! Both get the point across I think.). Does that help? I say yes. To play the game. Only because I'm prepared if I lost. The fact that I can afford to loose, makes me want to try my chance at wining I would play the game. Consider the expected price with n total interns splitting a total cost of P. That is P/2*E[1/(N)|you're paying]. This is simply equal to P/2*E[1/(N+1)] where n is a binomial now corresponding to n-1 interns. Notice that 1/n+1 is a concave function. That means that P/2*E[1/(N+1)] <= P/2*(1/E[N]+1)=P/2 * 1/( (n-1)/2 + 1)) = P / (n +1). So it pays to play on average Apologies, but both of the above answers are incorrect. The EV of playing is the same as the EV of not playing. Show More Responses No way would I play! The most money I save is a dollar the most money I can lose ( in the case of five interns) is 4 dollars... That's a 400 percent downside verse a 100 percent upside ( kind of you cannot technically compute your return on zero dollars invested). Plus, I do not even drink coffee! Stupid...I don't have time to play games at work. The EV in both scenarios are not the same: it's clear when think about the case where there are n = 3 interns. I've already explains why they are the same. In either scenario there will be n cups of coffee bought (3 in your case) so total EV is -n (-3 in your case). In the game each person is the same as any other so their EV must all be the same. That is why in the game the EV is -1 (same as if they buy their own cup of coffee). To argue otherwise is to argue that either the total EV is not the number of coffees purchased or that someone has an unfair advantage in the game. Incidentally, if you are going to claim that someone who has given you the answer is wrong, you should provide more of a response than "go think about it." if v = pay by game -1 . Then all have five intern have same distributed V by symmetry 5EV=0 It's zero sum. |

You are playing a game where the player gets to draw the number 1-100 out of hat, replace and redraw as many times as they want, with their final number being how many dollars they win from the game. Each "redraw" costs an extra $1. How much would you charge someone to play this game? 10 Answers10? redraw 10 times and get the payoff around 77? the average draw will pay out $50.50 Show More Responses Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of $1 is sunk cost); continue, then the expectation of benefit would be $50.50-1=49.50. This means, as long as you have get more than $50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: $Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73. Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of $1 is sunk cost); continue, then the expectation of benefit would be $50.50-1=49.50. This means, as long as you have get more than $50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: $Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73. All of the above answers are way off. For a correct answer, see Yuval Filmus' answer at StackExchange: http://math.stackexchange.com/questions/27524/fair-value-of-a-hat-drawing-game Yuval Filmus proves that the value of the game is 1209/14=86.37 and the strategy is to stay on 87 and throw again on 86 and below.. Let x be the expected value and n be the smallest number you'll stop the game at. Set up equation with x and n, get x in terms of n, take derivative to find n that maximizes x, plug in the ceiling (because n must be integer) and find maximum of x. ceiling ends up being 87, x is 87.357, so charge $87.36 or more I guess the question asks for the expected value of the game given an optimal strategy. I suppose the strategy is to go on the next round if the draw is 50 or less. Hence, the expected value of each round is: (1) 1/2*1/50(51 + 52 + ... + 100) (2) 1/2*1/2*1/50(51 + 52 + ... + 100) - 1/2 (3) 1/2^3 * 1/50 (51 + 52 + ... + 100) - 1/4 .... Sum all these up to infinity, you'd get 74.50. This is all very interesting and I'm sure has some application...but to trading? I don't think so. I own a seat on the futures exchange and was one of the largest derivatives traders on the floor. Math skills and reasoning are important but not to this level. I would associate day trading/scalping more to race car driving i.e. getting a feel for what's going on during the day, the speed at which the market is moving and the tempo for the up and down moves. If I were the interviewer at one of these firms, I throw a baseball at your head and see if you were quick enough and smart enough to duck. Then if you picked it up and threw it at my head I'd know that you had the balls to trade. I know guys who can answer these questions, work at major banks, have a team of quants working for them and call me up to borrow money from me because they're not making money. At the end of the day, if you want to be a trader then...be a trader. If you want to be a mathematician then be a mathematician. It's cool to multiply a string of numbers in your head, I can do this also, but never in my trading career did I make money because in an instant I could multiply 87*34 or answer Mensa questions which...realistically the above answer is: it depends on the market as the market will dictate the price. You may want to charge $87 to play that game but you'd have to be an idiot to play it. In trading terms this means that when AAPL is trading at $700 everyone would love to buy it at $400. Now that it's trading at $400 everyone is afraid that it's going to $0. Hope this helps. No offense to the math guys on this page, just want to set the trading record straight. |

Design patterns questions 10 AnswersHi Did you get a call for Onsite for this position? I have it on Monday i.e 03-25.If you went through onsite interview.Please share your experience I dint get a call for on site yet. When was your 1st interview? It was on 03-14.Within 2 days they scheduled for onsite interview.By the way what all design pattern questions they asked you in first round?I was asked only Observer Pattern and Singleton Pattern Show More Responses Ya the same thing.....I already listed right...I had a difficulty in answering Observer pattern so listed here so that others can prepare well :) On site they will ask to write small code I guess but not so complicated. I heard that fidelity interview is not that tough as tech companies. So prepare well, answer confidently. All the best for the interview. Did you guys gave the on-site interview? How was the experience? Could you guys share your onsite experience ? @above Dude my onsite is scheduled next week. So I dint have any experience to share apart from the phone interview questions. Now its your turn to share the experience so that I can do somewhat better next week. So please do share your experience. Hi, I had my phone interview this week.The interview didn't go well but I have a similar interview with the same company in a different location.Can someone please tell the questions that were asked onsite? Thanks Hello, I heard that the questions asked on onsite were related to map reduce scenarios like how to approach a situation (which the interviewer tells) and develop map and reduce tasks (orally not written). Different behavioural questions apart from these. Has anybody actually given the onsite interview?Please share your experience. |

### Trader at Jane Street was asked...

Let's play a game - I give you a 12 sided die and will pay you whatever the die lands on. If you are unhappy with the roll, you can choose to roll another two 6-sided dice and I wil pay you the sum of the two dice. How much are you willing to pay to play this game. 11 Answers1/2 expected value of rolling one die + 1/2 expected value of rolling two dice = $8.25 The expected value of the second roll is 8.5. I would choose to roll the second pair if the first die lands less than or equal to 8. The expected value of this strategy is 8.5 *2/3 + 9/12 +10/12+11/12 +12/12. This is 9.1667. That is wrong. The expected value of the second roll is 7. So I would choose to roll the second if I get less than 7 in the first die. The expected value of this strategy is 8.25. Show More Responses 12 sided die: Expected Val 13/2=6.5 2 6-sided die: (2*1+3*2+4*3+5*4+6*5+7*6+8*7+9*4+10*3+11*2+12*1)*1/6*1/6=268/36=77/9 First time rolling side 12, will stop to roll the 2 6-side one if the number is >=9; If the number is <=8, then roll the second dies. (9+10+11+12)/12+(8/12)*77/9=9.2 None of these answers are correct It says that if you roll the second, then whatever comes up will be your payment Strategy: Roll the 2 D6 in the case where you see a number on the D12 that is less than 7 because E(2*D6) = 7. This happens half the time so it will be .5 * E(D12) + .5 * E(2*D6) = 6.75 The answer is .5*9.5+.5*7=4.75+3.5=8.25. the way to get it is you roll again if you get less than 7 on the first roll. @Christine your approach is right, the only thing is that if you have 2 6-sided die then the prob of outcome of 8 is 5 ((2,6), (6,2), (3,5), (5,3) and (4,4)). Thus you will get (2*1+3*2+4*3+5*4+6*5+7*6+8*5+9*4+10*3+11*2+12*1)*1/6*1/6 = (14*(1+2+3+4+5) +7*6) /36=14*18/36=7. Overall we get 7*7/12 + (8+9+10+11+12)/12=(49+50)/12=99/12=8.25 Suppose your strategy is to to be "happy" if the first roll is 7 or higher and otherwise go on to the second round. The expected value is: 1/2 * [(7 + 8 + 9 + 10 + 11 + 12) / 6] + 1/2 * [(1 + 2 + 3 + 4 + 5 + 6) / 6 + 3.5] = 8.25. isn't answer is obvious? if roll less or equal than 6, roll again; otherwise accept it so expected value is 1/2*6.5 (if roll again)+1/2*9.5 (9.5 is the expected value of 7, 8, 9, 10, 11, 12) =16/2=8 @sys: the expected value of rolling again is E(6-sided) + E(6-sided) = 3.5+3.5 = 7. .5 * 7 + .5 * 9.5 = 8.25 E(rolling 6 dice twice ) = 2*E(6 dice one) = 2*(1+6)/2 = 7 So I reroll only if 12 dice gave me 6 or less. So with this strategy Let X payoff then E(X) = p(stick with 12)*E(X|sitck with 12 die) + p(reroll)*E(X|reroll) = 1/2 * (7+12)/2 + 1/2 * 7 = 4.75 + 3.5 = 8.25 |

### Quantitative Analyst at Jane Street was asked...

You have 2 decks of cards (each deck contains both red and black cards). One deck has twice the number of cards in the other deck with the same color ration (so one deck has 52 cards and the other has 104, both half red and half black). I offer you to play a game. First you get to chose which deck of cards you want to play with. Second, you draw 2 cards at random from your deck of choice. If both are red, then I will give you a ferarri. Which deck of cards would you chose? 9 AnswersThe unalert interviewee would answer "it doesn't matter, the probability is the same". While this is true for the first card, you have a higher probability of drawing a second red card with the big deck than the smaller one. So I chose the big deck (no homo) and I was right. yes, he would be right Mathematically: (26/52 * 25/51) vs (52/104 * 51/103) 51/206 > 25/102 Show More Responses Actually, the probabilities are the same for each deck. Consider than because you have a 50/50 chance of drawing your first card red, there's a 50% chance the numerator of the next fraction is reduced by one..so your probability is 26/52 * (25/51+26/51)/2 vs 52/104 * (51/103+52/103)/2 which are the same "G" did the right calculation. To calculate the probability of drawing two red cards in a row one needs to set up an equation where the first drawn was red, and the SECOND card was red as well. The question is asking what is the probability of drawing 2 red cards in a row, NOT what is the probability of drawing a red card then either a red or black card. Intuition tells us if you add in same number of red and black cards into the original problem with 52 cards, probability will go up. Just imagine when you add in 1 billion red cards and 1billion black cards You should definitely choose the larger deck if both are 50% red, 50% blue. Here's another explanation in addition to the other correct ones above. Each deck is naturally partitioned into maximal sub-stacks where each sub-stack consists of cards of a single color, either all red or all blue. If it is known ahead of time that half the cards are blue and half are red, then the expected size of the stack increases with the number of cards in the deck. Correction: expected size of the **sub-stacks** increases with the number of cards in the deck. Sorry about that. extreme case deck 1: 1 red 1 black deck 2: 2 red 2 black So more cards the better chance you get... |

Say I take a rubber band and randomly cut it into three pieces. What's the probability that one of the pieces has length greater than 1/2 of the original circumference of the rubber band. 9 Answers3/4 Suppose you have two cuts on the rubber band placed randomly. The probability of having one segment greater than half the circumference is the probability that the third cut will be inside the combined range of 90* to either side of the cuts. Since the average distance between the first two cuts is also 90*, the combined range is 270*, or 3/4 of the circle. You need 3 cuts to end up with 3 pieces. The first cut doesn't matter. The second cut can also be anywhere and the largest piece will still be at least half the circumference. What matters is the third cut, which should lie in the same half as the second cut. So the probability is actually 1/2. Show More Responses The correct answer is 3/4, as this problem is equivalent to the famous 3-points-on-semicircle problem. Why? If one of the pieces has length greater than 1/2 the circumference, then the three points of cutting must lie in the same semicircle. On other hand, if the three points of cutting lie on the same semicircle, then the longest piece must be at least 1/2 of the circumference. For reference to the 3-points-on-same-semicircle problem, see e.g., http://godplaysdice.blogspot.com/2007/10/probabilities-on-circle.html 1/4 1 -3/4 suppose I have two points whose minor arc distance is t <= 1/2. Then the range of semicircles covering both points gives an arc length of (1/2+1/2)-t = 1-t. say we fix the first point, tracing the second point around gives minor arc lengths from 0 to 1/2 and then 1/2 to 0. Therefore the answer is 2*integral (1-t) from 0 to 1/2, which is 2(1/2-1/8) = 3/4 It's 3/4. Cut it into 1 piece make a line. Cut it close. Pretend the length is 100. If you cut the first at x=1, as long as it isn't between x=50-51, it will have a length greater than 50% so there's 99% chance. You can imagine that if the cut was infinitely close to the end it would be about 100%. Now cut at x=2 you can't do between x=50-52. For x=3 it's 50-53 etc. So when you get to right to infinitely close to 50 it is pretty much between x=50-100 so there is a 50% chance you hit your spot. (obviously 50-50 is 100%, but since this length is continuous there's little chance it lands on that point). Obviously since this is symmetrical you can see this pattern going from 50% to 100% at the other end. Since each point on the continuous line has the same probability of happening the answer is clearly 75%. This problem is also equivalent to the probability that, if you have a line segment from 0 to 1 and you make 2 random cuts on that line segment, what is the probability that the three resulting pieces do NOT make a triangle? |

### Trader Assistant at DRW was asked...

What is the expected value of rolling two dice? 9 Answers7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 Show More Responses The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 |