Intern Interview Questions in New York, NY | Glassdoor # Intern Interview Questions in New York, NY

From retail to finance to medicine, every industry needs interns to provide additional support and assistance. Interview questions will vary greatly depending on the industry and role you are looking for. Expect to answer questions about how you work on teams and provide examples of any relevant work experience. To ace your interview, make sure to research the particular position you are applying for.

## Top Interview Questions

Sort: RelevancePopular Date

### Marketing Intern at L'Oréal was asked...

Feb 16, 2011
 They placed two new products on the table and asked me to analyze them and describe why they fit with the specific brand they were identified with. 1 AnswerI made sure to pay attention to the brand image and make some educated guesses as to what type of consumer the products targeted

### Software Engineer Intern at Goldman Sachs was asked...

Jul 28, 2009
 Suppose you had eight identical balls. One of them is slightly heavier and you are given a balance scale . What's the fewest number of times you have to use the scale to find the heavier ball? 48 Answers3 times. (2^3 = 8)Two. Split into three groups of three, three, and two. weigh the two groups of three against each other. If equal, weigh the group of two to find the heavier. If one group of three is heavier pick two of the three and compare them to find the heaviest.Brian - this would be correct if you in fact were using a weighing scale, and not a balance scale. The ability to weigh one group against another with a balance scale allows Marty's answer to be a correct answer. Although - the question as worded provides a loophole. If it had been worded as "What's the fewest number of times you have to use the scale to CONSISTENTLY find the heavier ball", then Marty's answer would be the only correct answer. However, it is possible that you could get lucky and find the heavier ball in the first comparison. Therefore, the answer to the question as stated, is ONE.Show More ResponsesThis question is from the book "How to move Mt Fuji".... Marty has already got the right answer.Actually Bill, by your interpretation of the question the answer is zero, because you could just pick a ball at random. If you get lucky, then you've found the heaviest ball without using the scale at all, thus the least possible amount of times using the scale would be zero.The answer is 2, as @Marty mentioned. cuz its the worst case scenario which u have to consider, otherwise as @woctaog mentioned it can be zero, u just got lucky picking the first ball....None- weigh them in your hands.Assuming that the balls cannot be discerned by physical touch, the answer is 3. You first divide the balls in two groups of 4, weigh, and discard the lighter pile. You do the same with the 4 remaining, dividing into two groups of 2, weighing, and discarding the lighter pile. Then you weigh the two remaining balls, and the heavier one is evident.2 3a+3b+2 = 8 if wt(3a)==wt(3b) then compare the remaining 2 to find the heaviest if wt(3a) !== wt(3b) then ignore group of 2 discard lighter group of 3 divide the remaining group of 3 into 2+1 weigh those 2 If == the remaing 1 is the heaviest if !== the heaviest will be on the scaleWith the systematic approach, the answer is 3. But, if you randomly choose 2 balls and weigh them, and by coincidence one of these two is the heavier ball, then the fewest number of times you'd have to use the scale is 1. Although the real question is: are the balls truly identical if one is heavier than the rest?just once. Say you are lucky and pick the heavy ball. One use of the scale will reveal your lucky choiceso once, or the creative answer zero if you allow for weighing by handWithout judging by hand: Put 4 balls on one side, and 4 on the other. Take the heavier group and divide again, put 2 balls on one side, and 2 on the other. Take the 2 that were heavier, and put one on each side. You've now found the heaviest ball. This is using the scale 3 times, and will always find the right ball.Show More ResponsesNone. They are identical. None is heavier.2 weighings to find the slightly heavier ball. Step 1. compare 2 groups of three balls. Case 1. if they are both equal in weight, compare the last 2 balls - one will be heavier. case 2. If either group of 3 balls is heavier, take 2 balls from the heavier side. compare 1 ball against the 2nd from the heavy group result 1. if one ball is heavier than the other, you have found the slightly heavier ball. result 2. if both balls are equal weight, the 3rd ball is the slightly heavier ball. Easy ShmeeziFewest - get lucky and pick the heaviest one. But wait! How would you know it is the heaviest one by just weighing one ball? Your logic is flawed. Two groups of four. Split heavier one, weigh. Split heavier one, weigh. 3 times.i think its 3. i would take it like this OOOO OOOO then OO OO then OO problem solved. i do this everyday. bye. praise be to allah. thats it.It's 2. Period. If you can't figure it out look it up online or in "How Would You Move Mount Fuji" (like somebody else said). This is one of the most basic brainteasers you could be asked in an interview.The answer is 2. 1) Divide the balls into 3 groups. 2 piles with 3 balls each, 1 pile with 2 balls. 2) Weigh the 2 piles of 3 balls. If both piles are the same weight, discard all 6 and weigh the last 2 to find the heavier one. 3) If 1 pile of 3 is heavier than the other, discard the lighter pile and the pile of 2 balls. Weigh 2 of the remaining 3 balls from the heavier pile. If both of the weighed balls are equal, the last ball is the heavier one.2=if all the balls are identical and you pick up the first...weigh it and the second one is lighter or heavier then you've found the heavier ball in the least amount of attempts.1=if all the balls are identical and you pick up the first...balance it and the second one is lighter or heavier then you've found the heavier ball in the least amount of attempts.Amy is 100% correct for the following reason: everyone (except Amy) is solving the theoretical problem. The practical side of the problem - notwithstanding jimwilliams57's brilliant observation that if one weighs more than the others IT IS NOT IDENTICAL (would have loved to see the interviewer's face on that one) - in order to 'weigh' them on a scale, one has to pick them up, therefore, you will immediately detect the heavier one without weighing: pick-up three and three ... no difference, no need to weight. Pick-up the remaining two to determine the heavier one. SteveFirst off, take yourself through the process visually and forget square roots, that doesnt apply here and here is why: The question is the Minimum, not the MAXIMUM. BTW, the max would be 7 ( 8-1); you are comparing 2 objects, so 1 ball is eliminated automatically in the first step. Anyway, you have a fulcrom of which you are placing 2 of 8 objects on each end. If by chance you pick the slightly heavier object as one of the two balls, you have in fact, found the slightly heavier one in the first round... btw dont be a smartass with your interviewer, he is looking for smarts not smarmy;)Show More ResponsesRespectfully, the folks who are answering "3" are mathematically modeling the nature of the balance incorrectly. Performing a measurement on a balance scale is not binary. It is trinary. Each measurement gives you one of three responses: The left is heavier, the right is heavier, or they are equal. So while you do need three binary bits to specify a number from one to eight, you need only two TRINARY-DIGITS Formally, you want the smallest value of n such that 3^n >= 8. The answer is 2. Note that you could add a ninth ball, and still, you'd only need to make two measurements. Of course, the smarty pants answer would be one. Just pick two balls at random and be lucky to have chosen the heavy one. But you're not guaranteed to be able to do it in just one measurement.English isn't my mother tongue... What is a balance scale? If looking up on Google, I find some devices with two bowls on a bar bearing in the center. Hence, the answer is once (if I'm luck enough to select the heavier ball in teh first measurement) If a balance scale allows to measure only one ball at a time, then it would take two measurements, unless you'd have more information on the weight, which is not listed here, hence doesn't exist in the context of the question^.3 times. Not having looked at the other comments, hopefully, I am the 26th to get this right. Put the balls 4 and 4 on the scale, Take the heavier side and place those balls 2 and 2 on the scale. Take the heavier side and place them 1 and 1 giving the heaviest ball.OK, now I read the comments and see that the people, like the question are divided into to groups, systematic approach people that say 3 (like I did) and analytic people that say 2. It takes a systematic person (me) a minute to get the answer. I'm guessing it took the analytic 5 minutes just to interpret all the ramifications of the question, i.e. they aren't idenitical if..., do it by hand..., get lucky.minimum is 1 (if lucky - 25% chance by picking 2 balls at random) & max is 2 (using most efficientl process to absolutely determine without luck - 3/3/2 scenario)While Symantec was busy weighing my balls I took a job with NetApp.... They need to focus on hiring good, capable security engineers, not weighing their balls.The point of these interview questions is to both check your logical brain function and to hear how you think. Most of you are just posting jerk off answers trying to be funny, or you are really dumb. These answer get you nowhere with me in an interview. Think out loud, go down the wrong path back track try another logic path, find the answer. None of this "0 if you use your hands". That is fine if you are interviewing for a job in advertising where creativity is desired, nobody wants you writing code like an 8 year old.You have 12 balls, equally big, equally heavy - except for one, which is a little heavier. How would you identify the heavier ball if you could use a pair of balance scales only twice?The problem is based on Binary Search. Split the balls into groups of 4 each. Choose the heavier group. Continue till you get the heavier ball. This can be done in log(8) (base 2) operations, that is, 3.Since there is only one scale available to weigh. You first divide the balls in half. Weigh each group, take the heaviest group. This is using the scale twice so far. Now, divide the previous heaviest group into half, weigh both groups. Take the heaviest. Divide this last group and take the heaviest. This is the heaviest ball. We have used the scale 5 times.Show More ResponsesWould it be wrong to say for a sample size as small as 8, we might as well not waste time thinking about an optimal solution and just use the scale 7 times, as this will be more efficient than coming up with an ideal solution prior to using the scale?3.I stumbled across this while looking for something else on Google but I had to answer. It is 2, split balls into 2,3 and 3. weigh the 2 groups of 3 against each other. If equal weigh the group of 2 and the heaviest is obvious. If they are not equal keep heavy group of 3 and weigh 2 of the balls. if equal heaviest ball is one you didn't weigh. If not equal the heavy ball is obvious.2 times. 8 balls. 1st step:    2nd step: [  ] [  ] [ ]No ideaThe fewest number of times to use the scale to find the heavier would be Eight to One times ?It will actually be 1 because the question asks what's the fewest amount of times which is one because you could just get lucky you can use any method you want it would still be one because that is the fewest amount of turns you can haveIt's one. The fewest number of tries on using a balance scale would be one. If you put one ball on each side and one is heavier, you have the found the heavier ball.Use an equilateral triangular lamina which is of uniform mass throughout. It is balanced on a pole or a similar structure. Steps: Place 2 balls at each corner (total 6 balls) i. if the odd ball is one of those, one side will either go up or go down. Now repeat the process with one ball at each corner including the 2 unbalanced ones. ii. if balance is perfect, repeat the process with the remaining two balls and one of the already weighed balls.test answer 2016-01-12 00:34:07 +0000Show More ResponsesYou would not be able to find a ball heavier than the others. All eight balls are identical; therefore, they must all be the same weight.Correct answer has already been posted. I just want to contribute some theoretical analysis. Given N balls, one of them is heavier. Finding out the ball requires log3(N) trit of information. (trit is the 3-base version of bit). Each weighing may give you one of the three outcomes: equal, left-heavier, right-heavier. So the amount of information given by each weighing is upper-bounded at 1 trit. Therefore, theoretical lower-bound for number of weighings in the worst case is log3(N), which is actually attainable. So 27 such balls need only 3 weighings and 243 balls need only 5 weighings, etc.32 as many have indicated above. The 3 is the kneejerk reaction but 2 is the correct answer.Marty's answer is correct, but he does not explain why. The logic of the balance scale is three-valued: . Its most efficient use is the recursive application of the three-valued logic until there is only one item left. The integral ceiling of ln(x)/ln(3) thus gives the fewest number of times you have to use the balance scale to find the uniquely heaviest ball of x balls. Ceiling(ln(8)/ln(3)) = 2.

### Intern at Jane Street was asked...

Mar 14, 2011
 If you had only 5 and 11 cent stamps. Whats the smallest number that would be impossible to make with those stamps.12 Answers49 i think i dont really remember now.I think you phrased that wrong budAnd I think it's 39Show More ResponsesCan anyone restate the question in a non-ambiguous manner?5*11-5-11=39 (Frobenius number)how bout 1?The problem seems to be the Postage Stamp Problem (look it up!). In this case, it is "what is the largest number which you cannot obtain by a combination of 5 or or 11 value postage stamps?" (implicit in the question is the fact that after a certain value, you can obtain every value by such a combination) As someone said above, it is 39 for these particular numbers. 40, 45, 55, etc. are all multiples of 5. 41 is 11 + 8*5, 42 is 2*11 + 4*5, etc. Basically once you can obtain 40-44, you can obtain 45-49 by adding 5, and then you can obtain 50 or higher by adding 10, and so forth.Why doesn't 17 work?starting with 11c, you can make 16c, 21c, 26c, etc by adding 5c each time, or all numbers congruent to 1 mod 5 is makeable. Do the same starting from 22c, and 33c. At this point, all numbers congruent to 0, 1, 2, 3, mod 5 are makeable. Once you start at 44c, all numbers congruent to 4 mod 5 are makeable, so the answer is the 4 mod 5 number below 44, which is 39.Well, isn't 39 the *largest* number the postage stamp can't make?I used a 15 by 15 Sieve and got 39 as the answer.The question is supposed to be: what is the largest number that cannot be generated by adding fives and elevens. To solve the problem, consider what is the minimum number with the last digit as 0, 1, 2,..., 9 that can be generated by adding fives and elevens: Last digit Minimum Achievable Maximum Unachievable 0 10 0 1 11 1 2 22 12 3 33 23 4 44 34 5 5 NA 6 16 6 7 27 17 8 38 28 9 49 39 Hence, maximum unachievable = 39

### Trader Intern at Jane Street was asked...

Jan 12, 2011
 Russian Roulette - 4 blanks 2 bullets, all in a row. If someone shoots a blank next to you, would you take another shot or spin 12 Answerstake another shot, 3/4 chance of surviving vs 2/3 if you respinHere is my answer: the prob. of survival after re-spin is: 3/5. the prob. of survival with on re-spin is: 1/15 * 6/4 = 1/10.correction: the prob. of survival after re-spin is: 4/6.Show More Responsesmy final correction: the prob. of survival after re-spin is: 4/6 = 2/3 the spin with no spin is: 2/5 = c(4, 2) / c(6, 2).Denoting the blanks by 0's and live bullets by 1's, and adjoining the left and right edges (denoted by ~) , so as to make a cylinder, the following diagram illustrates how the bullets are arranged in the cylinder of the revolver: ~000011~ . Denote the bullet that is to be fired if the trigger is pulled by enclosing it in parenthesis. Denote an empty chamber by *. Assume that when you pull the trigger, the cylinder rotates clockwise (to the right in the diagram above). If when you pulled the trigger for the first time, the bullet was a blank, then before and after you pulled the trigger, the cylinder was and is in one of the following states: 1) Before: ~(0)00011~ After: ~*0001(1)~ 2) Before: ~0(0)0011~ After: ~(0)*0011~ 3) Before: ~00(0)011~ After: ~0(0)*011~ 4) Before: ~000(0)11~. After: ~00(0)*11~ If you pull the trigger again without spinning the cylinder, then only in case one will the bullet be a live bullet, yielding a 3/4 probability of the bullet being a blank . If you spin the cylinder and then pull the trigger, then you have a 4/6=2/3 probability of the bullet being a blank. Clearly, you should not spin the cylinder.don't spin, 3/4 prob survival if not spinnedrespin - 4/6 = 2/3 (obvious) don't spin - you only die if the blank was the "last" one, which is 1/4 chance. hence 3/4 chance of live. since 3/4>2/3 don't spin.Wouldn't play.Spin Spin: 4/6=0.6667 to survive Not Spin: C(4,2)/C(5,2)=3/5=0.6condition on 2 bullet is in consecutive position and someone survives the 1st shot, we have: spin it-> survival rate is 4/6=2/3=67% not spin it-> survival probability=3/4=75% so...not spin it will have a bigger chance of survival.the question did not say that the bullets are next to each other so in this case you should spinI. Using Bayes' Theorem: P = probability of a blank : 4/6 P = probability of a bullet : 2/6 P[1|0] = probability of a bullet given a blank : find this P[0|1] = probability of a blank given a bullet If a bullet occurs then the next pull can be a bullet or a blank and so P[0|1] = 1/2 P[1|0] = P[0|1]*P / P = 1/4 So there is a 25% chance that the next pull is a bullet, or a 75% chance that it is a blank. The first pull had a probability of 8/12 of being a blank. Given a blank on the first pull, the second pull has a 9/12 probability of being a blank. II. From a frequentist perspective: So when the first pull is made and it is a blank the event space decreases from 6 possible outcomes to 4 possible outcomes. The first pull was on the last bullet or the the first pull was on the second to last blank. Out of the four possible events there is one where after the pull the chamber is sitting on the blank right before the bullet. Out of the four events 3 are blanks (3/4) and 1 is a bullet (1/4).

### Sales Strat Intern at Goldman Sachs was asked...

Mar 17, 2013
 Suppose we hire you, and you and the rest of the new interns decide to go buy a cup of coffee. Each intern purchases one cup of coffee. One of the interns suggests everyone play a game. Everyone will flip a fair coin, dividing the group of interns into two subgroups: those that got heads and those that got tails. The game is this: whichever group is smaller evenly splits the cost of everyone's cup of coffee (i.e. if there are 5 interns, 3 get H, 2 get T, then the two interns that got tails each buy 2.5 cups of coffee). However, nothing says you need to play this game. You can choose to buy your own cup of coffee and not play the game at all. The question: Should you play this game? (Note: You may assume that there is an odd number of interns, so there are no ties, and that if everyone gets H or everyone gets T, then everyone loses and just buys their own cup of coffee).18 AnswersHint: Despite its look, this is not a math question.I dont get it, Would you please provide more hints?Assume each coffee costs \$1, for simplicity. So this is effectively a choice between two outcomes: paying \$1 with probability 100%, or paying \$0 with some probability and paying more than \$0 with some probability. So you ask yourself: what is your expected cost in the second case? Give that a try and see if you can figure it out. However, I want to remind you that the question is "should you play this game?" The answer to this question isn't just a math question. If you only work out expected values, you've missed the point. For example, a separate question (with the same kind of flavor as the direction I'm trying to lead you) is this: suppose I give you a choice of two outcomes. Either you get \$1 with 100% probability or I give you \$500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was \$50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question. Let me know if you have anymore questions. And if you want me to post the answer, just let me know.Show More ResponsesI think I get it. It's about investor's risk appetite. Investors is likely to take guaranteed gains, here is \$1.Well yes and no. This is indeed a risk aversion question. If you work it out, you'll find that the EV in each case is exactly the same (your EV is -1 cup of coffee in both scenarios) but that's not the end of it. It's also really a question for them to test your risk aversion. You can really support either answer, and *should* comment on the validity of either answer. My answer was to go with buying my own cup of coffee, and followed it up with a story where a friend of mine had tried to get us to play credit card roulette (which is similar in spirit to this game) and I told him that I did, in fact, say no in that instance and why I said no. However, traditionally people are risk-averse when faced with gains and risk-seeking when faced with losses, so many would probably choose to play the game. But this is as much a question about your psychology as it is about your math skills. And that's why this is such an awesome question, and is probably a question that kills most people they ask it to.Would you please explain to me how do you get -1 for the second scenario? There are 50% H and 50% T. Therefore players have 50% opportunity in the winning group. Given 5 interns, there are two combination of lossing group (4 vs. 1 and 3 vs. 2). EV=0.5*0 + 0.5*(0.5*-5 + 0.5*-2.5) = -1.875.The probability of winning isn't 50%. It's actually slightly above 50%, but that's not the way to look at it. The total number of coffees that need to be bought is n, where n is the number of interns. Going into this game, every intern is the same so they each have the same expected value, and the sum of the expected values must equal -n. So everyone has an EV of -1 as claimed.Thank you very much. I finally got it.I'm still not sure why you said no if the EV is the same?It's a matter of risk preference. See the example I gave in my Mar 19 posting: "Suppose I give you a choice of two outcomes. Either you get \$1 with 100% probability or I give you \$500,000,000 with probability 1/100,000,000 and 0 otherwise. Which would you pick? Now what if it was the same first choice, but the second choice was \$50 with prob 1/10 and 0 otherwise? Now which would you pick? These are the kinds of things you want to think about while answering this kind of question." You would probably not play in the first instance and consider possibly playing in the second. (And those instances even have the risky game with a HIGHER EV). This coffee game is the same kind of game, and even if the EV is the same in each case, the volatility is not the same. It is usually a good rule of thumb to take the lower volatility outcome if the EV is the same (think Sharpe Ratio here! Or think efficient frontier here! Both get the point across I think.). Does that help?I say yes. To play the game. Only because I'm prepared if I lost. The fact that I can afford to loose, makes me want to try my chance at winingI would play the game. Consider the expected price with n total interns splitting a total cost of P. That is P/2*E[1/(N)|you're paying]. This is simply equal to P/2*E[1/(N+1)] where n is a binomial now corresponding to n-1 interns. Notice that 1/n+1 is a concave function. That means that P/2*E[1/(N+1)] <= P/2*(1/E[N]+1)=P/2 * 1/( (n-1)/2 + 1)) = P / (n +1). So it pays to play on averageApologies, but both of the above answers are incorrect. The EV of playing is the same as the EV of not playing.Show More ResponsesNo way would I play! The most money I save is a dollar the most money I can lose ( in the case of five interns) is 4 dollars... That's a 400 percent downside verse a 100 percent upside ( kind of you cannot technically compute your return on zero dollars invested). Plus, I do not even drink coffee!Stupid...I don't have time to play games at work.The EV in both scenarios are not the same: it's clear when think about the case where there are n = 3 interns.I've already explains why they are the same. In either scenario there will be n cups of coffee bought (3 in your case) so total EV is -n (-3 in your case). In the game each person is the same as any other so their EV must all be the same. That is why in the game the EV is -1 (same as if they buy their own cup of coffee). To argue otherwise is to argue that either the total EV is not the number of coffees purchased or that someone has an unfair advantage in the game. Incidentally, if you are going to claim that someone who has given you the answer is wrong, you should provide more of a response than "go think about it."if v = pay by game -1 . Then all have five intern have same distributed V by symmetry 5EV=0 It's zero sum.

### Prop Trading Summer Intern at Jane Street was asked...

Mar 11, 2011
 You are playing a game where the player gets to draw the number 1-100 out of hat, replace and redraw as many times as they want, with their final number being how many dollars they win from the game. Each "redraw" costs an extra \$1. How much would you charge someone to play this game?10 Answers10?redraw 10 times and get the payoff around 77?the average draw will pay out \$50.50Show More ResponsesEvery time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of \$1 is sunk cost); continue, then the expectation of benefit would be \$50.50-1=49.50. This means, as long as you have get more than \$50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: \$Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73.Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of \$1 is sunk cost); continue, then the expectation of benefit would be \$50.50-1=49.50. This means, as long as you have get more than \$50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: \$Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73.All of the above answers are way off. For a correct answer, see Yuval Filmus' answer at StackExchange: http://math.stackexchange.com/questions/27524/fair-value-of-a-hat-drawing-gameYuval Filmus proves that the value of the game is 1209/14=86.37 and the strategy is to stay on 87 and throw again on 86 and below..Let x be the expected value and n be the smallest number you'll stop the game at. Set up equation with x and n, get x in terms of n, take derivative to find n that maximizes x, plug in the ceiling (because n must be integer) and find maximum of x. ceiling ends up being 87, x is 87.357, so charge \$87.36 or moreI guess the question asks for the expected value of the game given an optimal strategy. I suppose the strategy is to go on the next round if the draw is 50 or less. Hence, the expected value of each round is: (1) 1/2*1/50(51 + 52 + ... + 100) (2) 1/2*1/2*1/50(51 + 52 + ... + 100) - 1/2 (3) 1/2^3 * 1/50 (51 + 52 + ... + 100) - 1/4 .... Sum all these up to infinity, you'd get 74.50.This is all very interesting and I'm sure has some application...but to trading? I don't think so. I own a seat on the futures exchange and was one of the largest derivatives traders on the floor. Math skills and reasoning are important but not to this level. I would associate day trading/scalping more to race car driving i.e. getting a feel for what's going on during the day, the speed at which the market is moving and the tempo for the up and down moves. If I were the interviewer at one of these firms, I throw a baseball at your head and see if you were quick enough and smart enough to duck. Then if you picked it up and threw it at my head I'd know that you had the balls to trade. I know guys who can answer these questions, work at major banks, have a team of quants working for them and call me up to borrow money from me because they're not making money. At the end of the day, if you want to be a trader then...be a trader. If you want to be a mathematician then be a mathematician. It's cool to multiply a string of numbers in your head, I can do this also, but never in my trading career did I make money because in an instant I could multiply 87*34 or answer Mensa questions which...realistically the above answer is: it depends on the market as the market will dictate the price. You may want to charge \$87 to play that game but you'd have to be an idiot to play it. In trading terms this means that when AAPL is trading at \$700 everyone would love to buy it at \$400. Now that it's trading at \$400 everyone is afraid that it's going to \$0. Hope this helps. No offense to the math guys on this page, just want to set the trading record straight.

### Summer Trading Intern at Jane Street was asked...

Oct 4, 2011
 Say I take a rubber band and randomly cut it into three pieces. What's the probability that one of the pieces has length greater than 1/2 of the original circumference of the rubber band.9 Answers3/4Suppose you have two cuts on the rubber band placed randomly. The probability of having one segment greater than half the circumference is the probability that the third cut will be inside the combined range of 90* to either side of the cuts. Since the average distance between the first two cuts is also 90*, the combined range is 270*, or 3/4 of the circle.You need 3 cuts to end up with 3 pieces. The first cut doesn't matter. The second cut can also be anywhere and the largest piece will still be at least half the circumference. What matters is the third cut, which should lie in the same half as the second cut. So the probability is actually 1/2.Show More ResponsesThe correct answer is 3/4, as this problem is equivalent to the famous 3-points-on-semicircle problem. Why? If one of the pieces has length greater than 1/2 the circumference, then the three points of cutting must lie in the same semicircle. On other hand, if the three points of cutting lie on the same semicircle, then the longest piece must be at least 1/2 of the circumference.For reference to the 3-points-on-same-semicircle problem, see e.g., http://godplaysdice.blogspot.com/2007/10/probabilities-on-circle.html1/4 1 -3/4suppose I have two points whose minor arc distance is t <= 1/2. Then the range of semicircles covering both points gives an arc length of (1/2+1/2)-t = 1-t. say we fix the first point, tracing the second point around gives minor arc lengths from 0 to 1/2 and then 1/2 to 0. Therefore the answer is 2*integral (1-t) from 0 to 1/2, which is 2(1/2-1/8) = 3/4It's 3/4. Cut it into 1 piece make a line. Cut it close. Pretend the length is 100. If you cut the first at x=1, as long as it isn't between x=50-51, it will have a length greater than 50% so there's 99% chance. You can imagine that if the cut was infinitely close to the end it would be about 100%. Now cut at x=2 you can't do between x=50-52. For x=3 it's 50-53 etc. So when you get to right to infinitely close to 50 it is pretty much between x=50-100 so there is a 50% chance you hit your spot. (obviously 50-50 is 100%, but since this length is continuous there's little chance it lands on that point). Obviously since this is symmetrical you can see this pattern going from 50% to 100% at the other end. Since each point on the continuous line has the same probability of happening the answer is clearly 75%.This problem is also equivalent to the probability that, if you have a line segment from 0 to 1 and you make 2 random cuts on that line segment, what is the probability that the three resulting pieces do NOT make a triangle?

### Quantitative Researcher Summer Intern at Jane Street was asked...

Apr 17, 2011
 3) Poker. 26 red, 26 black. Take one every time, you can choose to guess whether it’s red. You have only one chance. If you are right, you get 1 dollar. What’s the strategy? And what’s the expected earn?12 Answersexpected earn is 25 cents. 1/2*1/2*1, prob of choosing to guess is 1/2, prob of guessing right is 1/2, and the pay is \$1I would start picking cards without making a decision to reduce the sample size. This is risky because I could just as easily reduce my chances of selecting red by taking more red cards to start, as I could increase my chances of selecting red by picking more black cards first. But I like my chances with 52 cards, that at some point, I will at least get back to 50% if I start off by picking red. Ultimately, I can keep picking cards until there is only 1 red left. But I obviously wouldn't want to find myself in that situation so I would do my best to avoid it, by making a decision earlier rather than later. Best case scenario, I pick more blacks out of the deck right off the bat. My strategy would be to first pick 3 cards without making a decision. If I start off by selecting more than 1 red, and thus the probability of guessing red correctly is below 50%, then I will look to make a decision once I get back to the 50% mark. (The risk here is that I never get back to 50%) However, if I pick more than 1 black card, then I will continue to pick cards without making a choice until I reach 51% - ultimately hoping that I get down to a much smaller sample size, and variance is reduced, while odds are in my favor that I choose correctly. The expected return, in my opinion, all depends on "when" you decide to guess. If you decide to guess when there is a 50% chance of selecting correctly, then your expected return is 50 cents (50% correct wins you \$1 ; 50% incorrect wins \$0 --- 0.5 + 0 = .5) If you decide to guess when there is a 51% chance of selecting red correctly, then the expected return adjusts to (0.51* \$1) + (0.49 * \$0) = 51 cents. So, in other words, your expected return would be a direct function of the percentage probability of selecting correctly. i.e. 50% = 50 cents, 51% = 51 cents, 75% equals 75 cents. Thoughts?There is symmetry between red and black. Each time you pull a card it is equally likely to be red or black (assuming you haven't looked at the previous cards you pulled). Thus no matter when you guess you odds are 50% and the expected return should be 50 cents.Show More Responsesscheme: guess when the first one is black, p(guess) x p(right) x 1=1/2 x 26/51=13/510.5, just turn the first card to see if it's red. I think it's more about trading psychology. If you don't know where the price is going, just get out of the market asap. Don't expect anything.The problem should be random draw card and dont put it back. Every draw you have one chance to guess. So the strategy is after first draw you random guess it's red. If correct you get one dollar, next draw you know there is less red than black. So you guess black on next draw. Else if first guess you are wrong, you guess red on next round. It's all about conditioning on the information you know from the previous drawingsThis should be similar to the brainteaser about "picking an optimal place in the queue; if you are the first person whose birthday is the same as anyone in front of you, you win a free ticket." So in this case we want to find n such that the probability P(first n cards are black)*P(n+1th card is red | first n cards are black) is maximized, and call the n+1th card?The problem statement is not very clear. What I understand is: you take one card at a time, you can choose to guess, or you can look at it. If you guess, then if it's red, you gain \$1. And whatever the result, after the guess, game over. The answer is then \$0.5, and under whatever strategy you use. Suppose there is x red y black, if you guess, your chance of winning is x/(x+y). If you don't, and look at the card, and flip the next one, your chance of winning is x/(x+y)*(x-1)/(x+y-1) + y/(x+y)*x/(x+y-1) = x/(x+y), which is the same. A rigorous proof should obviously done by induction and start from x,y=0,1.The answer above is not 100% correct, for second scenario, if you don't guess, and only look, the total probability of getting red is indeed the same. However, the fact that you look at the card means you know if the probability of getting red is x/(x+y)*(x-1)/(x+y-1) or y/(x+y)*x/(x+y-1). Therefore, this argument only holds if you don't get to look at the card, or have any knowledge of what card you passedDoesn't matter what strategy you use. The probability is 1/2. It's a consequence of the Optional Stopping Theorem. The percent of cards that are left in the deck at each time is a martingale. Choosing when to stop and guess red is a stopping time. The expected value of a martingale at a stopping time is equal to the initial value, which is 1/2.My strategy was to always pick that colour, which has been taken less time during the previous picks. Naturally, that colour has a higher probability, because there are more still in the deck. In the model, n = the number of cards which has already been chosen, k = the number of black cards out of n, and m = min(k, n-k) i.e. the number of black cards out of n, if less black cards have been taken and the number of red cards out n if red cards have been taken less times. After n takes, we can face n+1 different situations, i.e. k = 0,1,2, ..., n. To calculate the expected value of the whole game we are interested in the probability that we face the given situation which can be computed with combination and the probability of winning the next pick. Every situation has the probability (n over m)/2^n, since every outcome can happen in (n over m) ways, and the number of all of the possible outcomes is 2^n. Then in that given situation the probability of winning is (26-m)/(52-n), because there are 26-m cards of the chosen colour in the deck which has 52-n cards in it. So combining them [(n over m)/2^n]*[(26-m)/(52-n)]. Then we have to sum over k from 0 to n, and then sum again over n from 0 to 51. (After the 52. pick we don't have to choose therefore we only sum to 51) I hope it's not that messy without proper math signs. After all, this is a bit too much of computation, so I wrote it quickly in Python and got 37.2856419726 which is a significant improvement compared to a basic strategy when you always choose the same colour.dynamic programming, let E(R,B) means the expected gain for R red and B blue remain, and the strategy will be guess whichever is more in the rest. E(0,B)=B for all Bs, E(R,0)=R for all Rs. E(R,B)=[max(R,B)+R*E(R-1,B)+B*E(R,B-1)]/(R+B). I don't know how to estimate E(26,26) quickly.

### Financial Software Developer Intern at Bloomberg L.P. was asked...

Apr 17, 2012
 There are 20 floors in a building. If you're on an elevator and you're trying to get to the 20th floor, what is the probability that 4 people ahead of you click the 20th floor before you do? Assuming you click last.10 Answersassume there is one button for each floor, so 20 buttons. a person can press any 1 button out of the 20, prob is 1/20. Since there are 4 people, so1/16000These are independent events so the chances of one person before you going to the 20th floor is 1/20. Since this happens 4 times before you the probability is 4*(1/20) or 1/5.The above two are close, but wrong.. There are 20 buttons, thus 20 choices, sure. But you are getting on at one of the floor. No body will press the button for the floor they get on.. Thus, there is really only 19 choices. P = (1/19)^3 (Independent events mean (1/19)(1/19)(1/19)).Show More Responses1/19 + 1/18 + 1/17 + 1/16 assuming that there were no repeated destinations.based on question: P(all 4 ahead of you want to get off on 20th fl) = (1/19)^4 real life(all 4 want to get off on 20th fl, and one of them is the first person press the button to 20th fl, and that leave all others, including you, stay still): (1/19) * (1/4)about 20% is the right answer. I am surprised with some of the answers, they are all very small possibilities (some less than 1%).I'm quite sure you are all wrong: The real probability is 1 - P(nobody pushes 20) = 1 - (18/19)^3 = 15%1- (19/20)^4If one of the 4 press the button for the 20th floor then the others won't have to do anything. The chances of one of them pressing 20th is: 1/19 + 1/19 + 1/19 + 1/19 = 4/19The answer is 1-(19/20)^4

### Trader Intern at Jane Street was asked...

Sep 20, 2011
 What if you could reflip 1 coin that you wanted. What would be the expected value then?6 AnswersSo I flip all my coins for the first time and my expected value is 2. Now, if all my coins are already heads (1/16 of the time) I stop and get \$4 (1/16*4) = .25 If I didn't get all heads the first time I select any one of the tails and re-flip, and then have a 50% chance of increasing my payout by \$1, or effectively, I theoretically increase my payout by (.5)(1) = .50 cents. So the final EV is: .25 + (15/16)(2.5) = 2.59375Not Quiteprob dist (1/16,1/4,6/16,1/4,1/16) for (4h,3h,2h,1h,0h,), payoff terminal node (4,4,3,2,1), EV(Risk Neutral)=41/16Show More Responsesit's 79/3279/32 is correct the post before was wrong to assume that a refip gets you a guarenteed headsusual expectation is 2, now we have to evaluate this option of flipping one additional coin. 1/16 of the time it's worthless, 15/16 of the time it's (conditional) value is 1/2. so the answer is 2 + 15/32 = 79/32
110 of 5,217 Interview Questions