Summer intern Interview Questions in New York, NY
summer intern interview questions shared by candidates
Top Interview Questions
What are the methods of valuing a company? 2 AnswersIt's DCF(Discounted casfhflow) & Relative method, where FCFE(Free cash flow to Equity) is vital for equity shareholders & FCFF(Free cash flow to Firm) is vital for the company. One or more comments have been removed. |
Tell me about a time when you were working in a team and your opinion was challenged. 1 AnswerI asked the team member to explain their opinion. No one is perfect, so keeping an open mind when listening to a team member’s response is key. There were times a team member could convince me they were right and other times when I could explain why my opinion may be the better option. |
Flip a coin until either HHT or HTT appears. Is one more likely to appear first? If so, which one and with what probability? 14 AnswersLet A be the event that HTT comes before HHT. P{A} = P{A|H}P{H} + P{A|T}P{T} = .5P{A|H} + .5P{A|T} P{A|T} = P{A} therefore, P{A|H} = P{A|T} P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Therefore, 2P{A|H} = P{A|HT} P{A|HT} = P{A|HTT}P{T} + P{A|HTH}P{H} = (1)(.5) + P{A|H}(.5) 2P{A|H} = .5 + P{A|H}(.5) P{A|H} = 1/3 and P{A|H} = P{A}, therefore, P{A} = 1/3 So, HHT is more likely to appear first and it appears first 2/3 of the time. P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Need help - - why is P{A|HH} = 0 ? P(A|HH) = 0 because after a sequence of consecutive heads, you can no longer achieve HTT. The moment you get a tail, you will have the sequence HHT. This the reason HHT is more likely to occur first than HTT. Show More Responses P(A|HH) = 0 because after a sequence of consecutive heads, you can no longer achieve HTT. The moment you get a tail, you will have the sequence HHT. This the reason HHT is more likely to occur first than HTT. HHT is more likely to appear first than HTT. The probability of HHT appearing first is 2/3 and thus the probability of HTT appearing first is 1/3. Indeed, both sequences need H first. Once H appeared, probability of HHT is 1/2 (b/c all you need is one H), and probability of HTT is 1/4 (b/c you need TT). Thus HHT is twice is likely to appear first. So, if the probability that HTT appears first is x, then the probability that HHT appears first is 2x. Since these are disjoint and together exhaust the whole probability space, x+2x=1. Therefore x=1/3. You guys seem to be mixing order being relevant and order being irrelevant. If order is relevant (meaning HHT is not the same as HTH) then this has a 1/8 of occuring in the first 3 tosses. Also HTT has a 1/8 chance of occurring in the first 3 tosses, making them equally likely. Now, if order is not relevant. (so HHT and THH are the same), then this has a (3 choose 2) * (1/8) probability of happening in the first 3 tosses. The same goes for HTT (which would be the same as THT etc and others) so this has a (3 choose 2) * 1/8 probability of happening in the first 3 tosses as well. Either way they come out to being equally likely, please comment on my mistake if I am doing something wrong. Ending up with HHT more likely (with probabilty 2/3). HHT is more likely (2/3) probability. People with wrong answers: Did you not Monte Carlo this? It takes 5 minutes to write a program, and you can then easily see that 2/3 is correct empirically. I don't get it. Shouldn't P{A|HH} = P{A} in the same sense that P{A|HTH} = P{A} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A} Am i wrong? sorry, i meant: I don't get it. Shouldn't P{A|HH} = P{A|H} in the same sense that P{A|HTH} = P{A|H} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A|H} Am i wrong? Above link is the best solution I have seen for this problem http://dicedcoins.wordpress.com/2012/07/19/flip-hhh-before-htt/ Apologies, Below* Here's my answer. Let x = probability of winning after no heads (or a tail). y=probability after just one heads. z=probability after two heads. w=probability after HT. Thus x=(1/2)x+(1/2)y, y=(1/2)z+(1/2)w, z=1/2 + (1/2)z, w=(1/2)y. Therefore, z=1, y=2/3, w=1/3, x=2/3. We wanted x at the beginning, so it is 2/3 that HHT comes up first. Show More Responses Think of HHT as H(HT) and HTT as (HT)T For every occurrence of (HT) in a sequence of flips, there is a 1/2 chance an H occurred before the (HT) and a 1/2*1/2 chance a T occurred after. Thus, HHT is 2x more likely than HTT. HHT is first 2/3 of the time. |
3) Poker. 26 red, 26 black. Take one every time, you can choose to guess whether it’s red. You have only one chance. If you are right, you get 1 dollar. What’s the strategy? And what’s the expected earn? 13 Answersexpected earn is 25 cents. 1/2*1/2*1, prob of choosing to guess is 1/2, prob of guessing right is 1/2, and the pay is $1 I would start picking cards without making a decision to reduce the sample size. This is risky because I could just as easily reduce my chances of selecting red by taking more red cards to start, as I could increase my chances of selecting red by picking more black cards first. But I like my chances with 52 cards, that at some point, I will at least get back to 50% if I start off by picking red. Ultimately, I can keep picking cards until there is only 1 red left. But I obviously wouldn't want to find myself in that situation so I would do my best to avoid it, by making a decision earlier rather than later. Best case scenario, I pick more blacks out of the deck right off the bat. My strategy would be to first pick 3 cards without making a decision. If I start off by selecting more than 1 red, and thus the probability of guessing red correctly is below 50%, then I will look to make a decision once I get back to the 50% mark. (The risk here is that I never get back to 50%) However, if I pick more than 1 black card, then I will continue to pick cards without making a choice until I reach 51% - ultimately hoping that I get down to a much smaller sample size, and variance is reduced, while odds are in my favor that I choose correctly. The expected return, in my opinion, all depends on "when" you decide to guess. If you decide to guess when there is a 50% chance of selecting correctly, then your expected return is 50 cents (50% correct wins you $1 ; 50% incorrect wins $0 --- 0.5 + 0 = .5) If you decide to guess when there is a 51% chance of selecting red correctly, then the expected return adjusts to (0.51* $1) + (0.49 * $0) = 51 cents. So, in other words, your expected return would be a direct function of the percentage probability of selecting correctly. i.e. 50% = 50 cents, 51% = 51 cents, 75% equals 75 cents. Thoughts? There is symmetry between red and black. Each time you pull a card it is equally likely to be red or black (assuming you haven't looked at the previous cards you pulled). Thus no matter when you guess you odds are 50% and the expected return should be 50 cents. Show More Responses scheme: guess when the first one is black, p(guess) x p(right) x 1=1/2 x 26/51=13/51 0.5, just turn the first card to see if it's red. I think it's more about trading psychology. If you don't know where the price is going, just get out of the market asap. Don't expect anything. The problem should be random draw card and dont put it back. Every draw you have one chance to guess. So the strategy is after first draw you random guess it's red. If correct you get one dollar, next draw you know there is less red than black. So you guess black on next draw. Else if first guess you are wrong, you guess red on next round. It's all about conditioning on the information you know from the previous drawings This should be similar to the brainteaser about "picking an optimal place in the queue; if you are the first person whose birthday is the same as anyone in front of you, you win a free ticket." So in this case we want to find n such that the probability P(first n cards are black)*P(n+1th card is red | first n cards are black) is maximized, and call the n+1th card? The problem statement is not very clear. What I understand is: you take one card at a time, you can choose to guess, or you can look at it. If you guess, then if it's red, you gain $1. And whatever the result, after the guess, game over. The answer is then $0.5, and under whatever strategy you use. Suppose there is x red y black, if you guess, your chance of winning is x/(x+y). If you don't, and look at the card, and flip the next one, your chance of winning is x/(x+y)*(x-1)/(x+y-1) + y/(x+y)*x/(x+y-1) = x/(x+y), which is the same. A rigorous proof should obviously done by induction and start from x,y=0,1. The answer above is not 100% correct, for second scenario, if you don't guess, and only look, the total probability of getting red is indeed the same. However, the fact that you look at the card means you know if the probability of getting red is x/(x+y)*(x-1)/(x+y-1) or y/(x+y)*x/(x+y-1). Therefore, this argument only holds if you don't get to look at the card, or have any knowledge of what card you passed Doesn't matter what strategy you use. The probability is 1/2. It's a consequence of the Optional Stopping Theorem. The percent of cards that are left in the deck at each time is a martingale. Choosing when to stop and guess red is a stopping time. The expected value of a martingale at a stopping time is equal to the initial value, which is 1/2. My strategy was to always pick that colour, which has been taken less time during the previous picks. Naturally, that colour has a higher probability, because there are more still in the deck. In the model, n = the number of cards which has already been chosen, k = the number of black cards out of n, and m = min(k, n-k) i.e. the number of black cards out of n, if less black cards have been taken and the number of red cards out n if red cards have been taken less times. After n takes, we can face n+1 different situations, i.e. k = 0,1,2, ..., n. To calculate the expected value of the whole game we are interested in the probability that we face the given situation which can be computed with combination and the probability of winning the next pick. Every situation has the probability (n over m)/2^n, since every outcome can happen in (n over m) ways, and the number of all of the possible outcomes is 2^n. Then in that given situation the probability of winning is (26-m)/(52-n), because there are 26-m cards of the chosen colour in the deck which has 52-n cards in it. So combining them [(n over m)/2^n]*[(26-m)/(52-n)]. Then we have to sum over k from 0 to n, and then sum again over n from 0 to 51. (After the 52. pick we don't have to choose therefore we only sum to 51) I hope it's not that messy without proper math signs. After all, this is a bit too much of computation, so I wrote it quickly in Python and got 37.2856419726 which is a significant improvement compared to a basic strategy when you always choose the same colour. dynamic programming, let E(R,B) means the expected gain for R red and B blue remain, and the strategy will be guess whichever is more in the rest. E(0,B)=B for all Bs, E(R,0)=R for all Rs. E(R,B)=[max(R,B)+R*E(R-1,B)+B*E(R,B-1)]/(R+B). I don't know how to estimate E(26,26) quickly. The question, to me, is not clear. Perhaps on purpose. If so, the best answers would involve asking for clarification. |
You are playing a game where the player gets to draw the number 1-100 out of hat, replace and redraw as many times as they want, with their final number being how many dollars they win from the game. Each "redraw" costs an extra $1. How much would you charge someone to play this game? 10 Answers10? redraw 10 times and get the payoff around 77? the average draw will pay out $50.50 Show More Responses Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of $1 is sunk cost); continue, then the expectation of benefit would be $50.50-1=49.50. This means, as long as you have get more than $50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: $Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73. Every time when you are deciding whether to play once more, we consider the two options: stop now, then you get current number (the cost of $1 is sunk cost); continue, then the expectation of benefit would be $50.50-1=49.50. This means, as long as you have get more than $50 (inclusive), then you should stop the game. Suppose the game ends after N rounds (with probability (49%)^(N-1) x 51%, and in the last round, the expected number is (50+100)/2=75, and thus the expected net benefit would be 75-N . This shows N<=74. Then we take the sum: $Sigma_{N=1}^{74} (49%)^(N-1) x 51% x (75-N), which is 73. All of the above answers are way off. For a correct answer, see Yuval Filmus' answer at StackExchange: http://math.stackexchange.com/questions/27524/fair-value-of-a-hat-drawing-game Yuval Filmus proves that the value of the game is 1209/14=86.37 and the strategy is to stay on 87 and throw again on 86 and below.. Let x be the expected value and n be the smallest number you'll stop the game at. Set up equation with x and n, get x in terms of n, take derivative to find n that maximizes x, plug in the ceiling (because n must be integer) and find maximum of x. ceiling ends up being 87, x is 87.357, so charge $87.36 or more I guess the question asks for the expected value of the game given an optimal strategy. I suppose the strategy is to go on the next round if the draw is 50 or less. Hence, the expected value of each round is: (1) 1/2*1/50(51 + 52 + ... + 100) (2) 1/2*1/2*1/50(51 + 52 + ... + 100) - 1/2 (3) 1/2^3 * 1/50 (51 + 52 + ... + 100) - 1/4 .... Sum all these up to infinity, you'd get 74.50. This is all very interesting and I'm sure has some application...but to trading? I don't think so. I own a seat on the futures exchange and was one of the largest derivatives traders on the floor. Math skills and reasoning are important but not to this level. I would associate day trading/scalping more to race car driving i.e. getting a feel for what's going on during the day, the speed at which the market is moving and the tempo for the up and down moves. If I were the interviewer at one of these firms, I throw a baseball at your head and see if you were quick enough and smart enough to duck. Then if you picked it up and threw it at my head I'd know that you had the balls to trade. I know guys who can answer these questions, work at major banks, have a team of quants working for them and call me up to borrow money from me because they're not making money. At the end of the day, if you want to be a trader then...be a trader. If you want to be a mathematician then be a mathematician. It's cool to multiply a string of numbers in your head, I can do this also, but never in my trading career did I make money because in an instant I could multiply 87*34 or answer Mensa questions which...realistically the above answer is: it depends on the market as the market will dictate the price. You may want to charge $87 to play that game but you'd have to be an idiot to play it. In trading terms this means that when AAPL is trading at $700 everyone would love to buy it at $400. Now that it's trading at $400 everyone is afraid that it's going to $0. Hope this helps. No offense to the math guys on this page, just want to set the trading record straight. |
Summer Analyst at Goldman Sachs was asked...
how many times the hour and minute hands of the clock form right angle during one day? 10 Answerseach hour creates 2 right angles...2 x 24 = 48 times a right angle is formed in one day Wrong. Think what's happening around 3 and 9 o'clock 4 times? Show More Responses Let me show you a mathematical approach. Common sense dictates that the minute hand moves at a faster rate of 5.5 degrees a minute (because the hour hand moves 0.5 degrees a min and the minute hand moves 6 degrees a minute). We start at 12 midnight. The hands are together. For subsequent 90 degree angles to occur, the minute hand must "overtake" the hour hand by 90 degrees, then 270 degrees, then 360 + 90 degrees, then 360 + 270 degrees, then 360 + 360 +90 degrees.. and so on. This can be re-expressed as: (1)90, 3(90), 5(90), 7(90), 9(90), 11(90)... n(90). The number of minutes this takes to happen can be expressed as (1)90/5.5, 3(90)/5.5, 5(90)/5.5, 7(90)/5.5, 9(90)/5.5, 11(90)/5.5... n(90)/5.5. In one day, there are 24 hr * 60 mins = 1440mins To find the maximum value of n, n(90)/5.5 = 1440 n = 88 but as seen from above, n must be an odd number (by pattern recognition and logic) hence n must be the next smallest odd number (87) counting 1,3,5,7,9,11......87, we see that the number of terms = (87-1)/2 +1 = 44. In other words, the minute hand "overtakes" the hour hand on 44 occasions in 24 hours in order to give a 90 degree angle. Therefore the answer to your question is 44. i agree with right_ans. although i got lost in his explanation, though i'm sure it is correct. i found another answer here: http://brainteaserbible.com/ Each hour has 2 occurrences of 90 degrees. In 12 hrs, it overtakes 24 times. BUT hrs 2 to 3 has only 1 NOT 2. Also hr 8 to 9 has only 1. So subtract 2 from 24. You get 22. In half a day (12 hrs) you get 22 times. Therefore in 1 day ie 24 hrs, it cross 22 * 2 = 44 times. 42 Relative speed is 5.5 degree/min. Time is 24*60 mins. Total distance is 5.5*24*60 degrees. How many full circles it is? 5.5*24*60/360 = 5.5*4 = 22. Each full circle contains 2 right angles (90 and 270). So answer is 22*2 = 44. Calm down, we first must convert time to angle, two different units. The hour hand completes one full revolution each 12 hours (considering a 12 clock). So, theta_h = 360 x h/12, where theta_h is the angle that the hour hand makes with 12 hs mark and h is the number of hours since 0hs. So at 0 hs, the angle is 0, and at 12, the angle is 360 = 0. Since h = m/60, where m is the number of minutes since 0h, we have: theta_h = 360 x m/60 / 12 = 360 m / 12 x 60 = 0.5m Now, given the number of minutes since 0h, we can tell the angle of hour hand using theta_h. The minute hand angle, theta_m is: theta_m = 360 x m / 60 = 6m So the difference between theta_h and theta_m is |theta_h - theta_m| = 5.5m. Now given the minutes since 0h, we can tell the angle between hour and minute hand. Now, how many minutes we need to make |theta_h - theta_m| = 5.5m = 90? About 16.36. Since we have 12 x 60 minutes in a day, we have 12 x 60 / 16.36 gaps that satisfy 90 degrees, which is 44. One or more comments have been removed. |
Say I take a rubber band and randomly cut it into three pieces. What's the probability that one of the pieces has length greater than 1/2 of the original circumference of the rubber band. 9 Answers3/4 Suppose you have two cuts on the rubber band placed randomly. The probability of having one segment greater than half the circumference is the probability that the third cut will be inside the combined range of 90* to either side of the cuts. Since the average distance between the first two cuts is also 90*, the combined range is 270*, or 3/4 of the circle. You need 3 cuts to end up with 3 pieces. The first cut doesn't matter. The second cut can also be anywhere and the largest piece will still be at least half the circumference. What matters is the third cut, which should lie in the same half as the second cut. So the probability is actually 1/2. Show More Responses The correct answer is 3/4, as this problem is equivalent to the famous 3-points-on-semicircle problem. Why? If one of the pieces has length greater than 1/2 the circumference, then the three points of cutting must lie in the same semicircle. On other hand, if the three points of cutting lie on the same semicircle, then the longest piece must be at least 1/2 of the circumference. For reference to the 3-points-on-same-semicircle problem, see e.g., http://godplaysdice.blogspot.com/2007/10/probabilities-on-circle.html 1/4 1 -3/4 suppose I have two points whose minor arc distance is t <= 1/2. Then the range of semicircles covering both points gives an arc length of (1/2+1/2)-t = 1-t. say we fix the first point, tracing the second point around gives minor arc lengths from 0 to 1/2 and then 1/2 to 0. Therefore the answer is 2*integral (1-t) from 0 to 1/2, which is 2(1/2-1/8) = 3/4 It's 3/4. Cut it into 1 piece make a line. Cut it close. Pretend the length is 100. If you cut the first at x=1, as long as it isn't between x=50-51, it will have a length greater than 50% so there's 99% chance. You can imagine that if the cut was infinitely close to the end it would be about 100%. Now cut at x=2 you can't do between x=50-52. For x=3 it's 50-53 etc. So when you get to right to infinitely close to 50 it is pretty much between x=50-100 so there is a 50% chance you hit your spot. (obviously 50-50 is 100%, but since this length is continuous there's little chance it lands on that point). Obviously since this is symmetrical you can see this pattern going from 50% to 100% at the other end. Since each point on the continuous line has the same probability of happening the answer is clearly 75%. This problem is also equivalent to the probability that, if you have a line segment from 0 to 1 and you make 2 random cuts on that line segment, what is the probability that the three resulting pieces do NOT make a triangle? |
At a party everyone shakes hands, 66 hand shakes occur, how many people are at the party? 9 Answers12 requires explanation it's not 12, it's 11 Show More Responses 12 sum of the series 66= n/2( 2*1 + (n-1)*1) n=12 You can suppose there are n people in the room and think of them in a row. The first one has to shake hands with (n-1) people (because he doesn't have to shake hands with himself). The second one has already shaken hands with the first one, so he has (n-2) shakes remaining... and so on. So you have to sum: (n-1)+(n-2)+(n-3)+...+1= (n/2)*(n-1) Then you have to solve (n/2)*(n-1)=66 and you get n=12. n(n-1)/2=66 so n=12 You can think of this problem as a combination problem: (N choose 2) = 66, and then solve for N. That is, N! / (2! (N-2)! ) = 66. Simplifying this equation leads to N(N-1)/2 = 66, and the integer solution to this problem is N = 12. 66=x!/((x-2)!*2) which gives 12 |
Summer Intern at Five Rings was asked...
• Is 1027 a prime number? • How would you write an algorithm that identifies prime numbers? • 2 blue and 2 red balls, in a box, no replacing. Guess the color of the ball, you receive a dollar if you are correct. What is the dollar amount you would pay to play this game? 8 AnswersAn algorithm for testing prime numbers is trial testing, test whether whether the number is dividable by an integer from 2 to its square root. For the color guessing game, the expected number of dollars you get is the average identity between a permutation of rrbb and rrbb, which is 2. For the prime number testing, only the number 2 and then odd numbers need to be tested. If it is not divisible by 2, there is no need to test against any other even number. So start with 2, then 3, then increment by 2 after that (3,7,9,...) until you are greater than the square root (then it's prime), or you find a divisible factor (it is not prime). To test for divisibility, we are looking for a remainder of zero - use a MOD function if available. Taking the integer portion of the quotient and subtracting from the actual quotient: if the difference is zero, then the remainder is zero and we have a divisible factor. If the difference is nonzero, then it is not divisible and continue testing. In this case, we find that dividing by 13 gives 79 with no remainder, so it is not prime. For the guessing game, the minimum winnings are $2 every time with the proper strategy. I'm assuming the rules are you pay to play and you get to guess until there are no more marbles. Say you guess wrong the first attempt. (you guess blue and it was red). So now you know there are 2 blue, 1 red. Your logical choice is to choose blue again, since there are more of them. But say you guess wrong again. Now you know there are 2 blue left, so you will win on both of the last 2 draws. If you were correct on one or both of the first two trials, then you could wind up with an even chance on the third trial, so you would win that some of the time, then you'll always win on the last trial. Show More Responses David, I think we could pay more that $2 and still come out on top. You logic seems sound, but looking at the probabilities I see: 1/2+1/3*(2)+2/3*(5/2) = 17/6 = ~2.83 Choosing the first ball, we obviously have an expected value of 1/2. Then, WLOG, we are left with RRB. Clearly we then choose R as this gives us a 2/3 shot at picking correctly. If it is R, then we get that $1, have a 50% shot at the next, and are assured the last, giving us, on average, $2.5. If it is B, then we know the next two will be R, giving us $2. As you can see, with an optimal strategy, we should expect to make ~$2.83 per round. Take the square root fo 1027. You get 32.04. Need only to check if divisible by prime numbers from 1 to 32, which include 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, and 31 For algorithm, see Lucas' test on Wikipedia, where there is also pseudocode. 1027 = 1000 + 27 = 10^3 + 3^3 and you know you can factor a^3 + b^3 1027 = 2^10-1 = (2^5-1)(2^5+1) prime number ez draw ball worths 17/6 dollars, the first draw worths 0.5, the rest worth(2/3 * 2.5 + 1/3 * 2) 1027 = 2^10-1 = (2^5-1)(2^5+1) prime number ez draw ball worths 17/6 dollars, the first draw worths 0.5, the rest worth(2/3 * 2.5 + 1/3 * 2) |
2) A. 10 ropes, each one has one red end and one blue end. Each time, take out a red and a blue end, make them together. Repeat 10 times. The expectation of the number of loops. B. 10 ropes, no color. All the other remains the same. 7 Answers1/10 + 1/9 +...+ 1 ? B is similar.. 1/19+1/17+etc in B E[n] = 1/n + (n-1)/n*E[n-1] = 1/n + E[n-1] For the case of n=10, you would sum up all of the numbers from 1 to 10: 1/10+1/9+ 1/8 + 1/7 ... + 1/2 Show More Responses add an extra 1 to the previous answer For part A), the answer is 1+1/2+1/3+...+1/10. For part B), the answer is 1+1/3+1/5+...+1/19. Explanations: For part A), ctofmtcristo has the right approach but with a typo in the equation for E[n]. To obtain the expected number of loops, we note that the first red has a 1/n chance of connecting with its opposite blue end (and forming a loop) and a (n-1)/n chance of connecting with a different rope's blue end (and not yet forming a loop), so E[n] = 1/n*(1+E[n-1]) + (n-1)/n*E[n-1] = 1/n + E[n-1], with base case E[1]=1. Then, by induction, we get E[n] = 1+1/2+1/3+...+1/n. Part B) is similar. We note that the first end now has 2n-1 possible ends to connect to, of which 1 of them is its opposite end and 2n-2 of them belong to a different rope. Then, E[n] = 1/(2n-1)*(1+E[n-1]) + (2n-2)/(2n-1)*E[n-1] = 1/(2n-1) + E[n-1], with base case E(1)=1. By induction, E[n] = 1+1/3+1/5+...+1/(2n-1). Ed's anwser is not right. Just check for the case of 3 pairs. So total cases is 3!=6. 1 case with 3 loops, 2 cases with all wrongly attached, and 3 cases with 1 loop. so expected value is (3/6)*(1) + (1/6)*(3) = 6/6 = 1... and Eds anwser gives 1+1/2 +1/3 = 11/6, which is wrong clearly. Timi, you are missing the fact that if they are "all wrongly attached" then they form a loop. Similarly, the case you are thinking of "with 1 loop" actually has 2 loops. The correct answer is still 11/6. |
See Interview Questions for Similar Jobs
- Investment Banking Analyst
- Summer Analyst
- Intern
- Analyst
- Financial Analyst
- Sales and Trading Summer Analyst
- Associate
- Investment Banking Associate
- Business Analyst
- Equity Research Associate
- Consultant
- Vice President
- Investment Analyst
- Operations Analyst
- Managing Director
- Research Analyst