Research associate Interview Questions in New York, NY
research associate interview questions shared by candidates
Top Interview Questions
Research Analyst at Aksia was asked...
If you had a machine that produced $100 dollars for life what would you be willing to pay for it today? 75 AnswersNo more than 100$ for 100$ for life Depends entirely on how fast the machine produces the bills. Current price will be determined by estimated future yield which isn't defined here. I would pay nothing for it. $100 dollars in a life? I can make that in one day... Show More Responses I would not pay anything. Only the Federal Reserve can legally produce $100 dollar bills. I would pay for everyone in my family to get a four year college degree and then their master's degree. I would cover the cost of books, supplies and dorm rooms or apartments near campus. I would buy a new car if they held a GPA over 2.00. I would buy them nice clothes and shoes so that they looked nice while attending class. I would also buy them a new coat to keep warm and an umbrella to keep the rain off of their heads. I would value education more than I can afford to do so now. I already have the machine, why would I pay for it?? Nothing, I already have it. absolutely nothing. $100 dollars for a life time is ridiculous What is "$100 dollars"? That reads as "one hundred dollars dollars." What's a dollar dollar? Is it double value? Say $100 or one hundred dollars; do not say both, in order to avoid being redundant. As much as I could lay my hands on. It could all be repaid instantly I would pay nothing for a machine that produced dollars at a cost of $100 each. What would you offer me to haul the machine away for you? Nothing. Fiat money is either rendered worthless by uncontrollable inflation, or $100 over a lifetime is not worth much. Firstly, if it made $100 for life, it wouldn't be worth anything. Secondly, if it's making money, it's illegal. Show More Responses 10 to 20 years is what I would have to pay for a machine like that, life is only for murder with special circumstances. Does such a machine exist? What? No? Then why are you asking this question ... MORON. No. Not worth it. Why is this considered a top 10 odd interview question? It's a basic accounting question that applies to any applicant at a financial institution. Let's assume the proper phrasing of the question is "If you had a machine that produced a free $100 dollars per year for life, what would you be willing to pay for it today?" Given that Aksia is a financial firm, they're basically asking what is the present value of a perpetuity with a $100 annual payment. PV=pmt/r where: PV=PResent value PMT= payment per period r= discount rate Given current US fed reserve discount rate is 0.75%, the Present value of such a device would be $13,333.33 Answer varies obviously if discount rate changes or if proper phrasing was meant to be $100 for a different time period. Nothing. According to the question, I already have it. I am assuming the question is stated incorrectly. It should read: "$100 per year for life". Let's say I expect to live another 50 years. Then in nominal terms, the machine produces $50,000. My time-preference rate is 4 percent per year and let's suppose that the Federal Reserve does a good job of keeping inflation in their target range and averages 2 percent. Then the total discount rate is 6 percent per year. In present-value terms, the machine produces $100 this year, $94.34 next year [e.g. $100/((1+r)^n) ], $89.00 in two years, etc, for a sum of $1,676.19. This is the present value of the machine and I would pay this much or less to own the machine. Now suppose that the machine actually produces real $100 dollars (so increases the amount according to inflation). In that case the discount rate is 4 percent and I would pay a maximum of $2,248.22. Obviously I'm using a calculator (Excel) to make these calculations, but the idea is the same. What are talking here? $100 per day? Week? Month? Only once in a lifetime? Oh, and does the IRS have to know about this? Be specific. Does it produce $100 every minute of your life or just once in your life? Will it produce enough for me to make bail and hire a lawyer after I'm arrested for counterfeiting? Press NPV button on calculator. 1) As many have stated, this is illegal. But putting that aside, 2) The way the question is presently phrased, technically "you" already have the machine so why would you pay anything? And also, there is a lack of clarity regarding the frequency of the machine's production of $100 - daily, weekly, monthly, etc? The output frequency changes the entire question (because if you only receive $100 once well why bother wasting x% of that $100 by paying for the machine itself?), 3) But assuming this machine provides $100 more than once, does it print a single $100 bill or 100 $1 bills (or other denominations)? Because based on that, what are the operating costs of this machine? How much does the paper cost? How much does the ink cost? How much does it cost to maintain this machine? The interviewer may or may not answer those questions depending on the interview style and format so I'd just go with "It's an illegal practice and I'd never do something so horrific and undermine the honest principles this country was founded upon!" :P Show More Responses Somebody has been watching too much Twilight Zone. Whoever formulated this question does not speak good English. The question has no real discernible meaning. They should be fired and a qualified person should be given the job. I will pay with a blank cheque; the seller puts whatever he/she wants; I just have to print it & deposit to my bank account. I assume I can print the $100 dollars unlimited number of times a day. I would tell the manufacturer that I can't pay for the machine now but I'm willing to pay them $1,000 per day for the rest of my life. That'll come up to just 10 pcs for them each day and I'll print another 100 pcs for myself each day. That'll be $10,000 for me everyday! Please define what $100 dollars is. Is it $100 or 100 dollars? USD? When does it produce this undefined amount? For my life or its? What are its operating and disposal costs? Is that in $ dollars also? Where do you spend $ dollars? There is nothing to base an answer on. Life is worth more than a hundred dollars! If I already have it, to whom would I pay anything and why? Silliness aside, you would need to know at least: Payment frequency Your life expectancy Some discount yield (subjective or otherwise) Then it's a matter of PV annuity, or PV perpetuity (if you are allowed to pass on payments after death) It produces $100 for each life it takes? Probably evil, would avoid. Show More Responses I already have it so I don't need to pay for it. But if I didn't have it and had to buy its details as mentioned above need to be answered such as $100 a day month year number of times u can take it out. But you are basically putting your own money into a machine to give it back to you when you need it, sort of like an atm Nothing, I already have it! The wording makes no sense. Saving money by using a hybrid car, LED bulbs, efficient appliances and HVAC, or solar is easier to calculate and pays way more than, $100 a year. With that 100$, i will buy another such machine. Now i have 2 machines, each of them generates 100$, which is 200$. With these 200$, i will buy another 2 such machines, now i have 4 machines. I will buy another 4 machines, with 400$, so on.. I will multiply so on, until i have enough money to solve everyone's problems of their lifetime !!! I would use it to buy a truckload of English usage manuals and distribute them to people who insist on uninformed mash-ups like "$100 dollars". Are they anything like "ATM machines" or "square acres"? Charity I'd be really worried that if it put out $100 on the first day, I'd be dead the next day. Show More Responses Honestly most people here haven't truly read the question. If I already had a machine that would make $100 bills for the rest of my life, why would I have to pay any money at all to get it? The majority of ppl answering this question, haven't even comprehended the question properly. Seriously. I would pay $6,500 for it. I plan to live to about 85 so I'd set up a payment plan and pay it off with the machine 1 hundred every year i will not pay anything. $100 is the least I could get for free $100. That's all it's worth if it pays that for life. In the first place this is a trick question. It wasn't stated that the machine would produce 100 dollar bills. It stated 100 dollars for life. In the second place it would be counterfeiting so possession of this machine could get you a prison sentence, and no amount of money is worth loosing years of your life. I've noticed most of these questions are trick questions like how many people flew people don't fly plane's and birds do. One dollar. Why? Because I know a value when I see one. My deal, my value. Secure that one and you have it made. Show More Responses Anything less than 100$ I'd want at least a 500% profit so $20 at most. I'd pay someone to stop coming up with these stupid questions. Makes me not want to work for anyone. No wonder so many jobs go unfilled. Who wants to work with or for people that start off our first meeting with stupid head games. Chances are that all the job involves is pushing buttons, referring to manuals, taking calls, reading emails and forwarding and escalating issues to other employees. If I wanted to solve riddles, I'd apply to the Riddle Factory where such questions would be relevant. $100USD Dollars for life Well, if I had the machine, as the question indicates, then I would not have to pay for it if I already owned the machine, which the beginning of the questions indicates. I'd leave the interview with a company that produces such an incredibly poorly-worded question. "$100 dollars?" So "one hundred dollars dollars?" Ok. Also, it doesn't specify any sort of rate at which it produces money. It doesn't say if it produces $100 per day, nor does it say if it produces X amount of dollars per day until it reaches $100 in a lifetime. As it is written, this question makes no sense, and the writer of it should be ashamed. I wouldn't pay anything for it today, or the day I got it. What are "dollars for life" anyway? And why do they cost $100? I want to earn money from my work not from machine Who is life, and why would I want to pay for his/her machine? I would pay up to $50 for it. Make 60,70,80 or $90 then sell it for $50+ Show More Responses Why would I pay for something that I already have? I would ask first the frequency of distribution, if that's daily/weekly/monthly etc. I would say that I have a policy of never buying something without testing it first and then when I was stood in front of the machine I would say how much do you want for it? Money is no object! :) I would first ask how easy is it to reproduce the machine. If it is less than $100 to reproduce the machine, I would just recreate as many machines as possible and earn the difference. Then I will do some math to calculate the optimized price to pay for a desired annual income. And maybe in the future, invest some of the income to reduce production cost, reuse of old machines, etc. I think the question has probably been misquoted . Probably meant "How much would you pay for a machine giving you $100 per year for the rest of your life ". Its basically a PV of future cashflows ; simply if into perpetuity : = coupon / discount rate. The coupon being the $100 . The Discount rate is the rate of the prevailing interest rate - usually take the yield of 30 yr UST here (say 3% for arguments sake) and add in a bit more if you fell there is any risk in holding the machine. (like some fool with a loaded gun trying to steal it of you) ie 100/ 0.03 = $3,333 . This will be the maximum one should pay. I'll pay for another machine which produce money in such an easy way for sure. Nothing. It clearly is a scam Learn to write a coherent question and then get back to me. Show More Responses 1) why would I pay for a machine I already own 2) $100 for life is just a one time payment of $100 One or more comments have been removed. |
3) Poker. 26 red, 26 black. Take one every time, you can choose to guess whether it’s red. You have only one chance. If you are right, you get 1 dollar. What’s the strategy? And what’s the expected earn? 13 Answersexpected earn is 25 cents. 1/2*1/2*1, prob of choosing to guess is 1/2, prob of guessing right is 1/2, and the pay is $1 I would start picking cards without making a decision to reduce the sample size. This is risky because I could just as easily reduce my chances of selecting red by taking more red cards to start, as I could increase my chances of selecting red by picking more black cards first. But I like my chances with 52 cards, that at some point, I will at least get back to 50% if I start off by picking red. Ultimately, I can keep picking cards until there is only 1 red left. But I obviously wouldn't want to find myself in that situation so I would do my best to avoid it, by making a decision earlier rather than later. Best case scenario, I pick more blacks out of the deck right off the bat. My strategy would be to first pick 3 cards without making a decision. If I start off by selecting more than 1 red, and thus the probability of guessing red correctly is below 50%, then I will look to make a decision once I get back to the 50% mark. (The risk here is that I never get back to 50%) However, if I pick more than 1 black card, then I will continue to pick cards without making a choice until I reach 51% - ultimately hoping that I get down to a much smaller sample size, and variance is reduced, while odds are in my favor that I choose correctly. The expected return, in my opinion, all depends on "when" you decide to guess. If you decide to guess when there is a 50% chance of selecting correctly, then your expected return is 50 cents (50% correct wins you $1 ; 50% incorrect wins $0 --- 0.5 + 0 = .5) If you decide to guess when there is a 51% chance of selecting red correctly, then the expected return adjusts to (0.51* $1) + (0.49 * $0) = 51 cents. So, in other words, your expected return would be a direct function of the percentage probability of selecting correctly. i.e. 50% = 50 cents, 51% = 51 cents, 75% equals 75 cents. Thoughts? There is symmetry between red and black. Each time you pull a card it is equally likely to be red or black (assuming you haven't looked at the previous cards you pulled). Thus no matter when you guess you odds are 50% and the expected return should be 50 cents. Show More Responses scheme: guess when the first one is black, p(guess) x p(right) x 1=1/2 x 26/51=13/51 0.5, just turn the first card to see if it's red. I think it's more about trading psychology. If you don't know where the price is going, just get out of the market asap. Don't expect anything. The problem should be random draw card and dont put it back. Every draw you have one chance to guess. So the strategy is after first draw you random guess it's red. If correct you get one dollar, next draw you know there is less red than black. So you guess black on next draw. Else if first guess you are wrong, you guess red on next round. It's all about conditioning on the information you know from the previous drawings This should be similar to the brainteaser about "picking an optimal place in the queue; if you are the first person whose birthday is the same as anyone in front of you, you win a free ticket." So in this case we want to find n such that the probability P(first n cards are black)*P(n+1th card is red | first n cards are black) is maximized, and call the n+1th card? The problem statement is not very clear. What I understand is: you take one card at a time, you can choose to guess, or you can look at it. If you guess, then if it's red, you gain $1. And whatever the result, after the guess, game over. The answer is then $0.5, and under whatever strategy you use. Suppose there is x red y black, if you guess, your chance of winning is x/(x+y). If you don't, and look at the card, and flip the next one, your chance of winning is x/(x+y)*(x-1)/(x+y-1) + y/(x+y)*x/(x+y-1) = x/(x+y), which is the same. A rigorous proof should obviously done by induction and start from x,y=0,1. The answer above is not 100% correct, for second scenario, if you don't guess, and only look, the total probability of getting red is indeed the same. However, the fact that you look at the card means you know if the probability of getting red is x/(x+y)*(x-1)/(x+y-1) or y/(x+y)*x/(x+y-1). Therefore, this argument only holds if you don't get to look at the card, or have any knowledge of what card you passed Doesn't matter what strategy you use. The probability is 1/2. It's a consequence of the Optional Stopping Theorem. The percent of cards that are left in the deck at each time is a martingale. Choosing when to stop and guess red is a stopping time. The expected value of a martingale at a stopping time is equal to the initial value, which is 1/2. My strategy was to always pick that colour, which has been taken less time during the previous picks. Naturally, that colour has a higher probability, because there are more still in the deck. In the model, n = the number of cards which has already been chosen, k = the number of black cards out of n, and m = min(k, n-k) i.e. the number of black cards out of n, if less black cards have been taken and the number of red cards out n if red cards have been taken less times. After n takes, we can face n+1 different situations, i.e. k = 0,1,2, ..., n. To calculate the expected value of the whole game we are interested in the probability that we face the given situation which can be computed with combination and the probability of winning the next pick. Every situation has the probability (n over m)/2^n, since every outcome can happen in (n over m) ways, and the number of all of the possible outcomes is 2^n. Then in that given situation the probability of winning is (26-m)/(52-n), because there are 26-m cards of the chosen colour in the deck which has 52-n cards in it. So combining them [(n over m)/2^n]*[(26-m)/(52-n)]. Then we have to sum over k from 0 to n, and then sum again over n from 0 to 51. (After the 52. pick we don't have to choose therefore we only sum to 51) I hope it's not that messy without proper math signs. After all, this is a bit too much of computation, so I wrote it quickly in Python and got 37.2856419726 which is a significant improvement compared to a basic strategy when you always choose the same colour. dynamic programming, let E(R,B) means the expected gain for R red and B blue remain, and the strategy will be guess whichever is more in the rest. E(0,B)=B for all Bs, E(R,0)=R for all Rs. E(R,B)=[max(R,B)+R*E(R-1,B)+B*E(R,B-1)]/(R+B). I don't know how to estimate E(26,26) quickly. The question, to me, is not clear. Perhaps on purpose. If so, the best answers would involve asking for clarification. |
Quantitative Researcher at Citadel was asked...
Given log X ~ N(0,1). Compute the expectation of X. 13 AnswersThis is a basic probability question. Exp[1/4] exp(mu + (sigma^2)/2) = exp(0+1/2) = exp(1/2) Show More Responses Let Y = log(X), then X = exp(Y) = r(Y), if we call the pdf of X f(X), then E[X] = integral(Xf(X)dX). By variable transformation, f(x) = g(r^-1(X))r^-1(X))', plug this into E[X] = integral(Xf(X)dX), we get integral( f(y)dy ), which equals to 1 Suppose the density function of Y is P(y) and the one for X is F(x), it obeys that P(y)*dy = F(x)*dx; then the expectation of X is E(x) = Integral( x*F(x)*dx ) = Integral( Exp(y) * P(y) * dy ); if you plug the gaussian function and standard deviation in, you will find E(x) = Integral( Exp(1/2) * P(y-1/2)*d(y-1/2) ) = Exp(1/2) So, mojo's ans is correct. I m not that sure, as I got E(x) = 4 I substituted log X = y e^y = X ;and e^2y = t and plz do not forget to change the integration limits Do they care if you explain the theory or not? I just looked at it, it's standard normal, therefore x=50% P(logX P(X Sorry misread the problem. ignore. X has a log-normal distribution, so yes the mean is exp(mu+sigma^2/2)=exp(1/2) Expanding on the correct answers above: E[X] = E[exp(logX)], and logX is normally distributed. So: E[X} is the moment-generating-function (mgf) of a standard normal distribution, evaluated at 1. The mgf of a normal distribution with mean mu, SD sigma is exp(mu*t + (1/2) * sigma^2 * t^2), now set mu = 0, sigma = 1, t = 1 to get exp(1/2). Complete the square in the integral One or more comments have been removed. |
Quantitative Researcher at Jane Street was asked...
If X, Y and Z are three random variables such that X and Y have a correlation of 0.9, and Y and Z have correlation of 0.8, what are the minimum and maximum correlation that X and Z can have? 8 Answers0.9 http://www.johndcook.com/blog/2010/06/17/covariance-and-law-of-cosines/ 0.98 & 0.46 Show More Responses http://wolfr.am/1i1XT4P How'd you get 0.98 and 0.46? NND correlation matrix --> det(\Sigma)>=0 --> 0.98 and 0.46 minimum: 0.9*0.8+sqrt(1-0.9^2)*sqrt(1-0.8^2) = 0.9815 maximum: 0.9*0.8-sqrt(1-0.9^2)*sqrt(1-0.8^2) = 0.4585 How do you know this |
Quantitative Researcher at Jane Street was asked...
You have two decks of cards: a 52 card deck (26 black, 26 red) and a 26 card deck (13 black, 13 red). You randomly draw two cards and win if both are the same color. Which deck would you prefer? What if the 26 card deck was randomly drawn from the 52 card deck? Which deck would you prefer then? 11 AnswersI responded immediately to the first part. The second part took me a bit longer - I immediately said that my intuition thought the third deck and the first deck were equally good but couldn't give a good rigorous proof very quickly (took about 30 seconds or so). Actually I think the third deck is better than the first deck. That is because it says to "draw two cards of the same color" not "draw two black cards". Compare the following decks: a deck with 13 black and 13 red, a deck with 26 black, and a deck with 26 red. The chance of drawing two of the same color cards are 6/25, 1, 1 respectively. You can see with a little math that any distribution of 26 cards is better than or equally as good as a distribution of 13 red and 13 black cards. Show More Responses @curious_cat I think that only implies that the third deck is better than the second deck (the second has 13/13 while the first has 26/26). 1) P(win | 52-card deck) = 25/51. P(win | even 26-card deck) = 12/25. 52-card deck is better. 2) P(win | n-red cards in random 26-card deck) = (n/26 * ((n-1) / 25)) + ((26-n) / 26 * ((26-n-1) / 25)) = (n^2) / 325 - (2n / 25) + 1. Taking the derivative and solving for the root: P' = 2n / 325 - 2 / 25 = 0 -> n = 13, which is a minimum. Interpretation: having equal numbers of red and black cards in the deck MINIMIZES your chances of winning. Because the last deck is the same as the second deck (26 cards, split evenly red/black) except it may have an uneven number, this last (randomly selected) deck is better than the evely-split deck, but is it better than the 52-card deck? For this, we use the Hypergeometric Distribution (like the Binomial distribution, but for trials without replacement) to look at the odds of getting a 26-card deck with n red cards: P(selecting n red cards for random 26-card deck) = [ (52-26) C (26-n) ] * [ 26 C n ] / [52 C 26] = (2^43 * 3^17 * 5^12 * 7^4 * 11^4 * 13^4 * 17 * 19^2 * 23^2) / (29 * 31 * 37 * 41 * 43 * 47 * (n!)^2 * ((26-n)!)^2). From here, all that's left to do is combine these probabilities with the probability of winning [from above, P(win | n-red cards in deck) = (n^2) / 325 - (2n / 25) + 1] with each deck that contains 0 through 26 red cards (n => {0,26}). If this is larger than 25/51, then we can say definitively that we would prefer the randomly selected 26-card deck to the even 52-card deck. However, doing this out reveals that the probability of winning with the randomly selected deck = 25/51. Therefore, odds of winning are THE SAME with either the first (even 52-card deck) or the last (26-card deck, randomly selected from an even 52-card deck). Imagine, all that math to prove a simple equality! :) Q.E.D. the 3rd deck is the same as the 1st deck we do not need to calculate it by hand P(I randomly pick 2 cards in a 52 deck) = P(I always pick 2 cards on the top of the 52 cards’ deck) = P(You shuffle the deck, then I pick 2 cards on the top) = P(You shuffle the deck, you throw away the bottom half deck, then I pick 2 cards on the top) = P(Picking a 26 cards’ random deck, then I pick 2 cards on the top) = P(Picking a 26 cards’ random deck, then I randomly choose 2 cards in the 26 cards deck) in this logic - even if you only pick a 4 cards' deck randomly from the 52 cards deck for me to choose 2 cards - it's the same probability as if I choose 2 randomly from the 52 cards' deck directly . The 3rd deck is better. Suppose the 3rd deck has k red cards. The probability of getting 2 cards of the same colour is (C(k,2) + C(26-k,2))/C(26,2). It is easy to see that this is minimum for k = 13, which is the first deck. So essentially any random 26 cards is at-least as good as a 13-13 split. Split a blind draw into two draws doesnt change your distribution. These answers are all overkill, the answers are obvious by intuition which are good enough (perhaps even better) for an interview. 1. Obviously deck 1 is better , because taking away your first card has a smaller impact on the ratio of cards left of the same colour. 2. Obviously they're the same. Deck 1 is equivalent to shuffling a deck and taking the top 2 cards, Deck 3 is equivalent to shuffling a deck, taking the top 13 cards of that and then taking the top 2 cards of that. One or more comments have been removed. |
Quantitative Researcher at Jane Street was asked...
Interesting question: From a deck of 52 cards pick 26 at random. From this set of 26 you pick two cards. You win if the both of these cards are of the same color. Is this a game you would prefer over one in which you win by picking two (first two picks) of the same color at random from a deck of 26 with equal number of black and red cards 8 Answersrandom is better Random is not better, both give equal win rates according to simulation. No difference, you can think of the first 26 cards in the shuffled deck as the randomly selected 26 cards, and then you pick the first two. So the winning probability will be exactly the same. Show More Responses First option is slightly better. One way to argue if the 26 random cards are even, then its the same as the 2nd situation, but if its uneven (12R 14B), then probability is 12/26 * 11/25 + 14/26 * 13/25 > the probability of the 2nd option. And the probability just gets better as the draw is more skewed. Another way to argue is the 1st option is picking from a deck of 52 even cards, and 2nd option is picking from 26. First option probability is 25/51, second option is 12/25. As u choose from more and more cards, the probability increases and tends towards 1/2. Intuitive Solution. 26 is an arbitrary selection. For a two card case: Case 2: deck of two cards. 1 black and 1 white card gives P = 0. Case 1: pick two cards from 52. P > 0. Random is better. Here is the solution: For latter game(fixed 26 cards with equal red and black), the probability to win is: p2=1-13*13/C(2,26)=0.48. (1-probability of picking two different color cards) For the random game, although the expectations of number of red card and black card are equal, but they may not be the exactly same. Assume R is the number of red cards, and B is the number of black cards. with constraints: R+B=26 then the probability to win for this game becomes: p1=1-R*B/C(2,26) With the constraints R+B=26, then R*B = p2 always. fix is better, since in random case more colors are mixed in. The prob of hitting same color pair got lowered. Keep in mind a deck of cards is composed of 4 colors and each of 13 cards. So that 12+14 is not happening. oops I got this wrong. random is better. 12+14 is happening. We are talking about colors... |
2) A. 10 ropes, each one has one red end and one blue end. Each time, take out a red and a blue end, make them together. Repeat 10 times. The expectation of the number of loops. B. 10 ropes, no color. All the other remains the same. 7 Answers1/10 + 1/9 +...+ 1 ? B is similar.. 1/19+1/17+etc in B E[n] = 1/n + (n-1)/n*E[n-1] = 1/n + E[n-1] For the case of n=10, you would sum up all of the numbers from 1 to 10: 1/10+1/9+ 1/8 + 1/7 ... + 1/2 Show More Responses add an extra 1 to the previous answer For part A), the answer is 1+1/2+1/3+...+1/10. For part B), the answer is 1+1/3+1/5+...+1/19. Explanations: For part A), ctofmtcristo has the right approach but with a typo in the equation for E[n]. To obtain the expected number of loops, we note that the first red has a 1/n chance of connecting with its opposite blue end (and forming a loop) and a (n-1)/n chance of connecting with a different rope's blue end (and not yet forming a loop), so E[n] = 1/n*(1+E[n-1]) + (n-1)/n*E[n-1] = 1/n + E[n-1], with base case E[1]=1. Then, by induction, we get E[n] = 1+1/2+1/3+...+1/n. Part B) is similar. We note that the first end now has 2n-1 possible ends to connect to, of which 1 of them is its opposite end and 2n-2 of them belong to a different rope. Then, E[n] = 1/(2n-1)*(1+E[n-1]) + (2n-2)/(2n-1)*E[n-1] = 1/(2n-1) + E[n-1], with base case E(1)=1. By induction, E[n] = 1+1/3+1/5+...+1/(2n-1). Ed's anwser is not right. Just check for the case of 3 pairs. So total cases is 3!=6. 1 case with 3 loops, 2 cases with all wrongly attached, and 3 cases with 1 loop. so expected value is (3/6)*(1) + (1/6)*(3) = 6/6 = 1... and Eds anwser gives 1+1/2 +1/3 = 11/6, which is wrong clearly. Timi, you are missing the fact that if they are "all wrongly attached" then they form a loop. Similarly, the case you are thinking of "with 1 loop" actually has 2 loops. The correct answer is still 11/6. |
1) Tow coins, P(head)=1/3, P(tail)=2/3, design a way to get the effect of fair coin 6 AnswersI guess Play 2 games , TH or HT = outcome 1, TT = outcome 2 . Both of probability 4/9 disregard HH manipulate payouts. P(Tails) = 2/3, so if it lands on tails I get $1. P(Heads) = 1/3, so if it lands on heads you get $2. 2/3 * 1 = 1/3 * 2 We need unbiased decision out of a biased coin. Throw the coin twice. Classify it as "heads" if we get HT and "tails" if we get TH. Disregard the other two occurrences i.e. HH and TT. Show More Responses it's like you need to give heads another 'chance' (to double it's probability to match tails) if you get a tail, stop if you get a head, roll again and take the second result Swift and anon are both correct, but Swift's solution is twice as efficient because 8/9 of the time, Swift only requires 2 flips, while 4/9 of the time, anon requires only two flips. Indeed, for Swift, we can show that the expected number of flips is 2.25, while for anon, the expected number of flips is double that, 4.5. Let X be the expected number of flips. Then, for Swift, EX = 2 + 1/9*EX ==> EX = 18/8 = 2.25, while for anon, EX = 2 + 5/9*EX ==> EX = 18/4=4.5. All of these type of problems can be solved by symmetry. Take two processes to flip each coin until it’s getting head. |
Quantitative Researcher at Jane Street was asked...
Flip the coins, the expected time to get HHT 6 AnswersLet x_i be the probability a random infinite binary sequence has its first HHT ending at position i. Clearly x_i =0 for i=1,2 and x_i=1/8 for i=3,4,5 and we can deduce the recurrence x_i=1/8(1-x_{i-3}-...-x_1) for i>=3. Since the sum of all x_i is 1 we can write x_i = 1/8(x_{i-2}+x_{i-1}+x_i+...) so x_3+x_4+x_5+... = 1/8 (x_1+2x_2+3x_3+...) so multiplying both sides by 8 we get that the expected number of flips is 8. This is incorrect. You cannot bracket like this. Condition on the first 2 steps and you will see that the expected number of steps to get HHT is 14. Both are wrong. The answer is 10. let S_{i} = roll that came up after flipping -ith coin. Sequence we need: HHT let X = # of rolls required to get HHT if S[1] = T => then we need to start from beginning, (1 + x) * 0.5 (as we have 1/2 Prob of seeing T) if S{1] = H => then move on and evaluate 2nd position if S[2] = H => we are basically at the same state, so you didn't waste any flips..just wait until T comes. So, (x) * 0.5 if S[2] = T => then you can proceed to next. if S[3] = H => then you reached the final spot; it took 3 steps to get here. if S[3] = T => then whole sequence breaks down, so you need to start from the beginning 0.5 * (x+3) So, the entire equation looks like: X = 0.5*(X+1) + 0.5 * { (0.5*X) + 0.5* { (0.5 * 3) + (0.5 * X)} } if you expand above, you get X = 10 Show More Responses Here is another approach to this problem using conditional expectation. Let X be the minimum number of flips needed to get the first run of HHT. Let HH denote the event that the first two flips land on heads and similarly define TH, HT, TT. Then using a well known theorem, we can write E[X] as E[X] = E[X | HT] P(HT) + E[X | TT] P(TT) + E[X | HH] P(HH) + E[X | TH] P(TH) where E[X | Z] is the conditional expectation of random variable X with respect to random variable Z and P is the probability measure. It is easy to see that E[X | HT] = E[X | TT] = E[X] + 2, and P(HH) = P(HT) = P(TT) = P(HT) = 0.25. Using the above theorem for E[X | HH], E[X | TH] and a little observation, we obtain E[X | HH] = 4 and E[X | TH] = 0.5 * E[X] + 4 The first equation then implies E[X] = 8. 8 is the correct answer...this can be solved by viewing it as a renewal reward process. Suppose we get a reward 1 each time we get HHT, then intuitively in the long run we can get 1/8 (=p(H)p(H)p(T)) unit of reward per unit time. By the central limit theorem for renewal reward process, this should be equal to the expectation of reward for each period over the expectation of hitting time. But the expected reward for each period is just 1, so the expectation of hitting time is 8. This can be generalized to HTH, which is more interesting, because the reward for each period is not 1 anymore: if we get a HTH from previous period, then if the start of the new period is TH, we actually get HTH TH, the reward here should be 2. So the expected reward in this case is 1+p(T)p(H) = 5/4, and thus the expected hitting time is 10. HHT = HH + T. After 2 consecutive heads (with E[HH] = 6), if we get H, we continue as we still have 2 consecutive H's, else we get desired HHT. So, E[T] = 2 and therefore, E[HHT] = E[HH+T] = E[HH] + E[T] = 6 + 2 = 8 |
5 babies in a room, 2 boys and 3 girls. one baby with unknown sex is added. Randomly choose one baby and the result is a boy. What's the prob that the added baby is a boy? 6 AnswersBayesian Bayes' theorem: P(A|B) = P(B|A)*P(A)/P(B) P(A)=1/2 by assumption; if the newborn is a boy, there is P(B|A)=3/6 chance of selecting a boy. now P(B)=P(B|A)*P(A)+P(B|A')*P(A') as you said, where A' = newborn is a girl. similarly, P(A')=1/2 by assumption, and P(B|A')=2/6. hence the answer is (1/2)*(3/6) / [(1/2)*(3/6)+(1/2)*(2/6)] = 0.6 I think there is this typo in the last line, and the answer is (1/2)*(3/6) / [(1/2)*(3/6)+(1/2)*(4/6)] = 3/7. Show More Responses 3C1/(3C1 + 2C1) = 60% TEst Using Bayes' Theorem with: Scenario A1 - added baby is a boy --> P(A1) = 1/2 Scenario A2 - added baby is a girl --> P(A2) = 1/2 A1, A2 for a complete set of scenarios P(B|A1) = 3/6 P(B|A2) = 2/6 P(A1|B) = [P(B|A1)*P(A1)] / [P(B|A1)*P(A1) + P(B|A2)*P(A2)] = [3/6 * 1/2] / [3/6*1/2 + 2/6*1/2]= 3/5 = 0.6 |
See Interview Questions for Similar Jobs
- Analyst
- Associate
- Business Analyst
- Research Associate
- Financial Analyst
- Consultant
- Investment Banking Analyst
- Intern
- Data Analyst
- Equity Research Associate
- Vice President
- Equity Research Analyst
- Senior Analyst
- Associate Consultant
- Software Engineer
- Senior Research Analyst
- Project Manager
- Investment Analyst
- Research Assistant
- Quantitative Analyst