Assistant trader Interview Questions
assistant trader interview questions shared by candidates
Top Interview Questions
Here is an example of a brainteaser during the interview: You have five pirates, ranked from 5 to 1 in descending order. The top pirate has the right to propose how 100 gold coins should be divided among them. But the others get to vote on his plan, and if fewer than half agree with him, he gets killed. How will the coins end up being divided, assuming all the pirates are rational and want to end up alive? 15 Answers54321 If the pirates are RATIONAL- then they would all agree with taking just 20 coins a peice and splitting it evenly. That is the short and sweet answer. Unfortunately these nerd interviewers may prefer a pointlessly long and drawn out answer to demonstrate your irrelevant mathematical reasoning skills/ability In which case you may answer with the following Give one coin each to the 2 lowest ranked pirates. and split the remaining between the top 3( Maybe 32 for the top pirate and 33 for pirate 2 and 3 ). The bottom 2 will certainly vote against you- but now atleast you are increasing your odds of the other 2 pirates agreeing with the plan. ( which are the 2 minimum votes needed for pirate one to survive) Obvious explanation If there are 5 pirates, one comes up with the plan and the other 4 vote. If fewer then half agree with the top pirate- he gets killed. That means if less than 2 agree he would get killed. He need at the minimum 2 pirates to agree with him to live. the answer isnt that obviouse.....you give the 3rd and 1st one coin and keep 98....start at the beginning. if theres 2 pirates, 100 for 2, 0 for 1 (one doesnt like this) If there is three 1 will be happy with just a single coin, b/c he does not want it to go down to 2. If there is 4 pirates, 2 will be happy with a single coin, b/c he does not want it to get down to 3 pirates where he will receive 0. So he gets 1 and 4 gets 99. At 5 it changes a bit. Here 1 and 3 will be happy with single coins b/c if it goes down to 4 they will receive 0 coins. So 5 takes 98, and 1 and 3 take 1 one each Show More Responses If there are 4 pirates and you give 1 coin to 2 and 99 to yourself....surely 1 and 3 will vote against you and it will go down to 3 pirates?? 0 to the lowest 2. 1/3*100 to the three others. half would disagree but the top pirate would live. highest rank keeps 98, two get 1 each, two receive nothing. now it's just up to the first two to be happy with 1 coin each, but then, if they were not, the other two would be more happy with 1 instead of 0 coins. one could argue that if all 4 are rational, they would demand 25 each or be unhappy alltogether. lol i hope that you are kidding....otherwise it's better to find a manual-farmer job.... btw sean is right sean is not right. The top pirate DOES NOT vote. If there are two pirates and the top decides to take the 100 coins for himself, the other one will vote against and the top pirate will be killed 98, 0, 1, 0 ,1 This is just a classic game theory question and you have to work backwards through it starting with the base case to understand the pirates motivation. If you have two pirates, p1 and p2, and if p1 is the final voter then he will always vote negatively unless p2 gives him all $100 since he can kill him anyways and take the whole loot. This means p2 is in a compromised position and does not want the game to go down to 2 pirates and will take any value greater than 0 from any other pirate, or will vote yes if he receives at least $1. When p3 is introduced, he knows p2 will need at least $1 to vote for the plan therefore he keeps $99 and gives away the last dollar to p2. This means p3 is in a dominant position and will vote down any plan that grants him less than $99. When p4 is introduced then he needs two of the three voters to vote for his plan. Granted p1 will decline unless he receives all of it and p3 will decline unless he receives at least $99 of it then he will give p3 exactly that and p2 $1 otherwise he is killed. p4 is in a compromised position so he will accept any offer where he receives something greater than 0. When p5 is introduced he knows p4 and p2 are screwed and the maximum they can earn if it bypasses him is $1. Therefore granting them each that money will guarantee their vote leaving the remaining $98 for himself and half the votes are positive thus he is not killed and gets to keep $98. So the distribution for p5, p4, p3, p2, p1 should be 98, 1, 0, 1, 0. If we assume that once the top pirate is killed the next one takes his place and decides how the money is allocated we can solve it like this: Five Pirates: || 34 || 33 | 33 | 0 | 0 | - pirates 4 and 3 make more than the 20 they should earn if the money is distributed equally and they should vote to pass while 2 and one vote against. Split 50/50 it should pass. BUT - could pirates 4 and 3 make more money if they voted against and then split it up among fewer people? Four Pirates: || 34 || 33 | 33 | 0 | - Again, everyone gets 33 and we assume pirates 3 and 2 vote in favor, outweighing pirate1. BUT - pirate 3 sees an opportunity Three Pirates: || 50 || 50 | 0 | - with only one vote to sway in order to get 50%, pirate 3 only has to give pirate 2 half of the total. So at best, pirate 3 will vote NO unless he gets his 50 up front. Pirate 4 will vote NO unless he gets his 34 up front. In order to survive, pirate 5 should pay accordingly to those pirates and take home a meager 16. Though 16<20, the other greedy pirates will always vote NO unless they get the amount they know they can receive. If Pirate 5 wants to stay alive that's what he'll do. TL;DR Pirate 5 - 16 Pirate 4 - 34 Pirate 3 - 50 Pirate 2 - 0 Pirate 1 - 0 Wait, that's wrong I just realized. The buck stops with pirate 3. He can get 50 and he will, so pay him 50 first. In the Four pirate scenario, Pirate 4 will also have to pay out 50 to pirate 3 or he'll be killed, so the most he can allot to himself is 17, which is all Pirate 5 should give him. New Tally: Pirate 5 - 33 Pirate 4 - 17 Pirate 3 - 50 Pirate 2 - 0 Pirate 1 - 0 Pirate 2 is always going to be looking for that $50 payday in the Three Pirate scenario, so it's important to neutralize his vote immediately by having 4 and 3 agree with 5 in the first place. If the top pirate gets killed, the rest of the pirates gets to divide the 100 coins which is 25 coins per person. but the higher ranking pirates 3,4 will probably get more so the best case scenario for pirates 1,2 is 25 coins per person. If the top pirates give 1 and 2 26 coins per person, they will vote for him and he gets to keep the rest of the 48 coins Show More Responses We have the following fundamental assumption: Any pirate will not vote down as long as his payout is greater or equal to that of what his payout would be with 1 fewer pirate. As such, we product the following chart, working up from 1 pirate: (Payout,Expected of 1 less pirate) , we require that (Payout >= Expected of 1 less pirate) to get their vote # pirates | Rank: 1 2 3 4 5 1 (100,) 2 (100,100)(0,0) 3 (0,100) (1,0) (99,0) 4 (1,0) (2,1) (0,99) (97,0) 5 (1,1) (0,2) (1,0) (0,97) (98,0) 20 each |
Flip a coin until either HHT or HTT appears. Is one more likely to appear first? If so, which one and with what probability? 14 AnswersLet A be the event that HTT comes before HHT. P{A} = P{A|H}P{H} + P{A|T}P{T} = .5P{A|H} + .5P{A|T} P{A|T} = P{A} therefore, P{A|H} = P{A|T} P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Therefore, 2P{A|H} = P{A|HT} P{A|HT} = P{A|HTT}P{T} + P{A|HTH}P{H} = (1)(.5) + P{A|H}(.5) 2P{A|H} = .5 + P{A|H}(.5) P{A|H} = 1/3 and P{A|H} = P{A}, therefore, P{A} = 1/3 So, HHT is more likely to appear first and it appears first 2/3 of the time. P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Need help - - why is P{A|HH} = 0 ? P(A|HH) = 0 because after a sequence of consecutive heads, you can no longer achieve HTT. The moment you get a tail, you will have the sequence HHT. This the reason HHT is more likely to occur first than HTT. Show More Responses P(A|HH) = 0 because after a sequence of consecutive heads, you can no longer achieve HTT. The moment you get a tail, you will have the sequence HHT. This the reason HHT is more likely to occur first than HTT. HHT is more likely to appear first than HTT. The probability of HHT appearing first is 2/3 and thus the probability of HTT appearing first is 1/3. Indeed, both sequences need H first. Once H appeared, probability of HHT is 1/2 (b/c all you need is one H), and probability of HTT is 1/4 (b/c you need TT). Thus HHT is twice is likely to appear first. So, if the probability that HTT appears first is x, then the probability that HHT appears first is 2x. Since these are disjoint and together exhaust the whole probability space, x+2x=1. Therefore x=1/3. You guys seem to be mixing order being relevant and order being irrelevant. If order is relevant (meaning HHT is not the same as HTH) then this has a 1/8 of occuring in the first 3 tosses. Also HTT has a 1/8 chance of occurring in the first 3 tosses, making them equally likely. Now, if order is not relevant. (so HHT and THH are the same), then this has a (3 choose 2) * (1/8) probability of happening in the first 3 tosses. The same goes for HTT (which would be the same as THT etc and others) so this has a (3 choose 2) * 1/8 probability of happening in the first 3 tosses as well. Either way they come out to being equally likely, please comment on my mistake if I am doing something wrong. Ending up with HHT more likely (with probabilty 2/3). HHT is more likely (2/3) probability. People with wrong answers: Did you not Monte Carlo this? It takes 5 minutes to write a program, and you can then easily see that 2/3 is correct empirically. I don't get it. Shouldn't P{A|HH} = P{A} in the same sense that P{A|HTH} = P{A} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A} Am i wrong? sorry, i meant: I don't get it. Shouldn't P{A|HH} = P{A|H} in the same sense that P{A|HTH} = P{A|H} from both HH and HTH we have get the first H from HTT and so it should be P{A|HH} = P{A|HTH} = P{A|H} Am i wrong? Above link is the best solution I have seen for this problem http://dicedcoins.wordpress.com/2012/07/19/flip-hhh-before-htt/ Apologies, Below* Here's my answer. Let x = probability of winning after no heads (or a tail). y=probability after just one heads. z=probability after two heads. w=probability after HT. Thus x=(1/2)x+(1/2)y, y=(1/2)z+(1/2)w, z=1/2 + (1/2)z, w=(1/2)y. Therefore, z=1, y=2/3, w=1/3, x=2/3. We wanted x at the beginning, so it is 2/3 that HHT comes up first. Show More Responses Think of HHT as H(HT) and HTT as (HT)T For every occurrence of (HT) in a sequence of flips, there is a 1/2 chance an H occurred before the (HT) and a 1/2*1/2 chance a T occurred after. Thus, HHT is 2x more likely than HTT. HHT is first 2/3 of the time. |
If you flip a coin until you decide to stop and you want to maximize the ratio of heads to total flips, what is that expected ratio? 14 AnswersI think we could approach the solution in the following way let us say we get paid an amount equal to the ratio of no.of heads to total flips So, expected amount is: if we flip once:expected amount = (1H/1flip).P(H) + (0H/1flip).P(T) = 0.5 if we flip twice:expected amount = (2H/2flips).P(2H) + (1H/2flips).P(1H) + (0H/2flips).P(0H) = 1.(.25) + (.5)(.5) + 0(.25) = .5 and so on for each case we get the expected value as .5 and the answer should be (1/2) your strategy should be to stop once you see a head. 1/2 probability of getting a ratio 1/1. (1/2)^2 probability of getting a ratio 1/2 (because first flip must be a tail and second a head) (1/2)^3 probability of getting a ratio 1/3 and so on. I don't know the answer to the infinite sum though... Since the question asks you to maximize the ratio of heads to total flips, the answer should be: -1/2 + exp(1/2) - exp(-1/2) ~ 0.54 Show More Responses Suppose the event meets Poisson distribution. lambda is the rate P(event happens in one hour) = 1-e^(-lambda) = 0.84 e^(-lambda) = 0.16 P(event happens in .5 hour) = 1-e^(-lambda/2) = 1-sqrt(0.16)=0.6 The strategy would be to stop when you are above a 1:2 ratio, because that's what it will be in the long run. Stopping at a 1:2 ratio makes you neutral. It's not stopping at your first heads. What if your heads is 5 in? Then the ratio is 1:5. In the long run you can expect that ratio to be 1:2 as the limit goes to infinite. The answer should be 3/4 because there's a 1/2 chance you will stop at 1:1, the maximum ratio. The other half of the time you will play until the ratio approaches 1:2. So (1/2)(1/1)+(1/2)(1/2)=3/4. Stop as soon as you've soon more heads than tail, so... H + THH + THTHH + THTHTHH + ... E[H] = (0.5) + (0.5)^3 + (0.5)^5 +.... E[H] = 66/100 This is a random walk and i think it is recurrent. If you have infinite time you should at some point hit close to 1:1... hm, i don't like my answer infinity, if you get a head on the first flip, which had p=0.5, then ratio is 1:0, which is infinity... o.5 * infinity = infinity should not consider this position.... The answer is 3/4. It is NOT infinity because the question asks for the ratio of heads to total coin flips, not the ratio of heads to tails. The strategy is to stop whenever the total number of heads is greater than or equal to the total number of tails. If we get a head at the beginning, stop, and the ratio is 1. Otherwise, continue until we have equal amount of heads and tails, and we will end up with ratio 1/2 Consider the cases where we never stop, that corresponds to the subset of infinite sequences of coin tosses where at any given place, there are more tails than heads. Such a subset of events should have probability measure zero (Law of Large Numbers). Namely, if we keep tossing the coin, we are almost always guaranteed to reach the point where we have equal amount of heads and tails. Thus the expected ratio will be 1*1/2 + 1/2*1/2 = 3/4 Show More Responses The number should greater than 3/4 since if you get back to 1/2 ratio, you should keep flipping. The StackExchange shows the true answer of ~78%. I wonder what the interviewer considers the correct answer. Probably saying "a bit bigger than 3/4" suffices. One or more comments have been removed. |
Assistant Trader at Jane Street was asked...
Simulate a 6 sided die with a coin. 13 Answerssplit into two sets {1,2,3,4} {5,6} with one assigned to heads and the others tails. If in the {5,6} set flip again. If in the {1,2,3,4} set assign {1,2} to heads and {3,4} to tails. then split again. Toss three coins, we get 8 outcomes: HHH, HHT, HTH, HTT, THH, THT, TTH, TTT, for the first two, we toss again until neither HHT or HHT appears, for the remaining 6, we assign 1~6 individually. I propose another solution. Use 4 coins. Flip 2 coins together and separately another 2 coins together. Sample space for a 2 coin flip is: HH, TT, HT, TH (assume that HT=TH) For the two 2-coin flips assign value of dice, for example (column 1: 2 coin flip, column 2 other 2-coin flip) HH HH => 1 HH HT/TH => 2 HH TT => 3 TT TT => 4 TT HT=> 5 HT HT=> 6 with TH=HT Show More Responses With 1 coin, flip three times and HHH=1,HHT=2, HTH=3,HTT=4, THH=5, THT=6. If the first twice is TT, then abandon the first T and start with the second T. The candidate and William have problems. I think Ey's solution works. For the candidate's solution, you will get way more 5s and 6s, since fully 1/2 of the results will be either a 5 or a 6. Similarly, for William's solution, if you toss out results starting with HH, then you end up with ANY result that starts with a tail, and only half the results that start with a H. That means you will end up with more 3s, 4s, 5s, and 6s than you should. I think by simply changing what you throw out you could fix William's solution. If you threw out all HHH and all TTT you would be closer, but a solution where you don't toss out any data is still better. Nope. I'm a moron. Can't edit earilier post, but Ey, sorry, I gave you too much credit. Can't count HT and TH as the same thing because you get too many of them. No idea why I didn't see that 10 minutes ago. You can't map 4 coin results to three numbers fairly. You will end up with too many 2s and 5s and way too many 6s with your solution. Now we have no valid answers. I think someone proved it can't be done elsewhere. Interesting problem though. Toss 1: Heads=even, start with 2, tails=odd, start with 1 Toss 2: Heads=0, tails=2, add to total Toss 3: Heads=0, tails=2, add to total It cannot be done. The odds of any set of coin flips is always a power of 2, and there is no power of 2 for which 6 is a factor, because 6 factors to 2 * 3 and both are prime numbers. Eric, you're right, I thought my solution worked as well. My idea was that we do 2 independent 2-coin flips. Each 2 coin flip has the following possible outcome: HH, TT, HT(or TH, same thing). When combining the outcomes of the 2 independent 2 coin flips this is what we get: (1) HH (2) HH (1) HH (2) HT (1) HH (2) TT (1) TT (2) HH (1) TT (2) HT (1) TT (2) TT (1) HT (2) HT (1) HT (2) HH (1) HT (2) TT we get 9 possible outcomes, even if we set for example outcomes (1) HH (2) TT to be equivalent to (1) TT (2) HH, and similarly with others to get 6 outcomes, it won't work... HHH, TTT, start the process again. Assign the rest values of dice. HHT = 1 HTH = 2 HTT = 3 THH = 4 THT = 5 TTH = 6 Geez, they should really allow for nested responses for better discussions. I agree that with the approach first stated by William (except he had a typo, "HHT or HHT appears" should read "HHH or HHT appears" he was just referring to first two (or any two of the eight combination that one chooses not to assign values to). The way that "Do three flips" describes is clearer. Other approaches are problematic in that the chances of getting 1-6 are disproportionate leading to an unfair dice (for reasons that others stated). "Binary Thinkers" approach is also incorrect as for similar reasons. there 8 possible accounts (3 flips 2^3 = 8). Although it's a unique way to account for all the numbers 1 through 6... 3's are and 4's are precisely twice as likely to occur (2/8 or 1/4 chance) vs the other numbers (1/8 chance). You may ask for a probability of THHHHH if we get a consequent combination with 5 heads and one tail. Here are the 6 possible combinations: THHHHH HTHHHH HHTHHH HHHTHH HHHHTH HHHHHT so the probability of each is 1/6 Not a precise solution, but good enough for many purposes: Flip the coin a large number of times, interpret the result as number in base 2, and take it modulo 6. Some of the six numbers will be slightly disadvantaged, but you can make that difference arbitrarily small by increasing the number of coin flips. |
Assistant Trader at Jane Street was asked...
You have a box filled with cash. Cash value is uniformly randomly distributed from 1 to 1000. You are trying to win the box in an auction: you win the box if you bid at least the value of the cash in the box; you win nothing if you bid less (but you lose nothing). If you win the box, you can resell it for 150% of its value. How much should you bid to maximize the expected value of your profit (resale of box minus bid)? 13 AnswersMy bad, I meant "0 to 1000" not "1 to 1000". Not that it affects the answer (hint hint). the expectation is always negative? bid 0. The expectation is negative. Show More Responses can someone explain why it is the case mathematically?? I thought your expected payoff is (0+1000)/2 x 1.5=750.. so if you bid 500. you will get expected 250 profit... Anon, If you bid 500 and it is worth 1, you only get paid $2 so you lose 498. $0 is correct. Generalize this for 1,2,5,10 maybe and find the EV of all those, or just 1,2,10, and then you will be able to "guess" that it isn't getting more and more negative and you should bid 0 you should bet 1, that is the minimum in the box, so you will win 0 or .5 if you bet one you'll never win. if you bet 500, you're missing out if value(box) >500 and <750 since those are +EV too. believe the answer is 749. The answer is bid 0 dollars. Soln: If you bid X dollar, you get the box if if it contains Y dollars, where Y < X. Or you nothing, which doesn't hurt anyway. Now think of possible values of Y in PAIRS, (0, X), (1, X-1), (2, X-2)... in each pair (a, b), expected payoff is P(Y being a or b) * E(payoff given Y = a or b). but the second term E(payoff given Y = a or b) = 1/2 (3/2 a - X) + 1/2 (3/2 b - X) = 3/4 (a + b) - X = (-1/4) X, which is always negative. If we sum over all possible pairs, we should get a negative number. Ofocz you need to handle the edge case, where X is small and the pairing does not really work. I think that since we are dealing with expected profit, we should ignore the case when n (our bid) is less than X (cash in the box) as nothing happens in this case. So we create a new distribution where X is uniformly distributed on [0, n]. Thus, E(Profit) = E(1.5X - n) = 1.5E(X) - n = 1.5*n*(n+1)/(2*(n+1)) - n = 0.75n - n = -0.25n <= 0. So the expected profit is maximised (=0) if n=0. Sorry, guys. I seem to have made a mistake in my solution above. Let X = cash value, n = our bid price. Then Profit = 1.5X - n. We want to find E(1.5X - n) = 1.5*E(X) - E(n). Now in order to find E(X), we notice that that X is uniformly distributed. So every outcome has a probability 1/1001. Also we only need to sum up over the values from 0 to n. Hence, E(X) = n(n+1)/(2*1001). For E(n), we see that we will pay the amount n with probability (n+1)/1001 and 0 otherwise. So E(n) = n(n+1)/1001. Therefore, E(Profit) = 1.5*n(n+1)/(2*1001) - n(n+1)/1001 = -0.25*n(n+1)/1001<=0. In order to maximize profit, we should bid zero. The answer is 0 if you ask me and here is the reason. let V be the value of the box and let x be your bid then you profit P is P=(1.5V-x)1_{V<=x}. we take expected value of this and use conditional expectation to get E(P)=E(1.5v-x|v<=x)p(v Answer is 2. The key is that the amount of cash X is uniform on (1,100) rather than (0,100). If we bid y money, then are expected value of profit P is E[P | bid y]=(\int_1^{y}(1.5-y)dx)/99=-(y^2-4y+3)/(4*99). Here the limits of integration are because P will be 0 if X is smaller than y, and we will get profit 1.5x-y if X=x and x<=y. Solving the maximum of this function gives y=1, with an expected profit of 1/(4*99)=0.0025. If the value was in fact from (0,100), then our expected value for any y would just be -y^2/(4*99) so we wouldn't want to play that. One or more comments have been removed. |
Assistant Trader at Jane Street was asked...
what is the expected number of flips of a coin to simulate a 6 sided die. 12 AnswersE(x) = .75*3 + .25*E(X+1). From there it simplifies to E(X) = 10/3 E(x)=3+0.25E(x), so E(x)=4. Hi Can you please explain this in detail ? I dont seem to follow how did u guys do this ? Thanks Show More Responses 11/3. E(x) = .75*3 + .25*E(X+2). The probability of a six sided die is 1/6, while the probability of a coin flip is 1/2, thus, in order to simulate a six sided die the coin must have the same probability, so 1/2 has to become 1/6. From there, you can write the equation1/2^x=1/6, or to make it simpler, 2^x=6. Without using guess and check, you can rewrite the equation to log base 2 6=x (or log2(6)=x), and from there solve: log(6)/log(2), which comes out to 2.58496, or 2.585 (2^2.585=6.000). So, to answer your question, the coin would have to flip 2.585 in order to simulate the flip of a coin. x=2.585 3 flips. The coin gives 1 out of 2 possibilities per flip. 3 flips will give you 6 possible combinations like the 6 sided die (1 heads, 2 tails, 2 heads, 1 tail, etc.). It is easier to look at it as almost 3 flips set up as a binary number. It would be perfect if you were representing an 8 sided die, but since you want only 6 possible results and 2 flips is too little and 3 flips is too big you have to adjust your fomula as such: 0=tails 1=heads 000 = Null this means you scratch these results and flip again three more times. 001 = 1 010 = 2 011 = 3 100 =4 101 = 5 110 = 6 111 = is another Null result Of course the Nulls can be made to be anything as are the numbers it is all relative. This give you a 1 in 6 chance everytime. Wsc is on the right track. You have to flip 3 coins at least, but the probability of a null result is 1/4 every time, so you would need to flip 3 more coins. This is a geometric sequence 3+3*1/4+3*(1/4)^2 etc, so the answer is 3/(1-1/4) = 4 expected coin flips. 4 The correct answer is 11/3, as written by "Someone" on July 9, 2011. Both William's and P's answers are close, but not quite accurate, because they assumed that it takes 3 flips to discard a null result. Instead, it only takes two flips to discard a null result (HHT or HHH) since we don't have to flip the third coin if the first two are already heads (HH). That's why the answer is slightly less than 4. Indeed, note that we have a 3/4 chance of being able to get one of the 6 good results (HTH, HTT, THH, THT, TTH, TTT) that yields a random number from 1 to 6, and a 1/4 chance of having to reflip the coins (HHH or HHT). The key is that for those two results to be discarded, we only needed to flip 2 coins to determine that! There's no need to flip a 3rd coin if the first two were already HH (it'd be a waste of a throw!). Hence, 3/4 of the time, the number of flips required is three and 1/4 of the time, the expected number of flips required is 2+E[X]: E[X] = 3/4*3 + 1/4*(2+E[X]). Alternatively, we can flip the coin twice, and then note that 3/4 of the time (for HT,TH,or TT), we only flip one more coin, and 1/4 of the time (for HH), we have to re-flip, so E[X] = 2 + (3/4*1+1/4*E[X]) Either way, we have E[X] = 11/4 + 1/4*E[X], which yields the desired result, E(x) = 11/3. 4.5 It's 11/3. Say you want HHH and TTT to be your rolls to start over so that the last roll is 50-50 to be H or T and you'll only have to roll twice more since there will be no bias. Roll your first roll, it's H or T. After that there is a probability of 1/4 that you'll have to 2 times again. So there a 3/4 chance you won't have to roll again. So using the equation 1/p to find EN we get 1+2(4/3)=1+8/3=11/3 You can't flip a die because you only have a coin. DUH |
There is a 91% chance of seeing a shooting star in the next hour, what is the probability of seeing a shooting star in the next half hour? 12 Answers70% How could this be the answer? surely it is 40.5%; half of the 91% probability? sorry I meant 45.5% Show More Responses probability shooting star in half hour = x probability of NOT seeing shooting star in 1 hour = (1-x)^2 probability of seeing shooting star in 1 hour = 1 - (1-x)^2 so we solve 1 - (1-x)^2 = 0.91 for x According to the question, suppose we have a random variable X that is the time when we see a shooting star, and let F(x) be the CDF of X. Hence, F(now+1h)-F(now)=0.91. And this is unfortunately the only thing we know about F(x). In other words, any valid F(x) satisfying such requirement could be the CDF of X. Therefore, we have no idea of the value of F(now+0.5h)-F(now) because F(now+0.5) could be any value between F(now) and F(now)+0,91. Answer: unknown result depending on the distribution. The second answer is incorrect. Take an extreme case as an example, if the probability of seeing a shooting star in 1 hour is 1, then the root of the function would be 1, which means that the probability of seeing a shooting star in the first 30 minutes is 1. An interesting question followed by that would be: what about in the first 15 (second 15) mins? According to this method, we can again conclude that the probability of having a shooting star in the first 15mins is still 1. And this can go on for ever, and result in the conclusion that at any time interval, P(shooting_Star) would be 1 which is obviously mistaken. let p= probability to see the star in half hour, (1-p)=won't see, (1-p)^2=can't see a star in an hour=1-0.91, solve we have p=0.7 The answer should be 0.7 but most of the explanations here make no sense. It is reasonable to assume a exponential distribution here and solve the probability. P(no star in an hr) = 1 - 0.91 = 0.09 P(no star in half an hour)^2 = 0.09 P(no star in half an hour) = 0.3 P(star in half an hour) =1 - 0.3 = 0.7 = 70% Probability is 30%. The arrival of star follows a Poisson process. Seeing K arrivals in time T is P = (lambda*T)^K * exp(-lambda*T)/K!. Now we know there has been no arrival in 1 hour, T = 1, K = 0. P = exp(-lambda) = 1-0.91 = 0.09. Now T = 0.5 (half an hour), P = exp(-lambda*1/2) = sqrt(0.09) = 0.3. The probability of seeing a star is 1-0.3 = 0.7, or 70%. I would argue for the 91/2% = 45.5% chance of seeing the star. The wording of the problem implies this is not a poisson process. Rather, I would interpret as the weather channel telling residents in an area that there is a 91% chance of seeing the shooting star between 7pm and 8pm (this is an unusual occurrence). Hence, with 9% chance you can't see a star in the next half hour. Then, in the 91% of the time where there is a shooting star in the next hour, one could make the reasonable assumption that the time it comes is uniformly distributed in the hour. So, 45.5%. Like, think about it in the real world. It is kind of unreasonable to model seeing shooting stars as a Poisson process with parameter .91 stars per hour. |
Trader Assistant at DRW was asked...
What is the expected value of rolling two dice? 9 Answers7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 Show More Responses The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 The answer is 7 because the expected value of rolling a single dice is 3.5 ((1/6 * 1)+(1/6*2) + (1/6*3) + (1/6*4) + (1/6*5) + (1/6*6)), so for two dice just multiply 3.5 * 2 = 7 |
Assistant Trader at Jane Street was asked...
Toss 4 coins - probability of more than 2 heads - expected winnings if $1 for each heads - expected winnings if can re-toss 1 coin if chosen 1 20 sided die - player A and B both choose a number. Whoever's number closest to value of the die wins the amount on the die. - optimal strategy? do you want to go first? - what is the expected winnings? 10 Answers1) (a) 3/8 (b) $1.875 2) Is it a fair die? There might be slight weight distribution error due to markings of dots. If it's a fair die then I would stay close to the expectation (10.5) or else I would chose something on the either side of the die based on the way it has been designed. Doesn't matter who goes first. Expectation = 10.5 ::the answer above is not totally right 5/16 for more than 2 heads. expected value 2 if re flip allowed, get 2+15/16=47/16 Show More Responses I think the reflip for the above isn't quite right. I think it's 2 + 15/16 * 1/2, since you have to account for what the reflipped coin may yield, not just the likelihood of having one tail. I'd pick 14 for the dice question, the sum of the bottom half needs to equal the sum of the top half. The sum of 1:14 = 105, the sum of 15:20 = 105. You should choose to go first, depending on your risk aversion you should either choose 15 or 14 Odds of more than 2 heads: 5/16. Expected value is $2 with no re-flips. If allowed 1 re-flip, this adds $0.50 of EV for 15/16 of the situations (you don't added value if you got 4 heads the first time), so EV = $2 + 15/32 = $2.46875. For the dice question, EV is the same ($5.25) if you choose either 14 or 15, therefore, I would allow my opponent to choose first to give myself the opportunity to take advantage if he/she makes a poor choice. If he/she doesn't choose 14 or 15, I can choose the number that is 1 away from their choice on the side closest to 14.5 and I will have an EV greater than $5.25 due to their sub-optimal choice. Expected value is not the same for range 1:14 it is 14/20*105=73.5 and for 15:20 it is 6/20*105=31,5. Sums are equal but you will be winning a lot more if you chose 14 and your opponent chooses 15. Wrong calculation in above answer, Rance is right. One or more comments have been removed. |
Assistant Trader at Five Rings was asked...
You are standing beside a road watching cars pass by. The probability that you see a car pass by in 1 minute is 1/4. What is the probability that you see a car pass by in 30 seconds? 8 Answers1-sqrt(3)/2 why is this let p=probability that you see a car in 30 seconds p(see car in first 30 seconds)=p p(see car in second 30 seconds)=p p(see car in the first 60 seconds)=2p-p^2=1/4 solving you get p=1-sqrt(3)/2 (reject 1+sqrt(3)/2 since it's over 1) Show More Responses I have a question on this solution: 1-(1-p)^2 is the probability of seeing at least one car, not the probability of seeing a car. You can also solve this using exponential distribution. From the question, you can deduce that the distribution has to be memoryless and hence there has to be a constant rate per unit time for the event to occur. Let the probability per unit time of a car passing by be p. Then from the given information 1/4 = 1 - e^{-p*T} The required answer is e^{-p*T/2} which gives the answer as reported above. Minor correction in above. The required answer is 1 - e^{-p*T/2} ans 1/2 as the person sees car in 15 sec of each 1 minute if we divide 1 minute into 4 parts (60/4 = 15 secs) s the probablity of seeing car now we are asked in 30 sec . the rate of moving of car will not change it will still continue to come at a rate of 1 in each 15 sec so the ans for each 30 sec would be 1/2 if we divide 30 into 2 parts . so in 15 sec one car is left .but for next 30 sec no car is going to come then itls probability would be 0 . now the ans is tricky which 30 secs are asked , the 30 sec in which car is seen or in which it is not seen by the man 1 - sqrt(3)/2 is the wrong answer |
See Interview Questions for Similar Jobs
- Trader
- Intern
- Analyst
- Software Engineer
- Quantitative Analyst
- Junior Trader
- Associate
- Quantitative Trader
- Quantitative Researcher
- Software Developer
- Financial Analyst
- Investment Banking Analyst
- Business Analyst
- Quantitative Research Analyst
- Research Analyst
- Consultant
- Summer Analyst
- Trading Assistant