# Assistant trader Interview Questions

# 545

Assistant Trader interview questions shared by candidates### 37 times 37

24 Answers↳

Two ways: 37 * 37 = (40-3) * (40-3) = 40*40 - 6*40 + 9 = 1369 37 * 37 = (35+2)*(35+2) = 35*35 + 4*35 + 4 Any two digit number ending in 5 can be squared by taking tens digit, multiplying it by itself plus one, and tacking on a '25' at the end, so 35*35 = 1225 so 1225 + 140 +4 = 1369 Less

↳

This can be answered in well under 30 seconds by a simple application of quadratics. 37^2 = (30 + 7)*(30 + 7) = 30^2 + 7^2 + (2*30*7) = 900 + 49 + 420 = 1369 Less

↳

There is a mathematical "shortcut" for squaring any two digit number that would probably be the easiest to use: 1) Square the individual digits and place the results next to each other: 3^2=9 and 7^2=49 => 949 2) Multiply the two digits, double the result, and add a zero: 3*7=21 2*21=42 10*42=420 3) Add the two numbers: 949+420=1,369 Less

### Here is an example of a brainteaser during the interview: You have five pirates, ranked from 5 to 1 in descending order. The top pirate has the right to propose how 100 gold coins should be divided among them. But the others get to vote on his plan, and if fewer than half agree with him, he gets killed. How will the coins end up being divided, assuming all the pirates are rational and want to end up alive?

15 Answers↳

This is just a classic game theory question and you have to work backwards through it starting with the base case to understand the pirates motivation. If you have two pirates, p1 and p2, and if p1 is the final voter then he will always vote negatively unless p2 gives him all $100 since he can kill him anyways and take the whole loot. This means p2 is in a compromised position and does not want the game to go down to 2 pirates and will take any value greater than 0 from any other pirate, or will vote yes if he receives at least $1. When p3 is introduced, he knows p2 will need at least $1 to vote for the plan therefore he keeps $99 and gives away the last dollar to p2. This means p3 is in a dominant position and will vote down any plan that grants him less than $99. When p4 is introduced then he needs two of the three voters to vote for his plan. Granted p1 will decline unless he receives all of it and p3 will decline unless he receives at least $99 of it then he will give p3 exactly that and p2 $1 otherwise he is killed. p4 is in a compromised position so he will accept any offer where he receives something greater than 0. When p5 is introduced he knows p4 and p2 are screwed and the maximum they can earn if it bypasses him is $1. Therefore granting them each that money will guarantee their vote leaving the remaining $98 for himself and half the votes are positive thus he is not killed and gets to keep $98. So the distribution for p5, p4, p3, p2, p1 should be 98, 1, 0, 1, 0. Less

↳

the answer isnt that obviouse.....you give the 3rd and 1st one coin and keep 98....start at the beginning. if theres 2 pirates, 100 for 2, 0 for 1 (one doesnt like this) If there is three 1 will be happy with just a single coin, b/c he does not want it to go down to 2. If there is 4 pirates, 2 will be happy with a single coin, b/c he does not want it to get down to 3 pirates where he will receive 0. So he gets 1 and 4 gets 99. At 5 it changes a bit. Here 1 and 3 will be happy with single coins b/c if it goes down to 4 they will receive 0 coins. So 5 takes 98, and 1 and 3 take 1 one each Less

↳

sean is not right. The top pirate DOES NOT vote. If there are two pirates and the top decides to take the 100 coins for himself, the other one will vote against and the top pirate will be killed Less

### Flip a coin until either HHT or HTT appears. Is one more likely to appear first? If so, which one and with what probability?

15 Answers↳

HHT is more likely to appear first than HTT. The probability of HHT appearing first is 2/3 and thus the probability of HTT appearing first is 1/3. Indeed, both sequences need H first. Once H appeared, probability of HHT is 1/2 (b/c all you need is one H), and probability of HTT is 1/4 (b/c you need TT). Thus HHT is twice is likely to appear first. So, if the probability that HTT appears first is x, then the probability that HHT appears first is 2x. Since these are disjoint and together exhaust the whole probability space, x+2x=1. Therefore x=1/3. Less

↳

Let A be the event that HTT comes before HHT. P{A} = P{A|H}P{H} + P{A|T}P{T} = .5P{A|H} + .5P{A|T} P{A|T} = P{A} therefore, P{A|H} = P{A|T} P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Therefore, 2P{A|H} = P{A|HT} P{A|HT} = P{A|HTT}P{T} + P{A|HTH}P{H} = (1)(.5) + P{A|H}(.5) 2P{A|H} = .5 + P{A|H}(.5) P{A|H} = 1/3 and P{A|H} = P{A}, therefore, P{A} = 1/3 So, HHT is more likely to appear first and it appears first 2/3 of the time. Less

↳

Above link is the best solution I have seen for this problem http://dicedcoins.wordpress.com/2012/07/19/flip-hhh-before-htt/ Less

### In world series (baseball) there are two teams, A and B. You know that each can win 50% of the time (1:1 odds). You also know how the game works, i.e. Whoever wins 4 games first wins. What is the probability of getting to game 7 (i.e. Each team wins 3 games)?

14 Answers↳

Assuming no tie. It's the prob of throwing 6 coins and get 3 head and 3 tails, which is (6!/3!3!)/2^6 = 20/2^6 = 5/16 Less

↳

to clear up any misconceptions, since there seem to be multiple answers here. 5/16 is the correct result Less

↳

Whoever wins the first game does not matter, as it could be either team with a lead after 1 game, so it's the next 5 games that matter. There are 2^5 or 32 possible outcomes for those games. The outcomes for the team that won game 1 that will force a game 7 are WWLLL, WLWLL, WLLWL, WLLLW, LWWLL, LWLWL, LWLLW, LLWWL, LLWLW, LLLWW. This is 10/32 or 5/16. Less

### If you flip a coin until you decide to stop and you want to maximize the ratio of heads to total flips, what is that expected ratio?

13 Answers↳

The answer should be 3/4 because there's a 1/2 chance you will stop at 1:1, the maximum ratio. The other half of the time you will play until the ratio approaches 1:2. So (1/2)(1/1)+(1/2)(1/2)=3/4. Less

↳

The strategy is to stop whenever the total number of heads is greater than or equal to the total number of tails. If we get a head at the beginning, stop, and the ratio is 1. Otherwise, continue until we have equal amount of heads and tails, and we will end up with ratio 1/2 Consider the cases where we never stop, that corresponds to the subset of infinite sequences of coin tosses where at any given place, there are more tails than heads. Such a subset of events should have probability measure zero (Law of Large Numbers). Namely, if we keep tossing the coin, we are almost always guaranteed to reach the point where we have equal amount of heads and tails. Thus the expected ratio will be 1*1/2 + 1/2*1/2 = 3/4 Less

↳

The number should greater than 3/4 since if you get back to 1/2 ratio, you should keep flipping. The StackExchange shows the true answer of ~78%. I wonder what the interviewer considers the correct answer. Probably saying "a bit bigger than 3/4" suffices. Less

### Simulate a 6 sided die with a coin.

13 Answers↳

HHH, TTT, start the process again. Assign the rest values of dice. HHT = 1 HTH = 2 HTT = 3 THH = 4 THT = 5 TTH = 6 Less

↳

Toss three coins, we get 8 outcomes: HHH, HHT, HTH, HTT, THH, THT, TTH, TTT, for the first two, we toss again until neither HHT or HHT appears, for the remaining 6, we assign 1~6 individually. Less

↳

Nope. I'm a moron. Can't edit earilier post, but Ey, sorry, I gave you too much credit. Can't count HT and TH as the same thing because you get too many of them. No idea why I didn't see that 10 minutes ago. You can't map 4 coin results to three numbers fairly. You will end up with too many 2s and 5s and way too many 6s with your solution. Now we have no valid answers. I think someone proved it can't be done elsewhere. Interesting problem though. Less

### Flip a coin four times. If you at first flip a head, you win $1. If you flip a consecutive head, you win double your previous winnings. What is the expected value of your winnings?

12 Answers↳

.5(1)+ .25(2) + .125(4)+.0625*(8) = 2

↳

assuming that your game ends when you get a T: 1: 1/4*1 = 1/4 2: 1/8*2 = 1/4 3: 1/16*4 = 1/4 4: 1/16*8 = 1/2 EV = $1.25 if you can keep going then it is 19/16 - draw out the tree Less

↳

The game doesn't say you stop to play if you get tails, you just toss 4 times and see what you get. for HHHH you get 1+2+4+8 dollars with probability 0.5^4 for HHHT or THHH you get 1+2+4 dollars with probability 2*0.5^4 for HTHH or HTHH you get 1+1+2 dollars with probability 2*0.5^4 for HHTT or THHT or TTHH you get 1+2 dollars with probability 3*0.5^4 for HTTH or HTHT or THTH you get 1+1 dollars with probability 3*0.5^4 for TTTH or TTHT or THTT or HTTT you get 1 dollar with probability 4*0.5^4 so the expected payout is given by the sum of the products of payouts and probabilities, $3.5 Less

### You have a box filled with cash. Cash value is uniformly randomly distributed from 1 to 1000. You are trying to win the box in an auction: you win the box if you bid at least the value of the cash in the box; you win nothing if you bid less (but you lose nothing). If you win the box, you can resell it for 150% of its value. How much should you bid to maximize the expected value of your profit (resale of box minus bid)?

12 Answers↳

Sorry, guys. I seem to have made a mistake in my solution above. Let X = cash value, n = our bid price. Then Profit = 1.5X - n. We want to find E(1.5X - n) = 1.5*E(X) - E(n). Now in order to find E(X), we notice that that X is uniformly distributed. So every outcome has a probability 1/1001. Also we only need to sum up over the values from 0 to n. Hence, E(X) = n(n+1)/(2*1001). For E(n), we see that we will pay the amount n with probability (n+1)/1001 and 0 otherwise. So E(n) = n(n+1)/1001. Therefore, E(Profit) = 1.5*n(n+1)/(2*1001) - n(n+1)/1001 = -0.25*n(n+1)/1001<=0. In order to maximize profit, we should bid zero. Less

↳

The answer is 0 if you ask me and here is the reason. let V be the value of the box and let x be your bid then you profit P is P=(1.5V-x)1_{V<=x}. we take expected value of this and use conditional expectation to get E(P)=E(1.5v-x|v<=x)p(v Less

↳

Answer is 2. The key is that the amount of cash X is uniform on (1,100) rather than (0,100). If we bid y money, then are expected value of profit P is E[P | bid y]=(\int_1^{y}(1.5-y)dx)/99=-(y^2-4y+3)/(4*99). Here the limits of integration are because P will be 0 if X is smaller than y, and we will get profit 1.5x-y if X=x and x<=y. Solving the maximum of this function gives y=1, with an expected profit of 1/(4*99)=0.0025. If the value was in fact from (0,100), then our expected value for any y would just be -y^2/(4*99) so we wouldn't want to play that. Less

### Given a 12 sided dice, you roll the dice repeatedly until the cumulative sum is odd. What is the number of the cumulative sum you can have with the highest probability?

11 Answers↳

Lets note : p(1)=p(2)=…=p(12)=1/12, the probability of getting n in a one roll, and Q(1),Q(3),…. the probability of getting the odd sum n in this game Q(1)=p(1), Q(3)=p(3)+p(2)*Q(1)=Q(1)*13/12, Q(5)=p(5)+p(2)*Q(3)+p(4)*Q(1)=Q(3)*13/12=Q(1)*(13/12)^2, … Q(11)=p(11)+p(2)*Q(9)+…+p(10)*Q(1)=Q(1)*(13/12)^5, Q(13)=p(2)*Q(11)+p(4)*Q(9)+…+p(12)*Q(1)=p(2)*Q(11)+Q(11)-p(11) = Q(1)*[(13/12)^6-1] 13 Q(n)=p(2)*Q(n-2)+...+p(12)*Q(n-12)=p(2)*Q(n-2)+Q(n-2)-p(12)*Q(n-14) =1/12*[Q(n-2)-Q(n-14)]+Q(n-2) < Q(n-2) The largest probability is Q(11)=(13/12)^5/12=0.124. Less

↳

So maybe this isn't how they want you to solve this, but here goes: For a 6-sided die, the most probable roll after 2 rolls is a sum of 7, which is really just the EV of two dice (3.5) x 2. So the EV of two 12's would be 6.5 x 2 also after two rolls. There's a pretty nice symmetrical distribution that looks almost bell-curve like. So my gut-reaction would be 13! But wait, we don't HAVE to roll two dice. If we hit an odd number on the first roll, we'll immediately stop. This knocks out a lot of the 13's odds now, since sure, if we make it to two rolls, it's the most likely, but we're not making it to 2 rolls if we roll a 1, 3, 5, 7, 9, 11 first, literally half of the 13 gets knocked out! The next most common rolls then are 12 and 14, but both of them are even! Okay, fine, then we have 11 and 15. 15 suffers from the same issues as 13 does, which is it needs to have 2 rolls to ever be realized. 11 has no such limitation, being available right from the start. Therefore, without doing virtually any math, I'd say the answer was 11. And then they'd ask me what that probability actually was and I'd be like "how do you like them apples?" and hang up the phone. Less

↳

of course, 13 comes from a martingale stopped at a stopping time is a martingale. E[sum] = E[one roll]*E[number of rolls] = 13 Less

### There is a 91% chance of seeing a shooting star in the next hour, what is the probability of seeing a shooting star in the next half hour?

12 Answers↳

70%

↳

probability shooting star in half hour = x probability of NOT seeing shooting star in 1 hour = (1-x)^2 probability of seeing shooting star in 1 hour = 1 - (1-x)^2 so we solve 1 - (1-x)^2 = 0.91 for x Less

↳

P(no star in an hr) = 1 - 0.91 = 0.09 P(no star in half an hour)^2 = 0.09 P(no star in half an hour) = 0.3 P(star in half an hour) =1 - 0.3 = 0.7 = 70% Less