Goldman Sachs

  www.goldmansachs.com
  www.goldmansachs.com

Interview Question

Quantitative Analyst Interview

What does a distribution with a maximal variance look like

  which is only defined between 0 and 1? Give the proof.
Answer

Interview Answer

3 Answers

2

You'd want the values to be at either 0 or 1, since anything in between just brings them closer to the mean (not fully rigorous, but should be obvious enough). let p be the density at 0, so (1-p) is the density at 1. Then mean = (1-p). Variance is: p*(1-p)^2+(1-p)*p^2 = p-2p^2+p^3+p^2-p^3 = p-p^2

We want to maximize variance, to take the first derivative and set it to 0:
1-2p = 0
p=1/2

Hence, half the density is at 0, half at 1.

There may be a more elegant argument.

DW on May 3, 2011
0

Basically same as DW: ( delta(0) + delta(1) )/2

xeesus on Aug 13, 2012
0

It is very hard than that to prove the results for given the choice in all probability distributions.

you must find the density f(x) such that:

f is maximising Var(f) = int{0->1} x^2f(x) dx - (int{0->1} xf(x) dx)^2
under the constraint int{0->1} f(x) dx = 1

writing optimality conditions you get that f must be infinite in 0 and 1 and you can use the symmetry argument to get the DW result.

Behe on Nov 10, 2012

Add Answers or Comments

To comment on this Question, Sign In with Facebook or Sign Up