Member of the Research Staff was asked...April 28, 2017

↳

I think we need to use Generalized Method of Moments to get the estimates. Since E[e|x] = 0, we have E[h(x)e] = 0 by the law of iterated expectation for any give function h(x). Now we need to find a best function h*(x) such that it will give you efficient GMM estimator. Less

↳

Actually, you will get least squares estimate as the best estimator in the following sense: y = ax+b+e E(e|x)=0 For any h(x), E(h(x)*e) = E(E(h(x)*e)|x) (where the outer expectation is over X E(h(x)*e|x) = h(x)*E(e|x) = 0 Therefore E(h(x)*e)=0 Take h(x) = y-a-b*x The moment condition is: E(e*(y-a-b*x))=0 This would lead to Least Squares. Less

↳

I believe the true model was y = ax + b + sigma*(x^2). You can use least squares to define the likelihood or use an L1 penalty. Less

Research Staff Member was asked...October 13, 2016

↳

I was prepared to talk about these things and had no trouble.

Research Staff Member was asked...January 5, 2013

Research Staff Member was asked...December 24, 2015

↳

How will you handle skewed datasets?

Research Staff Member was asked...June 30, 2010

Research Staff Member was asked...May 30, 2019

Research Staff Member was asked...March 20, 2009

Member(Research Staff) was asked...January 17, 2016

↳

I answered 60% of the questions correctly and was selected.

Member of Research Staff was asked...July 6, 2020

↳

I came up with something depth-first search based