Skip to content Skip to navigation

Edward Kennedy - Refined Doubly Robust Estimation With Undersmoothing and Double Cross-Fitting

Edward Kennedy Picture
February 24, 2020 - 1:10pm
Graduate School of Business, Gunn Building, Rm G102

Information regarding parking: http://www.gsb.stanford.edu/visit

LUNCH IS PROVIDED AND WILL BE SERVED AT 12:45 PM. 

Refined doubly robust estimation with undersmoothing and double cross-fitting

About this Event

Abstract: Classically, undersmoothing has been used to construct plug-in estimators that are asymptotically efficient. In this work we consider using undersmoothing with doubly robust and other bias-corrected semiparametric estimators, in the hope of reaching minimax optimality. We show the surprising result that without a specialized form of cross-fitting, doubly robust-style estimators can have provably worse performance than undersmoothed plug-in estimators. However this can be repaired with the double cross-fitting procedure proposed by Newey & Robins (2018) with spline estimators. We add to this work in several directions. First we show how undersmoothing a doubly robust estimator leads to a simple minimax optimal estimator of the expected density. We then prove that double cross-fitting allows for improved rates via undersmoothing beyond splines, and in fact for any general linear smoothers, including kernel, RKHS, Gaussian process, some random forest estimators, etc. Our analysis has a special focus on the non-root-n setting that arises when nuisance functions are very non-smooth or high-dimensional. Finally we derive a new minimax lower bound for functional estimation in fixed designs, showing that undersmoothing and double cross-fitting can be minimax optimal in this setup. We study the proposed estimators' performance in simulations, exploring several practical methods for appropriate undersmoothing.

 

 

 

Event Sponsor: 
Institute for Research in the Social Sciences and Graduate School of Business

This event belongs to the following series