Abstract
We develop a finite-sample optimal estimator for regression discontinuity (RD) designs when the outcomes are bounded, including binary outcomes as a leading case. RD designs exploit observations around the policy cutoff. However, if the sample size local to the cutoff is small, the standard estimation and inference methods are unreliable because their large sample approximation is unreliable. We provide an optimal estimator that achieves the exact minimax mean squared error when the regression function lies in a Lipschitz class. We show that the estimation problem is a sequence of finite-dimensional convex optimization. We also propose a uniformly valid inference procedure without a large sample approximation. In a simulation exercise, our estimates have smaller mean squared errors and shorter confidence intervals than large-sample techniques. We apply our method to a multi-cutoff design where the sample size for each cutoff is small. Our estimates report informative confidence intervals, unlike existing large-sample approaches.