Abstract

Financial and macroeconomic time-series data often exhibit infrequent but large jumps. Such jumps may be considered as outliers that are independent of the underlying data-generating processes and contaminate inferences on their model. In this study, we investigate the effects of such jumps on asymptotic inference for large-dimensional common factor models. We first derive the upper bound of jump magnitudes with which the standard asymptotic inference goes through. Second, we propose a jump-correction method based on a series-by-series outlier detection algorithm without accounting for the factor structure. This method gains standard asymptotic normality for the factor model unless outliers occur at common dates. Finally, we propose a test to investigate whether the jumps at a common date are independent outliers or are of factors. A Monte Carlo experiment confirms that the proposed jump-correction method retrieves good finite sample properties. The proposed test shows good size and power. Two small empirical applications illustrate usefulness of the proposed methods.