Abstract
Outlying observations are often disregarded at the sacrifice of degrees of freedom or downsized via robust loss functions (e.g., Huber's loss) to reduce the undesirable impact on data analysis. In this article, we treat the outlying status of each observation as a parameter and propose a penalization method to automatically adjust the outliers. The proposed method shifts the outliers towards the fitted values, while preserve the non-outlying observations. We also develop a generally applicable algorithm in the iterative fashion to estimate model parameters and demonstrate the connection with the maximum likelihood based estimation procedure in the case of least squares estimation. We establish asymptotic property of the resulting parameter estimators under the condition that the proportion of outliers does not vanish as sample size increases. We apply the proposed outlier adjustment method to ordinary least squares and lasso-type penalization procedure and demonstrate its empirical value via numeric studies. Furthermore, we study applicability of the proposed method to two robust estimators, Huber's robust estimator and Huberized lasso, and demonstrate its noticeable improvement of model fit in the presence of extremely large outliers.
Original language | English |
---|---|
Pages (from-to) | 1-23 |
Number of pages | 23 |
Journal | Statistical Modelling |
Volume | 16 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2016 Feb 1 |
Externally published | Yes |
Keywords
- Huber's estimator
- case-specific parameter
- extreme outliers
- robust lasso
- robust linear model
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty