In this thesis, I propose the Repeated Kernel Density-based Regression Estimator (RKDRE) for the linear regression model. The intuition is that the unknown error distribution can be approximated by using kernel density estimation on the residuals of an initial estimator. This density can then be maximized to obtain a new parameter estimate. The process of estimating the parameter and obtaining a density is repeated until convergence. RKDRE can be regarded as the multi-step version of KDRE as proposed by Yao and Zhao (2013). For computational convenience, I develop a constrained EM algorithm to perform the maximization. I show under relatively weak conditions that both KDRE and RKDRE converge almost surely to the true parameter. Also, I prove that using the conditions under which KDRE is adaptive (i.e., asymptotically normal and efficient), RKDRE is adaptive too. Even though the asymptotic properties of the estimators are the same, I show in a numerical study that RKDRE generally attains higher mean-square-error-efficiency. The overall performance of RKDRE is also arguably higher than any of the wide range of other adaptive estimators considered in the numerical study. The practical relevance of RKDRE is illustrated with an application to experimental research done in (Andrabi et al., 2017).

, , ,
Zhelonkin, M.
hdl.handle.net/2105/38903
Econometrie
Erasmus School of Economics

Reichardt, H.A. (Hugo). (2017, August 29). Adaptive Estimation in Linear Regression Using Repeated Kernel Error Density Estimation. Econometrie. Retrieved from http://hdl.handle.net/2105/38903