Sparse Bandwidth-Regularised Local Polynomial Regression
Local polynomial regression has been widely used in modelling nonlinear relationship between response and predictors. Its performance drops gradually when dimension of predictors increases. We propose a two-step bandwidth regularised local polynomial regression optimization problem to select variables in high dimension. In this method, bandwidths in kernel are penalized and tend to increase. Those unimportant variables are identified by exponentially large bandwidths while the bandwidths of important variables converge to zero asymptotically. A simple and greedy algorithm is developed to approximate the solution of the optimization problems by performing transformation on bandwidths. There are extensions of the work to quantile regression and generalized linear model. We will show our method is selection consistent in all the above mentioned types of regression. Simulations for quantile regression and logistic regression are conducted to compare our method with existing methods.