Local Reweighted Kernel Regression

Full Text (PDF, 177KB), PP.20-26

Views: 0 Downloads: 0


Weiwei Han 1

1. Guangdong University of Business Studies, Guangzhou, 510320, China

* Corresponding author.

DOI: https://doi.org/10.5815/ijem.2011.01.04

Received: 6 Oct. 2010 / Revised: 29 Nov. 2010 / Accepted: 30 Dec. 2010 / Published: 8 Feb. 2011

Index Terms

Irregular function, statistic learning, multiple kernel learning


Estimating the irregular function with multiscale structure is a hard problem. The results achieved by the traditional kernel learning are often unsatisfactory, since underfitting and overfitting cannot be simultaneously avoided, and the performance relative to boundary is often unsatisfactory. In this paper, we investigate the data-based localized reweighted regression model under kernel trick and propose an iterative method to solve the kernel regression problem. The new framework of kernel learning approach includes two parts. First, an improved Nadaraya-Watson estimator based on blockwised approach is constructed; second, an iterative kernel learning method is introduced in a series decreased active set to choose kernels. Experiments on simulated and real data sets demonstrate that the proposed method can avoid underfitting and overfitting simultaneously and improve the performance relative to the boundary effect.

Cite This Paper

Weiwei Han,"Local Reweighted Kernel Regression", IJEM, vol.1, no.1, pp.20-26, 2011. DOI: 10.5815/ijem.2011.01.04


[1] G. R. G. Lanckriet, T. D. Bie, N. Cristianini, M. I. Jordan and W. S. Noble, “A statistical framework for genomic data fusion,” Bioinformatics, vol.20, pp. 2626-2635, 2004.

[2] D. Zheng, J. Wang and Y. Zhao, “Non-flat function estimation with a muli-scale support vector regression,” Neurocomputing, vol. 70, pp. 420-429, 2006.

[3] B. Scholkopf and A. J. Smola, Learning with Kernels. London, England: The MIT Press, Cambbrige, Massachusetts, 2002.

[4] M. Gonen and E. Alpaydin, “Localized multiple kernel learning,” in Processing of 25th International Conference on Machine Learning, 2008.

[5] M. Szafranski, Y. Grandvalet and A. Rakotomamonjy, “Composite kernel learning,” in Processing of the 25th International Conference on Machine Learning, 2008.

[6] G. R. G. Lanckriet, “Learning the kernel matrix with semidefinite programming,” Journal of Machine Learning Research, vol. 5, pp. 27-72, 2004.

[7] A. Rakotomamonjy, F. Bach, S. Canu and Y. Grandvale, “More efficiency in multiple kernel learning,” Preceedings of the 24th international conference on Machine Learning, vol. 227, pp. 775-782, 2007.

[8] E. A. Nadaraya, “On estimating regression,” Theory of probability and Its Applications, vol. 9, no. 1, pp. 141-142, 1964.

[9] G. S. Watson, “Smooth regression analysis,” Sankhya, Ser. A, vol. 26, pp. 359-372, 1964.

[10] Y. Kim, J. Kim and Y. Kim, “Blockwise sparse regression,” Statistica Sinica, vol. 16, pp. 375-390, 2006.

[11] L. Lin, Y. Fan and L. Tan, “Blockwise bootstrap wavelet in nonparametric regression model with weakly dependent processes,” Metrika, vol. 67, pp. 31-48, 2008.

[12] A. Tikhonov and V. Arsenin, Solutions of Ill-posed Problem, Washingon: W. H. Winston, 1977.

[13] A. Rakotomamonjy, X, Mary and S. Canu, “Non-parametric regression with wavelet kernels,” Applied Stochastic Models in Business and Industry, vol. 21, pp. 153-163, 2005.