Rule Based Ensembles Using Pair Wise Neural Network Classifiers

Full Text (PDF, 237KB), PP.34-40

Views: 0 Downloads: 0


Moslem Mohammadi Jenghara 1,2,* Hossein Ebrahimpour-Komleh 2

1. Department of information technology, Payam Noor University, Miandoab, Iran

2. Electronic and Computer faculty, Kashan University, Kashan, Iran

* Corresponding author.


Received: 11 Jun. 2014 / Revised: 11 Sep. 2014 / Accepted: 17 Nov. 2014 / Published: 8 Mar. 2015

Index Terms

Classifier Ensemble, Pair Wise Classifiers, Rule Based Ensemble, Neural Network, Classifier Combination


In value estimation, the inexperienced people's estimation average is good approximation to true value, provided that the answer of these individual are independent. Classifier ensemble is the implementation of mentioned principle in classification tasks that are investigated in two aspects. In the first aspect, feature space is divided into several local regions and each region is assigned with a highly competent classifier and in the second, the base classifiers are applied in parallel and equally experienced in some ways to achieve a group consensus. In this paper combination of two methods are used. An important consideration in classifier combination is that much better results can be achieved if diverse classifiers, rather than similar classifiers, are combined. To achieve diversity in classifiers output, the symmetric pairwise weighted feature space is used and the outputs of trained classifiers over the weighted feature space are combined to inference final result. In this paper MLP classifiers are used as the base classifiers. The Experimental results show that the applied method is promising.

Cite This Paper

Moslem Mohammadi Jenghara, Hossein Ebrahimpour-Komleh, "Rule Based Ensembles Using Pair Wise Neural Network Classifiers", International Journal of Intelligent Systems and Applications(IJISA), vol.7, no.4, pp.34-40, 2015. DOI:10.5815/ijisa.2015.04.05


[1]M. Mohammadi, H. Alizadeh, and B. Minaei-Bidgoli, "Neural Network Ensembles Using Clustering Ensemble and Genetic Algorithm," presented at the Third International Conference on Convergence and Hybrid Information Technology, 2008. ICCIT'08, 2008.
[2]A. Krogh and J. Vedelsby, "Neural network ensembles, cross validation, and active learning," Advances in neural information processing systems, pp. 231-238, 1995.
[3]H. D. Navone, P. M. Granitto, P. F. Verdes, and H. A. Ceccatto, "A learning algorithm for neural network ensembles," Inteligencia Artificial, Revista Iberoamericana de Inteligencia Artificial, vol. 12, pp. 70–74, 2001.
[4]L. I. Kuncheva, Combining pattern classifiers: methods and algorithms: Wiley-Interscience, 2004.
[5]L. Shapley and B. Grofman, "Optimizing group judgmental accuracy in the presence of interdependencies," Public Choice, vol. 43, pp. 329-343, 1984.
[6]J. Kittler and F. Roli, Multiple classifier systems: Springer, 2000.
[7]H. Zouari, L. Heutte, and Y. Lecourtier, "Controlling the diversity in classifier ensembles through a measure of agreement," Pattern Recognition, vol. 38, pp. 2195-2199, 2005.
[8]L. Chen and M. S. Kamel, "A generalized adaptive ensemble generation and aggregation approach for multiple classifier systems," Pattern Recognition, vol. 42, pp. 629-644, 2009.
[9]K. Woods, W. P. Kegelmeyer Jr, and K. Bowyer, "Combination of multiple classifiers using local accuracy estimates," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 19, pp. 405-410, 1997.
[10]E. M. Dos Santos, R. Sabourin, and P. Maupin, "Overfitting cautious selection of classifier ensembles with genetic algorithms," Information Fusion, vol. 10, pp. 150-162, 2009.
[11]S. Weiss, "Lightweight rule induction," ed: Google Patents, 2003.
[12]H. Parvin, H. Alizadeh, B. Minaei-Bidgoli, and M. Analoui, "CCHR: Combination of Classifiers using Heuristic Retraining," 2008, pp. 302-305.
[13]J. Kittler, M. Hatef, R. P. W. Duin, and J. Matas, "On combining classifiers," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 20, pp. 226-239, 1998.
[14]Q. Fu, S. X. Hu, and S. Y. Zhao, "A PSO-based approach for neural network ensemble," Journal of Zhejiang University (Engineering Science), vol. 38, pp. 1596-1600, 2004.
[15]H. Parvin, H. Alizadeh, B. Minaei-Bidgoli, and M. Analoui, "A Scalable Method for Improving the Performance of Classifiers in Multiclass Applications by Pairwise Classifiers and GA," presented at the Fourth International Conference on Networked Computing and Advanced Information Management, 2008.
[16]Z. Wu and Y. Chen, "Genetic algorithm based selective neural network ensemble," presented at the IJCAI-01: proceedings of the Seventeenth International Joint Conference on Artificial Intelligence, Seattle, Washington, August 4-10, 2001, 2001.
[17]M. M. Rahman, S. Ahmed, and M. H. Shuvo, "Nearest Neighbor Classifier Method for Making Loan Decision in Commercial Bank," International Journal of Intelligent Systems and Applications (IJISA), vol. 6, p. 60, 2014.
[18]H. Parvin, H. Alizadeh, and B. Minaei-Bidgoli, "A New Approach to Improve the Vote-Based Classifier Selection," 2008, pp. 91-95.
[19]M. Govindarajan, "A Hybrid RBF-SVM Ensemble Approach for Data Mining Applications," International Journal of Intelligent Systems and Applications (IJISA), vol. 6, p. 84, 2014.
[20]L. K. Hansen and P. Salamon, "Neural network ensembles," presented at the IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990.
[21]B. E. Rosen, "Ensemble learning using decorrelated neural networks," Connection Science, vol. 8, pp. 373-384, 1996.
[22]Y. Liu, X. Yao, and T. Higuchi, "Evolutionary ensembles with negative correlation learning," IEEE Transactions on Evolutionary Computation, vol. 4, pp. 380-387, 2000.
[23]D. W. Opitz and J. W. Shavlik, "Actively searching for an effective neural network ensemble," Connection Science, vol. 8, pp. 337-354, 1996.
[24]A. Lazarevic and Z. Obradovic, "Effective pruning of neural network classifier ensembles," presented at the International Joint Conference on Neural Networks, 2001. Proceedings. IJCNN'01, 2001.
[25]H. D. Navone, P. F. Verdes, P. M. Granitto, and H. A. Ceccatto, "Selecting diverse members of neural network ensembles," presented at the proc. 16th Brazilian Symposium on Neural Networks, 2000.
[26]P. Cunningham, "Overfitting and diversity in classification ensembles based on feature selection," Trinity College Dublin, Dublin (Ireland), Computer Science Technical Report: TCD-CS-2000-07, 2000.
[27]R. Kohavi and D. Sommerfield, "Feature subset selection using the wrapper method: Overfitting and dynamic search space topology," 1995, pp. 192–197.
[28]A. H. R. Ko, R. Sabourin, and A. S. Britto, "Pairwise fusion matrix for combining classifiers," Pattern Recognition, vol. 40, pp. 2198-2210, 2007.
[29]Y. A. Christobel and D. P. Sivaprakasam, ""A New Classwise k Nearest Neighbor (CKNN) Method for the Classification of Diabetes Dataset" " International Journal of Engineering and Advanced Technology, 2013.
[30]T. Hothorn and B. Lausen, "Double-bagging: Combining classifiers by bootstrap aggregation," Pattern Recognition, vol. 36, pp. 1303-1309, 2003.