Optimization of SVM Multiclass by Particle Swarm (PSO-SVM)

Full Text (PDF, 145KB), PP.32-38

Views: 0 Downloads: 0


Fatima Ardjani 1,* Kaddour Sadouni 1

1. University of Sciences and Technology - Mohamed Boudiaf- USTOran/Computer Science Department, Laboratory LAMOSI, Oran, algeria

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2010.02.05

Received: 26 Aug. 2010 / Revised: 15 Oct. 2010 / Accepted: 3 Nov. 2010 / Published: 8 Dec. 2010

Index Terms

SVM multiclass, PSO, TIMIT, evolutionary method, optimization


In many problems of classification, the performances of a classifier are often evaluated by a factor (rate of error).the factor is not well adapted for the complex real problems, in particular the problems multiclass. Our contribution consists in adapting an evolutionary method for optimization of this factor. Among the methods of optimization used we chose the method PSO (Particle Swarm Optimization) which makes it possible to optimize the performance of classifier SVM (Separating with Vast Margin). The experiments are carried out on corpus TIMIT. The results obtained show that approach PSO-SVM gives a better classification in terms of accuracy even though the execution time is increased.

Cite This Paper

Fatima Ardjani, Kaddour Sadouni, "Optimization of SVM Multiclass by Particle Swarm (PSO-SVM)", International Journal of Modern Education and Computer Science(IJMECS), vol.2, no.2, pp.32-38, 2010. DOI:10.5815/ijmecs.2010.02.05


[1]Document Analysis and Recognition, Seattle, Ahmed Al-Ani, “An Ant Colony Optimization Based Approach for Feature Selection”,ICGST International Conference on Artificial Intelligence and Machine Learning (AIML-05), Cairo 2005.
[2]Ayat, N.E, Cheriet, M., Remaki, L., and Suen,C.Y. “KMOD – A New Support Vector Machine kernel with Moderate Decreasing for Pattern Recognition”. In Proceedings on USA, September 10-13, pp.1215-1219, 2001.
[3]V. J. Kennedy, RC.Eberhart, Particle Swarm Optimization, Proceedings of the IEEE International Joint Conference on Neural Networks, vol.4, pp. 1942-1948, 1995.
[4]J. Kennedy, RC. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann, 2002.
[5]Y. Shi, RC. Eberhart, A Modified Particle Swarm Optimizer, In Proc IEEE Congress on Evolutionary Computation, pp. 69-73, 1998.M. Clerc, L’optimisation par essaim particulaire, Hermès - Lavoisier, février 2005.
[6]Crina Grosan, Ajith Abraham, Monica Chis, Swarm Intelligence in Data Mining, Studies in Computational Intelligence (SCI) 34, pp. 1- 20, 2006.
[7]Kennedy.J, Mendes.R, Population Structure and Particle Swarm Performance. In proceedings of the IEEE Congress on Evolutionary Computation (CEC), pp.1671- 1676, 2002.
[8]X. Wang et al. Feature Selection Based on Rough Sets and Particle Swarm Optimization, Pattern Recognition Letters, Vol.28, pp.459-471, 2007.
[9]Pang-Ning Tan, Michael Steinbach, Vipin Kumar, Introduction to Data Mining, Pearson Addison Wesley, 2006.
[10]Vapnik, V.N. Statistical Learning Theory. John Wiley and Sons, New York, USA, 1998.
[11]Vapnik, V.N. The Natural of Statistical Learning theory. Springer – Verleg, New York, USA, 1995.
[12]Kecman, V. Learning and Soft Computing: Support Vector machines, Neural Networks, and Fuzzy logic Models.The MIT press, London, 2001.
[13]Ying Li, Yan Tong, Bendu Bai and Yaining Zhang, An Improved Particle Swarm Optimization for SVM Training, In Third International Conference on Natural Computation (ICNC 2007), pp. 611-615, 2007.
[14]Jaiwei Han, Micheline Kamber, Data Mining Concepts and Techniques, 2nd edition, Morgan Kaufmann, 2006.
[15]Wu. Zhili, Kernel based Learning Methods for Pattern and Feature Analysis, Ph.D thesis Hong Kong Baptist University, 2004.
[16]S. Ruping, SVM Kernels for Time Series Analysis, In LLWA 01 – Tagungsbandder GI-workshop-Woche Lernen-Wissen- Adaptivity, pp. 43-50, 2001.
[17]Ryan Rifkin and Aldebaro Klautau, In defense of one-vs-all classification; Journal of Machine Learning Research 5, 101-141. 2004.
[18]J.C. Platt, N. Cristianini, and J. Shawe-Taylor, Large margin DAGs for multiclass classification; In Advances in Neural Information Processing Systems, volume 12, pages 547-443. MIT Press. 2000.
[19]Chih-Wei Hsu and Chih-Jen Lin. A Comparison of Methods for Multiclass Support Vector Machines. New York, 2003.
[20]V. Vapnik, Statistical Learning Theory, Wiley, New York. 1998.
[21]A. Ganapathiraju, Support vector machines for speech recognition. PhD Thesis, Mississipi State University, USA. 2001.
[22]N. Smith and M. Gales, Speech recognition using SVM. Advances in Neural Information Processing Systems, 14, MIT Press. 2002.
[23]J. Salomon, k. Simon and Miles Osborne, Framewise Phone classification Using Support Vector Machines; ICSLP. 2002.
[24]P. Moreno, On the use of Support Vector Machines for Phonetic Classification; In the proceedings of ICCASP. 1999.
[25]R. Rifkin and a1, Noise Robust Phonetic Classification with Linear Regularized Least Squares and Second Order Featues; ICASSP. 2007.
[26]Radoslav Goldman, et.al, Candidate Markers for the Detection of Hepato Cellular Carcinoma in Low-mofraction of Serum Carcinogenesis, 28 (10), pp: 2149 -2153, October 2007.
[27]M. Slaney, Auditory Toolbox version 2. Tech. Report#010, Internal Research Corporation. 1998.
[28]Belkadi, K. Smail, K. ”Parallélisassions des méta heuristique (PSO-EM) appliqués aux systèmes de type Flow-Shop hybride d’Informatique”, memory of doctorate, USTO.2009.
[29]Belkadi, K. Hernane, S. ”Application des mètaheuristiques Parallèle inspirées du vivant pour l’ordonnancement des systèmes de type Flow Shop hybride ”. Department of Data processing, memory of doctorate, USTO.2006.
[30]Omran, M., Salman, A., et Engelbrecht, A. P. (2002).”Image classification using particle swarm optimization”. In Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning 2002 (SEAL 2002), pp.370–374.
[31]Khedam ,R. A. Belhadj-Aissa ”contribution au développement de méthodologies de fusion/classification contextuelles d’images satellitaires multisources”. Faculty of electronics and Data-processing thesis of doctorate,USTHB,2008.