IJISA Vol. 5, No. 1, 8 Dec. 2012

Cover page and Table of Contents: PDF (size: 1203KB)

Full Text (PDF, 1203KB), PP.69-80

Views: 0 Downloads: 0

Artificial Neural Network, Neural Network Training, Neural Network Pruning, Optimal Brain Damage, Swarm Intelligence, Cat Swarm Optimization

An Artificial Neural Network (ANN) is an abstract representation of the biological nervous system which has the ability to solve many complex problems. The interesting attributes it exhibits makes an ANN capable of “learning”. ANN learning is achieved by training the neural network using a training algorithm. Aside from choosing a training algorithm to train ANNs, the ANN structure can also be optimized by applying certain pruning techniques to reduce network complexity. The Cat Swarm Optimization (CSO) algorithm, a swarm intelligence-based optimization algorithm mimics the behavior of cats, is used as the training algorithm and the Optimal Brain Damage (OBD) method as the pruning algorithm. This study suggests an approach to ANN training through the simultaneous optimization of the connection weights and ANN structure. Experiments performed on benchmark datasets taken from the UCI machine learning repository show that the proposed CSONN-OBD is an effective tool for training neural networks.

John Paul T. Yusiong, "Optimizing Artificial Neural Networks using Cat Swarm Optimization Algorithm", International Journal of Intelligent Systems and Applications(IJISA), vol.5, no.1, pp.69-80, 2013.DOI:10.5815/ijisa.2013.01.07

[1]Z. Huanping, L. Congying, Y. Xinfeng. Optimization research on Artificial Neural Network Model. Proceedings of the 2011 International Conference on Computer Science and Network Technology, (2011), pp. 1724-1727.

[2]H. Shi. Evolving Artificial Neural Networks Using GA and Momentum. Proceedings of the 2009 Second International Symposium on Electronic Commerce and Security, ISECS '09, (2009), (1): pp. 475-478.

[3]M. Paliwal. and U. Kumar. A. Neural Networks and Statistical Techniques: A Review of Applications. Expert Systems with Applications, (2009), 36(1), pp. 2-17.

[4]H. Shi and W. Li. Artificial Neural Networks with Ant Colony Optimization for Assessing Performance of Residential Buildings. Proceedings of the International Conference on Future BioMedical Information Engineering, FBIE 2009, (2009), pp. 379-382.

[5]B. A. Garro, H. Sossa and R. A. Vázquez. Artificial Neural Network Synthesis by means of Artificial Bee Colony (ABC) Algorithm. Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2011, (2011), pp. 331-338.

[6]Y. Wang, Z. Xia and Y. Huo. Neural Network Research Using Particle Swarm Optimization. Proceedings of the 2011 International Conference on Internet Computing and Information Services, ICICIS '11, (2011), pp. 407-410.

[7]S-C. Chu and P-W. Tsai. Computational Intelligence based on the Behavior of Cats. International Journal of Innovative Computing, Information and Control, (2007), 3(1), pp. 163-173.

[8]S-C Chu, P-W Tsai and J-S. Pan. Cat Swarm Optimization. Proceedings of the 9th Pacific Rim International Conference on Artificial Intelligence, LNAI 4099, Guilin, (2006), pp. 854-858.

[9]J.-C. Hwang, J.-C. Chen and J.-S. Pan. CSO and PSO to Solve Optimal Contract Capacity for High Tension Customers. Proceedings of 8th International Conference on Power Electronics and Drive Systems, PEDS-2009, (2009), pp. 76-81.

[10]Y. Le Cun, J. S. Denker and S. A. Solla. Optimal Brain Damage. Advances in Neural Information Processing Systems, Touretzky, DS (ed), Morgan Kaufmann, San Mateo, (1990), (2): pp. 598–605.

[11]M. Gethsiyal Augasta and T. Kathirvalavakumar. A Novel Pruning Algorithm for Optimizing

Feedforward Neural Network of Classification Problems. Neural Processing Letters, (2011), 34(3), pp. 241-258.

[12]L. Li and B. Niu. Designing Artificial Neural Networks Using MCPSO and BPSO, Proceedings of the 2008 International Conference on Computational Intelligence and Security, CIS 2008, (2008), pp. 176-179.

[13]J. Tu, Y. Zhan and F. Han. A Neural Network Pruning Method Optimized with PSO Algorithm. In Proceedings of the 2010 Second International Conference on Computer Modeling and Simulation, ICCMS '10, (2010), (3), pp. 257-259.

[14]T. Orlowska-Kowalska and M. Kaminski. Effectiveness of Saliency-Based Methods in Optimization of Neural State Estimators of the Drive System with Elastic Couplings. IEEE Transactions on Industrial Electronics, (2009), 56(10), pp. 4043-4051.

[15]T. Orlowska-Kowalska and M. Kaminski. Optimization of Neural State Estimators of the Two-mass System using OBD method. Proceedings of the IEE International Symposium on Industrial Electronics, ISIE 2008, (2008), pp. 461-466.

[16]I. Sansa, N. B, Mrabet and M. Bouzid Ben Khader. Effectiveness of the Saliency-Based Methods in Optimization of NN Structure for Induction Motor Fault Diagnosis. Proceedings of the 8th International Multi-Conference on Systems, Signals & Devices, (2011), pp.1-7.

[17]I. A. Basheer and M. Hajmeer. Artificial Neural Networks: Fundamentals, Computing, Design, and Application, Journal of Microbiological Methods, (2000), 43, pp. 3-31.

[18]X. Yao. Evolving Artificial Neural Networks. Proceedings of the IEEE, (1999), (87): 1423-1447.

[19]C. Ozturk and D. Karaboga. Hybrid Artificial Bee Colony Algorithm for Neural Network Training. Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2011, (2011), pp. 84-88.

[20]E. Alba and J. Chicano. Training Neural Networks with GA Hybrid Algorithms, K. Deb (ed.). Proceedings of GECCO ’04, Seattle, Washington, LNCS 3102, (2004), pp. 852-863.

[21]Y. Liu and X. Yao. A Population-Based Learning Algorithm Which Learns Both Architectures and Weights of Neural Networks. Chinese J. Advanced Software Res., (1996), 3(1), pp. 54-65.

[22]E. Cantu´-Paz. Pruning Neural Networks with Distribution Estimation Algorithms. Proceedings of the 2003 International Conference on Genetic and Evolutionary computation, GECCO'03, (2003), 1: pp. 790-800.

[23]J. Sum and C-s. Leung. On the Error Sensitivity Measure for Pruning RBF networks. In Proceedings of the Second International Conference on Machine Learning and Cybernetics, (2003), pp. 1162-1167.

[24]J. Yang, A. Bouzerdoum and S. Phung. A Neural Network Pruning Approach based on Compressive Sampling. Proceedings of International Joint Conference on Neural Networks 2009, (2009), pp. 3428-3435.

[25]M. Shahin, M. Jaksa and H. Maier. Application of Neural Networks in Foundation Engineering. Theme paper to the International e-Conference on Modern Trends in Foundation Engineering: Geotechnical Challenges and Solutions, Theme No. 5: Numerical Modelling and Analysis, Chennai, India, (2004).

[26]B. Hassibi and D. G. Stork. Second Order Derivatives for Network Pruning: Optimal Brain Surgeon. Advances in Neural Information Processing Systems 5, [NIPS Conference], Stephen Jose Hanson, Jack D. Cowan, and C. Lee Giles (Eds.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, (1992), pp. 164-171.

[27]S. Samarasinghe. Neural Networks for Applied Sciences and Engineering, Auerbach Publications, Boston, MA, (2006).

[28]J. Ding, J. Shao, Y. Huang, L. Sheng, W. Fu and Y. Li. Swarm Intelligence Based Algorithms for Data Clustering. Proceedings of the 2011 International Conference on Computer Science and Network Technology, (2011), pp. 577-581.

[29]J. P. T. Yusiong and P. C. Naval, Jr. Training Neural Networks Using Multiobjective Particle Swarm Optimization, Lecture Notes in Computer Science, ICNC 2006, (2006), (1): pp. 879-888.

[30]D. Newman, S. Hettich, C. Blake and C. Merz. UCI Repository of machine learning databases. Irvine, CA: University of California, Department of Information and Computer Science, (1998).

[31]P. Palmes, T. Hayasaka and S. Usui. Mutation-based Genetic Neural Network. IEEE Transactions on Neural Networks, (2005), 16(3), pp. 587-600.