Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction

Full Text (PDF, 614KB), PP.56-64

Views: 0 Downloads: 0


Siddharth S. Rautaray 1,* Anupam Agrawal 1

1. Indian Institute of Information Technology, Allahabad, India

* Corresponding author.


Received: 4 May 2011 / Revised: 20 Sep. 2011 / Accepted: 9 Dec. 2011 / Published: 8 May 2012

Index Terms

Real time, gesture recognition, human computer interaction, tracking


With the increasing use of computing devices in day to day life, the need of user friendly interfaces has lead towards the evolution of different types of interfaces for human computer interaction. Real time vision based hand gesture recognition affords users the ability to interact with computers in more natural and intuitive ways. Direct use of hands as an input device is an attractive method which can communicate much more information by itself in comparison to mice, joysticks etc allowing a greater number of recognition system that can be used in a variety of human computer interaction applications. The gesture recognition system consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features. The designed system further integrated with different applications like image browser, virtual game etc. possibilities for human computer interaction. Computer Vision based systems has the potential to provide more natural, non-contact solutions. The present research work focuses on to design and develops a practical framework for real time hand gesture.

Cite This Paper

Siddharth S. Rautaray, Anupam Agrawal, "Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction", IJISA, vol.4, no.5, pp.56-64, 2012. DOI:10.5815/ijisa.2012.05.08


[1]Conic, N., Cerseato, P., De Natale, F. G. B.,: Natural Human- Machine Interface using an Interactive Virtual Blackboard, In Proceeding of ICIP 2007, pp.181-184, (2007).

[2]A. Vardy, J. Robinson, Li-Te Cheng, “The Wrist Cam as input device”, Wearable Computers, 1999

[3]Wong Tai Man, Sun Han Qiu, Wong Kin Hong, “ThumbStick: A Novel Virtual Hand Gesture Interface”, In Proceedings of the IEEE International Workshop on Robots and human Interactive Communication, 300-305.

[4]W. T., Freeman, D. B Anderson, and P. et al. Beardsley. “Computer vision for interactive computer graphics. IEEE Trans. On Computer Graphics and Applications, 18:42-53, 1998.

[5]N. Soontranon, S. Aramvith, and T. H. Chalidabhongse, “Improved face and hand tracking for sign language Recognition”. IEEE Trans. On ITCC, 2:141-146, 2005.

[6]V. Pavlovic, R. Sharma and T.S. Huang, “Visual interpretation of hand gestures for human-computer interaction: A review,” IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI), vol. 7(19), pp. 677–695, 1997.

[7]Xiujuan Chai, Yikai Fang and Kongqiao Wang, “Robust hand gesture analysis and application in gallery browsing,” In Proceeding of ICME, New York, pp. 938-94, 2009. 

[8]José Miguel Salles Dias, Pedro Nande, Pedro Santos, Nuno Barata and André Correia, “Image Manipulation through Gestures,” In Proceedings of AICG’04, pp. 1-8, 2004. 

[9]Ayman Atia and Jiro Tanaka, “Interaction with Tilting Gestures in Ubiquitous Environments,” In International Journal of UbiComp (IJU), Vol.1, No.3, 2010.

[10]S.S. Rautaray and A. Agrawal, “A Novel Human Computer Interface Based On Hand Gesture Recognition Using Computer Vision Techniques,” In Proceedings of ACM IITM’10, pp. 292-296, 2010.

[11]Z. Xu, C. Xiang, W. Wen-hui, Y. Ji-hai, V. Lantz and W. Kong-qiao, “ Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” In Proceedings of IUI’09, pp. 401-406, 2009.

[12]C. S. Lee, S. W. Ghyme, C. J. Park and K. Wohn, “The Control of avatar motion using hand gesture,” In Proceeding of Virtual Reality Software and technology (VRST), pp. 59-65, 1998.

[13]X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang and J. Yang, “A framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Trans. On Systems, Man and Cybernetics- Part A: Systems and Humans, pp. 1-13, 2011.

[14]N. Conci, P. Cerseato and F. G. B. De Natale, “Natural Human- Machine Interface using an Interactive Virtual Blackboard,” In Proceeding of ICIP 2007, pp. 181-184, 2007.

[15]B. Yi, F. C. Harris Jr., L. Wang and Y. Yan, “Real-time natural hand gestures”, In Proceedings of IEEE Computing in science and engineering, pp. 92-96, 2005.

[16]R. Lienhart and J. Maydt, “An extended set of Haar-like features for rapid object detection,” In Proceedings of ICIP02, pp. 900-903, 2002.

[17]C.H, Messom and A.L.C. Barczak, Fast and Efficient Rotated Haar-like Features Using Rotated Integral Images", In Proceedings of Australian Conference on Robotics and Automation (ACRA2006), pp. 1-6, 2006.

[18]Nguyen Dang Binh, Enokida Shuichi, Toshiaki Ejima, “Real-Time Hand Tracking and Gesture Recognition System”, 2005.

[19]G. R. Bradski. Computer video faces tracking for use in a perceptual user interface. Intel Technology Journal, Q2, pp. 1-15, 1998.

[20]R. J. K. Jacob, “Human-Computer Interaction”, ACM Computing surveys, 177-179, March 1996.

[21]T. Brown and R. C. Thomas, “Finger tracking for the digital desk”. IEEE Trans. On AUIC, 11-16, 2000.

[22]J. Shi, C. Tomasi. “Good Features to track”, IEEE Conference on Computer Vision and Pattern Recognition, 593-600, 1994.

[23]Q. Chen, N.D. Georganas, E.M. Petriu, “Realtime Vision-based Hand Gesture Recognition Using Haar-like Features,” In Proceedings of. IEEE Instrument and Measurement Technology Conference, 2007.

[24]Ismail, N. A., O’Brien, A.,: Enabling Multimodal Interaction in Web-Based Personal Digital Photo Browsing,” Proceedings of the International Conference on Computer and Communication Engineering 2008, Kuala Lumpur, Malaysia, May 13-15, pp. 907-910, (2008).

[25]Moeslund, T. B., Norgaard, L.: A brief overview of hand gestures used in wearable human computer interfaces, Technical report, Aalborg University, Denmark, (2002).