Difference of the Absolute Differences – A New Method for Motion Detection

Full Text (PDF, 857KB), PP.1-14

Views: 0 Downloads: 0


Khalid Youssef 1,* Peng-Yung Woo 1

1. Department of Electrical Engineering, Northern Illinois University, Dekalb, IL 60115, USA

* Corresponding author.

DOI: https://doi.org/10.5815/ijisa.2012.09.01

Received: 4 Dec. 2011 / Revised: 2 Mar. 2012 / Accepted: 11 May 2012 / Published: 8 Aug. 2012

Index Terms

Bare-Hand Motion, Computer Vision, Difference of the Absolute Differences Method, Human-Machine Interaction, Image Processing, Spatial Object Motion Detection


This article presents a new method, which reduces costs and processing time for spatial object motion detection by focusing on the bare-hand motion that mimics computer mouse functions to allow the user to move the mouse pointer in real-time by the motion of his/her hand without any gloves worn, any object carried, or any key hit. In this article, the study of this topic is from the viewpoint of computer vision and image processing. The principals of the difference of the absolute differences (DAD) are investigated. A new method based on the DAD principles, which is conceptually different from all the existing approaches to spatial object motion detection, is developed and applied successfully to the bare-hand motion. The real-time implementation of the bare-hand motion detection demonstrates the accuracy and efficiency of the DAD method.

Cite This Paper

Khalid Youssef, Peng-Yung Woo, "Difference of the Absolute Differences – A New Method for Motion Detection", International Journal of Intelligent Systems and Applications(IJISA), vol.4, no.9, pp.1-14, 2012. DOI:10.5815/ijisa.2012.09.01


[1]Wang, R. Y., “Real-Time Hand-Tracking as a User Input Device”, ACM Symposium on User Interface Software and Technology (UIST), 2008 

[2]Wang, R. Y. & Popovic, J., “Real-Time Hand-Tracking with a Color Glove”, ACM Transactions on Graphics, 2009

[3]Milanovic, V. & Lo, W. K., “Fast and High-Precision 3D Tracking and Position Measurement with MEMs Micromirrors”, Optical MEMs and Nanophotonics. IEEE/LEOS. 2009

[4]Lee, J. et al., “The 3D Sensor Table for Bare Hand Tracking and Posture Recognition”, Lecture Notes in Computer Science, Springer, 2006

[5]Campos, T. E. & Murray, D. W., “Regression-Based Hand Pose Estimation from Multiple Cameras”, Conference on Computer Vision and Pattern Recognition (CVPR), 2006 

[6]Fujiyoshi, H. et al., “Fast 3D Position Measurement with Two Unsynchronized Cameras”, IEEE International Symposium on Computational Intelligence in Robotics and Automation, 2003

[7]Schlattmann, M. et al., “Real-Time Bare-Hands Tracking for 3D Games”, IADIS International Conference on Game and Entertainment Technology (GET), 2009 

[8]Garg, P., Aggarwa, N. & Sofat, S., “Vision Based Hand Gesture Recognition”, Proceedings of World Academy of Science, Engineering and Technology, Vol. 49, pp. 972-977, 2009

[9]Hardenberg, C. V. & Bedard, F., “Bare Hand Human Computer Interaction”, Proceedings of the 2001 ACM Workshop on Perceptive User Interfaces, 2001

[10]Triesch, J. & Malsburg, C., “Robust Classification of Hand Postures Against Complex Background”, International Conference on Automatic Face and Gesture Recognition, 1996, Killington

[11]Ware, C. & Balakrishnan, R., “Researching for Objects in VR Displays: Lag and Frame Rate”, ACM Transactions on Computer-Human Interaction, 1994

[12]Sato, Y., Kobayashi, Y. & Koike, A. H., “Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface”, IEEE International Conference on Automatic Face and Gesture Recognition, 2000

[13]Segen, J., “GestureVR: Vision-Based 3D Hand Interface for Spatial Interaction”, ACM Multimedia, 1998, Bristol

[14]Laptev, I. & Lindeberg, T., “Tracking of Multi-State Hand Models Using Particle Filtering and a Hierarchy of Multi-Scale”, Proceedings of the IEEE Workshop on Scale-Space and Morphology, 2001

[15]Crowley, J., Bérard, F., & Coutaz, J., “Finger Tracking as an Input Device for Augmented Reality”, Proceedings of the International Workshop on Gesture and Face Recognition, 1995, Zurich

[16]O'Hagan, R., & Zelinsky, A., “Finger Track - A Robust and Real-Time Gesture Interface”, Australian Joint Conference on Artificial Intelligence. 1997, Perth

[17] tenger, B., Thayananthan, A., Torr, P. & Cipolla, R., “Model-Based Hand Tracking Using a Hierarchical Bayesian Filter”, IEEE Transactions on Pattern Analysis and Machine, pp. 1372 – 1384, 2006

[18]Bretzner, L., Laptev, I. & Lindeberg, T., “Hand Gesture Recognition Using Multi-scale Color Features Hierarchichal Models and Particle Filtering”, Proceedings of the International Conference on Automatic Face and Gesture Recognition, 2002, Washington D.C.

[19]Sánchez-Nielsen, E., Antón-Canalís, L. & Hernández-Tejera, M., “Hand Gesture Recognition for Human Machine Interaction”, 12th International Conference on Computer Graphics, Visualization and Computer Vision (WSCG), 2004 

[20]Stenger, B., “Template Based Hand Pose Recognition Using Multiple Cues”, Computer Vision – ACCV, Vol. 3852, pp. 551-560, 2006, Springer

[21]Lienhart, R., & Maydt, J., “An Extended Set of Haar-Like Features for Rapid Object Detection”, Proceedings of the IEEE International Conference on Image Processing, 2002

[22]Barczak, A. L. & Dadgostar, F., “Real-Time Hand Tracking Using a Set of Co-Operative Classifiers Based on Haar-Like Features”, Research Letters in the Information and Mathematical Sciences, Vol. 7, pp. 29-42, 2005

[23]Chen, Q., Georganas, N. & Petriu, E., “Real-Time Vision Based Hand Gesture Recognition Using Haar-Like Features,” Proceedings of the IEEE International Conference on Instrumentation and Measurement Technology, 2005, Warsaw

[24]Wang, C. C. & Wang, K. C., “Hand Posture Recognition Using Adaboost with SIFT for Human Robot Interaction”, Recent Progress in Robotics: Viable Robotic Service to Human, Vol. 370, pp. 317-329, 2009, Springer Berlin / Heidelberg

[25]Kang, H., Lee, C. W. & Jung, K., “Recognition-Based Gesture Spotting in Video Games”, Pattern Recognition Letters, pp. 1701-1714, 2004

[26]Lu, P., Chen, Y., Zeng, X. & Wang, Y., “A Vision Based Game Control Method”, Computer Vision in Human-Computer Interaction, Vol. 3766, pp. 70-78, 2005, Springer Berlin / Heidelberg 

[27]Jong-Hyun, Y., Park, J.-S. & Sung, M. Y., “Vision-Based Bare-Hand Gesture Interface for Interactive Augmented Reality Applications”, Entertainment Computing - ICEC 2006, Vol. 4161, pp. 386-389, Springer Berlin / Heidelberg, 2006

[28] Park, H. S., Jung, D. J. & Kim, H. J., “Vision-Based Game Interface Using Human Gesture”, Lecture Notes in Computer Science, Springer Berlin / Heidelberg, 2006

[29]Song, P., Yu, H. & Winkler, S., “Vision-Based 3D Finger Interactions for Mixed Reality Games with Physics Simulation”, Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, 2008

[30]Vassiliadis, S., Wong, S., Hakkennes, E., Wong, J. S. & Pechanek, G., “The Sum-Absolute Difference Motion Estimation Accelerator”, Proceedings of the IEEE 24th Euromicro Conference, 1998 

[31]Vanne, J., Aho, E., Hamalainen, T. & Kuusilinna, K., “A High-Performance Sum of Absolute Difference Implementation for Motion Estimation”, IEEE Transactions on Circuits and Systems for Video Technology, pp. 876-883, 2006

[32]Rehman, S., Young, R., Chatwin, C. & Birch, P., “An FPGA Based Generic Framework for High Speed Sum of Absolute Difference Implementation”, European Journal of Scientific Research, pp. 6-29, 2009

[33]Rehg, J. & Knade, T., “Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking”, Third European Conference on Computer Vision, pp. 35-46, Stockholm, 1994