A Gesture-based Recognition System for Augmented Reality

Vinothini Kasinathan, Aida Mustapha, Asti Amalia Nur Fajrillah


With the geometrical improvement in Information Technology, current conventional input devices are becoming increasingly obsolete and lacking. Experts in Human Computer Interaction (HCI) are convinced that input devices remain the bottleneck of information acquisition specifically in when using Augmented Reality (AR) technology. Current input mechanisms are unable to compete with this trend towards naturalness and expressivity which allows users to perform natural gestures or operations and convert them as input. Hence, a more natural and intuitive input device is imperative, specifically gestural inputs that have been widely perceived by HCI experts as the next big input device. To address this gap, this project is set to develop a prototype of hand gesture recognition system based on computer vision in modeling basic human-computer interactions. The main motivation in this work is a technology that requires no outfitting of additional equipment whatsoever by the users. The gesture-based had recognition system was implemented using the Rapid Application Development (RAD) methodology and was evaluated in terms of its usability and performance through five levels of testing, which are unit testing, integration testing, system testing, recognition accuracy testing, and user acceptance testing. The test results of unit, integration, system testing as well as user acceptance testing produced favorable results. In conclusion, current conventional input devices will continue to bottleneck this advancement in technology; therefore, a better alternative input technique should be looked into, in particularly, gesture-based input technique which offers user a more natural and intuitive control.


hand gesture; e-learning; human-computer interaction.

Full Text:



Nicu Sebe, Michael S Lew, and Thomas S Huang. The state-of-the-art in human-computer interaction. In International Workshop on Computer Vision in Human-Computer Interaction, pp. 1–6, 2004.

Don Norman. The design of everyday things: Revised and expanded edition. Basic Books, 2013.

Edwin L Hutchins, James D Hollan, and Donald A Norman. Direct manipulation interfaces. Human–Computer Interaction, 1(4):311–338, 1985.

Robert JK Jacob. Human-computer interaction: input devices. ACM Computing Surveys (CSUR), 28(1):177–179, 1996.

On the horizon: A new remote catheter manipulation system, 2018.

NP Sheehy and AJ Chapman. Nonverbal behavior at the human-computer interface. International Reviews of Ergonomics, 1:159–172, 1987.

Alexander I Rudnicky, Alexander G Hauptmann, and Kai-Fu Lee. Survey of current speech technology. Communications of the ACM, 37(3):52–57, 1994.

Russell Beale and ADN Edwards. Gestures and neural networks in human computer interaction. In Neural Nets in Human-Computer Interaction, IEE Colloquium on, pp. 5–1. IET, 2018.

D Ward. Dasher with an eye-tracker, 2015.

Clare-Marie Karat, Christine Halverson, Daniel Horn, and John Karat. Patterns of entry and correction in large vocabulary continuous speech recognition systems. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, pp. 568–575. ACM, 1999.

I Scott MacKenzie, Abigail Sellen, and William A S Buxton. A comparison nof input devices in element pointing and dragging tasks. In Proceedings of the SIGCHI conference on Human factors in computing systems, pp.161–166. ACM, 1991.

Alexander G Hauptmann. Speech and gestures for graphic image manipulation. In ACM SIGCHI Bulletin, volume 20, pp. 241–245. ACM, 1989.

Dean Rubine. Specifying gestures by example, volume 25. ACM, 1991.

Ramesh M Kagalkar and SV Gumaste. Detail study for sign language recognization techniques. Digital Image Processing, 8(3):65–69, 2016.

Kouichi Murakami and Hitomi Taguchi. Gesture recognition using recurrent neural networks. In Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 237–242, 1991.

Lam T Phi, Hung D Nguyen, TT Quyen Bui, and Thang T Vu. A glove based gesture recognition system for Z Vietnamese sign language. In Control, Automation and Systems (ICCAS), 15th International Conference on, pp. 1555–1559, 2015.

HS Nagendraswamy, BM Chethana Kumara, and R Lekha Chinmayi. Gist descriptors for sign language recognition: an approach based on symbolic representation. In International Conference on Mining Intelligence and Knowledge Exploration, pp. 103–114. Springer, 2015.

Sidney Fels and Geoffrey E Hinton. Building adaptive interfaces with neural networks: The glove-talk pilot study. In Proceedings of the IFIPTC13 Third International Conference on Human-Computer Interaction, pp. 683–688, 1990.

Philip J Mercurio, Thomas Erickson, D Diaper, D Gilmore, G Cockton, and B Shackel. Interactive scientiï¬c visualization: An assessment of a virtual reality system. In INTERACT, pp. 741–745, 1990.

Thomas G Zimmerman, Jaron Lanier, Chuck Blanchard, Steve Bryson, and Young Harvill. A hand gesture interface device. In ACM SIGCHI Bulletin, volume 18, pp. 189–192. ACM, 1987.

Andrew D Wilson and Edward Cutrell. Flowmouse: A computer vision-based pointing and gesture input device. In INTERACT, vol. 5, pp.565–578, 2005.

Randy Pausch. Virtual reality on ï¬ve dollars a day. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 265–270, 1991.

Eric J Townsend. Mattel Power Glove FAQ, 1993.

Jane J Stephan and Sana’a Khudayer. Gesture recognition for human computer interaction (HCI). Int. J. Adv. Comp. Techn., 2(4):30–35, 2010.

Zhang Lei, Zhang Xue-fei, and Liu Yin-ping. Research of the real-time detection of trafï¬c flow based on open CV. In Computer Science and Software Engineering, International Conference on, volume 2, pp. 870–873. IEEE, 2008.

James Kerr and Richard Hunter. Inside RAD: How to build fully functional computer systems in 90 days or less. McGraw-Hill, Inc., 1994.

Li, C., Xie, C., Zhang, B., Chen, C. and Han, J., 2018. Deep Fisher discriminant learning for mobile hand gesture recognition. Pattern Recognition, 77, pp.276-288.

Wu, Y., Jiang, D., Duan, J., Liu, X., Bayford, R. and Demosthenous, A., 2018, May. Towards a High Accuracy Wearable Hand Gesture Recognition System Using EIT. In Circuits and Systems (ISCAS), 2018 IEEE International Symposium on (pp. 1-4). IEEE.

Núñez, J.C., Cabido, R., Pantrigo, J.J., Montemayor, A.S. and Vélez, J.F., 2018. Convolutional Neural Networks and Long Short-Term Memory for skeleton-based human activity and hand gesture recognition. Pattern Recognition, 76, pp.80-94.

Patel, N.A. and Patel, S.J., 2018. A Survey On Hand Gesture Recognition System For Human Computer Interaction (HCI).

Antoshchuk, S., Kovalenko, M. and Sieck, J., 2018. Gesture Recognition-Based Human–Computer Interaction Interface for Multimedia Applications. In Digitisation of Culture: Namibian and International Perspectives (pp. 269-286). Springer, Singapore.

Rafii, A., Gokturk, S.B., Tomasi, C. and Sürücü, F., Microsoft Technology Licensing LLC, 2018. Gesture recognition system using depth perceptive sensors. U.S. Patent 9,959,463.

DOI: http://dx.doi.org/10.18517/ijaseit.9.6.6267


  • There are currently no refbacks.

Published by INSIGHT - Indonesian Society for Knowledge and Human Development