Head Gesture Recognition and Interaction Techniques in Virtual Reality: a Review

  • Authors

    • Nurul Nasuha Zolkefly
    • Ismahafezi Ismail
    • Suhailan Safei
    • Syadiah Nor Wan Shamsuddin
    • Mohd Azhar M. Arsad
    2018-12-09
    https://doi.org/10.14419/ijet.v7i4.31.23725
  • Head Gesture Recognition, Mobility Impairments, Virtual Reality Interaction Technique, Virtual Reality
  • This paper presents a review of head gesture recognition using specific models by year timeline. The related minor topics of this project also mainly discuss (a) Virtual Reality interaction techniques, (b) Virtual Reality head gesture interaction, and (c) Mobility Impairments using Virtual Reality (VR) System. This study contributes to an exploration of a different body part of the gestural input, which is the head gesture as the main interaction approach in virtual reality (VR). This review also prepares new insights of head gesture from how the model theoretically recognizes the gesture and implemented as input modality and interaction in virtual reality (VR) environment.

     

  • References

    1. [1] Oculus Rift. (2017). Retrieved on 19 January 2018 from http://www.oculus.com/

      [2] Putting the 'real' into virtual reality: the teams building virtual worlds on HTC Vive. (2018). Retrieved on 15 April 2018 from https:// www.techradar.com/news/ the-apps-and- those-behind-them-that-make-the- virtual-a-reality-for-the-htc-vive

      [3] Simulation & Training Practical Engagement of Learning Activities: Industrial. (2018). Retrieved on 14 February 2018 from https://www.vrstudios.com/simulations/

      [4] Virtual Reality Surgery: The Future of Healthcare. (2017). Retrieved on 14 February 2018 from https://appreal-vr.com/blog/understanding-vr-surgery/

      [5] Zhang, M., Zhang, Z., Chang, Y., Aziz, E. S., Esche, S., & Chassapis, C. (2018). Recent developments in game-based virtual reality educational laboratories using the Microsoft kinect. International Journal of Emerging Technologies in Learning, 13(1), 138–159

      [6] Morimoto, C., Yacoob, Y., & Davis, L. (1996). Recognition of head gestures using hidden Markov models. In Proceedings – International Conference on Pattern Recognition (Vol. 3, pp.461–465)

      [7] Kapoor, A., & Picard, R. W. (2001). A Real-Time Head Nod and Shake Detector IR camera with LEDs. Computer, (544), 5

      [8] Tan, W., & Rong, G. (2003). A real-time head nod and shake detector using HMMs. Expert Systems with Applications, 25(3), 461–466

      [9] Terven, J. R., Salas, J., & Raducanu, B. (2014). Robust head gestures recognition for assistive technology. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8495 LNCS, pp. 152–161)

      [10] Zhao, J., & Allison, R. S. (2017). Real-time head gesture recognition on head-mounted displays using cascaded hidden Markov models. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2361–2366

      [11] Keypoints, S., & Lowe, D. G. (2004). Distinctive Image Features from International Journal of Computer Vision, 60(2), 91–110

      [12] Leap Motion. (2017). Retrieved on 20 January 2018 from https:// www. leapmotion. com/ product/vr/

      [13] Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017). Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings (pp. 36–39)

      [14] Pai, Y. S., Outram, B., Vontin, N., & Kunze, K. (2016). Transparent Reality: Using Eye Gaze Focus Depth as Interaction Modality. Proceedings of the 29th Annual Symposium on User Interface Software and Technology –UIST ’16 Adjunct, 171–172

      [15] Pfeuffer, K., Mayer, B., Mardanbegi, D., & Gellersen, H. (2017). Gaze + pinch interaction in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction - SUI ’17 (pp. 99–108)

      [16] Pai, Y. S., Outram, B. I., Tag, B., Isogai, M., Ochi, D., & Kunze, K. (2017). GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze. ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017, 2016–2017

      [17] Sargunam, S. P., Moghadam, K. R., Suhail, M., & Ragan, E.D. (2017). Guided head rotation and amplified head rotation: Evaluating semi-natural travel and viewing techniques in virtual reality. In Proceedings - IEEE Virtual Reality (pp. 19–28)

      [18] Gandrud, J., & Interrante, V. (2016). Predicting destination using head orientation and gaze direction during locomotion in VR. In Proceedings of the ACM Symposium on Applied Perception - SAP ’16 (pp. 31–38)

      [19] Physical and Mobility Impairment Factsheet. (2018). Retrieved on 23 January 2018 from https://web.stanford.edu/class/engr110/factsheet.html

      [20] WORLD REPORT ON DISABILITY (2011), World Health Organization (WHO) - The World Bank joint. Disability: a global concern. What do we know about people with disabilities? (pp. 261)Disability and Virtual Reality Technology. (2016). Retrieved on 21 January 2018 from https://www.disabled- world.com/assistivedevices/computer/vr-tech.php

      [21] Jabeen, F., Tao, L., & Linlin, T. (2017). One Bit Mouse for Virtual Reality. In Proceedings - 2016 International Conference on Virtual Reality and Visualization, ICVRV 2016 (pp. 442–446)

  • Downloads

  • How to Cite

    Nasuha Zolkefly, N., Ismail, I., Safei, S., Nor Wan Shamsuddin, S., & Azhar M. Arsad, M. (2018). Head Gesture Recognition and Interaction Techniques in Virtual Reality: a Review. International Journal of Engineering & Technology, 7(4.31), 437-440. https://doi.org/10.14419/ijet.v7i4.31.23725