Non – Vision Based Sensors for Dynamic Hand Gesture Recognition Systems: A Comparative Study

  • Authors

    • Panduranga H T
    • Mani. C
    2018-07-20
    https://doi.org/10.14419/ijet.v7i3.12.17782
  • Human Computer Interaction, Motion Capture, Hand Gesture Recognition, Machine Learning,
  • Gestures are considered as a type of configuration associated with motion in concerned body part, signifying meaningful information or expressing motion or intending to command and control.  Wide ranges of sensors working with different technology are available in market.  Gesture recognition process involves steps like data acquisition from sensor, segmentation, an algorithm for taking gesture data as input, an algorithm to extract parameters and algorithm to classify hand gestures.  Three - dimensional hand gestures have been widely accepted for advanced applications like creation of virtual world where in users can feel the naturality of interacting or playing a musical instrument without presence of any physical device.  Techniques for dynamic finger gesture recognition can be classified as visual based and wearable sensor based.  The purpose of this paper is to compare various non – vision based sensors with different tracking technologies, updating advantages and drawbacks helping investigators and researchers working on this area.

     

     

  • References

    1. [1] Arpita Ray Sarkar, G.Sanyal, S.Majumder , “Hand Gesture Recognition Systems: A Surveyâ€, Internationational Journal of Computer Applications ,Vol.71 – No 15, ,pp: 0975-8887, May 2013.

      [2] Shuai Yang, University of Wollongong, Research Online, “Robust Human Computer Interaction Using Dynamic Hand Gesture Recognitionâ€, Doctor of Philosophy thesis, July-2016.

      [3] Aarti Malik, Ruchika, “ Gesture Technology: A Reviewâ€, International Journal of Electronics and Computer Science Engineeringâ€, ISSN – 2277-1956, Pages 2324 – 2327.

      [4] G. Simion, V.Gui and M.Otesteanu.†Vision Based Hand Gesture Recognition: A Reviewâ€, International Journal of Circuits, Systems and Signal Processingâ€, issue – 4, Vol – 6, 2012, pages 275-282.

      [5] Ying Wu, Thomas S. Huang, “Vision – Based Gesture Recognition: A Reviewâ€, Beckman Institute, University of Illinios at Urbana – Champaign, Lecture Notes in Computer Science, Gesture Workshop, 1999.

      [6] Joseph J LaViola Jr, “3D Gestural Interaction: The State of the Fieldâ€, Review Article, ISRN Artificial Intelligence vol 2013, October 2013.

      [7] G.R.S.Murthy & R.S.Jadon, “A Review Of Vision Based Hand Gesture Recognition Systemâ€,International Journal Of Information technology and Knowledge Managementâ€, July – Dec 2009, vol-2, No.2, PP 405 – 410.

      [8] Sigal Berman. Helman Stern, “ Sensors for Gesture Recognition Systemsâ€, IEEE Transactions on Systems, Man and Cybernetics Part C(Applications and Reviews) Vol.42, No.3, May – 2012.

      [9] Motion Capture – Chapter.

      [10] Yuri De Pra Fausto Spoto, Federico Fontana and Linmi Tao, “ Infrared vs, Ultrasonic Finger Detection on a Virtual Piano Keyboard “, Proceedings ICMC|SMC|2014, 14 – 20 september 2014, Athens, Greece, PP 654 – 658.

      [11] Cheng Zhang et al., “ SoundTrak: Continuous 3D tracking of a finger using active acousticsâ€, PACM on Interactive, Mobile, Wearable and Ubiquitious Technologies (IMWUT), Vol v, No. N, Article, Jan YY.

      [12] Sangki Yun et al. “ Strata: Fine – Grained Acoustic – based Device – Free Trackingâ€, MobiSys’17, June 19 – 23, 2017, ISBN 978-1-4503-4928-4/17/06.

      [13] Rajalakshmi Nandakumar et al. “ FingerIO: Using Active Sonar for Fine – Grained Finger Trackingâ€, CHI’16, May 7 -12, 2016, ISBN 978-1-4503-3362-7/16/05.

      [14] Wenguang Mao, Jian He and Lili Qiu,â€CAT:High – Precision Acoustic Motion Trackingâ€, MobiCom’16, October 03 – 07, 2016, ISBN 978 – 1 – 4503-4226-1/16/10.

      [15] Ke-Yu Chen, Shwetak Patel,Sean Keller, “ Finexus: Tracking Precise Motions of Multiple Fingertips Using Magnetic Sensingâ€, CHI’16, May 07 – 12, 2016, San Jose, ISBN 978-1-4503-3362-7/16/05

      [16] Stale A.Skogstad at al. “Using IR Optical Marker Based Motion Capture for Exploring Musical Interactionâ€, Proceedings of the 2010 Conferences on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia.

      [17] Xiaochi Gu et al. “ Dexmo:An inexpensive and Lightweight Mechanical Exoskeleton For Motion Capture and Force Feedback in VRâ€, www.unity3d.com.

      [18] Laura Dipietro et al. “ A Survey of Glove – Based Systems and Their Applicationsâ€, IEEE Transactions on Systems, Man and Cybernetics – Part C : Applications and Reviews, Vol. – 38, No.4, July – 2008.

      [19] Joao Lourenco, “ Lierature Review: Glove Based input and Three Dimensional Vision Based Interaction “, Rhodes University, B.Sc.[Honours],29th October 2010.

      [20] Eider C.P.Silva et al. “ Sensor Data Fusion for Full Arm Tracking using Myo Armband and Leap Motionâ€, SBC- Proceedings of SBGames 2015|ISSN : 2179 – 2259.

      [21] Xu Zhang et al. “ A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensorsâ€, IEEE Transactions on Systems, Man and Cybernetics – Part A: Systems and Humans, Vol – 41, No.6, November 2011.

      [22] Dom Brown et al. “ Leimu: Gloveless Music Interaction using a Wrist Mounted Leap Motionâ€, NIME’16, July 11-15, 2016, Griffith University, Brisbane, Australia.

      [23] Xiaoyan Wang et al. “ Hidden-Markov-Models-Based Dynamic Hand Gesture Recognitionâ€, Research Article, Mathematical Problems in Engineeering, Vol. 2012, Article ID 986134.

      [24] Mario Ganzeboom, “ How hand gestures are recognized using a Datagloveâ€, Human Media Interaction MSc, University of Twente, The Netherlands.

      [25] Deepali N. Kakade, Prof. Dr.J.S.Chitode,â€Dynamic Hand Gesture Recognition: A literature Reviewâ€, IJERT, ISSN:2278 – 0181, Vol.1, Issue 9, Nov-2012.

  • Downloads

  • How to Cite

    H T, P., & C, M. (2018). Non – Vision Based Sensors for Dynamic Hand Gesture Recognition Systems: A Comparative Study. International Journal of Engineering & Technology, 7(3.12), 1175-1181. https://doi.org/10.14419/ijet.v7i3.12.17782