Interactive Dance Guidance Using Example Motions

  • Authors

    • Yejin Kim
    • . .
    2018-09-01
    https://doi.org/10.14419/ijet.v7i3.34.19372
  • Dance motion, dance train, example motion, interactive guidance, online lesson.
  • Abstract

    Background/Objectives: Human movements in dance are difficult to train without taking an actual class. In this paper, an interactive system of dance guidance is proposed to teach dance motions using examples.

    Methods/Statistical analysis: In the proposed system, a set of example motions are captured from experts through a method of marker-free motion capture, which consists of multiple Kinect cameras. The captured motions are calibrated and optimally reconstructed into a motion database. For the efficient exchange of motion data between a student and an instructor, a posture-based motion search and multi-mode views are provided for online lessons.

    Findings: To capture accurate example motions, the proposed system solves the joint occlusion problem by using multiple Kinect cameras. An iterative closest point (ICP) method is used to unify the multiple camera data into the same coordinate system, which generates an output motion in real time. Comparing to a commercial system, our system can capture various dance motions over an average of 85% accuracy, as shown in the experimental results. Using the touch screen devices, a student can browse a desired motion from the database to start a dance practice and send own motion to an instructor for feedback. By conducting online dance lessons such as ballet, K-pop, and traditional Korean, our experimental results show that the participating students can train their dance skills over a given period.

    Improvements/Applications: Our system is applicable to any student who wants to learn dance motions without taking an actual class andto receive online feedback from a distant instructor.

     

     

  • References

    1. [1] United Nations Educational, Scientific and Cultural Organization (UNESCO). Intangible cultural heritage. Retrieved from https://ich.unesco.org/en/home/.

      [2] Dance Notation Bureau. Labanotation. Retrieved from http://www.dancenotation.org/.

      [3] The Noa Eshkol Foundation for Movement Notation. Eshkol-Wachman Movement Notation (EWMN). Retrieved from http://noaeshkol.org/.

      [4] TroikaTronicx. Isadora. Retrieved from https://troikatronix.com/.

      [5] Credo Interactive. DanceForms 2. Retrieved from http://charactermotion.com/products/danceforms/.

      [6] Harmonix. Dance Central Spotlight. Retrieved from http://www.harmonixmusic.com/games/dance-central/.

      [7] Ubisoft Entertainment. Just Dance Now. Retrieved from https://just-dance.ubisoft.com/en-us/home/.

      [8] Microsoft. Kinect Camera Sensor. Retrieved from https://developer.microsoft.com/en-us/windows/kinect/.

      [9] Chan, J., Leung, H., Tang, J., & Komura, T. (2011). A Virtual Reality Dance Training System Using Motion Capture Technology. IEEE Transactions on Learning Technologies,4(2), 187-195.

      [10] Intel. RealSense. Retrieved from https://software.intel.com/en-us/realsense.

      [11] Zhang, L., Sturm, J., Cremers, D., &Lee, D. (2012, October 7-12). Real-time human motion tracking using multiple depth cameras. Paper presented at the IEEE International Conference on Intelligent Robots and Systems. doi: 10.1109/IROS.2012.6385968

      [12] Kitsikidis, A., Dimitropoulos, K., Douka, S., &Grammalidis, N. (2014, January 5-8). Dance Analysis using Multiple Kinect Sensors. Paper presented at the International Conference on Computer Vision Theory and Applications, Lisbon, Portugal. Piscataway, New Jersey: IEEE.

      [13] Kaenchan, S., Mongkolnam, P., Watanapa, B., &Sathienpong, S. (2013, September 4-6). Automatic Multiple Kinect Cameras Setting for Simple Walking Posture Analysis. Paper presented at the International Computer Science and Engineering Conference. doi: 10.1109/ICSEC.2013.6694787

      [14] Moon, S., Park, Y., Ko, D.W., &Suh, I.H. (2016). Multiple Kinect Sensor Fusion for Human Skeleton Tracking using Kalman Filtering. International Journal of Advanced Robotic Systems, 13(2), 1-10, doi:10.5772/62415

      [15] Jo, H., Yu, H., Kim, K., &Jung, H.S. (2015). Motion Tracking System for Multi-User with Multiple Kinects. International Journal of u- and e-Service, Science and Technology, 8(7), 99-108. doi:10.14257/ijunesst. 2015.8.7.10

      [16] Kim, Y., Baek, S., & Bae, B.-C. (2017). Motion Capture of the Human Body. ETRI Journal, 39(2), 181-190. doi:10.4218/etrij.17.2816.0045

      [17] Kim, Y. (2017). Dance motion capture and composition using multiple RGB and depth sensors, International Journal of Distributed Sensor Networks, 13(2), 1-11. doi:10.1177/1550147717696083

      [18] Besl, P.J., & McKay, N.D. (1992). A Method for Registration of 3-D Shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(2), 239–256.

      [19] Hong, S., & Kim, M., (2016). A Framework for Human Body Parts Detection in RGB-D Image. Journal of Korea Multimedia Society, 19(12), 1927-1935.

      [20] Xsens.MVN motion capture system. Retrieved from http://xsens.com.

  • Downloads

  • How to Cite

    Kim, Y., & ., . (2018). Interactive Dance Guidance Using Example Motions. International Journal of Engineering & Technology, 7(3.34), 521-526. https://doi.org/10.14419/ijet.v7i3.34.19372

    Received date: 2018-09-09

    Accepted date: 2018-09-09

    Published date: 2018-09-01