Designing an Expressive Virtual Kompang on Mobile Device with Tri-Axial Accelerometer
-
2018-12-09 https://doi.org/10.14419/ijet.v7i4.31.23721 -
Gesture Recognition, Mobile Music, Music Interaction, Natural User Interface. -
Abstract
The paper presents an expressive virtual percussion instrument for Kompang on mobile devices that closely replicate the actual instrument. In nowadays, most available applications are lacking expressiveness control as these applications only use trigger-type event to play corresponding sound. This paper is therefore implemented a simple extraction method by extracting percussive features from embedded sensors to map with the output sound with minimum delay. Multiple features related to the shape of the drum hit are extracted by using tri-axis accelerometer sensors of the mobile device. These features provide an expressive percussion experience that closely imitates playing an actual instrument. An application of the virtual instrument for Kompang is described with an evaluation of the system with ideas for future developments. Result from the study showed that the feature extraction algorithm had an accuracy of 86.78% at detecting drum hit at its peak acceleration value. The questionnaire results also indicated the participants were satisfied with the system in overall.
Â
Â
-
References
[1] Dolhansky B, McPherson A & Kim YE (2011), Designing an expressive virtual percussion instrument. Proceedings of the 8th Sound and Music Computing Conference, Vol. 69. Padova, Italy.
[2] Marrin T & Paradiso JA (1997), The Digital Baton: a Versatile Performance Instrument. ICMC.
[3] Young D & Fujinaga I (2004), Aobachi: A new interface for japanese drumming. Proceedings of the 2004 conference on New interfaces for musical expression, pp.23-26.
[4] Heise S & Loviscach J (2008), A versatile expressive percusion instrument with game technology. Multimedia and Expo, 2008 IEEE International Conference, pp.393-396
[5] Levin G (2001), Dialtones-a telesymphony.
[6] Williamson J, Murray-Smith R & Hughes S (2007), Shoogle: excitatory multimodal interaction on mobile devices. Proceedings of the SIGCHI conference on Human factors in computing systems, pp.121-124.
[7] Norman DA (2010), Natural user interfaces are not natural. Interactions, Vol. 17, 3, pp.6-10.
[8] Hariadi RR and Kuswardayan I (2016), Design and Implementation of Virtual Indonesian Musical Instrument (VIMi) Application Using Leap Motion Controller. Information & Communication Technology and Systems (ICTS), 2016 International Conference, pp. 43-48.
[9] Togootogtokh E, Shih TK, Kumara WGCW, Wu SJ, Sun WE and Chang HH (2017), 3D finger tracking and recognition image processing for real-time music playing with depth sensors. Multimedia Tools and Applications, Vol. 77, 8, pp.9233-9248.
[10] Silva ES, de Abreu JAO, de Almeida JHP, Teichrieb V & Ramalho GL (2013), A Preliminary Evaluation of the Leap Motion Sensor as Controller of New Digital Musical Instruments. Recife, Brasil.
[11] Han J & Gold N (2014), Lessons Learned in Exploring the Leap Motionâ„¢ Sensor for Gesture-based Instrument Design. Goldsmiths University of London, London, United Kingdom.
[12] Collicutt M, Casciato C & Wanderley MM (2009), From Real to Virtual: A Comparison of Input Devices for Percussion Tasks. NIME, pp.1-6.
[13] Wigdor D & Wixon D (2011), Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier.
-
Downloads
-
How to Cite
Yong Leng, H., Mohd. Norowi, N., & Hazri Jantan, A. (2018). Designing an Expressive Virtual Kompang on Mobile Device with Tri-Axial Accelerometer. International Journal of Engineering & Technology, 7(4.31), 414-419. https://doi.org/10.14419/ijet.v7i4.31.23721Received date: 2018-12-12
Accepted date: 2018-12-12
Published date: 2018-12-09