Hand Gesture Interface for Smart Operation Theatre Lighting
-
2018-05-03 https://doi.org/10.14419/ijet.v7i2.25.12358 -
Accelerometer, hand gesture, nRF module, Operation Theatre -
Abstract
Operation theatre lighting system is a critical component in the operation theatre. Focusing the light to the area of surgery and varying the light intensity is manually done in currently available systems. Disruption of surgeon’s attention and possibility of infection due to non sterilized hands is a major setback of the manual controlling method. This usually requires assistants to adjust the position and intensity of the light system by commands from the surgeon. So, designing an intelligent lighting system for operating theatre which can maintain the intensity values to the required level without shadow effect to the operation area is an important requirement. Proposed system makes use of accelerators in the gloves worn by surgeon so as to track hand movements. Pre defined hand movements can be configured to move the lighting system up, down, left or right. Accelerator input is interpreted for its corresponding movement by controller at transmitter side and it is wirelessly given to the receiver side using an nRF module. At the receiver side the incoming command from the nRF is taken and suitable response generated at the motor driver by the controller to move the lighting device. In the same way predefined hand signal can be used to vary the intensity of the light by using microcontroller to drive an LED driver.
Â
Â
-
References
[1] R. Mardiyanto, M. F. R. Utomo, D. Purwanto and H. Suryoatmojo, “Development of hand gesture recognition sensor based on accelerometer and gyroscope for controlling arm of underwater remotely operated robot," 2017 International Seminar on Intelligent Technology and Its Applications (ISITIA), Surabaya, 2017, pp. 329-333
[2] D. G. Choi, B. J. Yi and W. k. Kim, "Automation of Surgical Illumination System Using Robot and Ultrasonic Sensor," 2007 International Conference on Mechatronics and Automation, Harbin, 2007, pp. 1062-1066.
[3] Y. Pititeeraphab, P. Choitkunnan, N. Thongpance, K. Kullathum and C. Pintavirooj, "Robot-arm control system using LEAP motion controller," 2016 International Conference on Biomedical Engineering (BME-HUST), Hanoi, 2016, pp. 109-112
[4] Wachs J, Stern H, Edan Y, Gillam M, Feied C, Smith M, Handler J. ‘Gestix’ A Doctor-Computer Sterile Gesture Interface for Dynamic Environments†in the 11th Online World Conference on Soft Computing in Industrial Applications, September 18th - October 6th, 2006
[5] M. G. Jacob, Y. T. Li and J. P. Wachs, "Gestonurse: A multimodal robotic scrub nurse," 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Boston, MA, 2012, pp. 153-154.
[6] Mewes, André & Hensen, Bennet & Wacker, Frank & Hansen, Christian. (2016). “Touchless interaction with software in interventional radiology and surgery a systematic literature reviewâ€. International Journal of Computer Assisted Radiology and Surgery. 12. 10.1007/s11548-016-1480-6.
[7] Schwarz LA, Bigdelou A, Navab N (2011) “Learning gestures for customizable human-computer interactionin the operating room.†In: Medical Image Computing and Computer-Assisted Intervention–MICCAI 2011,Springer,pp129–136
[8] Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014)†Leap motion gesture control with osirix in the operating room to control imaging first experiences during live surgeryâ€. Surgical innovation pp655–656
[9] Velloso, Eduardo & Schmidt, Dominik & Alexander, Jason & Gellersen, Hans & Bulling, Andreas. (2015). The Feet in Human--Computer Interaction. ACM Computing Surveys. 48. 1-35. 10.1145/2816455.
[10] P. Neto, J. N. Pires and A. P. Moreira, "Accelerometer-based control of an industrial robotic arm," RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, 2009, pp. 1192-1197.
[11] B. Mrazovac, M. Z. Bjelica, D. Simić, S. Tikvić and I. Papp, "Gesture based hardware interface for RF lighting control," 2011 IEEE 9th International Symposium on Intelligent Systems and Informatics, Subotica, 2011, pp. 309-314
[12] Riduwan M, Basori AH, Mohamed F (2013) “Finger based gestural interaction for exploration of 3d heart visualizationâ€. Procedia-Social and Behavioral Sciences 97:684–690
[13] Suelze B, Agten R, Bertrand PB, Vandenryt T, Thoelen R, Vandervoort P, Grieten L (2013) “Waving at the heart: Implementation of a kinect-based real-time interactive control system for viewing cine angiogram loops during cardiac catheterization procedures.†In: Computing in Cardiology Conference (CinC), 2013, IEEE, pp 229–232
-
Downloads
-
How to Cite
Joseph, J., & D S, D. (2018). Hand Gesture Interface for Smart Operation Theatre Lighting. International Journal of Engineering & Technology, 7(2.25), 20-23. https://doi.org/10.14419/ijet.v7i2.25.12358Received date: 2018-05-03
Accepted date: 2018-05-03
Published date: 2018-05-03