Retina the real time interactive solution for visually impaired

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    The main objective of the project is to provide an application that acts as an all-in-one tool for disabled people. Most activities are performed digitally, a prime example being online shopping platforms which have replaced the traditional means. In this era of modern technology, people are entirely reliant on electronic gadgets and make use of several features to run their daily lives. To enhance the usability of such features for disabled people, this application has been developed to take care of their basic needs. The dependence of the general public on mobile applications is justified but that is not the case for disabled people. This application has been designed specifically to cater to such people with disabilities. This proposal requires our “Retina App” and breath analyzer module only. The differentiating feature of this appli-cation is the ability to perform various functions such as booking an Uber, reserving tables at restaurants, calling ambulances and fire trucks in addition to the detection of diseases. It can detect the level of alcohol in a person’s breath and when it surpasses a certain level, it displays a notification which can help the user to book a cab directly to take him/her back to his/her residence..

     

     


  • Keywords


    API; Breath Analyzer ; GPS; Mobile Application; Online Services; Phone Calls; Uber App

  • References


      [1] Suvarna Bhoir, Ajeesh Abraham, Krupa Wadhaiya, "Camera based product identification for the visually impaired", International Journal of Engineering Research and General Science, vol. 4, no. 2, pp. 413-417, March-April 2016.

      [2] V. Kulyukin and A. Kutiyanawala, “Accessible Shopping Systems for Blind and Visually Impaired Individuals: Design Requirements and the State of the Art,” Open Rehabilitation J., vol. 3, 2010, pp. 158−168.

      [3] D. Dakopoulos and N.G. Bourbakis, “Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey,” IEEE Trans. Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 40, no. 1, 2010, pp. 25−35.

      [4] R. Brenner, J. Priyadarshi, and L. Itti, “Perfect Accuracy with Human-in-the-Loop Object Detection,” Proc. European Conf. Computer Vision (ECCV 16), 2016, pp. 360−374.

      [5] K. Rajendran, A. Samraj & M. Rajavel (2013, January). “Gesture and Hand Activity Based Emergency Response Communication by Patients, Elderly and Disabled While Using Data Gloves”. In Intelligent Systems Modelling & Simulation (ISMS), 2013 4th International Conference on (pp. 264-269). IEEE.

      [6] V.Naresh, B.Venkataramani, Abhishek Karan and J.Manikandan, “PSOC based isolated speech recognition system,” IEEE International Conference on Communication and Signal Processing, pp 693- 697, April 3-5, 2013, India

      [7] Y. Yin, H. Yu, B. Chu, and Y. Xiao, "A sensor array optimization method of electronic nose based on elimination transform of Wilks statistic for discrimination of three kinds of vinegars," J. Food Eng., 2013.

      [8] D. Guo, D. Zhang, and L. Zhang, "An LDA based sensor selection approach used in breath analysis system," Sens. Actuators: B. Chem., vol. 157, pp. 265-274, 2011.

      [9] D. L. Edyburn (2006). “Assistive technology and mild disabilities” Special Education Technology Practice, 8(4), 18–28.

      [10] Le Chen,Alan Mislove,Christo Wilson “Peeking Beneath the Hood of Uber” ,2015.

      [11] Taabish Gulzar, Anand Singh, Dinesh Kumar Rajoriya and Najma Farooq, “A Systematic Analysis of Automatic Speech Recognition: An Overview,” International Journal of Current Engineering and Technology, vol.4, no.3, June 2014.

      [12]D. Mourtzis, M. Doukas & C. Vandera (2014). Mobile apps for product customisation and design of manufacturing networks. Manufacturing Letter, 2 (2), 30-34.

      [13] M. M. Martins, C. P. Santos, A. Frizera-Neto & R. Ceres (2012). “Assistive mobility devices focusing on Smart Walkers: Classification and review”. Robotics and Autonomous Systems, 60(4), 548-562.

      [14] J. Nicholson, V. Kulyukin, and D. Coster, “ShopTalk Independent Blind Shopping through Verbal Route Directions”.


 

View

Download

Article ID: 14843
 
DOI: 10.14419/ijet.v7i2.33.14843




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.