A novel audio based human interaction proof for visually challenged users
-
2018-03-18 https://doi.org/10.14419/ijet.v7i2.7.10599 -
Bots, CAPTCHA, Human Association Proof, Instrumental Music, Web Security -
Abstract
CAPTCHAs are strategies to recognize human clients and PC programs naturally. CAPTCHAs shield different sorts of online administrations from beast compel assaults and foreswearing of administration via programmed PC programs. Most CAPTCHAs comprise of mages with misshaped content. Shockingly, visual CAPTCHAs constrain access to the a huge number of outwardly hindered individuals utilizing the Web. Sound CAPTCHAs were made to fathom this openness issue. However the presently accessible sound CAPTCHAs have been broken with differing achievement, utilizing the shortcoming in the techniques utilized. Our system, presents the user with an interface that plays a song using instrumental music (nonvocal) randomly selected from some language of users choice. The user is then asked to kind the music composer and then the device estimates whether it is a human or no longer by means of analyzing the response. A person look at turned into conducted to research the overall performance of our proposed mechanism.
-
References
[1] L. von Ahn, M. Blum, and J. Langford. “Telling Humans and Computers Apart Automatically,†Communications of the ACM, vol. 47, no. 2, pp. 57-60, Feb. 2004.
[2] L. von Ahn et al. CAPTCHA: Using Hard AI Problems for Security. Eurocrypt, 2003.
[3] AltaVista's Add-URL" site: altavista.com/sites/addurl/newurl, protected by the earliest known CAPTCHA.
[4] M. Blum, L. A. von Ahn, and J. Langford, The CAPTCHA Project, Completely Automatic Public Turing Test to tell Computers and Humans Apart," www.captcha.net, Dept. of Computer Science, Carnegie-Mellon Univ., and personal communications, November, 2000
[5] A.L. Coates, R.J. Fateman, and H.S. Baird ,Pessimal Print: A Reverse Turing Test, In Proceeding of the 6th Inernational Conference on Document Analysis and Recognition, Seattle, WA,USA, 2001, pp.1154-1158.
[6] M. Chew and H.S. Baird, BaffleText: a Human Interactive Proof, Proc., 10th SPIE/IS&T Document Recognition and Retrieval Conf.(DRR2003), Santa Clara, CA, January 23-24, 2003.
[7] Paypals-URL" site: www.paypals.com
[8] L. V. Ahn, B. Maurer, C. McMillen, D. Abraham, and M. Blum. reCAPTCHA: Human Based Character Recognition via Web Security Measures. Science Express, 321(5895):1465 -1468, 2008.
[9] A. Rusu and V. Govindaraju. Handwritten CAPTCHA: Using the Difference in the Abilities of Humans and Machines in Reading Handwritten Words. In Proceedings of the 9th International Workshop on Frontiers in Handwriting Recognition (IWFHR- 9 2004), pages 226{231, Kokubunji, Tokyo, Japan, 2004.
[10] M. H. Shirali-Shahreza and M. Shirali-Shahreza. Persian/Arabic Baffletext CAPTCHA. Journal of Universal Computer Science, 12(12):1783{1796, 2006.
[11] Microsoft 2006. Microsoft Hotmail. http://www.hotmail.com/ last visited 5 September 2006.
[12] C. Pope and K. Kaur. Is It Human or Computer? Defending E-Commerce with Captchas. IEEE IT Professional, 7(2):43{49, 2005.
[13] Ahn, L. von, Blum, M., Hopper, N.J., and Langford, J. The CAPTCHA Web page; www.captcha.net.
[14] J. Elson, J.R. Douceur, J. Howell, and J. Saul, Asirra: a CAPTCHA that exploits interest-aligned manual image categorization. In Proceedings of the 14th ACM Conference on Computer and Communications Security, Alexandria, Virginia, USA, 2007, 366-374.
[15] R. Datta, J. Li, and J. Z. Wang. Imagination:A Robust Image-Based CAPTCHA Generation System. In Proceedings of the 13th Annual ACM International Conference on Multimedia (MULTIMEDIA05), pages 331{334, New York, NY, USA, 2005. ACM Press.
[16] W. H. Liao. A CAPTCHA Mechanism by Exchanging Image Blocks. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR06), volume 1, pages 1179{1183, Hong Kong, 2006.
[17] D. Misra and K. Gaj. Face Recognition CAPTCHAs. In Proceedings of the Advanced International Conference on Telecommunications and International Conference on Internet and Web Applications and Services(AICT/ICIW'06), pages 122{127, Guadeloupe,French Caribbean, 2006.
[18] Luis von Ahn_ Manuel Blum_ John Langford_ “Telling Humans and Computers Apart (Automatically) or How Lazy Cryptographers do AIâ€Nancy Chan. Program Byan: http://drive.to/research
[19] Haichang Gao, Honggang Liu, Dan Yao, Xiyang Liu “An audio CAPTCHA to distinguish humans from computers†2010 Third International Symposium on Electronic Commerce and Security July 29July 31 ISBN: 978-0-7695-4219-5
[20] G. Sauer, H. Hochheiser, J. Feng, and J. Lazar. Towards a Universally Usable CAPTCHA. In Proceedings of the Symposium on Accessible Privacy and Security, ACM Symposium on Usable Privacy and Security (SOUPS'08), Pittsburgh, PA, USA, 2008.
[21] K. Chellapilla, and P. Simard. Using Machine Learning to Break Visual Human Interaction Proofs (HIPs), Advances in Neural Information Processing Systems, Vol. 17, pp. 265-272. MIT Press.
[22] J. Tam, J. Simsa, S. Hyde, and L. Von Ahn, Breaking Audio CAPTCHAs. Advances in Neural Information Processing Systems. 2008
[23] Noshina Tariq and Farrukh Aslam Khan. Match-the-Sound CAPTCHA.july2017 Jonathan Lazar, Jinjuan Heidi Feng, Tim Brooks, Genna Melamed, Jon Holman, The Sounds Right CAPTCHA: An Improved Approach to Audio Human Interaction Proofs for Blind Users, Department of Computer and Information Sciences and Universal Usability Laboratory Towson University 8000 York Road, Towson, MD 21252, USA,2012
[24] Haichang Gao, Honggang Liu, Dan Yao, Xiyang Liu An audio CAPTCHA to distinguish humans from computers, Software Engineering Institute Xidian University Xi’an, Shaanxi 710071, P.R.China,2010
[25] Yannis Agiomyrgiannakis, Edison Tan, David John Abraham, Systems and methods for threedimensional audio CAPTCHA,US9263055 B2, 2016
[26] Jonathan Lazar, Timothy I. Brooks, Genna Melamed, Jonathan D. Holman, Junjuan Feng, Audio based human-interaction proof, US8667566 B2, 2014
[27] Shotaro Sano Takuma Otsuka Hiroshi G. Okuno, Solving Google’s Continuous Audio CAPTCHA with HMM-Based Automatic Speech Recognition, Graduate School of Informatics, Kyoto University, Kyoto, Japan {sano,ohtsuka,okuno}@kuis.kyoto-u.ac.jp,2013
-
Downloads
-
How to Cite
Korrapolu, R., Sai. N, M., & Rao.M, K. (2018). A novel audio based human interaction proof for visually challenged users. International Journal of Engineering & Technology, 7(2.7), 289-292. https://doi.org/10.14419/ijet.v7i2.7.10599Received date: 2018-03-25
Accepted date: 2018-03-25
Published date: 2018-03-18