The Application of Eye-Tracking in Consumer Behaviour

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    Eye tracking is one of the important technologies that is used to identify individual’s interest by recording and analysing his/her eyes’ movements. The attention and interest identification can be, then, used in various applications such as marketing and education. In this research, we utilise this technology and apply it in consumer behaviour application namely in retail items display and shelves organising. The gaze points data obtained from eye trackers is analysed and the consumer interest is discussed based on the analysis. From the analyses,

    it is shown that the human attention could be attracted by adding some irregularity in colour, shape, and size to the scene. 

     

     


  • Keywords


    eye tracking, consumer behaviour, marketing, irregularity.

  • References


      [1] M. M. Nitzschner, U. K. J. Nagler, J. F. Rauthmann, A. Steger, and M. R. Furtner, “The Role of Personality in Advertising Perception: An Eye Tracking Study,” J. Psychol. des Alltagshandelns / Psychol. Everyday Act., vol. 8, no. 1, 2015.

      [2] H. Khachatryan and A. L. Rihn, “Eye-Tracking Methodology and Applications in Consumer Research,” Electron. Data Inf. Source UF/IFAS Ext., pp. 1–5, 2014.

      [3] A. Poole and L. J. Ball, “Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects,” in Prospects”, Chapter in C. Ghaoui (Ed.): Encyclopedia of Human-Computer Interaction, Pennsylvania: Idea Group, Inc, 2005.

      [4] M. Koller, T. Salzberger, G. Brenner, and P. Walla, “Broadening the range of applications of eye-tracking in business research,” Análise, Porto Alegre, vol. 23, no. 1, pp. 71–77, 2012.

      [5] S. Peißl, C. D. Wickens, and R. Baruah, “Eye-Tracking Measures in Aviation: A Selective Literature Review,” Int. J. Aerosp. Psychol., vol. 00, no. 00, pp. 1–15, 2018.

      [6] A. Andrychowicz-Trojanowska, “Basic terminology of eye-tracking research,” Appl. Linguist. Pap., vol. 2/2018, no. 25, pp. 123–132, Jun. 2018.

      [7] P. Rodrigues and P. J. Rosa, “Eye-Tracking as a Research Methodology in Educational Context,” in Early Childhood Development, no. January, 2019, pp. 269–294.

      [8] S. Hess, Q. Lohmeyer, M. Meboldt, and E. T. H. Zurich, “Mobile Eye Tracking in Engineering Design Education,” Des. Technol. Educ. An Int. J., vol. 23, no. 2, pp. 86–99, 2018.

      [9] L. Leveque, H. Bosmans, L. Cockmartin, and H. Liu, “State of the art: Eye-Tracking studies in medical imaging,” IEEE Access, vol. 6, no. June, pp. 37023–37034, 2018.

      [10] M. Al-Azawi, “Human Visual Attention and Machine Vision Computational Saliency,” in IEEE International Conference on Electrical, Electronic, Computer, Mechanical and Computing, EECCMC, 2018.

      [11] C. E. Connor, H. E. Egeth, and S. Yantis, “Visual attention: Bottom-up versus top-down,” Curr. Biol., vol. 14, no. 19, pp. 850–852, 2004.

      [12] Y. Pinto, A. R. Van Der Leij, I. G. Sligte, V. A. F. Lamme, and H. S. Scholte, “Bottom-up and top-down attention are independent,” J. Vis., vol. 13, no. 3, pp. 1–14, 2013.

      [13] T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl, “Visualization of Eye Tracking Data: A Taxonomy and Survey,” Comput. Graph. Forum, vol. 36, no. 8, pp. 260–284, 2017.

      [14] M. Borys, “Eye Tracking in Marketing Research: a Review of Recent Available Literature,” in Management, Knowledge and Learning Internationa Conference, 2014, pp. 939–941.

      [15] A. Voßkühler, “OGAMA (Open Gaze and Mouse Analyzer),” 2018. [Online]. Available: http://www.ogama.net/.

      [16] M. Al-Azawi, Y. Yang, and H. Istance, “Irregularity-based saliency identification and evaluation,” IEEE Int. Conf. Comput. Intell. Comput. Res. IEEE ICCIC 2013, 2013.

      [17] M. Al-Azawi, Y. Yang, and H. Istance, “Irregularity-based image regions saliency identification and evaluation,” Multimed. Tools Appl., vol. 75, no. 1, pp. 25–48, 2016.


 

View

Download

Article ID: 29469
 
DOI: 10.14419/ijet.v8i1.12.29469




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.