Eye Gaze Relevance Feedback Indicators for Information Retrieval

Full Text (PDF, 461KB), PP.57-65

Views: 0 Downloads: 0


Stephen Akuma 1,*

1. Department of Mathematics and Computer Science, Benue State University, Makurdi, PMB102119, Nigeria

* Corresponding author.

DOI: https://doi.org/10.5815/ijisa.2022.01.05

Received: 3 Jun. 2021 / Revised: 11 Aug. 2021 / Accepted: 16 Oct. 2021 / Published: 8 Feb. 2022

Index Terms

Eye gaze, information retrieval, user experiment, implicit feedback, eye tracking, intelligent system


There is a growing interest in the research on interactive information retrieval, particularly in the study of eye gaze-enhanced interaction. Feedback generated from user gaze features is important for developing an interactive information retrieval system. Generating these gaze features have become less difficult with the advancement of the eye tracker system over the years. In this work, eye movement as a source of relevant feedback was examined. A controlled user experiment was carried out and a set of documents were given to users to read before an eye tracker and rate the documents according to how relevant they are to a given task. Gaze features such as fixation duration, fixation count and heat maps were captured. The result showed a medium linear relationship between fixation count and user explicit ratings. Further analysis was carried out and three classifiers were compared in terms of predicting document relevance based on gaze features. It was found that the J48 decision tree classifier produced the highest accuracy.

Cite This Paper

Stephen Akuma, "Eye Gaze Relevance Feedback Indicators for Information Retrieval", International Journal of Intelligent Systems and Applications(IJISA), Vol.14, No.1, pp.57-65, 2022. DOI: 10.5815/ijisa.2022.01.05


[1] F. Jahanbakhsh, A.H. Awadallah, S.T. Dumais and X. Xu, '"Effects of past interactions on user experience with recommended documents," CHIIR - Proc. Conf. Hum. Inf. Interact. Retr., pp. 153-162.
[2] S. Sarkar, M. Mitsui, J. Liu and C. Shah, '"Implicit information need as explicit problems, help, and behavioral signals," Inf.Process.Manage., vol. 57, no. 2.
[3] J. Chen, J. Mao, Y. Liu, M. Zhang and S. Ma, '"A context-aware click model for web search," WSDM - Proc. Int. Conf. Web Search Data Min., pp. 88-96.
[4] Stephen Akuma, Rahat Iqbal,"Development of Relevance Feedback System using Regression Predictive Model and TF-IDF Algorithm", International Journal of Education and Management Engineering(IJEME), Vol.8, No.4, pp.31-49, 2018.DOI: 10.5815/ijeme.2018.04.04
[5] W. Wang, S. Hosseini, A.H. Awadallah, P.N. Bennett and C. Quirk, '"Context-aware intent identification in email conversations," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 585-594.
[6] H. Zhang, X. Song, C. Xiong, C. Rosset, P.N. Bennett, N. Craswell and S. Tiwary, '"Generic intent representation in web search," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 65-74.
[7] A. Olteanu, J. Garcia-Gathright, M. De Rijke and M.D. Ekstrand, '"Facts-ir: Fairness, accountability, confdentiality, transparency, and safety in information retrieval," Sigir Forum, vol. 53, no. 2, pp. 20-43.
[8] H. Xu, S. Zhang and H. Huang, '"A novel personalized recommendation system of digital resources based on semantics," Proceedings - 3rd International Conference on Information Management, Innovation Management and Industrial Engineering, ICIII 2010, vol. 2, pp. 529-533.
[9] A. Lipani, B. Carterette and E. Yilmaz, '"From a user model for query sessions to session rank biased precision (sRBP)," ICTIR - Proc. ACM SIGIR Int. Conf. Theory Inform. Retr., pp. 109-116.
[10] G. Buscher, A. Dengel, R. Biedert and L. Van Elst, '"Attentive Documents: Eye Tracking as Implicit Feedback for Information Retrieval and Beyond," ACM Transactions on Interactive Intelligent Systems, vol. 2, no. 1, pp. 1-30.
[11] C. Hansen, C. Hansen, J.G. Simonsen, B. Larsen, S. Alstrup and C. Lioma, '"Factuality Checking in News Headlines with Eye Tracking," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 2013-2016.
[12] K. Albishre, Y. Li, Y. Xu and W. Huang, '"Query-based unsupervised learning for improving social media search," World Wide Web, vol. 23, no. 3, pp. 1791-1809.
[13] H. Zamani, B. Mitra, E. Chen, G. Lueck, F. Diaz, P.N. Bennett, N. Craswell and S.T. Dumais, '"Analyzing and Learning from User Interactions for Search Clarification," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 1181-1190.
[14] M. Claypool, P. Le, M. Wased and D. Brown, '"Implicit interest indicators," International Conference on Intelligent User Interfaces, Proceedings IUI, pp. 33-40.
[15] H.R. Kim and P.K. Chan, '"Implicit indicators for interesting Web pages," WEBIST 2005 - 1st International Conference on Web Information Systems and Technologies, Proceedings, pp. 270-277.
[16] J. Jiang, D. He and J. Allan, '"Comparing in situ and multidimensional relevance judgments," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 405-414.
[17] Stephen Akuma,"Investigating the Effect of Implicit Browsing Behaviour on Students’ Performance in a Task Specific Context", International Journal of Information Technology and Computer Science(IJITCS), vol.6, no.5, pp.11-17, 2014. DOI: 10.5815/ijitcs.2014.05.02
[18] J. Huang, R.W. White, G. Buscher and K. Wang, '"Improving searcher models using mouse cursor activity," SIGIR'12 Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 195-204.
[19] A.L. Yarbus, '"The perception of an image fixed with respect to the retina," Biophysics, vol. 2, no. 5-6, pp. 683-690.
[20] L.R. Young and D. Sheena, '"Survey of eye movement recording methods," Behav.Res.Methods, vol. 7, no. 5, pp. 397-429.
[21] A. Ajanki, D.R. Hardoon, S. Kaski, K. Puolamäki and J. Shawe-Taylor, '"Can eyes reveal interest? Implicit queries from gaze patterns," User Modell User Adapt Interact, vol. 19, no. 4, pp. 307-339.
[22] A. Vtyurina, A. Fourney, M.R. Morris, L. Findlater and R.W. White, '"Verse: Bridging screen readers and voice assistants for enhanced eyes-free web search," ASSETS - Int. ACM SIGACCESS Conf. Comput. Access., pp. 414-426.
[23] J. Gwizdka and Y. Zhang, '"Differences in eye-tracking measures between visits and revisits to relevant and irrelevant Web pages," SIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 811-814.
[24] D.R. Hardoon, J. Shawe-Taylor, A. Ajanki, K. Puolamäki and S. Kaski, '"Information retrieval by inferring implicit queries from eye movements," J.Mach.Learn.Res., vol. 2, pp. 179-186.
[25] M. Davari, D. Hienert, D. Kern and S. Dietze, '"The role of word-eye-fixations for query term prediction," CHIIR - Proc. Conf. Hum. Inf. Interact. Retr., pp. 422-426.
[26] T.D. Loboda, P. Brusilovsky and J. Brunstein, '"Inferring word relevance from eye-movements of readers," Int Conf Intell User Interfaces Proc IUI, pp. 175-184.
[27] J. Pfeiffer, T. Pfeiffer, M. Meißner and E. Weiß, '"Eye-tracking-based classification of information search behavior using machine learning: Evidence from experiments in physical shops and virtual reality shopping environments," Inf.Syst.Res., vol. 31, no. 3, pp. 675-691.
[28] P. Balatsoukas and I. Ruthven, '"An eye-tracking approach to the analysis of relevance judgments on the Web: The case of Google search engine," J.Am.Soc.Inf.Sci.Technol., vol. 63, no. 9, pp. 1728-1746.
[29] Z. Prasov and J.Y. Chai, '"What's in a Gaze? The role of eye-gaze in reference resolution in multimodal conversational interfaces," Int Conf Intell User Interfaces Proc IUI, pp. 20-29.
[30] L.A. Granka, T. Joachims and G. Gay, '"Eye-tracking analysis of user behavior in WWW search," Proceedings of Sheffield SIGIR - Twenty-Seventh Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 478-479.
[31] P.P. Maglio, R. Barrett, C.S. Campbell and T. Selker, '"SUITOR: An attentive information system," International Conference on Intelligent User Interfaces, Proceedings IUI, pp. 169-176.
[32] T. Joachims, L. Granka, B. Pan, H. Hembrooke and G. Gay, '"Accurately interpreting clickthrough data as implicit feedback," SIGIR - Proc. Annu. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 154-161.
[33] J. Salojärvi, K. Puolamäki and S. Kaski, '"Implicit relevance feedback from eye movements," Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 3696 LNCS, pp. 513-518.
[34] E. Cutrell and Z. Guan, '"What are you looking for?: An eye-tracking study of information usage in Web search," Conference on Human Factors in Computing Systems - Proceedings, pp. 407-416.
[35] G. Buscher, L. Van Elst and A. Dengel, '"Segment-level display time as implicit feedback: A comparison to eye tracking," Proceedings - 32nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2009, pp. 67-74.
[36] Q. Guo and E. Agichtein, '"Towards predicting web searcher gaze position from mouse movements," Conference on Human Factors in Computing Systems - Proceedings, pp. 3601-3606.
[37] Q. Guo and E. Agichtein, '"Beyond dwell time: Estimating document relevance from cursor movements and other post-click searcher behavior," WWW'12 - Proceedings of the 21st Annual Conference on World Wide Web, pp. 569-578.
[38] G. Buscher, R.W. White, S.T. Dumais and J. Huang, '"Large-scale analysis of individual and task differences in search result page examination strategies," WSDM 2012 - Proceedings of the 5th ACM International Conference on Web Search and Data Mining, pp. 373-382.
[39] M.J. Cole, J. Gwizdka, C. Liu, R. Bierig, N.J. Belkin and X. Zhang, '"Task and user effects on reading patterns in information search," Interact Comput, vol. 23, no. 4, pp. 346-362.
[40] R. Baeza-Yate and B. Riberro-Neto, '"Modern Information Retrieval,", 1999.
[41] S. Akuma, R. Iqbal, C. Jayne and F. Doctor, '"Comparative analysis of relevance feedback methods based on two user studies," Comput.Hum.Behav., vol. 60, pp. 138-146.
[42] Y. Liao, S.E. Li, G. Li, W. Wang, B. Cheng and F. Chen, '"Detection of driver cognitive distraction: An SVM based real-time algorithm and its comparison study in typical driving scenarios," IEEE Intell Veh Symp Proc, vol. 2016-August, pp. 394-399.
[43] M. Chuang, R. Bala, E.A. Bernal, P. Paul and A. Burry, '"Estimating gaze direction of vehicle drivers using a smartphone camera," IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recogn. Workshops, pp. 165-170.
[44] J.H. Oh and N. Kwak, '"Recognition of a Driver's gaze for vehicle headlamp control," IEEE Trans.Veh.Technol., vol. 61, no. 5, pp. 2008-2017.
[45] S. Akuma and T. Ndera, '"Adaptive Educational Hypermedia System for High School Students Based on Learning Styles," International Journal of Educational and Pedagogical Sciences, vol. 15(2), pp. 228-234.