AI-Powered Eye Tracking for Bias Detection in Online Course Reviews: A Udemy Case Study
<p>(<b>a</b>): Research flow chart. (<b>b</b>): A detailed research roadmap is derived from project development, setup, and execution.</p> "> Figure 1 Cont.
<p>(<b>a</b>): Research flow chart. (<b>b</b>): A detailed research roadmap is derived from project development, setup, and execution.</p> "> Figure 2
<p>(<b>a</b>–<b>c</b>): Focus score differences between negative and positive reviews based on the video data analysis with heatmaps selected on reviews. The heat map illustrates the areas that garnered the most significant attention, while the attention itself was evaluated on a frame-by-frame basis throughout the entire video. (<b>d</b>–<b>f</b>): Cognitive Demand score differences between negative and positive reviews are based on the video data analysis with fog map selected on reviews. The fog map unambiguously reveals the areas not discernible to the human eye when recording the cognitive demand frame by frame. Consequently, the figure appears illegible.</p> "> Figure 2 Cont.
<p>(<b>a</b>–<b>c</b>): Focus score differences between negative and positive reviews based on the video data analysis with heatmaps selected on reviews. The heat map illustrates the areas that garnered the most significant attention, while the attention itself was evaluated on a frame-by-frame basis throughout the entire video. (<b>d</b>–<b>f</b>): Cognitive Demand score differences between negative and positive reviews are based on the video data analysis with fog map selected on reviews. The fog map unambiguously reveals the areas not discernible to the human eye when recording the cognitive demand frame by frame. Consequently, the figure appears illegible.</p> "> Figure 2 Cont.
<p>(<b>a</b>–<b>c</b>): Focus score differences between negative and positive reviews based on the video data analysis with heatmaps selected on reviews. The heat map illustrates the areas that garnered the most significant attention, while the attention itself was evaluated on a frame-by-frame basis throughout the entire video. (<b>d</b>–<b>f</b>): Cognitive Demand score differences between negative and positive reviews are based on the video data analysis with fog map selected on reviews. The fog map unambiguously reveals the areas not discernible to the human eye when recording the cognitive demand frame by frame. Consequently, the figure appears illegible.</p> "> Figure 3
<p>(<b>a</b>): Total Attention-derived focus heat map of the negative (2-star) review category based on the image data analysis. (<b>b</b>): Total Attention-derived heat map of the positive (5-star) review category based on the image data analysis. The ‘both’ figure represents the AOI’s selected per each review which was needed to the obtain more insightful findings.</p> "> Figure 3 Cont.
<p>(<b>a</b>): Total Attention-derived focus heat map of the negative (2-star) review category based on the image data analysis. (<b>b</b>): Total Attention-derived heat map of the positive (5-star) review category based on the image data analysis. The ‘both’ figure represents the AOI’s selected per each review which was needed to the obtain more insightful findings.</p> "> Figure 4
<p>Correlation matrix for the review view of the negative (2-star) review category from the image data analysis.</p> ">
Abstract
:1. Introduction
1.1. Literature Review
1.2. Discrepancies in Data Labelling
1.3. Restricted Adoption of Cutting-Edge AI Technologies
1.4. Requirement for All-Encompassing Frameworks
1.5. Ethical and Methodological Considerations
1.6. Marketing and Advertising Applications of AI Eye Tracking
1.7. Constraints of AI Eye-Tracking Technology
2. Materials and Methods
2.1. Dynamic Testing: Experimental Design and Methodology
2.2. Static Testing: Experimental Design and Methodology
2.3. Dynamics and Static Research Methodology: Final Framework
3. Results
3.1. Preliminary Results from the Video Analysis
3.2. Statistical Differences from the Image Analysis
3.2.1. Total Attention
3.2.2. Engagement
3.2.3. Clarity
3.2.4. Cognitive Demand
3.2.5. Start and End Attentions
3.2.6. Time Spent and Percentage Seen
3.3. Correlations from the Image Analysis
View | Metrics | Pearson’s Correlation Score | p-Value |
---|---|---|---|
Overall | Total Attention and Engagement | 0.46 | 0.03 |
Total Attention and Cognitive Demand | 0.79 | <0.001 | |
Total Attention and Start Attention | 0.86 | <0.001 | |
Total Attention and End Attention | 0.94 | <0.001 | |
Total Attention and Time Spent | 0.99 | <0.001 | |
Total Attention and Percentage Seen | 0.995 | <0.001 | |
Engagement and Clarity | 0.43 | 0.04 | |
Engagement and Cognitive Demand | 0.78 | <0.001 | |
Engagement and Time Spent | 0.43 | 0.04 | |
Engagement and Percentage Seen | 0.43 | 0.04 | |
Clarity and Start Attention | 0.5 | 0.02 | |
Cognitive Demand and Start Attention | 0.65 | <0.001 | |
Cognitive Demand and End Attention | 0.73 | <0.001 | |
Cognitive Demand and Time Spent | 0.77 | <0.001 | |
Cognitive Demand and Percentage Seen | 0.77 | <0.001 | |
Start Attention and End Attention | 0.93 | <0.001 | |
Start Attention and Time Spent | 0.84 | <0.001 | |
Start Attention and Percentage Seen | 0.87 | <0.001 | |
End Attention and Time Spent | 0.94 | <0.001 | |
End Attention and Percentage Seen | 0.94 | <0.001 | |
Time Spent and Percentage Seen | 0.99 | <0.001 | |
Review | Total Attention and Cognitive Demand | 0.55 | 0.02 |
Total Attention and Start Attention | 0.8 | <0.001 | |
Total Attention and End Attention | 0.93 | <0.001 | |
Total Attention and Time Spent | 0.99 | <0.001 | |
Total Attention and Percentage Seen | 0.996 | <0.001 | |
Engagement and Cognitive Demand | 0.57 | 0.02 | |
Clarity and Cognitive Demand | 0.6 | 0.01 | |
Cognitive Demand and End Attention | 0.6 | 0.01 | |
Cognitive Demand and Time Spent | 0.59 | 0.01 | |
Cognitive Demand and Percentage Seen | 0.55 | 0.02 | |
Start Attention and End Attention | 0.91 | <0.001 | |
Start Attention and Time Spent | 0.77 | <0.001 | |
Start Attention and Percentage Seen | 0.81 | <0.001 | |
End Attention and Time Spent | 0.92 | <0.001 | |
End Attention and Percentage Seen | 0.92 | <0.001 | |
Time Spent and Percentage Seen | 0.98 | <0.001 | |
Other | Total Attention and Engagement | 0.99 | <0.001 |
Total Attention and Cognitive Demand | 0.99 | <0.001 | |
Total Attention and Start Attention | 0.98 | <0.001 | |
Total Attention and End Attention | 0.99 | <0.001 | |
Total Attention and Time Spent | 0.998 | <0.001 | |
Total Attention and Percentage Seen | 0.999 | <0.001 | |
Engagement and Cognitive Demand | 0.98 | <0.001 | |
Engagement and Start Attention | 0.99 | <0.001 | |
Engagement and Time Spent | 0.996 | <0.001 | |
Engagement and Percentage Seen | 0.99 | <0.001 | |
Cognitive Demand and Start Attention | 0.96 | 0.002 | |
Cognitive Demand and End Attention | 0.99 | <0.001 | |
Cognitive Demand and Time Spent | 0.99 | <0.001 | |
Cognitive Demand and Percentage Seen | 0.99 | <0.001 | |
Start Attention and End Attention | 0.99 | <0.001 | |
Start Attention and Time Spent | 0.99 | <0.001 | |
Start Attention and Percentage Seen | 0.98 | <0.001 | |
End Attention and Time Spent | 0.996 | <0.001 | |
End Attention and Percentage Seen | 0.99 | <0.001 | |
Time Spent and Percentage Seen | 0.998 | <0.001 |
4. Discussion
5. Limitations
- First and foremost, limited metrics for video data: this AI eye-tracking software can only generate limited metrics (‘Focus’ and ‘Cognitive Demand’) for video data sets, restricting the depth of analysis possible for this format. That is why we could only use the video data to set the basis for this study. This limitation led to image screenshots for a more comprehensive examination understanding (using additional metrics such as ‘Engagement’, ‘Clarity’, etc.), potentially missing some nuances of the user experience captured in video format.
- Inability to measure emotional valence: The Predict cannot assess the impact of emotional valence on purchasing behaviours. This is a significant limitation, as research has shows that discrete emotions play a crucial role in the consumer decision-making processes beyond simple positive or negative effects [55,56]. Valence, which refers to an event’s intrinsic attractiveness or aversiveness, plays a significant role in consumer decision-making processes [29,30]. However, the complexity of emotions and their discrete nature can influence purchasing behaviours in ways that are not fully captured by valence alone. Research indicates that discrete emotions significantly influence consumer behaviour rather than just valence.
- Lack of consideration for specific emotions: The study’s focus on general metrics may overlook the influence of discrete emotions on consumer behaviour. Research demonstrated that specific emotions, such as gratitude, regret, and disappointment, can significantly impact consumer judgments and behaviours in ways not fully captured by valence-based measures [57,58]. This suggests marketers should focus on specific emotions rather than general valence to predict better and influence consumer behaviour [31,32].
- Absence of emotional priming effects: The study does not account for the potential influence of emotional priming on purchase decisions. Positive emotional primes can increase purchase intentions, while negative primes can decrease them. This suggests that emotional context and priming can be powerful tools in marketing strategies [33]. These limitations highlight the need for future research to address these gaps and expand upon the current findings. Incorporating multi-method approaches, considering discrete emotions, and conducting longitudinal studies for static and not just for dynamic testing could provide a more comprehensive understanding of consumer behaviour in digital environments. While future researchers are encouraged to expand this scope further, the metrics in this research helped build solid foundations for this research question, given that the AI eye-tracking machine tool is trained on a large training data set of 180,000 participants, with prerecorded eye tracking and EEG studies at Stanford University.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AOI | Areas of interest |
ADT | Accessibility–diagnosticity theory |
AI | Artificial intelligence |
EEG | electroencephalography |
PREDICT | AI eye-tracking consumer behaviour prediction software |
References
- Silva-Torres, J.-J.; Martínez-Martínez, L.; Cuesta-Cambra, U. Diseño de un modelo de atención visual para campañas de comunicación. El caso de la COVID-19. Prof. Inf. 2020, 29, e290627. [Google Scholar] [CrossRef]
- Stracke, C.M.; Sharma, R.C.; Bozkurt, A.; Burgos, D.; Cassafieres, C.S.; dos Santos, A.I.; Mason, J.; Ossiannilsson, E.; Santos-Hermosa, G.; Shon, J.G.; et al. Impact of COVID-19 on Formal Education: An International Review of Practices and Potentials of Open Education at a Distance. Int. Rev. Res. Open Distrib. Learn. 2022, 23, 1–18. [Google Scholar] [CrossRef]
- Lee, P.-C.; Liang, L.-L.; Huang, M.-H.; Huang, C.-Y. A comparative study of positive and negative electronic word-of-mouth on the SERVQUAL scale during the COVID-19 epidemic—Taking a regional teaching hospital in Taiwan as an example. BMC Health Serv. Res. 2022, 22, 1568. [Google Scholar] [CrossRef] [PubMed]
- Ho, N.T.T.; Pham, H.-H.; Sivapalan, S.; Dinh, V.-H. The adoption of blended learning using Coursera MOOCs: A case study in a Vietnamese higher education institution. Australas. J. Educ. Technol. 2022, 38, 121–138. [Google Scholar] [CrossRef]
- Dong, W.; Liu, Y.; Zhu, Z.; Cao, X. The Impact of Ambivalent Attitudes on the Helpfulness of Web-Based Reviews: Secondary Analysis of Data From a Large Physician Review Website. J. Med. Internet Res. 2023, 25, e38306. [Google Scholar] [CrossRef]
- Merle, A.; St-Onge, A.; Sénécal, S. Does it pay to be honest? The effect of retailer-provided negative feedback on consumers’ product choice and shopping experience. J. Bus. Res. 2022, 147, 532–543. [Google Scholar] [CrossRef]
- Ai, J.; Gursoy, D.; Liu, Y.; Lv, X. Effects of offering incentives for reviews on trust: Role of review quality and incentive source. Int. J. Hosp. Manag. 2022, 100, 103101. [Google Scholar] [CrossRef]
- Zhu, Q.; Lo, L.Y.-H.; Xia, M.; Chen, Z.; Ma, X. Bias-Aware Design for Informed Decisions: Raising Awareness of Self-Selection Bias in User Ratings and Reviews. Proc. ACM Hum. Comput. Interact. 2022, 6, 1–31. [Google Scholar] [CrossRef]
- Bilal, M.; Almazroi, A.A. Effectiveness of Fine-tuned BERT Model in Classification of Helpful and Unhelpful Online Customer Reviews. Electron. Commer. Res. 2022, 23, 2737–2757. [Google Scholar] [CrossRef]
- Kastrati, Z.; Imran, A.S.; Kurti, A. Weakly Supervised Framework for Aspect-Based Sentiment Analysis on Students’ Reviews of MOOCs. IEEE Access 2020, 8, 106799–106810. [Google Scholar] [CrossRef]
- Campos, J.D.S.; Campos, J.R. Evaluating The Impact of Online Product Review Credibility and Online Product Review Quality on Purchase Intention of Online Consumers. Appl. Quant. Anal. 2024, 4, 12–28. [Google Scholar] [CrossRef]
- Heesook, H.; Hye-Shin, K.; Sharron, L. The Effects of Perceived Quality and Usefulness of Consumer Reviews on Review Reading and Purchase Intention. J. Consum. Satisf. Dissatisf. Complain. Behav. 2019, 31, 1–19. [Google Scholar]
- Mahdi, A. Impact of Online Reviews on Consumer Purchase Decisions. Int. J. Financ. Adm. Econ. Sci. 2023, 2, 19–31. [Google Scholar] [CrossRef]
- Dipankar, D. Measurement of Trustworthiness of the Online Reviews. arXiv 2023, arXiv:2210.00815. [Google Scholar] [CrossRef]
- Putri, Y.A.; Lukitaningsih, A.; Fadhilah, M. Analisis online consumer reviews dan green product terhadap purchase decision melalui trust sebagai variabel intervening. J. Pendidik. Ekon. (JURKAMI) 2023, 8, 334–346. [Google Scholar] [CrossRef]
- Sharma, S.; Kumar, S. Insights into the Impact of Online Product Reviews on Consumer Purchasing Decisions: A Survey-based Analysis of Brands’ Response Strategies. Scholedge Int. J. Manag. Dev. 2023, 10, 1. [Google Scholar] [CrossRef]
- KMall, G.; Pandey, A.C.; Tiwari, A.S.; Chauhan, A.R.; Agarwal, D.A.; Asrani, K.A. E-Commerce customer behavior using machine learning. Int. J. Innov. Res. Comput. Sci. Technol. (IJIRCST) 2024, 12, 324–330. [Google Scholar] [CrossRef]
- Kumaran, T.E.; Lokesh, B.; Arunkumar, P.; Thirumeni, M. Forecasting Customer Attrition using Machine Learning. In Proceedings of the 2024 10th International Conference on Communication and Signal Processing (ICCSP), Melmaruvathur, India, 12–14 April 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 801–806. [Google Scholar] [CrossRef]
- Liu, Z. Analysis of Key Economic Factors in Consumer Behavior and Purchase Decisions in Online Markets. Adv. Econ. Manag. Political Sci. 2024, 77, 26–32. [Google Scholar] [CrossRef]
- Liu, D.; Huang, H.; Zhang, H.; Luo, X.; Fan, Z. Enhancing customer behaviour prediction in e-commerce: A comparative analysis of machine learning and deep learning models. Appl. Comput. Eng. 2024, 55, 190–204. [Google Scholar] [CrossRef]
- Nuradina, K. Psychological factors affects online buying behaviour. J. Bus. Manag. Ina. 2022, 1, 112–123. [Google Scholar] [CrossRef]
- Pokhrel, L. Factor That Influence Online Consumer Buying Behavior with Reference to Nepalgunj city. Acad. Res. J. 2023, 2, 60–69. [Google Scholar] [CrossRef]
- Su, Y.; Zhao, L. Research on online education consumer choice behavior path based on informatization. China Commun. 2021, 18, 233–252. [Google Scholar] [CrossRef]
- Noor, N.M.; Thanakodi, S.; Fadzlah, A.F.A.; Wahab, N.A.; Talib, M.L.; Manimaran, K. Factors influencing online purchasing behaviour: A case study on Malaysian university students. AIP Conf. Proc. 2022, 2617, 060004. [Google Scholar] [CrossRef]
- Ayalew, M.; Zewdie, S. What Factors Determine the Online Consumer Behavior in This Digitalized World? A Systematic Literature. Hum. Behav. Emerg. Technol. 2022, 2022, 1–18. [Google Scholar] [CrossRef]
- Freya, Z.A.; Heike, K.S.; Christina, P.; Teresa, K.N.; Rinaldo, K. Measuring selective exposure to online information: Combining eye-tracking and content analysis of users’ actual search behaviour. In ZORA (Zurich Open Repository and Archive), 14th ed.; Halem: Köln, Germany, 2019; Available online: https://www.zora.uzh.ch/id/eprint/176070/ (accessed on 18 October 2024).
- Silva, B.B.; Orrego-Carmona, D.; Szarkowska, A. Using linear mixed models to analyse data from eye-tracking research on subtitling. Transl. Spaces 2022, 11, 60–88. [Google Scholar] [CrossRef]
- Sharova, T.; Bodyk, O.; Kravchenko, V.; Zemlianska, A.; Nisanoglu, N. Quantitative Analysis of MOOC for Language Training. Int. J. Inf. Educ. Technol. 2022, 12, 421–429. [Google Scholar] [CrossRef]
- Floh, A.; Koller, M.; Zauner, A. Taking a deeper look at online reviews: The asymmetric effect of valence intensity on shopping behaviour. J. Mark. Manag. 2013, 29, 646–670. [Google Scholar] [CrossRef]
- Yang, J.; Sarathy, R.; Walsh, S.M. Do review valence and review volume impact consumers’ purchase decisions as assumed? Nankai Bus. Rev. Int. 2016, 7, 231–257. [Google Scholar] [CrossRef]
- Kranzbühler, A.-M.; Zerres, A.; Kleijnen, M.H.P.; Verlegh, P.W.J. Beyond valence: A meta-analysis of discrete emotions in firm-customer encounters. J. Acad. Mark. Sci. 2020, 48, 478–498. [Google Scholar] [CrossRef]
- Zeelenberg, M.; Pieters, R. Beyond valence in customer dissatisfaction. J. Bus. Res. 2004, 57, 445–455. [Google Scholar] [CrossRef]
- Bello, E. Unravelling the Consumer Brain: The Role of Emotion in Purchase Behavior. Bachelor’s Thesis, William & Mary, Williamsburg, VA, USA, 2014. Available online: https://scholarworks.wm.edu/honorstheses/48 (accessed on 1 September 2024).
- Matzen, L.E.; Stites, M.C.; Gastelum, Z.N. Studying visual search without an eye tracker: An assessment of artificial foveation. Cogn. Res. Princ. Implic. 2021, 6, 45. [Google Scholar] [CrossRef] [PubMed]
- Šola, H.M.; Qureshi, F.H.; Khawaja, S. Predicting Behaviour Patterns in Online and PDF Magazines with AI Eye-Tracking. Behav. Sci. 2024, 14, 677. [Google Scholar] [CrossRef] [PubMed]
- Chen, T.; Samaranayake, P.; Cen, X.; Qi, M.; Lan, Y.-C. The Impact of Online Reviews on Consumers’ Purchasing Decisions: Evidence From an Eye-Tracking Study. Front. Psychol. 2022, 13, 865702. [Google Scholar] [CrossRef] [PubMed]
- Sun, R. Applications of Machine Learning Algorithms in Predicting User’s Purchasing Behavior. Sci. Technol. Eng. Chem. Environ. Prot. 2024, 1, 2–6. [Google Scholar] [CrossRef]
- Berger, J.; Sorensen, A.T.; Rasmussen, S.J. Positive Effects of Negative Publicity: When Negative Reviews Increase Sales. Mark. Sci. 2010, 29, 815–827. [Google Scholar] [CrossRef]
- Ramachandran, R.; Sudhir, S.; Unnithan, A.B. Exploring the relationship between emotionality and product star ratings in online reviews. IIMB Manag. Rev. 2021, 33, 299–308. [Google Scholar] [CrossRef]
- Qu, L.; Chau, P.Y.K. Nudge with interface designs of online product review systems—Effects of online product review system designs on purchase behaviour. Inf. Technol. People 2023, 36, 1555–1579. [Google Scholar] [CrossRef]
- Hernandez-Bocanegra, D.C.; Ziegler, J. Effects of Interactivity and Presentation on Review-Based Explanations for Recommendations. In Proceedings of the Human-Computer Interaction—INTERACT 2021, Bari, Italy, 30 August–3 September 2021; pp. 597–618. [Google Scholar] [CrossRef]
- Liu, R.; Ford, J.B.; Raajpoot, N. Theoretical investigation of the antecedent role of review valence in building electronic customer relationships. Int. J. Electron. Cust. Relatsh. Manag. 2022, 13, 187. [Google Scholar] [CrossRef]
- Du, X.; Zhao, Z.; Cui, X. The Effect of Review Valence, New Product Types and Regulatory Focus on New Product Online Review Usefulness. Acta Psychol. Sin. 2015, 47, 555. [Google Scholar] [CrossRef]
- Li, Y.; Geng, L.; Chang, Y.; Ning, P. Research online and purchase offline: The disruptive impact of consumers’ online information on offline sales interaction. Psychol. Mark. 2023, 40, 2642–2652. [Google Scholar] [CrossRef]
- Meftah, M.; Ounacer, S.; Azzouazi, M. Enhancing Customer Engagement in Loyalty Programs Through AI-Powered Market Basket Prediction Using Machine Learning Algorithms. In Engineering Applications of Artificial Intelligence; Springer: Cham, Switzerland, 2024; pp. 319–338. [Google Scholar] [CrossRef]
- Munde, A.; Kaur, J. Predictive Modelling of Customer Sustainable Jewelry Purchases Using Machine Learning Algorithms. Procedia Comput. Sci. 2024, 235, 683–700. [Google Scholar] [CrossRef]
- Su, X.; Niu, M. Too obvious to ignore: Influence of popular reviews on consumer online purchasing decisions. Hum. Syst. Manag. 2021, 40, 211–222. [Google Scholar] [CrossRef]
- Kassab, S.E.; Al-Eraky, M.; El-Sayed, W.; Hamdy, H.; Schmidt, H. Measurement of student engagement in health professions education: A review of literature. BMC Med. Educ. 2023, 23, 354. [Google Scholar] [CrossRef]
- Tonbuloglu, B. An Evaluation of the Use of Artificial Intelligence Applications in Online Education. J. Educ. Technol. Online Learn. 2023, 6, 866–884. [Google Scholar] [CrossRef]
- Shafique, R.; Aljedaani, W.; Rustam, F.; Lee, E.; Mehmood, A.; Choi, G.S. Role of Artificial Intelligence in Online Education: A Systematic Mapping Study. IEEE Access 2023, 11, 52570–52584. [Google Scholar] [CrossRef]
- Dogan, M.E.; Dogan, T.G.; Bozkurt, A. The Use of Artificial Intelligence (AI) in Online Learning and Distance Education Processes: A Systematic Review of Empirical Studies. Appl. Sci. 2023, 13, 3056. [Google Scholar] [CrossRef]
- Durso, S.D.O.; Arruda, E.P. Artificial intelligence in distance education: A systematic literature review of Brazilian studies. Probl. Educ. 21st Century 2022, 80, 679–692. [Google Scholar] [CrossRef]
- Šola, H.M.; Qureshi, F.H.; Khawaja, S. AI Eye-Tracking Technology: A New Era in Managing Cognitive Loads for Online Learners. Educ. Sci. 2024, 14, 933. [Google Scholar] [CrossRef]
- Mansor, A.A.; Isa, M.S. Development of Neuromarketing Model in Branding Service. In Proceedings of the 8th International Conference on Education and Information Management (ICEIM-2015), Penang, Malaysia, 16–17 May 2015; pp. 1–10. Available online: https://www.researchgate.net/publication/306396646 (accessed on 3 September 2024).
- Armengol-Urpi, A.; Salazar-Gómez, A.F.; Sarma, S.E. Brainwave-Augmented Eye Tracker: High-Frequency SSVEPs Improves Camera-Based Eye Tracking Accuracy. In Proceedings of the 27th International Conference on Intelligent User Interfaces, Helsinki Finland, 22–25 March 2022; ACM: New York, NY, USA, 2022; pp. 258–276. [Google Scholar] [CrossRef]
- Lescroart, M.; Binaee, K.; Shankar, B.; Sinnott, C.; Hart, J.A.; Biswas, A.; Nudnou, I.; Balas, B.; Greene, M.R.; MacNeilage, P. Methodological limits on sampling visual experience with mobile eye tracking. J. Vis. 2022, 22, 3201. [Google Scholar] [CrossRef]
- Alateyyat, S.; Soltan, M. Utilizing Artificial Intelligence in Higher Education: A Systematic Review. In Proceedings of the 2024 ASU International Conference in Emerging Technologies for Sustainability and Intelligent Systems (ICETSIS), Manama, Bahrain, 28–29 January 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 371–374. [Google Scholar] [CrossRef]
- Tlili, A.; Huang, R.; Mustafa, M.Y.; Zhao, J.; Bozkurt, A.; Xu, L.; Wang, H.; Salha, S.; Altinay, F.; Affouneh, S.; et al. Speaking of transparency: Are all Artificial Intelligence (AI) literature reviews in education transparent? J. Appl. Learn. Teach. 2023, 6, 45–49. [Google Scholar] [CrossRef]
Review Category | Metrics | Results |
---|---|---|
Negative (1-star) Review | Engagement | 28 |
(2-star) Review | Engagement | 29 |
(3-star) Review | Engagement | 28 |
(4-star) Review | Engagement | 26 |
Positive (5-star) Review | Engagement | 30 |
(2-star) Review | Clarity | 45 |
(3-star) Review | Clarity | 57 |
(4-star) Review | Clarity | 47 |
Positive (5-star) Review | Clarity | 63 |
Negative (1-star) Review | Cognitive Demand | 24 |
(2-star) Review | Cognitive Demand | 25 |
(3-star) Review | Cognitive Demand | 24 |
(4-star) Review | Cognitive Demand | 24 |
Positive (5-star) Review | Cognitive Demand | 24 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Šola, H.M.; Qureshi, F.H.; Khawaja, S. AI-Powered Eye Tracking for Bias Detection in Online Course Reviews: A Udemy Case Study. Big Data Cogn. Comput. 2024, 8, 144. https://doi.org/10.3390/bdcc8110144
Šola HM, Qureshi FH, Khawaja S. AI-Powered Eye Tracking for Bias Detection in Online Course Reviews: A Udemy Case Study. Big Data and Cognitive Computing. 2024; 8(11):144. https://doi.org/10.3390/bdcc8110144
Chicago/Turabian StyleŠola, Hedda Martina, Fayyaz Hussain Qureshi, and Sarwar Khawaja. 2024. "AI-Powered Eye Tracking for Bias Detection in Online Course Reviews: A Udemy Case Study" Big Data and Cognitive Computing 8, no. 11: 144. https://doi.org/10.3390/bdcc8110144
APA StyleŠola, H. M., Qureshi, F. H., & Khawaja, S. (2024). AI-Powered Eye Tracking for Bias Detection in Online Course Reviews: A Udemy Case Study. Big Data and Cognitive Computing, 8(11), 144. https://doi.org/10.3390/bdcc8110144