[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3631700.3665195acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
short-paper
Open access

Assessing Human Visual Attention in Retail Human-Robot Interaction: A YOLOv8-Nano and Eye-Tracking Approach

Published: 28 June 2024 Publication History

Abstract

Objectives: This research delves into the dynamics of human-robot interaction (HRI) in retail environments, with a focus on robot detection from videos captured via an eye-tracking system. Methods: The study employs YOLOv8-nano model for real-time robot detection during grocery shopping tasks. All videos were processed using the YOLOv8 model to test inference speed while performing eye-tracking data analysis as a case study. Results: The YOLOv8 model demonstrated high precision in robot detection, with a mean average precision (mAP) of approximately 97.3% for Intersection over Union (IoU), 100% precision, and 99.87% recall for box detection. The model’s ability to process an average of 160.36 frames per second (FPS) confirmed its suitability for real-time applications. In the case study on the impact of a robot’s presence on human eye movements, the presence of a robot contributes to greater consistency in gaze fixation behavior, potentially leading to more predictable patterns of visual attention. Conclusion: The study’s findings contribute significantly to the design of safer and more efficient cobot systems. They provide a deeper understanding of human responses in real-world scenarios, which is crucial for the development of effective HRI systems.

References

[1]
Carla Alves, André Cardoso, Ana Colim, Estela Bicho, Ana Cristina Braga, João Cunha, Carlos Faria, and Luís A. Rocha. 2022. Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues. Robotics 11, 3 (May 2022), 59. https://doi.org/10.3390/robotics11030059
[2]
Tatiana A. Amor, Saulo D. S. Reis, Daniel Campos, Hans J. Herrmann, and José S. Andrade. 2016. Persistence in eye movement during visual search. Scientific Reports 6, 1 (Feb. 2016). https://doi.org/10.1038/srep20815
[3]
Sergi Bermúdez i Badia, Paula Alexandra Silva, Diogo Branco, Ana Pinto, Carla Carvalho, Paulo Menezes, Jorge Almeida, and Artur Pilacinski. 2022. Virtual Reality for Safe Testing and Development in Collaborative Robotics: Challenges and Perspectives. Electronics 11, 11 (May 2022), 1726. https://doi.org/10.3390/electronics11111726
[4]
Yuhao Chen, Yue Luo, and Boyi Hu. 2022. Towards Next Generation Cleaning Tools: Factors Affecting Cleaning Robot Usage and Proxemic Behaviors Design. Frontiers in Electronics 3 (April 2022). https://doi.org/10.3389/felec.2022.895001
[5]
Yuhao Chen, Yue Luo, Chizhao Yang, Mustafa Ozkan Yerebakan, Shuai Hao, Nicolas Grimaldi, Song Li, Read Hayes, and Boyi Hu. 2022. Human Mobile Robot Interaction in the Retail Environment. (Sept. 2022). https://doi.org/10.11922/sciencedb.01351
[6]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition. Ieee, 248–255.
[7]
Carole S. Franklin, Elena G. Dominguez, Jeff D. Fryman, and Mark L. Lewandowski. 2020. Collaborative robotics: New era of human–robot cooperation in the workplace. Journal of Safety Research 74 (Sept. 2020), 153–160. https://doi.org/10.1016/j.jsr.2020.06.013
[8]
Robert J.K. Jacob and Keith S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research. Elsevier, 573–605. https://doi.org/10.1016/b978-044451020-4/50031-1
[9]
Glenn Jocher, Ayush Chaurasia, and Jing Qiu. 2023. Ultralytics YOLO. https://github.com/ultralytics/ultralytics
[10]
K Kompatsiari, F Ciardo, D De Tommaso, and A Wykowska. 2019. Measuring engagement elicited by eye contact in Human-Robot Interaction. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 6979–6985. https://doi.org/10.1109/IROS40897.2019.8967747
[11]
Tsung-Yi Lin, Michael Maire, Serge Belongie, Lubomir Bourdev, Ross Girshick, James Hays, Pietro Perona, Deva Ramanan, C. Lawrence Zitnick, and Piotr Dollár. 2014. Microsoft COCO: Common Objects in Context. https://doi.org/10.48550/ARXIV.1405.0312
[12]
Yue Luo, Yuhao Chen, and Boyi Hu. 2023. Multisensory Evaluation of Human-Robot Interaction in Retail Stores - The Effect of Mobile Cobots on Individuals’ Physical and Neurophysiological Responses. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction(HRI ’23). Association for Computing Machinery, New York, NY, USA, 403–406. https://doi.org/10.1145/3568294.3580115
[13]
Yue Luo, Yuhao Chen, Mustafa Ozkan Yerebakan, Shuai Hao, Nicolas Grimaldi, Chizhao Yang, Read Hayes, and Boyi Hu. 2022. How Do Humans Adjust Their Motion Patterns in Mobile Robots Populated Retail Environments?. In 2022 IEEE 3rd International Conference on Human-Machine Systems (ICHMS). 1–6. https://doi.org/10.1109/ICHMS56717.2022.9980607
[14]
Bhanuka Mahanama, Yasith Jayawardana, Sundararaman Rengarajan, Gavindya Jayawardena, Leanne Chukoskie, Joseph Snider, and Sampath Jayarathna. 2022. Eye Movement and Pupil Measures: A Review. Frontiers in Computer Science 3 (Jan. 2022). https://doi.org/10.3389/fcomp.2021.733531
[15]
Marcus Nyström and Kenneth Holmqvist. 2010. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods 42, 1 (Feb. 2010), 188–204. https://doi.org/10.3758/brm.42.1.188
[16]
E.A.B. Over, I.T.C. Hooge, B.N.S. Vlaskamp, and C.J. Erkelens. 2007. Coarse-to-fine eye movement strategy in visual search. Vision Research 47, 17 (Aug. 2007), 2272–2280. https://doi.org/10.1016/j.visres.2007.05.002
[17]
Martin Placek. 2023. Market size of industrial automation globally 2020-2025. Webpage. Retrieved April 10, 2024 from https://www.statista.com/statistics/1219772/industrial-automation-market-size-worldwide
[18]
Keith Rayner, Xingshan Li, Carrick C. Williams, Kyle R. Cave, and Arnold D. Well. 2007. Eye movements during information processing tasks: Individual differences and cultural effects. Vision Research 47, 21 (Sept. 2007), 2714–2726. https://doi.org/10.1016/j.visres.2007.05.007
[19]
Lei Shi, Cosmin Copot, and Steve Vanlanduit. 2019. What Are You Looking at? Detecting Human Intention in Gaze based Human-Robot Interaction. CoRR abs/1909.07953 (2019). arXiv:1909.07953
[20]
Kentaro Wada. 2021. Labelme: Image Polygonal Annotation with Python. https://doi.org/10.5281/zenodo.5711226
[21]
Xu Xiao, Yaonan Wang, and Yiming Jiang. 2023. Review of Research Advances in Fruit and Vegetable Harvesting Robots. Journal of Electrical Engineering and amp Technology 19, 1 (Sept. 2023), 773–789. https://doi.org/10.1007/s42835-023-01596-8
[22]
Zhi Zhang, Tong He, Hang Zhang, Zhongyue Zhang, Junyuan Xie, and Mu Li. 2019. Bag of Freebies for Training Object Detection Neural Networks. CoRR abs/1902.04103 (2019). arXiv:1902.04103

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
UMAP Adjunct '24: Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization
June 2024
662 pages
ISBN:9798400704666
DOI:10.1145/3631700
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 June 2024

Check for updates

Author Tags

  1. Applied Computing
  2. Human Robot Interaction
  3. Object Detection
  4. Physiological Patterns

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • National Science Foundation (NSF)

Conference

UMAP '24
Sponsor:

Acceptance Rates

Overall Acceptance Rate 162 of 633 submissions, 26%

Upcoming Conference

UMAP '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 215
    Total Downloads
  • Downloads (Last 12 months)215
  • Downloads (Last 6 weeks)36
Reflects downloads up to 23 Jan 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media