[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3027063.3053174acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based Interface

Published: 06 May 2017 Publication History

Abstract

We present the first iteration of a user-centred design for integrating an eye gaze tracker with ultrasound (US) machine interfaces used in routine diagnostic sonography. The aim is to speed up the interaction of the sonographer with the machine, offer a more ergonomic solution while minimizing the cognitive load and maintaining the produced image quality. In this iteration, we target pan and zoom functions. Field studies and observations have led to two design alternatives. A feasibility study of two design approaches determined from field studies is done using a functional prototype. Results from six sonographers provided evidence for the potential of a multimodal gaze-based interface for US machines. Results from the feasibility study have also led to a second design iteration that combines the advantages of both gaze-based designs.

Supplementary Material

suppl.mov (lbw0455.mp4)
Supplemental video

References

[1]
Nicholas Adams, Mark Witkowski, and Robert Spence. 2008. The Inspection of Very Large Images by Eye-gaze Control. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI '08). ACM, New York, NY, USA, 111--118.
[2]
CIRS. 2016. Multi-purpose multi-tissue ultrasound phantom. (2016). http://www.cirsinc.com/products/ modality/67/multi-purpose-multi-tissue-ultrasound-phantom/.
[3]
Gazepoint. 2016. GP3 Eye Tracker. (2016). https: //www.gazept.com/.
[4]
Hartmut Glücker, Felix Raab, Florian Echtler, and Christian Wolff. 2014. EyeDE: Gaze-enhanced Software Development Environments. In Proceedings of the Extended Abstracts of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI EA '14). ACM, New York, NY, USA, 1555--1560.
[5]
Robert JK Jacob. 1995. Eye tracking in advanced interface design. Virtual environments and advanced interface design (1995), 258--288.
[6]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 421--430.
[7]
Emilie Mollenbach, Thorarinn Stefansson, and John Paulin Hansen. 2008. All Eyes on the Monitor: Gaze Based Interaction in Zoomable, Multi-scaled Information-spaces. In Proceedings of the 13th International Conference on Intelligent User Interfaces (IUI '08). ACM, New York, NY, USA, 373--376.
[8]
Andre Russo, Carmel Murphy, Victoria Lessoway, and Jonathan Berkowitz. 2002. The prevalence of musculoskeletal symptoms among British Columbia sonographers. Applied Ergonomics 33, 5 (2002), 385 -- 393.
[9]
Sophie Stellmach and Raimund Dachselt. 2012. Investigating Gaze-supported Multimodal Pan and Zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 357--360. 2168556.2168636
[10]
Ultrasonix. 2016. Touch screen ultrasound system. (2016). http://www.ultrasonix.com/node/73.
[11]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246--253.

Cited By

View all
  • (2019)Multimodal enactive interface: Design principles grounded on cognitive neuroscientific basis2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)10.1109/CogInfoCom47531.2019.9089912(000397-000402)Online publication date: Oct-2019
  • (2019)A novel gaze-supported multimodal human–computer interaction for ultrasound machinesInternational Journal of Computer Assisted Radiology and Surgery10.1007/s11548-019-01964-814:7(1107-1115)Online publication date: 11-Apr-2019

Index Terms

  1. Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based Interface

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '17: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems
    May 2017
    3954 pages
    ISBN:9781450346566
    DOI:10.1145/3027063
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 May 2017

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze input
    3. gaze-based interaction
    4. multimodal interaction
    5. pan
    6. sonography
    7. ultrasound machines
    8. zoom

    Qualifiers

    • Abstract

    Conference

    CHI '17
    Sponsor:

    Acceptance Rates

    CHI EA '17 Paper Acceptance Rate 1,000 of 5,000 submissions, 20%;
    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Multimodal enactive interface: Design principles grounded on cognitive neuroscientific basis2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)10.1109/CogInfoCom47531.2019.9089912(000397-000402)Online publication date: Oct-2019
    • (2019)A novel gaze-supported multimodal human–computer interaction for ultrasound machinesInternational Journal of Computer Assisted Radiology and Surgery10.1007/s11548-019-01964-814:7(1107-1115)Online publication date: 11-Apr-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media