[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3313831.3376196acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Multimodal Gaze Interaction for Creative Design

Published: 23 April 2020 Publication History

Abstract

We present a new application ("Sakura") that enables people with physical impairments to produce creative visual design work using a multimodal gaze approach. The system integrates multiple features tailored for gaze interaction including the selection of design artefacts via a novel grid approach, control methods for manipulating canvas objects, creative typography, a new color selection approach, and a customizable guide technique facilitating the alignment of design elements. A user evaluation (N=24) found that non-disabled users were able to utilize the application to complete common design activities and that they rated the system positively in terms of usability. A follow-up study with physically impaired participants (N=6) demonstrated they were able to control the system when working towards a website design, rating the application as having a good level of usability. Our research highlights new directions in making creative activities more accessible for people with physical impairments.

Supplementary Material

MP4 File (a69-creed-presentation.mp4)

References

[1]
Michael Ashmore, Andrew Duchowski, and Garth Shoemaker. 2005. Efficient Eye Pointing with a Fisheye Lens Michael. In Proceedings of Graphics Interface, 203--210.
[2]
Behrooz Ashtiani and I. Scott MacKenzie. 2010. BlinkWrite2: an improved text entry method using eye blinks. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 339--345.
[3]
Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies 4, 3: 114--123.
[4]
Richard Bates and Howell Istance 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proceedings of the fifth international ACM conference on Assistive technologies, 119--126.
[5]
Patrick Baudisch, Edward Cutrell, Ken Hinckley, and Adam Eversole. "nap-and-go: helping users align objects without the modality of traditional snapping. In Proceedings of the SIGCHI conference on Human factors in computing systems, 301--310.
[6]
T.R. Beelders and P.J. Blignaut. 2011. The Usability of Speech and Eye Gaze as a Multimodal Interface for a Word Processor. In Speech Technologies, Ivo Ipsic, IntechOpen.
[7]
Ralf Biedert, Georg Buscher, and Andreas Dengel. 2010. The eyebook--using eye tracking to enhance the reading experience. Informatik-Spektrum 33, 3: 272281.
[8]
Eric A. Bier and Maureen C. Stone. 1986. Snapdragging. In ACM SIGGRAPH Computer Graphics, 233--240.
[9]
Tuhin Chakraborty, Sayan Sarcar, and Debasis Samanta. 2014. Design and evaluation of a dwell-free eye typing technique. In Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems, 1573--1578.
[10]
Chris Creed. 2016. Assistive tools for disability arts: collaborative experiences in working with disabled artists and stakeholders. Journal of Assistive Technologies 10, 2: 121--129.
[11]
Liwei Dai, Rich Goldman, Andrew Sears, and Jeremy Lozier. 2005. Speech-based cursor control using grids: modelling performance and comparisons with other solutions. Behaviour & Information Technology 24, 3: 219--230.
[12]
Antonio Diaz-Tula and Carlos Morimoto. 2016. AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys. In Proceedings of the 2016 CHI conference on human factors in computing systems, 3533--3544.
[13]
James Gips and Peter Olivieri. 1996. EagleEyes: An eye control system for persons with disabilities. In Proceedings of the eleventh international conference on technology and persons with disabilities, 1--15.
[14]
Smartbox. 2019. Grid Pad. Retrieved September 19, 2019 from https://thinksmartbox.com/product/grid-pad/
[15]
Yasmin Halwani, Septimiu Salcudean, Victoria Lessoway, and Sidney Fels. 2017. Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based Interface. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 1648--1654.
[16]
Dan Hansen, Henrik Skovsgaard, John Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications, 205--212.
[17]
John Hansen, Javier Agustin, and Henrik Skovsgaard. 2011. Gaze Interaction From Bed. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications, 11.
[18]
Henna Heikkilä. 2013. EyeSketch: a drawing application for gaze control. In Proceedings of the 2013 Conference on Eye Tracking South Africa, 71--74.
[19]
Henna Heikkilä. 2013. Tools for a Gaze-Controlled Drawing Application--Comparing Gaze Gestures against Dwell Buttons. In Proceedings of IFIP Conference on Human-Computer Interaction, 187--201.
[20]
Jens Helmert, Sebastian Pannasch, and Boris Velichkovsky. 2008. Influences of dwell time and cursor control on the performance in gaze driven typing. Journal of Eye Movement 2, 4: 1--8.
[21]
Seongkook Heo, Yong-Ki Lee, Jiho Yeom, and Geehyuk Lee. 2012. Design of a shape dependent snapping algorithm. In CHI'12 Extended Abstracts on Human Factors in Computing Systems, 2207--2212.
[22]
Anthony Hornof and Anna Cavender. 2005. EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the SIGCHI conference on Human factors in computing systems, 161--170.
[23]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye tracking research & applications, 51--54.
[24]
Kenji Itoh, Hirotaka Aoki, and John Paulin Hansen. 2006. A comparative usability study of two Japanese gaze typing systems. In Proceedings of the 2006 symposium on Eye tracking research & applications, 59--66.
[25]
Robert Jacob and Keith Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The Mind's Eye, 573-- 605.
[26]
Ghita Jalal, Nolwenn Maudet, and Wendy E. Mackay. 2015. Color portraits: From color picking to interacting with color. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 4207--4216.
[27]
Hesham M. Kamel and James A. Landay. 2000. A study of blind drawing practice: creating graphical information without the visual channel. In Proceedings of the fourth international ACM conference on Assistive technologies, 34--41.
[28]
Hesham M. Kamel and Halil I. Erhan. 2013. WebSight: the use of the grid-based interface to convey layout of web-pages in a non-visual environment. In International Conference on Universal Access in Human-Computer Interaction, 674--683.
[29]
Jan van der Kamp and Veronica Sundstedt. 2011. Gaze and voice controlled drawing. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications, 1-- 8.
[30]
EunJin Kim and Hyeon-Jeong Suk. 2017. Thoughts and Tools for Crafting Colors: Implications from Designers' Behavior. In Proceedings of the 2017 Conference on Designing Interactive Systems, 321--331.
[31]
Manu Kumar, Terry Winograd, and Andreas Paepcke. 2007. Gaze-enhanced Scrolling Techniques. Gaze-enhanced scrolling techniques. In Proceedings of CHI'07 Extended Abstracts on Human Factors in Computing Systems, 2531--2536.
[32]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwellfree Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 1952--1956.
[33]
Chris Lankford. 2000. Effective eye-gaze input into windows. In Proceedings of the 2000 symposium on Eye tracking research & applications, 23--27.
[34]
Shuo Samuel Liu, Andrew Rawicz, Teng Ma, Cheng Zhang, Kyle Lin, Siavash Rezaei, and Eion Wu. 2010. An eye-gaze tracking and human computer interface system for people with ALS and other locked-in diseases. CMBES 33, 1.
[35]
I. Scott MacKenzie and Xuang Zhang. 2008. Eye typing using word and letter prediction and a fixation algorithm. In Proceedings of the 2008 symposium on Eye tracking research & applications, 55--58.
[36]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg pakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the 27th international conference on Human factors in computing systems, 357--360.
[37]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human--computer interaction. Advances in physiological computing, 39--65.
[38]
Päivi Majaranta, Poika Isokoski, Jussi Rantala, and Oleg Spakov. 2016. Haptic feedback in eye typing. Journal of Eye Movement Research 9, 1: 1--14.
[39]
Päivi Majaranta, I. Scott MacKenzie, Anne Aula, and Kari-Jouko Räihä. 2006. Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society 5, 2: 199--208.
[40]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of Eye Tracking Research and Applications, 15--22.
[41]
Toshiyuki Masui. HyperSnapping. 2001. In Proceedings IEEE Symposia on Human-Centric Computing Languages and Environments, 188--194.
[42]
André Meyer and Markus Dittmar. 2009. Conception and development of an accessible application for producing images by gaze interaction - EyeArt.
[43]
Microsoft. 2019. Eye Control in Windows 10. Retrieved December 20, 2019 from https://support.microsoft.com/en-us/help/4512610.
[44]
Darius Miniotas, Oleg Spakov, Ivan Tugoy, and I. Scott MacKenzie. 2006. Speech-augmented eye gaze interaction with small closely spaced targets. In Proceedings of the 2006 symposium on Eye tracking research & applications, 67--72.
[45]
Emilie Mollenbach, Thorarinn Stefansson, and John P. Hansen. 2008. All eyes on the monitor. In Proceedings of the 13th international conference on Intelligent user interfaces, 373--376.
[46]
MNDA. 2019. Motor Neurone Disease Association. Retrieved September 19, 2019 from https://www.mndassociation.org/.
[47]
OptiKey. 2019. Optikey Website. Retrieved December 20, 2019 from https://bit.ly/2QcLCoq.
[48]
Prateek Panwar, Sayan Sarcar, Debasis Samanta. 2012. EyeBoard: A fast and accurate eye gaze-based text entry system. In Proceedings of 4th International Conference on Intelligent Human Computer Interaction (IHCI), 1--8.
[49]
Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2012. Designing for the eye: design parameters for dwell in gaze interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference, 479--488.
[50]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: combining gaze with multi-touch for interaction on the same surface. In Proceedings of the 27th annual ACM symposium on User interface software and technology, 509--518.
[51]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gazeshifting: Direct-indirect input with pen and touch modulated by gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, 73--383.
[52]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and touch interaction on tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, 301--311.
[53]
Marco Porta and Alessia Ravelli. 2009. WeyeB, an eye-controlled web browser for hands-free navigation. In 2nd Conference on Human System Interactions, 210215.
[54]
Marco Porta, Alice Ravarelli, Giovanni Spagnoli. 2010. ceCursor, a contextual eye cursor for general pointing in windows environments. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 331--337.
[55]
Marco Porta and Matteo Turina. 2008. Eye-S: a fullscreen input modality for pure eye-based communication. In Proceedings of the 2008 symposium on Eye tracking research & applications, 27--34.
[56]
Kari-Jouko Räihä and Selina Sharmin. 2014. Gaze-contingent scrolling and reading patterns. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, 65--68.
[57]
Kari-Jouko Räihä and Saila Ovaska. 2012. An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proceedings of the SIGCHI conference on human factors in computing systems, 3001--3010.
[58]
Sayan Sarcar and Prateek Panwar. 2013. Eyeboard++. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, 354--363.
[59]
Simon Schenk, Marc Dreiser, Gerhard Rigoll, and Michael Dorr. 2017. GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 3034--3044.
[60]
Maria Shugrina, Wenjia Zhang, Fanny Chevalier, Sanja Fidler, and Karan Singh. 2019. Color Builder: A Direct Manipulation Interface for Versatile Color Theme Authoring. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Paper No: 456.
[61]
Linda Sibert and Robert Jacob. 2000. Evaluation of eye gaze interaction. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 281--288.
[62]
Henrik Skovsgaard, Julio Mateo, John Flach, and John Paulin Hansen. 2010. Small-target selection with gaze alone. In Proceedings of the 2010 Symposium on EyeTracking Research & Applications, 145--148.
[63]
Henrik Skovsgaard, Julio Mateo, and John Paulin Hansen. 2011. Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets. Behaviour & Information 30, 6: 821--831.
[64]
Philip Strain, Graham McAllister, Emma Murphy, Ravi Kuber, and Wai Yu. 2007. A grid-based extension to an assistive multimodal interface. In Proceedings of Conference on Human Factors in Computing Systems: CHI'07 extended abstracts on Human factors in computing systems, 2675--2680.
[65]
The Eye Tribe. 2019. The Eye Tribe website. Retrieved December 20, 2019 from https://bit.ly/36Svhfd.
[66]
Tobii Dynavox. 2019. Gaze Point. Retrieved December 20, 2019 from https://www.tobiidynavox.com/engb/software/free-resources/gaze-point-1/.
[67]
Outi Tuisku, Veikko Surakka, Ville Rantanen, Toni Vanhala, and Jukka Lekkala. 2013. Text Entry by Gazing and Smiling. Advances in Human-Computer Interaction No. 1.
[68]
Jayson Turner, Shamsi Iqbal, and Susan Dumais. 2015. Understanding gaze and scrolling strategies in text consumption tasks. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, 829--838.
[69]
David Ward and David MacKay. 2002. Fast hands-free writing by gaze direction. Nature 418, 6900: 838.
[70]
Andrew Wilson and Shane Williams. 2018. Autopager: exploiting change blindness for gaze-assisted reading. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 46.
[71]
Jacob Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications, 11--18.
[72]
Xiaoyu Zhao, Elias Guestrin, Dimitry Sayenko, and Tyler Simpson. 2012. Typing with eye-gaze and tooth-clicks. In Proceedings of the Symposium on Eye Tracking Research and Applications, 341--344.
[73]
Shaojian Zhu, Yao Ma, Jinjuan Feng, and Andrew Sears. 2009. Speech-Based Navigation: Improving Grid-Based Solutions. In IFIP Conference on Human-Computer Interaction, 50--62.

Cited By

View all
  • (2024)Evaluating the performance of gaze interaction for map target selectionCartography and Geographic Information Science10.1080/15230406.2024.2335331(1-21)Online publication date: 9-Apr-2024
  • (2023)Identification of Challenges and Best Practices for Including Users with Disabilities in User-Based TestingApplied Sciences10.3390/app1309549813:9(5498)Online publication date: 28-Apr-2023
  • (2023)Accessibility Research and Users with Multiple Disabilities or Complex NeedsProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3615651(1-6)Online publication date: 22-Oct-2023
  • Show More Cited By

Index Terms

  1. Multimodal Gaze Interaction for Creative Design

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
    April 2020
    10688 pages
    ISBN:9781450367080
    DOI:10.1145/3313831
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 April 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye gaze design
    2. eye gaze tracking
    3. gaze interaction
    4. interface design

    Qualifiers

    • Research-article

    Conference

    CHI '20
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)111
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 12 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Evaluating the performance of gaze interaction for map target selectionCartography and Geographic Information Science10.1080/15230406.2024.2335331(1-21)Online publication date: 9-Apr-2024
    • (2023)Identification of Challenges and Best Practices for Including Users with Disabilities in User-Based TestingApplied Sciences10.3390/app1309549813:9(5498)Online publication date: 28-Apr-2023
    • (2023)Accessibility Research and Users with Multiple Disabilities or Complex NeedsProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3615651(1-6)Online publication date: 22-Oct-2023
    • (2023)G-DAIC: A Gaze Initialized Framework for Description and Aesthetic-Based Image CroppingProceedings of the ACM on Human-Computer Interaction10.1145/35911327:ETRA(1-19)Online publication date: 18-May-2023
    • (2023)Improving and Analyzing Sketchy High-Fidelity Free-Eye DrawingProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596121(856-870)Online publication date: 10-Jul-2023
    • (2023)Exploring Trajectory Data in Augmented Reality: A Comparative Study of Interaction Modalities2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00094(790-799)Online publication date: 16-Oct-2023
    • (2023)Eye-gesture-based multi-context interaction2023 IEEE 18th Conference on Industrial Electronics and Applications (ICIEA)10.1109/ICIEA58696.2023.10241952(429-434)Online publication date: 18-Aug-2023
    • (2023)Eyes can drawInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102966170:COnline publication date: 1-Feb-2023
    • (2023)Spatialgaze: towards spatial gaze tracking for extended realityCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-023-00139-45:4(430-446)Online publication date: 16-Oct-2023
    • (2023)Real-time multimodal interaction in virtual reality - a case study with a large virtual interfaceMultimedia Tools and Applications10.1007/s11042-023-14381-682:16(25427-25448)Online publication date: 2-Feb-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media