[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2858036.2858201acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction

Published: 07 May 2016 Publication History

Abstract

Bimanual pen and touch UIs are mainly based on the direct manipulation paradigm. Alternatively we propose partially-indirect bimanual input, where direct pen input is used with the dominant hand, and indirect-touch input with the non-dominant hand. As direct and indirect inputs do not overlap, users can interact in the same space without interference. We investigate two indirect-touch techniques combined with direct pen input: the first redirects touches to the user's gaze position, and the second redirects touches to the pen position. In this paper, we present an empirical user study where we compare both partially-indirect techniques to direct pen and touch input in bimanual pan, zoom, and ink tasks. Our experimental results show that users are comparatively fast with the indirect techniques, but more accurate as users can dynamically change the zoom-target during indirect zoom gestures. Further our studies reveal that direct and indirect zoom gestures have distinct characteristics regarding spatial use, gestural use, and bimanual parallelism.

Supplementary Material

suppl.mov (pn939.mp4)
Supplemental video
MP4 File (p2845-pfeuffer.mp4)

References

[1]
1. Martha Abednego, Joong-Ho Lee, Won Moon, and Ji-Hyung Park. 2009. I-Grabber: Expanding Physical Reach in a Large-display Tabletop Environment Through the Use of a Virtual Grabber. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS '09). ACM, NY, NY, USA, 61--64.
[2]
2. Jeff Avery, Mark Choi, Daniel Vogel, and Edward Lank. 2014. Pinch-to-zoom-plus: An Enhanced Pinch-to-zoom That Reduces Clutching and Panning. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, NY, NY, USA, 595--604.
[3]
3. Ravin Balakrishnan and Ken Hinckley. 2000. Symmetric Bimanual Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, NY, NY, USA, 33--40.
[4]
4. Hrvoje Benko, Andrew D. Wilson, and Patrick Baudisch. 2006. Precise Selection Techniques for Multi-touch Screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '06). ACM, NY, NY, USA, 1263--1272.
[5]
5. Anastasia Bezerianos and Ravin Balakrishnan. 2005. The Vacuum: Facilitating the Manipulation of Distant Objects. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, NY, NY, USA, 361--370.
[6]
6. Peter Brandl, Clifton Forlines, Daniel Wigdor, Michael Haller, and Chia Shen. 2008. Combining and Measuring the Benefits of Bimanual Pen and Direct-touch Interaction on Horizontal Interfaces. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI '08). ACM, NY, NY, USA, 154--161.
[7]
7. W. Buxton and B. Myers. 1986. A Study in Two-handed Input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '86). ACM, NY, NY, USA, 321--326.
[8]
8. Yves Guiard. 1987. Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. In Journal of Motor Behavior, Vol. 19. 486--517.
[9]
9. Yves Guiard and Michel Beaudouin-Lafon. 2004. Target Acquisition in Multiscale Electronic Worlds. Int. J. Hum.-Comput. Stud. 61, 6 (Dec. 2004), 875--905.
[10]
10. Ken Hinckley, Xiaojun Bi, Michel Pahud, and Bill Buxton. 2012. Informal Information Gathering Techniques for Active Reading. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, NY, NY, USA, 1893--1896.
[11]
11. Ken Hinckley, Michel Pahud, Hrvoje Benko, Pourang Irani, François Guimbreti'ere, Marcel Gavriliu, Xiang 'Anthony' Chen, Fabrice Matulic, William Buxton, and Andrew Wilson. 2014. Sensing Techniques for Tablet+Stylus Interaction. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, NY, NY, USA, 605--614.
[12]
12. Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, and Bill Buxton. 2010. Pen + Touch = New Tools. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology (UIST '10). ACM, NY, NY, USA, 27--36.
[13]
13. Kenrick Kin, Maneesh Agrawala, and Tony DeRose. 2009. Determining the Benefits of Direct-touch, Bimanual, and Multifinger Input on a Multitouch Workstation. In Proceedings of Graphics Interface 2009 (GI '09). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 119--124. http: //dl.acm.org/citation.cfm?id=1555880.1555910
[14]
14. Celine Latulipe, Craig S. Kaplan, and Charles L. A. Clarke. 2005. Bimanual and Unimanual Image Alignment: An Evaluation of Mouse-based Techniques. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST '05). ACM, NY, NY, USA, 123--131.
[15]
15. Celine Latulipe, Stephen Mann, Craig S. Kaplan, and Charlie L. A. Clarke. 2006. symSpline: Symmetric Two-handed Spline Manipulation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '06). ACM, NY, NY, USA, 349--358.
[16]
16. Andrea Leganchuk, Shumin Zhai, and William Buxton. 1998. Manual and Cognitive Benefits of Two-handed Input: An Experimental Study. ACM Trans. Comput.-Hum. Interact. 5, 4 (Dec. 1998), 326--359.
[17]
17. Pedro Lopes, Daniel Mendes, Bruno Araújo, and Joaquim A. Jorge. 2011. Combining Bimanual Manipulation and Pen-based Input for 3D Modelling. In Proceedings of the Eighth Eurographics Symposium on Sketch-Based Interfaces and Modeling (SBIM '11). ACM, NY, NY, USA, 15--22.
[18]
18. Fabrice Matulic and Moira Norrie. 2012. Empirical Evaluation of Uni- and Bimodal Pen and Touch Interaction Properties on Digital Tabletops. In Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces (ITS '12). ACM, NY, NY, USA, 143--152.
[19]
19. Mathieu Nancel, Julie Wagner, Emmanuel Pietriga, Olivier Chapuis, and Wendy Mackay. 2011. Mid-air Pan-and-zoom on Wall-sized Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, NY, NY, USA, 177--186.
[20]
20. Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, NY, NY, USA, 509--518.
[21]
21. Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, NY, NY, USA, 373--383.
[22]
22. Ken Pfeuffer, Jason Alexander, and Hans Gellersen. 2015. Gaze+touch vs. Touch: Whats the Trade-off When Using Gaze to Extend Touch to Remote Displays?. In Human-Computer Interaction INTERACT 2015 (Lecture Notes in Computer Science), Vol. 9297. Springer International Publishing, 349--367.
[23]
23. Sophie Stellmach and Raimund Dachselt. 2012a. Investigating Gaze-supported Multimodal Pan and Zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, NY, NY, USA, 357--360.
[24]
24. Sophie Stellmach and Raimund Dachselt. 2012b. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, NY, NY, USA, 2981--2990.
[25]
25. Jayson Turner, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, NY, NY, USA, 4179--4188.
[26]
26. Simon Voelker, Andrii Matviienko, Johannes Schöning, and Jan Borchers. 2015. Combining Direct and Indirect Touch Input for Interactive Workspaces Using Gaze Input. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI '15). ACM, NY, NY, USA, 79--88.
[27]
27. Daniel Vogel and Ravin Balakrishnan. 2010. Occlusion-aware Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, NY, NY, USA, 263--272.
[28]
28. Daniel Vogel and Patrick Baudisch. 2007. Shift: A Technique for Operating Pen-based Interfaces Using Touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, NY, NY, USA, 657--666.
[29]
29. Daniel Wigdor, Hrvoje Benko, John Pella, Jarrod Lombardo, and Sarah Williams. 2011. Rock & Rails: Extending Multi-touch Interactions with Shape Gestures to Enable Precise Spatial Manipulations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, NY, NY, USA, 1581--1590.
[30]
30. Mike Wu, Chia Shen, Kathy Ryall, Clifton Forlines, and Ravin Balakrishnan. 2006. Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces. In Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06). IEEE Computer Society, Washington, DC, USA, 185--192.
[31]
31. Ka-Ping Yee. 2004. Two-handed Interaction on a Tablet Display. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, NY, NY, USA, 1493--1496.
[32]
32. Dongwook Yoon, Nicholas Chen, François Guimbreti'ere, and Abigail Sellen. 2014. RichReview: Blending Ink, Speech, and Gesture to Support Collaborative Document Review. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, NY, NY, USA, 481--490.
[33]
33. Robert Zeleznik, Andrew Bragdon, Ferdi Adeputra, and Hsu-Sheng Ko. 2010. Hands-on Math: A Page-based Multi-touch and Pen Desktop for Technical Work and Problem Solving. In Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology (UIST '10). ACM, NY, NY, USA, 17--26.
[34]
34. Xinyong Zhang, Xiangshi Ren, and Hongbin Zha. 2008. Improving Eye Cursor's Stability for Eye Pointing Tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, NY, NY, USA, 525--534.

Cited By

View all
  • (2024)Gaze, Wall, and Racket: Combining Gaze and Hand-Controlled Plane for 3D Selection in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981348:ISS(189-213)Online publication date: 24-Oct-2024
  • (2024)MRTranslate: Bridging Language Barriers in the Physical World Using a Mixed Reality Point-and-Translate SystemProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656652(1-9)Online publication date: 3-Jun-2024
  • (2024)PortalInk: 2.5D Visual Storytelling with SVG Parallax and Waypoint TransitionsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676376(1-16)Online publication date: 13-Oct-2024
  • Show More Cited By

Index Terms

  1. Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
    May 2016
    6108 pages
    ISBN:9781450333627
    DOI:10.1145/2858036
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 May 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bimanual input
    2. direct and indirect input
    3. gaze
    4. pan and zoom
    5. pen and touch

    Qualifiers

    • Research-article

    Funding Sources

    • Google Faculty Research award

    Conference

    CHI'16
    Sponsor:
    CHI'16: CHI Conference on Human Factors in Computing Systems
    May 7 - 12, 2016
    California, San Jose, USA

    Acceptance Rates

    CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)46
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 26 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Gaze, Wall, and Racket: Combining Gaze and Hand-Controlled Plane for 3D Selection in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981348:ISS(189-213)Online publication date: 24-Oct-2024
    • (2024)MRTranslate: Bridging Language Barriers in the Physical World Using a Mixed Reality Point-and-Translate SystemProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656652(1-9)Online publication date: 3-Jun-2024
    • (2024)PortalInk: 2.5D Visual Storytelling with SVG Parallax and Waypoint TransitionsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676376(1-16)Online publication date: 13-Oct-2024
    • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
    • (2024)Design Principles and Challenges for Gaze + Pinch Interaction in XRIEEE Computer Graphics and Applications10.1109/MCG.2024.338296144:3(74-81)Online publication date: May-2024
    • (2023)Spreadsheets on Interactive Surfaces: Breaking through the Grid with the PenACM Transactions on Computer-Human Interaction10.1145/363009731:2(1-33)Online publication date: 25-Oct-2023
    • (2023)PalmGazer: Unimanual Eye-hand Menus in Augmented RealityProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614523(1-12)Online publication date: 13-Oct-2023
    • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
    • (2021)HybridPointing for TouchProceedings of the ACM on Human-Computer Interaction10.1145/34885405:ISS(1-22)Online publication date: 5-Nov-2021
    • (2021)Bi-3D: Bi-Manual Pen-and-Touch Interaction for 3D Manipulation on TabletsThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474741(149-161)Online publication date: 10-Oct-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media