[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2788940.2788949acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input

Published: 08 August 2015 Publication History

Abstract

Interactive workspaces combine horizontal and vertical touch surfaces into a single digital workspace. During an exploration of these systems, it was shown that direct interaction on the vertical surface is cumbersome and more inaccurate than on the horizontal one. To overcome these problems, indirect touch systems turn the horizontal touch surface into the input which allows manipulation of objects on the vertical display. If the horizontal touch surface also acts as a display, however, it becomes necessary to notify the system which screen is currently in use by providing a switching mode. We investigate the use of gaze tracking to perform these mode switches. In three user studies, we compare absolute and relative gaze augmented selection techniques with the traditional direct-touch approach. Our results show that our relative gaze augmented selection technique outperforms the other techniques for simple tapping tasks alternating between horizontal and vertical surfaces, and for dragging on the vertical surface. However, when tasks involve dragging across surfaces, the findings are more complex. We provide a detailed description of the proposed interaction techniques, a statistical analysis of these interaction techniques, and how they can be applied to systems that involve a combination of multiple horizontal and vertical touch surfaces.

Supplementary Material

suppl.mov (sui0140-file3.mp4)
Supplemental video

References

[1]
1. Ashdown, M., Oka, K., and Sato, Y. Combining Head Tracking and Mouse Input for a GUI on Multiple Monitors. In CHI EA, ACM (New York, USA, 2005), 1188--1191.
[2]
2. Bi, X., Grossman, T., Matejka, J., and Fitzmaurice, G. Magic Desk: Bringing Multitouch Surfaces into Desktop Work. In Proc. CHI, ACM (New York, USA, 2011), 2511--2520.
[3]
3. Bi, X., Li, Y., and Zhai, S. FFitts Law: Modeling Finger Touch with Fitts' Law. In Proc. CHI, ACM (New York, USA, 2013), 1363--1372.
[4]
4. Buxton, W. A Three-State Model of Graphical Input. In Proc. INTERACT, North-Holland Publishing Co. (Amsterdam, The Netherlands, 1990), 449--456.
[5]
5. Fitts, P. M. The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement. Journal of experimental psychology 47, 6 (1954), 381.
[6]
6. Fono, D., and Vertegaal, R. EyeWindows: Evaluation of Eye-controlled Zooming Windows for Focus Selection. In Proc. CHI, ACM (New York, USA, 2005), 151--160.
[7]
7. Gilliot, J., Casiez, G., and Roussel, N. Impact of Form Factors and Input Conditions on Absolute Indirect-touch Pointing Tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (New York, USA, 2014), 723--732.
[8]
8. Grossman, T., and Balakrishnan, R. The Bubble Cursor: Enhancing Target Acquisition by Dynamic Resizing of the Cursor's Activation Area. In Proc. CHI, ACM (New York, USA, 2005), 281--290.
[9]
9. Hammerton, M., and Tickner, A. H. An Investigation into the Comparative Suitability of Forearm, Hand and Thumb Controls in Acquisition Tasks. Ergonomics 9 (1966), 125--130.
[10]
10. Kern, D., Marshall, P., and Schmidt, A. Gazemarks: Gaze-based Visual Placeholders to Ease Attention Switching. In Proc. CHI, ACM (New York, USA, 2010), 2093--2102.
[11]
11. Nancel, M., Chapuis, O., Pietriga, E., Yang, X.-D., Irani, P. P., and Beaudouin-Lafon, M. High-precision Pointing on Large Wall Displays Using Small Handheld Devices. In Proc. CHI '13, ACM (New York, USA, 2013), 831--840.
[12]
12. Pfeuffer, K., Alexander, J., Chong, M. K., and Gellersen, H. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proc. UIST, ACM (New York, USA, 2014), 509--518.
[13]
13. Schmidt, D., Block, F., and Gellersen, H. A Comparison of Direct and Indirect Multi-touch Input for Large Surfaces. In Proc.INTERACT, Springer-Verlag (Berlin, Heidelberg, 2009), 582--594.
[14]
14. Schöning, J. Touching the Future: The Rise of Multitouch Interfaces. PerAda Magazine (2010), 5531.
[15]
15. Shell, J. S., Vertegaal, R., Cheng, D., Skaburskis, A. W., Sohn, C., Stewart, A. J., Aoudeh, O., and Dickie, C. ECSGlasses and EyePliances: Using Attention to Open Sociable Windows of Interaction. In Proc. ETRA, ACM (New York, USA, 2004), 93--100.
[16]
16. Shneiderman, B. Direct Manipulation for Comprehensible, Predictable and Controllable User Interfaces. In Proceedings of the 2Nd International Conference on Intelligent User Interfaces, IUI '97, ACM (New York, USA, 1997), 33--39.
[17]
17. Smith, J. D., and Graham, T. C. N. Use of Eye Movements for Video Game Control. In Proc. ACE, ACM (New York, USA, 2006), Article 20.
[18]
18. Stellmach, S., and Dachselt, R. Look & touch: Gaze-supported target acquisition. In Proc. CHI, ACM (New York, USA, 2012), 2981--2990.
[19]
19. Stellmach, S., and Dachselt, R. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proc. CHI, ACM (New York, USA, 2013), 285--294.
[20]
20. Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. Designing Gaze-supported Multimodal Interactions for the Exploration of Large Image Collections. In Proc. NGCA, ACM (New York, USA, 2011), 1:1--1:8.
[21]
21. Sutherland, I. E. Sketch Pad a Man-machine Graphical Communication System. In Proc. DAC, ACM (New York, USA, 1964), 6.329--6.346.
[22]
22. Tognazzini, B. The "Starfire" Video Prototype Project: A Case History. In Proc. CHI, ACM (New York, USA, 1994), 99--105.
[23]
23. Turner, J. Cross-device Eye-based Interaction. In Proc. UIST Adjunct, ACM (New York, USA, 2013), 37--40.
[24]
24. Turner, J., Bulling, A., and Gellersen, H. Combining Gaze with Manual Interaction to Extend Physical Reach. In Proc. PETMEI, ACM (New York, USA, 2011), 33--36.
[25]
25. Vertegaal, R., Mamuji, A., Sohn, C., and Cheng, D. Media Eyepliances: Using Eye Tracking for Remote Control Focus Selection of Appliances. In CHI EA, ACM (New York, USA, 2005), 1861--1864.
[26]
26. Voelker, S., Wacharamanotham, C., and Borchers, J. An Evaluation of State Switching Methods for Indirect Touch Systems. In Proc. CHI, ACM (New York, USA, 2013), 745--754.
[27]
27. Weiss, M., Voelker, S., Sutter, C., and Borchers, J. BendDesk: Dragging Across the Curve. In Proc. ITS, ACM (New York, USA, 2010), 1--10.
[28]
28. Wimmer, R., Hennecke, F., Schulz, F., Boring, S., Butz, A., and Hussmann, H. Curve: Revisiting the Digital Desk. In Proc. NordiCHI, ACM (New York, USA, 2010), 561--570.
[29]
29. Zhai, S. What's in the Eyes for Attentive Input. Commun. ACM 46, 3 (Mar. 2003), 34--39.
[30]
30. Zhai, S., Morimoto, C., and Ihde, S. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proc. CHI, ACM (New York, USA, 1999), 246--253.

Cited By

View all
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2023)Guiding Visual Attention on 2D Screens: Effects of Gaze Cues from Avatars and HumansProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614529(1-9)Online publication date: 13-Oct-2023
  • (2022)NotiBike: Assessing Target Selection Techniques for Cyclist Notifications in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35467326:MHCI(1-24)Online publication date: 20-Sep-2022
  • Show More Cited By

Index Terms

  1. Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        SUI '15: Proceedings of the 3rd ACM Symposium on Spatial User Interaction
        August 2015
        152 pages
        ISBN:9781450337038
        DOI:10.1145/2788940
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 08 August 2015

        Permissions

        Request permissions for this article.

        Check for updates

        Badges

        • Honorable Mention

        Author Tags

        1. gaze-based interaction
        2. indirect touch
        3. interactive surfaces and tabletops
        4. tabletop interaction
        5. touch
        6. workspaces

        Qualifiers

        • Research-article

        Funding Sources

        • German B-IT Foundation

        Conference

        SUI '15
        SUI '15: Symposium on Spatial User Interaction
        August 8 - 9, 2015
        California, Los Angeles, USA

        Acceptance Rates

        SUI '15 Paper Acceptance Rate 17 of 48 submissions, 35%;
        Overall Acceptance Rate 86 of 279 submissions, 31%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)41
        • Downloads (Last 6 weeks)1
        Reflects downloads up to 11 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
        • (2023)Guiding Visual Attention on 2D Screens: Effects of Gaze Cues from Avatars and HumansProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614529(1-9)Online publication date: 13-Oct-2023
        • (2022)NotiBike: Assessing Target Selection Techniques for Cyclist Notifications in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35467326:MHCI(1-24)Online publication date: 20-Sep-2022
        • (2022)Advantages of a Statistical Estimation Approach for Clock Frequency Estimation of Heterogeneous and Irregular CGRAsACM Transactions on Reconfigurable Technology and Systems10.1145/353106216:1(1-33)Online publication date: 22-Dec-2022
        • (2022)An Optimized GIB Routing Architecture with Bent Wires for FPGAACM Transactions on Reconfigurable Technology and Systems10.1145/351959916:1(1-28)Online publication date: 22-Dec-2022
        • (2022)Can Computing Be Diversified on “Principles” Alone? Exploring the Role of AP Computer Science Courses in Students’ Major and Career IntentionsACM Transactions on Computing Education10.1145/347943122:2(1-26)Online publication date: 4-Mar-2022
        • (2021)VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display InteractionProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462076(283-297)Online publication date: 28-Jun-2021
        • (2021)M[eye]croProceedings of the ACM on Human-Computer Interaction10.1145/34617325:EICS(1-22)Online publication date: 29-May-2021
        • (2021)Byzantine Fault-tolerant State-machine Replication from a Systems PerspectiveACM Computing Surveys10.1145/343672854:1(1-38)Online publication date: 11-Feb-2021
        • (2021)Survey on Periodic Scheduling for Time-triggered Hard Real-time SystemsACM Computing Surveys10.1145/343123254:1(1-32)Online publication date: 5-Mar-2021
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media