[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2785830.2785838acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments

Published: 24 August 2015 Publication History

Abstract

Distributed display environments (DDEs) allow use of various specialized devices but challenge designers to provide a clean flow of data across multiple displays. Upcoming consumer-ready head-worn displays (HWDs) can play a central role in unifying the interaction experience in such ecosystems. In this paper, we report on the design and development of Gluey, a user interface that acts as a 'glue' to facilitate seamless input transitions and data movement across displays. Based on requirements we refine for such an interface, Gluey leverages inherent headworn display attributes such as field-of-view tracking and an always-available canvas to redirect input and migrate content across multiple displays, while minimizing device switching costs. We implemented a functional prototype integrating Gluey's numerous interaction possibilities. From our experience in this integration and from user evaluation results, we identify the open challenges in using HWDs to unify the interaction experience in DDEs.

Supplementary Material

serrano (p161-serrano-suppl.zip)
Supplemental movie and image files for, Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments

References

[1]
Ashdown, M., Oka, K., and Sato, Y. Combining head tracking and mouse input for a GUI on multiple monitors. CHI '05 EA, 1188--1191.
[2]
Bardram, J.E. Activity-based computing: Support for mobility and collaboration in ubiquitous computing. Journal of Personal and Ubiquitous Computing 9(5), 312--322, 2005.
[3]
Baur, D., Boring, S., and Feiner, S. Virtual projection: exploring optical projection as a metaphor for multi-device interaction. CHI '12, 1693--1702.
[4]
Benko, H. & Feiner, S. Multi-Monitor Mouse. CHI '05, 1208--1211.
[5]
Bi, X. and Balakrishnan, R. Comparing usage of a large high-resolution display to single or dual desktop displays for daily work. CHI '09, 1005--1014.
[6]
Bier, E., Stone, Maureen., Pier, K., Buxton, W., and DeRose, T. Toolglass and magic lenses: the see-through interface. SIGGRAPH '93, 73--80.
[7]
Boring, S., Altendorfer, M., Broll, G., Hilliges, O. and Butz, A. Shoot & Copy: Phonecam-based information transfer from public displays onto mobile phones. Mobility '07, 24--31.
[8]
Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P. 2010. Touch projector: mobile interaction through video. CHI '10, 2287--2296.
[9]
Brumitt, B., Meyers, M., Krumm, J., Kern, A. and Shafer, S. EasyLiving: Teechnologies for intelligent environments. HUC '00, 12--29.
[10]
Butz, A., Höllerer, T., Feiner, S., MacIntyre, B., and Beshers, C. 1999. Enveloping Users and Computers in a Collaborative 3D Augmented Reality. IWAR. IEEE, 35.
[11]
Cauchard, J., Löchtefeld, M., Irani, P., Schoening, J. Krüger, A., Fraser, M. and Subramanian, S. Visual separation in mobile multi-display environments. UIST '11, 451--460.
[12]
Chang, T.-H. and Li, Y. Deep shot: a framework for migrating tasks across devices using mobile phone cameras. CHI '11, 2163--2172.
[13]
Dearman, D. and Pierce, J. It's on my other computer!: computing with multiple devices. CHI '08, 767--776.
[14]
Dickie, C., Hart, J., Vertegaal, R., and Eiser, A. LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers. OZCHI '06, 119--126.
[15]
Ens, B., Finnegan, R. and Irani, P. The Personal Cockpit: A spatial interface for effective task switching on head-worn displays. CHI '14, 3171--3180.
[16]
Feiner, S., Macintyre, B. and Seligmann, D. Knowledge-based augmented reality. Communications of the ACM, 53--62, 1993.
[17]
Fitzmaurice, G.W. Situated information spaces and spatially aware palmtop computers. Communications of the ACM, 39--49, 1993.
[18]
Freedman, E. G., and Sparks, D. L. Coordination of the eyes and head: movement kinematics. Exp. Brain Res. 131(1), 22--32, 2000.
[19]
Grubert, J., Heinisch, M., Quigley, A. and Schmalstieg, D. MultiFi: Multi-Fidelity Interaction with Displays On and Around the Body. CHI'15. ACM, to appear.
[20]
Grudin, J. Partitioning digital worlds: focal and peripheral awareness in multiple monitor use. CHI '01, 458--465.
[21]
Höllerer, T., Feiner, S., Terauchi, T., Rashid, G. Hallaway, D. 1999. Exploring MARS: Developing Indoor and Outdoor User Interfaces to a Mobile Augmented Reality System. In Computers and Graphics. Vol 23, 779--785.
[22]
Houben, S., Nielsen, S., Esbensen, M, and Bardram, J.E. NooSphere: An activity-centric infrastructure for distributed interaction. MUM '13.
[23]
Houben, S. and Marquardt, N. WatchConnect: A Toolkit for Prototyping Smartwatch-Centric Cross-Device Applications. CHI'15. ACM, to appear.
[24]
Hutchings, D.R., Stasko, J. and Czerwinski, M. Distributed display environments. Interactions, 12(6), 50--53, 2005.
[25]
Johanson, B., Hutchins, G., Winograd, T., and Stone, M. PointRight: experience with flexible input redirection in interactive workspaces. UIST '02, 227--234.
[26]
Kim, K.-H., Reed, M.P. and Martin, B.J. A model of head movement contribution for gaze transitions, Ergonomics, 53(4), 447--457.
[27]
Miller, R.C. and Myers, B.A. Synchronizing clipboards of multiple computers. UIST '99, 65--66.
[28]
Nacenta, M., Sallam, S., Champoux, B., Subramanian, S., and Gutwin, C. Perspective cursor: Perspective-based interaction for multi-display environments. CHI '06, 289--298.
[29]
Nacenta, M, Mandryk, R., and Gutwin, C. Targeting across displayless space. CHI '08, 777--786.
[30]
Oulasvirta, A. and Sumari, L. Mobile kits and laptop trays: Managing multiple devices in mobile information work. CHI '07, 1127--1136.
[31]
Pierce, J., Conway, M., Dantzich, M., and Robertson, G. 1999. Toolspaces and glances: storing, accessing, and retrieving objects in 3D desktop applications. I3D '99. ACM, 163--168.
[32]
Rekimoto, J. Pick-and-drop: a direct manipulation technique for multiple computer environments. UIST '97, 31--39.
[33]
Rekimoto, J. and Nagao, K. The world through the computer: Computer augmented interaction with real-world environments. UIST '95, 29--36.
[34]
Rekimoto, J. and Saitoh, M. Augmented surfaces: A spatially continuous workspace for hybrid computing environments. CHI '99, 378--385.
[35]
Robertson, G. and Card, S. Fix and float: object movement by egocentric navigation. In Proc. of UIST '97, 149--150.
[36]
Sandor, C., Olwal, A., Bell, B., and Feiner, S. 2005. Immersive Mixed-Reality Configuration of Hybrid User Interfaces. In Proc. of ISMAR '05. IEEE, 110--113.
[37]
Santosa, S. and Wigdor, D. A field study of multi-device workflows in distributed workspaces. UbiComp'13, 63--72.
[38]
Schmidt, D., Sas, C. and Gellersen, H. Personal Clipboards for individual copy-and-paste on shared multi-user surfaces. CHI '13, 3335--3344.
[39]
Serrano, M., Ens, B. and Irani, P. Exploring the use of hand-to-face input for interacting with head-worn displays. CHI '14, 3181--3190.
[40]
Serrano, M., Lecolinet, E. and Guiard, Y. Bezel-Tap gestures: Quick activation of commands from sleep mode on tablets. CHI '13, 3027--3036.
[41]
Sridharan, S.K., Hincapié-Ramos, J.D., Flatla, D.R. and Irani, P. Color correction for optical see-through displays using display color profiles. VRST'13, 231--240.
[42]
Stellmach, S. and Dachselt, R. Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. CHI '13, 285--294.
[43]
Turner, J., Alexander, J., Bulling, A., Schmidt, D. and Gellersen, H. Eye pull, eye push: Moving objects between large screens and personal devices with gaze and touch. INTERACT '13, 170--186.
[44]
Wallace, J.R., Mandryk, R.L, and Inkpen, K. Comparing content and input redirection in MDEs. CSCW '08,157--166.
[45]
Weiser, M. The computer for the 21st century. Scientific American, 94--104, 1991.
[46]
Wilson, A. and Benko, H. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. UIST '10. ACM, 273--282.
[47]
Yang, X-D., Mak, E., McCallum, D., Irani, P., Cao, X. and Izadi, S. 2010. LensMouse: augmenting the mouse with an interactive touch display. CHI '10, 2431--2440.
[48]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. CHI '99, 246--253.

Cited By

View all
  • (2024)I've Got the Data in My Pocket! - Exploring Interaction Techniques with Everyday Objects for Cross-Device Data TransferProceedings of Mensch und Computer 202410.1145/3670653.3670778(242-255)Online publication date: 1-Sep-2024
  • (2024)Investigating the Impact of Multiple View Layouts on Users' Visual Task Performance in Extended RealityProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656756(1-3)Online publication date: 3-Jun-2024
  • (2024)SwitchSpace: Understanding Context-Aware Peeking Between VR and Desktop InterfacesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642358(1-16)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '15: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services
    August 2015
    611 pages
    ISBN:9781450336529
    DOI:10.1145/2785830
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 August 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Content migration
    2. Distributed displays
    3. Head-Worn Display
    4. Input redirection
    5. Multi-display environments

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    MobileHCI '15
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)65
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 20 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)I've Got the Data in My Pocket! - Exploring Interaction Techniques with Everyday Objects for Cross-Device Data TransferProceedings of Mensch und Computer 202410.1145/3670653.3670778(242-255)Online publication date: 1-Sep-2024
    • (2024)Investigating the Impact of Multiple View Layouts on Users' Visual Task Performance in Extended RealityProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656756(1-3)Online publication date: 3-Jun-2024
    • (2024)SwitchSpace: Understanding Context-Aware Peeking Between VR and Desktop InterfacesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642358(1-16)Online publication date: 11-May-2024
    • (2023)A Design Space of Multi-Display Spatial Interactions for Visualization TasksProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614516(1-13)Online publication date: 13-Oct-2023
    • (2023)Cross-Device Shortcuts: Seamless Attention-guided Content Transfer via Opportunistic Deep Links between Apps and DevicesProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3614145(125-134)Online publication date: 9-Oct-2023
    • (2023)InDe: An Inline Data Deduplication Approach via Adaptive Detection of Valid Container UtilizationACM Transactions on Storage10.1145/356842619:1(1-27)Online publication date: 11-Jan-2023
    • (2023)End-to-end I/O Monitoring on Leading SupercomputersACM Transactions on Storage10.1145/356842519:1(1-35)Online publication date: 11-Jan-2023
    • (2023)A Human-in-the-Loop Segmented Mixed-Effects Modeling Method for Analyzing Wearables DataACM Transactions on Management Information Systems10.1145/356427614:2(1-17)Online publication date: 25-Jan-2023
    • (2023)WebJump: AR-facilitated Distributed Display of Web PagesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585669(1-6)Online publication date: 19-Apr-2023
    • (2023)User-Driven Constraints for Layout Optimisation in Augmented RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580873(1-16)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media