[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3611659.3615709acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Vicarious: Context-aware Viewpoints Selection for Mixed Reality Collaboration

Published: 09 October 2023 Publication History

Abstract

Mixed-perspective, combining egocentric (first-person) and exocentric (third-person) viewpoints, have been shown to improve the collaborative experience in remote settings. Such experiences allow remote users to switch between different viewpoints to gain alternative perspectives of the remote space. However, existing systems lack seamless selection and transition between multiple perspectives that better fit the task at hand. To address this, we present a new approach called Vicarious, which simplifies and automates the selection between egocentric and exocentric viewpoints. Vicarious employs a context-aware method for dynamically switching or highlighting the optimal viewpoint based on user actions and the current context. To evaluate the effectiveness of the viewpoint selection method, we conducted a user study (n = 27) using an asymmetric AR-VR setup where users performed remote collaboration tasks under four distinct conditions: No-view, Manual, Guided, and Automatic selection. The results showed that Guided and Automatic viewpoint selection improved users’ understanding of the task space and task performance, and reduced cognitive load compared to Manual or No-view selection. The results also suggest that the asymmetric setup had minimal impact on spatial and social presence, except for differences in task load and preference. Based on these findings, we provide design implications for future research in mixed reality collaboration.

Supplemental Material

MP4 File
Vicarious: Context-aware Viewpoints Selection for Mixed Reality Collaboration

References

[1]
Parastoo Abtahi, Mar Gonzalez-Franco, Eyal Ofek, and Anthony Steed. 2019. I’m a giant: Walking in large virtual environments at high speed gains. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–13.
[2]
Jeremy N Bailenson, Kim Swinth, Crystal Hoyt, Susan Persky, Alex Dimov, and Jim Blascovich. 2005. The independent and interactive effects of embodied-agent appearance and behavior on self-report, cognitive, and behavioral markers of copresence in immersive virtual environments. Presence 14 (2005), 379–393.
[3]
Mark Billinghurst, Hirokazu Kato, and Ivan Poupyrev. 2001. The MagicBook: a transitional AR interface. Computers & Graphics 25, 5 (2001), 745–753.
[4]
Mark Billinghurst, Alaeddin Nassani, and Carolin Reichherzer. 2014. Social panoramas: using wearable computers to share experiences. In SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications. 1–1.
[5]
John Brooke 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189 (1996), 4–7.
[6]
Amine Chellali, Isabelle Milleville-Pennel, and Cédric Dumas. 2013. Influence of contextual objects on spatial interactions and viewpoints sharing in virtual environments. Virtual Reality 17, 1 (2013), 1–15.
[7]
Sung Ho Choi, Minseok Kim, and Jae Yeol Lee. 2018. Situation-dependent remote AR collaborations: Image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode. Computers in Industry 101 (2018), 51–66.
[8]
Arthur Fages, Cédric Fleury, and Theophanis Tsandilas. 2022. Understanding Multi-View Collaboration between Augmented Reality and Remote Desktop Users. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (2022), 1–27.
[9]
Susan R Fussell, Robert E Kraut, and Jane Siegel. 2000. Coordination of communication: Effects of shared visual context on collaborative work. In Proceedings of the 2000 ACM conference on Computer supported cooperative work. 21–30.
[10]
Lei Gao, Huidong Bai, Rob Lindeman, and Mark Billinghurst. 2017. Static local environment capturing and sharing for MR remote collaboration. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications. 1–6.
[11]
Raphaël Grasset, Philip Lamb, and Mark Billinghurst. 2005. Evaluation of mixed-space collaboration. In Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’05). IEEE, 90–99.
[12]
Jan Gugenheimer, Evgeny Stemasov, Julian Frommel, and Enrico Rukzio. 2017. Sharevr: Enabling co-located experiences for virtual reality between hmd and non-hmd users. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 4021–4033.
[13]
Kunal Gupta, Gun A Lee, and Mark Billinghurst. 2016. Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE transactions on visualization and computer graphics 22, 11 (2016), 2413–2422.
[14]
Chad Harms and Frank Biocca. 2004. Internal consistency and reliability of the networked minds measure of social presence. In Seventh annual international workshop: Presence, Vol. 2004. Universidad Politecnica de Valencia Valencia, Spain.
[15]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139–183.
[16]
Jörg Hauber, Holger Regenbrecht, Mark Billinghurst, and Andy Cockburn. 2006. Spatiality in videoconferencing: trade-offs between efficiency and social presence. In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work. 413–422.
[17]
Carrie Heeter. 1992. Being there: The subjective experience of presence. Presence: Teleoperators & Virtual Environments 1 (1992), 262–271.
[18]
Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can eye help you? Effects of visualizing eye fixations on remote collaboration scenarios for physical tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 5180–5190.
[19]
Aaron Hitchcock and Kelvin Sung. 2018. Multi-view augmented reality with a drone. In Proceedings of the 24th ACM symposium on virtual reality software and technology. 1–2.
[20]
Hikaru Ibayashi, Yuta Sugiura, Daisuke Sakamoto, Natsuki Miyata, Mitsunori Tada, Takashi Okuma, Takeshi Kurata, Masaaki Mochimaru, and Takeo Igarashi. 2015. Dollhouse vr: a multi-view, multi-user collaborative design workspace with vr technology. In SIGGRAPH Asia 2015 Emerging Technologies. 1–2.
[21]
Hyungeun Jo and Sungjae Hwang. 2013. Chili: viewpoint control and on-video drawing for mobile video calls. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 1425–1430.
[22]
Shunichi Kasahara and Jun Rekimoto. 2014. JackIn: integrating first-person view with out-of-body vision generation for human-human augmentation. In Proceedings of the 5th augmented human international conference. 1–8.
[23]
Shunichi Kasahara and Jun Rekimoto. 2015. JackIn head: immersive visual telepresence system with omnidirectional wearable camera for remote collaboration. In Proceedings of the 21st ACM symposium on virtual reality software and technology. 217–225.
[24]
Robert S Kennedy, Norman E Lane, Kevin S Berbaum, and Michael G Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology 3 (1993), 203–220.
[25]
Seungwon Kim, Allison Jing, Hanhoon Park, Soo-hyung Kim, Gun Lee, and Mark Billinghurst. 2020. Use of Gaze and Hand Pointers in Mixed Reality Remote Collaboration. In The 9th International Conference on Smart Media and Applications. SMA, Jeju, Republic of Korea. 1–6.
[26]
Seungwon Kim, Gun Lee, Mark Billinghurst, and Weidong Huang. 2020. The combination of visual communication cues in mixed reality remote collaboration. Journal on Multimodal User Interfaces 14 (2020), 321–335.
[27]
Seungwon Kim, Gun Lee, Weidong Huang, Hayun Kim, Woontack Woo, and Mark Billinghurst. 2019. Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–13.
[28]
Ryohei Komiyama, Takashi Miyaki, and Jun Rekimoto. 2017. JackIn space: designing a seamless transition between first and third person view for effective telepresence collaborations. In Proceedings of the 8th Augmented Human International Conference. 1–9.
[29]
Sven Kratz, Don Kimber, Weiqing Su, Gwen Gordon, and Don Severns. 2014. Polly: " being there" through the parrot and a guide. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. 625–630.
[30]
Morgan Le Chénéchal, Thierry Duval, Valérie Gouranton, Jérôme Royan, and Bruno Arnaldi. 2016. Vishnu: virtual immersive support for helping users an interaction paradigm for collaborative remote guiding in mixed reality. In 2016 IEEE Third VR International Workshop on Collaborative Virtual Environments (3DCVE). IEEE, 9–12.
[31]
Morgan Le Chénéchal, Thierry Duval, Valérie Gouranton, Jérôme Royan, and Bruno Arnaldi. 2019. Help! i need a remote guide in my mixed reality collaborative environment. Frontiers in Robotics and AI 6 (2019), 106.
[32]
Geonsun Lee, HyeongYeop Kang, JongMin Lee, and JungHyun Han. 2020. A User Study on View-sharing Techniques for One-to-Many Mixed Reality Collaborations. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 343–352.
[33]
Gun A Lee, Seungjun Ahn, William Hoff, and Mark Billinghurst. 2020. Enhancing first-person view task instruction videos with augmented reality cues. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 498–508.
[34]
Gun A Lee, Theophilus Teo, Seungwon Kim, and Mark Billinghurst. 2017. Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications. 1–4.
[35]
Chengyuan Lin, Edgar Rojas-Munoz, Maria Eugenia Cabrera, Natalia Sanchez-Tamayo, Daniel Andersen, Voicu Popescu, Juan Antonio Barragan Noguera, Ben Zarzaur, Pat Murphy, Kathryn Anderson, 2020. How about the mentor? effective workspace visualization in ar telementoring. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 212–220.
[36]
Jörg Müller, Tobias Langlotz, and Holger Regenbrecht. 2016. PanoVC: Pervasive telepresence using mobile phones. In 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom). IEEE, 1–10.
[37]
Jens Müller, Roman Rädle, and Harald Reiterer. 2017. Remote collaboration with mixed reality displays: How shared virtual landmarks facilitate spatial referencing. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 6481–6486.
[38]
Shohei Nagai, Shunichi Kasahara, and Jun Rekimoto. 2015. Livesphere: Sharing the surrounding visual environment for immersive experience in remote collaboration. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction. 113–116.
[39]
Cuong Nguyen, Stephen DiVerdi, Aaron Hertzmann, and Feng Liu. 2017. CollaVR: collaborative in-headset review for VR video. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. 267–277.
[40]
Keith Phillips and Wayne Piekarski. 2005. Possession techniques for interaction in real-time strategy augmented reality games. In Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology. 2–es.
[41]
Jeffrey S Pierce, Brian C Stearns, and Randy Pausch. 1999. Voodoo dolls: seamless interaction at multiple scales in virtual environments. In Proceedings of the 1999 symposium on Interactive 3D graphics. 141–145.
[42]
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. 2019. The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI 6 (2019), 5.
[43]
Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Snow dome: A multi-scale interaction in mixed reality remote collaboration. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–4.
[44]
Thammathip Piumsomboon, Gun A Lee, Barrett Ens, Bruce H Thomas, and Mark Billinghurst. 2018. Superman vs giant: A study on spatial perception for a multi-scale mixed reality flying telepresence interface. IEEE transactions on visualization and computer graphics 24, 11 (2018), 2974–2982.
[45]
Thammathip Piumsomboon, Gun A Lee, Andrew Irlitti, Barrett Ens, Bruce H Thomas, and Mark Billinghurst. 2019. On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–17.
[46]
Kevin Ponto, Hyun Joon Shin, Joe Kohlmann, and Michael Gleicher. 2012. Online real-time presentation of virtual experiences forexternal viewers. In Proceedings of the 18th ACM symposium on Virtual reality software and technology. 45–52.
[47]
Jason Rambach, Gergana Lilligreen, Alexander Schäfer, Ramya Bankanal, Alexander Wiebel, and Didier Stricker. 2021. A survey on applications of augmented, mixed and virtual reality for nature and environment. In Virtual, Augmented and Mixed Reality: 13th International Conference, VAMR 2021, Held as Part of the 23rd HCI International Conference, HCII 2021, Virtual Event, July 24–29, 2021, Proceedings. Springer, 653–675.
[48]
Taehyun Rhee, Stephen Thompson, Daniel Medeiros, Rafael Dos Anjos, and Andrew Chalmers. 2020. Augmented virtual teleportation for high-fidelity telecollaboration. IEEE transactions on visualization and computer graphics 26 (2020), 1923–1933.
[49]
Bektur Ryskeldiev, Michael Cohen, and Jens Herder. 2018. Streamspace: Pervasive mixed reality telepresence for remote collaboration on mobile devices. Journal of Information Processing 26 (2018), 177–185.
[50]
Mehrnaz Sabet, Mania Orand, and David W. McDonald. 2021. Designing telepresence drones to support synchronous, mid-air remote collaboration: An exploratory study. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–17.
[51]
Prasanth Sasikumar, Lei Gao, Huidong Bai, and Mark Billinghurst. 2019. Wearable RemoteFusion: A Mixed Reality Remote Collaboration System with Local Eye Gaze and Remote Hand Gesture Sharing. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 393–394.
[52]
Thomas Schubert, Frank Friedmann, and Holger Regenbrecht. 2001. The experience of presence: Factor analytic insights. Presence: Teleoperators & Virtual Environments 10 (2001), 266–281.
[53]
Mickael Sereno, Xiyao Wang, Lonni Besançon, Michael J Mcguffin, and Tobias Isenberg. 2020. Collaborative work in augmented reality: A survey. IEEE Transactions on Visualization and Computer Graphics 28, 6 (2020), 2530–2549.
[54]
Hanieh Shakeri and Carman Neustaedter. 2019. Teledrone: Shared outdoor exploration using telepresence drones. In Conference companion publication of the 2019 on computer supported cooperative work and social computing. 367–371.
[55]
Rajinder S Sodhi, Brett R Jones, David Forsyth, Brian P Bailey, and Giuliano Maciocci. 2013. BeThere: 3D mobile collaboration with spatial input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 179–188.
[56]
Maximilian Speicher, Jingchen Cao, Ao Yu, Haihua Zhang, and Michael Nebeling. 2018. 360anywhere: Mobile ad-hoc collaboration in any environment using 360 video and augmented reality. Proceedings of the ACM on Human-Computer Interaction 2 (2018), 1–20.
[57]
Aaron Stafford, Bruce H Thomas, and Wayne Piekarski. 2008. Efficiency of techniques for mixed-space collaborative navigation. In 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE, 181–182.
[58]
Richard Stoakley, Matthew J Conway, and Randy Pausch. 1995. Virtual reality on a WIM: interactive worlds in miniature. In Proceedings of the SIGCHI conference on Human factors in computing systems. 265–272.
[59]
Matthew Tait and Mark Billinghurst. 2015. The effect of view independence in a collaborative AR system. Computer Supported Cooperative Work (CSCW) 24 (2015), 563–589.
[60]
Masanari Tanase and Yasuyuki Yanagida. 2015. Video stabilization for HMD-based telexistence-Concept and prototype configuration. In 2015 IEEE/SICE International Symposium on System Integration (SII). IEEE, 106–111.
[61]
Theophilus Teo, Louise Lawrence, Gun A Lee, Mark Billinghurst, and Matt Adcock. 2019. Mixed reality remote collaboration combining 360 video and 3d reconstruction. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–14.
[62]
John Thomason, Photchara Ratsamee, Kiyoshi Kiyokawa, Pakpoom Kriangkomol, Jason Orlosky, Tomohiro Mashita, Yuki Uranishi, and Haruo Takemura. 2017. Adaptive view management for drone teleoperation in complex 3D structures. In Proceedings of the 22nd International Conference on Intelligent User Interfaces. 419–426.
[63]
Balasaravanan Thoravi Kumaravel, Fraser Anderson, George Fitzmaurice, Bjoern Hartmann, and Tovi Grossman. 2019. Loki: Facilitating remote instruction of physical tasks using bi-directional mixed-reality telepresence. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 161–174.
[64]
Balasaravanan Thoravi Kumaravel, Cuong Nguyen, Stephen DiVerdi, and Bjoern Hartmann. 2020. TransceiVR: Bridging Asymmetrical Communication Between VR Users and External Collaborators. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 182–195.
[65]
Eduardo Veas, Alessandro Mulloni, Ernst Kruijff, Holger Regenbrecht, and Dieter Schmalstieg. 2010. Techniques for view transition in multi-camera outdoor environments. In Proceedings of Graphics Interface 2010. Citeseer, 193–200.
[66]
Jacob O Wobbrock, Leah Findlater, Darren Gergle, and James J Higgins. 2011. The aligned rank transform for nonparametric factorial analyses using only anova procedures. In Proceedings of the SIGCHI conference on human factors in computing systems. 143–146.
[67]
Haijun Xia, Sebastian Herscher, Ken Perlin, and Daniel Wigdor. 2018. Spacetime: Enabling fluid individual and collaborative editing in virtual reality. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 853–866.
[68]
Jacob Young, Tobias Langlotz, Matthew Cook, Steven Mills, and Holger Regenbrecht. 2019. Immersive telepresence and remote collaboration using mobile and wearable devices. IEEE transactions on visualization and computer graphics 25 (2019), 1908–1918.
[69]
Jacob Young, Tobias Langlotz, Steven Mills, and Holger Regenbrecht. 2020. Mobileportation: Nomadic Telepresence for Mobile Devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4 (2020), 1–16.

Cited By

View all
  • (2024)EEG-driven eye blink-controlled smart interface for physically challengedUniversal Access in the Information Society10.1007/s10209-024-01182-3Online publication date: 28-Nov-2024

Index Terms

  1. Vicarious: Context-aware Viewpoints Selection for Mixed Reality Collaboration

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      VRST '23: Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology
      October 2023
      542 pages
      ISBN:9798400703287
      DOI:10.1145/3611659
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 09 October 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 360-degree Panoramic Video
      2. Mixed Reality
      3. Perspective Sharing.
      4. Remote Collaboration
      5. Telepresence
      6. Viewpoint Sharing

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      VRST 2023

      Acceptance Rates

      Overall Acceptance Rate 66 of 254 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)211
      • Downloads (Last 6 weeks)23
      Reflects downloads up to 12 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)EEG-driven eye blink-controlled smart interface for physically challengedUniversal Access in the Information Society10.1007/s10209-024-01182-3Online publication date: 28-Nov-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media