Abstract
Remote guidance on physical tasks is a type of collaboration in which a local worker is guided by a remote helper to operate on a set of physical objects. It has many applications in industrial sections such as remote maintenance and how to support this type of remote collaboration has been researched for almost three decades. Although a range of different modern computing tools and systems have been proposed, developed and used to support remote guidance in different application scenarios, it is essential to provide communication cues in a shared visual space to achieve common ground for effective communication and collaboration. In this paper, we conduct a selective review to summarize communication cues, approaches that implement the cues and their effects on augmented reality based remote guidance. We also discuss challenges and propose possible future research and development directions.
Similar content being viewed by others
References
Kraut RE, Miller MD, Siegel J (1996) “Collaboration in performance of physical tasks: effects on outcomes and communication,“ in Proceedings of the 1996 ACM conference on Computer supported cooperative work, : ACM, pp. 57–66
Huang W, Alem L (2013) Gesturing in the air: supporting full mobility in remote collaboration on physical tasks,. J Univers Comput Sci 19(8):1158–1174
Gasques D et al (2021) “ARTEMIS: A Collaborative Mixed-Reality System for Immersive Surgical Telementoring,“ presented at the Proceedings of the CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 2021. [Online]. Available: https://doi.org/10.1145/3411764.3445576
Mather C, Barnett T, Broucek V, Saunders A, Grattidge D, Huang W (2017) “Helping Hands: Using Augmented Reality to Provide Remote Guidance to Health Professionals,“ Studies in Health Technology and Informatics, vol. 241, Context Sensitive Health Informatics: Redesigning Healthcare Work, pp. 57–62,
Kim S, Billinghurst M, Kim K “Multimodal interfaces and communication cues for remote collaboration,“Journal on Multimodal User Interfaces, vol. 14, no. 4, pp.313–319, 2020/12/01 2020, doi: https://doi.org/10.1007/s12193-020-00346-8
Kim S, Billinghurst M, Lee C, Lee G (2018) Using Freeze Frame and Visual Notifications in an Annotation Drawing Interface for Remote Collaboration,. KSII Trans Internet Inf Syst 12(12):6034–6056
Fussell SR, Kraut RE, Siegel J (2000) “Coordination of communication: Effects of shared visual context on collaborative work,“ in Proceedings of the ACM conference on Computer supported cooperative work, 2000: ACM, pp. 21–30
Kraut RE, Gergle D, Fussell SR (2002) “The use of visual information in shared visual spaces: Informing the development of virtual co-presence,“ in Proceedings of the ACM conference on Computer supported cooperative work, 2002: ACM, pp. 31–40
Li J, Wessels A, Alem L, Stitzlein C (2007) 2007 “Exploring Interface with Representation of Gesture for Remote Collaboration,“ in Proceedings of the 19th Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces, : ACM, pp. 179–182. [Online]. Available: https://doi.org/acm.org/10.1145/1324892.1324926. [Online]. Available: http://doi.acm.org/10.1145/1324892.1324926
Alem L, Huang W, Tecchia F (2011) “Supporting the changing roles of maintenance operators in mining: A human factors perspective,“The Ergonomics Open Journal, vol. 4, no. 1,
Kirk D, Rodden T, Fraser DS (2007) 2007 “Turn it way: grounding collaborative action with remote gestures,“ in CHI ‘07: Proceedings of the SIGCHI conference on Human factors in computing systems, : ACM, pp. 1039–1048. [Online]. Available: http://portal.acm.org/citation.cfm?id=1240782#. [Online]. Available: http://portal.acm.org/citation.cfm?id=1240782#
Kuzuoka H, Kosuge T, Tanaka M (1994) “GestureCam: A Video Communication System for Sympathetic Remote Collaboration,“ in Proceedings of the ACM Conference on Computer Supported Cooperative Work, 1994 1994: ACM, pp. 35–43. [Online]. Available: https://doi.org/acm.org/10.1145/192844.192866. [Online]. Available: http://doi.acm.org/10.1145/192844.192866
Bauer M, Kortuem G, Segall Z (1999) 1999 “"Where Are You Pointing At?“ A Study of Remote Collaboration in a Wearable Videoconference System,“ in Proceedings of the 3rd IEEE International Symposium on Wearable Computers, : IEEE Computer Society, pp. 151–151. [Online]. Available: http://dl.acm.org/citation.cfm?id=519309.856501. [Online]. Available: http://dl.acm.org/citation.cfm?id=519309.856501
Wang P et al (2020) “Using a Head Pointer or Eye Gaze: The Effect of Gaze on Spatial AR Remote Collaboration for Physical Tasks,“. Interact Comput 32(2):153–169. doi: https://doi.org/10.1093/iwcomp/iwaa012
Ou J, Fussell SR, Chen X, Setlock LD, Yang J (2003) “Gestural communication over video stream: supporting multimodal interaction for remote collaborative physical tasks,“ in Proceedings of the 5th international conference on Multimodal interfaces, : ACM, pp. 242–249
Yamashita N, Kaji K, Kuzuoka H, Hirata K (2011) “Improving visibility of remote gestures in distributed tabletop collaboration,“ in Proceedings of the ACM conference on Computer supported cooperative work, 2011 2011: ACM, pp. 95–104. [Online]. Available: https://doi.org/acm.org/10.1145/1958824.1958839. [Online]. Available: http://doi.acm.org/10.1145/1958824.1958839
Fussell SR, Setlock LD, Yang J, Ou J, Mauer E, Kramer ADI (2004) Gestures over video streams to support remote collaboration on physical tasks. Hum -Comput Interact 19(3):273–309. doi: https://doi.org/10.1207/s15327051hci1903_3
Susan R. Fussell, Leslie D. Setlock, Elizabeth M. Parker, and Jie Yang. 2003. Assessing the value of a cursor pointing device for remote collaboration on physical tasks. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). Association for Computing Machinery, New York, NY, USA, 788–789. DOI: https://doi.org/10.1145/765891.765992
O’Neill J, Castellani S, Roulland F, Hairon N, Juliano C, Dai L (2011) “From ethnographic study to mixed reality: a remote collaborative troubleshooting system,“ in Proceedings of the ACM conference on Computer supported cooperative work, 2011 2011: ACM, pp. 225–234. [Online]. Available: https://doi.org/acm.org/10.1145/1958824.1958859. [Online]. Available: http://doi.acm.org/10.1145/1958824.1958859
Palmer D et al (2007) “Annotating with light for remote guidance,“ in Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces, : ACM, pp. 103–110
Karsenty L (1999) “Cooperative work and shared visual context: an empirical study of comprehension problems in side-by-side and remote help dialogues,“. Hum -Comput Interact 14(3):283–315. http://dx.doi.org/http://dx.doi.org/10.1207/S15327051HCI1403_2
Sakata N, Kurata T, Kato T, Kourogi M, Kuzuoka H (2003) “WACL: supporting telecommunications using - wearable active camera with laser pointer,“ in Seventh IEEE International Symposium on Wearable Computers, Proceedings., 21–23 Oct. 2003 2003, pp. 53–56, doi: https://doi.org/10.1109/ISWC.2003.1241393
Kuzuoka H et al (2004) “Mediating dual ecologies,“ in Proceedings of the 2004 ACM conference on Computer supported cooperative work, : ACM, pp. 477–486
Tajimi K, Sakata N, Uemura K, Nishida S (2010) “Remote collaboration using real-world projection interface,“ in Systems Man and Cybernetics (SMC), 2010 IEEE International Conference on, pp. 3008–3013
Higuch K, Yonetani R, Sato Y (2016) “Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks,“ in Proceedings of the CHI Conference on Human Factors in Computing Systems, 2016 2016: ACM, pp. 5180–5190. [Online]. Available: https://doi.org/acm.org/10.1145/2858036.2858438. [Online]. Available: http://doi.acm.org/10.1145/2858036.2858438
Akkil D, James JM, Isokoski P, Kangas J (2016) “GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks,“ presented at the Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, California, USA, [Online]. Available: https://doi.org/10.1145/2851581.2892459
Gauglitz S, Lee C, Turk M, Höllerer T (2012) 2012 “Integrating the physical environment into mobile remote collaboration,“ in Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services, : ACM, pp. 241–250. [Online]. Available: https://doi.org/acm.org/10.1145/2371574.2371610. [Online]. Available: http://doi.acm.org/10.1145/23715742371610
Gauglitz S, Nuernberger B, Turk M, Höllerer T (2014) 2014 “World-stabilized Annotations and Virtual Scene Navigation for Remote Collaboration,“ in Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, : ACM, pp. 449–459. [Online]. Available: https://doi.org/acm.org/10.1145/2642918.2647372. [Online]. Available: http://doi.acm.org/10.1145/26429182647372
Chang Y-C, Wang H-C, Chu H-k, Lin S-Y, Wang S-P (2017) “AlphaRead: Support Unambiguous Referencing in Remote Collaboration with Readable Object Annotation,“ in Proceedings of the ACM Conference on Computer Supported Cooperative Work and Social Computing, 2017 2017: ACM, pp. 2246–2259. [Online]. Available: https://doi.org/acm.org/10.1145/2998181.2998258. [Online]. Available: http://doi.acm.org/10.1145/29981812998258
Fakourfar O, Ta K, Tang R, Bateman S, Tang A (2016) “Stabilized Annotations for Mobile Remote Assistance,“ in Proceedings of the CHI Conference on Human Factors in Computing Systems, 2016 2016: ACM, pp. 1548–1560. [Online]. Available: https://doi.org/acm.org/10.1145/2858036.2858171. [Online]. Available: http://doi.acm.org/10.1145/28580362858171
Ou J, Chen X, Fussell SR, Yang J (2003) 2003 “DOVE: Drawing over Video Environment,“ in MULTIMEDIA ‘03 Proceedings of the eleventh ACM international conference on Multimedia, : ACM, pp. 100–101
Chantziaras G et al “An Augmented Reality-Based Remote Collaboration Platform for Worker Assistance,“ Cham, 2021: Springer International Publishing, in Pattern Recognition. ICPR International Workshops and Challenges, pp. 404–416
Gurevich P, Lanir J, Cohen B, Stone R (2012) “TeleAdvisor: a versatile augmented reality tool for remote assistance,“ in Proceedings of the ACM annual conference on Human Factors in Computing Systems, 2012 2012: ACM, pp. 619–622. [Online]. Available: https://doi.org/acm.org/10.1145/2207676.2207763. [Online]. Available: http://doi.acm.org/10.1145/22076762207763
Gauglitz S, Nuernberger B, Turk M, Höllerer T (2014) 2014 “In Touch with the Remote World: Remote Collaboration with Augmented Reality Drawings and Virtual Navigation,“ in Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology, : ACM, pp. 197–205. [Online]. Available: http://doi.acm.org.ezproxy.lib.swin.edu.au/10.1145/2671015.2671016. [Online]. Available: http://doi.acm.org.ezproxy.lib.swin.https://doi.org/edu.au/10.1145/2671015.2671016
Adcock M, Gunn C (2015) “Using Projected Light for Mobile Remote Guidance,“ Computer Supported Cooperative Work (CSCW), https://doi.org/10.1007/s10606-015-9237-2 vol. 24, no. 6, pp. 591–611, [Online]. Available: https://doi.org/10.1007/s10606-015-9237-2
Stevenson D, Li J, Smith J, Hutchins M (2008) 2008 “A Collaborative Guidance Case Study,“ in Ninth Australasian User Interface Conference (AUIC 2008), B. Plimmer and G. Weber, Eds., vol. 76: ACS, pp. 33–42
Kirk D, Stanton Fraser D (2006) “Comparing remote gesture technologies for supporting collaborative physical tasks,“ in Proceedings of the SIGCHI conference on Human Factors in computing systems, : ACM, pp. 1191–1200
Kirk, D., Crabtree, A., Rodden, T. (2005). Ways of the Hands. In: Gellersen, H., Schmidt, K., Beaudouin-Lafon, M., Mackay, W. (eds) ECSCW 2005. Springer, Dordrecht. https://doi.org/10.1007/1-4020-4023-7_1
Wickey A, Alem L (2007) 2007 “Analysis of hand gestures in remote collaboration: some design recommendations,“ in Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces, : ACM, pp. 87–93. [Online]. Available: https://doi.org/acm.org/10.1145/1324892.1324909. [Online]. Available: http://doi.acm.org/10.1145/13248921324909
Kato H, Yamazaki K, Suzuki H, Kuzuoka H, Miki H, Yamazaki A (1997) 1997 “Designing a video-mediated collaboration system based on a body metaphor,“ in CSCL ‘97: Proceedings of the 2nd international conference on Computer support for collaborative learning, : International Society of the Learning Sciences, pp. 148–156
Alem, L., Tecchia, F., Huang, W. (2011). HandsOnVideo: Towards a Gesture based Mobile AR System for Remote Collaboration. In: Alem, L., Huang, W. (eds) Recent Trends of Mobile Collaborative Augmented Reality Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9845-3_11
Kirk DS, Fraser DS (2005) “The effects of remote gesturing on distance instruction,“ in Proceedings of th conference on Computer support for collaborative learning: learning 2005: the next 10 years!, 2005: International Society of the Learning Sciences, pp. 301–310
Robert K, Zhu D, Huang W, Alem L, Gedeon T “MobileHelper: remote guiding using smart mobile devices, hand gestures and augmented reality,“ presented at the SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, Hong Kong, Hong Kong, 2013. [Online]. Available: https://doi.org/10.1145/2543651.2543664
SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, 2017 2017: ACM, pp. 17:1–17:6. [Online]. Available: http://doi.acm.org/10.1145/3132787.3139204. [Online]. Available: http://doi.acm.org/10.1145/3132787.3139204
Huang W, Alem L, Tecchia F, Duh HB-L (2018) “Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration,“. J Multimodal User Interfaces 12(2):77–89
Tecchia F, Alem L, Huang W “3D helping hands: a gesture based MR system for remote collaboration,“ presented at the Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, Singapore, Singapore, 2012. [Online]. Available: https://doi.org/10.1145/2407516.2407590
Yamashita N, Hirata K, Takada T, Harada Y, Shirai Y, Aoyagi S (2007) Effects of Room-sized Sharing on Remote Collaboration on Physical Tasks,. IPSJ Digit Courier 3:788–799
Yamashita N, Kuzuoka H, Hirata K, Aoyagi S, Shirai Y (2011) “Supporting fluid tabletop collaboration across distances,“ in Proceedings of the annual conference on Human factors in computing systems, 2011: ACM, 2011, pp. 2827–2836. [Online]. Available: https://doi.org/acm.org/10.1145/1978942.1979362. [Online]. Available: http://doi.acm.org/10.1145/19789421979362
Huang W, Kim S, Billinghurst M, Alem L “Sharing hand gesture and sketch cues in remote collaboration,“Journal of Visual Communication and Image Representation, vol. 58, pp.428–438, 2019/01/01/ 2019, doi: https://doi.org/10.1016/j.jvcir.2018.12.010
Proceedings of the 28th Annual ACM Symposium on User Interface Software &38; Technology, 2015 2015: ACM, pp. 405–415. [Online]. Available: http://doi.acm.org/10.1145/2807442.2807497. [Online]. Available: http://doi.acm.org/10.1145/2807442.2807497
Tait M, Billinghurst M (2015) “The Effect of View Independence in a Collaborative AR System,“ Computer Supported Cooperative Work (CSCW), https://doi.org/10.1007/s10606-015-9231-8 pp. 1–27, [Online]. Available: http://dx.doi.org/10.1007/s10606-015-9231-8
Kim S, Lee G, Huang W, Kim H, Woo W, Billinghurst M (2019) “Evaluating the Combination of Visual Communication Cues for HMD-based Mixed Reality Remote Collaboration,“ presented at the Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland Uk, 2019. [Online]. Available: https://doi.org/10.1145/3290605.3300403
Kim S, Lee G, Billinghurst M, Huang W “The combination of visual communication cues in mixed reality remote collaboration,“ Journal on Multimodal User Interfaces, 2020/07/15 2020, doi: https://doi.org/10.1007/s12193-020-00335-x
Billinghurst M, Gupta K, Masai K, Lee Y, Lee G, Kunze KS et al. Is it in your eyes? Explorations in using gaze cues for remote collaboration. In: Anslow, C., Campos, P., Jorge, J. (eds) Collaboration Meets Interactive Spaces. Springer International Publishing. 2017. p. 177–199. https://doi.org/10.1007/978-3-319-45853-3_9
Gupta K, Lee GA, Billinghurst M (2016) “Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration,“ IEEE Transactions on Visualization and Computer Graphics, https://doi.org/10.1109/TVCG.2016.2593778 vol. 22, no. 11, pp. 2413–2422,
Bai H, Sasikumar P, Yang J, Billinghurst M (2020) “A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing,“ presented at the Proceedings of the CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 2020. [Online]. Available: https://doi-org.ezproxy.lib.uts.edu.au/https://doi.org/10.1145/3313831.3376550
Otsuki M, Maruyama K, Kuzuoka H, Suzuki Y (2018) “Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks,“ in Proceedings of the CHI Conference on Human Factors in Computing Systems, 2018 2018: ACM, pp. 368:1-368:11. [Online]. Available: https://doi.org/acm.org/10.1145/3173574.3173942. [Online]. Available: http://doi.acm.org/10.1145/3173574.3173942
Kuzuoka H, Yamazaki K, Yamazaki A, Kosaka Ji, Suga Y, Heath C (2004) 2004 “Dual Ecologies of Robot As Communication Media: Thoughts on Coordinating Orientations and Projectability,“ in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, : ACM, pp. 183–190. [Online]. Available: https://doi.org/acm.org/10.1145/985692.985716. [Online]. Available: http://doi.acm.org/10.1145/985692985716
Yamamoto T, Otsuki M, Kuzuoka H, Suzuki Y (2018) Tele-Guidance System to Support Anticipation during Communication,. Multimodal Technol Interact 2:55–55
Piumsomboon T et al (2018) “Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration,“ presented at the Proceedings of the CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada, 2018. [Online]. Available: https://doi.org/10.1145/3173574.3173620
Wang T-Y, Sato Y, Otsuki M, Kuzuoka H, Suzuki Y (2020) “Effect of Body Representation Level of an Avatar on Quality of AR-Based Remote Instruction,“ Multimodal Technologies and Interaction, vol. 4, no. 1, p. 3, [Online]. Available: https://www.mdpi.com/2414-4088/4/1/3
Günther S, Kratz S, Avrahami D, Mühlhäuser M (2018) 2018 “Exploring Audio, Visual, and Tactile Cues for Synchronous Remote Assistance,“ in Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, : ACM, pp. 339–344. [Online]. Available: http://doi.acm.org.ezproxy.lib.swin.edu.au/10.1145/3197768.3201568. [Online]. Available: http://doi.acm.org.ezproxy.lib.swin.https://doi.org/edu.au/10.1145/3197768.3201568
Wang P et al “Haptic Feedback Helps Me? A VR-SAR Remote Collaborative System with Tangible Interaction,“International Journal of Human–Computer Interaction, vol. 36, no. 13, pp.1242–1257, 2020/08/08 2020, doi: https://doi.org/10.1080/10447318.2020.1732140
Ou J, Oh LM, Fussell SR, Blum T, Yang J (2005) “Analyzing and predicting focus of attention in remote collaborative tasks,“ in Proceedings of the 7th international conference on Multimodal interfaces, : ACM, pp. 116–123
Ou J, Oh LM, Fussell SR, Blum T, Yang J (2008) Predicting Visual Focus of Attention From Intention in Remote Collaborative Tasks,. IEEE Trans Multimedia 10(6):1034–1045
Ou J, Oh LM, Yang J, Fussell SR (2005) “Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task,“ in Proceedings of the SIGCHI conference on Human factors in computing systems, : ACM, pp. 231–240
Wong J, Oh LM, Ou J, Rosé CP, Yang J, Fussell SR (2007) “Sharing a single expert among multiple partners,“ in Proceedings of the SIGCHI conference on human factors in computing systems, : ACM, pp. 261–270
Gergle D, Kraut RE, Fussell SR (2006) “The impact of delayed visual feedback on collaborative performance,“ in Proceedings of the SIGCHI conference on Human Factors in computing systems, : ACM, pp. 1303–1312
Huang W, Alem L, Nepal S, Thilakanathan D “Supporting tele-assistance and tele-monitoring in safety-critical environments,“ presented at the Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, Australia, 2013. [Online]. Available: https://doi.org/10.1145/2541016.2541065
Yap TF, Epps J, Ambikairajah E, Choi EHC (2015) “Voice source under cognitive load: Effects and classification,“ Speech Communication, vol. 72, pp. 74–95, 2015/09/01/ doi: https://doi.org/10.1016/j.specom.2015.05.007
Liebenthal E, Silbersweig DA, Stern E “The Language, Tone and Prosody of Emotions: Neural Substrates and Dynamics of Spoken-Word Emotion Perception,“ (in English), Frontiers in Neuroscience, Review vol. 10, no. 506, 2016-November-08 2016, doi: https://doi.org/10.3389/fnins.2016.00506
Kirschner F, Paas F, Kirschner PA “Individual Versus Group Learning as a Function of Task Complexity: An Exploration into the Measurement of Group Cognitive Load,“ Dordrecht, 2008: Springer Netherlands, in Beyond Knowledge:The Legacy of Competence, pp.21–28
Kim S, Lee H, Connerton TP, “How Psychological Safety Affects Team Performance : Mediating Role of Efficacy and Learning Behavior,“ (in English), Frontiers in Psychology, Original Research vol. 11, no. 1581, 2020-July-24 2020, doi: https://doi.org/10.3389/fpsyg.2020.01581
Huang W, Alem L, Livingston MA (eds) (2013) Human Factors in Augmented Reality Environments. Springer Science + Business Media, New York, p 274
Le HC, Huang W, Billinghurst M, Yap EH “Identifying Human Factors for Remote Guidance on Physical Tasks,“ Cham, 2021: Springer International Publishing, in Cooperative Design, Visualization, and Engineering, pp. 271–283
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Huang, W., Wakefield, M., Rasmussen, T.A. et al. A review on communication cues for augmented reality based remote guidance. J Multimodal User Interfaces 16, 239–256 (2022). https://doi.org/10.1007/s12193-022-00387-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-022-00387-1