Abstract
Industrial manual assembly is a relatively established use case for emerging head-mounted Augmented Reality (AR) platforms: operators get visual support in placing pieces depending on where they are in the assembly process. However, is vision the only suitable sensory modality for such guidance? We present a systematic review of previous work done on multimodal guidance and subtle guidance approaches, confirming that explicit visual cues dominate. We then outline a three-step method for generating multisensory guidance ideas intended for real-world task support based on task observation that led to identification of 18 steps in truss assembly, brainstorming AR guidance approaches related to assembly and maintenance, and mapping of brainstorming results to the observed task. We illustrated the use of the method by deploying it on our current mission in producing AR guidance approaches for an industrial partner involved in designing and assembling wooden trusses. In this work, we went beyond the standard visual AR guidance in two ways, 1) by opening for guidance through auditory, tactile, and olfactory sensory channels, 2) by considering subtle guidance as alternative or complement to explicit information presentation. We presented a resulting set of multisensory guidance ideas, each tied to one of the 18 steps in the observed truss assembly task. To mention a few which we intend to investigate further: smell for gradual warning about non-imminent potential hazardous situations; 3D sound to guide operators to location of different tools; thermos-haptics for subtle notifications about contextual events (e.g., happening at other assembly stations). The method presented helped us to explore all modalities and to identify new possibilities. More work is needed to understand how different modalities can be combined and the impact of different modality distractions on task performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Change history
23 August 2022
In the version of this chapter that was originally published the heading contains a typographical error. This has now been corrected.
References
Rubinstein, J., Meyer, D., Evans, J.: Executive control of cognitive processes in task switching. J. Exp. Psychol. Hum. Percept. Perform. 27, 763–797 (2001)
Lee, S., Starner, T.: BuzzWear. In: Proceedings of the 28th International Conference on Human Factors in Computing Systems - CHI 2010 (2010)
Schumacher, E., et al.: Virtually perfect time sharing in dual-task performance: uncorking the central cognitive bottleneck. Psychol. Sci. 12, 101–108 (2001)
Cambridge Dictionary (2021). https://dictionary.cambridge.org/dictionary/english/guidance
Suchman, L.: Plans and Situated Actions: The Problem of Human-Machine Communication. Cambridge University Press, New York (1987)
Heinz, M., Büttner, S., Röcker, C.: Exploring augmented reality training for automated systems. In: Workshop “Everyday Automation Experience” at ACM CHI Conference on Human Factors in Computing Systems (CHI 2019). ACM (2019)
Michalos, G., Karagiannis, P., Makris, S., Tokçalar, Ö., Chryssolouris, G.: Augmented reality (AR) applications for supporting human-robot interactive cooperation. Procedia CIRP. 41, 370–375 (2016)
Hanson, R., Falkenström, W., Miettinen, M.: Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput. Ind. Eng. 113, 570–575 (2017)
NCC:VR-VirtualReality (2021). https://www.ncc.com/our-offer/customer-values/digital-construction/virtual-reality/
Boeing: Paul Davies (Boeing): Evolution of the Boeing AR Kit, & Its Application to Airplane Manufacturing (2019)
Renner, P., Pfeiffer, T.: Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI) (2017)
Biocca, F., Tang, A., Owen, C., Fan, X.: The omnidirectional attention funnel: a dynamic 3D cursor for mobile augmented reality systems. In: Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS'06) (2006)
Danielsson, O., Holm, M., Syberfeldt, A.: Augmented reality smart glasses for operators in production: survey of relevant categories for supporting operators. Procedia CIRP 93, 1298–1303 (2020)
Renner, P., Pfeiffer, T.: Attention guiding using augmented reality in complex environments. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (2018)
Cao, Y., Qian, X., Wang, T., Lee, R., Huo, K., Ramani, K.: An exploratory study of augmented reality presence for tutoring machine tasks. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (2020)
SensFloor System - Future-Shape EN. https://future-shape.com/en/system/
Bolton, F., Jalaliniya, S., Pederson, T.: A wrist-worn thermohaptic device for graceful interruption. interaction design and architecture(s) J. IxD&A 26, 39–54 (2015)
Zhu, K., Perrault, S., Chen, T., Cai, S., Lalintha Peiris, R.: A sense of ice and fire: exploring thermal feedback with multiple thermoelectric-cooling elements on a smart ring. Int. J. Hum. Comput Stud. 130, 234–247 (2019)
Huang, G., Pitts, B.: Takeover requests for automated driving: the effects of signal direction, lead time, and modality on takeover performance. Accid. Anal. Prev. 165, 106534 (2022)
Weir, P., et al.: Burnar: involuntary heat sensations in augmented reality. In: 2013 IEEE Virtual Reality (VR) (2013)
Stone, S., Tata, M.: Rendering visual events as sounds: spatial attention capture by auditory augmented reality. PLoS ONE 12, e0182635 (2017)
Pusch, A., Lécuyer, A.: Pseudo-haptics. In: Proceedings of the 13th International Conference on Multimodal Interfaces - ICMI 2011 (2011)
Binetti, N., Wu, L., Chen, S., Kruijff, E., Julier, S., Brumby, D.: Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality. Displays 69, 102032 (2021)
Sabeti, S., Shoghli, O., Baharani, M., Tabkhi, H.: Toward AI-enabled augmented reality to enhance the safety of highway work zones: Feasibility, requirements, and challenges. Adv. Eng. Inform. 50, 101429 (2021)
Haghbin, N., Kersten-Oertel, M.: Multimodal cueing in gamified physiotherapy: a preliminary study. In: Proceedings of the 7th International Conference on Information and Communication Technologies for Ageing Well and e-Health (2021)
Gruenefeld, U., Löcken, A., Brueck, Y., Boll, S., Heuten, W.: Where to look. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (2018)
San Martín, A., Kildal, J.: Audio-visual AR to improve awareness of hazard zones around robots. In: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (2019)
Miller, J., Godfroy-Cooper, M., Szoboszlay, Z.: Augmented-reality multimodal cueing for obstacle awareness: towards a new topology for threat-level presentation. In: Vertical Flight Society's 75th Forum (2019)
Trepkowski, C., et al.: Multisensory proximity and transition cues for improving target awareness in narrow field of view augmented reality displays. IEEE Trans. Visual Comput. Graph. 28, 1342–1362 (2022)
Schönauer, C., Fukushi, K., Olwal, A., Kaufmann, H., Raskar, R.: Multimodal motion guidance. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction - ICMI 2012 (2012)
Obrenovic, Z., Starcevic, D., Jovanov, E.: Experimental evaluation of multimodal human computer interface for tactical audio applications. In: Proceedings. IEEE International Conference on Multimedia and Expo (2002)
Kweon, Y., Kim, S., Yoon, B., Jo, T., Park, C.: Implementation of educational drum contents using mixed reality and virtual reality. In: Stephanidis, C. (ed.) HCI 2018. CCIS, vol. 851, pp. 296–303. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92279-9_40
Weevers, I., Sluis, W., van Schijndel, C., Fitrianie, S., Kolos-Mazuryk, L., Martens, J.-B.: Read-It: a multi-modal tangible interface for children who learn to read. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 226–234. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28643-1_29
Wang, X., Ong, S., Nee, A.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30, 406–421 (2016)
Tao, W., Lai, Z., Leu, M., Yin, Z., Qin, R.: A self-aware and active-guiding training & assistant system for worker-centered intelligent manufacturing. Manuf. Lett. 21, 45–49 (2019)
Kong, J., Sabha, D., Bigham, J., Pavel, A., Guo, A.: TutorialLens: authoring Interactive augmented reality tutorials through narration and demonstration. In: Symposium on Spatial User Interaction (2021)
Serras, M., García-Sardiña, L., Simões, B., Álvarez, H., Arambarri, J.: Dialogue enhanced extended reality: interactive system for the operator 4.0. Appl. Sci. 10, 3960 (2020)
Neumann, A., Strenge, B., Schalkwijk, L., Essig, K., Schack, T.: Facilitating workers’ task proficiency with subtle decay of contextual AR-based assistance derived from unconscious memory structures. Information 12, 17 (2021)
Bakker, S.: Design for peripheral interaction, Ph.D. thesis. Technische Universiteit Eindhoven (2013)
Bolton, F., Jalaliniya, S., Pederson, T.: A wrist-worn thermohaptic device for graceful interruption. Interact. Des. Archit. 26, 39–54 (2015)
Bailey, R., McNamara, A., Sudarsanam, N., Grimm, C.: Subtle gaze direction. ACM Trans. Graph. 28, 1–14 (2009)
Masai, K., Kunze, K., Sugimoto, M., Billinghurst, M.: Empathy glasses. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (2016)
Wherton, J., Monk, A.: Problems people with dementia have with kitchen tasks: the challenge for pervasive computing. Interact. Comput. 22, 253–266 (2010)
Schlosser, P., Matthews, B., Sanderson, P.: Head-worn displays for healthcare and industry workers: a review of applications and design. Int. J. Hum. Comput. Stud. 154, 102628 (2021)
Isoyama, N., Sakuragi, Y., Terada, T., Tsukamoto, M.: Effects of augmented reality object and texture presentation on walking behavior. Electronics 10, 702 (2021)
Kotseruba, I., Tsotsos, J.: Behavioral research and practical models of drivers’ attention. ArXiv. abs/2104.05677 (2021)
Lu, W., Feng, D., Feiner, S., Zhao, Q., Duh, H.: Subtle cueing for visual search in head-tracked head worn displays. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2013)
Lu, W., Feng, D., Feiner, S., Zhao, Q., Duh, H.: Evaluating subtle cueing in head-worn displays. In: Proceedings of the Second International Symposium of Chinese CHI on - Chinese CHI 2014 (2014)
Dillman, K., Mok, T., Tang, A., Oehlberg, L., Mitchell, A.: A visual interaction cue framework from video game environments for augmented reality. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (2018)
Feng, D., Weng, D., Lu, W., Sun, C., Do, E.: A browser-based perceptual experiment platform for visual search study in augmented reality system. In: 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing and 2013 IEEE 10th International Conference on Autonomic and Trusted Computing (2013)
Lu, W., Duh, H., Feiner, S., Zhao, Q.: Attributes of subtle cues for facilitating visual search in augmented reality. IEEE Trans. Visual Comput. Graph. 20, 404–412 (2014)
MacIntyre, B., Bolter, J., Moreno, E., Hannigan, B.: Augmented reality as a new media experience. In: Proceedings IEEE and ACM International Symposium on Augmented Reality (2001)
John, B., Kalyanaraman, S., Jain, E.: Look out! A design framework for safety training systems a case study on omnidirectional Cinemagraphs. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (2020)
Ahmaniemi, T.: Dynamic tactile feedback in human computer interaction. Dissertations. Aalto University (2012)
D’Angelo, S., Schneider, B.: Shared gaze visualizations in collaborative interactions: past, present and future. Interact. Comput. 33, 115–133 (2021)
Wren, C., Reynolds, C.: Parsimony & transparency in ubiquitous interface design. In: Proceedings of the 2002 International Conference on Ubiquitous Computing (UBICOMP) (2002)
Veas, E.E., Mendez, E., Feiner, S.K., Schmalstieg, D.: Directing attention and influencing memory with visual saliency modulation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2011)
McNamara, A.M.: Enhancing art history education through mobile augmented reality. In: Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry - VRCAI 2011 (2011)
Booth, T., Sridharan, S., McNamara, A., Grimm, C., Bailey, R.: Guiding attention in controlled real-world environments. In: Proceedings of the ACM Symposium on Applied Perception (2013)
Lu, W., Duh, B.-L.H., Feiner, S.: Subtle cueing for visual search in augmented reality. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2012)
Pohl, H., Muresan, A., Hornbæk, K.: Charting subtle interaction in the HCI literature. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (2019)
Pohl, H., Medrek, J., Rohs, M.: ScatterWatch. In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (2016)
Pohl, H., Brandes, P., Ngo Quang, H., Rohs, M.: Squeezeback. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (2017)
Acknowledgments
The authors would like to thank Derome AB for valuable discussions and the possibility to do observations in the production facilities and the project “Tillverka I Trä” (ID-nr 20201948) that is funded by Region Västra Götaland and Swedish Agency for Economic and Regional Growth.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Tobisková, N., Malmsköld, L., Pederson, T. (2022). Multimodal Augmented Reality and Subtle Guidance for Industrial Assembly – A Survey and Ideation Method. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality: Applications in Education, Aviation and Industry. HCII 2022. Lecture Notes in Computer Science, vol 13318. Springer, Cham. https://doi.org/10.1007/978-3-031-06015-1_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-06015-1_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-06014-4
Online ISBN: 978-3-031-06015-1
eBook Packages: Computer ScienceComputer Science (R0)