[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Multimodal Augmented Reality and Subtle Guidance for Industrial Assembly – A Survey and Ideation Method

  • Conference paper
  • First Online:
Virtual, Augmented and Mixed Reality: Applications in Education, Aviation and Industry (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13318))

Included in the following conference series:

  • The original version of this chapter was revised: the chapter contains a typographical error: “Quidance” instead of “Guidance”. The correction to this chapter is available at https://doi.org/10.1007/978-3-031-06015-1_26

Abstract

Industrial manual assembly is a relatively established use case for emerging head-mounted Augmented Reality (AR) platforms: operators get visual support in placing pieces depending on where they are in the assembly process. However, is vision the only suitable sensory modality for such guidance? We present a systematic review of previous work done on multimodal guidance and subtle guidance approaches, confirming that explicit visual cues dominate. We then outline a three-step method for generating multisensory guidance ideas intended for real-world task support based on task observation that led to identification of 18 steps in truss assembly, brainstorming AR guidance approaches related to assembly and maintenance, and mapping of brainstorming results to the observed task. We illustrated the use of the method by deploying it on our current mission in producing AR guidance approaches for an industrial partner involved in designing and assembling wooden trusses. In this work, we went beyond the standard visual AR guidance in two ways, 1) by opening for guidance through auditory, tactile, and olfactory sensory channels, 2) by considering subtle guidance as alternative or complement to explicit information presentation. We presented a resulting set of multisensory guidance ideas, each tied to one of the 18 steps in the observed truss assembly task. To mention a few which we intend to investigate further: smell for gradual warning about non-imminent potential hazardous situations; 3D sound to guide operators to location of different tools; thermos-haptics for subtle notifications about contextual events (e.g., happening at other assembly stations). The method presented helped us to explore all modalities and to identify new possibilities. More work is needed to understand how different modalities can be combined and the impact of different modality distractions on task performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 55.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 69.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Change history

  • 23 August 2022

    In the version of this chapter that was originally published the heading contains a typographical error. This has now been corrected.

References

  1. Rubinstein, J., Meyer, D., Evans, J.: Executive control of cognitive processes in task switching. J. Exp. Psychol. Hum. Percept. Perform. 27, 763–797 (2001)

    Article  Google Scholar 

  2. Lee, S., Starner, T.: BuzzWear. In: Proceedings of the 28th International Conference on Human Factors in Computing Systems - CHI 2010 (2010)

    Google Scholar 

  3. Schumacher, E., et al.: Virtually perfect time sharing in dual-task performance: uncorking the central cognitive bottleneck. Psychol. Sci. 12, 101–108 (2001)

    Article  Google Scholar 

  4. Cambridge Dictionary (2021). https://dictionary.cambridge.org/dictionary/english/guidance

  5. Suchman, L.: Plans and Situated Actions: The Problem of Human-Machine Communication. Cambridge University Press, New York (1987)

    Google Scholar 

  6. Heinz, M., Büttner, S., Röcker, C.: Exploring augmented reality training for automated systems. In: Workshop “Everyday Automation Experience” at ACM CHI Conference on Human Factors in Computing Systems (CHI 2019). ACM (2019)

    Google Scholar 

  7. Michalos, G., Karagiannis, P., Makris, S., Tokçalar, Ö., Chryssolouris, G.: Augmented reality (AR) applications for supporting human-robot interactive cooperation. Procedia CIRP. 41, 370–375 (2016)

    Article  Google Scholar 

  8. Hanson, R., Falkenström, W., Miettinen, M.: Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput. Ind. Eng. 113, 570–575 (2017)

    Article  Google Scholar 

  9. NCC:VR-VirtualReality (2021). https://www.ncc.com/our-offer/customer-values/digital-construction/virtual-reality/

  10. Boeing: Paul Davies (Boeing): Evolution of the Boeing AR Kit, & Its Application to Airplane Manufacturing (2019)

    Google Scholar 

  11. Renner, P., Pfeiffer, T.: Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI) (2017)

    Google Scholar 

  12. Biocca, F., Tang, A., Owen, C., Fan, X.: The omnidirectional attention funnel: a dynamic 3D cursor for mobile augmented reality systems. In: Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS'06) (2006)

    Google Scholar 

  13. Danielsson, O., Holm, M., Syberfeldt, A.: Augmented reality smart glasses for operators in production: survey of relevant categories for supporting operators. Procedia CIRP 93, 1298–1303 (2020)

    Article  Google Scholar 

  14. Renner, P., Pfeiffer, T.: Attention guiding using augmented reality in complex environments. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (2018)

    Google Scholar 

  15. Cao, Y., Qian, X., Wang, T., Lee, R., Huo, K., Ramani, K.: An exploratory study of augmented reality presence for tutoring machine tasks. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (2020)

    Google Scholar 

  16. SensFloor System - Future-Shape EN. https://future-shape.com/en/system/

  17. Bolton, F., Jalaliniya, S., Pederson, T.: A wrist-worn thermohaptic device for graceful interruption. interaction design and architecture(s) J. IxD&A 26, 39–54 (2015)

    Google Scholar 

  18. Zhu, K., Perrault, S., Chen, T., Cai, S., Lalintha Peiris, R.: A sense of ice and fire: exploring thermal feedback with multiple thermoelectric-cooling elements on a smart ring. Int. J. Hum. Comput Stud. 130, 234–247 (2019)

    Article  Google Scholar 

  19. Huang, G., Pitts, B.: Takeover requests for automated driving: the effects of signal direction, lead time, and modality on takeover performance. Accid. Anal. Prev. 165, 106534 (2022)

    Article  Google Scholar 

  20. Weir, P., et al.: Burnar: involuntary heat sensations in augmented reality. In: 2013 IEEE Virtual Reality (VR) (2013)

    Google Scholar 

  21. Stone, S., Tata, M.: Rendering visual events as sounds: spatial attention capture by auditory augmented reality. PLoS ONE 12, e0182635 (2017)

    Article  Google Scholar 

  22. Pusch, A., Lécuyer, A.: Pseudo-haptics. In: Proceedings of the 13th International Conference on Multimodal Interfaces - ICMI 2011 (2011)

    Google Scholar 

  23. Binetti, N., Wu, L., Chen, S., Kruijff, E., Julier, S., Brumby, D.: Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality. Displays 69, 102032 (2021)

    Article  Google Scholar 

  24. Sabeti, S., Shoghli, O., Baharani, M., Tabkhi, H.: Toward AI-enabled augmented reality to enhance the safety of highway work zones: Feasibility, requirements, and challenges. Adv. Eng. Inform. 50, 101429 (2021)

    Article  Google Scholar 

  25. Haghbin, N., Kersten-Oertel, M.: Multimodal cueing in gamified physiotherapy: a preliminary study. In: Proceedings of the 7th International Conference on Information and Communication Technologies for Ageing Well and e-Health (2021)

    Google Scholar 

  26. Gruenefeld, U., Löcken, A., Brueck, Y., Boll, S., Heuten, W.: Where to look. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (2018)

    Google Scholar 

  27. San Martín, A., Kildal, J.: Audio-visual AR to improve awareness of hazard zones around robots. In: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (2019)

    Google Scholar 

  28. Miller, J., Godfroy-Cooper, M., Szoboszlay, Z.: Augmented-reality multimodal cueing for obstacle awareness: towards a new topology for threat-level presentation. In: Vertical Flight Society's 75th Forum (2019)

    Google Scholar 

  29. Trepkowski, C., et al.: Multisensory proximity and transition cues for improving target awareness in narrow field of view augmented reality displays. IEEE Trans. Visual Comput. Graph. 28, 1342–1362 (2022)

    Article  Google Scholar 

  30. Schönauer, C., Fukushi, K., Olwal, A., Kaufmann, H., Raskar, R.: Multimodal motion guidance. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction - ICMI 2012 (2012)

    Google Scholar 

  31. Obrenovic, Z., Starcevic, D., Jovanov, E.: Experimental evaluation of multimodal human computer interface for tactical audio applications. In: Proceedings. IEEE International Conference on Multimedia and Expo (2002)

    Google Scholar 

  32. Kweon, Y., Kim, S., Yoon, B., Jo, T., Park, C.: Implementation of educational drum contents using mixed reality and virtual reality. In: Stephanidis, C. (ed.) HCI 2018. CCIS, vol. 851, pp. 296–303. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92279-9_40

    Chapter  Google Scholar 

  33. Weevers, I., Sluis, W., van Schijndel, C., Fitrianie, S., Kolos-Mazuryk, L., Martens, J.-B.: Read-It: a multi-modal tangible interface for children who learn to read. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 226–234. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28643-1_29

    Chapter  Google Scholar 

  34. Wang, X., Ong, S., Nee, A.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30, 406–421 (2016)

    Article  Google Scholar 

  35. Tao, W., Lai, Z., Leu, M., Yin, Z., Qin, R.: A self-aware and active-guiding training & assistant system for worker-centered intelligent manufacturing. Manuf. Lett. 21, 45–49 (2019)

    Article  Google Scholar 

  36. Kong, J., Sabha, D., Bigham, J., Pavel, A., Guo, A.: TutorialLens: authoring Interactive augmented reality tutorials through narration and demonstration. In: Symposium on Spatial User Interaction (2021)

    Google Scholar 

  37. Serras, M., García-Sardiña, L., Simões, B., Álvarez, H., Arambarri, J.: Dialogue enhanced extended reality: interactive system for the operator 4.0. Appl. Sci. 10, 3960 (2020)

    Google Scholar 

  38. Neumann, A., Strenge, B., Schalkwijk, L., Essig, K., Schack, T.: Facilitating workers’ task proficiency with subtle decay of contextual AR-based assistance derived from unconscious memory structures. Information 12, 17 (2021)

    Article  Google Scholar 

  39. Bakker, S.: Design for peripheral interaction, Ph.D. thesis. Technische Universiteit Eindhoven (2013)

    Google Scholar 

  40. Bolton, F., Jalaliniya, S., Pederson, T.: A wrist-worn thermohaptic device for graceful interruption. Interact. Des. Archit. 26, 39–54 (2015)

    Google Scholar 

  41. Bailey, R., McNamara, A., Sudarsanam, N., Grimm, C.: Subtle gaze direction. ACM Trans. Graph. 28, 1–14 (2009)

    Article  Google Scholar 

  42. Masai, K., Kunze, K., Sugimoto, M., Billinghurst, M.: Empathy glasses. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (2016)

    Google Scholar 

  43. Wherton, J., Monk, A.: Problems people with dementia have with kitchen tasks: the challenge for pervasive computing. Interact. Comput. 22, 253–266 (2010)

    Article  Google Scholar 

  44. Schlosser, P., Matthews, B., Sanderson, P.: Head-worn displays for healthcare and industry workers: a review of applications and design. Int. J. Hum. Comput. Stud. 154, 102628 (2021)

    Article  Google Scholar 

  45. Isoyama, N., Sakuragi, Y., Terada, T., Tsukamoto, M.: Effects of augmented reality object and texture presentation on walking behavior. Electronics 10, 702 (2021)

    Article  Google Scholar 

  46. Kotseruba, I., Tsotsos, J.: Behavioral research and practical models of drivers’ attention. ArXiv. abs/2104.05677 (2021)

    Google Scholar 

  47. Lu, W., Feng, D., Feiner, S., Zhao, Q., Duh, H.: Subtle cueing for visual search in head-tracked head worn displays. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2013)

    Google Scholar 

  48. Lu, W., Feng, D., Feiner, S., Zhao, Q., Duh, H.: Evaluating subtle cueing in head-worn displays. In: Proceedings of the Second International Symposium of Chinese CHI on - Chinese CHI 2014 (2014)

    Google Scholar 

  49. Dillman, K., Mok, T., Tang, A., Oehlberg, L., Mitchell, A.: A visual interaction cue framework from video game environments for augmented reality. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (2018)

    Google Scholar 

  50. Feng, D., Weng, D., Lu, W., Sun, C., Do, E.: A browser-based perceptual experiment platform for visual search study in augmented reality system. In: 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing and 2013 IEEE 10th International Conference on Autonomic and Trusted Computing (2013)

    Google Scholar 

  51. Lu, W., Duh, H., Feiner, S., Zhao, Q.: Attributes of subtle cues for facilitating visual search in augmented reality. IEEE Trans. Visual Comput. Graph. 20, 404–412 (2014)

    Article  Google Scholar 

  52. MacIntyre, B., Bolter, J., Moreno, E., Hannigan, B.: Augmented reality as a new media experience. In: Proceedings IEEE and ACM International Symposium on Augmented Reality (2001)

    Google Scholar 

  53. John, B., Kalyanaraman, S., Jain, E.: Look out! A design framework for safety training systems a case study on omnidirectional Cinemagraphs. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (2020)

    Google Scholar 

  54. Ahmaniemi, T.: Dynamic tactile feedback in human computer interaction. Dissertations. Aalto University (2012)

    Google Scholar 

  55. D’Angelo, S., Schneider, B.: Shared gaze visualizations in collaborative interactions: past, present and future. Interact. Comput. 33, 115–133 (2021)

    Article  Google Scholar 

  56. Wren, C., Reynolds, C.: Parsimony & transparency in ubiquitous interface design. In: Proceedings of the 2002 International Conference on Ubiquitous Computing (UBICOMP) (2002)

    Google Scholar 

  57. Veas, E.E., Mendez, E., Feiner, S.K., Schmalstieg, D.: Directing attention and influencing memory with visual saliency modulation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2011)

    Google Scholar 

  58. McNamara, A.M.: Enhancing art history education through mobile augmented reality. In: Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry - VRCAI 2011 (2011)

    Google Scholar 

  59. Booth, T., Sridharan, S., McNamara, A., Grimm, C., Bailey, R.: Guiding attention in controlled real-world environments. In: Proceedings of the ACM Symposium on Applied Perception (2013)

    Google Scholar 

  60. Lu, W., Duh, B.-L.H., Feiner, S.: Subtle cueing for visual search in augmented reality. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2012)

    Google Scholar 

  61. Pohl, H., Muresan, A., Hornbæk, K.: Charting subtle interaction in the HCI literature. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (2019)

    Google Scholar 

  62. Pohl, H., Medrek, J., Rohs, M.: ScatterWatch. In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (2016)

    Google Scholar 

  63. Pohl, H., Brandes, P., Ngo Quang, H., Rohs, M.: Squeezeback. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (2017)

    Google Scholar 

Download references

Acknowledgments

The authors would like to thank Derome AB for valuable discussions and the possibility to do observations in the production facilities and the project “Tillverka I Trä” (ID-nr 20201948) that is funded by Region Västra Götaland and Swedish Agency for Economic and Regional Growth.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicole Tobisková .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tobisková, N., Malmsköld, L., Pederson, T. (2022). Multimodal Augmented Reality and Subtle Guidance for Industrial Assembly – A Survey and Ideation Method. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality: Applications in Education, Aviation and Industry. HCII 2022. Lecture Notes in Computer Science, vol 13318. Springer, Cham. https://doi.org/10.1007/978-3-031-06015-1_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-06015-1_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-06014-4

  • Online ISBN: 978-3-031-06015-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics