Abstract
Mobile robots are becoming more and more ubiquitous in our everyday living environments. Therefore, it is very important that people can easily interpret what the robot’s intentions are. This is especially important when a robot is driving down a crowded corridor. It is essential for people in its vicinity to understand which way the robot wants to go next. To explore what signals are the best for conveying its intention to turn, we implemented three lighting schemes and tested them out in an online experiment. We found that signals resembling automotive signaling work the best also for logistic mobile robots. We further find that people’s opinion of these signaling methods will be influenced by their demographic background (gender, age).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
See a video of the implemented conditions here https://youtu.be/J6jtDH6ZSuw.
References
Riek, L.D.: Healthcare robotics. Commun. ACM 60, 68–78 (2017)
Bodenhagen, L., Suvei, S.-D., Juel, W.K., Brander, E., Krüger, N.: Robot technology for future welfare: meeting upcoming societal challenges–an outlook with offset in the development in Scandinavia. Health Technol. (Berl) 9(3), 197–218 (2019)
Palinko, O., Ramirez, E.R., Juel, W.K., Krüger, N., Bodenhagen, L.: Intention indication for human aware robot navigation. In: VISIGRAPP 2020 - Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (2020)
Svenstrup, M., Tranberg, S., Andersen, H.J., Bak, T.: Pose estimation and adaptive robot behaviour for human-robot interaction. In: Proceedings - IEEE International Conference on Robotics and Automation (2009)
Hameed, I.A., Tan, Z.-H., Thomsen, N.B., Duan, X.: User acceptance of social robots. In: Proceedings of the Ninth International Conference on Advances in Computer-Human Interactions (ACHI 2016), Venice, Italy, pp. 274–279 (2016)
Beer, W.A., Prakash, J. M., Mitzner, A., Rogers, T.L.: Understanding robot acceptance. Georg. Inst. Technol. (2011)
Kilner, J.M.: More than one pathway to action understanding. Trends Cogn. Sci. 15(8), 352–357 (2011)
Castiello, U.: Understanding other people’s actions: intention and attention. J. Exp. Psychol. Hum. Percept. Perform. 29, 416 (2003)
Ansuini, C., Giosa, L., Turella, L., Altoè, G., Castiello, U.: An object for an action, the same object for other actions: effects on hand shaping. Exp. Brain Res. 185(1), 111–119 (2008)
Gielniak, M.J., Thomaz, A. L.: Generating anticipation in robot motion. In: Proceedings - IEEE International Workshop on Robot and Human Interactive Communication (2011)
Duarte, N.F., Raković, M., Tasevski, J., Coco, M.I., Billard, A., Santos-Victor, J.: Action anticipation: reading the intentions of humans and robots. IEEE Robot. Autom. Lett. 3(4), 4132–4139 (2018)
Coovert, M.D., Lee, T., Shindev, I., Sun, Y.: Spatial augmented reality as a method for a mobile robot to communicate intended movement. Comput. Human Behav. 34, 241–248 (2014)
Chadalavada, R.T., Andreasson, H., Krug, R., Lilienthal, A.J.: That’s on my mind! Robot to human intention communication through on-board projection on shared floor space (2016)
Pörtner, A., Schröder, L., Rasch, R., Sprute, D., Hoffmann, M., König, M.: The power of color: a study on the effective use of colored light in human-robot interaction. In: IEEE International Conference on Intelligent Robots and Systems (2018)
Baraka, K., Veloso, M.M.: Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int. J. Soc. Robot. 10, 65–92 (2018)
Szafir, D., Mutlu, B., Fong, T.: Communicating directionality in flying robots. In: ACM/IEEE International Conference on Human-Robot Interaction (2015)
Hart, J., et al.: Unclogging our arteries: using human-inspired signals to disambiguate navigational intentions. arXiv Preprint arXiv:1909.06560 (2019)
Mutlu, B., Shiwa, T., Kanda, T., Ishiguro, H., Hagita, N.: Footing in human-robot conversations: how robots might shape participant roles using gaze cues. Hum. Factors 2(1), 61–68 (2009)
Palinko, O., Fischer, K., Ruiz Ramirez, E., Damsgaard Nissen, L., Langedijk, R.M.: A drink-serving mobile social robot selects who to interact with using gaze. In: ACM/IEEE International Conference on Human-Robot Interaction (2020)
Dutzik, T., Inglis, J., Baxandall, P.: Millennials in motion: changing travel Habits of young Americans and the implications for public policy (2014)
Acknowledgements
This work was supported by the project Health-CAT, funded by the European Regional Development Fund.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Palinko, O., Ramirez, E.R., Krüger, N., Bodenhagen, L. (2022). Intention Understanding for Human-Aware Mobile Robots: Comparing Cues and the Effect of Demographics. In: Bouatouch, K., et al. Computer Vision, Imaging and Computer Graphics Theory and Applications. VISIGRAPP 2020. Communications in Computer and Information Science, vol 1474. Springer, Cham. https://doi.org/10.1007/978-3-030-94893-1_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-94893-1_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-94892-4
Online ISBN: 978-3-030-94893-1
eBook Packages: Computer ScienceComputer Science (R0)