[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3491102.3517719acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

Published: 29 April 2022 Publication History

Abstract

This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics.

Supplementary Material

MP4 File (3491102.3517719-talk-video.mp4)
Talk Video

References

[1]
2014. List of Physical Visualizations and Related Artifacts. Retrieved on January 5, 2022 from http://dataphys.org/list/
[2]
2015. The Mercedes-Benz F 015 luxury in motion. Retrieved on January 5, 2022 from https://www.mercedes-benz.com/en/innovation/autonomous/research-vehicle-f-015-luxury-in-motion/
[3]
2015. Microsoft Hololens Robot Demo at Build 2015. Retrieved on January 5, 2022 from https://www.youtube.com/watch?v=mSCrviBGTeQ
[4]
2016. Boeing: UAVs. Holograms. Wildfire. Retrieved on January 5, 2022 from https://www.youtube.com/watch?v=omGoz66xHU8
[5]
2017. Personal Fabrication Research in HCI and Graphics: An Overview of Related Work. Retrieved on January 5, 2022 from https://hcie.csail.mit.edu/fabpub/
[6]
2018. MorphUI. Retrieved on January 5, 2022 from http://morphui.com/
[7]
2019. Jaguar land rover lights up the road ahead for self-driving vehicles of the future. Retrieved on January 5, 2022 from https://media.jaguarlandrover.com/news/2019/01/jaguar-land-rover-lights-road-ahead-self-driving-vehicles-future
[8]
2020. Nintendo Mario Kart Live: Home Circuit. Retrieved on January 5, 2022 from https://mklive.nintendo.com/
[9]
Syed Mohsin Abbas, Syed Hassan, and Jongwon Yun. 2012. Augmented reality based teaching pendant for industrial robot. In 2012 12th International Conference on Control, Automation and Systems. IEEE, 2210–2213.
[10]
Jong-gil Ahn, Gerard J Kim, Hyemin Yeon, Eunja Hyun, and Kyoung Choi. 2013. Supporting augmented reality based children’s play with pro-cam robot: three user perspectives. In Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. 17–24. https://doi.org/10.1145/2534329.2534342
[11]
Yuya Aikawa, Masayoshi Kanoh, Felix Jimenez, Mitsuhiro Hayase, Takahiro Tanaka, and Hitoshi Kanamori. 2018. Comparison of gesture inputs for robot system using mixed reality to encourage driving review. In 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems (SCIS) and 19th International Symposium on Advanced Intelligent Systems (ISIS). IEEE, 62–66. https://doi.org/10.1109/scis-isis.2018.00020
[12]
Batu Akan, Afshin Ameri, Baran Cürüklü, and Lars Asplund. 2011. Intuitive industrial robot programming through incremental multimodal language and augmented reality. In 2011 IEEE International Conference on Robotics and Automation. IEEE, 3934–3939. https://doi.org/10.1109/icra.2011.5979887
[13]
Takintope Akinbiyi, Carol E Reiley, Sunipa Saha, Darius Burschka, Christopher J Hasser, David D Yuh, and Allison M Okamura. 2006. Dynamic augmented reality for sensory substitution in robot-assisted surgical systems. In 2006 International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 567–570. https://doi.org/10.1109/iembs.2006.259707
[14]
Samer Al Moubayed, Jonas Beskow, Gabriel Skantze, and Björn Granström. 2012. Furhat: a back-projected human-like robot head for multiparty human-machine interaction. In Cognitive behavioural systems. Springer, 114–130. https://doi.org/10.1007/978-3-642-34584-5_9
[15]
Jacopo Aleotti, Giorgio Micconi, Stefano Caselli, Giacomo Benassi, Nicola Zambelli, Manuele Bettelli, and Andrea Zappettini. 2017. Detection of nuclear sources by UAV teleoperation using a visuo-haptic augmented reality interface. Sensors 17, 10 (2017), 2234. https://doi.org/10.3390/s17102234
[16]
Jason Alexander, Anne Roudaut, Jürgen Steimle, Kasper Hornbæk, Miguel Bruns Alonso, Sean Follmer, and Timothy Merritt. 2018. Grand challenges in shape-changing interface research. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–14. https://doi.org/10.1145/3173574.3173873
[17]
Omri Alon, Sharon Rabinovich, Chana Fyodorov, and Jessica R Cauchard. 2021. Drones in Firefighting: A User-Centered Design Perspective. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction. 1–11. https://doi.org/10.1145/3447526.3472030
[18]
Malek Alrashidi, Ahmed Alzahrani, Michael Gardner, and Vic Callaghan. 2016. A pedagogical virtual machine for assembling mobile robot using augmented reality. In Proceedings of the 7th Augmented Human International Conference 2016. 1–2. https://doi.org/10.1145/2875194.2875229
[19]
Malek Alrashidi, Michael Gardner, and Vic Callaghan. 2017. Evaluating the use of pedagogical virtual machine with augmented reality to support learning embedded computing activity. In Proceedings of the 9th International Conference on Computer and Automation Engineering. 44–50. https://doi.org/10.1145/3057039.3057088
[20]
Alborz Amir-Khalili, Masoud S Nosrati, Jean-Marc Peyrat, Ghassan Hamarneh, and Rafeef Abugharbieh. 2013. Uncertainty-encoded augmented reality for robot-assisted partial nephrectomy: A phantom study. In Augmented Reality Environments for Medical Imaging and Computer-Assisted Interventions. Springer, 182–191. https://doi.org/10.1007/978-3-642-40843-4_20
[21]
Rasmus S Andersen, Ole Madsen, Thomas B Moeslund, and Heni Ben Amor. 2016. Projecting robot intentions into human environments. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 294–301. https://doi.org/10.1109/ROMAN.2016.7745145
[22]
Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. 2012. Designing effective gaze mechanisms for virtual agents. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 705–714. https://doi.org/10.1145/2207676.2207777
[23]
Takafumi Aoki, Takashi Matsushita, Yuichiro Iio, Hironori Mitake, Takashi Toyama, Shoichi Hasegawa, Rikiya Ayukawa, Hiroshi Ichikawa, Makoto Sato, Takatsugu Kuriyama, 2005. Kobito: virtual brownies. In ACM SIGGRAPH 2005 emerging technologies. 11–es. https://doi.org/10.1145/1187297.1187309
[24]
Dejanira Araiza-Illan, Alberto De San Bernabe, Fang Hongchao, and Leong Yong Shin. 2019. Augmented reality for quick and intuitive robotic packing re-programming. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 664–664. https://doi.org/10.1109/hri.2019.8673327
[25]
Stephanie Arévalo Arboleda, Tim Dierks, Franziska Rücker, and Jens Gerken. 2020. There’s More than Meets the Eye: Enhancing Robot Control through Augmented Visual Cues. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 104–106. https://doi.org/10.1145/3371382.3378240
[26]
Stephanie Arévalo Arboleda, Tim Dierks, Franziska Rücker, and Jens Gerken. 2021. Exploring the Visual Space to Improve Depth Perception in Robot Teleoperation Using Augmented Reality: The Role of Distance and Target’s Pose in Time, Success, and Certainty. In IFIP Conference on Human-Computer Interaction. Springer, 522–543. https://doi.org/10.1007/978-3-030-85623-6_31
[27]
Stephanie Arevalo Arboleda, Franziska Rücker, Tim Dierks, and Jens Gerken. 2021. Assisting Manipulation and Grasping in Robot Teleoperation with Augmented Reality Visual Cues. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14. https://doi.org/10.1145/3411764.3445398
[28]
Michael Argyle and Mark Cook. 1976. Gaze and mutual gaze.(1976). https://doi.org/10.1017/S0007125000073980
[29]
Pasquale Arpaia, Carmela Bravaccio, Giuseppina Corrado, Luigi Duraccio, Nicola Moccaldi, and Silvia Rossi. 2020. Robotic Autism Rehabilitation by Wearable Brain-Computer Interface and Augmented Reality. In 2020 IEEE International Symposium on Medical Measurements and Applications (MeMeA). IEEE, 1–6. https://doi.org/10.1109/MeMeA49120.2020.9137144
[30]
Doris Aschenbrenner, Jonas SI Rieder, Daniëlle Van Tol, Joris Van Dam, Zoltan Rusak, Jan Olaf Blech, Mohammad Azangoo, Salo Panu, Karl Kruusamäe, Houman Masnavi, 2020. Mirrorlabs-creating accessible Digital Twins of robotic production environment with Mixed Reality. In 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR). IEEE, 43–48. https://doi.org/10.1109/aivr50618.2020.00017
[31]
Doris Aschenbrenner, Michael Rojkov, Florian Leutert, Jouke Verlinden, Stephan Lukosch, Marc Erich Latoschik, and Klaus Schilling. 2018. Comparing different augmented reality support applications for cooperative repair of an industrial robot. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 69–74. https://doi.org/10.1109/ismar-adjunct.2018.00036
[32]
Ronald T Azuma. 1997. A survey of augmented reality. Presence: teleoperators & virtual environments 6, 4(1997), 355–385. https://doi.org/10.1162/pres.1997.6.4.355
[33]
Daniel Bambuŝek, Zdeněk Materna, Michal Kapinus, Vítězslav Beran, and Pavel Smrž. 2019. Combining interactive spatial augmented reality with head-mounted display for end-user collaborative robot programming. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1–8. https://doi.org/10.1109/RO-MAN46459.2019.8956315
[34]
Kim Baraka, Ana Paiva, and Manuela Veloso. 2016. Expressive lights for revealing mobile service robot state. In Robot 2015: Second Iberian Robotics Conference. Springer, 107–119. https://doi.org/10.1007/978-3-319-27146-0_9
[35]
Zoltán Bárdosi, Christian Plattner, Yusuf Özbek, Thomas Hofmann, Srdjan Milosavljevic, Volker Schartinger, and Wolfgang Freysinger. 2020. CIGuide: in situ augmented reality laser guidance. International journal of computer assisted radiology and surgery 15, 1(2020), 49–57. https://doi.org/10.1007/s11548-019-02066-1
[36]
Patrick Baudisch, Stefanie Mueller, 2017. Personal fabrication. Foundations and Trends® in Human–Computer Interaction 10, 3–4(2017), 165–293. https://doi.org/10.1561/1100000055
[37]
Philipp Beckerle, Claudio Castellini, and Bigna Lenggenhager. 2019. Robotic interfaces for cognitive psychology and embodiment research: a research roadmap. Wiley Interdisciplinary Reviews: Cognitive Science 10, 2 (2019), e1486. https://doi.org/10.1002/wcs.1486
[38]
William Bentz, Sahib Dhanjal, and Dimitra Panagou. 2019. Unsupervised learning of assistive camera views by an aerial co-robot in augmented reality multitasking environments. In 2019 International Conference on Robotics and Automation (ICRA). IEEE, 3003–3009. https://doi.org/10.1109/icra.2019.8793587
[39]
Lorenzo Bianchi, Francesco Chessa, Andrea Angiolini, Laura Cercenelli, Simone Lodi, Barbara Bortolani, Enrico Molinaroli, Carlo Casablanca, Matteo Droghetti, Caterina Gaudiano, 2021. The use of augmented reality to guide the intraoperative frozen section during robot-assisted radical prostatectomy. European Urology 80, 4 (2021), 480–488. https://doi.org/10.1016/j.eururo.2021.06.020
[40]
Mark Billinghurst and Michael Nebeling. 2021. Rapid prototyping for XR. In SIGGRAPH Asia 2021 Courses. 1–178. https://doi.org/10.1145/3476117.3483444
[41]
Oliver Bimber and Ramesh Raskar. 2006. Modern approaches to augmented reality. In ACM SIGGRAPH 2006 Courses. 1–es. https://doi.org/10.1145/1185657.1185796
[42]
Sebastian Blankemeyer, Rolf Wiemann, Lukas Posniak, Christoph Pregizer, and Annika Raatz. 2018. Intuitive robot programming using augmented reality. Procedia CIRP 76(2018), 155–160. https://doi.org/10.1016/J.PROCIR.2018.02.028
[43]
Andrew Boateng and Yu Zhang. 2021. Virtual Shadow Rendering for Maintaining Situation Awareness in Proximal Human-Robot Teaming. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 494–498. https://doi.org/10.1145/3434074.3447221
[44]
Gabriele Bolano, Christian Juelg, Arne Roennau, and Ruediger Dillmann. 2019. Transparent robot behavior using augmented reality in close human-robot interaction. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1–7. https://doi.org/10.1109/ro-man46459.2019.8956296
[45]
Gabriele Bolano, Arne Roennau, and Ruediger Dillmann. 2020. Planning and Evaluation of Robotic Solutions in a Logistic Line Through Augmented Reality. In 2020 Fourth IEEE International Conference on Robotic Computing (IRC). IEEE, 422–423. https://doi.org/10.1109/irc.2020.00075
[46]
Jean Botev and Francisco J Rodríguez Lera. 2021. Immersive Robotic Telepresence for Remote Educational Scenarios. Sustainability 13, 9 (2021), 4717. https://doi.org/10.3390/SU13094717
[47]
Gustavo Caiza, Pablo Bonilla-Vasconez, Carlos A Garcia, and Marcelo V Garcia. 2020. Augmented Reality for Robot Control in Low-cost Automation Context and IoT. In 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vol. 1. IEEE, 1461–1464. https://doi.org/10.1109/etfa46521.2020.9212056
[48]
Davide Calandra, Alberto Cannavò, and Fabrizio Lamberti. 2021. Evaluating an Augmented Reality-Based Partially Assisted Approach to Remote Assistance in Heterogeneous Robotic Applications. In 2021 IEEE 7th International Conference on Virtual Reality (ICVR). IEEE, 380–387. https://doi.org/10.1109/icvr51878.2021.9483849
[49]
Daniel Calife, João Luiz Bernardes Jr, and Romero Tori. 2009. Robot Arena: An augmented reality platform for game development. Computers in Entertainment (CIE) 7, 1 (2009), 1–26. https://doi.org/10.1145/1486508.1486519
[50]
Laura Cancedda, Alberto Cannavò, Giuseppe Garofalo, Fabrizio Lamberti, Paolo Montuschi, and Gianluca Paravati. 2017. Mixed reality-based user interaction feedback for a hand-controlled interface targeted to robot teleoperation. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Springer, 447–463. https://doi.org/10.1007/978-3-319-60928-7_38
[51]
Yuanzhi Cao, Tianyi Wang, Xun Qian, Pawan S Rao, Manav Wadhawan, Ke Huo, and Karthik Ramani. 2019. GhostAR: A time-space editor for embodied authoring of human-robot collaborative task with augmented reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 521–534. https://doi.org/10.1145/3332165.3347902
[52]
Yuanzhi Cao, Zhuangying Xu, Terrell Glenn, Ke Huo, and Karthik Ramani. 2018. Ani-Bot: A Modular Robotics System Supporting Creation, Tweaking, and Usage with Mixed-Reality Interactions. In Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction. 419–428. https://doi.org/10.1145/3173225.3173226
[53]
Yuanzhi Cao, Zhuangying Xu, Fan Li, Wentao Zhong, Ke Huo, and Karthik Ramani. 2019. V. Ra: An in-situ visual authoring system for robot-IoT task planning with augmented reality. In Proceedings of the 2019 on Designing Interactive Systems Conference. 1059–1070. https://doi.org/10.1145/3322276.3322278
[54]
Irvin Steve Cardenas, Kaleb Powlison, and Jong-Hoon Kim. 2021. Reducing Cognitive Workload in Telepresence Lunar-Martian Environments Through Audiovisual Feedback in Augmented Reality. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 463–466. https://doi.org/10.1145/3434074.3447214
[55]
Jon Carroll and Fabrizio Polo. 2013. Augmented reality gaming with sphero. In ACM Siggraph 2013 Mobile. 1–1. https://doi.org/10.1145/2503512.2503535
[56]
Giandomenico Caruso and Paolo Belluco. 2010. Robotic arm for car dashboard layout assessment in mixed reality environment. In 19th International Symposium in Robot and Human Interactive Communication. IEEE, 62–68. https://doi.org/10.1109/ROMAN.2010.5598685
[57]
Jessica Cauchard, Woody Gover, William Chen, Stephen Cartwright, and Ehud Sharlin. 2021. Drones in Wonderland–Disentangling Collocated Interaction Using Radical Form. IEEE Robotics and Automation Letters(2021). https://doi.org/10.1109/lra.2021.3103653
[58]
Jessica R Cauchard, Alex Tamkin, Cheng Yao Wang, Luke Vink, Michelle Park, Tommy Fang, and James A Landay. 2019. Drone. io: A gestural and visual interface for human-drone interaction. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 153–162. https://doi.org/10.1109/HRI.2019.8673011
[59]
Elizabeth Cha, Naomi T Fitter, Yunkyung Kim, Terrence Fong, and Maja J Matarić. 2018. Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 434–442. https://doi.org/10.1145/3171221.3171285
[60]
Elizabeth Cha and Maja Matarić. 2016. Using nonverbal signals to request help during human-robot collaboration. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 5070–5076. https://doi.org/10.1109/IROS.2016.7759744
[61]
Sonia Mary Chacko, Armando Granado, and Vikram Kapila. 2020. An augmented reality framework for robotic tool-path teaching. Procedia CIRP 93(2020), 1218–1223. https://doi.org/10.1016/j.procir.2020.03.143
[62]
Sonia Mary Chacko, Armando Granado, Ashwin RajKumar, and Vikram Kapila. 2020. An Augmented Reality Spatial Referencing System for Mobile Robots. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 4446–4452. https://doi.org/10.1109/iros45743.2020.9340742
[63]
Sonia Mary Chacko and Vikram Kapila. 2019. An augmented reality interface for human-robot interaction in unconstrained environments. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 3222–3228. https://doi.org/10.1109/iros40897.2019.8967973
[64]
Ravi Teja Chadalavada, Henrik Andreasson, Maike Schindler, Rainer Palm, and Achim J Lilienthal. 2020. Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction. Robotics and Computer-Integrated Manufacturing 61 (2020), 101830. https://doi.org/10.1016/j.rcim.2019.101830
[65]
Seungho Chae, Hyocheol Ro, Yoonsik Yang, and Tack-Don Han. 2018. A Pervasive Assistive Robot System Including Projection-Camera Technology for Older Adults. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 83–84. https://doi.org/10.1145/3173386.3177007
[66]
Tathagata Chakraborti, Sarath Sreedharan, Anagha Kulkarni, and Subbarao Kambhampati. 2018. Projection-aware task planning and execution for human-in-the-loop operation of robots in a mixed-reality workspace. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 4476–4482. https://doi.org/10.1109/IROS.2018.8593830
[67]
Wesley P Chan, Geoffrey Hanks, Maram Sakr, Tiger Zuo, HF Machiel Van der Loos, and Elizabeth Croft. 2020. An augmented reality human-robot physical collaboration interface design for shared, large-scale, labour-intensive manufacturing tasks. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 11308–11313. https://doi.org/10.1109/IROS45743.2020.9341119
[68]
Wesley P Chan, Adnan Karim, Camilo P Quintero, HF Machiel Van der Loos, and Elizabeth Croft. 2018. Virtual barriers in augmented reality for safe human-robot collaboration in manufacturing. In Robotic Co-Workers 4.0 2018: Human Safety and Comfort in Human-Robot Interactive Social Environments.
[69]
Wesley P Chan, Maram Sakr, Camilo Perez Quintero, Elizabeth Croft, and HF Machiel Van der Loos. 2020. Towards a Multimodal System combining Augmented Reality and Electromyography for Robot Trajectory Programming and Execution. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 419–424. https://doi.org/10.1109/RO-MAN47096.2020.9223526
[70]
Tom Chandler, Maxime Cordeil, Tobias Czauderna, Tim Dwyer, Jaroslaw Glowacki, Cagatay Goncu, Matthias Klapperstueck, Karsten Klein, Kim Marriott, Falk Schreiber, 2015. Immersive analytics. In 2015 Big Data Visual Analytics (BDVA). IEEE, 1–8. https://doi.org/10.1109/TVCG.2019.2929033
[71]
Chih-Wei Chang, Jih-Hsien Lee, Chin-Yeh Wang, and Gwo-Dong Chen. 2010. Improving the authentic learning experience by integrating robots into the mixed-reality environment. Computers & Education 55, 4 (2010), 1572–1578. https://doi.org/10.1016/j.compedu.2010.06.023
[72]
Siam Charoenseang and Tarinee Tonggoed. 2011. Human–robot collaboration with augmented reality. In International Conference on Human-Computer Interaction. Springer, 93–97. https://doi.org/10.1007/978-3-642-22095-1_19
[73]
Hua Chen, Oliver Wulf, and Bernardo Wagner. 2006. Object detection for a mobile robot using mixed reality. In International Conference on Virtual Systems and Multimedia. Springer, 466–475. https://doi.org/10.1007/11890881_51
[74]
Ian Yen-Hung Chen, Bruce MacDonald, Burkhard Wünsche, Geoffrey Biggs, and Tetsuo Kotoku. 2010. Analysing mixed reality simulation for industrial applications: A case study in the development of a robotic screw remover system. In International Conference on Simulation, Modeling, and Programming for Autonomous Robots. Springer, 350–361. https://doi.org/10.1007/978-3-642-17319-6_33
[75]
Linfeng Chen, Akiyuki Ebi, Kazuki Takashima, Kazuyuki Fujita, and Yoshifumi Kitamura. 2019. PinpointFly: An egocentric position-pointing drone interface using mobile AR. In SIGGRAPH Asia 2019 Emerging Technologies. 34–35. https://doi.org/10.1145/3355049.3360534
[76]
Linfeng Chen, Kazuki Takashima, Kazuyuki Fujita, and Yoshifumi Kitamura. 2021. PinpointFly: An Egocentric Position-control Drone Interface using Mobile AR. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13. https://doi.org/10.1145/3411764.3445110
[77]
Long Chen, Fengfeng Zhang, Wei Zhan, Minfeng Gan, and Lining Sun. 2020. Optimization of virtual and real registration technology based on augmented reality in a surgical navigation system. Biomedical engineering online 19, 1 (2020), 1–28. https://doi.org/10.1186/s12938-019-0745-z
[78]
Mingxuan Chen, Ping Zhang, Zebo Wu, and Xiaodan Chen. 2020. A multichannel human-swarm robot interaction system in augmented reality. Virtual Reality & Intelligent Hardware 2, 6 (2020), 518–533. https://doi.org/10.1016/j.vrih.2020.05.006
[79]
Xiaogang Chen, Xiaoshan Huang, Yijun Wang, and Xiaorong Gao. 2020. Combination of augmented reality based brain-computer interface and computer vision for high-level control of a robotic arm. IEEE Transactions on Neural Systems and Rehabilitation Engineering 28, 12(2020), 3140–3147. https://doi.org/10.1109/tnsre.2020.3038209
[80]
Zhe Chen, Zhuohang Cao, Peili Ma, and Lijun Xu. 2020. Industrial Robot Training Platform Based on Virtual Reality and Mixed Reality Technology. In International Conference on Man-Machine-Environment System Engineering. Springer, 891–898. https://doi.org/10.1007/978-981-15-6978-4_102
[81]
Vijay Chidambaram, Yueh-Hsuan Chiang, and Bilge Mutlu. 2012. Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. 293–300. https://doi.org/10.1145/2157689.2157798
[82]
Seung Wook Choi, Hee Chan Kim, Heung Sik Kang, Seongjun Kim, and Jaesoon Choi. 2013. A haptic augmented reality surgeon console for a laparoscopic surgery robot system. In 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013). IEEE, 355–357. https://doi.org/10.1109/iccas.2013.6703923
[83]
Jonathan Wun Shiung Chong, SKc Ong, Andrew YC Nee, and KB Youcef-Youmi. 2009. Robot programming using augmented reality: An interactive method for planning collision-free paths. Robotics and Computer-Integrated Manufacturing 25, 3(2009), 689–701. https://doi.org/10.1016/J.RCIM.2008.05.002
[84]
Wusheng Chou, Tianmiao Wang, and Yuru Zhang. 2004. Augmented reality based preoperative planning for robot assisted tele-neurosurgery. In 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No. 04CH37583), Vol. 3. IEEE, 2901–2906. https://doi.org/10.1109/icsmc.2004.1400773
[85]
Nicklas H Christensen, Oliver G Hjermitslev, Frederik Falk, Marco B Madsen, Frederik H Østergaard, Martin Kibsgaard, Martin Kraus, Johan Poulsen, and Jane Petersson. 2017. Depth cues in augmented reality for training of robot-assisted minimally invasive surgery. In Proceedings of the 21st International Academic Mindtrek Conference. 120–126. https://doi.org/10.1145/3131085.3131123
[86]
Francesco Clemente, Strahinja Dosen, Luca Lonini, Marko Markovic, Dario Farina, and Christian Cipriani. 2016. Humans can integrate augmented reality feedback in their sensorimotor control of a robotic hand. IEEE Transactions on Human-Machine Systems 47, 4 (2016), 583–589. https://doi.org/10.1109/thms.2016.2611998
[87]
Marcelo Coelho and Jamie Zigelbaum. 2011. Shape-changing interfaces. Personal and Ubiquitous Computing 15, 2 (2011), 161–173. https://doi.org/10.1007/s00779-010-0311-y
[88]
Michael D Coovert, Tiffany Lee, Ivan Shindev, and Yu Sun. 2014. Spatial augmented reality as a method for a mobile robot to communicate intended movement. Computers in Human Behavior 34 (2014), 241–248. https://doi.org/10.1016/j.chb.2014.02.001
[89]
Austin Corotan and Jianna Jian Zhang Irgen-Gioro. 2019. An Indoor Navigation Robot Using Augmented Reality. In 2019 5th International Conference on Control, Automation and Robotics (ICCAR). IEEE, 111–116. https://doi.org/10.1109/iccar.2019.8813348
[90]
Hugo Costa, Peter Cebola, Tiago Cunha, and Armando Sousa. 2015. A mixed reality game using 3Pi robots—“PiTanks”. In 2015 10th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 1–6. https://doi.org/10.1109/CISTI.2015.7170600
[91]
Nuno Costa and Artur Arsenio. 2015. Augmented reality behind the wheel-human interactive assistance by mobile robots. In 2015 6th International Conference on Automation, Robotics and Applications (ICARA). IEEE, 63–69. https://doi.org/10.1109/ICARA.2015.7081126
[92]
Ève Coste-Manière, Louaï Adhami, Fabien Mourgues, and Alain Carpentier. 2003. Planning, simulation, and augmented reality for robotic cardiac procedures: the STARS system of the ChIR team. In Seminars in thoracic and cardiovascular surgery, Vol. 15. Elsevier, 141–156. https://doi.org/10.1016/S1043-0679(03)70022-7
[93]
Matthew Cousins, Chenguang Yang, Junshen Chen, Wei He, and Zhaojie Ju. 2017. Development of a mixed reality based interface for human robot interaciotn. In 2017 International Conference on Machine Learning and Cybernetics (ICMLC), Vol. 1. IEEE, 27–34. https://doi.org/10.1109/icmlc.2017.8107738
[94]
Oscar Danielsson, Anna Syberfeldt, Rodney Brewster, and Lihui Wang. 2017. Assessing instructions in augmented reality for human-robot collaborative assembly by using demonstrators. Procedia CIRP 63(2017), 89–94. https://doi.org/10.1016/J.PROCIR.2017.02.038
[95]
Kurtis Danyluk, Barrett Ens, Bernhard Jenny, and Wesley Willett. 2021. A Design Space Exploration of Worlds in Miniature. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, 1–15. https://doi.org/10.1145/3411764.3445098
[96]
Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, and Martin Hachet. 2019. DroneSAR: extending physical spaces in spatial augmented reality using projection on a drone. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia. 1–7. https://doi.org/10.1145/3365610.3365631
[97]
Devleena Das, Siddhartha Banerjee, and Sonia Chernova. 2021. Explainable ai for robot failures: Generating explanations that improve user assistance in fault recovery. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 351–360. https://doi.org/10.1145/3434073.3444657
[98]
Alessandro De Franco, Edoardo Lamon, Pietro Balatti, Elena De Momi, and Arash Ajoudani. 2019. An Intuitive augmented reality interface for task scheduling, monitoring, and work performance improvement in human-robot collaboration. In 2019 IEEE International Work Conference on Bioinspired Intelligence (IWOBI). IEEE, 75–80. https://doi.org/10.1109/iwobi47054.2019.9114472
[99]
Artem Dementyev, Hsin-Liu Kao, Inrak Choi, Deborah Ajilo, Maggie Xu, Joseph A Paradiso, Chris Schmandt, and Sean Follmer. 2016. Rovables: Miniature on-body robots as mobile wearables. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 111–120. https://doi.org/10.1145/2984511.2984531
[100]
Morteza Dianatfar, Jyrki Latokartano, and Minna Lanz. 2021. Review on existing VR/AR solutions in human–robot collaboration. Procedia CIRP 97(2021), 407–411. https://doi.org/10.1016/j.procir.2020.05.259
[101]
Adhitha Dias, Hasitha Wellaboda, Yasod Rasanka, Menusha Munasinghe, Ranga Rodrigo, and Peshala Jayasekara. 2020. Deep Learning of Augmented Reality based Human Interactions for Automating a Robot Team. In 2020 6th International Conference on Control, Automation and Robotics (ICCAR). IEEE, 175–182. https://doi.org/10.1109/iccar49639.2020.9108004
[102]
Tiago Dias, Pedro Miraldo, Nuno Gonçalves, and Pedro U Lima. 2015. Augmented reality on robot navigation using non-central catadioptric cameras. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 4999–5004. https://doi.org/10.1109/iros.2015.7354080
[103]
Maximilian Diehl, Alexander Plopski, Hirokazu Kato, and Karinne Ramirez-Amaro. 2020. Augmented Reality interface to verify Robot Learning. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 378–383. https://doi.org/10.1109/ro-man47096.2020.9223502
[104]
André Dietrich, Michael Schulze, Sebastian Zug, and Jörg Kaiser. 2010. Visualization of robot’s awareness and perception. In Proceedings of the First International Workshop on Digital Engineering. 38–44. https://doi.org/10.1145/1837154.1837160
[105]
Huy Dinh, Quilong Yuan, Iastrebov Vietcheslav, and Gerald Seet. 2017. Augmented reality interface for taping robot. In 2017 18th International Conference on Advanced Robotics (ICAR). IEEE, 275–280. https://doi.org/10.1109/ICAR.2017.8023530
[106]
Anca D Dragan, Kenton CT Lee, and Siddhartha S Srinivasa. 2013. Legibility and predictability of robot motion. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 301–308. https://doi.org/10.1109/HRI.2013.6483603
[107]
Mauro Dragone, Thomas Holz, and Gregory MP O’Hare. 2006. Mixing robotic realities. In Proceedings of the 11th international conference on Intelligent user interfaces. 261–263. https://doi.org/10.1145/1111449.1111504
[108]
Mauro Dragone, Thomas Holz, and Gregory MP O’Hare. 2007. Using mixed reality agents as social interfaces for robots. In RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 1161–1166. https://doi.org/10.1109/roman.2007.4415255
[109]
Mauro Dragone, Thomas Holz, GMP O’Hare, and Michael J O’Grady. 2009. Mixed Reality Agent (MiRA) Chameleons. In Agent-Based Ubiquitous Computing. Springer, 13–33. https://doi.org/10.2991/978-94-91216-31-2_2
[110]
Philip Edgcumbe, Rohit Singla, Philip Pratt, Caitlin Schneider, Christopher Nguan, and Robert Rohling. 2016. Augmented reality imaging for robot-assisted partial nephrectomy surgery. In International Conference on Medical Imaging and Augmented Reality. Springer, 139–150. https://doi.org/10.1007/978-3-319-43775-0_13
[111]
Lotfi El Hafi, Hitoshi Nakamura, Akira Taniguchi, Yoshinobu Hagiwara, and Tadahiro Taniguchi. 2021. Teaching system for multimodal object categorization by human-robot interaction in mixed reality. In 2021 IEEE/SICE International Symposium on System Integration (SII). IEEE, 320–324. https://doi.org/10.1109/IEEECONF49454.2021.9382607
[112]
Ahmed Elsharkawy, Khawar Naheem, Dongwoo Koo, and Mun Sang Kim. 2021. A UWB-Driven Self-Actuated Projector Platform for Interactive Augmented Reality Applications. Applied Sciences 11, 6 (2021), 2871. https://doi.org/10.3390/app11062871
[113]
Barrett Ens, Benjamin Bach, Maxime Cordeil, Ulrich Engelke, Marcos Serrano, Wesley Willett, Arnaud Prouzeau, Christoph Anthes, Wolfgang Büschel, Cody Dunne, 2021. Grand challenges in immersive analytics. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, 1–17. https://doi.org/10.1145/3411764.3446866
[114]
Okan Erat, Werner Alexander Isop, Denis Kalkofen, and Dieter Schmalstieg. 2018. Drone-augmented human vision: Exocentric control for drones exploring hidden areas. IEEE transactions on visualization and computer graphics 24, 4(2018), 1437–1446. https://doi.org/10.1109/TVCG.2018.2794058
[115]
David Estevez, Juan G Victores, Santiago Morante, and Carlos Balaguer. 2015. Robot devastation: Using DIY low-cost platforms for multiplayer interaction in an augmented reality game. In 2015 7th International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN). IEEE, 32–36. https://doi.org/10.4108/icst.intetain.2015.259753
[116]
Aluna Everitt and Jason Alexander. 2017. PolySurface: a design approach for rapid prototyping of shape-changing displays using semi-solid surfaces. In Proceedings of the 2017 Conference on Designing Interactive Systems. 1283–1294. https://doi.org/10.1145/3064663.3064677
[117]
Aluna Everitt and Jason Alexander. 2019. 3D Printed Deformable Surfaces for Shape-Changing Displays. Frontiers in Robotics and AI 6 (2019), 80. https://doi.org/10.3389/frobt.2019.00080
[118]
A Evlampev and M Ostanin. 2019. Obstacle avoidance for robotic manipulator using Mixed reality glasses. In 2019 3rd School on Dynamics of Complex Networks and their Application in Intellectual Robotics (DCNAIR). IEEE, 46–48. https://doi.org/10.1109/dcnair.2019.8875555
[119]
Volkmar Falk, Fabien Mourgues, Louaï Adhami, Stefan Jacobs, Holger Thiele, Stefan Nitzsche, Friedrich W Mohr, and Ève Coste-Manière. 2005. Cardio navigation: planning, simulation, and augmented reality in robotic assisted endoscopic bypass grafting. The Annals of thoracic surgery 79, 6 (2005), 2040–2047. https://doi.org/10.1016/J.ATHORACSUR.2004.11.060
[120]
HC Fang, SK Ong, and AYC Nee. 2012. Interactive robot trajectory planning and simulation using augmented reality. Robotics and Computer-Integrated Manufacturing 28, 2(2012), 227–237. https://doi.org/10.1016/J.RCIM.2011.09.003
[121]
HC Fang, SK Ong, and AYC Nee. 2012. Robot path and end-effector orientation planning using augmented reality. Procedia CIRP 3(2012), 191–196. https://doi.org/10.1016/J.PROCIR.2012.07.034
[122]
HC Fang, SK Ong, and AYC Nee. 2013. Orientation planning of robot end-effector using augmented reality. The International Journal of Advanced Manufacturing Technology 67, 9-12(2013), 2033–2049. https://doi.org/10.1007/S00170-012-4629-7
[123]
HC Fang, SK Ong, and AYC Nee. 2014. A novel augmented reality-based interface for robot path planning. International Journal on Interactive Design and Manufacturing (IJIDeM) 8, 1(2014), 33–42. https://doi.org/10.1007/S12008-013-0191-2
[124]
Hongchao Fang, Soh Khim Ong, and Andrew Yeh-Ching Nee. 2009. Robot programming using augmented reality. In 2009 International Conference on CyberWorlds. IEEE, 13–20. https://doi.org/10.1109/CW.2009.14
[125]
Federica Ferraguti, Marco Minelli, Saverio Farsoni, Stefano Bazzani, Marcello Bonfè, Alexandre Vandanjon, Stefano Puliatti, Giampaolo Bianchi, and Cristian Secchi. 2020. Augmented reality and robotic-assistance for percutaneous nephrolithotomy. IEEE robotics and automation letters 5, 3 (2020), 4556–4563. https://doi.org/10.1109/lra.2020.3002216
[126]
Michael Filipenko, Alexander Poeppel, Alwin Hoffmann, Wolfgang Reif, Andreas Monden, and Markus Sause. 2020. Virtual commissioning with mixed reality for next-generation robot-based mechanical component testing. In ISR 2020; 52th International Symposium on Robotics. VDE, 1–6. https://doi.org/10.14236/EWIC/EVA2008.3
[127]
Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. In Uist, Vol. 13. 2501988–2502032. https://doi.org/10.1145/2501988.2502032
[128]
Jason Fong, Renz Ocampo, Douglas P Gross, and Mahdi Tavakoli. 2019. A robot with an augmented-reality display for functional capacity evaluation and rehabilitation of injured workers. In 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR). IEEE, 181–186. https://doi.org/10.1109/icorr.2019.8779417
[129]
Jutta Fortmann, Tim Claudius Stratmann, Susanne Boll, Benjamin Poppinga, and Wilko Heuten. 2013. Make me move at work! An ambient light display to increase physical activity. In 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops. IEEE, 274–277. https://doi.org/10.4108/icst.pervasivehealth.2013.252089
[130]
Jared A Frank and Vikram Kapila. 2016. Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. In 2016 Indian Control Conference (ICC). IEEE, 385–392. https://doi.org/10.1109/indiancc.2016.7441163
[131]
Jared Alan Frank, Sai Prasanth Krishnamoorthy, and Vikram Kapila. 2017. Toward mobile mixed-reality interaction with multi-robot systems. IEEE Robotics and Automation Letters 2, 4 (2017), 1901–1908. https://doi.org/10.1109/LRA.2017.2714128
[132]
Jared A Frank, Matthew Moorhead, and Vikram Kapila. 2016. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 302–307. https://doi.org/10.1109/ROMAN.2016.7745146
[133]
Jared A Frank, Matthew Moorhead, and Vikram Kapila. 2017. Mobile mixed-reality interfaces that enhance human–robot interaction in shared spaces. Frontiers in Robotics and AI 4 (2017), 20. https://doi.org/10.3389/frobt.2017.00020
[134]
Ayaka Fujii, Kanae Kochigami, Shingo Kitagawa, Kei Okada, and Masayuki Inaba. 2020. Development and Evaluation of Mixed Reality Co-eating System: Sharing the Behavior of Eating Food with a Robot Could Improve Our Dining Experience. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 357–362. https://doi.org/10.1109/ro-man47096.2020.9223518
[135]
Richard Fung, Sunao Hashimoto, Masahiko Inami, and Takeo Igarashi. 2011. An augmented reality system for teaching sequential tasks to a household robot. In 2011 RO-MAN. IEEE, 282–287. https://doi.org/10.1109/roman.2011.6005235
[136]
Anna Fuste, Ben Reynolds, James Hobin, and Valentin Heun. 2020. Kinetic AR: A Framework for Robotic Motion Systems in Spatial Computing. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–8. https://doi.org/10.1145/3334480.3382814
[137]
Samir Yitzhak Gadre, Eric Rosen, Gary Chien, Elizabeth Phillips, Stefanie Tellex, and George Konidaris. 2019. End-user robot programming using mixed reality. In 2019 International conference on robotics and automation (ICRA). IEEE, 2707–2713. https://doi.org/10.1109/icra.2019.8793988
[138]
Ramsundar Kalpagam Ganesan, Yash K Rathore, Heather M Ross, and Heni Ben Amor. 2018. Better teaming through visual cues: how projecting imagery in a workspace can improve human-robot collaboration. IEEE Robotics & Automation Magazine 25, 2 (2018), 59–71. https://doi.org/10.1109/mra.2018.2815655
[139]
Peng Gao, Brian Reily, Savannah Paul, and Hao Zhang. 2020. Visual reference of ambiguous objects for augmented reality-powered human-robot communication in a shared workspace. In International Conference on Human-Computer Interaction. Springer, 550–561. https://doi.org/10.1007/978-3-030-49695-1_37
[140]
Yuxiang Gao and Chien-Ming Huang. 2019. PATI: a projection-based augmented table-top interface for robot programming. In Proceedings of the 24th international conference on intelligent user interfaces. 345–355. https://doi.org/10.1145/3301275.3302326
[141]
Yuan Gao, Elena Sibirtseva, Ginevra Castellano, and Danica Kragic. 2019. Fast adaptation with meta-reinforcement learning for trust modelling in human-robot interaction. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 305–312. https://doi.org/10.1109/IROS40897.2019.8967924
[142]
Abraham Prieto García, Gervasio Varela Fernández, Blanca María Priego Torres, and Fernando López-Peña. 2011. Educational autonomous robotics setup using mixed reality. In 2011 7th International Conference on Next Generation Web Services Practices. IEEE, 452–457. https://doi.org/10.1109/nwesp.2011.6088222
[143]
Andre Gaschler, Maximilian Springer, Markus Rickert, and Alois Knoll. 2014. Intuitive robot tasks with augmented reality and virtual obstacles. In 2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 6026–6031. https://doi.org/10.1109/icra.2014.6907747
[144]
Hakan GENÇTÜRK and Uğur YAYAN. 2019. Development of Augmented Reality Based Mobile Robot Maintenance Software. In 2019 Innovations in Intelligent Systems and Applications Conference (ASYU). IEEE, 1–5. https://doi.org/10.1109/asyu48272.2019.8946359
[145]
Fabrizio Ghiringhelli, Jérôme Guzzi, Gianni A Di Caro, Vincenzo Caglioti, Luca M Gambardella, and Alessandro Giusti. 2014. Interactive augmented reality for understanding and analyzing multi-robot systems. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1195–1201. https://doi.org/10.1109/iros.2014.6942709
[146]
Mario Gianni, Federico Ferri, and Fiora Pirri. 2013. ARE: Augmented reality environment for mobile robots. In Conference Towards Autonomous Robotic Systems. Springer, 470–483. https://doi.org/10.1007/978-3-662-43645-5_48
[147]
Fabio Giannone, Emanuele Felli, Zineb Cherkaoui, Pietro Mascagni, and Patrick Pessaux. 2021. Augmented Reality and Image-Guided Robotic Liver Surgery. Cancers 13, 24 (2021), 6268. https://doi.org/10.3390/cancers13246268
[148]
Antonio Gomes, Calvin Rubens, Sean Braley, and Roel Vertegaal. 2016. Bitdrones: Towards using 3d nanocopter displays as interactive self-levitating programmable matter. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 770–780. https://doi.org/10.1145/2858036.2858519
[149]
Liang Gong, Changyang Gong, Zhao Ma, Lujie Zhao, Zhenyu Wang, Xudong Li, Xiaolong Jing, Haozhe Yang, and Chengliang Liu. 2017. Real-time human-in-the-loop remote control for a life-size traffic police robot with multiple augmented reality aided display terminals. In 2017 2nd International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 420–425. https://doi.org/10.1109/icarm.2017.8273199
[150]
LL Gong, SK Ong, and AYC Nee. 2019. Projection-based augmented reality interface for robot grasping tasks. In Proceedings of the 2019 4th International Conference on Robotics, Control and Automation. 100–104. https://doi.org/10.1145/3351180.3351204
[151]
Michael A Goodrich and Alan C Schultz. 2008. Human-robot interaction: a survey. Now Publishers Inc. https://doi.org/10.1561/1100000005
[152]
Gregory R Gossweiler, Cameron L Brown, Gihan B Hewage, Eitan Sapiro-Gheiler, William J Trautman, Garrett W Welshofer, and Stephen L Craig. 2015. Mechanochemically active soft robots. ACS applied materials & interfaces 7, 40 (2015), 22431–22435. https://doi.org/10.1021/acsami.5b06440
[153]
Michael Gradmann, Eric M Orendt, Edgar Schmidt, Stephan Schweizer, and Dominik Henrich. 2018. Augmented reality robot operation interface with Google Tango. In ISR 2018; 50th International Symposium on Robotics. VDE, 1–8.
[154]
Keith Evan Green. 2016. Architectural robotics: ecosystems of bits, bytes, and biology. MIT Press.
[155]
Scott A Green, Mark Billinghurst, XiaoQi Chen, and J Geoffrey Chase. 2008. Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems 5, 1 (2008), 1. https://doi.org/10.5772/5664
[156]
Scott A Green, Xioa Qi Chen, Mark Billinghurst, and J Geoffrey Chase. 2008. Collaborating with a mobile robot: An augmented reality multimodal interface. IFAC Proceedings Volumes 41, 2 (2008), 15595–15600. https://doi.org/10.3182/20080706-5-KR-1001.02637
[157]
Santiago Grijalva and Wilbert G Aguilar. 2019. Landmark-Based Virtual Path Estimation for Assisted UAV FPV Tele-Operation with Augmented Reality. In International Conference on Intelligent Robotics and Applications. Springer, 688–700. https://doi.org/10.1007/978-3-030-27529-7_58
[158]
Thomas Groechel, Zhonghao Shi, Roxanna Pakkar, and Maja J Matarić. 2019. Using socially expressive mixed reality arms for enhancing low-expressivity robots. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1–8. https://doi.org/10.1109/ro-man46459.2019.8956458
[159]
Jens Emil Grønbæk, Majken Kirkegaard Rasmussen, Kim Halskov, and Marianne Graves Petersen. 2020. KirigamiTable: Designing for proxemic transitions with a shape-changing tabletop. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–15. https://doi.org/10.1145/3313831.3376834
[160]
Uwe Gruenefeld, Lars Prädel, Jannike Illing, Tim Stratmann, Sandra Drolshagen, and Max Pfingsthorn. 2020. Mind the ARm: realtime visualization of robot motion intent in head-mounted augmented reality. In Proceedings of the Conference on Mensch und Computer. 259–266. https://doi.org/10.1145/3404983.3405509
[161]
Jan Guhl, Johannes Hügle, and Jörg Krüger. 2018. Enabling human-robot-interaction via virtual and augmented reality in distributed control systems. Procedia CIRP 76(2018), 167–170. https://doi.org/10.1016/J.PROCIR.2018.01.029
[162]
Jan Guhl, Son Tung, and Jörg Kruger. 2017. Concept and architecture for programming industrial robots using augmented reality with mobile devices like microsoft HoloLens. In 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA). IEEE, 1–4. https://doi.org/10.1109/etfa.2017.8247749
[163]
Cheng Guo, James Everett Young, and Ehud Sharlin. 2009. Touch and toys: new techniques for interaction with a remote group of robots. In Proceedings of the SIGCHI conference on human factors in computing systems. 491–500. https://doi.org/10.1145/1518701.1518780
[164]
Akihiro Hamada, Atsuro Sawada, Jin Kono, Masanao Koeda, Katsuhiko Onishi, Takashi Kobayashi, Toshinari Yamasaki, Takahiro Inoue, Hiroshi Noborio, and Osamu Ogawa. 2020. The current status and challenges in augmented-reality navigation system for robot-assisted laparoscopic partial nephrectomy. In International Conference on Human-Computer Interaction. Springer, 620–629. https://doi.org/10.1007/978-3-030-49062-1_42
[165]
Jared Hamilton, Thao Phung, Nhan Tran, and Tom Williams. 2021. What’s The Point? Tradeoffs Between Effectiveness and Social Perception When Using Mixed Reality to Enhance Gesturally Limited Robots. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 177–186. https://doi.org/10.1145/3434073.3444676
[166]
Jeonghye Han, Miheon Jo, Eunja Hyun, and Hyo-Jeong So. 2015. Examining young children’s perception toward augmented reality-infused dramatic play. Educational Technology Research and Development 63, 3(2015), 455–474. https://doi.org/10.1007/S11423-015-9374-9
[167]
John Hardy, Christian Weichel, Faisal Taher, John Vidler, and Jason Alexander. 2015. Shapeclip: towards rapid prototyping with shape-changing displays for designers. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 19–28. https://doi.org/10.1145/2702123.2702599
[168]
Jeremy Hartmann, Yen-Ting Yeh, and Daniel Vogel. 2020. AAR: Augmenting a wearable augmented reality display with an actuated head-mounted projector. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 445–458. https://doi.org/10.1145/3379337.3415849
[169]
Sunao Hashimoto, Akihiko Ishida, Masahiko Inami, and Takeo Igarashi. 2011. Touchme: An augmented reality based remote robot manipulation. In The 21st International Conference on Artificial Reality and Telexistence, Proceedings of ICAT2011, Vol. 2.
[170]
Hooman Hedayati, Ryo Suzuki, Daniel Leithinger, and Daniel Szafir. 2020. Pufferbot: Actuated expandable structures for aerial robots. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 1338–1343. https://doi.org/10.1109/iros45743.2020.9341088
[171]
Hooman Hedayati, Michael Walker, and Daniel Szafir. 2018. Improving collocated robot teleoperation with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 78–86. https://doi.org/10.1145/3171221.3171251
[172]
Mary Hegarty, Matt S Canham, and Sara I Fabrikant. 2010. Thinking about the weather: How display salience and knowledge affect performance in a graphic inference task.Journal of Experimental Psychology: Learning, Memory, and Cognition 36, 1(2010), 37. https://doi.org/10.1037/a0017683
[173]
Viviane Herdel, Anastasia Kuzminykh, Andrea Hildebrandt, and Jessica R Cauchard. 2021. Drone in Love: Emotional Perception of Facial Expressions on Flying Robots. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–20. https://doi.org/10.1145/3411764.3445495
[174]
Juan David Hernández, Shlok Sobti, Anthony Sciola, Mark Moll, and Lydia E Kavraki. 2020. Increasing robot autonomy via motion planning and an augmented reality interface. IEEE Robotics and Automation Letters 5, 2 (2020), 1017–1023. https://doi.org/10.1109/lra.2020.2967280
[175]
Takayuki Hirai, Satoshi Nakamaru, Yoshihiro Kawahara, and Yasuaki Kakehi. 2018. xslate: A stiffness-controlled surface for shape-changing interfaces. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–4. https://doi.org/10.1145/3170427.3186496
[176]
Takefumi Hiraki, Shogo Fukushima, Yoshihiro Kawahara, and Takeshi Naemura. 2018. Phygital field: An integrated field with physical robots and digital images using projection-based localization and control method. SICE Journal of Control, Measurement, and System Integration 11, 4(2018), 302–311. https://doi.org/10.9746/jcmsi.11.302
[177]
Takefumi Hiraki, Shogo Fukushima, Yoshihiro Kawahara, and Takeshi Naemura. 2019. NavigaTorch: Projection-based Robot Control Interface using High-speed Handheld Projector. In SIGGRAPH Asia 2019 Emerging Technologies. 31–33. https://doi.org/10.1145/3355049.3360538
[178]
Takefumi Hiraki, Shogo Fukushima, and Takeshi Naemura. 2016. Phygital field: an integrated field with a swarm of physical robots and digital images. In SIGGRAPH ASIA 2016 Emerging Technologies. 1–2. https://doi.org/10.1145/2988240.2988242
[179]
Takefumi Hiraki, Issei Takahashi, Shotaro Goto, Shogo Fukushima, and Takeshi Naemura. 2015. Phygital field: integrated field with visible images and robot swarm controlled by invisible images. In ACM SIGGRAPH 2015 Posters. 1–1. https://doi.org/10.1145/2787626.2792604
[180]
Yutaka Hiroi, Shuhei Hisano, and Akinori Ito. 2010. Evaluation of head size of an interactive robot using an augmented reality. In 2010 World Automation Congress. IEEE, 1–6. https://doi.org/10.1109/ro-man46459.2019.8956315
[181]
Tzu-Hsuan Ho and Kai-Tai Song. 2020. Supervised control for robot-assisted surgery using augmented reality. In 2020 20th International Conference on Control, Automation and Systems (ICCAS). IEEE, 329–334. https://doi.org/10.23919/ICCAS50221.2020.9268278
[182]
Khoa Cong Hoang, Wesley P Chan, Steven Lay, Akansel Cosgun, and Elizabeth Croft. 2021. Virtual Barriers in Augmented Reality for Safe and Effective Human-Robot Cooperation in Manufacturing. arXiv preprint arXiv:2104.05211(2021).
[183]
Ayanna M Howard, Luke Roberts, Sergio Garcia, and Rakale Quarells. 2012. Using mixed reality to map human exercise demonstrations to a robot exercise coach. In 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 291–292. https://doi.org/10.1109/ismar.2012.6402579
[184]
Baichuan Huang, Deniz Bayazit, Daniel Ullman, Nakul Gopalan, and Stefanie Tellex. 2019. Flight, camera, action! using natural language and mixed reality to control a drone. In 2019 International Conference on Robotics and Automation (ICRA). IEEE, 6949–6956. https://doi.org/10.1109/ICRA.2019.8794200
[185]
Bidan Huang, Nicholas Gerard Timmons, and Qiang Li. 2020. Augmented reality with multi-view merging for robot teleoperation. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 260–262. https://doi.org/10.1145/3371382.3378336
[186]
Chien-Ming Huang and Bilge Mutlu. 2013. Modeling and Evaluating Narrative Gestures for Humanlike Robots. In Robotics: Science and Systems. 57–64. https://doi.org/10.15607/RSS.2013.IX.026
[187]
Tianqi Huang, Ruiyang Li, Yangxi Li, Xinran Zhang, and Hongen Liao. 2021. Augmented reality-based autostereoscopic surgical visualization system for telesurgery. International Journal of Computer Assisted Radiology and Surgery 16, 11(2021), 1985–1997. https://doi.org/10.1007/s11548-021-02463-5
[188]
Dinh Quang Huy, I Vietcheslav, and Gerald Seet Gim Lee. 2017. See-through and spatial augmented reality-a novel framework for human-robot interaction. In 2017 3rd International Conference on Control, Automation and Robotics (ICCAR). IEEE, 719–726. https://doi.org/10.1109/ICCAR.2017.7942791
[189]
Jane Hwang, Sangyup Lee, Sang Chul Ahn, and Hyoung-gon Kim. 2008. Augmented robot agent: Enhancing co-presence of the remote participant. In 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE, 161–162. https://doi.org/10.1109/ismar.2008.4637346
[190]
Hisham Iqbal, Fabio Tatti, and Ferdinando Rodriguez y Baena. 2021. Augmented reality in robotic assisted orthopaedic surgery: A pilot study. Journal of Biomedical Informatics 120 (2021), 103841. https://doi.org/10.1016/j.jbi.2021.103841
[191]
Kentaro Ishii, Shengdong Zhao, Masahiko Inami, Takeo Igarashi, and Michita Imai. 2009. Designing laser gesture interface for robot control. In IFIP Conference on Human-Computer Interaction. Springer, 479–492. https://doi.org/10.1007/978-3-642-03658-3_52
[192]
Yvonne Jansen, Pierre Dragicevic, Petra Isenberg, Jason Alexander, Abhijit Karnik, Johan Kildal, Sriram Subramanian, and Kasper Hornbæk. 2015. Opportunities and challenges for data physicalization. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 3227–3236. https://doi.org/10.1145/2702123.2702180
[193]
Yunwoo Jeong, Han-Jong Kim, and Tek-Jin Nam. 2018. Mechanism perfboard: An augmented reality environment for linkage mechanism design and fabrication. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–11. https://doi.org/10.1145/3173574.3173985
[194]
Zhenrui Ji, Quan Liu, Wenjun Xu, Bitao Yao, Jiayi Liu, and Zude Zhou. 2021. A Closed-Loop Brain-Computer Interface with Augmented Reality Feedback for Industrial Human-Robot Collaboration. (2021). https://doi.org/10.21203/RS.3.RS-283263/V1
[195]
Chun Jia and Zhenzhong Liu. 2020. Collision Detection Based on Augmented Reality for Construction Robot. In 2020 5th International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 194–197. https://doi.org/10.1109/icarm49381.2020.9195301
[196]
Jingang Jiang, Yafeng Guo, Zhiyuan Huang, Yongde Zhang, Dianhao Wu, and Yi Liu. 2021. Adjacent surface trajectory planning of robot-assisted tooth preparation based on augmented reality. Engineering Science and Technology, an International Journal (2021). https://doi.org/10.1016/J.JESTCH.2021.05.005
[197]
Brennan Jones, Yaying Zhang, Priscilla NY Wong, and Sean Rintel. 2020. VROOM: Virtual Robot Overlay for Online Meetings. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–10. https://doi.org/10.1145/3334480.3382820
[198]
Brennan Jones, Yaying Zhang, Priscilla NY Wong, and Sean Rintel. 2021. Belonging There: VROOM-ing into the Uncanny Valley of XR Telepresence. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1(2021), 1–31. https://doi.org/10.1145/3449133
[199]
Colin Jones, Michael Novitzky, and Christopher Korpela. 2021. AR/VR Tutorial System for Human-Robot Teaming. In 2021 IEEE 11th Annual Computing and Communication Workshop and Conference (CCWC). IEEE, 0878–0882. https://doi.org/10.1109/ccwc51732.2021.9375845
[200]
Jana Jost, Thomas Kirks, Preity Gupta, Dennis Lünsch, and Jonas Stenzel. 2018. Safe human-robot-interaction in highly flexible warehouses using augmented reality and heterogenous fleet management system. In 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR). IEEE, 256–260. https://doi.org/10.1109/IISR.2018.8535808
[201]
Kevin Sebastian Kain, Susanne Stadler, Manuel Giuliani, Nicole Mirnig, Gerald Stollnberger, and Manfred Tscheligi. 2017. Tablet-based augmented reality in the factory: Influence of knowledge in computer programming on robot teaching tasks. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. 151–152. https://doi.org/10.1145/3029798.3038347
[202]
Alisa Kalegina, Grace Schroeder, Aidan Allchin, Keara Berlin, and Maya Cakmak. 2018. Characterizing the design space of rendered robot faces. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 96–104. https://doi.org/10.1145/3171221.3171286
[203]
Megha Kalia, Apeksha Avinash, Nassir Navab, and Septimiu Salcudean. 2021. Preclinical evaluation of a markerless, real-time, augmented reality guidance system for robot-assisted radical prostatectomy. International Journal of Computer Assisted Radiology and Surgery (2021), 1–8. https://doi.org/10.1007/s11548-021-02419-9
[204]
Megha Kalia, Prateek Mathur, Keith Tsang, Peter Black, Nassir Navab, and Septimiu Salcudean. 2020. Evaluation of a marker-less, intra-operative, augmented reality guidance system for robot-assisted laparoscopic radical prostatectomy. International Journal of Computer Assisted Radiology and Surgery 15 (2020), 1225–1233. https://doi.org/10.1007/s11548-020-02181-4
[205]
Kenji Kansaku, Naoki Hata, and Kouji Takano. 2010. My thoughts through a robot’s eyes: An augmented reality-brain–machine interface. Neuroscience research 66, 2 (2010), 219–222. https://doi.org/10.1016/j.neures.2009.10.006
[206]
Michal Kapinus, Vítězslav Beran, Zdeněk Materna, and Daniel Bambušek. 2019. Spatially Situated End-User Robot Programming in Augmented Reality. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1–8. https://doi.org/10.1109/RO-MAN46459.2019.8956336
[207]
Michal Kapinus, Zdeněk Materna, Daniel Bambušek, and Vitězslav Beran. 2020. End-User Robot Programming Case Study: Augmented Reality vs. Teach Pendant. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 281–283. https://doi.org/10.1145/3371382.3378266
[208]
Mohamed Kari, Tobias Grosse-Puppendahl, Luis Falconeri Coelho, Andreas Rene Fender, David Bethge, Reinhard Schütte, and Christian Holz. 2021. TransforMR: Pose-Aware Object Substitution for Composing Alternate Mixed Realities. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 69–79. https://doi.org/10.1109/ismar52148.2021.00021
[209]
Shunichi Kasahara, Ryuma Niiyama, Valentin Heun, and Hiroshi Ishii. 2013. exTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. 223–228. https://doi.org/10.1145/2460625.2460661
[210]
Misaki Kasetani, Tomonobu Noguchi, Hirotake Yamazoe, and Joo-Ho Lee. 2015. Projection mapping by mobile projector robot. In 2015 12th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI). IEEE, 13–17. https://doi.org/10.1109/URAI.2015.7358918
[211]
Linh Kästner and Jens Lambrecht. 2019. Augmented-reality-based visualization of navigation data of mobile robots on the microsoft hololens-possibilities and limitations. In 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM). IEEE, 344–349. https://doi.org/10.1109/cis-ram47153.2019.9095836
[212]
Jun Kato, Daisuke Sakamoto, Masahiko Inami, and Takeo Igarashi. 2009. Multi-touch interface for controlling multiple mobile robots. In CHI’09 Extended Abstracts on Human Factors in Computing Systems. 3443–3448. https://doi.org/10.1145/1520340.1520500
[213]
Yuta Kato, Yuya Aikawa, Masayoshi Kanoh, Felix Jimenez, Mitsuhiro Hayase, Takahiro Tanaka, and Hitoshi Kanamori. 2019. A Robot System Using Mixed Reality to Encourage Driving Review. In International Conference on Human-Computer Interaction. Springer, 112–117. https://doi.org/10.1007/978-3-030-23528-4_16
[214]
Rubaiat Habib Kazi, Tovi Grossman, Nobuyuki Umetani, and George Fitzmaurice. 2016. Motion amplifiers: sketching dynamic illustrations using the principles of 2D animation. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems(CHI ’16). Association for Computing Machinery, 4599–4609. https://doi.org/10.1145/2858036.2858386
[215]
Maram Khatib, Khaled Al Khudir, and Alessandro De Luca. 2021. Human-robot contactless collaboration with mixed reality interface. Robotics and Computer-Integrated Manufacturing 67 (2021), 102030. https://doi.org/10.1016/j.rcim.2020.102030
[216]
Hyoungnyoun Kim, Jun-Sik Kim, Kwanghyun Ryu, Seyoung Cheon, Yonghwan Oh, and Ji-Hyung Park. 2014. Task-oriented teleoperation through natural 3D user interaction. In 2014 11th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI). IEEE, 335–338. https://doi.org/10.1109/urai.2014.7057536
[217]
Lawrence H Kim, Daniel S Drew, Veronika Domova, and Sean Follmer. 2020. User-defined swarm robot control. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). Association for Computing Machinery, 1–13. https://doi.org/10.1145/3313831.3376814
[218]
Lawrence H Kim and Sean Follmer. 2017. Ubiswarm: Ubiquitous robotic interfaces and investigation of abstract motion as a display. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (2017), 1–20. https://doi.org/10.1145/3130931
[219]
Won S Kim. 1996. Virtual reality calibration and preview/predictive displays for telerobotics. Presence: Teleoperators & Virtual Environments 5, 2(1996), 173–190.
[220]
Kazuhiko Kobayashi, Koichi Nishiwaki, Shinji Uchiyama, Hiroyuki Yamamoto, Satoshi Kagami, and Takeo Kanade. 2007. Overlay what humanoid robot perceives and thinks to the real-world by mixed reality system. In 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, 275–276. https://doi.org/ismar.2007.4538864
[221]
Minoru Kojima, Maki Sugimoto, Akihiro Nakamura, Masahiro Tomita, Hideaki Nii, and Masahiko Inami. 2006. Augmented coliseum: An augmented game environment with small vehicles. In First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP’06). IEEE, 6–pp. https://doi.org/10.1109/TABLETOP.2006.3
[222]
Abhishek Kolagunda, Scott Sorensen, Sherif Mehralivand, Philip Saponaro, Wayne Treible, Baris Turkbey, Peter Pinto, Peter Choyke, and Chandra Kambhamettu. 2018. A mixed reality guidance system for robot assisted laparoscopic radical prostatectomy. In OR 2.0 Context-Aware Operating Theaters, Computer Assisted Robotic Endoscopy, Clinical Image-Based Procedures, and Skin Image Analysis. Springer, 164–174. https://doi.org/10.1007/978-3-030-01201-4_18
[223]
Andreas Korthauer, Clemens Guenther, Andreas Hinrichs, Wen Ren, and Yiwen Yang. 2020. Watch Your Vehicle Driving at the City: Interior HMI with Augmented Reality for Automated Driving. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–5. https://doi.org/10.1145/3406324.3425895
[224]
Tomáš Kot, Petr Novák, and Ján Babjak. 2017. Application of augmented reality in mobile robot teleoperation. In International Workshop on Modelling and Simulation for Autonomous Systems. Springer, 223–236. https://doi.org/10.1007/978-3-319-76072-8_16
[225]
Niki Kousi, Christos Stoubos, Christos Gkournelos, George Michalos, and Sotiris Makris. 2019. Enabling human robot interaction in flexible robotic assembly lines: An augmented reality based software suite. Procedia CIRP 81(2019), 1429–1434. https://doi.org/10.1016/J.PROCIR.2019.04.328
[226]
Dennis Krupke, Frank Steinicke, Paul Lubos, Yannick Jonetzko, Michael Görner, and Jianwei Zhang. 2018. Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 1–9. https://doi.org/10.1109/iros.2018.8594043
[227]
Aleksander Krzywinski, Haipeng Mi, Weiqin Chen, and Masanori Sugimoto. 2009. RoboTable: a tabletop framework for tangible interaction with robots in a mixed reality. In proceedings of the international conference on advances in computer Enterntainment technology. 107–114. https://doi.org/10.1145/1690388.1690407
[228]
Eranda Lakshantha and Simon Egerton. 2014. Human Robot Interaction and Control: Translating Diagrams into an Intuitive Augmented Reality Approach. In 2014 International Conference on Intelligent Environments. IEEE, 111–116. https://doi.org/10.1109/ie.2014.24
[229]
Fabrizio Lamberti, Davide Calandra, Federica Bazzano, Filippo G Prattico, and Davide M Destefanis. 2018. Robotquest: A robotic game based on projected mixed reality and proximity interaction. In 2018 IEEE Games, Entertainment, Media Conference (GEM). IEEE, 1–9. https://doi.org/10.1109/GEM.2018.8516501
[230]
Fabrizio Lamberti, Alberto Cannavò, and Paolo Pirone. 2019. Designing interactive robotic games based on mixed reality technology. In 2019 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 1–4. https://doi.org/10.1109/icce.2019.8661911
[231]
Jens Lambrecht, Linh Kästner, Jan Guhl, and Jörg Krüger. 2021. Towards commissioning, resilience and added value of Augmented Reality in robotics: Overcoming technical obstacles to industrial applicability. Robotics and Computer-Integrated Manufacturing 71 (2021), 102178. https://doi.org/10.1016/J.RCIM.2021.102178
[232]
Jens Lambrecht and Jörg Krüger. 2012. Spatial programming for industrial robots based on gestures and augmented reality. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 466–472. https://doi.org/10.1109/IROS.2012.6385900
[233]
Matheus Laranjeira, Aurélien Arnaubec, Lorenzo Brignone, Claire Dune, and Jan Opderbecke. 2020. 3D Perception and Augmented Reality Developments in Underwater Robotics for Ocean Sciences. Current Robotics Reports(2020), 1–8. https://doi.org/10.1007/s43154-020-00014-5
[234]
Tomas Lazna. 2018. The visualization of threats using the augmented reality and a remotely controlled robot. IFAC-PapersOnLine 51, 6 (2018), 444–449. https://doi.org/10.1016/J.IFACOL.2018.07.113
[235]
Mathieu Le Goc, Lawrence H Kim, Ali Parsaei, Jean-Daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building blocks for swarm user interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 97–109. https://doi.org/10.1145/2984511.2984547
[236]
David Ledo, Steven Houben, Jo Vermeulen, Nicolai Marquardt, Lora Oehlberg, and Saul Greenberg. 2018. Evaluation strategies for HCI toolkit research. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–17. https://doi.org/10.1145/3173574.3173610
[237]
Ho-Dong Lee, Dongwon Kim, Min-Chul Park, and Gwi-Tae Park. 2008. Augmented reality based vision system for network based mobile robot. In Asia-Pacific Conference on Computer Human Interaction. Springer, 123–130. https://doi.org/10.1007/978-3-540-70585-7_14
[238]
Ho-Dong Lee, Hyun-Gu Lee, Joo-Hyung Kim, Min-Chul Park, and Gwi-Tae Park. 2007. Human machine interface with augmented reality for the network based mobile robot. In International Conference on Knowledge-Based and Intelligent Information and Engineering Systems. Springer, 57–64. https://doi.org/10.1007/978-3-540-74829-8_8
[239]
Joo-Haeng Lee, Junho Kim, and Hyun Kim. 2011. A note on hybrid control of robotic spatial augmented reality. In 2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI). IEEE, 621–626. https://doi.org/10.1109/URAI.2011.6145895
[240]
Jae Young Lee, Jong-Wook Lee, Teressa Talluri, Amarnathvarma Angani, and Jeong Bea Lee. 2020. Realization of Robot Fish with 3D Hologram Fish using Augmented Reality. In 2020 IEEE 2nd International Conference on Architecture, Construction, Environment and Hydraulics (ICACEH). IEEE, 102–104. https://doi.org/10.1109/icaceh51803.2020.9366226
[241]
Kevin Lee, Christopher Reardon, and Jonathan Fink. 2018. Augmented Reality in Human-Robot Cooperative Search. In 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). IEEE, 1–1. https://doi.org/10.1109/ssrr.2018.8468659
[242]
Myungho Lee, Nahal Norouzi, Gerd Bruder, Pamela J Wisniewski, and Gregory F Welch. 2018. The physical-virtual table: exploring the effects of a virtual human’s physical influence on social interaction. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. 1–11. https://doi.org/10.1145/3281505.3281533
[243]
Daniel Leithinger, Sean Follmer, Alex Olwal, Samuel Luescher, Akimitsu Hogge, Jinha Lee, and Hiroshi Ishii. 2013. Sublimate: state-changing virtual and physical rendering to augment interaction with shape displays. In Proceedings of the SIGCHI conference on human factors in computing systems. 1441–1450. https://doi.org/10.1145/2470654.2466191
[244]
Daniel Leithinger and Hiroshi Ishii. 2010. Relief: a scalable actuated shape display. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction. 221–222. https://doi.org/10.1145/1709886.1709928
[245]
Daniel Leithinger, David Lakatos, Anthony DeVincenzi, Matthew Blackshaw, and Hiroshi Ishii. 2011. Direct and gestural interaction with relief: a 2.5 D shape display. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 541–548. https://doi.org/10.1145/2047196.2047268
[246]
Jakob Leitner, Michael Haller, Kyungdahm Yun, Woontack Woo, Maki Sugimoto, Masahiko Inami, Adrian David Cheok, and HD Been-Lirn. 2010. Physical interfaces for tabletop games. Computers in Entertainment (CIE) 7, 4 (2010), 1–21. https://doi.org/10.1145/1658866.1658880
[247]
Germán Leiva, Cuong Nguyen, Rubaiat Habib Kazi, and Paul Asente. 2020. Pronto: Rapid augmented reality video prototyping using sketches and enaction. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). Association for Computing Machinery, 1–13. https://doi.org/10.1145/3313831.3376160
[248]
Alexander Lenhardt and Helge Ritter. 2010. An augmented-reality based brain-computer interface for robot control. In International Conference on Neural Information Processing. Springer, 58–65. https://doi.org/10.1007/978-3-642-17534-3_8
[249]
Francisco J Lera, Víctor Rodríguez, Carlos Rodríguez, and Vicente Matellán. 2014. Augmented reality in robotic assistance for the elderly. In International technology robotics applications. Springer, 3–11. https://doi.org/10.1007/978-3-319-02332-8_1
[250]
Mirna Lerotic, Adrian J Chung, George Mylonas, and Guang-Zhong Yang. 2007. Pq-space based non-photorealistic rendering for augmented reality. In International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, 102–109. https://doi.org/10.1007/978-3-540-75759-7_13
[251]
Florian Leutert, Christian Herrmann, and Klaus Schilling. 2013. A spatial augmented reality system for intuitive display of robotic data. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 179–180. https://doi.org/10.1109/hri.2013.6483560
[252]
Florian Leutert and Klaus Schilling. 2012. Support of power plant telemaintenance with robots by augmented reality methods. In 2012 2nd International Conference on Applied Robotics for the Power Industry (CARPI). IEEE, 45–49. https://doi.org/10.1109/carpi.2012.6473362
[253]
Florian Leutert and Klaus Schilling. 2015. Augmented reality for telemaintenance and-inspection in force-sensitive industrial robot applications. IFAC-PapersOnLine 48, 10 (2015), 153–158. https://doi.org/10.1016/J.IFACOL.2015.08.124
[254]
Chunxu Li, Ashraf Fahmy, and Johann Sienz. 2019. An augmented reality based human-robot interaction interface using Kalman filter sensor fusion. Sensors 19, 20 (2019), 4586. https://doi.org/10.3390/s19204586
[255]
Congyuan Liang, Chao Liu, Xiaofeng Liu, Long Cheng, and Chenguang Yang. 2019. Robot teleoperation system based on mixed reality. In 2019 IEEE 4Th international conference on advanced robotics and mechatronics (ICARM). IEEE, 384–389. https://doi.org/10.1109/icarm.2019.8834302
[256]
Li Lin, Yunyong Shi, Andy Tan, Melia Bogari, Ming Zhu, Yu Xin, Haisong Xu, Yan Zhang, Le Xie, and Gang Chai. 2016. Mandibular angle split osteotomy based on a novel augmented reality navigation using specialized robot-assisted arms—A feasibility study. Journal of Cranio-Maxillofacial Surgery 44, 2 (2016), 215–223. https://doi.org/10.1016/j.jcms.2015.10.024
[257]
Natan Linder and Pattie Maes. 2010. LuminAR: portable robotic augmented reality interface design and prototype. In Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology. 395–396. https://doi.org/10.1145/1866218.1866237
[258]
David Lindlbauer, Jens Emil Grønbæk, Morten Birk, Kim Halskov, Marc Alexa, and Jörg Müller. 2016. Combining shape-changing interfaces and spatial augmented reality enables extended object appearance. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 791–802. https://doi.org/10.1145/2858036.2858457
[259]
David Lindlbauer, Jörg Mueller, and Marc Alexa. 2017. Changing the appearance of real-world objects by modifying their surroundings. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems(CHI ’17). Association for Computing Machinery, 3954–3965. https://doi.org/10.1145/3025453.3025795
[260]
David Lindlbauer and Andy D. Wilson. 2018. Remixed Reality: Manipulating Space and Time in Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3173574.3173703
[261]
Ragavendra Lingamaneni, Thomas Kubitza, and Jürgen Scheible. 2017. DroneCAST: towards a programming toolkit for airborne multimedia display applications. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–8. https://doi.org/10.1145/3098279.3122128
[262]
Hangxin Liu, Yaofang Zhang, Wenwen Si, Xu Xie, Yixin Zhu, and Song-Chun Zhu. 2018. Interactive robot knowledge patching using augmented reality. In 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 1947–1954. https://doi.org/10.1109/ICRA.2018.8462837
[263]
Kexi Liu, Daisuke Sakamoto, Masahiko Inami, and Takeo Igarashi. 2011. Roboshop: multi-layered sketching interface for robot housework assignment and management. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 647–656. https://doi.org/10.1145/1978942.1979035
[264]
Wen P Liu, Jeremy D Richmon, Jonathan M Sorger, Mahdi Azizian, and Russell H Taylor. 2015. Augmented reality and cone beam CT guidance for transoral robotic surgery. Journal of robotic surgery 9, 3 (2015), 223–233. https://doi.org/10.1007/s11701-015-0520-5
[265]
Yuzhou Liu, Georg Novotny, Nikita Smirnov, Walter Morales-Alvarez, and Cristina Olaverri-Monreal. 2020. Mobile Delivery Robots: Mixed Reality-Based Simulation Relying on ROS and Unity 3D. In 2020 IEEE Intelligent Vehicles Symposium (IV). IEEE, 15–20. https://doi.org/10.1109/IV47402.2020.9304701
[266]
Salvatore Livatino, Filippo Banno, and Giovanni Muscato. 2011. 3-D integration of robot vision and laser data with semiautomatic calibration in augmented reality stereoscopic visual interface. IEEE Transactions on Industrial Informatics 8, 1 (2011), 69–77. https://doi.org/10.1109/tii.2011.2174062
[267]
Salvatore Livatino, Dario C Guastella, Giovanni Muscato, Vincenzo Rinaldi, Luciano Cantelli, Carmelo D Melita, Alessandro Caniglia, Riccardo Mazza, and Gianluca Padula. 2021. Intuitive robot teleoperation through multi-sensor informed mixed reality visual aids. IEEE Access 9(2021), 25795–25808. https://doi.org/10.1109/access.2021.3057808
[268]
Salvatore Livatino, Giovanni Muscato, Filippo Banno, Davide De Tommaso, and Marco Macaluso. 2010. Video and laser based augmented reality stereoscopic viewing for mobile robot teleoperation. IFAC Proceedings Volumes 43, 23 (2010), 161–168. https://doi.org/10.3182/20101005-4-RO-2018.00049
[269]
Salvatore Livatino, Giovanni Muscato, Davide De Tommaso, and Marco Macaluso. 2010. Augmented reality stereoscopic visualization for intuitive robot teleguide. In 2010 IEEE International Symposium on Industrial Electronics. IEEE, 2828–2833. https://doi.org/10.1109/ISIE.2010.5636955
[270]
Matthew B Luebbers, Connor Brooks, Minjae John Kim, Daniel Szafir, and Bradley Hayes. 2019. Augmented reality interface for constrained learning from demonstration. In Proceedings of the 2nd International Workshop on Virtual, Augmented, and Mixed Reality for HRI (VAM-HRI).
[271]
Dario Luipers and Anja Richert. 2021. Concept of an Intuitive Human-Robot-Collaboration via Motion Tracking and Augmented Reality. In 2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA). IEEE, 423–427. https://doi.org/10.1109/icaica52286.2021.9498091
[272]
Maria Luce Lupetti. 2016. Designing playful HRI: Acceptability of robots in everyday life through play. In 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 631–632. https://doi.org/10.1109/hri.2016.7451891
[273]
Maria Luce Lupetti, Giovanni Piumatti, Claudio Germak, and Fabrizio Lamberti. 2018. Design and Evaluation of a Mixed-Reality Playground for Child-Robot Games. Multimodal Technologies and Interaction 2, 4 (2018), 69. https://doi.org/10.3390/mti2040069
[274]
Andreas Luxenburger, Jonas Mohr, Torsten Spieldenner, Dieter Merkel, Fabio Espinosa, Tim Schwartz, Florian Reinicke, Julian Ahlers, and Markus Stoyke. 2019. Augmented reality for human-robot cooperation in aircraft assembly. In 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR). IEEE, 263–2633. https://doi.org/10.1109/AIVR46125.2019.00061
[275]
Stéphane Magnenat, Morderchai Ben-Ari, Severin Klinger, and Robert W Sumner. 2015. Enhancing robot programming with visual feedback and augmented reality. In Proceedings of the 2015 ACM conference on innovation and technology in computer science education. 153–158. https://doi.org/10.1145/2729094.2742585
[276]
Karthik Mahadevan, Elaheh Sanoubari, Sowmya Somanath, James E Young, and Ehud Sharlin. 2019. AV-Pedestrian interaction design using a pedestrian mixed traffic simulator. In Proceedings of the 2019 on designing interactive systems conference. 475–486. https://doi.org/10.1145/3322276.3322328
[277]
Karthik Mahadevan, Maurício Sousa, Anthony Tang, and Tovi Grossman. 2021. “Grip-that-there”: An Investigation of Explicit and Implicit Task Allocation Techniques for Human-Robot Collaboration. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14. https://doi.org/10.1145/3411764.3445355
[278]
Kartik Mahajan, Thomas Groechel, Roxanna Pakkar, Julia Cordero, Haemin Lee, and Maja J Matarić. 2020. Adapting Usability Metrics for a Socially Assistive, Kinesthetic, Mixed Reality Robot Tutoring Environment. In International Conference on Social Robotics. Springer, 381–391. https://doi.org/10.1007/978-3-030-62056-1_32
[279]
Madjid Maidi, Malik Mallem, Laredj Benchikh, and Samir Otmane. 2013. An evaluation of camera pose methods for an augmented reality system: Application to teaching industrial robots. In Transactions on Computational Science XVII. Springer, 3–30. https://doi.org/10.1007/978-3-642-35840-1_1
[280]
Jim Mainprice, Emrah Akin Sisbot, Thierry Siméon, and Rachid Alami. 2010. Planning Safe and Legible Hand-over Motions for Human-Robot Interaction. In IARP/IEEE-RAS/EURON workshop on technical challenges for dependable robots in human environments. HAL. https://hal.laas.fr/hal-01976223
[281]
Zhanat Makhataeva and Huseyin Atakan Varol. 2020. Augmented reality for robotics: a review. Robotics 9, 2 (2020), 21. https://doi.org/10.3390/robotics9020021
[282]
Zhanat Makhataeva, Altay Zhakatayev, and Huseyin Atakan Varol. 2019. Safety Aura Visualization for Variable Impedance Actuated Robots. In 2019 IEEE/SICE International Symposium on System Integration (SII). IEEE, 805–810. https://doi.org/10.1109/SII.2019.8700332
[283]
Sotiris Makris, Panagiotis Karagiannis, Spyridon Koukas, and Aleksandros-Stereos Matthaiakis. 2016. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Annals 65, 1 (2016), 61–64. https://doi.org/10.1016/J.CIRP.2016.04.038
[284]
Ehsan Malayjerdi, Mahdi Yaghoobi, and Mohammad Kardan. 2017. Mobile robot navigation based on fuzzy cognitive map optimized with grey wolf optimization algorithm used in augmented reality. In 2017 5th RSI International Conference on Robotics and Mechatronics (ICRoM). IEEE, 211–218. https://doi.org/10.1109/icrom.2017.8466169
[285]
Ivo Malỳ, David Sedláček, and Paulo Leitao. 2016. Augmented reality experiments with industrial robot in industry 4.0 environment. In 2016 IEEE 14th international conference on industrial informatics (INDIN). IEEE, 176–181. https://doi.org/10.1109/INDIN.2016.7819154
[286]
Raúl Marín and Pedro J Sanz. 2002. Augmented reality to teleoperate a robot through the Web. IFAC Proceedings Volumes 35, 1 (2002), 161–165. https://doi.org/10.3182/20020721-6-ES-1901.00933
[287]
Andrés Martín-Barrio, Juan Jesús Roldán-Gómez, Iván Rodríguez, Jaime Del Cerro, and Antonio Barrientos. 2020. Design of a Hyper-Redundant Robot and Teleoperation Using Mixed Reality for Inspection Tasks. Sensors 20, 8 (2020), 2181. https://doi.org/10.3390/s20082181
[288]
Zdeněk Materna, Michal Kapinus, Vítězslav Beran, Pavel SmrĚ, Manuel Giuliani, Nicole Mirnig, Susanne Stadler, Gerald Stollnberger, and Manfred Tscheligi. 2017. Using persona, scenario, and use case to develop a human-robot augmented reality collaborative workspace. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. 201–202. https://doi.org/10.1145/3029798.3038366
[289]
Zdeněk Materna, Michal Kapinus, Vítězslav Beran, Pavel Smrž, and Pavel Zemčík. 2018. Interactive spatial augmented reality in collaborative robot programming: User experience evaluation. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 80–87. https://doi.org/10.1109/roman.2018.8525662
[290]
Florin Octavian Matu, Mikkel Thøgersen, Bo Galsgaard, Martin Møller Jensen, and Martin Kraus. 2014. Stereoscopic augmented reality system for supervised training on minimal invasive surgery robots. In Proceedings of the 2014 Virtual Reality International Conference. 1–4. https://doi.org/10.1145/2617841.2620722
[291]
William A McNeely. 1993. Robotic graphics: a new approach to force feedback for virtual reality. In Proceedings of IEEE Virtual Reality Annual International Symposium. IEEE, 336–341. https://doi.org/10.1109/VRAIS.1993.380761
[292]
George Michalos, Panagiotis Karagiannis, Sotiris Makris, Önder Tokçalar, and George Chryssolouris. 2016. Augmented reality (AR) applications for supporting human-robot interactive cooperation. Procedia CIRP 41(2016), 370–375. https://doi.org/10.1016/J.PROCIR.2015.12.005
[293]
Paul Milgram and Fumio Kishino. 1994. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems 77, 12 (1994), 1321–1329.
[294]
Paul Milgram, Shumin Zhai, David Drascic, and Julius Grodski. 1993. Applications of augmented reality for human-robot communication. In Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’93), Vol. 3. IEEE, 1467–1472. https://doi.org/10.1109/IROS.1993.583833
[295]
Omid Mohareri and Ahmad B Rad. 2011. Autonomous humanoid robot navigation using augmented reality technique. In 2011 IEEE International Conference on Mechatronics. IEEE, 463–468. https://doi.org/10.1109/icmech.2011.5971330
[296]
Nicolas Mollet, Ryad Chellali, and Luca Brayda. 2009. Virtual and augmented reality tools for teleoperation: improving distant immersion and perception. In Transactions on edutainment II. Springer, 135–159. https://doi.org/10.1007/978-3-642-03270-7_10
[297]
William Montalvo, Pablo Bonilla-Vasconez, Santiago Altamirano, Carlos A Garcia, and Marcelo V Garcia. 2020. Industrial Control Robot Based on Augmented Reality and IoT Protocol. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Springer, 345–363. https://doi.org/10.1007/978-3-030-58468-9_25
[298]
Rafael Morales, Asier Marzo, Sriram Subramanian, and Diego Martínez. 2019. LeviProps: Animating levitated optimized fabric structures using holographic acoustic tweezers. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 651–661. https://doi.org/10.1145/3332165.3347882
[299]
Stephen A Morin, Robert F Shepherd, Sen Wai Kwok, Adam A Stokes, Alex Nemiroski, and George M Whitesides. 2012. Camouflage and display for soft machines. Science 337, 6096 (2012), 828–832. https://doi.org/10.1126/science.1222149
[300]
Kohei Morita, Takefumi Hiraki, Haruka Matsukura, Daisuke Iwai, and Kosuke Sato. 2020. Extension of Projection Area using Head Orientation in Projected Virtual Hand Interface for Wheelchair Users. In 2020 59th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE). IEEE, 421–426. https://doi.org/10.23919/SICE48898.2020.9240271
[301]
D Mourtzis, G Synodinos, J Angelopoulos, and N Panopoulos. 2020. An augmented reality application for robotic cell customization. Procedia CIRP 90(2020), 654–659. https://doi.org/10.1016/j.procir.2020.02.135
[302]
Dimitris Mourtzis, Vasilios Zogopoulos, and E Vlachou. 2017. Augmented reality application to support remote maintenance as a service in the robotics industry. Procedia Cirp 63(2017), 46–51. https://doi.org/10.1016/j.procir.2017.03.154
[303]
Fabian Mueller, Christian Deuerlein, and Michael Koch. 2019. Intuitive welding robot programming via motion capture and augmented reality. IFAC-PapersOnLine 52, 10 (2019), 294–299. https://doi.org/10.1016/j.ifacol.2019.10.045
[304]
Stefanie Mueller, Pedro Lopes, and Patrick Baudisch. 2012. Interactive construction: interactive fabrication of functional mechanical devices. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 599–606. https://doi.org/10.1145/2380116.2380191
[305]
Faizan Muhammad, Amel Hassan, Andre Cleaver, and Jivko Sinapov. 2019. Creating a shared reality with robots. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 614–615. https://doi.org/10.1109/HRI.2019.8673191
[306]
Alex Murphy and Alan G Millard. 2020. Prototyping Sensors and Actuators for Robot Swarms in Mixed Reality. In Annual Conference Towards Autonomous Robotic Systems. Springer, 377–386. https://doi.org/10.1007/978-3-030-63486-5_39
[307]
Bilge Mutlu, Jodi Forlizzi, and Jessica Hodgins. 2006. Modeling and evaluation of human-like gaze behavior. (2006), 518–523. https://doi.org/10.1109/ICHR.2006.321322
[308]
Ken Nakagaki, Luke Vink, Jared Counts, Daniel Windham, Daniel Leithinger, Sean Follmer, and Hiroshi Ishii. 2016. Materiable: Rendering dynamic material properties in response to direct physical touch with shape changing interfaces. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 2764–2772. https://doi.org/10.1145/2858036.2858104
[309]
Nassir Navab, Christoph Hennersperger, Benjamin Frisch, and Bernhard Fürst. 2016. Personalized, relevance-based multimodal robotic imaging and augmented reality for computer assisted interventions., 64–71 pages. https://doi.org/10.1016/j.media.2016.06.021
[310]
Aditya Nawab, Keshav Chintamani, Darin Ellis, Gregory Auner, and Abhilash Pandya. 2007. Joystick mapped augmented reality cues for end-effector controlled tele-operated robots. In 2007 IEEE Virtual Reality Conference. IEEE, 263–266. https://doi.org/10.1109/vr.2007.352496
[311]
Michael Nebeling, Janet Nebeling, Ao Yu, and Rob Rumble. 2018. Protoar: Rapid physical-digital prototyping of mobile augmented reality applications. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). Association for Computing Machinery, 1–12. https://doi.org/10.1145/3173574.3173927
[312]
Chuen Leong Ng, Teck Chew Ng, Thi Anh Ngoc Nguyen, Guilin Yang, and Wenjie Chen. 2010. Intuitive robot tool path teaching using laser and camera in augmented reality environment. In 2010 11th International Conference on Control Automation Robotics & Vision. IEEE, 114–119. https://doi.org/10.1109/icarcv.2010.5707399
[313]
D Ni, AWW Yew, SK Ong, and AYC Nee. 2017. Haptic and visual augmented reality interface for programming welding robots. Advances in Manufacturing 5, 3 (2017), 191–198. https://doi.org/10.1007/s40436-017-0184-7
[314]
Sivapong Nilwong and Genci Capi. 2020. Outdoor Robot Navigation System using Game-Based DQN and Augmented Reality. In 2020 17th International Conference on Ubiquitous Robots (UR). IEEE, 74–80. https://doi.org/10.1109/ur49135.2020.9144838
[315]
Koichi Nishiwaki, Kazuhiko Kobayashi, Shinji Uchiyama, Hiroyuki Yamamoto, and Satoshi Kagami. 2008. Mixed reality environment for autonomous robot development. In 2008 IEEE International Conference on Robotics and Automation. IEEE, 2211–2212. https://doi.org/10.1109/ROBOT.2008.4543538
[316]
Diana Nowacka, Karim Ladha, Nils Y Hammerla, Daniel Jackson, Cassim Ladha, Enrico Rukzio, and Patrick Olivier. 2013. Touchbugs: Actuated tangibles on multi-touch tables. In Proceedings of the SIGCHI conference on human factors in computing systems. 759–762. https://doi.org/10.1145/2470654.2470761
[317]
Hiroki Nozaki. 2014. Flying display: a movable display pairing projector and screen in the air. In CHI’14 Extended Abstracts on Human Factors in Computing Systems. 909–914. https://doi.org/10.1145/2559206.2579410
[318]
R Nunez, JR Bandera, JM Perez-Lorenzo, and Francisco Sandoval. 2006. A human-robot interaction system for navigation supervision based on augmented reality. In MELECON 2006-2006 IEEE Mediterranean Electrotechnical Conference. IEEE, 441–444. https://doi.org/10.1109/melcon.2006.1653133
[319]
Cristina Nuzzi, Stefano Ghidini, Roberto Pagani, Simone Pasinetti, Gabriele Coffetti, and Giovanna Sansoni. 2020. Hands-Free: a robot augmented reality teleoperation system. In 2020 17th International Conference on Ubiquitous Robots (UR). IEEE, 617–624. https://doi.org/10.1109/ur49135.2020.9144841
[320]
Yoichi Ochiai and Keisuke Toyoshima. 2011. Homunculus: the vehicle as augmented clothes. In Proceedings of the 2nd Augmented Human International Conference. 1–4. https://doi.org/10.1145/1959826.1959829
[321]
Yusuke Okuno, Takayuki Kanda, Michita Imai, Hiroshi Ishiguro, and Norihiro Hagita. 2009. Providing route directions: design of robot’s utterance, gesture, and timing. In 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 53–60. https://doi.org/10.1145/1514095.1514108
[322]
Shayegan Omidshafiei, Ali-Akbar Agha-Mohammadi, Yu Fan Chen, Nazim Kemal Ure, Shih-Yuan Liu, Brett T Lopez, Rajeev Surati, Jonathan P How, and John Vian. 2016. Measurable augmented reality for prototyping cyberphysical systems: A robotics platform to aid the hardware prototyping and performance testing of algorithms. IEEE Control Systems Magazine 36, 6 (2016), 65–87. https://doi.org/10.1109/mcs.2016.2602090
[323]
SK Ong, AWW Yew, NK Thanigaivel, and AYC Nee. 2020. Augmented reality-assisted robot programming system for industrial applications. Robotics and Computer-Integrated Manufacturing 61 (2020), 101820. https://doi.org/10.1016/J.RCIM.2019.101820
[324]
Soh-Khim Ong, JWS Chong, and Andrew YC Nee. 2006. Methodologies for immersive robot programming in an augmented reality environment. In Proceedings of the 4th international conference on computer graphics and interactive techniques in Australasia and Southeast Asia. 237–244. https://doi.org/10.1145/1174429.1174470
[325]
Mikhail Ostanin and Alexandr Klimchik. 2018. Interactive robot programing using mixed reality. IFAC-PapersOnLine 51, 22 (2018), 50–55. https://doi.org/10.1016/j.ifacol.2018.11.517
[326]
Mikhail Ostanin, Stanislav Mikhel, Alexey Evlampiev, Valeria Skvortsova, and Alexandr Klimchik. 2020. Human-robot interaction for robotic manipulator programming in Mixed Reality. In 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2805–2811. https://doi.org/10.1109/ICRA40945.2020.9196965
[327]
M Ostanin, R Yagfarov, and A Klimchik. 2019. Interactive Robots Control Using Mixed Reality. IFAC-PapersOnLine 52, 13 (2019), 695–700. https://doi.org/10.1016/j.ifacol.2019.11.307
[328]
Ayberk Özgür, Séverin Lemaignan, Wafa Johal, Maria Beltran, Manon Briod, Léa Pereyre, Francesco Mondada, and Pierre Dillenbourg. 2017. Cellulo: Versatile handheld robots for education. In 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI. IEEE, 119–127. https://doi.org/10.1145/2909824.3020247
[329]
Yun Suen Pai, Hwa Jen Yap, Siti Zawiah Md Dawal, S Ramesh, and Sin Ye Phoon. 2016. Virtual planning, control, and machining for a modular-based automated factory operation in an augmented reality environment. Scientific reports 6, 1 (2016), 1–19. https://doi.org/10.1038/srep27380
[330]
Yong Pan, Chengjun Chen, Dongnian Li, Zhengxu Zhao, and Jun Hong. 2021. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device. Robotics and Computer-Integrated Manufacturing 71 (2021), 102167. https://doi.org/10.1016/J.RCIM.2021.102167
[331]
Gian Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2002. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. In Proceedings of the 15th annual ACM symposium on User interface software and technology. 181–190. https://doi.org/10.1145/571985.572011
[332]
Christos Papachristos and Kostas Alexis. 2016. Augmented reality-enhanced structural inspection using aerial robots. In 2016 IEEE international symposium on intelligent control (ISIC). IEEE, 1–6. https://doi.org/10.1109/ISIC.2016.7579983
[333]
Peter Papcun, Jan Cabadaj, Erik Kajati, David Romero, Lenka Landryova, Jan Vascak, and Iveta Zolotova. 2019. Augmented Reality for Humans-Robots Interaction in Dynamic Slotting “Chaotic Storage” Smart Warehouses. In IFIP International Conference on Advances in Production Management Systems. Springer, 633–641. https://doi.org/10.1007/978-3-030-30000-5_77
[334]
Hyeshin Park, Yo-An Lim, Aslam Pervez, Beom-Chan Lee, Sang-Goog Lee, and Jeha Ryu. 2007. Teleoperation of a multi-purpose robot over the internet using augmented reality. In 2007 International Conference on Control, Automation and Systems. IEEE, 2456–2461. https://doi.org/10.1109/iccas.2007.4406776
[335]
Jung Pil Park, Min Woo Park, and Soon Ki Jung. 2014. Qr-code based online robot augmented reality system for education. In Proceedings of the 29th Annual ACM Symposium on Applied Computing. 180–185. https://doi.org/10.1145/2554850.2555038
[336]
Kyeong-Beom Park, Sung Ho Choi, Jae Yeol Lee, Yalda Ghasemi, Mustafa Mohammed, and Heejin Jeong. 2021. Hands-Free Human–Robot Interaction Using Multimodal Gestures and Deep Learning in Wearable Mixed Reality. IEEE Access 9(2021), 55448–55464. https://doi.org/10.1109/access.2021.3071364
[337]
Yoon Jung Park, Hyocheol Ro, and Tack-Don Han. 2019. Deep-ChildAR bot: educational activities and safety care augmented reality system with deep-learning for preschool. In ACM SIGGRAPH 2019 Posters. 1–2. https://doi.org/10.1145/3306214.3338589
[338]
Yoon Jung Park, Yoonsik Yang, Hyocheol Ro, JungHyun Byun, Seougho Chae, and Tack Don Han. 2018. Meet AR-bot: Meeting Anywhere, Anytime with Movable Spatial AR Robot. In Proceedings of the 26th ACM international conference on Multimedia. 1242–1243. https://doi.org/10.1145/3240508.3241390
[339]
James Patten and Hiroshi Ishii. 2007. Mechanical constraints as computational constraints in tabletop tangible interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 809–818. https://doi.org/10.1145/1240624.1240746
[340]
Esben Warming Pedersen and Kasper Hornbæk. 2011. Tangible bots: interaction with active tangibles in tabletop interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2975–2984. https://doi.org/10.1145/1978942.1979384
[341]
Huaishu Peng, Jimmy Briggs, Cheng-Yao Wang, Kevin Guo, Joseph Kider, Stefanie Mueller, Patrick Baudisch, and François Guimbretière. 2018. RoMA: Interactive fabrication with augmented reality and a robotic 3D printer. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–12. https://doi.org/10.1145/3173574.3174153
[342]
Lorenzo Peppoloni, Filippo Brizzi, Emanuele Ruffaldi, and Carlo Alberto Avizzano. 2015. Augmented reality-aided tele-presence system for robot manipulation in industrial manufacturing. In Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology. 237–240. https://doi.org/10.1145/2821592.2821620
[343]
Nate Phillips, Brady Kruse, Farzana Alam Khan, J Edward Swan II, and Cindy L Bethel. 2020. A Robotic Augmented Reality Virtual Window for Law Enforcement Operations. In International Conference on Human-Computer Interaction. Springer, 591–610. https://doi.org/10.1007/978-3-030-49695-1_40
[344]
Luis Piardi, Vivian Cremer Kalempa, Marcelo Limeira, André Schneider de Oliveira, and Paulo Leitão. 2019. ARENA—augmented reality to enhanced experimentation in smart warehouses. Sensors 19, 19 (2019), 4308. https://doi.org/10.3390/s19194308
[345]
Carlo Pinciroli, Mohamed S Talamali, Andreagiovanni Reina, James AR Marshall, and Vito Trianni. 2018. Simulating kilobots within argos: models and experimental validation. In International Conference on Swarm Intelligence. Springer, 176–187. https://doi.org/10.1007/978-3-030-00533-7_14
[346]
Giovanni Piumatti, Andrea Sanna, Marco Gaspardone, and Fabrizio Lamberti. 2017. Spatial augmented reality meets robots: Human-machine interaction in cloud-based projected gaming environments. In 2017 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 176–179. https://doi.org/10.1109/ICCE.2017.7889276
[347]
Francesco Porpiglia, Enrico Checcucci, Daniele Amparore, Matteo Manfredi, Federica Massa, Pietro Piazzolla, Diego Manfrin, Alberto Piana, Daniele Tota, Enrico Bollito, 2019. Three-dimensional elastic augmented-reality robot-assisted radical prostatectomy using hyperaccuracy three-dimensional reconstruction technology: a step further in the identification of capsular involvement. European urology 76, 4 (2019), 505–514. https://doi.org/10.1016/j.eururo.2019.03.037
[348]
Francesco Porpiglia, Enrico Checcucci, Daniele Amparore, Federico Piramide, Gabriele Volpi, Stefano Granato, Paolo Verri, Matteo Manfredi, Andrea Bellin, Pietro Piazzolla, 2020. Three-dimensional augmented reality robot-assisted partial nephrectomy in case of complex tumours (PADUA ≥ 10): a new intraoperative tool overcoming the ultrasound guidance. European urology 78, 2 (2020), 229–238. https://doi.org/10.1016/j.eururo.2019.11.024
[349]
Ivan Poupyrev, Tatsushi Nashida, and Makoto Okabe. 2007. Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays. In Proceedings of the 1st international conference on Tangible and embedded interaction. 205–212. https://doi.org/10.1145/1226969.1227012
[350]
F Gabriele Pratticò, Alberto Cannavò, Junchao Chen, and Fabrizio Lamberti. 2019. User Perception of Robot’s Role in Floor Projection-based Mixed-Reality Robotic Games. In 2019 IEEE 23rd International Symposium on Consumer Technologies (ISCT). IEEE, 76–81. https://doi.org/10.1109/isce.2019.8901037
[351]
Filippo Gabriele Pratticò, Francisco Navarro Merino, and Fabrizio Lamberti. 2020. Is Learning by Teaching an Effective Approach in Mixed-Reality Robotic Training Systems?. In International Conference on Intelligent Technologies for Interactive Entertainment. Springer, 177–190. https://doi.org/10.1007/978-3-030-76426-5_12
[352]
David Puljiz, Franziska Krebs, Fabian Bosing, and Bjorn Hein. 2020. What the HoloLens Maps Is Your Workspace: Fast Mapping and Set-up of Robot Cells via Head Mounted Displays and Augmented Reality. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 11445–11451. https://doi.org/10.1109/iros45743.2020.9340879
[353]
Isabel PS Qamar, Rainer Groh, David Holman, and Anne Roudaut. 2018. HCI meets material science: A literature review of morphing materials for the design of shape-changing interfaces. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–23. https://doi.org/10.1145/3173574.3173948
[354]
Long Qian, Anton Deguet, Zerui Wang, Yun-Hui Liu, and Peter Kazanzides. 2019. Augmented reality assisted instrument insertion and tool manipulation for the first assistant in robotic surgery. In 2019 International Conference on Robotics and Automation (ICRA). IEEE, 5173–5179. https://doi.org/10.1109/ICRA.2019.8794263
[355]
Long Qian, Chengzhi Song, Yiwei Jiang, Qi Luo, Xin Ma, Philip Waiyan Chiu, Zheng Li, and Peter Kazanzides. 2020. FlexiVision: Teleporting the Surgeon’s Eyes via Robotic Flexible Endoscope and Head-Mounted Display. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 3281–3287. https://doi.org/10.1109/IROS45743.2020.9340716
[356]
Long Qian, Jie Ying Wu, Simon P DiMaio, Nassir Navab, and Peter Kazanzides. 2019. A review of augmented reality in robotic-assisted surgery. IEEE Transactions on Medical Robotics and Bionics 2, 1 (2019), 1–16. https://doi.org/10.1109/tmrb.2019.2957061
[357]
Shuwen Qiu, Hangxin Liu, Zeyu Zhang, Yixin Zhu, and Song-Chun Zhu. 2020. Human-Robot Interaction in a Shared Augmented Reality Workspace. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 11413–11418. https://doi.org/10.1109/iros45743.2020.9340781
[358]
Camilo Perez Quintero, Sarah Li, Matthew KXJ Pan, Wesley P Chan, HF Machiel Van der Loos, and Elizabeth Croft. 2018. Robot programming through augmented trajectories in augmented reality. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 1838–1844. https://doi.org/10.1109/IROS.2018.8593700
[359]
Majken K Rasmussen, Esben W Pedersen, Marianne G Petersen, and Kasper Hornbæk. 2012. Shape-changing interfaces: a review of the design space and open research questions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 735–744. https://doi.org/10.1145/2207676.2207781
[360]
Christopher Reardon, Jason Gregory, Carlos Nieto-Granda, and John G Rogers. 2020. Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection. In International Conference on Human-Computer Interaction. Springer, 611–628. https://doi.org/10.1007/978-3-030-49695-1_41
[361]
Christopher Reardon, Kevin Lee, and Jonathan Fink. 2018. Come see this! augmented reality to enable human-robot cooperative search. In 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). IEEE, 1–7. https://doi.org/10.1109/SSRR.2018.8468622
[362]
Christopher Reardon, Kevin Lee, John G Rogers, and Jonathan Fink. 2019. Augmented reality for human-robot teaming in field environments. In International Conference on Human-Computer Interaction. Springer, 79–92. https://doi.org/10.1007/978-3-030-21565-1_6
[363]
Christopher Reardon, Kevin Lee, John G Rogers, and Jonathan Fink. 2019. Communicating via augmented reality for human-robot teaming in field environments. In 2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). IEEE, 94–101. https://doi.org/10.1109/SSRR.2019.8848971
[364]
Andreagiovanni Reina, Mattia Salvaro, Gianpiero Francesca, Lorenzo Garattoni, Carlo Pinciroli, Marco Dorigo, and Mauro Birattari. 2015. Augmented reality for robots: virtual sensing technology applied to a swarm of e-pucks. In 2015 NASA/ESA Conference on Adaptive Hardware and Systems (AHS). IEEE, 1–6. https://doi.org/10.1109/ahs.2015.7231154
[365]
Ying Ren and Jiro Tanaka. 2019. Augmented Reality Based Actuated Monitor Manipulation from Dual Point of View. In International Conference on Human-Computer Interaction. Springer, 93–107. https://doi.org/10.1007/978-3-030-21565-1_7
[366]
Patrick Renner, Florian Lier, Felix Friese, Thies Pfeiffer, and Sven Wachsmuth. 2018. Facilitating HRI by mixed reality techniques. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 215–216. https://doi.org/10.1145/3173386.3177032
[367]
Patrick Renner, Florian Lier, Felix Friese, Thies Pfeiffer, and Sven Wachsmuth. 2018. Wysiwicd: What you see is what i can do. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 382–382. https://doi.org/10.1145/3173386.3177533
[368]
Hyocheol Ro, Jung-Hyun Byun, Inhwan Kim, Yoon Jung Park, Kyuri Kim, and Tack-Don Han. 2019. Projection-based augmented reality robot prototype with human-awareness. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 598–599. https://doi.org/10.1109/HRI.2019.8673173
[369]
David Robert and Cynthia Breazeal. 2012. Blended reality characters. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. 359–366. https://doi.org/10.1145/2157689.2157810
[370]
David Robert, Ryan Wistorrt, Jesse Gray, and Cynthia Breazeal. 2010. Exploring mixed reality robot gaming. In Proceedings of the fifth international conference on tangible, embedded, and embodied interaction. 125–128. https://doi.org/10.1145/1935701.1935726
[371]
Nancy Rodriguez, Luis Jose Pulido, and Jean-Pierre Jessel. 2004. Enhancing a telerobotics Java tool with augmented reality. In International Symposium and School on Advancex Distributed Systems. Springer, 9–18. https://doi.org/10.1007/978-3-540-25958-9_2
[372]
Eric Rosen, David Whitney, Elizabeth Phillips, Gary Chien, James Tompkin, George Konidaris, and Stefanie Tellex. 2020. Communicating robot arm motion intent through mixed reality head-mounted displays. In Robotics research. Springer, 301–316. https://doi.org/10.1007/978-3-030-28619-4_26
[373]
Alexandros Rotsidis, Andreas Theodorou, Joanna J Bryson, and Robert H Wortham. 2019. Improving robot transparency: An investigation with mobile augmented reality. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1–8. https://doi.org/10.1109/ro-man46459.2019.8956390
[374]
Anne Roudaut, Abhijit Karnik, Markus Löchtefeld, and Sriram Subramanian. 2013. Morphees: toward high” shape resolution” in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 593–602. https://doi.org/10.1145/2470654.2470738
[375]
Dávid Rozenberszki and Gábor Sörös. 2021. Towards Universal User Interfaces for Mobile Robots. In Augmented Humans Conference 2021. 274–276. https://doi.org/10.1145/3458709.3458996
[376]
Emanuele Ruffaldi, Filippo Brizzi, Franco Tecchia, and Sandro Bacinelli. 2016. Third point of view augmented reality for robot intentions visualization. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Springer, 471–478. https://doi.org/10.1007/978-3-319-40621-3_35
[377]
JJ Ruiz, A Viguria, JR Martinez-de Dios, and A Ollero. 2015. Immersive displays for building spatial knowledge in multi-UAV operations. In 2015 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, 1043–1048. https://doi.org/10.1109/icuas.2015.7152395
[378]
Golnoosh Samei, Keith Tsang, Claudia Kesch, Julio Lobo, Soheil Hor, Omid Mohareri, Silvia Chang, S Larry Goldenberg, Peter C Black, and Septimiu Salcudean. 2020. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Medical image analysis 60 (2020), 101588. https://doi.org/10.1016/j.media.2019.101588
[379]
Yasumitsu Sarai and Yusuke Maeda. 2017. Robot programming for manipulators through volume sweeping and augmented reality. In 2017 13th ieee conference on automation science and engineering (case). IEEE, 302–307. https://doi.org/10.1109/COASE.2017.8256120
[380]
Markus Sauer, Frauke Driewer, Manuel Göllnitz, and Klaus Schilling. 2007. Potential and challenges of stereo augmented reality for mobile robot teleoperation. IFAC Proceedings Volumes 40, 16 (2007), 183–188. https://doi.org/10.3182/20070904-3-KR-2922.00032
[381]
Markus Sauer, Martin Hess, and Klaus Schilling. 2009. Towards a predictive mixed reality user interface for mobile robot teleoperation. IFAC Proceedings Volumes 42, 22 (2009), 91–96. https://doi.org/10.3182/20091006-3-US-4006.00016
[382]
Jürgen Scheible, Achim Hoth, Julian Saal, and Haifeng Su. 2013. Displaydrone: a flying robot based interactive display. In Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 49–54. https://doi.org/10.1145/2491568.2491580
[383]
Riccardo Schiavina, Lorenzo Bianchi, Francesco Chessa, Umberto Barbaresi, Laura Cercenelli, Simone Lodi, Caterina Gaudiano, Barbara Bortolani, Andrea Angiolini, Federico Mineo Bianchi, 2021. Augmented reality to guide selective clamping and tumor dissection during robot-assisted partial nephrectomy: a preliminary experience. Clinical genitourinary cancer 19, 3 (2021), e149–e155. https://doi.org/10.1016/j.clgc.2020.09.005
[384]
Riccardo Schiavina, Lorenzo Bianchi, Simone Lodi, Laura Cercenelli, Francesco Chessa, Barbara Bortolani, Caterina Gaudiano, Carlo Casablanca, Matteo Droghetti, Angelo Porreca, 2021. Real-time augmented reality three-dimensional guided robotic radical prostatectomy: preliminary experience and evaluation of the impact on surgical planning. European urology focus 7, 6 (2021), 1260–1267. https://doi.org/10.1016/j.euf.2020.08.004
[385]
Jan Schmitt, Andreas Hillenbrand, Philipp Kranz, and Tobias Kaupp. 2021. Assisted Human-Robot-Interaction for Industrial Assembly: Application of Spatial Augmented Reality (SAR) for Collaborative Assembly Tasks. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 52–56. https://doi.org/10.1145/3434074.3447127
[386]
Julian Seifert, Sebastian Boring, Christian Winkler, Florian Schaub, Fabian Schwab, Steffen Herrdum, Fabian Maier, Daniel Mayer, and Enrico Rukzio. 2014. Hover Pad: interacting with autonomous and self-actuated displays in space. In Proceedings of the 27th annual ACM symposium on User interface software and technology. 139–147. https://doi.org/10.1145/2642918.2647385
[387]
Ronny Seiger, Mandy Korzetz, Maria Gohlke, and Uwe Aßmann. 2017. Mixed reality cyber-physical systems control and workflow composition. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia. 495–500. https://doi.org/10.1145/3152832.3157808
[388]
Martin Seleckỳ, Jan Faigl, and Milan Rollo. 2019. Analysis of using mixed reality simulations for incremental development of multi-uav systems. Journal of Intelligent & Robotic Systems 95, 1 (2019), 211–227. https://doi.org/10.1007/S10846-018-0875-8
[389]
Atsushi Sengiku, Masanao Koeda, Atsuro Sawada, Jin Kono, Naoki Terada, Toshinari Yamasaki, Kiminori Mizushino, Takahiro Kunii, Katsuhiko Onishi, Hiroshi Noborio, 2017. Augmented reality navigation system for robot-assisted laparoscopic partial nephrectomy. In International Conference of Design, User Experience, and Usability. Springer, 575–584. https://doi.org/10.1007/978-3-319-58637-3_45
[390]
Rossitza Setchi, Maryam Banitalebi Dehkordi, and Juwairiya Siraj Khan. 2020. Explainable Robotics in Human-Robot Interactions. Procedia Computer Science 176 (2020), 3057–3066. https://doi.org/10.1016/j.procs.2020.09.198
[391]
Nikitas M Sgouros and Sophia Kousidou. 2001. Generation and implementation of mixed-reality, narrative performances involving robotic actors. In International Conference on Virtual Storytelling. Springer, 69–78. https://doi.org/10.1007/3-540-45420-9_9
[392]
Dylan Shah, Bilige Yang, Sam Kriegman, Michael Levin, Josh Bongard, and Rebecca Kramer-Bottiglio. 2021. Shape changing robots: bioinspiration, simulation, and physical realization. Advanced Materials 33, 19 (2021), 2002882. https://doi.org/10.1002/adma.202002882
[393]
Shyang Shao, Satoshi Muramatsu, Katsuhiko Inagaki, Daisuke Chugo, Syo Yokota, and Hiroshi Hashimoto. 2019. Development of robot design evaluating system using Augmented Reality for affinity robots. In 2019 IEEE 17th International Conference on Industrial Informatics (INDIN), Vol. 1. IEEE, 815–820. https://doi.org/10.1109/indin41052.2019.8972057
[394]
Jun Shen, Nabil Zemiti, Christophe Taoum, Guillaume Aiche, Jean-Louis Dillenseger, Philippe Rouanet, and Philippe Poignet. 2020. Transrectal ultrasound image-based real-time augmented reality guidance in robot-assisted laparoscopic rectal surgery: a proof-of-concept study. International journal of computer assisted radiology and surgery 15, 3(2020), 531–543. https://doi.org/10.1007/s11548-019-02100-2
[395]
Noriyoshi Shimizu, Maki Sugimoto, Dairoku Sekiguchi, Shoichi Hasegawa, and Masahiko Inami. 2008. Mixed reality robotic user interface: virtual kinematics to enhance robot motion. In Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology. 166–169. https://doi.org/10.1145/1501750.1501789
[396]
Elena Sibirtseva, Ali Ghadirzadeh, Iolanda Leite, Mårten Björkman, and Danica Kragic. 2019. Exploring temporal dependencies in multimodal referring expressions with mixed reality. In International Conference on Human-Computer Interaction. Springer, 108–123. https://doi.org/10.1007/978-3-030-21565-1_8
[397]
Dietmar Siegele, Dieter Steiner, Andrea Giusti, Michael Riedl, and Dominik T Matt. 2021. Optimizing Collaborative Robotic Workspaces in Industry by Applying Mixed Reality. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Springer, 544–559. https://doi.org/10.1007/978-3-030-87595-4_40
[398]
Torsten Sebastian Sievers, Bianca Schmitt, Patrick Rückert, Maren Petersen, and Kirsten Tracht. 2020. Concept of a Mixed-Reality Learning Environment for Collaborative Robotics. Procedia Manufacturing 45 (2020), 19–24. https://doi.org/10.1016/j.promfg.2020.04.034
[399]
David Sirkin, Brian Mok, Stephen Yang, and Wendy Ju. 2015. Mechanical ottoman: how robotic furniture offers and withdraws support. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction. 11–18. https://doi.org/10.1145/2696454.2696461
[400]
Enrico Sita, Matthew Studley, Farid Dailami, Anthony Pipe, and Trygve Thomessen. 2017. Towards multimodal interactions: robot jogging in mixed reality. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–2. https://doi.org/10.1145/3139131.3141200
[401]
Alexa F Siu, Shenli Yuan, Hieu Pham, Eric Gonzalez, Lawrence H Kim, Mathieu Le Goc, and Sean Follmer. 2018. Investigating tangible collaboration for design towards augmented physical telepresence. In Design thinking research. Springer, 131–145. https://doi.org/10.1007/978-3-319-60967-6_7
[402]
J Ernesto Solanes, Adolfo Muñoz, Luis Gracia, Ana Martí, Vicent Girbés-Juan, and Josep Tornero. 2020. Teleoperation of industrial robot manipulators based on augmented reality. The International Journal of Advanced Manufacturing Technology 111, 3(2020), 1077–1097. https://doi.org/10.1007/s00170-020-05997-1
[403]
Sichao Song and Seiji Yamada. 2018. Bioluminescence-inspired human-robot interaction: designing expressive lights that affect human’s willingness to interact with a robot. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 224–232. https://doi.org/10.1145/3171221.3171249
[404]
Adam Sosa, Richard Stanton, Stepheny Perez, Christian Keyes-Garcia, Sara Gonzalez, and Zachary O Toups. 2015. Imperfect robot control in a mixed reality game to teach hybrid human-robot team coordination. In Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play. 697–702. https://doi.org/10.1145/2793107.2810288
[405]
Maximilian Speicher, Brian D Hall, and Michael Nebeling. 2019. What is mixed reality?. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–15. https://doi.org/10.1145/3290605.3300767
[406]
Aaron St. Clair and Maja Mataric. 2015. How robot verbal feedback can improve team performance in human-robot task collaborations. In Proceedings of the tenth annual acm/ieee international conference on human-robot interaction. 213–220. https://doi.org/10.1145/2696454.2696491
[407]
Susanne Stadler, Kevin Kain, Manuel Giuliani, Nicole Mirnig, Gerald Stollnberger, and Manfred Tscheligi. 2016. Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 179–184. https://doi.org/10.1109/ROMAN.2016.7745108
[408]
Gordon Stein and Ákos Lédeczi. 2019. Mixed reality robotics for stem education. In 2019 IEEE Blocks and Beyond Workshop (B&B). IEEE, 49–53. https://doi.org/10.1109/bb48857.2019.8941229
[409]
Camille Linick Stewart, Abigail Fong, Govinda Payyavula, Simon DiMaio, Kelly Lafaro, Kirsten Tallmon, Sherry Wren, Jonathan Sorger, and Yuman Fong. 2021. Study on augmented reality for robotic surgery bedside assistants. Journal of Robotic Surgery(2021), 1–8. https://doi.org/10.1007/s11701-021-01335-z
[410]
Dominykas Strazdas, Jan Hintz, and Ayoub Al-Hamadi. 2021. Robo-HUD: Interaction Concept for Contactless Operation of Industrial Cobotic Systems. Applied Sciences 11, 12 (2021), 5366. https://doi.org/10.3390/APP11125366
[411]
Li-Ming Su, Balazs P Vagvolgyi, Rahul Agarwal, Carol E Reiley, Russell H Taylor, and Gregory D Hager. 2009. Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73, 4 (2009), 896–900. https://doi.org/10.1016/j.urology.2008.11.040
[412]
Mu-Chun Su, Gwo-Dong Chen, Yi-Shan Tsai, Ren-Hao Yao, Chung-Kuang Chou, Yohannes Budiono Jinawi, De-Yuan Huang, Yi-Zeng Hsieh, and Shih-Chieh Lin. 2009. Design of an Interactive Table for Mixed-Reality Learning Environments. In International Conference on Technologies for E-Learning and Digital Entertainment. Springer, 489–494. https://doi.org/10.1007/978-3-642-03364-3_59
[413]
Yun-Peng Su, Xiao-Qi Chen, Tony Zhou, Christopher Pretty, and J Geoffrey Chase. 2021. Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding. Applied Sciences 11, 23 (2021), 11280. https://doi.org/10.3390/app112311280
[414]
EK Subin, Ashik Hameed, and AP Sudheer. 2017. Android based augmented reality as a social interface for low cost social robots. In Proceedings of the Advances in Robotics. 1–4. https://doi.org/10.1145/3132446.3134907
[415]
Masanori Sugimoto. 2011. A mobile mixed-reality environment for children’s storytelling using a handheld projector and a robot. IEEE Transactions on Learning Technologies 4, 3 (2011), 249–260. https://doi.org/10.1109/tlt.2011.13
[416]
Masanori Sugimoto, Tomoki Fujita, Haipeng Mi, and Aleksander Krzywinski. 2011. RoboTable2: a novel programming environment using physical robots on a tabletop platform. In Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology. 1–8. https://doi.org/10.1145/2071423.2071436
[417]
Maki Sugimoto, Georges Kagotani, Hideaki Nii, Naoji Shiroma, Fumitoshi Matsuno, and Masahiko Inami. 2005. Time Follower’s Vision: a teleoperation interface with past images. IEEE Computer Graphics and Applications 25, 1 (2005), 54–63. https://doi.org/10.1109/mcg.2005.23
[418]
Ippei Suzuki, Shuntarou Yoshimitsu, Keisuke Kawahara, Nobutaka Ito, Atushi Shinoda, Akira Ishii, Takatoshi Yoshida, and Yoichi Ochiai. 2016. Gushed diffusers: Fast-moving, floating, and lightweight midair display. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 69–70. https://doi.org/10.1145/2984751.2985706
[419]
Naoki Suzuki and Asaki Hattori. 2012. Development of new augmented reality function using intraperitoneal multi-view camera. In Workshop on Augmented Environments for Computer-Assisted Interventions. Springer, 67–76. https://doi.org/10.1007/978-3-642-38085-3_8
[420]
Naoki Suzuki, Asaki Hattori, Kazuo Tanoue, Satoshi Ieiri, Kozo Konishi, Morimasa Tomikawa, Hajime Kenmotsu, and Makoto Hashizume. 2010. Scorpion shaped endoscopic surgical robot for NOTES and SPS with augmented reality functions. In International Workshop on Medical Imaging and Virtual Reality. Springer, 541–550. https://doi.org/10.1007/978-3-642-15699-1_57
[421]
Ryo Suzuki, Hooman Hedayati, Clement Zheng, James L Bohn, Daniel Szafir, Ellen Yi-Luen Do, Mark D Gross, and Daniel Leithinger. 2020. Roomshift: Room-scale dynamic haptics for vr with furniture-moving swarm robots. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–11. https://doi.org/10.1145/3313831.3376523
[422]
Ryo Suzuki, Jun Kato, Mark D Gross, and Tom Yeh. 2018. Reactile: Programming swarm user interfaces through direct physical manipulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13. https://doi.org/10.1145/3173574.3173773
[423]
Ryo Suzuki, Rubaiat Habib Kazi, Li-Yi Wei, Stephen DiVerdi, Wilmot Li, and Daniel Leithinger. 2020. RealitySketch: Embedding Responsive Graphics and Visualizations in AR through Dynamic Sketching. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 166–181. https://doi.org/10.1145/3379337.3415892
[424]
Ryo Suzuki, Eyal Ofek, Mike Sinclair, Daniel Leithinger, and Mar Gonzalez-Franco. 2021. HapticBots: Distributed Encountered-type Haptics for VR with Multiple Shape-changing Mobile Robots. In The 34th Annual ACM Symposium on User Interface Software and Technology. 1269–1281. https://doi.org/10.1145/3472749.3474821
[425]
Ryo Suzuki, Clement Zheng, Yasuaki Kakehi, Tom Yeh, Ellen Yi-Luen Do, Mark D Gross, and Daniel Leithinger. 2019. Shapebots: Shape-changing swarm robots. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 493–505. https://doi.org/10.1145/3332165.3347911
[426]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2014. Communication of intent in assistive free flyers. In 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 358–365. https://doi.org/10.1145/2559636.2559672
[427]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2015. Communicating directionality in flying robots. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 19–26. https://doi.org/10.1145/2696454.2696475
[428]
Daniel Szafir and Danielle Albers Szafir. 2021. Connecting Human-Robot Interaction and Data Visualization. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 281–292. https://doi.org/10.1145/3434073.3444683
[429]
Faisal Taher, John Hardy, Abhijit Karnik, Christian Weichel, Yvonne Jansen, Kasper Hornbæk, and Jason Alexander. 2015. Exploring interactions with physically dynamic bar charts. In Proceedings of the 33rd annual acm conference on human factors in computing systems. 3237–3246. https://doi.org/10.1145/2702123.2702604
[430]
Kazuki Takashima, Naohiro Aida, Hitomi Yokoyama, and Yoshifumi Kitamura. 2013. TransformTable: a self-actuated shape-changing digital table. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces. 179–188. https://doi.org/10.1145/2512349.2512818
[431]
Kazuki Takashima, Takafumi Oyama, Yusuke Asari, Ehud Sharlin, Saul Greenberg, and Yoshifumi Kitamura. 2016. Study and design of a shape-shifting wall display. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. 796–806. https://doi.org/10.1145/2901790.2901892
[432]
Leonardo Tanzi, Pietro Piazzolla, Francesco Porpiglia, and Enrico Vezzetti. 2021. Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. International Journal of Computer Assisted Radiology and Surgery 16, 9(2021), 1435–1445. https://doi.org/10.1007/s11548-021-02432-y
[433]
Pedro Tavares, Carlos M Costa, Luís Rocha, Pedro Malaca, Pedro Costa, António P Moreira, Armando Sousa, and Germano Veiga. 2019. Collaborative welding system using BIM for robotic reprogramming and spatial augmented reality. Automation in Construction 106 (2019), 102825. https://doi.org/10.1016/J.AUTCON.2019.04.020
[434]
Frank Thomas, Ollie Johnston, and Frank Thomas. 1995. The illusion of life: Disney animation. Hyperion New York.
[435]
Rundong Tian and Eric Paulos. 2021. Adroid: Augmenting Hands-on Making with a Collaborative Robot. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology. https://doi.org/10.1145/3472749.3474749
[436]
Hiroaki Tobita, Shigeaki Maruyama, and Takuya Kuji. 2011. Floating avatar: blimp-based telepresence system for communication and entertainment. In ACM SIGGRAPH 2011 Emerging Technologies. 1–1. https://doi.org/10.1145/2048259.2048263
[437]
Junya Tominaga, Kensaku Kawauchi, and Jun Rekimoto. 2014. Around me: a system with an escort robot providing a sports player’s self-images. In Proceedings of the 5th Augmented Human International Conference. 1–8. https://doi.org/10.1145/2582051.2582094
[438]
Bethan Hannah Topliss, Sanna M Pampel, Gary Burnett, Lee Skrypchuk, and Chrisminder Hare. 2018. Establishing the role of a virtual lead vehicle as a novel augmented reality navigational aid. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 137–145. https://doi.org/10.1145/3239060.3239069
[439]
Nhan Tran. 2020. Exploring mixed reality robot communication under different types of mental workload. Colorado School of Mines. https://doi.org/10.31219/osf.io/f3a8c
[440]
Nhan Tran, Trevor Grant, Thao Phung, Leanne Hirshfield, Christopher Wickens, and Tom Williams. 2021. Get This!? Mixed Reality Improves Robot Communication Regardless of Mental Workload. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 412–416. https://doi.org/10.1145/3434074.3447203
[441]
Nhan Tran, Trevor Grant, Thao Phung, Leanne Hirshfield, Christopher Wickens, and Tom Williams. 2021. Robot-Generated Mixed Reality Gestures Improve Human-Robot Interaction. In International Conference on Social Robotics. Springer, 768–773. https://doi.org/10.1007/978-3-030-90525-5_69
[442]
Jörg Traub, Marco Feuerstein, Martin Bauer, Eva U Schirmbeck, Hesam Najafi, Robert Bauernschmitt, and Gudrun Klinker. 2004. Augmented reality for port placement and navigation in robotically assisted minimally invasive cardiovascular surgery. In International Congress Series, Vol. 1268. Elsevier, 735–740. https://doi.org/10.1016/J.ICS.2004.03.049
[443]
Jaryd Urbani, Mohammed Al-Sada, Tatsuo Nakajima, and Thomas Höglund. 2018. Exploring Augmented Reality Interaction for Everyday Multipurpose Wearable Robots. In 2018 IEEE 24th International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA). IEEE, 209–216. https://doi.org/10.1109/RTCSA.2018.00033
[444]
AJN Van Breemen. 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of Shapping Human-Robot Interaction workshop held at CHI, Vol. 2004. Citeseer, 143–144.
[445]
Ana M Villanueva, Ziyi Liu, Zhengzhe Zhu, Xin Du, Joey Huang, Kylie A Peppler, and Karthik Ramani. 2021. RobotAR: An Augmented Reality Compatible Teleconsulting Robotics Toolkit for Augmented Makerspace Experiences. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13. https://doi.org/10.1145/3411764.3445726
[446]
Daniel Vogel and Ravin Balakrishnan. 2004. Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In Proceedings of the 17th annual ACM symposium on User interface software and technology. 137–146. https://doi.org/10.1145/1029632.1029656
[447]
Francesco Volonté, François Pugin, Pascal Bucher, Maki Sugimoto, Osman Ratib, and Philippe Morel. 2011. Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. Journal of Hepato-biliary-pancreatic Sciences 18, 4 (2011), 506–509. https://doi.org/10.1007/s00534-011-0385-6
[448]
Sebastian von Mammen, Heiko Hamann, and Michael Heider. 2016. Robot gardens: an augmented reality prototype for plant-robot biohybrid systems. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. 139–142. https://doi.org/10.1145/2993369.2993400
[449]
Emanuel Vonach, Clemens Gatterer, and Hannes Kaufmann. 2017. VRRobot: Robot actuated props in an infinite virtual environment. In 2017 IEEE Virtual Reality (VR). IEEE, 74–83. https://doi.org/10.1109/VR.2017.7892233
[450]
Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir. 2018. Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 316–324. https://doi.org/10.1145/3171221.3171253
[451]
Michael E Walker, Hooman Hedayati, and Daniel Szafir. 2019. Robot teleoperation with augmented reality virtual surrogates. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 202–210. https://doi.org/10.1109/HRI.2019.8673306
[452]
DA Wang, Fernando Bello, and Ara Darzi. 2004. Augmented reality provision in robotically assisted minimally invasive surgery. In International Congress Series, Vol. 1268. Elsevier, 527–532. https://doi.org/10.1016/J.ICS.2004.03.057
[453]
Guoping Wang, Xuechen Chen, Sheng Liu, Chingping Wong, and Sheng Chu. 2016. Mechanical chameleon through dynamic real-time plasmonic tuning. Acs Nano 10, 2 (2016), 1788–1794. https://doi.org/10.1021/acsnano.5b07472
[454]
Qiang Wang, Xiumin Fan, Mingyu Luo, Xuyue Yin, and Wenmin Zhu. 2020. Construction of Human-Robot Cooperation Assembly Simulation System Based on Augmented Reality. In International Conference on Human-Computer Interaction. Springer, 629–642. https://doi.org/10.1007/978-3-030-49695-1_42
[455]
Tianyi Wang, Xun Qian, Fengming He, Xiyun Hu, Ke Huo, Yuanzhi Cao, and Karthik Ramani. 2020. CAPturAR: An Augmented Reality Tool for Authoring Human-Involved Context-Aware Applications. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 328–341. https://doi.org/10.1145/3379337.3415815
[456]
Xi Vincent Wang, Lihui Wang, Mingtian Lei, and Yongqing Zhao. 2020. Closed-loop augmented reality towards accurate human-robot collaboration. CIRP annals 69, 1 (2020), 425–428. https://doi.org/10.1016/j.cirp.2020.03.014
[457]
Jonas Wassermann, Axel Vick, and Jörg Krüger. 2018. Intuitive robot programming through environment perception, augmented reality simulation and automated program verification. Procedia CIRP 76(2018), 161–166. https://doi.org/10.1016/J.PROCIR.2018.01.036
[458]
Atsushi Watanabe, Tetsushi Ikeda, Yoichi Morales, Kazuhiko Shinozawa, Takahiro Miyashita, and Norihiro Hagita. 2015. Communicating robotic navigational intentions. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 5763–5769. https://doi.org/10.1109/IROS.2015.7354195
[459]
Rong Wen, Chin-Boon Chng, and Chee-Kong Chui. 2017. Augmented reality guidance with multimodality imaging data and depth-perceived interaction for robot-assisted surgery. Robotics 6, 2 (2017), 13. https://doi.org/10.3390/robotics6020013
[460]
Rong Wen, Chin-Boon Chng, Chee-Kong Chui, Kah-Bin Lim, Sim-Heng Ong, and Stephen Kin-Yong Chang. 2012. Robot-assisted RF ablation with interactive planning and mixed reality guidance. In 2012 IEEE/SICE International Symposium on System Integration (SII). IEEE, 31–36. https://doi.org/10.1109/SII.2012.6426963
[461]
Rong Wen, Wei-Liang Tay, Binh P Nguyen, Chin-Boon Chng, and Chee-Kong Chui. 2014. Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Computer methods and programs in biomedicine 116, 2(2014), 68–80. https://doi.org/10.1016/j.cmpb.2013.12.018
[462]
Wesley Willett, Bon Adriel Aseniero, Sheelagh Carpendale, Pierre Dragicevic, Yvonne Jansen, Lora Oehlberg, and Petra Isenberg. 2021. Perception! Immersion! Empowerment!: Superpowers as Inspiration for Visualization. IEEE Transactions on Visualization and Computer Graphics (2021). https://doi.org/10.1109/TVCG.2021.3114844
[463]
Wesley Willett, Yvonne Jansen, and Pierre Dragicevic. 2016. Embedded data representations. IEEE transactions on visualization and computer graphics 23, 1(2016), 461–470. https://doi.org/10.1109/TVCG.2016.2598608
[464]
Tom Williams, Matthew Bussing, Sebastian Cabrol, Elizabeth Boyle, and Nhan Tran. 2019. Mixed reality deictic gesture for multi-modal robot communication. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 191–201. https://doi.org/10.1109/hri.2019.8673275
[465]
Tom Williams, Leanne Hirshfield, Nhan Tran, Trevor Grant, and Nicholas Woodward. 2020. Using augmented reality to better study human-robot interaction. In International Conference on Human-Computer Interaction. Springer, 643–654. https://doi.org/10.1007/978-3-030-49695-1_43
[466]
Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. 2018. Virtual, augmented, and mixed reality for human-robot interaction. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 403–404. https://doi.org/10.1145/3173386.3173561
[467]
Ryan Wistort and Cynthia Breazeal. 2011. TofuDraw: A mixed-reality choreography tool for authoring robot character performance. In Proceedings of the 10th International Conference on Interaction Design and Children. 213–216. https://doi.org/10.1145/1999030.1999064
[468]
Mulun Wu, Shi-Lu Dai, and Chenguang Yang. 2020. Mixed reality enhanced user interactive path planning for omnidirectional mobile robot. Applied Sciences 10, 3 (2020), 1135. https://doi.org/10.3390/app10031135
[469]
Mulun Wu, Yanbin Xu, Chenguang Yang, and Ying Feng. 2018. Omnidirectional mobile robot control based on mixed reality and semg signals. In 2018 Chinese Automation Congress (CAC). IEEE, 1867–1872. https://doi.org/10.1109/cac.2018.8623114
[470]
Tian Xia, Simon Léonard, Anton Deguet, Louis Whitcomb, and Peter Kazanzides. 2012. Augmented reality environment with virtual fixtures for robotic telemanipulation in space. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 5059–5064. https://doi.org/10.1109/iros.2012.6386169
[471]
Siyuan Xiang, Ruoyu Wang, and Chen Feng. 2021. Mobile projective augmented reality for collaborative robots in construction. Automation in Construction 127 (2021), 103704. https://doi.org/10.1016/J.AUTCON.2021.103704
[472]
Xiao Xiao, Paula Aguilera, Jonathan Williams, and Hiroshi Ishii. 2013. MirrorFugue iii: conjuring the recorded pianist. In CHI Extended Abstracts. Citeseer, 2891–2892. https://doi.org/10.1145/2468356.2479564
[473]
Xiao Xiao, Pablo Puentes, Edith Ackermann, and Hiroshi Ishii. 2016. Andantino: Teaching children piano with projected animated characters. In Proceedings of the the 15th international conference on interaction design and children. 37–45. https://doi.org/10.1145/2930674.2930689
[474]
Chung Xue, Yuansong Qiao, and Niall Murray. 2020. Enabling Human-Robot-Interaction for remote robotic operation via Augmented Reality. In 2020 IEEE 21st International Symposium on” A World of Wireless, Mobile and Multimedia Networks”(WoWMoM). IEEE, 194–196. https://doi.org/10.1109/wowmom49955.2020.00046
[475]
Wataru Yamada, Kazuhiro Yamada, Hiroyuki Manabe, and Daizo Ikeda. 2017. iSphere: self-luminous spherical drone display. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. 635–643. https://doi.org/10.1145/3126594.3126631
[476]
Junichi Yamaoka and Yasuaki Kakehi. 2016. MiragePrinter: interactive fabrication on a 3D printer with a mid-air display. In ACM SIGGRAPH 2016 Studio. 1–2. https://doi.org/10.1145/2929484.2929489
[477]
AWW Yew, SK Ong, and AYC Nee. 2017. Immersive augmented reality environment for the teleoperation of maintenance robots. Procedia Cirp 61(2017), 305–310. https://doi.org/10.1016/J.PROCIR.2016.11.183
[478]
Yan Yixian, Kazuki Takashima, Anthony Tang, Takayuki Tanno, Kazuyuki Fujita, and Yoshifumi Kitamura. 2020. Zoomwalls: Dynamic walls that simulate haptic infrastructure for room-scale vr world. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 223–235. https://doi.org/10.1145/3379337.3415859
[479]
James Young and Ehud Sharlin. 2006. A Mixed Reality Approach to Human-Robot Interaction. (2006). https://doi.org/10.11575/PRISM/30998
[480]
James Young, Ehud Sharlin, and Takeo Igarashi. 2011. What is mixed reality, anyway? Considering the boundaries of mixed reality in the context of robots. In Mixed Reality and Human-Robot Interaction. Springer, 1–11. https://doi.org/10.1007/978-94-007-0582-1_1
[481]
James E Young, Min Xin, and Ehud Sharlin. 2007. Robot expressionism through cartooning. In 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 309–316. https://doi.org/10.1145/1228716.1228758
[482]
Liangzhe Yuan, Christopher Reardon, Garrett Warnell, and Giuseppe Loianno. 2019. Human gaze-driven spatial tasking of an autonomous MAV. IEEE Robotics and Automation Letters 4, 2 (2019), 1343–1350. https://doi.org/10.1109/LRA.2019.2895419
[483]
Ludek Zalud. 2007. Augmented reality user interface for reconnaissance robotic missions. In RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 974–979. https://doi.org/10.1109/roman.2007.4415224
[484]
Ludek Zalud, Petra Kocmanova, Frantisek Burian, and Tomas Jilek. 2014. Color and thermal image fusion for augmented reality in rescue robotics. In The 8th International Conference on Robotic, Vision, Signal Processing & Power Applications. Springer, 47–55. https://doi.org/10.1007/978-981-4585-42-2_6
[485]
Bowei Zeng, Fanle Meng, Hui Ding, and Guangzhi Wang. 2017. A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation. International journal of computer assisted radiology and surgery 12, 8(2017), 1355–1368. https://doi.org/10.1007/s11548-017-1634-1
[486]
Dongpu Zhang, Lin Tian, Kewu Huang, and Jiwu Wang. 2020. Vision Tracking Algorithm for Augmented Reality System of Teleoperation Mobile Robots. In 2020 3rd International Conference on Unmanned Systems (ICUS). IEEE, 1047–1052. https://doi.org/10.1109/icus50048.2020.9274917
[487]
Fengxin Zhang, Chow Yin Lai, Milan Simic, and Songlin Ding. 2020. Augmented reality in robot programming. Procedia Computer Science 176 (2020), 1221–1230. https://doi.org/10.1016/j.procs.2020.09.119
[488]
Renjie Zhang, Xinyu Liu, Jiazhou Shuai, and Lianyu Zheng. 2020. Collaborative robot and mixed reality assisted microgravity assembly for large space mechanism. Procedia Manufacturing 51 (2020), 38–45. https://doi.org/10.1016/j.promfg.2020.10.007
[489]
Zhou Zhao, Panfeng Huang, Zhenyu Lu, and Zhengxiong Liu. 2017. Augmented reality for enhancing tele-robotic system with force feedback. Robotics and Autonomous Systems 96 (2017), 93–101. https://doi.org/10.1016/j.robot.2017.05.017
[490]
Allan Zhou, Dylan Hadfield-Menell, Anusha Nagabandi, and Anca D Dragan. 2017. Expressive robot motion timing. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 22–31. https://doi.org/10.1145/2909824.3020221
[491]
Chaozheng Zhou, Ming Zhu, Yunyong Shi, Li Lin, Gang Chai, Yan Zhang, and Le Xie. 2017. Robot-assisted surgery for mandibular angle split osteotomy using augmented reality: preliminary results on clinical animal experiment. Aesthetic plastic surgery 41, 5 (2017), 1228–1236. https://doi.org/10.1007/s00266-017-0900-5
[492]
Danny Zhu and Manuela Veloso. 2016. Virtually adapted reality and algorithm visualization for autonomous robots. In Robot World Cup. Springer, 452–464. https://doi.org/10.1007/978-3-319-68792-6_38
[493]
Kamil Židek, Ján Pitel’, Michal Balog, Alexander Hošovskỳ, Vratislav Hladkỳ, Peter Lazorík, Angelina Iakovets, and Jakub Demčák. 2021. CNN Training Using 3D Virtual Models for Assisted Assembly with Mixed Reality and Collaborative Robots. Applied Sciences 11, 9 (2021), 4269. https://doi.org/10.3390/APP11094269
[494]
Stefanie Zollmann, Christof Hoppe, Tobias Langlotz, and Gerhard Reitmayr. 2014. Flyar: Augmented reality supported micro aerial vehicle navigation. IEEE transactions on visualization and computer graphics 20, 4(2014), 560–568. https://doi.org/10.1109/TVCG.2014.24
[495]
Mark Zolotas, Joshua Elsdon, and Yiannis Demiris. 2018. Head-mounted augmented reality for explainable robotic wheelchair assistance. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 1823–1829. https://doi.org/10.1109/iros.2018.8594002
[496]
Wenchao Zou, Mayur Andulkar, and Ulrich Berger. 2018. Development of Robot Programming System through the use of Augmented Reality for Assembly Tasks. In ISR 2018; 50th International Symposium on Robotics. VDE, 1–7. https://doi.org/10.1201/9781439863992-10

Cited By

View all
  • (2024)Framework design using the robotic augmented reality for the cyberphysical systemFME Transactions10.5937/fme2403506N52:3(506-516)Online publication date: 2024
  • (2024)Gamification in Real-World Applications: Interactive Maps and Augmented RealityLevel Up! Exploring Gamification's Impact on Research and Innovation10.5772/intechopen.1004870Online publication date: 14-May-2024
  • (2024)Проблеми застосування віртуальної та доповненої реальностей у повсякденній діяльностіScientific Bulletin of UNFU10.36930/4034051234:5(90-96)Online publication date: 23-May-2024
  • Show More Cited By

Index Terms

  1. Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
      April 2022
      10459 pages
      ISBN:9781450391573
      DOI:10.1145/3491102
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 April 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. AR-HRI
      2. VAM-HRI
      3. actuated tangible UI
      4. augmented reality
      5. human-robot interaction
      6. mixed reality
      7. robotics
      8. shape-changing UI

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • NSERC

      Conference

      CHI '22
      Sponsor:
      CHI '22: CHI Conference on Human Factors in Computing Systems
      April 29 - May 5, 2022
      LA, New Orleans, USA

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)2,030
      • Downloads (Last 6 weeks)291
      Reflects downloads up to 10 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Framework design using the robotic augmented reality for the cyberphysical systemFME Transactions10.5937/fme2403506N52:3(506-516)Online publication date: 2024
      • (2024)Gamification in Real-World Applications: Interactive Maps and Augmented RealityLevel Up! Exploring Gamification's Impact on Research and Innovation10.5772/intechopen.1004870Online publication date: 14-May-2024
      • (2024)Проблеми застосування віртуальної та доповненої реальностей у повсякденній діяльностіScientific Bulletin of UNFU10.36930/4034051234:5(90-96)Online publication date: 23-May-2024
      • (2024)Supporting Human–Robot Interaction in Manufacturing with Augmented Reality and Effective Human–Computer Interaction: A Review and FrameworkMachines10.3390/machines1210070612:10(706)Online publication date: 4-Oct-2024
      • (2024)Diverse Humanoid Robot Pose Estimation from Images Using Only Sparse DatasetsApplied Sciences10.3390/app1419904214:19(9042)Online publication date: 7-Oct-2024
      • (2024)Advanced systems and technologies for the enhancement of user experience in cultural spaces: an overviewHeritage Science10.1186/s40494-024-01186-512:1Online publication date: 28-Feb-2024
      • (2024)Harnessing Hypertext Paradigms to Augment VR SpacesProceedings of the 7th Workshop on Human Factors in Hypertext10.1145/3679058.3688633(1-10)Online publication date: 10-Sep-2024
      • (2024)Keep Track! Supporting Spatial Tasks with Augmented Reality OverviewsProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682093(1-11)Online publication date: 7-Oct-2024
      • (2024)LIMIT: Learning Interfaces to Maximize Information TransferACM Transactions on Human-Robot Interaction10.1145/367575813:4(1-26)Online publication date: 23-Oct-2024
      • (2024)SenseBot: Leveraging Embodied Asymmetric Interaction and Social Robotic to Enhance Intergenerational CommunicationAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686734(1-3)Online publication date: 13-Oct-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media