Planning Socially Expressive Mobile Robot Trajectories
<p>Illustrations of the effect of the six motion sequence values on the velocity profiles, shown in the six subfigures (<b>A</b>–<b>F</b>). The slope and maximum values of the profiles are determined by the kinematics variable. These profiles use the smooth variant.</p> "> Figure 2
<p>Velocity profiles resulting from combining the increment (<b>left</b>) or saccade (<b>right</b>) variants with motion sequence A and medium kinematics.</p> "> Figure 3
<p>Examples of linear trajectories to approach a person using different velocity, acceleration, and timing features resulting in confident or hesitant perception.</p> "> Figure 4
<p>Illustration of the construction of the velocity profiles by combining the motion corpus variables. <b>Top</b>: all motion sequences represented with medium kinematics and smooth variant. <b>Bottom</b>: profiles resulting from applying different kinematics or variants to motion sequence B. In total, <math display="inline"><semantics> <mrow> <mn>4</mn> <mo>×</mo> <mn>3</mn> <mo>×</mo> <mn>3</mn> <mo>=</mo> <mn>36</mn> </mrow> </semantics></math> profiles can be obtained by combining the 4 motion sequences with 3 kinematics and 3 variants.</p> "> Figure 5
<p>Representation of a corpus velocity profile using motion sequence B (no pauses, no hesitations) and the smooth variant as a sequence <span class="html-italic">U</span> of <math display="inline"><semantics> <mrow> <mi>N</mi> <mo>=</mo> <mn>2</mn> </mrow> </semantics></math> motion phases <math display="inline"><semantics> <msub> <mi>u</mi> <mn>0</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>u</mi> <mn>1</mn> </msub> </semantics></math>. Values of <math display="inline"><semantics> <msub> <mi>v</mi> <mrow> <mi>k</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>a</mi> <mrow> <mi>k</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math> depend on the selected kinematics type (medium, low, and high), and dictate the slope and maximum of the velocity profile.</p> "> Figure 6
<p>Illustration of the transformation of a corpus velocity profile to travel shorter or longer distances. <b>Top</b>: transformation for profiles without pauses or hesitations (sequence B). <b>Bottom</b>: transformation for profiles with pauses and without hesitations (sequence A).</p> "> Figure 7
<p>Illustration of the pause constraint. <b>Left</b>: valid trajectories. <b>Right</b>: invalid trajectories due to insufficient length of the constant velocity phase.</p> "> Figure 8
<p><b>Left</b>: RobAIR mobile robot. <b>Right</b>: RobAIR base.</p> "> Figure 9
<p>High-level architecture of our system. ROS nodes are represented with rounded boxes; hardware devices are represented with dashed boxes.</p> "> Figure 10
<p><b>Top</b>: past command velocities issued at 10 Hz (blue) and encoder-based odometry estimated at 40 Hz (red) in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>, plotted with respect to time (s). <b>Bottom</b>: visualization of the planned trajectory’s velocity, discretized into time intervals of length <math display="inline"><semantics> <mrow> <mi>d</mi> <mi>t</mi> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> ms. The robot stops within 10 cm of its goal position (green).</p> "> Figure 10 Cont.
<p><b>Top</b>: past command velocities issued at 10 Hz (blue) and encoder-based odometry estimated at 40 Hz (red) in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>, plotted with respect to time (s). <b>Bottom</b>: visualization of the planned trajectory’s velocity, discretized into time intervals of length <math display="inline"><semantics> <mrow> <mi>d</mi> <mi>t</mi> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> ms. The robot stops within 10 cm of its goal position (green).</p> "> Figure 11
<p>Plot representing the full point-to-point motion to a goal point, using low kinematics. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>.</p> "> Figure 12
<p>Plot representing the full point-to-point motion to a goal point using high kinematics. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>.</p> "> Figure 13
<p>Plot representing the full point-to-point motion to a goal point. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>. Distance to the goal in m shown in green.</p> "> Figure 14
<p>Point -to-point motion using the hesitation sequence (without pauses) and medium kinematics. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>.</p> "> Figure 15
<p>Point -to-point motion using the increment variant and medium kinematics. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>.</p> "> Figure 16
<p>Point -to-point motion using the saccade variant and medium kinematics. Past command velocities shown in blue and unfiltered odometry shown in red, both given in <math display="inline"><semantics> <mrow> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mrow> <mspace width="0.166667em"/> <mo>·</mo> <mspace width="0.166667em"/> </mrow> <msup> <mi mathvariant="normal">s</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </mrow> </semantics></math>.</p> ">
Abstract
:1. Introduction
- We build a statistical model of human perception of different combinations of trajectory features capturing a robot’s movement dynamics, bringing new knowledge on human social perception of robot motion.
- We formalize these trajectory features found to cause different social perceptions and design a novel optimization-based trajectory planning algorithm that can accurately reproduce the social motion features while performing a navigation task.
2. State of the Art
2.1. Design and Evaluation of Social Navigation Algorithms
2.2. Algorithmic Approaches for Social Navigation
3. Model of Human Social Perception of Mobile Robot Motion
3.1. Robot Motion Corpus Background
3.1.1. Motion Variables
3.1.2. Appearance Variables
3.2. Perceptual Scales
3.3. Logistic Regression Modeling
3.3.1. Method
3.3.2. Results
4. Algorithm for Trajectory Planning with Configurable Movement Styles
4.1. Velocity Profile Representation
4.2. Generalization of Corpus Profiles to Variable Distances
- Acceleration and maximum velocity (kinematics type);
- Successions of accelerations and decelerations (motion sequence and variant);
- Duration of maximum velocity phase (motion sequence).
4.3. Problem Formulation
4.4. Prosody Constraint Formalization
4.4.1. Integration of Motion Sequences
4.4.2. Integration of Variants
4.4.3. Integration of Kinematics Types
4.5. Trajectory Planning and Open-Loop Control
4.5.1. Trajectory Planning
Algorithm 1: Prosody-aware trajectory planning |
Output: , phases of the optimal trajectory. Notations: , kth robot state. , kth motion phase. T trajectory tree. Algorithm: |
4.5.2. Open-Loop Control
Algorithm 2: Open-loop control |
Input: , goal point. , set of prosody constraints. Output: , acceleration command sent to the motors. Notations: , initial state of the robot. , motion phase executed at time t. , sequence of motion phases describing the trajectory. Algorithm: |
5. Implementation and Validation
5.1. Implementation
5.2. Validation
5.2.1. Point-to-Point Trajectory Execution
5.2.2. Kinematics
5.2.3. Pause and Hesitation Sequences
5.2.4. Increment and Saccade Variants
6. Discussion
6.1. Generalization of the Human Perception Model
6.2. Limitations of the Trajectory Planning Algorithm
6.3. Ethical Considerations
7. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kivrak, H.; Uluer, P.; Kose, H.; Gumuslu, E.; Erol Barkana, D.; Cakmak, F.; Yavuz, S. Physiological Data-Based Evaluation of a Social Robot Navigation System. In Proceedings of the 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August 2020–4 September 2020; pp. 994–999. [Google Scholar] [CrossRef]
- Coşar, S.; Fernandez-Carmona, M.; Agrigoroaie, R.; Pages, J.; Ferland, F.; Zhao, F.; Yue, S.; Bellotto, N.; Tapus, A. ENRICHME: Perception and Interaction of an Assistive Robot for the Elderly at Home. Int. J. Soc. Robot. 2020, 12, 779–805. [Google Scholar] [CrossRef]
- Kruse, T.; Pandey, A.K.; Alami, R.; Kirsch, A. Human-Aware Robot Navigation: A Survey. Robot. Auton. Syst. 2013, 61, 1726–1743. [Google Scholar] [CrossRef]
- Mori, M. The uncanny valley. Energy 1970, 7, 33. [Google Scholar]
- Destephe, M.; Brandao, M.; Kishi, T.; Zecca, M.; Hashimoto, K.; Takanishi, A. Walking in the uncanny valley: Importance of the attractiveness on the acceptance of a robot as a working partner. Front. Psychol. 2015, 6, 204. [Google Scholar] [CrossRef] [PubMed]
- Zlotowski, J.A.; Sumioka, H.; Nishio, S.; Glas, D.F.; Bartneck, C.; Ishiguro, H. Persistence of the uncanny valley: The influence of repeated interactions and a robot’s attitude on its perception. Front. Psychol. 2015, 6, 883. [Google Scholar] [CrossRef]
- Venture, G.; Kulić, D. Robot Expressive Motions. ACM Trans. Hum.-Robot Interact. 2019, 8, 1–17. [Google Scholar] [CrossRef]
- Vannucci, F.; Di Cesare, G.; Rea, F.; Sandini, G.; Sciutti, A. A Robot with Style: Can Robotic Attitudes Influence Human Actions? In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Beijing, China, 6–9 November 2018; pp. 952–957. [Google Scholar] [CrossRef]
- Sripathy, A.; Bobu, A.; Li, Z.; Sreenath, K.; Brown, D.S.; Dragan, A.D. Teaching Robots to Span the Space of Functional Expressive Motion. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 13406–13413. [Google Scholar] [CrossRef]
- Francis, A.; Pérez-D’Arpino, C.; Li, C.; Xia, F.; Alahi, A.; Alami, R.; Bera, A.; Biswas, A.; Biswas, J.; Chandra, R.; et al. Principles and Guidelines for Evaluating Social Robot Navigation Algorithms. arXiv 2023, arXiv:2306.16740. [Google Scholar]
- Gil, Ó.; Garrell, A.; Sanfeliu, A. Social robot navigation tasks: Combining machine learning techniques and social force model. Sensors 2021, 21, 7087. [Google Scholar] [CrossRef] [PubMed]
- Carton, D.; Olszowy, W.; Wollherr, D.; Buss, M. Socio-Contextual Constraints for Human Approach with a Mobile Robot. Int. J. Soc. Robot. 2017, 9, 309–327. [Google Scholar] [CrossRef]
- Kamezaki, M.; Kobayashi, A.; Yokoyama, Y.; Yanagawa, H.; Shrestha, M.; Sugano, S. A Preliminary Study of Interactive Navigation Framework with Situation-Adaptive Multimodal Inducement: Pass-By Scenario. Int. J. Soc. Robot. 2019, 12, 567–588. [Google Scholar] [CrossRef]
- Shochi, T. Prosodie des Affects Socioculturels en Japonais, et Anglais: À la Recherche des Vrais et Faux-Amis pour le Parcours de l’Apprenant. Ph.D. Thesis, Université Stendhal-Grenoble III, Grenoble, France, 2008. [Google Scholar]
- Gobl, C.; Ní Chasaide, A. The role of voice quality in communicating emotion, mood and attitude. Speech Commun. 2003, 40, 189–212. [Google Scholar] [CrossRef]
- Hebesberger, D.; Koertner, T.; Gisinger, C.; Pripfl, J. A Long-Term Autonomous Robot at a Care Hospital: A Mixed Methods Study on Social Acceptance and Experiences of Staff and Older Adults. Int. J. Soc. Robot. 2017, 9, 417–429. [Google Scholar] [CrossRef]
- Mutlu, B.; Forlizzi, J. Robots in organizations. In Proceedings of the 3rd International Conference on Human Robot Interaction-HRI ’08, Amsterdam, The Netherlands, 12–15 March 2008; p. 287. [Google Scholar] [CrossRef]
- Mavrogiannis, C.; Baldini, F.; Wang, A.; Zhao, D.; Trautman, P.; Steinfeld, A.; Oh, J. Core Challenges of Social Robot Navigation: A Survey. J. Hum.-Robot Interact. 2023, 12, 1–39. [Google Scholar] [CrossRef]
- Vega, A.; Manso, L.J.; Macharet, D.G.; Bustos, P.; Núñez, P. Socially aware robot navigation system in human-populated and interactive environments based on an adaptive spatial density function and space affordances. Pattern Recognit. Lett. 2019, 118, 72–84. [Google Scholar] [CrossRef]
- Henderson, M.; Ngo, T.D. RRT-SMP: Socially-encoded Motion Primitives for Sampling-based Path Planning. In Proceedings of the 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 8–12 August 2021; pp. 330–336. [Google Scholar] [CrossRef]
- Kollmitz, M.; Hsiao, K.; Gaa, J.; Burgard, W. Time dependent planning on a layered social cost map for human-aware robot navigation. In Proceedings of the 2015 European Conference on Mobile Robots, Lincoln, UK, 2–4 September 2015. [Google Scholar] [CrossRef]
- Bartneck, C.; Kulić, D.; Croft, E.; Zoghbi, S. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef]
- Nomura, T.; Suzuki, T.; Kanda, T.; Kato, K. Measurement of negative attitudes toward robots. Interact. Stud. 2006, 7, 437–454. [Google Scholar] [CrossRef]
- Barchard, K.A.; Lapping-Carr, L.; Westfall, R.S.; Fink-Armold, A.; Banisetty, S.B.; Feil-Seifer, D. Measuring the perceived social intelligence of robots. ACM Trans. Hum.-Robot Interact. 2020, 9. [Google Scholar] [CrossRef]
- Carpinella, C.M.; Wyman, A.B.; Perez, M.A.; Stroessner, S.J. The Robotic Social Attributes Scale (RoSAS): Development and Validation. In Proceedings of the 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vienna, Austria, 6–9 March 2017; pp. 254–262. [Google Scholar]
- Sasa, Y.; Aubergé, V. Perceived isolation and elderly boundaries in EEE (EmOz Elder-ly Expressions) corpus: Appeal to communication dynamics with a socio-affectively gluing robot in a smart home. Gerontechnology 2016, 15, 162. [Google Scholar]
- Guillaume, L.; Aubergé, V.; Magnani, R.; Aman, F.; Cottier, C.; Sasa, Y.; Wolf, C.; Nebout, F.; Neverova, N.; Bonnefond, N.; et al. HRI in an ecological dynamic experiment: The GEE corpus based approach for the Emox robot. In Proceedings of the 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO), Lyon, France, 30 June 2015–2 July 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Sasa, Y.; Aubergé, V. SASI: Perspectives for a socio-affectively intelligent HRI dialog system. In Proceedings of the 1st Workshop on “Behavior, Emotion and Representation: Building Blocks of Interaction”, Bielefeld, Germany, 17 October 2017. [Google Scholar]
- Tsvetanova, L.; Aubergé, V.; Sasa, Y. Multimodal breathiness in interaction: From breathy voice quality to global breathy “body behavior quality”. In Proceedings of the Proc. of the 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots—VIHAR 2017, Stockholm, Sweden, 25–26 August 2017. [Google Scholar]
- Scales, P.; Aubergé, V.; Aycard, O. From vocal prosody to movement prosody, from HRI to understanding humans. Interact. Stud. 2023, 24, 131–168. [Google Scholar] [CrossRef]
- Ramirez, O.A.; Khambhaita, H.; Chatila, R.; Chetouani, M.; Alami, R. Robots learning how and where to approach people. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; Volume 1, pp. 347–353. [Google Scholar] [CrossRef]
- Chen, Y.F.; Everett, M.; Liu, M.; How, J.P. Socially aware motion planning with deep reinforcement learning. IEEE Int. Conf. Intell. Robot. Syst. 2017, 2017, 1343–1350. [Google Scholar] [CrossRef]
- Kitagawa, R.; Liu, Y.; Kanda, T. Human-inspired Motion Planning for Omni-directional Social Robots. In Proceedings of the 2021 16th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Online, 9–11 March 2021; pp. 34–42. [Google Scholar]
- Shiomi, M.; Zanlungo, F.; Hayashi, K.; Kanda, T. Towards a Socially Acceptable Collision Avoidance for a Mobile Robot Navigating Among Pedestrians Using a Pedestrian Model. Int. J. Soc. Robot. 2014, 6, 443–455. [Google Scholar] [CrossRef]
- Rios-Martinez, J.; Spalanzani, A.; Laugier, C. From Proxemics Theory to Socially-Aware Navigation: A Survey. Int. J. Soc. Robot. 2015, 7, 137–153. [Google Scholar] [CrossRef]
- Hall, E.T.; Birdwhistell, R.L.; Bock, B.; Bohannan, P.; Diebold, A.R.; Durbin, M.; Edmonson, M.S.; Fischer, J.L.; Hymes, D.; Kimball, S.T.; et al. Proxemics [and Comments and Replies]. Curr. Anthropol. 1968, 9, 83–108. [Google Scholar] [CrossRef]
- Honour, A.; Banisetty, S.B.; Feil-Seifer, D. Perceived Social Intelligence as Evaluation of Socially Navigation. In Proceedings of the Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 9–11 March 2021; pp. 519–523. [Google Scholar] [CrossRef]
- Mavrogiannis, C.; Hutchinson, A.M.; MacDonald, J.; Alves-Oliveira, P.; Knepper, R.A. Effects of Distinct Robot Navigation Strategies on Human Behavior in a Crowded Environment. ACM/IEEE Int. Conf.-Hum.-Robot. Interact. 2019, 2019, 421–430. [Google Scholar] [CrossRef]
- Sorrentino, A.; Khalid, O.; Coviello, L.; Cavallo, F.; Fiorini, L. Modeling human-like robot personalities as a key to foster socially aware navigation. In Proceedings of the 2021 30th IEEE International Conference on Robot and Human Interactive Communication, Vancouver, BC, Canada, 8–12 August 2021; pp. 95–101. [Google Scholar] [CrossRef]
- Campbell, N.; Mokhtari, P. Voice quality: The 4th prosodic dimension. In Proceedings of the 15th International Congress of Phonetic Sciences, Barcelona, Spain, 3–9 August 2003; pp. 2417–2420. [Google Scholar]
- Khambhaita, H.; Alami, R. Viewing Robot Navigation in Human Environment as a Cooperative Activity; Springer: Cham, Switzerland, 2020; pp. 285–300. [Google Scholar] [CrossRef]
- Luo, L.; Guo, T.; Cui, K.; Zhang, Q. Trajectory Planning in Robot Joint Space Based on Improved Quantum Particle Swarm Optimization Algorithm. Appl. Sci. 2023, 13, 7031. [Google Scholar] [CrossRef]
- Akopov, A.S.; Beklaryan, L.A.; Beklaryan, A.L. Cluster-Based Optimization of an Evacuation Process Using a Parallel Bi-Objective Real-Coded Genetic Algorithm. Cybern. Inf. Technol. 2020, 20, 45–63. [Google Scholar] [CrossRef]
- Sisbot, E.A.; Marin-Urias, K.F.; Alami, R.; Siméon, T. A human aware mobile robot motion planner. IEEE Trans. Robot. 2007, 23, 874–883. [Google Scholar] [CrossRef]
- Kruse, T.; Basili, P.; Glasauer, S.; Kirsch, A. Legible robot navigation in the proximity of moving humans. In Proceedings of the IEEE Workshop on Advanced Robotics and Its Social Impacts, ARSO, Munich, Germany, 21–23 May 2012; pp. 83–88. [Google Scholar] [CrossRef]
- Repiso, E.; Ferrer, G.; Sanfeliu, A. On-line adaptive side-by-side human robot companion in dynamic urban environments. IEEE Int. Conf. Intell. Robot. Syst. 2017, 2017, 872–877. [Google Scholar] [CrossRef]
- Rios-Martinez, J.; Renzaglia, A.; Spalanzani, A.; Martinelli, A.; Laugier, C. Navigating between people: A stochastic optimization approach. In Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 2880–2885. [Google Scholar] [CrossRef]
- Zhou, A.; Dragan, A.D. Cost Functions for Robot Motion Style. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 3632–3639. [Google Scholar] [CrossRef]
- Hagane, S.; Venture, G. Robotic Manipulator’s Expressive Movements Control Using Kinematic Redundancy. Machines 2022, 10, 1118. [Google Scholar] [CrossRef]
- Park, J.J. Graceful Navigation for Mobile Robots in Dynamic and Uncertain Environments. Ph.D. Disertation, The University of Michigan, Ann Arbor, MI, USA, 2016. [Google Scholar]
- Tail, L.; Zhang, J.; Liu, M.; Burgard, W. Socially compliant navigation through raw depth inputs with generative adversarial imitation learning. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; pp. 1111–1117. [Google Scholar] [CrossRef]
- Fischer, K.; Jensen, L.C.; Suvei, S.D.; Bodenhagen, L. Between legibility and contact: The role of gaze in robot approach. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 646–651. [Google Scholar] [CrossRef]
- Mumm, J.; Mutlu, B. Human-robot proxemics: Physical and psychological distancing in human-robot interaction. In Proceedings of the 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Lausanne, Switzerland, 6–9 March 2011; pp. 331–338. [Google Scholar] [CrossRef]
- Winter, B. Statistics for Linguists: An Introduction Using R, 1st ed.; Routledge: London, UK, 2019. [Google Scholar] [CrossRef]
- Bates, D.; Mächler, M.; Bolker, B.; Walker, S. Fitting Linear Mixed-Effects Models Using lme4. J. Stat. Softw. 2015, 67, 1–48. [Google Scholar] [CrossRef]
- Fawcett, T. An introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
- Baayen, R.H. Analyzing Linguistic Data: A Practical Introduction to Statistics Using R; Cambridge University Press: Cambridge, UK, 2008. [Google Scholar] [CrossRef]
- Searle, S.R.; Speed, F.M.; Milliken, G.A. Population Marginal Means in the Linear Model: An Alternative to Least Squares Means. Am. Stat. 1980, 34, 216–221. [Google Scholar] [CrossRef]
- Lenth, R.V. Emmeans: Estimated Marginal Means, aka Least-Squares Means. R Package Version 1.8.8, 2023. Available online: https://cran.r-project.org/web/packages/emmeans/index.html (accessed on 25 March 2024).
- Holm, S. A Simple Sequentially Rejective Multiple Test Procedure. Scand. J. Stat. 1979, 6, 65–70. [Google Scholar]
- Tedrake, R. Underactuated Robotics; Course Notes for MIT 6.832, 2023. Available online: https://underactuated.csail.mit.edu (accessed on 25 March 2024).
- RobAIR Mobile Robot, Designed and Built by FabMASTIC, Grenoble. 2021. Available online: https://air.imag.fr/index.php/RobAIR (accessed on 19 July 2021).
- Hiroi, Y.; Ito, A. Influence of the Size Factor of a Mobile Robot Moving Toward a Human on Subjective Acceptable Distance. Mob. Robot. Curr. Trends 2011, 9, 177–190. [Google Scholar] [CrossRef]
- Magnani, R.; Aubergé, V.; Bayol, C.; Sasa, Y. Bases of Empathic Animism Illusion: Audio-visual perception of an object devoted to becoming perceived as a subject for HRI. In Proceedings of the 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots—VIHAR 2017, Stockholm, Sweden, 25–26 August 2017. [Google Scholar]
- Walters, M.; Koay, K.; Syrdal, D.; Dautenhahn, K.; Boekhorst, R. Preferences and Perceptions of Robot Appearance and Embodiment in Human-Robot Interaction Trials. In Proceedings of the New Frontiers in Human-Robot Interaction, Symposium at the AISB09 Convention, Edinburgh, Scotland, 8–9 April 2009. [Google Scholar]
- Matsumoto, M. Fragile Robot: The Fragility of Robots Induces User Attachment to Robots. Int. J. Mech. Eng. Robot. Res. 2021, 10, 536–541. [Google Scholar] [CrossRef]
Parameter | Low | Medium | High |
---|---|---|---|
a | |||
0 to | |||
to |
Adjective 1 | Adjective 2 |
---|---|
Aggressive | Gentle |
Authoritative | Polite |
Seems Confident | Doubtful, Hesitant |
Inspires confidence | Doesn’t inspire confidence |
Nice | Disagreeable |
Sturdy | Frail |
Strong | Weak |
Smooth | Abrupt |
Rigid | Supple |
Tender | Insensitive |
− | Aggressive | Authoritative | Confident | Inspires Conf. | Nice |
---|---|---|---|---|---|
+ | Gentle | Polite | Hesitant | Does Not | Disagreeable |
Kin. high | −28 *** | −24 *** | −17 *** | 7 *** | 15 *** |
Kin. low | 24 *** | 22 *** | 15 *** | −8 *** | −15 *** |
Kin. medium | 4 *** | 3 * | 2 | 1 | 1 |
Sequence A | 2 | 4 | −3 | −1 | 2 |
Sequence B | 2 | 7 ** | 5 * | −1 | −4 |
Sequence C | 0 | 2 | 19 *** | 9 *** | 1 |
Sequence D | 1 | 2 | 27 *** | 13 *** | 1 |
Sequence E | −6 * | −10 *** | −28 *** | −9 *** | 3 |
Sequence F | 1 | −5 | −21 *** | −11 *** | −3 |
Var. increment | 8 *** | 6 *** | −1 | −4 ** | −3 * |
Var. saccade | −14 *** | −8 *** | 22 *** | 20 *** | 10 *** |
Var. smooth | 6 *** | 3 | −21 *** | −16 *** | −6 *** |
Eyes none | 1 | 5 ** | 4 | 3 * | 3 * |
Eyes round | 7 *** | 5 ** | −1 | −7 *** | −11 *** |
Eyes squint | −8 *** | −10 *** | −3 | 4 * | 8 *** |
Stable | 3 * | −2 | −11 *** | −6 *** | −2 |
Unstable | −3 * | 2 | 11 *** | 6 *** | 2 |
− | Sturdy | Strong | Smooth | Rigid | Tender |
+ | Frail | Weak | Abrupt | Supple | Insensitive |
Kin. high | −15 *** | −20 *** | 13 *** | −9 *** | 13 *** |
Kin. low | 12 *** | 17 *** | −14 *** | 12 *** | −13 *** |
Kin. medium | 3 * | 3 * | 1 | −2 * | 0 |
Sequence A | −3 | −3 | −4 | 2 | −1 |
Sequence B | 9 *** | 7 *** | 0 | 3 | −6 * |
Sequence C | 10 *** | 9 *** | 4 | 1 | 2 |
Sequence D | 20 *** | 18 *** | 7 ** | −5 * | 2 |
Sequence E | −22 *** | −19 *** | −2 | −2 | 6 * |
Sequence F | −14 *** | −12 *** | −5 * | 1 | −2 |
Var. increment | −4 * | −2 | −4 *** | 3 * | −4 |
Var. saccade | 27 *** | 20 *** | 15 *** | −9 *** | 6 *** |
Var. smooth | −24 *** | −18 *** | −10 *** | 6 *** | −3 |
Eyes none | 3 | 4 * | 1 | −1 | 6 *** |
Eyes round | 1 | 1 | −7 *** | 4 * | −14 *** |
Eyes squint | −3 | −6 ** | 6 *** | −3 | 8 *** |
Stable | −16 *** | −11 *** | −4 *** | 0 | 0 |
Unstable | 16 *** | 11 *** | 4 *** | 0 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Scales, P.; Aycard, O.; Aubergé, V. Planning Socially Expressive Mobile Robot Trajectories. Sensors 2024, 24, 3533. https://doi.org/10.3390/s24113533
Scales P, Aycard O, Aubergé V. Planning Socially Expressive Mobile Robot Trajectories. Sensors. 2024; 24(11):3533. https://doi.org/10.3390/s24113533
Chicago/Turabian StyleScales, Philip, Olivier Aycard, and Véronique Aubergé. 2024. "Planning Socially Expressive Mobile Robot Trajectories" Sensors 24, no. 11: 3533. https://doi.org/10.3390/s24113533
APA StyleScales, P., Aycard, O., & Aubergé, V. (2024). Planning Socially Expressive Mobile Robot Trajectories. Sensors, 24(11), 3533. https://doi.org/10.3390/s24113533