[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic Eyes

Published: 13 September 2023 Publication History

Abstract

The ability of autonomous vehicles (AVs) to interact socially with pedestrians poses a significant impact on their integration with urban traffic. This is particularly important for vehicle-pedestrian shared spaces due to increased social requirements in comparison to vehicular roads. Current pedestrian experience in shared spaces suffers from negative attitudes towards AVs and the consequently low acceptability of AVs in these spaces. HRI work shows that the acceptability of robots in public spaces can be positively impacted by their perceived sociability (i.e., possessing social skills), which can be enhanced by their ability to express emotions. Inspired by this approach, we follow a systematic process to design emotional expressions for AVs using the headlight ("eye'') area and investigate their impact on perceived sociability of AVs in shared spaces, by conducting expert focus groups (N=12) and an online video-based user study (N=106). Our findings confirm that the perceived sociability of AVs can be enhanced by emotional expressions indicated through emotional eyes. We further discuss implications of our findings for improving pedestrian experience and attitude in shared spaces and highlight opportunities to use AVs' emotional expressions as a new external communication strategy for future research.

References

[1]
Franziska Babel, Andrea Vogt, Philipp Hock, Johannes Kraus, Florian Angerer, Tina Seufert, and Martin Baumann. 2022. Step aside! VR-based evaluation of adaptive robot conflict resolution strategies for domestic service robots. International Journal of Social Robotics, Vol. 14, 5 (2022), 1239--1260.
[2]
Kimberly A Barchard, Leiszle Lapping-Carr, R Shane Westfall, Andrea Fink-Armold, Santosh Balajee Banisetty, and David Feil-Seifer. 2020. Measuring the perceived social intelligence of robots. ACM Transactions on Human-Robot Interaction (THRI), Vol. 9, 4 (2020), 1--29.
[3]
Christoph Bartneck, Dana Kulić, Elizabeth Croft, and Susana Zoghbi. 2009. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics, Vol. 1, 1 (2009), 71--81.
[4]
Pavlo Bazilinskyy, Tsuyoshi Sakuma, and Joost de Winter. 2021. What driving style makes pedestrians think a passing vehicle is driving automatically? Applied ergonomics, Vol. 95 (2021), 103428.
[5]
Susanne Boll, Marion Koelle, and Jessica Cauchard. 2019. Understanding the socio-technical impact of automated (Aerial) vehicles on casual bystanders. In 1st International Workshop on Human-Drone Interaction.
[6]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry, Vol. 25, 1 (1994), 49--59.
[7]
Judee K Burgoon and Jerold L Hale. 1984. The fundamental topoi of relational communication. Communication Monographs, Vol. 51, 3 (1984), 193--214.
[8]
Chia-Ming Chang. 2020. A gender study of communication interfaces between an autonomous car and a pedestrian. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 42--45.
[9]
Chia-Ming Chang, Koki Toda, Xinyue Gui, Stela H Seo, and Takeo Igarashi. 2022. Can Eyes on a Car Reduce Traffic Accidents?. In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 349--359.
[10]
Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 65--73.
[11]
Mark Colley, Jan Henry Belz, and Enrico Rukzio. 2021a. Investigating the effects of feedback communication of autonomous vehicles. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 263--273.
[12]
Mark Colley, Christian Hummler, and Enrico Rukzio. 2022. Effects of mode distinction, user visibility, and vehicle appearance on mode confusion when interacting with highly automated vehicles. Transportation research part F: traffic psychology and behaviour, Vol. 89 (2022), 303--316.
[13]
Mark Colley, Surong Li, and Enrico Rukzio. 2021b. Increasing Pedestrian Safety Using External Communication of Autonomous Vehicles for Signalling Hazards. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction (Toulouse; Virtual, France) (MobileHCI '21). Association for Computing Machinery, New York, NY, USA, Article 20, 10 pages. https://doi.org/10.1145/3447526.3472024
[14]
Mark Colley and Enrico Rukzio. 2020. A Design Space for External Communication of Autonomous Vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 212--222.
[15]
Mark Colley, Marcel Walch, Jan Gugenheimer, Ali Askari, and Enrico Rukzio. 2020b. Towards inclusive external communication of autonomous vehicles for pedestrians with vision impairments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1--14.
[16]
Mark Colley, Marcel Walch, and Enrico Rukzio. 2020a. Unveiling the lack of scalability in research on external communication of autonomous vehicles. In Extended abstracts of the 2020 chi conference on human factors in computing systems. 1--9.
[17]
Rebecca Currano, So Yeon Park, Lawrence Domingo, Jesus Garcia-Mancilla, Pedro C Santana-Mancilla, Victor M Gonzalez, and Wendy Ju. 2018. !` Vamos! Observations of pedestrian interactions with driverless cars in Mexico. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. Association for Computing Machinery, New York, NY, USA, 210--220.
[18]
Miguel Ángel de Miguel, Daniel Fuchshuber, Ahmed Hussein, and Cristina Olaverri-Monreal. 2019. Perceived pedestrian safety: Public interaction with driverless vehicles. In 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, New York, NY, USA, 90--95.
[19]
Debargha Dey, Azra Habibovic, Andreas Löcken, Philipp Wintersberger, Bastian Pfleging, Andreas Riener, Marieke Martens, and Jacques Terken. 2020a. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles' external human-machine interfaces. Transportation Research Interdisciplinary Perspectives, Vol. 7 (2020), 100174.
[20]
Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020b. Color and animation preferences for a light band eHMI in interactions between automated vehicles and pedestrians. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1--13.
[21]
Debargha Dey, Marieke Martens, Berry Eggen, and Jacques Terken. 2019. Pedestrian road-crossing willingness as a function of vehicle automation, external appearance, and driving behaviour. Transportation research part F: traffic psychology and behaviour, Vol. 65 (2019), 191--205.
[22]
Debargha Dey and Jacques Terken. 2017. Pedestrian interaction with vehicles: roles of explicit and implicit communication. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. Association for Computing Machinery, New York, NY, USA, 109--113.
[23]
Joshua Domeyer, Azadeh Dinparastdjadid, John D Lee, Grace Douglas, Areen Alsaid, and Morgan Price. 2019. Proxemics and kinesics in automated vehicle--pedestrian communication: Representing ethnographic observations. Transportation research record, Vol. 2673, 10 (2019), 70--81.
[24]
Brian R Duffy. 2003. Anthropomorphism and the social robot. Robotics and autonomous systems, Vol. 42, 3--4 (2003), 177--190.
[25]
Grace Eden, Benjamin Nanchen, Randolf Ramseyer, and Florian Evéquoz. 2017. On the road with an autonomous passenger shuttle: Integration in public spaces. In Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. 1569--1576.
[26]
Paul Ekman and Wallace V Friesen. 1971. Constants across cultures in the face and emotion. Journal of personality and social psychology, Vol. 17, 2 (1971), 124.
[27]
Paul Ekman and Wallace V Friesen. 1978. Facial action coding system. Environmental Psychology & Nonverbal Behavior (1978).
[28]
Friederike Eyssel, Frank Hegel, Gernot Horstmann, and Claudia Wagner. 2010. Anthropomorphic inferences from emotional nonverbal cues: A case study. In 19th international symposium in robot and human interactive communication. IEEE, 646--651.
[29]
Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. 2003. A survey of socially interactive robots. Robotics and autonomous systems, Vol. 42, 3--4 (2003), 143--166.
[30]
Andreas Frank, Fabian Schneider, Alexander Meschtscherjakov, and Julian Stadon. 2015. Advanced traffic light interface: countdown timers to increase user experience. In Adjunct Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 56--61.
[31]
Changzeng Fu, Yuichiro Yoshikawa, Takamasa Iio, and Hiroshi Ishiguro. 2021. Sharing experiences to help a robot present its mind and sociability. International Journal of Social Robotics, Vol. 13, 2 (2021), 341--352.
[32]
Hiroshi Furuya, Kangsoo Kim, Gerd Bruder, Pamela J. Wisniewski, and Gregory F. Welch. 2021. Autonomous vehicle visual embodiment for pedestrian interactions in crossing scenarios: Virtual drivers in avs for pedestrian crossing. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. 1--7.
[33]
Xinyue Gui, Koki Toda, Stela Hanbyeol Seo, Chia-Ming Chang, and Takeo Igarashi. 2022. ?I am going this way": Gazing Eyes on Self-Driving Car Show Multiple Driving Directions. In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 319--329.
[34]
Azra Habibovic, Victor Malmsten Lundgren, Jonas Andersson, Maria Klingegård, Tobias Lagström, Anna Sirkka, Johan Fagerlönn, Claes Edgren, Rikard Fredriksson, Stas Krupenia, et al. 2018. Communicating intent of automated vehicles to pedestrians. Frontiers in psychology (2018), 1336.
[35]
Julie Hatfield and Prasannah Prabhakharan. 2016. An investigation of behaviour and attitudes relevant to the user safety of pedestrian/cyclist shared paths. Transportation research part F: traffic psychology and behaviour, Vol. 40 (2016), 35--47.
[36]
Fritz Heider and Marianne Simmel. 1944. An experimental study of apparent behavior. The American journal of psychology, Vol. 57, 2 (1944), 243--259.
[37]
Viviane Herdel, Anastasia Kuzminykh, Andrea Hildebrandt, and Jessica R Cauchard. 2021a. Drone in love: Emotional perception of facial expressions on flying robots. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--20.
[38]
Viviane Herdel, Lee J. Yamin, Eyal Ginosar, and Jessica R. Cauchard. 2021b. Public Drone: Attitude Towards Drone Capabilities in Various Contexts. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction (Toulouse; Virtual, France) (MobileHCI '21). Association for Computing Machinery, New York, NY, USA, Article 25, 16 pages. https://doi.org/10.1145/3447526.3472053
[39]
Guy Hoffman, Oren Zuckerman, Gilad Hirschberger, Michal Luria, and Tal Shani Sherman. 2015. Design and evaluation of a peripheral robotic conversation companion. In Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction. 3--10.
[40]
Marius Hoggenmueller, Jiahao Chen, and Luke Hespanhol. 2020. Emotional expressions of non-humanoid urban robots: the role of contextual aspects on interpretations. In Proceedings of the 9TH ACM International Symposium on Pervasive Displays. Association for Computing Machinery, New York, NY, USA, 87--95.
[41]
Marius Hoggenmueller, Martin Tomitsch, Luke Hespanhol, Tram Thi Minh Tran, Stewart Worrall, and Eduardo Nebot. 2021. Context-based interface prototyping: Understanding the effect of prototype representation on user feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--14.
[42]
Marius Hoggenmueller, Martin Tomitsch, and Stewart Worrall. 2022. Designing Interactions With Shared AVs in Complex Urban Mobility Scenarios. Frontiers in Computer Science (2022), 59.
[43]
Kai Holl"ander, Ashley Colley, Christian Mai, Jonna H"akkil"a, Florian Alt, and Bastian Pfleging. 2019. Investigating the Influence of External Car Displays on Pedestrians' Crossing Behavior in Virtual Reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (Taipei, Taiwan) (MobileHCI '19). Association for Computing Machinery, New York, NY, USA, Article 27, 11 pages. https://doi.org/10.1145/3338286.3340138
[44]
Kai Holl"ander, Andy Krüger, and Andreas Butz. 2020. Save the Smombies: App-Assisted Street Crossing. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (Oldenburg, Germany) (MobileHCI '20). Association for Computing Machinery, New York, NY, USA, Article 22, 11 pages. https://doi.org/10.1145/3379503.3403547
[45]
Nathanael Jarrasse, Vittorio Sanguineti, and Etienne Burdet. 2014. Slaves no longer: review on role assignment for human--robot joint motor action. Adaptive Behavior, Vol. 22, 1 (2014), 70--82.
[46]
Suresh Kumaar Jayaraman, Chandler Creech, Dawn M Tilbury, X Jessie Yang, Anuj K Pradhan, Katherine M Tsui, and Lionel P Robert Jr. 2019. Pedestrian trust in automated vehicles: Role of traffic signal and AV driving behavior. Frontiers in Robotics and AI, Vol. 6 (2019), 117.
[47]
Daniel Jiang, Stewart Worrall, and Mao Shan. 2022. The Design of a Pedestrian Aware Contextual Speed Controller for Autonomous Driving. In 2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC). IEEE, 3899--3906.
[48]
Malte F Jung. 2017. Affective grounding in human-robot interaction. In 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI. IEEE, 263--273.
[49]
Alisa Kalegina, Grace Schroeder, Aidan Allchin, Keara Berlin, and Maya Cakmak. 2018. Characterizing the design space of rendered robot faces. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. 96--104.
[50]
Ioannis Kaparias, Michael GH Bell, Weili Dong, Aditya Sastrawinata, Amritpal Singh, Xuxi Wang, and Bill Mount. 2013. Analysis of pedestrian--vehicle traffic conflicts in street designs with elements of shared space. Transportation research record, Vol. 2393, 1 (2013), 21--30.
[51]
Auttapone Karndacharuk, Douglas J Wilson, and Roger CM Dunn. 2013. Analysis of pedestrian performance in shared-space environments. Transportation research record, Vol. 2393, 1 (2013), 1--11.
[52]
Eun Ho Kim, Sonya S Kwak, and Yoon Keun Kwak. 2009. Can robotic emotional expressions induce a human to empathize with a robot?. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 358--362.
[53]
Lawrence H Kim and Sean Follmer. 2019. Swarmhaptics: Haptic display with swarm robots. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1--13.
[54]
Richard A Krueger. 2014. Focus groups: A practical guide for applied research. Sage publications.
[55]
Jennifer S Lerner, Ye Li, Piercarlo Valdesolo, and Karim S Kassam. 2015. Emotion and decision making. Annual review of psychology, Vol. 66 (2015), 799--823.
[56]
Kunming Li, Mao Shan, Karan Narula, Stewart Worrall, and Eduardo Nebot. 2020. Socially Aware Crowd Navigation with Multimodal Pedestrian Trajectory Prediction for Autonomous Vehicles. In 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC). IEEE, New York, NY, USA, 1--8.
[57]
Yang Li, Hao Cheng, Zhe Zeng, Hailong Liu, and Monika Sester. 2021. Autonomous Vehicles Drive into Shared Spaces: eHMI Design Concept Focusing on Vulnerable Road Users. In 2021 IEEE International Intelligent Transportation Systems Conference (ITSC). IEEE, New York, NY, USA, 1729--1736.
[58]
Mike Ligthart and Khiet P Truong. 2015. Selecting the right robot: Influence of user attitude, robot sociability and embodiment on user preferences. In 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 682--687.
[59]
Andreas Löcken, Mark Colley, Andrii Matviienko, Kai Holl"ander, Debargha Dey, Azra Habibovic, Andrew L Kun, Susanne Boll, and Andreas Riener. 2021. WeCARe: Workshop on Inclusive Communication between Automated Vehicles and Vulnerable Road Users. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (Oldenburg, Germany) (MobileHCI '20). Association for Computing Machinery, New York, NY, USA, Article 43, 5 pages. https://doi.org/10.1145/3406324.3424587
[60]
Andreas Löcken, Carmen Golling, and Andreas Riener. 2019. How should automated vehicles interact with pedestrians? A comparative analysis of interaction concepts in virtual reality. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. Association for Computing Machinery, New York, NY, USA, 262--274.
[61]
Diana Löffler, Nina Schmidt, and Robert Tscharn. 2018. Multimodal expression of artificial emotion in social robots using color, motion and sound. In 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 334--343.
[62]
Michal Luria, Jodi Forlizzi, and Jessica Hodgins. 2018. The Effects of Eye Design on the Perception of Social Robots. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 1032--1037. https://doi.org/10.1109/ROMAN.2018.8525767
[63]
Karthik Mahadevan, Sowmya Somanath, and Ehud Sharlin. 2018. Communicating awareness and intent in autonomous vehicle-pedestrian interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1--12.
[64]
Maya B Mathur, David B Reichling, Francesca Lunardini, Alice Geminiani, Alberto Antonietti, Peter AM Ruijten, Carmel A Levitan, Gideon Nave, Dylan Manfredi, Brandy Bessette-Symons, et al. 2020. Uncanny but not confusing: Multisite study of perceptual category confusion in the Uncanny Valley. Computers in Human Behavior, Vol. 103 (2020), 21--30.
[65]
Milecia Matthews, Girish Chowdhary, and Emily Kieson. 2017. Intent communication between autonomous vehicles and pedestrians. arXiv preprint arXiv:1708.07123 (2017).
[66]
Natasha Merat, Tyron Louw, Ruth Madigan, Marc Wilbrink, and Anna Schieben. 2018. What externally presented information do VRUs require when interacting with fully Automated Road Transport Systems in shared space? Accident Analysis & Prevention, Vol. 118 (2018), 244--252.
[67]
Nicole Mirnig, Nicole Perterer, Gerald Stollnberger, and Manfred Tscheligi. 2017. Three strategies for autonomous car-to-pedestrian communication: A survival guide. In Proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction. 209--210.
[68]
Simon Moody and Steve Melia. 2014. Shared space--research, policy and problems. In Proceedings of the Institution of Civil Engineers-Transport, Vol. 167. Thomas Telford Ltd, ICE Publishing, London, UK, 384--392.
[69]
Dylan Moore, Rebecca Currano, Michael Shanks, and David Sirkin. 2020b. Defense against the dark cars: Design principles for griefing of autonomous vehicles. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. Association for Computing Machinery, New York, NY, USA, 201--209.
[70]
Dylan Moore, Rebecca Currano, and David Sirkin. 2020a. Sound decisions: How synthetic motor sounds improve autonomous vehicle-pedestrian interactions. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Association for Computing Machinery, New York, NY, USA, 94--103.
[71]
Dylan Moore, Rebecca Currano, G Ella Strack, and David Sirkin. 2019. The case for implicit external human-machine interfaces for autonomous vehicles. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. Association for Computing Machinery, New York, NY, USA, 295--307.
[72]
Masahiro Mori, Karl F MacDorman, and Norri Kageki. 2012. The uncanny valley [from the field]. IEEE Robotics & automation magazine, Vol. 19, 2 (2012), 98--100.
[73]
Michael W Morris and Dacher Keltner. 2000. How emotions work: The social functions of emotional expression in negotiations. Research in organizational behavior, Vol. 22 (2000), 1--50.
[74]
Dimitris Nathanael, Evangelia Portouli, Vassilis Papakostopoulos, Kostas Gkikas, and Angelos Amditis. 2018. Naturalistic observation of interactions between car drivers and pedestrians in high density urban settings. In Congress of the International Ergonomics Association. Springer, 389--397.
[75]
Trung Thanh Nguyen, Kai Holl"ander, Marius Hoggenmueller, Callum Parker, and Martin Tomitsch. 2019. Designing for projection-based communication between autonomous vehicles and pedestrians. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. 284--294.
[76]
Sina Nordhoff, Joost De Winter, Miltos Kyriakidis, Bart Van Arem, and Riender Happee. 2018. Acceptance of driverless vehicles: Results from a large cross-national questionnaire study. Journal of Advanced Transportation, Vol. 2018 (2018).
[77]
Eyal Peer, Joachim Vosgerau, and Alessandro Acquisti. 2014. Reputation as a sufficient condition for data quality on Amazon Mechanical Turk. Behavior research methods, Vol. 46, 4 (2014), 1023--1031.
[78]
Yu Peng, Yuan-Ling Feng, Nan Wang, and Haipeng Mi. 2020. How children interpret robots' contextual behaviors in live theatre: Gaining insights for multi-robot theatre design. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 327--334.
[79]
Björn Petrak, Gundula Sopper, Katharina Weitz, and Elisabeth André. 2021. Do you mind if i pass through? Studying the appropriate robot behavior when traversing two conversing people in a hallway setting. In 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN). IEEE, 369--375.
[80]
Jaguar Land Rover Automotive Plc. 2018. The Virtual Eyes Have it. Retrieved January 07, 2023 from https://www.jaguarlandrover.com/2018/virtual-eyes-have-it (2018).
[81]
F Gabriele Pratticò, Fabrizio Lamberti, Alberto Cannavò, Lia Morra, and Paolo Montuschi. 2021. Comparing state-of-the-art and emerging augmented reality interfaces for autonomous vehicle-to-pedestrian communication. IEEE Transactions on Vehicular Technology, Vol. 70, 2 (2021), 1157--1168.
[82]
Manon Prédhumeau, Anne Spalanzani, and Julie Dugdale. 2021. Pedestrian Behavior in Shared Spaces with Autonomous Vehicles: An Integrated Framework and Review. IEEE Transactions on Intelligent Vehicles, Vol. Early Access (2021), 1--1.
[83]
Amir Rasouli, Iuliia Kotseruba, and John K Tsotsos. 2017. Agreeing to cross: How drivers and pedestrians communicate. In 2017 IEEE Intelligent Vehicles Symposium (IV). IEEE, New York, NY, USA, 264--269.
[84]
Amir Rasouli and John K Tsotsos. 2019. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE transactions on intelligent transportation systems, Vol. 21, 3 (2019), 900--918.
[85]
Paola Rodr'iguez, M Hagenzieker, H Farah, and R Happee. 2017. Safety of pedestrians and cyclists when interacting with automated vehicles: A case study of the WEpods. Master's thesis. Civil Enginieering-Transport & Planning, Delft University.
[86]
Alexandros Rouchitsas and Håkan Alm. 2022. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information, Vol. 13, 9 (2022), 420.
[87]
Alexandros Rouchitsas and Håkan Alm. 2023. Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. Multimodal Technologies and Interaction, Vol. 7, 2 (2023), 10.
[88]
James A Russell. 1980. A circumplex model of affect. Journal of personality and social psychology, Vol. 39, 6 (1980), 1161.
[89]
James A Russell and Albert Mehrabian. 1977. Evidence for a three-factor theory of emotions. Journal of research in Personality, Vol. 11, 3 (1977), 273--294.
[90]
Shadan Sadeghian, Marc Hassenzahl, and Kai Eckoldt. 2020. An exploration of prosocial aspects of communication cues between automated vehicles and pedestrians. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 205--211.
[91]
Joan Saez-Pons, Hagen Lehmann, Dag Sverre Syrdal, and Kerstin Dautenhahn. 2014. Development of the sociability of non-anthropomorphic robot home companions. In 4th International Conference on Development and Learning and on Epigenetic Robotics. IEEE, 111--116.
[92]
Anna Schieben, Marc Wilbrink, Carmen Kettwich, Ruth Madigan, Tyron Louw, and Natasha Merat. 2019. Designing the interaction of automated vehicles with other traffic participants: design considerations based on human needs and expectations. Cognition, Technology & Work, Vol. 21, 1 (2019), 69--85.
[93]
Friederike Schneemann and Irene Gohl. 2016. Analyzing driver-pedestrian interaction at crosswalks: A contribution to autonomous driving in urban environments. In 2016 IEEE intelligent vehicles symposium (IV). IEEE, New York, NY, USA, 38--43.
[94]
Semcon. 2016. The smiling car. In https://semcon.com/smilingcar/ (Last accessed on September 2022).
[95]
Masahiro Shiomi, Francesco Zanlungo, Kotaro Hayashi, and Takayuki Kanda. 2014. Towards a socially acceptable collision avoidance for a mobile robot navigating among pedestrians using a pedestrian model. International Journal of Social Robotics, Vol. 6, 3 (2014), 443--455.
[96]
Sichao Song and Seiji Yamada. 2018. Designing expressive lights and in-situ motions for robots to express emotions. In Proceedings of the 6th International Conference on Human-Agent Interaction. 222--228.
[97]
Rebecca Stower, Karen Tatarian, Damien Rudaz, Marine Chamoux, Mohamed Chetouani, and Arvid Kappas. 2022. Does what users say match what they do? Comparing self-reported attitudes and behaviours towards a social robot. In 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 1429--1434.
[98]
Justin M Stritch, Mogens Jin Pedersen, and Gabel Taggart. 2017. The opportunities and limitations of using Mechanical Turk (Mturk) in public administration and management scholarship. International Public Management Journal, Vol. 20, 3 (2017), 489--511.
[99]
Keith S Taber. 2018. The use of Cronbach's alpha when developing and reporting research instruments in science education. Research in science education, Vol. 48 (2018), 1273--1296.
[100]
Tram Thi Minh Tran, Callum Parker, and Martin Tomitsch. 2021. A Review of Virtual Reality Studies on Autonomous Vehicle--Pedestrian Interaction. IEEE Transactions on Human-Machine Systems (2021).
[101]
Tram Thi Minh Tran, Callum Parker, Yiyuan Wang, and Martin Tomitsch. 2022. Designing Wearable Augmented Reality Concepts to Support Scalability in Autonomous Vehicle--Pedestrian Interaction. Frontiers in Computer Science (2022), 39.
[102]
Rutger Verstegen, Debargha Dey, and Bastian Pfleging. 2021. CommDisk: A holistic 360 eHMI concept to facilitate scalable, unambiguous interactions between automated vehicles and other road users. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 132--136.
[103]
Yiyuan Wang, Luke Hespanhol, and Martin Tomitsch. 2021. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technologies and Interaction, Vol. 5, 12 (2021), 84.
[104]
Yiyuan Wang, Luke Hespanhol, Stewart Worrall, and Martin Tomitsch. 2022. Pedestrian-Vehicle Interaction in Shared Space: Insights for Autonomous Vehicles. In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 330--339.
[105]
Senuri Wijenayake, Danula Hettiachchi, Simo Hosio, Vassilis Kostakos, and Jorge Goncalves. 2020. Effect of conformity on perceived trustworthiness of news in social media. IEEE Internet Computing, Vol. 25, 1 (2020), 12--19.
[106]
Julie R Williamson. 2012. User experience, performance, and social acceptability: usable multimodal mobile interaction. Ph.,D. Dissertation. University of Glasgow.
[107]
Sonja Windhager, Florian Hutzler, Claus-Christian Carbon, Elisabeth Oberzaucher, Katrin Schaefer, Truls Thorstensen, Helmut Leder, and Karl Grammer. 2010. Laying eyes on headlights: Eye movements suggest facial features in cars. Collegium antropologicum, Vol. 34, 3 (2010), 1075--1080.
[108]
Nidzamuddin Md Yusof, Juffrizal Karjanto, Jacques Terken, Frank Delbressine, Muhammad Zahir Hassan, and Matthias Rauterberg. 2016. The exploration of autonomous vehicle driving styles: preferred longitudinal, lateral, and vertical accelerations. In Proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications. 245--252.
[109]
Xinyu Zhu, Xingguo Zhang, Zinan Chen, Zhanxun Dong, Zhenyu Gu, and Danni Chang. 2022. The Trusted Listener: The Influence of Anthropomorphic Eye Design of Social Robots on User's Perception of Trustworthiness. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1--13.

Cited By

View all
  • (2024)Enhancing Accuracy for Super Spreader Identification in High-Speed Data StreamsProceedings of the VLDB Endowment10.14778/3681954.368198817:11(3124-3137)Online publication date: 1-Jul-2024
  • (2024)Crowdsourced Geospatial Intelligence: Constructing 3D Urban Maps with Satellitic Radiance FieldsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785728:3(1-24)Online publication date: 9-Sep-2024
  • (2024)AcousAF: Acoustic Sensing-Based Atrial Fibrillation Detection System for Mobile PhonesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678488(377-383)Online publication date: 5-Oct-2024
  • Show More Cited By

Index Terms

  1. My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic Eyes

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image Proceedings of the ACM on Human-Computer Interaction
      Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue MHCI
      MHCI
      September 2023
      1017 pages
      EISSN:2573-0142
      DOI:10.1145/3624512
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 September 2023
      Published in PACMHCI Volume 7, Issue MHCI

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. autonomous vehicle
      2. emotional expression
      3. external human-machine interfaces
      4. mobile services
      5. pedestrians
      6. perceived sociability
      7. shared space

      Qualifiers

      • Research-article

      Funding Sources

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)200
      • Downloads (Last 6 weeks)37
      Reflects downloads up to 11 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Enhancing Accuracy for Super Spreader Identification in High-Speed Data StreamsProceedings of the VLDB Endowment10.14778/3681954.368198817:11(3124-3137)Online publication date: 1-Jul-2024
      • (2024)Crowdsourced Geospatial Intelligence: Constructing 3D Urban Maps with Satellitic Radiance FieldsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785728:3(1-24)Online publication date: 9-Sep-2024
      • (2024)AcousAF: Acoustic Sensing-Based Atrial Fibrillation Detection System for Mobile PhonesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678488(377-383)Online publication date: 5-Oct-2024
      • (2024)AdaFlow: Opportunistic Inference on Asynchronous Mobile Data with Generalized Affinity ControlProceedings of the 22nd ACM Conference on Embedded Networked Sensor Systems10.1145/3666025.3699361(606-618)Online publication date: 4-Nov-2024
      • (2024)Wearable Technology Insights: Unveiling Physiological Responses During Three Different Socially Anxious ActivitiesACM Journal on Computing and Sustainable Societies10.1145/36636712:2(1-23)Online publication date: 20-Jun-2024
      • (2024)SoilCares: Towards Low-cost Soil Macronutrients and Moisture Monitoring Using RF-VNIR SensingProceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services10.1145/3643832.3661868(196-209)Online publication date: 3-Jun-2024
      • (2024)TagSleep3DProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435128:1(1-28)Online publication date: 6-Mar-2024
      • (2024)Inside Out: Emotion GaRage Vol. VAdjunct Proceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3641308.3677403(260-263)Online publication date: 22-Sep-2024
      • (2024)Shrinkable Arm-based eHMI on Autonomous Delivery Vehicle for Effective Communication with Other Road UsersProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675716(305-316)Online publication date: 22-Sep-2024
      • (2024)MSense: Boosting Wireless Sensing Capability Under Motion InterferenceProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649350(108-123)Online publication date: 29-May-2024
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media