[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3125739.3125749acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article
Open access

Personal Influences on Dynamic Trust Formation in Human-Agent Interaction

Published: 27 October 2017 Publication History

Abstract

The development of automated technologies in our daily life has transformed the role of human operators from a controller to a teammate who shares control with automated agents. However, this 'teammate' relationship between humans and automation raises an important but challenging research question regarding the formation of human-agent trust. Considering that the formation of human-agent trust is a dynamic and sophisticated process involving human factors, this study conducted a two-phase online experiment to examine personal influences on users' trust propensity and their trust formation in human-agent interactions. Our findings revealed distinctive personal influences on dispositional trust and the formation of human-agent trust at different stages. We found that users who exhibit higher trust propensities in humans also develop higher trust toward automated agents in initial stages. This study, as the first of its kind, not only fills the gap of knowledge about personal influences on human-agent trust, but also offers opportunities to enhance the future design of automated agent systems.

References

[1]
Nathan R. Bailey, and Mark W. Scerbo. 2007. Automation-induced complacency for monitoring highly reliable systems:The role of task complexity, system experience, and operator trust. Theoretical Issues in Ergonomics Science,8(4), 321--348.
[2]
Kay Behrenbruch, Matthias Söllner, Jan Marco Leimeister, and Ludger Schmidt. 2013. Understanding Diversity--The Impact of Personality on Technology Acceptance. In IFIP Conference on Human-Computer Interaction (pp. 306--313). Springer Berlin Heidelberg.
[3]
David P. Biros, Gregory Fields, and Gregg Gunsch. 2003. The effect of external safeguards on humaninformation system trust in an information warfare environment. In Proceedings of the 36th Annual Hawaii International Conference on System Sciences, 10. New York, NY:IEEE (pp. 1--10).
[4]
David P. Biros, Mark Daly, and Gregg Gunsch. 2004. The influence of task load and automation trust on deception detection. Group Decision and Negotiation, 13, 173--189.
[5]
John T. Cacioppo, Wendi L. Gardner, Gary G. Berntson. 1997. Beyond bipolar conceptualizations and measures:The case of attitudes and evaluative space. Personality and Social Psychology Review, 1(1), 3--25.
[6]
Béatrice Cahour, and Jean-François Forzy. 2009. Does projection into use improve trust and exploration? An example with a cruise control system. Safety science, 47(9), 1260--1270.
[7]
Alain Chavaillaz, David Wastell, and Jürgen Sauer. 2016. System reliability, performance and trust in adaptable automation. Applied Ergonomics, 52, 333342.
[8]
Nancy L. Collins, and Stephen J. Read. 1990. Adult attachment, working models, and relationship quality in dating couples. Journal of personality and social psychology, 58(4), 644.
[9]
Mary L. Cummings, Andrew Clare, and Christin Hart. 2010. The role of human-automation consensus in multiple unmanned vehicle scheduling. Human Factors:The Journal of the Human Factors and Ergonomics Society.
[10]
Ewart de Visser, and Raja Parasuraman. 2011. Adaptive aiding of human-robot teaming effects of imperfect automation on performance, trust, and workload.Journal of Cognitive Engineering and Decision Making, 5(2), 209--231.
[11]
Birsen Donmez, Linda N. Boyle, John D. Lee, and Daniel V. McGehee. 2006. Drivers' attitudes toward imperfect distraction mitigation strategies. Transportation research part F:traffic psychology and behaviour, 9(6), 387--398.
[12]
Mary T. Dzindolet, Scott A. Peterson, Regina A Pomranky, Linda G. Pierce, and Hall P. Beck. 2003. The role of trust in automation reliance. International Journal of Human-Computer Studies, 58(6), 697--718.
[13]
Craig W. Ellison, and Ira J. Firestone. 1974. Development of interpersonal trust as a function of self-esteem, target status, and target style. Journal of personality and social psychology, 29(5), 655.
[14]
Gary Klein, David D. Woods, Jeffrey M. Bradshaw, Robert R. Hoffman and Paul J. Feltovich. 2004. Ten challenges for making automation a" team player" in joint human-agent activity. IEEE Intelligent Systems, 19(6), 91--95.
[15]
Peter A. Hancock, Deborah R. Billings, Kristin E. Schaefer, Jessie Y. Chen, Ewart J. de Visser, and Raja Parasuraman. 2011. A meta-analysis of factors affecting trust in human-robot interaction. Human Factors:The Journal of the Human Factors and Ergonomics Society, 53(5), 517--527.
[16]
Geoffrey Ho, Dana Wheatley, and Charles T. Scialfa. 2005. Age differences in trust and reliance of a medication management system. Interacting with Computers,17(6), 690--710.
[17]
Morris Rosenberg. 1965. Rosenberg self-esteem scale (RSE). Acceptance and commitment therapy. Measures package, 61.
[18]
Kevin A. Hoff, and Masooda Bashir. 2015. Trust in automation integrating empirical evidence on factors that influence trust. Human Factors:The Journal of the Human Factors and Ergonomics Society, 57(3), 407434.
[19]
Esperanza Huerta, TerryAnn Glandon, and Yanira Petrides. 2012. Framing, decision- aid systems, and culture:Exploring influences on fraud inves- tigations. International Journal of Accounting Information Systems, 13, 316--333.
[20]
Dale W. Griffin and Kim Bartholomew. 1994. Models of the self and other:Fundamental dimensions underlying measures of adult attachment. Journal of personality and social psychology, 67(3), 430.
[21]
Katja Kircher, and Birgitta Thorslund. 2009. Effects of road surface appearance and low friction warning systems on driver behaviour and confidence in the warning system. Ergonomics, 52(2), 165--176.
[22]
R. Rogers Kobak, and Amy Sceery. 1988. Attachment in late adolescence:Working models, affect regulation, and representations of self and others. Child development, 135--146.
[23]
Art Kohn. 2014. Brain Science:The Forgetting Curve-- the Dirty Secret of Corporate Training. Available on Sep 7, 2016 at:http://www.learningsolutionsmag.com/articles/1379/br ain-science-the-forgetting-curvethe-dirty-secret-ofcorporate-training
[24]
Robert E. Larzelere, and Ted L. Hutson. 1980. The dyadic trust scale:Toward understanding interpersonal trust in close relationships. Journal of Marriage and the Family, 595--604.
[25]
John D. Lee, and Katrina A. See. 2004. Trust in automation:Designing for appropriate reliance. Human Factors:The Journal of the Human Factors and Ergonomics Society, 46(1), 50--80.
[26]
Monica N. Lees and John D. Lee, 2007. The influence of distraction and driving context on driver response to imperfect collision warning systems. Ergonomics, 50(8), 1264--1286.
[27]
Rosemarijn Looije, Mark A. Neerincx, and Fokie Cnossen. 2010. Persuasive robotic assistant for health self-management of older adults:Design and evaluation of social behaviors. International Journal of Human-Computer Studies, 68(6), 386--397.
[28]
Dingjun Li, Patrick Rau, and Ye Li. 2010. A crosscultural study:Effect of robot appearance and task. International Journal of Social Robotics, 2, 175--186.
[29]
Yung-Ming Li, and Yung-Shao Yeh, 2010. Increasing trust in mobile com- merce through design aesthetics. Computers in Human Behavior, 26, 673--684.
[30]
Paul T. Costa and Robert R. McCrae. 1992. Four ways five factors are basic. Personality and individual differences, 13(6), 653--665.
[31]
Poornima Madhavan, and Douglas A. Wiegmann. 2007. Similarities and differences between human-- human and human--automation trust:an integrative review.Theoretical Issues in Ergonomics Science, 8(4), 277--301.
[32]
Poornima Madhavan, and Douglas A. Wiegmann. 2007. Effects of information source, pedigree, and reliability on operator interaction with decision support systems. Human Factors:The Journal of the Human Factors and Ergonomics Society, 49(5), 773--785.
[33]
Stephanie M. Merritt, and Daneil R. Ilgen. 2008. Not all trust is created equal:Dispositional and historybased trust in human--automation interaction. Human Factors, 50, 194--210.
[34]
Adriana Miramontes, Andriana Tesoro, Yuri Trujillo, Edward Barraza, Jillian Keeler, Alexander Boudreau, Thomas Z. Strybel, and Kim-Phuong L. Vu. 2015. Training Student Air Traffic Controllers to Trust Automation. Procedia Manufacturing, 3, 3005--3010.
[35]
Roger C. Mayer and James H. Davis. 1999. The effect of the performance appraisal system on trust for management:A field quasi-experiment. Journal of applied psychology, 84(1), 123.
[36]
Kathleen L. Mosier, Linda J. Skitka, and Kristina J. Korte. 1994. Cognitive and social psychological issues in flight crew/automation interaction. Human performance in automated systems:Current research and trends, 191--197.
[37]
Clifford Nass, Youngme Mood, and Paul Carney. 1999. Are people polite to computers' Responses to computer-based interviewing systems. Journal of Applied Social Psychology, 29, 1093--1109.
[38]
Clifford Nass, Jonathan Steuer, and Ellen R. Tauber. 1994. Computers are social actors. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 72--78). ACM.
[39]
Tatsuya Nomura, Takayuki Kanda, Tomohiro Suzuki, and Kensuke Kato. 2008. Prediction of human behavior in human--robot interaction using psycho- logical scales for anxiety and negative attitudes toward robots. IEEE Transactions on Robotics, 24, 442--451.
[40]
Kenya F. Oduor, and Christopher S. Campbell. 2007. Deciding when to trust automation in a policy-based city management game:policity. In Proceedings of the 2007 symposium on Computer human interaction for the management of information technology (p. 2). ACM.
[41]
Richard Pak, Nicole Fink, Margaux Price, Brock Bass, and Lindsay Sturre. 2012. Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics, 55(9), 1059--1072.
[42]
Richard Pak, Ericka Rovira, Anne C. McLaughlin, and Natalee Baldwin. 2016. Does the domain of technology impact user trust? Investigating trust in automation across different consumer-oriented domains in young adults, military, and older adults. Theoretical Issues in Ergonomics Science, 1--22.
[43]
Raja Parasuraman, and Christopher A. Miller. 2004. Trust and etiquette in high-criticality automated systems. Communications of the ACM, 47(4), 51--55.
[44]
Stephanie M. Merritt. 2011. Affective processes in human--automation interactions. Human Factors:The Journal of the Human Factors and Ergonomics Society,53(4), 356--370.
[45]
Carl J. Pearson, Allaire K. Welk, William A. Boettcher, Roger C. Mayer, Sean Streck, Joseph M. Simons-Rudolph, Chistopher B. Mayhorn. 2016. Differences in trust between human and automated decision aids. In Proceedings of the Symposium and Bootcamp on the Science of Security (pp. 95--98). ACM.
[46]
Byron Reeves, and Clifford Nass. 1996. How people treat computers, television, and new media like real people and places (pp. 19--36). Cambridge, UK:CSLI Publications and Cambridge university press.
[47]
John K. Rempel, John G. Holmes, and Mark P. Zanna. 1985. Trust in close relationships. Journal of personality and social psychology, 49(1), 95.
[48]
Jennifer M. Ross, James L. Szalma, Peter A. Hancock, John S. Barnett, and Grant Taylor. 2008. The effect of automation reliability on user automation trust and reliance in a search-and-rescue scenario. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 52, No. 19, pp. 13401344). SAGE Publications.
[49]
Maha Salem, Gabriella Lakatos, Farshid Amirabdollahian, Kerstin Dautenhahn. 2015. Would you trust a (faulty) robot?:Effects of error, task type and personality on human-robot cooperation and trust. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (pp. 141--148). ACM.
[50]
Julian Sanchez, Arthur D. Fisk, Wendy A. Rogers. 2004. Reliability and age-related effects on trust and reliance of a decision support aid. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 48, No. 3, pp. 586--589). SAGE Publications.
[51]
Julian Sanchez, Wendy A. Rogers, Arthur D. Fisk, Ericka Rovira. 2014. Understanding reliance on automation:effects of error type, error distribution, age and experience. Theoretical Issues in Ergonomics Science, 15(2), 134--160.
[52]
Kristin E. Schaefer, Jessie Y. Chen, James L. Szalma, and Peter A. Hancock. 2016. A Meta-Analysis of Factors Influencing the Development of Trust in Automation Implications for Understanding Autonomy in Future Systems. Human Factors:The Journal of the Human Factors and Ergonomics Society, 0018720816634228.
[53]
Valarie A. Yerdon, Tiffani A. Marlowe, William G. Volante, Shuling Li, and Peter A. Hancock. 2017. Investigating Cross-Cultural Differences in Trust Levels of Automotive Automation. In Advances in Cross-Cultural Decision Making (pp. 183--194). Springer International Publishing.
[54]
Susanne G. Scott, and Reginald A. Bruce. 1995. Decision-making style:The development and assessment of a new measure. Educational and psychological measurement 55.5:818--831.
[55]
Younho Seong, and Ann M. Bisantz. 2008. The impact of cognitive feedback on judgment performance and trust with decision aids. International Journal of Industrial Ergonomics, 38(7), 608--625.
[56]
Colin A. Stedmon, Stiig Markager, Lars Tranvik, Leif Kronberg, Tove Slätis, and Winnie Martinsen. 2007. Photochemical production of ammonium and transformation of dissolved organic matter in the Baltic Sea. Marine Chemistry, 104(3), 227--240.
[57]
Indramani L. Singh, Robert Molloy, Raja Parasuraman. 1993. Automation-induced" complacency":Development of the complacency-potential rating scale. The International Journal of Aviation Psychology, 3(2), 111--122.
[58]
James L. Szalma, and Grant S. Taylor. 2011. Individual differences in response to automation:the five factor model of personality. Journal of Experimental Psychology:Applied, 17(2), 71.
[59]
Fang-Wu Tung. 2011. Influence of gender and age on the attitudes of children towards humanoid robots. In J. A. Jacko (Ed.), Human-computer interaction, Part IV (pp. 637--646). Berlin, Germany:Springer-Verlag.
[60]
Bill Vlasic, and Neal E. Boudette. 2016. As U.S. Investigates Fatal Tesla Crash, Company Defends Autopilot System. Available on Sep 7, 2016 at:http://www.nytimes.com/2016/07/13/business/teslaautopilot-fatal-crash-investigation.html.
[61]
Jacob M. Wetzel. 2005. Driver trust, annoyance, and compliance for an automated calendar system (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses.
[62]
Jeffry A. Simpson, William S. Rholes, and Julia S. Nelligan. 1992. Support seeking and support giving within couples in an anxiety-provoking situation:The role of attachment styles. Journal of personality and social psychology 62.3:434.

Cited By

View all
  • (2023)Team at Your Service: Investigating Functional Specificity for Trust Calibration in Automated Driving with Conversational AgentsInternational Journal of Human–Computer Interaction10.1080/10447318.2023.221995239:16(3254-3267)Online publication date: 29-Jun-2023
  • (2023)My Actions Speak Louder Than Your Words: When User Behavior Predicts Their Beliefs About Agents’ AttributesArtificial Intelligence in HCI10.1007/978-3-031-35894-4_17(232-248)Online publication date: 23-Jul-2023
  • (2022)Beyond the Halo: Investigation of Trust and Functional Specificity in Automated Driving with Conversational AgentsProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3546834(195-203)Online publication date: 17-Sep-2022
  • Show More Cited By

Index Terms

  1. Personal Influences on Dynamic Trust Formation in Human-Agent Interaction

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      HAI '17: Proceedings of the 5th International Conference on Human Agent Interaction
      October 2017
      550 pages
      ISBN:9781450351133
      DOI:10.1145/3125739
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 October 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. decision-aids automation
      2. human-agent collaboration
      3. psychological factors
      4. user interaction design

      Qualifiers

      • Research-article

      Conference

      HAI '17
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 121 of 404 submissions, 30%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)127
      • Downloads (Last 6 weeks)25
      Reflects downloads up to 13 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Team at Your Service: Investigating Functional Specificity for Trust Calibration in Automated Driving with Conversational AgentsInternational Journal of Human–Computer Interaction10.1080/10447318.2023.221995239:16(3254-3267)Online publication date: 29-Jun-2023
      • (2023)My Actions Speak Louder Than Your Words: When User Behavior Predicts Their Beliefs About Agents’ AttributesArtificial Intelligence in HCI10.1007/978-3-031-35894-4_17(232-248)Online publication date: 23-Jul-2023
      • (2022)Beyond the Halo: Investigation of Trust and Functional Specificity in Automated Driving with Conversational AgentsProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3546834(195-203)Online publication date: 17-Sep-2022
      • (2022)Experimental Investigation of Trust in Anthropomorphic Agents as Task PartnersProceedings of the 10th International Conference on Human-Agent Interaction10.1145/3527188.3563921(302-305)Online publication date: 5-Dec-2022
      • (2022)Is trust in artificial intelligence systems related to user personality? Review of empirical evidence and future research directionsElectronic Markets10.1007/s12525-022-00594-432:4(2021-2051)Online publication date: 23-Nov-2022
      • (2021)Evaluating Effective Connectivity of Trust in Human–Automation Interaction: A Dynamic Causal Modeling (DCM) StudyHuman Factors: The Journal of the Human Factors and Ergonomics Society10.1177/001872082098744364:6(1051-1069)Online publication date: 3-Mar-2021
      • (2021)How to Evaluate Trust in AI-Assisted Decision Making? A Survey of Empirical MethodologiesProceedings of the ACM on Human-Computer Interaction10.1145/34760685:CSCW2(1-39)Online publication date: 18-Oct-2021
      • (2020)An Examination of Dispositional Trust in Human and Autonomous System InteractionsHuman-Computer Interaction. Human Values and Quality of Life10.1007/978-3-030-49065-2_30(420-435)Online publication date: 10-Jul-2020
      • (2019)‘If You Agree with Me, Do I Trust You?’: An Examination of Human-Agent Trust from a Psychological PerspectiveIntelligent Systems and Applications10.1007/978-3-030-29513-4_73(994-1013)Online publication date: 24-Aug-2019
      • (2019)The Measurement of the Propensity to Trust AutomationVirtual, Augmented and Mixed Reality. Applications and Case Studies10.1007/978-3-030-21565-1_32(476-489)Online publication date: 8-Jun-2019
      • Show More Cited By

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media