[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2899361.2899364acmotherconferencesArticle/Chapter ViewAbstractPublication PagesataccsConference Proceedingsconference-collections
research-article

Action reversibility in human-machine systems

Published: 30 September 2015 Publication History

Abstract

This paper focuses on the reversibility of human actions in the frame of human-machine interaction, with a special focus on the interaction between a pilot and a flight management system controlling an aircraft. A multi-level reversibility scale is defined for human actions. A reversibility property is defined for each level. An algorithm implementing a reversibility property check on the machine logic described in ADEPT is proposed. Specifically this paper describes a method for formally identifying actions that are not reversible within one step, that are eventually totally unrecoverable and that are totally unrecoverable.

References

[1]
{AAIC94} JAPAN. Aircraft Accident Investigation Commission, Ministry of Transport, Aircraft accident investigation report. China airlines, Airbus A300b4-622r, B-1816, april 26 1994.
[2]
{AD91} G. D. Abowd and A. J. Dix, Giving undo attention, Interacting with Computers 4, 1991, 317--342.
[3]
{AIR} AIRBUS, Flight operations briefing notes. Landing techniques: Bounce recovery - rejected landing.
[4]
{AIR07} AIRBUS, M. Parisis, Go-around (presentation), 15th performance and operations conference, 2007.
[5]
{AW97} R. Amalberti and L. Wioland, Human error in aviation., Aviation safety. (H.M Soekkha, ed.), 1997, p. 91.
[6]
{BBS13} M. L. Bolton, E. J. Bass, and R. I. Siminiceanu, Using formal verification to evaluate human-automation interaction: A review, Systems, Man, and Cybernetics: Systems, IEEE Transactions on 43 (2013), no. 3, pp. 488--503.
[7]
{CCH08} J. Creissac Campos and M. D. Harrison, Systematic analysis of control panel interfaces using formal tools, Interactive Systems. Design, Specification, and Verification, Springer, 2008, pp. 72--85.
[8]
{CGPF11} S. Combéfis, D. Giannakopoulou, Ch. Pecheur, and M. Feary, A formal framework for design and analysis of human-machine interaction, Systems, Man, and Cybernetics (SMC), 2011 IEEE International Conference on, 2011, pp. 1801--1808.
[9]
{Deh02} F. Dehais, Modelling cognitive conflict in pilot's activity., STAIRS 2002: Starting Artificial Intelligence Researchers Symposium, 2002, pp. 45--54.
[10]
{DCT11} F. Dehais, M. Causse, and S. Tremblay, Mitigation of conflicts with automation, Human Factors: The Journal of the Human Factors and Ergonomics Society 53, 2011, no. 3, pp. 448--460.
[11]
{DCV14} F. Dehais, M. Causse, F. Vachon, N. Régis, E. Menant and S. Tremblay, Failure to Detect Critical Auditory Alerts in the Cockpit Evidence for Inattentional Deafness., Human Factors: The Journal of the Human Factors and Ergonomics Society 56, no. 4, 2014, pp. 631--644.
[12]
{DPS15} F. Dehais, V. Peysakhovich, S. Scannella, J. Fongue, and T. Gateau, Automation Surprise in Aviation: Real-Time Solutions., Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015, pp. 2525--2534.
[13]
{DH02} A. Degani and M. Heymann, Formal verification of human-automation interaction, Human Factors: The Journal of the Human Factors and Ergonomics Society 44, 2002, no. 1, pp. 28--43.
[14]
{Fea05} M. Feary, Formal identification of automation surprise vulnerabilities in design, Ph.D. thesis, Cranfield University, 2005.
[15]
{Fea07} M. Feary, Automatic detection of interaction vulnerabilities in an executable specification, D. Harris (Ed.): Engin. Psychol. and Cog. Ergonomics, HCII 2007, LNAI 4562, 2007, pp. 487--496.
[16]
{Fea10} M. Feary, A toolset for supporting iterative human automation interaction in design, NASA Ames Research Center, Tech. Rep. 20100012861, 2010.
[17]
{Jen95} N.R. Jennings, Controlling cooperative problem solving in industrial multi-agent systems using joint intentions, Artificial Intelligence 75, 1995, no. 2, pp. 195--240.
[18]
{JMDK00} D. Javaux, M. Masson, and V. De Keyser, Beware of agents when flying aircraft: Basic principles behind a generic methodology for the evaluation and certification of advanced aviation systems, Human Factors in Certification. Lawrence Erlbaum Associates, 2000, pp. 375--405.
[19]
{Kon99} T. Kontogiannis, User strategies in recovering from errors in man--machine systems., Safety Science, 1999, p. 49.
[20]
{LPS97} N.G. Leveson, L.D. Pinnel, S.D. Sandys, S. Koga, and J.D. Reese, Analyzing software specifications for mode confusion potential, Workshop on Human Error and System Development, 1997, pp. 132--146.
[21]
{LSMC10} C. Layton, P. Smith, and C. Mc Coy, Design of a cooperative problem-solving system for en-route flight planning: An empirical evaluation, Human factors: The Journal of the Human Factors and Ergonomics Society 1, 2010, no. 36, pp. 94--119.
[22]
{MC99} M. Modarresa and S. W. Cheon, Function-centered modeling of engineering systems using the goal tree-success tree technique and functional primitives, Reliability Engineering and System Safety, vol. 64, 1999, pp. 181--200.
[23]
{MDK93} M. Masson and V. De Keyser, Preventing human errors in skilled activities through a computerized support system, Advances in human factors ergonomics 19, 1993, pp. 802--802.
[24]
{Mer11} S. Mercier, Contrôle du partage de l'autorité dans un système d'agents hétérogènes, Ph.D. thesis, ISAE, 2011.
[25]
{NAS} NASA, asrs.arc.nasa.gov/search/database.html.
[26]
{OW90} C. M. Ozveren and A. S. Willsky, Observability of discrete event dynamic systems, IEEE transactions on automatic control 34 (1990), no. 7, 797--806.
[27]
{Pal97} P. Palanque, Formal methods in human-computer interaction, Springer-Verlag New York, Inc., 1997.
[28]
{PH00} A. R. Pritchett and R. J. Hansman, Use of testable responses for performance-based measurement of situation awareness, Situation awareness analysis and measurement, 2000, pp. 189--209.
[29]
{PSW00} R. Parasuraman, T.B. Sheridan, and C.D. Wickens, A model for types and levels of human interaction with automation, IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 30, 2000, no. 3, pp. 286--297.
[30]
{PDT11} S. Pizziol, F. Dehais and C. Tessier, Towards human operator state assessment, Proceedings of the 1st International Conference on Application and Theory of Automation in Command and Control Systems, 2011, pp. 99--106.
[31]
{Piz13} S. Pizziol, Conflict prediction in human-machine systems, Ph.D. thesis, Univeristé de Toulouse, 2013, pp. 63--103.
[32]
{PTD14} S. Pizziol, C. Tessier, and F. Dehais, Petri net-based modelling of human--automation conflicts in aviation, Ergonomics 57, 2014, no. 3, pp. 319--331.
[33]
{Ras83} J. Rasmussen, Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models, IEEE Transactions on Systems, Man and Cybernetics 3, 1983, no. 13, pp. 257--266.
[34]
{Rea90} J. Reason, Human error, Cambridge University Press (New York, NY), 1990.
[35]
{Ski09} S.S. Skiena, The Algorithm Design Manual, Cambridge Springer London, 2009.
[36]
{SMW07} N. Sarter, R. Mumaw, and C. Wickens, Pilots' monitoring strategies and performance on automated flight decks: An empirical study combining behavioral and eye-tracking data, Human Factors: The Journal of the Human Factors and Ergonomics Society, 2007, no. 49, pp. 347--357.
[37]
{SW95} N.B. Sarter and D.D. Woods, How in the world did we ever get into that mode? Mode error and awareness in supervisory control, Human Factors: The Journal of the Human Factors and Ergonomics Society 37, 1995, no. 1, p. 5.
[38]
{TD12} C. Tessier and F. Dehais, Authority management and conflict solving in human-machine systems., AerospaceLab-journal, no. 4, 2012.
[39]
{Wic05} C.D. Wickens, Attentional tunneling and task management, 13th International Symposium on Aviation Psychology (Dayton, OH), 2005.
[40]
{WS00} D. Woods and N. Sarter, Learning from automation surprises and going sour accidents., Cognitive engineering in the aviation domain. (N. Sarter and R. Amalberti, eds.), 2000, p. 347.

Cited By

View all
  • (2016)Reversible decision support system: Minimising cognitive dissonance in multi-criteria based complex system using fuzzy analytic hierarchy process2016 8th Computer Science and Electronic Engineering (CEEC)10.1109/CEEC.2016.7835915(210-215)Online publication date: Sep-2016

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ATACCS '15: Proceedings of the 5th International Conference on Application and Theory of Automation in Command and Control Systems
September 2015
176 pages
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • Eurocontrol: Eurocontrol

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 September 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Reversibility
  2. human-machine interaction
  3. undo

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

ATACCS '15

Acceptance Rates

Overall Acceptance Rate 14 of 42 submissions, 33%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)2
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2016)Reversible decision support system: Minimising cognitive dissonance in multi-criteria based complex system using fuzzy analytic hierarchy process2016 8th Computer Science and Electronic Engineering (CEEC)10.1109/CEEC.2016.7835915(210-215)Online publication date: Sep-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media