[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Automated Steering of Model-Based Test Oracles to Admit Real Program Behaviors

Published: 01 June 2017 Publication History

Abstract

The <italic>test oracle</italic>&#x2014;a judge of the correctness of the system under test (SUT)&#x2014;is a major component of the testing process. Specifying test oracles is challenging for some domains, such as real-time embedded systems, where small changes in timing or sensory input may cause large behavioral differences. Models of such systems, often built for analysis and simulation, are appealing for reuse as test oracles. These models, however, typically represent an <italic>idealized</italic> system, abstracting away certain issues such as non-deterministic timing behavior and sensor noise. Thus, even with the same inputs, the model&#x0027;s behavior may fail to match an acceptable behavior of the SUT, leading to many false positives reported by the test oracle. We propose an automated <italic>steering</italic> framework that can adjust the behavior of the model to better match the behavior of the SUT to reduce the rate of false positives. This <italic>model steering</italic> is limited by a set of constraints (defining the differences in behavior that are acceptable) and is based on a search process attempting to minimize a dissimilarity metric. This framework allows non-deterministic, but bounded, behavioral differences, while preventing future mismatches by guiding the oracle&#x2014;within limits&#x2014;to match the execution of the SUT. Results show that steering significantly increases SUT-oracle conformance with minimal masking of real faults and, thus, has significant potential for reducing false positives and, consequently, testing and debugging costs while improving the quality of the testing process.

References

[1]
S. Anand, et al., “An orchestrated survey on automated software test case generation,” J. Syst. Softw., vol. 86, no. 8, pp. 1978–2001, Aug. 2013.
[2]
E. Barr, M. Harman, P. McMinn, M. Shahbaz, and S. Yoo, “The oracle problem in software testing: A survey,” IEEE Trans. Softw. Eng., vol. 41, no. 5, pp. 507–525, May 2015.
[3]
A. En-Nouaary, R. Dssouli, and F. Khendek, “Timed Wp-method: Testing real-time systems,” IEEE Trans. Softw. Eng. , vol. 28, no. 11, pp. 1023–1038, Nov. 2002.
[4]
D. Lee and M. Yannakakis, “ Principles and methods of testing finite state machines—a survey,” Proc. IEEE, vol. 84, no. 8, pp. 1090 –1123, Aug. 1996.
[5]
MathWorks Inc. Stateflow, 2015. [Online]. Available: http://www.mathworks.com/stateflow
[6]
D. Harel, “STATEMATE: A working environment for the development of complex reactive systems,” IEEE Trans. Softw. Eng., vol. 16, no. 4, pp. 403–414, Apr. 1990 .
[7]
[8]
D. Miller, J. Guo, E. Kraemer, and Y. Xiong, “On-the-fly calculation and verification of consistent steering transactions,” in Proc. ACM/IEEE Conf. Supercomputing, 2001, p. 8.
[9]
M. Veanes, C. Campbell, W. Grieskamp, W. Schulte, N. Tillmann, and L. Nachmanson, “Model-based testing of object-oriented reactive systems with SPEC explorer,” in Formal Methods and Testing, R. M. Hierons, J. P. Bowen, and M. Harman, Eds. Berlin, Germany: Springer, 2008, pp. 39–76. [Online]. Available: http://dblp.uni-trier.de/db/conf/fortest/fortest2008.html#VeanesCGSTN08
[10]
G. Gay, S. Rayadurgam, and M. P. Heimdahl, “Steering model-based oracles to admit real program behaviors,” in Proc. 36th Int. Conf. Softw. Eng., 2014, pp. 428–431.
[11]
G. Gay, S. Rayadurgam, and M. P. Heimdahl, “Improving the accuracy of oracle verdicts through automated model steering,” in Proc. 29th ACM/IEEE Int. Conf. Automated Softw. Eng., 2014, pp. 527– 538. [Online]. Available: http://doi.acm.org/10.1145/2642937.2642989
[12]
W. Howden, “Theoretical and empirical studies of program testing,” IEEE Trans. Softw. Eng., vol. SE-4, no. 4, pp. 293–298, Jul. 1978.
[13]
E. Weyuker, “The oracle assumption of program testing,” in Proc. 13th Int. Conf. Syst. Sci., 1980, pp. 44 –49.
[14]
M. Staats, G. Gay, and M. Heimdahl, “Automated oracle creation support, or: How I learned to stop worrying about fault propagation and love mutation testing,” in Proc. Int. Conf. Softw. Eng., 2012, pp. 870–880.
[15]
D. Coppit and J. Haddox-Schatz, “On the use of specification-based assertions as test oracles,” in Proc. 29th Annu. IEEE/NASA Softw. Eng. Workshop, 2005, pp. 305–314. [Online]. Available: https://doi.org/10.1109/SEW.2005.33
[16]
M. Pezze and M. Young, Software Test and Analysis: Process, Principles, and Techniques. Hoboken, NJ, USA : Wiley, Oct. 2006.
[17]
B. Scientific, “Pacemaker system specification,” in Pacemaker Formal Methods Challenge. Oshawa, ON, Canada: Softw. Quality Res. Lab., 2007.
[18]
T. M. Association, “Modelica-a unified object-oriented language for systems modeling,” Modelica Assoc., Vienna, Austria, 2012.
[19]
Q. Xie and A. M. Memon, “ Designing and comparing automated test oracles for GUI-based software applications,” ACM Trans. Softw. Eng. Methodology, vol. 16, no. 1, Feb. 2007, Art. no. [Online]. Available: http://doi.acm.org/10.1145/1189748.1189752
[20]
A. Gomes, A. Mota, A. Sampaio, F. Ferri, and E. Watanabe, “Constructive model-based analysis for safety assessment,” Int. J. Softw. Tools Technol. Transfer, vol. 14, no. 6, pp. 673–702, 2012. [Online]. Available: http://dx.doi.org/10.1007/s10009–012-0238-x
[21]
S. P. Miller, A. C. Tribble, M. W. Whalen, and M. P. E. Heimdahl, “Proving the shalls: Early validation of requirements through formal methods,” Int. J. Softw. Tools Technol. Transfer, vol. 8, no. 4, pp. 303–319, 2006.
[22]
K. G. Larsen, M. Mikucionis, and B. Nielsen, “ Online testing of real-time systems using UPPAAL,” in Proc. 4th Int. Workshop Formal Approaches Testing Softw., 2004, pp. 79 –94.
[23]
J. Zander, I. Schieferdecker, and P. J. Mosterman, Model-Based Testing for Embedded Systems. Boca Raton, FL, USA: CRC Press, 2011.
[24]
S. Mahajan and W. G. Halfond, “Finding HTML presentation failures using image comparison techniques,” in Proc. 29th ACM/IEEE Int. Conf. Automated Softw. Eng., 2014, pp. 91–96. [Online]. Available: http://doi.acm.org/10.1145/2642937.2642966
[25]
W. Gu, J. Vetter, and K. Schwan, “An annotated bibliography of interactive program steering,” ACM SIGPLAN Notices, vol. 29, pp. 140–148, 1994.
[26]
S.-H. Cha, “Comprehensive survey on distance/similarity measures between probability density functions,” Int. J. Math. Models Methods Appl. Sci., vol. 1, no. 4, pp. 300–307, 2007. [Online]. Available: http://www.gly.fsu.edu/ parker/geostats/Cha.pdf
[27]
G. Navarro, “A guided tour to approximate string matching,” ACM Comput. Surveys, vol. 33, no. 1, pp. 31 –88, Mar. 2001. [Online]. Available: http://doi.acm.org/10.1145/375360.375365
[28]
G. Hagen, “Verifying safety properties of lustre programs: An SMT-based approach,” Ph.D. dissertation, Graduate College, Univ. Iowa, Iowa City, IA, Dec. 2008.
[29]
M. Huth and M. Ryan, Logic in Computer Science: Modelling and Reasoning About Systems, 2nd ed. Cambridge, U.K.: Cambridge Press, 2006.
[30]
A. Biere, M. Heule, H. van Maaren, and T. Walsh, Handbook of Satisfiability: Volume 185, Frontiers in Artificial Intelligence and Applications. Amsterdam, The Netherlands : IOS Press, 2009.
[31]
L. De Moura and N. Bjørner, “Z3: An efficient SMT solver,” in Tools and Algorithms for the Construction and Analysis of Systems. Berlin, Germany: Springer, 2008, pp. 337–340.
[32]
G. Gay, “Automated steering of model-based test oracles to admit real program behaviors,” Ph.D. dissertation, Dept. Comput. Sci. Eng., Univ. Minnesota, Minneapolis, May 2015.
[33]
M. Mohri, A. Rostamizadeh, and A. Talwalkar, Foundations of Machine Learning. Cambridge, MA, USA: MIT Press, 2012.
[34]
T. Menzies and Y. Hu, “Data mining for very busy people,” IEEE Comput., vol. 36, no. 11, pp. 22–29, Nov. 2003. [Online]. Available: https://doi.org/10.1109/MC.2003.1244531
[35]
C. Pasareanu, T. Menzies, J. Schumann, K. Gundy-Burlet, and A. Barrett, “Software V&V support by parametric analysis of large software simulation systems,” in Proc. IEEE Aerospace Conf., 2009, pp. 1–8.
[36]
K. Gundy-Burlet, J. Schumann, T. Barrett, and T. Menzies, “Parametric analysis of antares re-entry guidance algorithms using advanced test generation and data analysis,” in Proc. 9th Int. Symp. Artif. Intell. Robot. Autom. Space, 2008.
[37]
K. Gundy-Burlet, J. Schumann, T. Barrett, and T. Menzies, “Parametric analysis of a hover test vehicle using advanced test generation and data analysis,” in Proc. AIAA Aerospace Conf., 2009.
[38]
G. Gay, T. Menzies, M. Davies, and K. Gundy-Burlet, “Automatically finding the control variables for complex system behavior,” Automated Softw. Eng., vol. 17, no. 4, pp. 439–468, Dec. 2010. [Online]. Available: https://doi.org/10.1007/s10515%E2%80%93010-0072-x
[39]
S. D. Bay and M. J. Pazzani, “Detecting change in categorical data: Mining contrast sets,” in Proc. 5th ACM SIGKDD Int. Conf. Knowl. Discovery Data Mining, 1999, pp. 302–306. [Online]. Available: http://doi.acm.org/10.1145/312129.312263
[40]
A. Murugesan, S. Rayadurgam, and M. Heimdahl, “ Modes, features, and state-based modeling for clarity and flexibility,” in Proc. Workshop Model. Softw. Eng., 2013, pp. 13 –17.
[41]
N. Halbwachs, Synchronous Programming of Reactive Systems. Dordrecht, The Netherlands: Kluwer Academic Press, 1993.
[42]
A. Rajan, M. Whalen, M. Staats, and M. Heimdahl, Requirements Coverage as an Adequacy Measure for Conformance Testing. Berlin, Germany: Springer, 2008, pp. 86–104.
[43]
J. Andrews, L. Briand, Y. Labiche, and A. Namin, “Using mutation analysis for assessing and comparing testing coverage criteria,” IEEE Trans. Softw. Eng. , vol. 32, no. 8, pp. 608–624, Aug. 2006.
[44]
N. Li and J. Offutt, “ An empirical analysis of test oracle strategies for model-based testing,” in Proc. IEEE 7th Int. Conf. Softw. Testing Verification Validation, Mar. 2014, pp. 363–372.
[45]
A. Rajan, M. Whalen, and M. Heimdahl, “The effect of program and model structure on MC/DC test adequacy coverage,” in Proc. 30th Int. Conf. Softw. Eng., 2008, pp. 161 –170.
[46]
E. Brinksma and J. Tretmans, “Testing transition systems: An annotated bibliography,” in Modeling and Verification of Parallel Processes, F. Cassez, C. Jard, B. Rozoy, and M. Ryan, Eds. Berlin, Germany: Springer, 2001, pp. 187–195. [Online]. Available: https://doi.org/10.1007/3%E2%80%93540-45510-8_9
[47]
J. Tretmans, “Model based testing with labelled transition systems,” in Formal Methods and Testing. Berlin, Germany : Springer, 2008, pp. 1–38.
[48]
L. Padgham, Z. Zhang, J. Thangarajah, and T. Miller, “Model-based test oracle generation for automated unit testing of agent systems,” IEEE Trans. Softw. Eng., vol. 39, no. 9, pp. 1230–1244, Sep. 2013.
[49]
A. Arcuri, M. Z. Iqbal, and L. Briand, “Black-box system testing of real-time embedded systems using random and search-based testing,” in Proc. 22nd IFIP WG 6.1 Int. Conf. Testing Softw. Syst., 2010, pp. 95–110. [Online]. Available: http://dl.acm.org/citation.cfm?id=1928028.1928036
[50]
T. Savor and R. Seviora, “An approach to automatic detection of software failures in real-time systems,” in Proc. 3rd IEEE Real-Time Technol. Appl. Symp., 1997, pp. 136–146.
[51]
L. Briones and E. Brinksma, “A test generation framework for quiescent real-time systems,” in Proc. 4th Int. Workshop Formal Approaches Testing Softw., 2004, pp. 64–78.
[52]
J. Bengtsson and W. Yi, “Timed automata: Semantics, algorithms and tools,” in Lectures on Concurrency and Petri Nets, J. Desel, W. Reisig, and G. Rozenberg, Eds. Berlin, Germany: Springer, 2004, pp. 87–124. [Online]. Available: http://dx.doi.org/10.1007/978–3-540-27755-2_3
[53]
S. Kannan, M. Kim, I. Lee, O. Sokolsky, and M. Viswanathan, “Run-time monitoring and steering based on formal specifications,” in Proc. Workshop Model. Softw. Syst. Structures Fastly Moving Scenario, 2000.
[54]
T. Menzies, R. Krishna, and D. Pryor, “The PROMISE repository of empirical software engineering data,” North Carolina State University, Department of Computer Science, 2015, http://openscience.us/repo

Cited By

View all
  • (2023)Mapping the structure and evolution of software testing research over the past three decadesJournal of Systems and Software10.1016/j.jss.2022.111518195:COnline publication date: 8-Feb-2023

Index Terms

  1. Automated Steering of Model-Based Test Oracles to Admit Real Program Behaviors
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image IEEE Transactions on Software Engineering
    IEEE Transactions on Software Engineering  Volume 43, Issue 6
    June 2017
    105 pages

    Publisher

    IEEE Press

    Publication History

    Published: 01 June 2017

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Mapping the structure and evolution of software testing research over the past three decadesJournal of Systems and Software10.1016/j.jss.2022.111518195:COnline publication date: 8-Feb-2023

    View Options

    View options

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media