[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3128473.3128483acmotherconferencesArticle/Chapter ViewAbstractPublication PagessastConference Proceedingsconference-collections
research-article

Regression Tests Provenance Data in the Continuous Software Engineering Context

Published: 18 September 2017 Publication History

Abstract

Regression tests are executed after every change in software. In a software development environment that adopts Continuous Software Engineering practices such as Continuous Integration, Continuous Delivery and Continuous Deployment, software is changed, built and tested many times. Every regression test execution may include different situations and problems that are treated in isolated way. Data provenance is concerned with the origins and processes that some data has gone through, until it becomes information. Ontologies are formal models that contain axioms and relationships between classes and individuals from a specific context and can be used to infer implicit knowledge. Considering that Continuous Software Engineering activities are based on feedback cycles, in this paper, we propose an architecture based on the use of an ontology and provenance model to capture and provide regression tests data to support the continuous improvement of software testing processes. Moreover, using ontology and provenance to track execution performance and issues in this scenario, may increase the chances of those issues not happening again, since practitioners can address and solve them for future executions.

References

[1]
P. Buneman, S. Khanna, and T. Wang-Chiew. Why and Where: A Characterization of Data Provenance. In International Conference on Database Theory, pages 316--330. Springer, Berlin, Heidelberg, 2001.
[2]
L. Chen. Continuous Delivery: Huge Benefits, but Challenges Too. IEEE Software, 32(2):50--54, mar 2015.
[3]
G. C. B. Costa, C. M. L. Werner, and R. Braga. Software Process Performance Improvement Using Data Provenance and Ontology, pages 55--71. Springer International Publishing, Cham, 2016.
[4]
H. L. O. Dalpra, G. C. B. Costa, T. F. M. Sirqueira, R. M. M. Braga, F. Campos, C. M. L. Werner, and J. M. N. David. Using ontology and data provenance to improve software processes. In ONTOBRAS, volume 1442, 2015.
[5]
S. B. Davidson and J. Freire. Provenance and scientific workflows: Challenges and opportunities. In Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, SIGMOD '08, pages 1345--1350, New York, NY, USA, 2008. ACM.
[6]
H. Do. Recent Advances in Regression Testing Techniques. In Advances in Computers, 103, pages 53--77. Elsevier, 2016.
[7]
E. Engström, P. Runeson, and A. Ljung. Improving Regression Testing Transparency and Efficiency with History-Based Prioritization -- An Industrial Case Study. In 2011 Fourth IEEE International Conference on Software Testing, Verification and Validation, pages 367--376. IEEE, mar 2011.
[8]
B. Fitzgerald and K.-J. Stol. Continuous software engineering: A roadmap and agenda. Journal of Systems and Software, 123:176--189, jan 2017.
[9]
P. Groth and L. Moreau. PROV-Overview: An Overview of the PROV Family of Documents. W3C Note, pages 1--9, 2013.
[10]
Y.-C. Huang, K.-L. Peng, and C.-Y. Huang. A history-based cost-cognizant test case prioritization technique in regression testing. Journal of Systems and Software, 85(3):626--637, mar 2012.
[11]
A. Khalilian, M. Abdollahi Azgomi, and Y. Fazlalizadeh. An improved method for test case prioritization by incorporating historical test case data. Science of Computer Programming, 78(1):93--116, nov 2012.
[12]
T. Lebo, S. Sahoo, D. McGuinness, K. Belhajjame, J. Cheney, D. Corsar, D. Garijo, S. Soiland-Reyes, S. Zednik, and J. Zhao. PROV-O: The PROV Ontology, 2013.
[13]
C. Lim, S. Lu, A. Chebotko, and F. Fotouhi. Prospective and Retrospective Provenance Collection in Scientific Workflow Environments. In 2010 IEEE International Conference on Services Computing, pages 449--456. IEEE, jul 2010.
[14]
M. Meyer. Continuous integration and its tools. IEEE Software, 31(3):14--16, may 2014.
[15]
L. Moreau, B. Clifford, J. Freire, J. Futrelle, Y. Gil, P. Groth, N. Kwasnikowska, S. Miles, P. Missier, J. Myers, et al. The open provenance model core specification (v1. 1). Future generation computer systems, 27(6):743--756, 2011.
[16]
L. Moreau and P. Missier. Prov-dm: The prov data model. https://www.w3.org/TR/prov-dm/, 2013.
[17]
X. Wang and H. Zeng. History-Based Dynamic Test Case Prioritization for Requirement Properties in Regression Testing. In 2016 IEEE/ACM International Workshop on Continuous Software Evolution and Delivery (CSED), pages 41--47. IEEE Computer Society, 2016.

Cited By

View all

Index Terms

  1. Regression Tests Provenance Data in the Continuous Software Engineering Context

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    SAST '17: Proceedings of the 2nd Brazilian Symposium on Systematic and Automated Software Testing
    September 2017
    100 pages
    ISBN:9781450353021
    DOI:10.1145/3128473
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    In-Cooperation

    • SBC: Sociedade Brasileira de Computação

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 September 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Continuous Integration
    2. Continuous Testing
    3. Data Provenance
    4. Ontological Inferences
    5. Process Improvement

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    SAST '17

    Acceptance Rates

    SAST '17 Paper Acceptance Rate 11 of 16 submissions, 69%;
    Overall Acceptance Rate 45 of 92 submissions, 49%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 19 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Automated and non-intrusive provenance capture with UML2PROVComputing10.1007/s00607-021-01012-x104:4(767-788)Online publication date: 1-Apr-2022
    • (2021)Assessing test artifact quality—A tertiary studyInformation and Software Technology10.1016/j.infsof.2021.106620139:COnline publication date: 23-Aug-2021
    • (2020)Domain Ontology Construction and Evaluation for the Entire Process of Software TestingIEEE Access10.1109/ACCESS.2020.30371888(205374-205385)Online publication date: 2020
    • (2020)A systematic literature review on semantic web enabled software testingJournal of Systems and Software10.1016/j.jss.2019.110485162:COnline publication date: 1-Apr-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media