[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2851553.2858662acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Asking "What"?, Automating the "How"?: The Vision of Declarative Performance Engineering

Published: 12 March 2016 Publication History

Abstract

Over the past decades, various methods, techniques, and tools for modeling and evaluating performance properties of software systems have been proposed covering the entire software life cycle. However, the application of performance engineering approaches to solve a given user concern is still rather challenging and requires expert knowledge and experience. There are no recipes on how to select, configure, and execute suitable methods, tools, and techniques allowing to address the user concerns. In this paper, we describe our vision of Declarative Performance Engineering (DPE), which aims to decouple the description of the user concerns to be solved (performance questions and goals) from the task of selecting and applying a specific solution approach. The strict separation of "what" versus "how" enables the development of different techniques and algorithms to automatically select and apply a suitable approach for a given scenario. The goal is to hide complexity from the user by allowing users to express their concerns and goals without requiring any knowledge about performance engineering techniques. Towards realizing the DPE vision, we discuss the different requirements and propose a reference architecture for implementing and integrating respective methods, algorithms, and tooling.

References

[1]
L. Bass, I. Weber, and L. Zhu. DevOps: A Software Architect's Perspective. Addison-Wesley Professional, 2015.
[2]
S. Becker, H. Koziolek, and R. Reussner. The Palladio component model for model-driven performance prediction. Elsevier Journal of Systems and Software (JSS), 2009.
[3]
A. B. Bondi. Foundations of Software and System Performance Engineering: Process, Performance Modeling, Requirements, Testing, Scalability, and Practice. Addison-Wesley Professional, 2014.
[4]
F. Brosig, P. Meier, S. Becker, A. Koziolek, H. Koziolek, and S. Kounev. Quantitative evaluation of model-driven performance analysis and simulation of component-based architectures. IEEE Transactions on Software Engineering (TSE), 41(2):157--175, 2015.
[5]
A. et al. Performance-oriented DevOps: A research agenda. Technical Report SPEC-RG-2015-01, SPEC Research Group -- DevOps Performance Working Group, Standard Performance Evaluation Corporation (SPEC), 2015.
[6]
S. Frey, A. van Hoorn, R. Jung, W. Hasselbring, and B. Kiel. MAMBA: A measurement architecture for model-based analysis. Technical Report TR-1112, Department of Computer Science, University of Kiel, Germany, 2011.
[7]
F. Gorsler, F. Brosig, and S. Kounev. Performance queries for architecture-level performance models. In Proceedings of the 5th ACM/SPEC International Conference on Performance Engineering (ICPE 2014), pages 99--110. ACM, 2014.
[8]
V. Grassi, R. Mirandola, and A. Sabetta. Filling the gap between design and performance/reliability models of component-based systems: A model-driven approach. Journal of Systems and Software, 80(4):528--558, 2007.
[9]
K. Hoad, S. Robinson, and R. Davies. Automating warm-up length estimation. Journal of the Operational Research Society, 61(9):1389--1403, 2010.
[10]
N. Huber, A. van Hoorn, A. Koziolek, F. Brosig, and S. Kounev. Modeling run-time adaptation at the system architecture level in dynamic service-oriented environments. Service Oriented Computing and Applications Journal (SOCA), 8(1):73--89, 2014.
[11]
S. Kounev, F. Brosig, and N. Huber. The Descartes Modeling Language. Technical report, Department of Computer Science, University of Wuerzburg, 2014.
[12]
S. Kounev, X. Zhu, J. O. Kephart, and M. Kwiatkowska. Model-driven Algorithms and Architectures for Self-Aware Computing Systems (Dagstuhl Seminar 15041). Dagstuhl Reports, 5(1):164--196, 2015.
[13]
K. Kritikos, B. Pernici, P. Plebani, C. Cappiello, M. Comuzzi, S. Benrernou, I. Brandic, A. Kertész, M. Parkin, and M. Carro. A survey on service quality description. ACM Comput. Surv., 46(1):1:1--1:58, July 2013.
[14]
D. Westermann, R. Krebs, and J. Happe. Efficient experiment selection in automated software performance evaluations. In Proceedings of the 8th European Conference on Computer Performance Engineering, EPEW'11, pages 325--339, Berlin, Heidelberg, 2011. Springer-Verlag.
[15]
M. Woodside, G. Franks, and D. C. Petriu. The future of software performance engineering. In 2007 Future of Software Engineering (FOSE '07), pages 171--187. IEEE, 2007.
[16]
M. Woodside, D. C. Petriu, D. B. Petriu, H. Shen, T. Israr, and J. Merseguer. Performance by unified model analysis (puma). In Proceedings of the 5th International Workshop on Software and Performance (WOSP '05), pages 1--12. ACM, 2005.

Cited By

View all
  • (2022)Design and Development of a Technology-Agnostic NFR Testing FrameworkProceedings of the 2022 5th International Conference on Software Engineering and Information Management10.1145/3520084.3520092(45-50)Online publication date: 21-Jan-2022
  • (2021)Performance Testing Using a Smart Reinforcement Learning-Driven Test Agent2021 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC45853.2021.9504763(2385-2394)Online publication date: 28-Jun-2021
  • (2021)An autonomous performance testing framework using self-adaptive fuzzy reinforcement learningSoftware Quality Journal10.1007/s11219-020-09532-z30:1(127-159)Online publication date: 10-Mar-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '16: Proceedings of the 7th ACM/SPEC on International Conference on Performance Engineering
March 2016
346 pages
ISBN:9781450340809
DOI:10.1145/2851553
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 March 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. declarative performance engineering
  2. devops
  3. measurement-based analysis
  4. model-based analysis

Qualifiers

  • Research-article

Funding Sources

Conference

ICPE'16

Acceptance Rates

ICPE '16 Paper Acceptance Rate 23 of 74 submissions, 31%;
Overall Acceptance Rate 252 of 851 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)1
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Design and Development of a Technology-Agnostic NFR Testing FrameworkProceedings of the 2022 5th International Conference on Software Engineering and Information Management10.1145/3520084.3520092(45-50)Online publication date: 21-Jan-2022
  • (2021)Performance Testing Using a Smart Reinforcement Learning-Driven Test Agent2021 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC45853.2021.9504763(2385-2394)Online publication date: 28-Jun-2021
  • (2021)An autonomous performance testing framework using self-adaptive fuzzy reinforcement learningSoftware Quality Journal10.1007/s11219-020-09532-z30:1(127-159)Online publication date: 10-Mar-2021
  • (2019)Concern-driven Reporting of Software Performance Analysis ResultsCompanion of the 2019 ACM/SPEC International Conference on Performance Engineering10.1145/3302541.3313103(1-4)Online publication date: 27-Mar-2019
  • (2019)Behavior-driven Load Testing Using Contextual Knowledge - Approach and ExperiencesProceedings of the 2019 ACM/SPEC International Conference on Performance Engineering10.1145/3297663.3309674(265-272)Online publication date: 4-Apr-2019
  • (2018)A Declarative Approach for Performance Tests Execution in Continuous Software Development EnvironmentsProceedings of the 2018 ACM/SPEC International Conference on Performance Engineering10.1145/3184407.3184417(261-272)Online publication date: 30-Mar-2018
  • (2018)A systematic approach for performance assessment using process miningEmpirical Software Engineering10.1007/s10664-018-9606-923:6(3394-3441)Online publication date: 1-Dec-2018
  • (2018)A Domain-Specific Language and Toolchain for Performance Evaluation Based on MeasurementsMeasurement, Modelling and Evaluation of Computing Systems10.1007/978-3-319-74947-1_21(295-301)Online publication date: 25-Jan-2018
  • (2017)Automated and Adaptable Decision Support for Software Performance EngineeringProceedings of the 11th EAI International Conference on Performance Evaluation Methodologies and Tools10.1145/3150928.3150952(66-73)Online publication date: 5-Dec-2017
  • (2017)Mapping of Service Level Objectives to Performance QueriesProceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion10.1145/3053600.3053646(197-202)Online publication date: 18-Apr-2017
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media