[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Does Code Decay? Assessing the Evidence from Change Management Data

Published: 01 January 2001 Publication History

Abstract

A central feature of the evolution of large software systems is that change which is necessary to add new functionality, accommodate new hardware, and repair faults becomes increasingly difficult over time. In this paper, we approach this phenomenon, which we term code decay, scientifically and statistically. We define code decay and propose a number of measurements (code decay indices) on software and on the organizations that produce it, that serve as symptoms, risk factors, and predictors of decay. Using an unusually rich data set (the fifteen-plus year change history of the millions of lines of software for a telephone switching system), we find mixed, but on the whole persuasive, statistical evidence of code decay, which is corroborated by developers of the code. Suggestive indications that perfective maintenance can retard code decay are also discussed.

References

[1]
A. Mockus S.G. Eick T.L. Graves and A.F. Karr, “On Measurement and Analysis of Software Changes,” technical report, National Inst. of Statistical Sciences. 1999.
[2]
L.A. Belady and M.M. Lehman, “Programming System Dynamics, or the Meta-Dynamics of Systems in Maintenance and Growth,” technical report, IBM T.J. Watson Research Center, 1971.
[3]
L.A. Belady and M.M. Lehman, “A Model of Large Program Development,” IBM Systems J., pp. 225–252, 1976.
[4]
M.M. Lehman and L.A. Belady, Program Evolution: Processes of Software Change. Academic Press, 1985.
[5]
I. Vessey and R. Weber, “Some Factors Affecting Program Repair Maintenance: An Empirical Study,” Comm. ACM, vol. 26, pp. 128–134, 1983.
[6]
S. Bendifallah and W. Scacchi, “Understanding Software Maintenance Work,” IEEE Trans. Software Eng., vol. 24, pp. 311–323, 1987.
[7]
W.S. Scacchi, “Managing Software Engineering Projects: A Social Analysis,” IEEE Trans. Software Eng., vol. 10, no. 1, pp. 49–59, Jan. 1984.
[8]
C.F. Kemerer and A.K. Ream, “Empirical Research on Software Maintenance: 1981–1990,” technical report, Mass. Inst. of Technology, 1992.
[9]
D.L. Parnas, “Software Aging,” Proc. 16th Int'l Conf. Software Eng., pp. 279–287, May 1994.
[10]
N. Ohlsson and H. Alberg, “Predicting Fault-Prone Software Modules in Telephone Switches,” IEEE Trans. Software Eng., vol. 22,no. 12, pp. 886–894, Dec. 1996.
[11]
V.R. Basili and B.T. Perricone, “Software Errors and Complexity: An Empirical Investigation,” Comm. ACM, vol. 27, no. 1, pp. 42–52, Jan. 1984.
[12]
V.R. Basili and D.M. Weiss, “A Methodology for Collecting Valid Software Engineering Data,” IEEE Trans. Software Eng., vol. 10,no. 6, pp. 728–737, 1984.
[13]
E.B. Swanson, “The Dimensions of Maintenance,” Proc. Second Conf. Software Eng., pp. 492-497, 1976.
[14]
K.H. An D.A. Gustafson and A.C. Melton, “A Model for Software Maintenance,” Proc. Conf. Software Maintenance, pp. 57-62, Sept. 1987.
[15]
D.E. Perry H.P. Siy and L.G. Votta, “Parallel Changes in Large Scale Software Development: An Observational Case Study,” Proc. Int'l Conf. Software Eng., Apr. 1998.
[16]
N. Staudenmayer T.L. Graves J.S. Marron A. Mockus H. Siy L.G. Votta and D.E. Perry, “Adapting to a New Environment: How a Legacy Software Organization Copes with Volatility and Change,” Proc. Fifth Int'l Product Development Conf., 1998.
[17]
S.L. Pfleeger R. Jeffery W. Curtis and B. Kitchenham, “Status Report on Software Measurement,” IEEE Software, pp. 33–43, Mar./Apr. 1997.
[18]
H. Zuse, Software Complexity: Measures and Methods. Berlin, New York: de Gruyter, 1991.
[19]
T.A. Ball and S.G. Eick, “Software Visualization in the Large,” IEEE Computer, vol. 29, no. 4, pp. 33–43, Apr. 1996.
[20]
T.L. Graves A.F. Karr J.S. Marron and H.P. Siy, “Predicting Fault Incidence Using Software Change History,” IEEE Trans. Software Eng., vol. 26, no. 7, pp. 653-661, July 2000.
[21]
M.P. Wand and M.C. Jones, Kernel Smoothing. London: Chapman and Hall, 1995.
[22]
J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications. London: Chapman and Hall, 1996.
[23]
P. Chaudhuri and J.S. Marron, “SiZer for Exploration of Structures in Curves,” J. Am. Statistical Assoc., vol. 94, no. 447, pp. 807-823, Sept. 1999.
[24]
G.J. Wills, “Nicheworks—Interactive Visualization of Very Large Graphs,” J. of Computational and Graphical Statistics, vol. 8, no. 2, pp. 190-212, June 1999.
[25]
T.L. Graves and A. Mockus, “Inferring Change Effort from Configuration Management Databases,” Proc. Fifth Int'l Symp. Software Metrics, Nov. 1998.

Cited By

View all
  • (2024)Atrophy in Aging SystemsInformation Systems Research10.1287/isre.2023.121835:1(66-86)Online publication date: 1-Mar-2024
  • (2024)Improving the Learning of Code Review Successive Tasks with Cross-Task Knowledge DistillationProceedings of the ACM on Software Engineering10.1145/36437751:FSE(1086-1106)Online publication date: 12-Jul-2024
  • (2024)Software Architecture Recovery from Multiple Dependency ModelsProceedings of the 39th ACM/SIGAPP Symposium on Applied Computing10.1145/3605098.3635917(1185-1192)Online publication date: 8-Apr-2024
  • Show More Cited By

Recommendations

Reviews

Andrew Brooks

Change management data from a fifteen-plus year change history on millions of lines of code is analyzed to determine whether code decay (the increasing difficulty over time of making changes) is real. Figure 3 presents a convincing visualization that the probability of a modification request touching more than one file increases from a low of less than 2 percent in 1989 to more than 5 percent in 1996. Whilst the authors recognize the alternative interpretation that the inherent difficulty of the desired changes is increasing, the modification request data is not examined independently from this perspective. Figure 4 presents a convincing visualization of the 'breaking down' of functional separation between two clusters of modules meaning some modification requests repeatedly touched on the same group of modules. A discussion on the nature of these particular modification requests might have provided insights as to whether or not the inherent difficulty of the desired changes was increasing. As further evidence of code decay, the authors provide models for fault prediction and state the best models predict numbers of faults using numbers of changes to the module in the past. Forming an appreciation of the strength of this claim is, however, not possible without referring to an earlier publication. This paper represents a major contribution to the understanding of code decay and is highly recommended to specialists in software maintenance with access to change management data who are looking for ideas on metrics and visualizations of change.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Software Engineering
IEEE Transactions on Software Engineering  Volume 27, Issue 1
January 2001
96 pages
ISSN:0098-5589
Issue’s Table of Contents

Publisher

IEEE Press

Publication History

Published: 01 January 2001

Author Tags

  1. Software maintenance
  2. effort modeling.
  3. fault potential
  4. metrics
  5. span of changes
  6. statistical analysis

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 12 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Atrophy in Aging SystemsInformation Systems Research10.1287/isre.2023.121835:1(66-86)Online publication date: 1-Mar-2024
  • (2024)Improving the Learning of Code Review Successive Tasks with Cross-Task Knowledge DistillationProceedings of the ACM on Software Engineering10.1145/36437751:FSE(1086-1106)Online publication date: 12-Jul-2024
  • (2024)Software Architecture Recovery from Multiple Dependency ModelsProceedings of the 39th ACM/SIGAPP Symposium on Applied Computing10.1145/3605098.3635917(1185-1192)Online publication date: 8-Apr-2024
  • (2024)3Erefactor: Effective, Efficient and Executable Refactoring Recommendation for Software Architectural ConsistencyIEEE Transactions on Software Engineering10.1109/TSE.2024.344956450:10(2633-2655)Online publication date: 1-Oct-2024
  • (2024)A longitudinal study on the temporal validity of software samplesInformation and Software Technology10.1016/j.infsof.2024.107404168:COnline publication date: 1-Apr-2024
  • (2024)A study of behavioral decay in design patternsJournal of Software: Evolution and Process10.1002/smr.263836:7Online publication date: 14-Jul-2024
  • (2023)A Survey of Tool Support for Working with Design Decisions in CodeACM Computing Surveys10.1145/360786856:2(1-37)Online publication date: 10-Jul-2023
  • (2023)Is it Enough to Recommend Tasks to Newcomers? Understanding Mentoring on Good First IssuesProceedings of the 45th International Conference on Software Engineering10.1109/ICSE48619.2023.00064(653-664)Online publication date: 14-May-2023
  • (2023)Detecting Scattered and Tangled Quality Concerns in Source Code to Aid Maintenance and Evolution TasksProceedings of the 45th International Conference on Software Engineering: Companion Proceedings10.1109/ICSE-Companion58688.2023.00051(184-188)Online publication date: 14-May-2023
  • (2023)Towards semantically enhanced detection of emerging quality-related concerns in source codeSoftware Quality Journal10.1007/s11219-023-09614-831:3(865-915)Online publication date: 17-Feb-2023
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media