[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article
Free access

An empirical validation of software cost estimation models

Published: 01 May 1987 Publication History

Abstract

Practitioners have expressed concern over their inability to accurately estimate costs associated with software development. This concern has become even more pressing as costs associated with development continue to increase. As a result, considerable research attention is now directed at gaining a better understanding of the software-development process as well as constructing and evaluating software cost estimating tools. This paper evaluates four of the most popular algorithmic models used to estimate software costs (SLIM, COCOMO, Function Points, and ESTIMACS). Data on 15 large completed business data-processing projects were collected and used to test the accuracy of the models' ex post effort estimation. One important result was that Albrecht's Function Points effort estimation model was validated by the independent data provided in this study [3]. The models not developed in business data-processing environments showed significant need for calibration. As models of the software-development process, all of the models tested failed to sufficiently reflect the underlying factors affecting productivity. Further research will be required to develop understanding in this area.

References

[1]
Abdel-Hamid. T. and Madnick, S. Impact of schedule estimation on software project behavior. IEEE Softw. 3,4 (July 1986), 70-75.
[2]
Albrecht, A.J. Measuring application development productivity. In Proceedinns of fhe IBM Applications Development Symposium, GUIDE/ SHARE (Goiterey, Calic,'Oct. 14-17). IBM, 1979,'pp. 83-92.
[3]
Albrecht. A.J. and Gaffney. J., Jr. Software function, source lines of code, and development effort prediction: A software science validation. IEEE Trans. Softw. Eng. SE-g, 6 (Nov. 1963), 639-646.
[4]
Behrens. CA. Measuring the productivity of computer systems development activities with Function Points. IEEE Trans. Softw. Eng. SE-g, 6 (Nov. 1963). 646-652.
[5]
Boehm, B.W. Software Engineering Economics. Prentice-Hall, Englewood Cliffs, N.J. 1981.
[6]
Brooks, F.P. The Mythical Man-Month. Addison-Wesley, Reading. Mass. 1975.
[7]
Call&m, H., and Colborne, S. A proposed method for estimating software cost from requirements. J. Parmetrics 4,4 (Dec. 1984), 33-40.
[8]
Conte. S., Dunsmore, H. and Shen, V. Software Engineering Metrics and Models. Benjamin/Cummings, Menlo Park. Calif., 1986.
[9]
Ferens, D.V. Software support cost models: Quo vadis? J. Pammetrics 4. 4 (Dec. 1984). 64-99.
[10]
Gaffney. J.E. Goldberg. R. and Misek-Falkoff. L. SCORE82 Summary. Perform. Eval. Rev. 12.4 (Winter 1984-1985). 4-12.
[11]
Golden. J.R., Mueller. J.R., and Anselm. B. Software cost estimating: Craft or witchcraft. Database 12. 3 (Spring 1981), 12-14.
[12]
Jones, C. Programming Productivity. McGraw-Hill, New York, 1986.
[13]
Kitchenham. B. and Taylor, N.R. Software cost models. ICL Tech. J. 4.1 (May 1984), 73-102.
[14]
Masters, T.F. II. An overview of software cost estimating at the NSA. J Parmetrics 5. 1 (Mar. 1985), 72-84.
[15]
Pinsky, S.S. The effect of complexity on software trade off equations. J, Pammetrics 4, 4 (Dec. 1984), 23-32.
[16]
Putnam. L.H. General empirical solution to the macro software sizing and estimating problem. IEEE Trans. Soffw. Eng. SE 4, 4 (July 19781, 345-361.
[17]
Putnam. L. and Fitzsimmons, A. Estimating software costs. Datamation 25, lo-12 (Sept.-Nov. 1979).
[18]
Quantitative Software Management. Refprence Notes for the DOD SLIM Sofrware Cost Estimating Course. Quantitative Software Management. McLean. Va., 1983.
[19]
Quantitative Software Management. SLIM User Manual (IBM PC Version) Draft copy ed. Quantitative Software Management. McLean, Va. 1984.
[20]
Rubin, H.A. Macroestimation of software development parameters: The Estimacs system. In SOFTFAIR Conference on Software Deuelopment Took, Techniques and AIternatiues (Arlington, Va., July 25-28). IEEE Press, New York, 1983. pp. 109-118.
[21]
Rubin. H.A. The art and science of software estimation: Fifth generation estimators. In Proceedings of the 7th Annual ISPA Conference (Orlando, Fla., May 7-9). International Society of Parametric Analysts, McLean, Va. 1985. pp. 56-72.
[22]
Robin, H.A. Using ESTIMACS E. Management and Computer Services, Valley Forge, Pa., Mar. 1984.
[23]
Software Productivity Research. SPQR/ZO User Guide. Software Productivity Research, Cambridge, Mass., 1986.
[24]
Theabaut. SM. Model evaluation in software metrics research. In Computer Science and Statistics: Proceedings of the 15th Symposium on the Interface (Houston, Tex., Mar.). 1983. pp. 277-285.
[25]
Walston. C.E. and Felix, C.P. A method of programming measurement and estimation. IBM Syst. J, 26, 1 (Jan. 1977), 54-73.
[26]
W&berg. S. Applied Linear Regression. Wiley, New York, 1980.
[27]
Wolverton, W.R. Cost of developing large scale software. lEEE Trans. Comput. C-23, 6 (June 19741, 615-634.

Cited By

View all
  • (2024)Organizational Ohm's Law in Practice: Measuring Engineering ProductivityIEEE Transactions on Engineering Management10.1109/TEM.2024.342189271(11494-11504)Online publication date: 2024
  • (2024)Software Cost Estimation: A Comparative Analysis2024 International Conference on Computer, Electrical & Communication Engineering (ICCECE)10.1109/ICCECE58645.2024.10497286(1-8)Online publication date: 2-Feb-2024
  • (2024)Enhancing Software Effort Estimation in the Analogy-Based Approach Through the Combination of Regression MethodsIEEE Access10.1109/ACCESS.2024.348082912(152122-152137)Online publication date: 2024
  • Show More Cited By

Recommendations

Reviews

William W. Agresti

Four software cost-estimation models (SLIM, COCOMO, Function Points, and ESTIMACS) were evaluated. The models' estimates of cost—in man-months (MM) of effort—were compared to actual effort data for 15 completed business data processing projects. These results were found: (1) Models not originally developed in a business data processing environment strongly require calibration for such an environment (which, as the author acknowledges, is to be expected). (2) The Function Points effort-estimation model is validated by the data in this study. Papers that describe and compare software cost-estimation models can be very helpful to readers who want to know if such models might be effective in their organizations (see, for example, [1] for a related report). However, the paper being reviewed is disappointing with respect to its primary thrust: how well do the four estimation models perform against actual cost data for the 15 projects__?__ Consider the performance of the first model discussed, SLIM. Here the values of key SLIM model parameters were determined (in accordance with the SLIM method) by the answers to 22 questions put to the user. On the first project, SLIM estimated a 3,858-MM effort, while the actual effort was 287 MM, giving an error of 1,244 percent. The author proceeded to apply the model, using the same initial parameter values, to the remaining 14 projects, with the result that the effort on each of the 15 projects was overestimated by an average of 772 percent. Three versions (basic, intermediate, and detailed) of the second model, COCOMO, were applied to the same 15 projects. The result was that effort was again overestimated in all 45 cases by an average of 601 percent. (A model that estimates 0 MM in all cases, and therefore has only 100 percent error, would have fared well in this evaluation]) A third model, Function Points, includes man-month estimates for two projects that are negative numbers. My disappointment lies in the unrealistic use of the models. Admittedly, the models did overestimate the actual effort. But the 15 projects are highly coherent, and intentionally so. They involve business data processing applications, mostly in COBOL, developed by a single company within the span of a few years. As the author notes, these circumstances encourage a high level of consistency in staff quality and in methodology used. So, if the models overestimate one project and there is no learning effect or adjustment of parameters by the user, it is understandable if the overestimation continues, as it did, 60 out of 60 times for the first two models. It does not seem useful to repeat essentially the same test 60 times and say the average overestimate is 600–700 percent. The 15 projects are not described as being ordered chronologically, but the results would have had some measure of realism if the analysis had considered them to be in order. Then, after the first project was overestimated by 1,224 percent, or certainly after the first few huge overestimates, the model parameters could have been adjusted before the next project was estimated. The author did not take advantage of having data for 15 projects. He should have applied the models to each project in turn, with opportunities for parameter adjustment along the way, in order to reflect the way cost-estimation models are used by organizations. These models are tools that require a contribution from the user in an ongoing process of refinement and sequential decision making.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Communications of the ACM
Communications of the ACM  Volume 30, Issue 5
May 1987
93 pages
ISSN:0001-0782
EISSN:1557-7317
DOI:10.1145/22899
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 May 1987
Published in CACM Volume 30, Issue 5

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)420
  • Downloads (Last 6 weeks)47
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Organizational Ohm's Law in Practice: Measuring Engineering ProductivityIEEE Transactions on Engineering Management10.1109/TEM.2024.342189271(11494-11504)Online publication date: 2024
  • (2024)Software Cost Estimation: A Comparative Analysis2024 International Conference on Computer, Electrical & Communication Engineering (ICCECE)10.1109/ICCECE58645.2024.10497286(1-8)Online publication date: 2-Feb-2024
  • (2024)Enhancing Software Effort Estimation in the Analogy-Based Approach Through the Combination of Regression MethodsIEEE Access10.1109/ACCESS.2024.348082912(152122-152137)Online publication date: 2024
  • (2024)SENSE: software effort estimation using novel stacking ensemble learningInnovations in Systems and Software Engineering10.1007/s11334-024-00581-2Online publication date: 14-Sep-2024
  • (2024)Optimizing Effort and Cost Estimation: Model Implementation Using Artificial Neural Networks and Taguchi’s Orthogonal Vector PlansRecent Advances in Artificial Intelligence in Cost Estimation in Project Management10.1007/978-3-031-76572-8_9(291-417)Online publication date: 7-Dec-2024
  • (2023)Heterogeneous Ensemble Model to Optimize Software Effort Estimation AccuracyIEEE Access10.1109/ACCESS.2023.325653311(27759-27792)Online publication date: 2023
  • (2023)Towards a method to quantitatively measure toolchain interoperability in the engineering lifecycleComputer Standards & Interfaces10.1016/j.csi.2023.10374486:COnline publication date: 1-Aug-2023
  • (2023)Evaluating ensemble imputation in software effort estimationEmpirical Software Engineering10.1007/s10664-022-10260-028:2Online publication date: 15-Mar-2023
  • (2022)Software Effort Estimation for Successful Software Application DevelopmentResearch Anthology on Agile Software, Software Development, and Testing10.4018/978-1-6684-3702-5.ch008(123-164)Online publication date: 2022
  • (2022)Evolution of Software Development Effort and Cost Estimation Techniques: Five Decades Study Using Automated Text Mining ApproachMathematical Problems in Engineering10.1155/2022/57825872022(1-17)Online publication date: 2-May-2022
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media