[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2372251.2372300acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Does the prioritization technique affect stakeholders' selection of essential software product features?

Published: 19 September 2012 Publication History

Abstract

Context: To select the essential, non-negotiable product features is a key skill for stakeholders in software projects. Such selection relies on human judgment, possibly supported by structured prioritization techniques and tools. Goal: Our goal was to investigate whether certain attributes of prioritization techniques affect stakeholders' threshold for judging product features as essential. The four investigated techniques represent four combinations of granularity (low, high) and cognitive support (low, high). Method: To control for robustness and masking effects when investigating in the field, we conducted both an artificial experiment and a field experiment using the same prioritization techniques. In the artificial experiment, 94 subjects in four treatment groups indicated the features (from a list of 16) essential when buying a new cell phone. In the field experiment, 44 domain experts indicated the software product features that were essential for the fulfillment of the project's vision. The effects of granularity and cognitive support on the number of essential ratings were analyzed and compared between the experiments. Result: With lower granularity, significantly more features were rated as essential. The effect was large in the general experiment and extreme in the field experiment. Added cognitive support had medium effect, but worked in opposite directions in the two experiments, and was not statistically significant in the field experiment. Implications: Software projects should avoid taking stakeholders' judgments of essentiality at face value. Practices and tools should be designed to counteract biases and to support the conscious knowledge-based elements of prioritizing.

References

[1]
J. S. Armstrong, editor. Principles of Forecasting: A Handbook for Researchers and Practitioners. Kluwer Academic Publishers, 2001.
[2]
H. C. Benestad and J. E. Hannay. A comparison of model-based and judgment-based release planning in incremental software projects. In Proc. 33rd Int'l Conf. Software Engineering (ICSE 2011), pages 766--775. ACM, 2011.
[3]
P. Berander and A. Andrews. Requirements prioritization. In Engineering and Managing Software Requirements, chapter 4, pages 69--94. Springer, 2005.
[4]
P. Berander and P. Jönsson. Hierarchical cumulative voting (hcv) prioritization of requirements in hierarchies. Int'l J. Software Engineering & Knowledge Engineering, 16:819--849, 2006.
[5]
P. Berander, K. A. Khan, and L. Lehtola. Towards a research framework on requirements prioritization. In Proc. 6th Conf. Software Engineering Research and Practice in Sweden, pages 39--48, 2006.
[6]
A. S. Danesh and R. Ahmad. Study of prioritization techniques using students as subjects. In Int'l Conf. Information Management and Engineering, pages 390--394. IEEE Computer Society, 2009.
[7]
K. A. Ericsson. The influence of experience and deliberate practice on the development of superior expert performance. In K. A. Ericsson, N. Charness, P. J. Feltovich, and R. R. Hoffman, editors, The Cambridge Handbook of Expertise and Expert Performance, chapter 38, pages 683--703. Cambridge Univ. Press, 2006.
[8]
D. Gentner and A. L. Stevens, editors. Mental Models. Lawrence Erlbaum Associates, Inc., 1983.
[9]
G. A. Gescheider. Psychophysical scaling. Annual Review of Psychology, 39:169--200, 1988.
[10]
G. Gigerenzer and P. M. Todd, editors. Simple Heuristics that Make Us Smart. Oxford University Press, 1999.
[11]
T. Halkjelsvik and M. Jørgensen. From origami to software development: A review of studies on judgment-based predictions of performance time. accepted to Psychological Bulletin, 2011.
[12]
J. E. Hannay. Better software effort estimationâa matter of skill or environment? Submitted to IEEE Trans. Software Engineering; available at simula.no/people/johannay/bibliography, 2012.
[13]
J. E. Hannay and H. C. Benestad. Perceived productivity threats in large agile development projects. In Proc. 4th Int'l Symp.Empirical Software Engineering and Measurement (ESEM), pages 1--10. IEEE Computer Society, 2010.
[14]
J. E. Hannay and M. Jørgensen. The role of deliberate artificial design elements in software engineering experiments. IEEE Trans. Software Eng., 34:242--259, Mar/Apr 2008.
[15]
P. T. Harker. Incomplete pairwise comparisons in the analytic hierarchy process. Mathematical Modelling, 9(11):837--848, 1987.
[16]
P. N. Johnson-Laird. Mental Models: Towards a Cognitive Science of Language, Inference, and Consciousness. Cambridge Univ. Press, 1983.
[17]
M. Jørgensen and S. Grimstad. The impact of irrelevant and misleading information on software development effort estimates: A randomized controlled field experiment. IEEE Trans. Software Eng., 37(5):695--707, 2011.
[18]
M. Jørgensen and T. Halkjelsvik. The effects of request formats on judgment-based effort estimation. J. Systems and Software, 83(1):29--36, 2010.
[19]
D. Kahneman and S. Frederick. A model of heuristic judgment. In K. J. Holyoak and R. G. Morrison, editors, The Cambridge Handbook of Thinking and Reasoning, pages 267--294. Cambridge Univ. Press, 2004.
[20]
V. B. Kampenes, T. Dybå, J. E. Hannay, and D. I. K. Sjøberg. A systematic review of effect size in software engineering experiments. Information and Software Technology, 49(11-12):1073--1086, Nov. 2007.
[21]
J. Karlsson. Software requirements prioritizing. In 2nd Int'l Conf. Requirements Engineering (ICRE'96), pages 110--116. IEEE Computer Society, 1996.
[22]
J. Karlsson, S. Olsson, and K. Ryan. Improved practical support for large-scale requirements prioritising. Requirements Engineering, 2:51--60, 1997.
[23]
J. Karlsson, C. Wohlin, and B. Regnell. An evaluation of methods for prioritizing software requirements. Information & Software Technology, 39(14-15):939--947, 1998.
[24]
L. Karlsson, M. Höst, and B. Regnell. Evaluating the practical use of different measurement scales in requirements prioritisation. In Proc. 2006 ACM/IEEE Int'l Symp. Empirical Software Engineering, ISESE '06, pages 326--335. ACM, 2006.
[25]
L. Karlsson, T. Thelin, B. Regnell, P. Berander, and C. Wohlin. Pair-wise comparisons versus planning game partitioning-experiments on requirements prioritisation techniques. Empirical Software Engineering, 12:3--33, 2007.
[26]
G. Klein. Developing expertise in decision making. Thinking & Reasoning, 3(4):337--352, 1997.
[27]
L. Lehtola and M. Kauppinen. Empirical evaluation of two requirements prioritization methods in product development projects. In T. Dingsøyr, editor, Software Process Improvement, volume 3281 of Lecture Notes in Computer Science, pages 161--170. Springer, 2004.
[28]
L. Lehtola and M. Kauppinen. Suitability of requirements prioritization methods for market-driven software product development. Software Process: Improvement and Practice, 11:7--19, 2006.
[29]
T. Mussweiler. Comparison processes in social judgment: Mechanisms and consequences. Psych. Review, 110(3):472--489, 2003.
[30]
A. Parducci and D. H. Wedell. The category effect with rating scales: Number of categories, number of stimuli, and method of presentation. J. Experimental Psychology: Human Perception and Performance, 12(4):496--516, 1996.
[31]
A. Perini, F. Ricca, and A. Susi. Tool-supported requirements prioritization: Comparing the ahp and cbrank methods. Information and Software Technology, 51(6):1021--1032, 2009.
[32]
E. C. Poulton. Behavioral Decision Theory: A New Approach. Cambridge University Press, 1994.
[33]
T. L. Saaty. The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation. McGraw-Hill, 1990.
[34]
T. L. Saaty. Multicriteria Decision Making: The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation. RWS Publications, 1990.
[35]
W. R. Shadish, T. D. Cook, and D. T. Campbell. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Houghton Mifflin, 2002.
[36]
H. A. Simon. The Sciences of the Artificial. MIT Press, third edition, 1996.
[37]
F. Strack and T. Mussweiler. Explaining the enigmatic anchoring effect: Mechanisms of selective accessibility. J. Personality and Social Psychology, 73(3):437--446, 1997.
[38]
R. K. Yin. Case Study Research: Design and Methods, volume 5 of Applied Social Research Methods Series. Sage Publications, third edition, 2003.

Cited By

View all
  • (2021)Earned Business Value ManagementBenefit/Cost-Driven Software Development10.1007/978-3-030-74218-8_5(61-74)Online publication date: 25-Jun-2021
  • (2018)Software Requirements Prioritisation: A Systematic Literature Review on Significance, Stakeholders, Techniques and ChallengesIEEE Access10.1109/ACCESS.2018.28817556(71497-71523)Online publication date: 2018
  • (2018)A systematic literature review of software requirements prioritization researchInformation and Software Technology10.1016/j.infsof.2014.02.00156:6(568-585)Online publication date: 30-Dec-2018
  • Show More Cited By

Index Terms

  1. Does the prioritization technique affect stakeholders' selection of essential software product features?

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ESEM '12: Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement
    September 2012
    338 pages
    ISBN:9781450310567
    DOI:10.1145/2372251
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 September 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. essential features
    2. field experiment
    3. prioritization techniques
    4. requirements
    5. robustness
    6. stakeholders

    Qualifiers

    • Research-article

    Conference

    ESEM '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 130 of 594 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)11
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 21 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Earned Business Value ManagementBenefit/Cost-Driven Software Development10.1007/978-3-030-74218-8_5(61-74)Online publication date: 25-Jun-2021
    • (2018)Software Requirements Prioritisation: A Systematic Literature Review on Significance, Stakeholders, Techniques and ChallengesIEEE Access10.1109/ACCESS.2018.28817556(71497-71523)Online publication date: 2018
    • (2018)A systematic literature review of software requirements prioritization researchInformation and Software Technology10.1016/j.infsof.2014.02.00156:6(568-585)Online publication date: 30-Dec-2018
    • (2017)Earned Business Value: See That You Deliver Value to Your CustomerIEEE Software10.1109/MS.2017.10534:4(58-70)Online publication date: 2017

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media