[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Testing Homogeneous Spreadsheet Grids with the "What You See Is What You Test" Methodology

Published: 01 June 2002 Publication History

Abstract

Although there has been recent research into ways to design environments that enable end users to create their own programs, little attention has been given to helping these end users systematically test their programs. To help address this need in spreadsheet systems the most widely used type of end-user programming language we previously introduced a visual approach to systematically testing individual cells in spreadsheet systems. However, the previous approach did not scale well in the presence of largely homogeneous grids, which introduce problems somewhat analogous to the array-testing problems of imperative programs. In this paper, we present two approaches to spreadsheet testing that explicitly support such grids. We present the algorithms, time complexities, and performance data comparing the two approaches. This is part of our continuing work to bring to end users at least some of the benefits of formalized notions of testing without requiring knowledge of testing beyond a naive level.

References

[1]
A. Aho, R. Sethi, and J. Ullman, Compilers, Principles, Techniques, and Tools. Reading, Mass: Addison-Wesley, 1986.
[2]
P. Brown and J. Gould, "An Experimental Study of People Creating Spreadsheets," ACM Trans. Office Automation, vol. 5, no. 3, pp. 258-272, July 1987.
[3]
M. Burnett and H. Gottfried, "Graphical Definitions: Expanding Spreadsheet Languages through Direct Manipulation and Gestures," ACM Trans. Computer-Human Interaction, vol. 5, no. 1, pp. 1-33, Mar. 1998.
[4]
M. Burnett, A. Agrawal, and P. Zee, "Exception Handling in the Spreadsheet Paradigm," IEEE Trans. Software Eng., vol. 26, no. 10, pp. 923-942, Oct. 2000.
[5]
M. Burnett, J. Atwood, R. Djang, H. Gottfried, J. Reichwein, and S. Yang, "Forms/3: A FirstOrder Visual Language to Explore the Boundaries of the Spreadsheet Paradigm," J. Functional Programming, pp. 155-206, Mar. 2001.
[6]
M. Burnett, A. Sheretov, and G. Rothermel, "Scaling Up a 'What You Sec Is What You Test' Methodology to Testing Spreadsheet Grids," Proc. 1999 IEEE Symp. Visual Languages, pp. 30-37, Sept. 1999.
[7]
E. Chi, J. Riedl, P. Barry, and J. Konstan, "Principles for Information Visualization Spreadsheets," IEEE Computer Graphics and Applications, July/Aug. 1998.
[8]
C. Cook, K. Rothermel, M. Burnett, T. Adams, G. Rothermel, A. Sheretov, F. Cort, and J. Reichwein, "Does a Visual 'Testedness' Methodology Aid Debugging?" Technical Report TR #9960-07, Oregon State Univ., Mar. 2001, ftp://ftp.cs.orst.edu/pub/burnett/TR.EmpiricalTestingDebug.ps.
[9]
Coopers & Lybrand UK, "Spreadsheet Modelling in Financial Services Institutions," http://www.planningobjects.com/junglel. htm, June 1997.
[10]
R. Djang and M. Burnett, "Similarity Inheritance: A New Model of Inheritance for Spreadsheet VPLs," Proc. 1998 IEEE Symp. Visual Languages, pp. 134-141, Sept. 1998.
[11]
E. Duesterwald, R. Gupta, and M.L. Soffa, "Rigorous Data Flow Testing Through Output Influences," Proc. Second Irvine Software Symp., pp. 131-145, Mar. 1992.
[12]
P. Frankl and E. Weyuker, "An Applicable Family of Data Plow Criteria," IEEE Trans. Software Eng., vol. 14, no. 10, pp. 1483-1498, Oct. 1988.
[13]
D. Galletta, D. Abraham, M. El Louadi, W. Lekse, Y. Pollalis, and J. Sampler, "An Empirical Study of Spreadsheet Error-Finding Performance," Accounting, Management, and Information Technology, vol. 3, no. 2, pp. 79-95, 1993.
[14]
D. Hamlet, B. Gifford, and B. Nikolik, "Exploring Dataflow Testing of Arrays," Proc. Int'l Conf. Software Eng., pp. 118-129, May 1993.
[15]
J. Horgan and S. London, "Data Flow Coverage and the C Language," Proc. Fourth Symp. Testing, Analysis, and Verification, pp. 87-97, Oct. 1991.
[16]
J. Laski and B. Korel, "A Data Flow Oriented Program Testing Strategy," IEEE Trans. Software Eng., vol. 9, no. 5, pp. 347-354, May 1993.
[17]
J. Leopold and A. Ambler, "Keyboardless Visual Programming Using Voice, Handwriting, and Gesture," Proc. 1997 IEEE Symp. Visual Languages, pp. 28-35, Sept. 1997.
[18]
Microsoft Corporation, Microsoft Excel 4.0 User's Guide 1 and Microsoft Excel 4.0 Function Reference. 1992.
[19]
B. Myers, "Graphical Techniques in a Spreadsheet for Specifying User Interfaces," Proc. ACM Conf. Human Factors in Computing Systems, pp. 243-249, May 1991.
[20]
B. Myers, D. Guise, R. Dannenberg, B. Vander Zanden, D. Kosbie, E. Pervin, A. Mickish, and P. Marchal, "Garnet: Comprehensive Support for Graphical, Highly Interactive User Interfaces," Computer, pp. 71-85, Nov. 1990.
[21]
R. Panko, "What We Know about Spreadsheet Errors," J. End User Computing, vol. 10, no. 2, pp. 15-21, Spring 1998. A longer, continuously updated version is at http://panko.ccba.hawaii. edu/ssr/, with related information at http://panko.ccba.hawaii. edu/humanerr/ and http; / /pankccba.hawaii.edu/humanerr/ SS.htm.
[22]
S. Rapps and E.J. Weyuker, "Selecting Software Test Data Using Data Flow Information," IEEE Trans. Software Eng., vol.11, pp. 367-375, Apr. 1985.
[23]
J. Reichwein, G. Rothermel, and M. Burnett, "Slicing Spreadsheets: An Integrated Methodology for Spreadsheet Testing and Debugging," Proc. Conf. Domain Specific Languages (DSL '99), pp. 25-38, Oct. 1999.
[24]
B. Ronen, R. Palley, and H. Lucas, "Spreadsheet Analysis and Design," Comm. ACM, vol. 32, no. 1, pp. 84-93, Jan. 1989.
[25]
G. Rothermel, M. Burnett, L. Li, C. DuPuis, and A. Sheretov, "A Methodology for Testing Spreadsheets," ACM Trans. Software Eng. and Methodology, pp. 110-147, Jan. 2001.
[26]
G. Rothermel, L. Li, and M. Burnett, "Testing Strategies for FormBased Visual Programs," Proc. Eighth Int'l Symp. Software Reliability Eng., pp. 96-107, Nov. 1997.
[27]
G. Rothermel, L. Li, C. DuPuis, and M. Burnett, "What You See Is What You Test: A Methodology for Testing Form-Based Visual Programs," Proc. Int'l Conf. Software Eng., pp. 198-207, Apr. 1998.
[28]
K. Rothermel, C. Cook, M. Burnett, J. Schonfeld, T. Green, and G. Rothermel, "An Empirical Evaluation of a Methodology for Testing Spreadsheets," Proc. Int'l Conf. Software Eng., pp. 230-239, June 2000.
[29]
T. Smedley, P. Cox, and S. Byrne, "Expanding the Utility of Spreadsheets Through the Integration of Visual Programming and User Interface Objects," ACM Proc. Workshop Advanced Visual Interfaces, pp. 148-155, May 1996.
[30]
T. Teo and M. Tan, "Quantitative and Qualitative Errors in Spreadsheet Development," Proc. 13th Hawaii Int'l Conf. System Sciences, Part 3, vol. 3, pp. 149-155, Jan. 1997.
[31]
G. Wang and A. Ambler, "Solving Display-Based Problems." Proc. 1996 IEEE Symp. Visual Languages, pp. 122-129, Sept. 1996.
[32]
N. Wilde and C. Lewis, "Spreadsheet-Based Interactive Graphics: From Prototype to Tool," Proc. ACM Conf. Human Factors in Computing Systems, pp. 153-159, Apr. 1990.

Cited By

View all
  • (2024)Design Goals for End-User Development of Robot-Assisted Physical Training Activities: A Participatory Design StudyProceedings of the ACM on Human-Computer Interaction10.1145/36646328:EICS(1-31)Online publication date: 17-Jun-2024
  • (2019)Fragment-based spreadsheet debuggingAutomated Software Engineering10.1007/s10515-018-0250-926:1(203-239)Online publication date: 1-Mar-2019
  • (2018)Spreadsheet guardianJournal of Software: Evolution and Process10.1002/smr.193430:9Online publication date: 17-Sep-2018
  • Show More Cited By

Recommendations

Reviews

Tsun-Him Tse

The authors of this paper previously proposed a systematic method for end users to use in testing programs in spreadsheet applications. The method, a “what you see is what you test” (WYSIWYT) interaction, was not scalable; each individual cell had to be tested independently. In this paper, the authors build on the belief that replicated formulas need not be repeatedly tested. They propose a new method that applies two approaches. A straightforward approach allows users to rubberband a number of cells to form a region, validate one of the cells, and deem the whole region to be correct. A region representative approach further allows the tasks involved in the validation of a complex formula to be shared among different cells. Algorithms for collecting static information in a region, tracking execution traces, validating the cells in the region using the WYSIWYT method, and adjusting test adequacy have been proposed for both approaches. The approaches prove to be effective and efficient, helping end users test their spreadsheets in a clean and tidy manner while remaining user-friendly and practical. End users should note, however, that the proposed method does not reveal every type of error. For example, suppose cell A3 is defined as “A1 + A2,” and the formula should also apply in cell B3. This requirement is usually subject to two interpretations, namely “B3 = A1 + A2” or “B3 = B1 + B2”. The proposed method cannot reveal errors caused by such ambiguities if the user only validates A3 in the region. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Software Engineering
IEEE Transactions on Software Engineering  Volume 28, Issue 6
June 2002
96 pages

Publisher

IEEE Press

Publication History

Published: 01 June 2002

Author Tags

  1. Software testing
  2. spreadsheets
  3. visual programming.

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Design Goals for End-User Development of Robot-Assisted Physical Training Activities: A Participatory Design StudyProceedings of the ACM on Human-Computer Interaction10.1145/36646328:EICS(1-31)Online publication date: 17-Jun-2024
  • (2019)Fragment-based spreadsheet debuggingAutomated Software Engineering10.1007/s10515-018-0250-926:1(203-239)Online publication date: 1-Mar-2019
  • (2018)Spreadsheet guardianJournal of Software: Evolution and Process10.1002/smr.193430:9Online publication date: 17-Sep-2018
  • (2017)Signifying software engineering to computational thinking learners with AgentSheets and PoliFacetsJournal of Visual Languages and Computing10.1016/j.jvlc.2017.01.00540:C(91-112)Online publication date: 1-Jun-2017
  • (2017)Automatic verification and validation wizard in web-centred end-user software engineeringJournal of Systems and Software10.1016/j.jss.2016.11.025125:C(47-67)Online publication date: 1-Mar-2017
  • (2016)SS-BDDProceedings of the 1st Brazilian Symposium on Systematic and Automated Software Testing10.1145/2993288.2993296(1-10)Online publication date: 19-Sep-2016
  • (2016)Reflecting on the Physics of Notations applied to a visualisation case studyProceedings of the 6th Mexican Conference on Human-Computer Interaction10.1145/2967175.2967383(24-31)Online publication date: 21-Sep-2016
  • (2014)BumbleBee: a refactoring environment for spreadsheet formulasProceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering10.1145/2635868.2661673(747-750)Online publication date: 11-Nov-2014
  • (2013)Improving spreadsheet test practicesProceedings of the 2013 Conference of the Center for Advanced Studies on Collaborative Research10.5555/2555523.2555531(56-69)Online publication date: 18-Nov-2013
  • (2013)Data clone detection and visualization in spreadsheetsProceedings of the 2013 International Conference on Software Engineering10.5555/2486788.2486827(292-301)Online publication date: 18-May-2013
  • Show More Cited By

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media