Ciolkowski, 1999 - Google Patents
Evaluating the effectiveness of different inspection techniques on informal requirements documentsCiolkowski, 1999
View PDF- Document ID
- 1503198524022817522
- Author
- Ciolkowski M
- Publication year
- Publication venue
- University of Kaiserslautern, Tech. Rep
External Links
Snippet
Software inspections promise to detect and remove defects before they propagate to subsequent development phases. They are particularly important for requirements documents since these documents represent the starting point for many software …
- 238000000034 method 0 title abstract description 431
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3604—Software analysis for verifying properties of programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/22—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
- G06F11/2257—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/02—Knowledge representation
- G06N5/022—Knowledge engineering, knowledge acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/36—Computer-assisted acquisition of medical data, e.g. computerised clinical trials or questionnaires
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
- G09B7/02—Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Sjøberg et al. | Construct validity in software engineering | |
Shull | Developing techniques for using software documents: a series of empirical studies | |
Basili et al. | Building knowledge through families of experiments | |
Basili et al. | The empirical investigation of perspective-based reading | |
Porter et al. | Comparing detection methods for software requirements inspections: A replicated experiment | |
Mays et al. | Experiences with defect prevention | |
Walia et al. | Using error abstraction and classification to improve requirement quality: conclusions from a family of four empirical studies | |
Cook et al. | How to perform credible verification, validation, and accreditation for modeling and simulation | |
Sebrechts et al. | Using algebra word problems to assess quantitative ability: Attributes, strategies, and errors | |
Häser et al. | Is business domain language support beneficial for creating test case specifications: A controlled experiment | |
Sabaliauskaite et al. | Challenges in aligning requirements engineering and verification in a large-scale industrial context | |
Melo et al. | Testing education: A survey on a global scale | |
Shull et al. | Replicated studies: building a body of knowledge about software reading techniques | |
Kaleeswaran et al. | A user study for evaluation of formal verification results and their explanation at Bosch | |
He et al. | PBR vs. checklist: a replication in the n-fold inspection context | |
Williams et al. | Examination of the software architecture change characterization scheme using three empirical studies | |
Ciolkowski | Evaluating the effectiveness of different inspection techniques on informal requirements documents | |
Sabaliauskaite et al. | Further investigations of reading techniques for object-oriented design inspection | |
Véras et al. | A benchmarking process to assess software requirements documentation for space applications | |
Easterbrook et al. | Case studies for software engineers | |
Huang et al. | A New Code Review Method based on Human Errors | |
Kamsties et al. | An empirical investigation of the defect detection capabilities of requirements specification languages | |
Williams et al. | Characterizing software architecture changes: An initial study | |
Abrahao et al. | On the effectiveness of dynamic modeling in UML: Results from an external replication | |
McCall et al. | Software reliability, measurement, and testing software reliability and test integration |