US20050043919A1 - Process and system for quality assurance for software - Google Patents
Process and system for quality assurance for software Download PDFInfo
- Publication number
- US20050043919A1 US20050043919A1 US10/938,957 US93895704A US2005043919A1 US 20050043919 A1 US20050043919 A1 US 20050043919A1 US 93895704 A US93895704 A US 93895704A US 2005043919 A1 US2005043919 A1 US 2005043919A1
- Authority
- US
- United States
- Prior art keywords
- test
- acceptance
- quality assurance
- function
- high level
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
Definitions
- the present invention relates to quality assurance processes used in software development, and more particularly, to acceptance testing processes used in software development.
- Acceptance testing is a quality assurance process that is used in the software industry for a newly developed software application. It is a process of verifying that the newly developed software application performs in accordance with design specifications for such newly developed software.
- the goals of acceptance testing are to achieve zero defects in the newly developed software application and on-time delivery of such newly-developed software application.
- Acceptance testing may be performed for an internally developed software application as well as for an externally developed software application.
- Acceptance testing may be performed for a software application intended for use internally within a business entity as well as for a software application developed for a client. Acceptance testing may be performed by a software developer, a user of the software or a third party.
- acceptance testing is often performed in an ad hoc manner. Steps in an acceptance testing process may be repeated or inefficiently performed. Thus, acceptance testing may require a greater amount of time than is ideal and may not effectively eliminate defects in the newly developed software.
- a process and system for quality assurance for a newly developed software application includes developing a high level quality assurance resource estimate and a high level quality assurance time estimate; producing a business analysis outline; and creating an acceptance test plan using an acceptance test plan template with the business analysis outline.
- the process further includes creating a plurality of test cases to be carried out during a test execution phase (i.e., during an acceptance test) of the quality assurance process using the acceptance test plan; refining the high level quality assurance resource estimate and the high level quality assurance time estimate based on at least the acceptance test plan; executing each of the test cases in an acceptance test to produce a set of test results for each of the test cases; and evaluating each of the sets of test results against the refined high level quality assurance resource estimate and the refined high level quality assurance time estimate.
- One or more defects tracked during the execution of each of the test cases are reported and corrected and a sign off of the acceptance test is negotiated with a client.
- An application audit package is created and stored for future reference.
- FIG. 1 is a flow chart illustrating one embodiment of a process of conducting an acceptance test
- FIG. 2 is a flow chart illustrating one embodiment of a process of conducting a staff allocation
- FIG. 3 is a flow chart illustrating one embodiment of a process of developing an acceptance test plan
- FIG. 4 is a flow chart illustrating one embodiment of a process of creating a plurality of test scripts for an acceptance test
- FIG. 5 is a flow chart illustrating one embodiment of a process for processing defects in an acceptance test
- FIG. 6 is an example-screen shot of one embodiment of a function log document
- FIG. 7 is an example screen shot of one embodiment of a test script document
- FIG. 8 is a block diagram of one embodiment of a system for acceptance testing.
- FIG. 9 is a block diagram of one embodiment of a system for data analysis for acceptance testing.
- the present invention is described in relation to a process and system for acceptance testing of software. Nonetheless, the characteristics and parameters pertaining to the process and system of the invention may be applicable to acceptance testing of other types of products and services.
- the process and system, and the sub-processes and subsystems, of the invention may be used for updates to existing software applications as well as for newly developed software applications.
- Acceptance testing is a third tier of software functionality verification. Acceptance testing occurs after a first tier of unit testing and a second tier of system testing with respect to a particular software product or application have been completed. Acceptance testing involves testing the entire software application's functionality as it is to be used in a production environment by a user.
- FIG. 1 is a flowchart illustrating one embodiment of a process of conducting an acceptance test.
- a high level quality assurance resource estimate and a high level quality assurance time estimate are developed.
- a business analysis outline is produced.
- an acceptance test plan is created using an acceptance test plan template with the business analysis outline produced at step 102 .
- the high level quality assurance resource estimate and the high level quality assurance time estimate are refined.
- a plurality of test cases to be carried out during a test execution phase of an acceptance test are developed.
- each of the test cases are executed in the acceptance test to produce a set of test results for each of the test cases.
- each of the sets of results of the test cases is evaluated against predicted results outlined in an acceptance test function log or test scripts, described below with reference to step 312 of FIG. 3 .
- one or more defects tracked during the execution of each of the test cases are processed.
- a sign-off of the acceptance test is negotiated with a client.
- an application audit package is created and stored for future reference.
- FIG. 1 may be performed by a system, such as the system illustrated in FIGS. 8 and 9 . Additionally, the sequence of steps shown in FIG. 1 may be modified in accordance with the present invention. The steps illustrated in FIG. 1 will now be described in greater detail.
- a high level quality assurance resource estimate and a high level quality assurance time estimate is developed. Each of these estimates may be based on a past quality assurance experience, a number of test cases and/or a level of staffing.
- the high level quality assurance resource estimate and the high level quality assurance time estimate may each be developed in a staff allocation process, as described below.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are each developed very early in a project life cycle of a software development project.
- the acceptance testing process is also started early in the life cycle of the software development project. Early initiation of the acceptance testing process allows earlier detection and correction of any defects, and a lowering of a cost of correction of any defects.
- a business analysis outline is produced.
- the business analysis outline is an input document to the acceptance testing process.
- the business analysis outline may be prepared by a lead quality assurance analyst.
- the business analysis outline may include, for example, an interpretation of a plurality of user requirements for a software application to be developed in the software development project.
- the business analysis outline may also include a plurality of queries for making a determination of whether the user requirements have been met.
- the step 102 of producing the business analysis outline also provides an opportunity to reevaluate and confirm that the prior high level quality assurance resource estimate and the high level quality assurance time estimate are correct.
- an acceptance test plan for testing the software application is created.
- the acceptance test-plan may be created by the lead quality assurance analyst using an acceptance test plan template along with the business analysis outline developed at step 102 .
- the acceptance test plan is a document describing a plurality of steps to be conducted during execution of the acceptance test, a plurality of factors that need to be considered in executing the acceptance test, and any other elements associated with the execution of the acceptance test.
- the acceptance test plan creation process will be described in more detail below with reference to FIG. 3 .
- the prior high level quality assurance resource estimate and the high level quality assurance time estimate are refined.
- the high level quality assurance resource estimate and the high level quality assurance time estimate may be refined based on the acceptance test plan created at step 103 .
- test cases or function logs are developed using the acceptance test plan created in step 103 .
- the test cases or function logs are comprised of a plurality of detailed descriptions of tests that are to be carried out during the execution of the acceptance test plan. The development of the test cases will be described in more detail below with reference to FIG. 4 .
- each of the test cases developed in step 105 are executed in the acceptance test.
- a set of test results from each of the executed test cases is evaluated.
- Each of the sets of test results is evaluated against a set of predefined expected results.
- Each of the test cases is then labeled by a tester to indicate a “pass” or a “fail”.
- Each of the test cases that fails may indicate a defect in the software application.
- each of the defects reported during the execution step 106 and the evaluation step 107 of the acceptance test are reported.
- the defect is identified, logged and submitted to the team of software developers for correction.
- the team of software developers may correct each of the identified defects or explain why the identified corrected defect is not a defect. Once corrected, each of the identified corrected defects is retested, and if appropriate, closed.
- a sign-off of the acceptance test is negotiated with a client.
- the lead quality assurance analyst may initiate a discussion with the client to review test results.
- any outstanding issues (open defects, system or training issues) may be negotiated.
- the effect of defects on time frames may be discussed with the client.
- open defects have too great an impact on system functionality and, thus, implementation may need to be moved forward until the defect is fixed.
- open defects are minor enough that implementation is undisturbed and the application is moved into production with the known defects.
- an application audit package is created and stored for future reference.
- the application audit package may include completed function logs, a project defect log, a sample of a few defect tickets and resolutions, a representative sample of products produced and a business area sign-off form.
- the application audit package may also include, where applicable, an acceptance test plan, a final business analysis, and a service request.
- the acceptance test process of the present invention provides a repeatable process that effectively directs a verification of a software development process and efficiently eliminates any defects found.
- the acceptance testing process communicates a plurality of roles and responsibilities for each person involved in the acceptance testing process and a methodology required to successfully validate functionality of a newly developed software application.
- the acceptance testing process facilitates communication between a client, a team of developers, a tester and an end user, thereby removing any deviations between the software application as specified by the client and the software application delivered to the client by the team of developers.
- the acceptance testing process of the invention also eliminates redundant steps involved in prior acceptance testing processes and is, therefore, more efficient.
- the acceptance testing process is easy to execute and impart to others, including less-experienced, non-technical employees who may need to be trained in conducting the acceptance testing process.
- the acceptance testing process also facilitates meeting one or more milestones in a software development project.
- FIG. 2 is a flowchart illustrating one embodiment of a process for conducting a staff allocation.
- the staff allocation process involves three (3) stages of making quality assurance resource estimates and quality assurance time estimates.
- a high level quality assurance resource estimate and a high level quality assurance time estimate are developed.
- the high level quality assurance-resource estimate and the high level quality assurance time estimate are recorded in a staff allocation document.
- the high level quality assurance resource estimate and high level quality assurance time estimate are both confirmed.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are-both refined.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are developed from the business analysis outline.
- the high level quality assurance resource estimate and the high level quality assurance time estimate developed in step 201 are the same as the high level quality assurance resource estimate and the high level quality assurance time estimate developed in step 101 of FIG. 1 .
- the first process is a “similar project approach.”
- a high level quality assurance time estimate and a high level quality assurance resource estimate are derived from a quality assurance resource estimate and a quality assurance time estimate from a similarly complex and similarly sized prior development project.
- the second process is a “building block approach.” In the building block approach, a number of test cases are estimated and that number of test cases is multiplied by a time estimated to execute the test cases.
- the third process is a “resource leveling approach.” In the resource leveling approach, resources are added to those that were used in a prior development project to reduce an implementation time that was required for implementing the prior development project.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are recorded in the staff allocation document.
- the high level quality assurance resource estimate and the high level quality assurance time estimate in the staff allocation document developed in step 202 are confirmed.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are confirmed during the review of the business analysis outline process described above with reference to step 102 in FIG. 1 .
- the high level quality assurance resource estimate and the high level quality assurance time estimate are refined.
- the high level quality assurance resource estimate and the high level quality assurance time estimate are refined in step 204 based on an acceptance test plan, such as the acceptance test plan created in step 103 of FIG. 1 above.
- the processes described with reference to FIG. 2 may be performed by a system, such as the system described below with reference to FIGS. 8 and 9 .
- a plurality of inputs to the staff allocation process may include a past quality assurance project data, the business analysis outline and the acceptance test plan.
- the output of the staff allocation process includes the high level quality assurance time estimate and the high level quality assurance resource estimate, and a staff allocation document which may be provided to members of interested or associated groups including, for example, information technology group. Each successive stages of the quality assurance staff allocation process requires more accurate time and resource-predictions.
- FIG. 3 is a flowchart illustrating one embodiment of a process of developing an acceptance test plan.
- an overview is created for the acceptance test plan process.
- a plurality of objectives for completing the acceptance test plan are developed.
- a plurality of risks associated with the acceptance testing are identified.
- a plurality of remedies for thee identified risks are proposed.
- a plurality of assumptions for a successful acceptance testing process are identified.
- a required system environment for conducting the acceptance testing is identified.
- a plurality of limitations imposed upon the acceptance test are identified.
- a plurality of sign-offs required for approving the acceptance test plan are identified.
- a glossary of terms used in the acceptance test plan is drafted.
- a strategy and an approach for the acceptance test are developed.
- a plurality of conditions for execution of the acceptance test plan are identified.
- a test specification is created.
- the results of the acceptance test plan process are recorded in an acceptance test plan document. The process illustrated in FIG. 3 may be performed by a system, such as the system illustrated in FIGS. 8 and 9 .
- an overview of the software application being developed is created.
- the overview comprises a brief description of the software application being developed.
- the software application being developed is for a web-site
- the overview may include a description of the web-site and a plurality of objectives of the software application.
- a plurality of objectives for the acceptance test plan are developed.
- the objectives may comprise a plurality of high-level goals and a plurality of strategies required to successfully complete the acceptance test.
- the objectives may comprise ensuring that a functionality for the software application being developed as outlined in the acceptance test plan has been successfully tested, verifying that the objectives of the software application being tested have been successfully met and verifying that the software application being tested can coexist with all required functionality. Additionally, the objectives may include verifying through regression testing that other installed or related software applications continue to properly function, detailing a plurality of activities required to prepare and conduct acceptance testing, defining a methodology that will be used to test and verify system functionality and using acceptance testing as an effective communication tool in identifying and resolving a plurality of system issues and problems.
- the objectives may further include determining and coordinating a plurality of responsibilities for persons involved in the acceptance testing, defining a plurality of environments and test cycle approaches required to effectively complete the acceptance testing and defining all system functions to be tested, as well as those functions that will not be tested.
- a plurality of risks to the acceptance testing process are identified.
- the risks may include risks to a schedule and/or risks to a quality level.
- the risks identified may include a tight schedule where any additional functionality or requirement changes could result in a delay in implementation of the software application and a need for dedicated technical support in order to eliminate any testing down time.
- a plurality of remedies or contingencies are proposed for the identified risks.
- Each identified risk must have a corresponding remedy or contingency. For example, if a risk to an acceptance test schedule is that the schedule is very tight and any additional functionality or requirement changes could result in a delay in implementation of the software application, a corresponding contingency may be that any requirement changes after a certain date must follow one or more procedures, such as a predetermined change control procedure.
- the assumptions may include a plurality of basic testing related issues that are required for a successful acceptance testing process.
- the assumptions may include an assumption that, at the end of a design phase, a detailed schedule of the software deliverables will be published, all functionality required to fully support an application must be ready for acceptance testing on or before a certain date, and all documentation relating to the software application will be made available to a quality assurance group on or before a certain date.
- a system environment for acceptance testing is described.
- the description of the system environment may include a description of a hardware and a software interface needed for acceptance testing.
- the description may also include a type of hardware and software platform, such as, for example, a specific mainframe system and related software interfaces for a Marketing Sales System such as, for example. ChoicepointTM, CorrespondenceTM, EPSTM Electronic Publishing, Code 1TM, POLKTM, Ratabase 1.5TM, or other similar systems.
- one or more limitations or constraints to the acceptance test plan are identified.
- the limitations may include, for example, that the scheduled implementation date for a new software application is firm and cannot be changed.
- a plurality of sign-offs required for approving the acceptance test plan are identified.
- the sign-offs required may include a list of persons from a client area who will be responsible for approving the acceptance test plan and reviewing any test material and a list of persons from the information technology group who will approve the acceptance test plan from a technical point of view.
- a glossary of terms used in the acceptance test plan is drafted.
- the glossary of terms may include a definition of each technical, business, and quality assurance term that may be unfamiliar to those reviewing the acceptance test plan.
- the glossary of terms may include, for example, a name of the software application being tested, a vendor name, and other terms and acronyms that may be unfamiliar to a person reviewing the acceptance test plan.
- a strategy is developed for the acceptance testing.
- the strategy or approach for the acceptance testing includes identifying a test environment for the execution of the acceptance test plan and developing a scope and an approach for the execution of the acceptance test plan.
- Identifying the test environment comprises providing a high level description of resources, both hardware and software, that will be required in order to conduct the acceptance testing.
- the description of the test environment may include a requirement for a facility to be available at a certain time, including evening and weekend availability, an identification of all interfaces that must be available to verify the test cases, and an identification of all functionality of the software being tested that must be ready for acceptance testing on a certain date for the start of acceptance testing,
- the scope and approach for the acceptance testing comprises an identification of one or more functions or tests that are planned for the acceptance testing and any testing limitations or identifiable ranges of the tests.
- the scope and approach may include, for example, a number of stages of a test with an identification of a type of test.
- the test may include two stages with the first stage comprising a timing test, if a goal of the software application under development is to reduce processing time, and a second stage comprising a system functionality test.
- one-or more conditions required for execution of the acceptance test-plan are identified.
- the conditions may include an entrance criteria condition, a resources required condition, an acceptance test tools condition, and an organization for testing condition.
- the entrance criteria are one or more events required by the quality assurance group to be successfully completed prior to the beginning of the acceptance testing.
- the entrance criteria may include that a volunteer tester must have an ability to access the acceptance testing site from his home, that an identification of all problems detected during system testing must be forwarded to the quality assurance group prior to the start of acceptance testing, a verification that the acceptance testing environment is ready to support acceptance testing, that a test team has been identified and trained, that all system documentation has been turned over to the quality assurance group, that an acceptance testing schedule has been agreed upon and published and that all major interfaces of the software application to be tested are available for acceptance testing.
- the resources required comprise an identification of the human, hardware and software resources required to complete the acceptance testing.
- the identification of the human resources required may include a listing of persons required to conduct the acceptance testing such as a number of quality assurance analysts required, and a number of volunteer testers working from home or at the acceptance testing facility.
- the identification of the hardware/software resources required may include an identification of a type of platform such as a mainframe system, or Internet access.
- the acceptance test tools include any software package that automates any part of the acceptance testing process.
- the acceptance test tools may include Microsoft Word® or other word processing program, a copy tool for the software application being tested or a session open tool.
- the identification of the organization for acceptance testing comprises identification of persons required to conduct the acceptance testing and their corresponding responsibilities during the acceptance testing.
- the responsibilities may include a responsibility for “Review of Acceptance Test Plan”, a responsibility for “Keying Test Cases”, and a responsibility for “Reviewing Test Results”.
- the human resources required to fulfill such responsibilities may be listed under each responsibility, such as, for example, in the “Review of Acceptance Test Plan” category, the human resources required may include a Project Manager, an Applications Team Leader, a Business Analyst, a Quality Assurance and a Business Partner.
- a test specification is created for the acceptance test plan.
- the test specification may include a test structure/organization section, a test functions or cases section, a test scripts or function logs section, and a test schedules and test completion criteria section.
- the test structure/organization section may be comprised of one or two paragraphs relating to one or more types of processing cycles and environment changes that may be needed during the acceptance testing.
- test functions or cases section makes up the primary section of the acceptance test plan.
- the test functions/cases section identifies all of the functions or cases to be verified during the acceptance testing.
- the test scripts/function logs section should be included in the test specification so that the function logs and test scripts are generally available upon request.
- a test schedule identifies a schedule established in the scope and approach section of the acceptance test plan.
- test completion criteria include a paragraph that describes a framework for handling any severe system problems that remain open at the end of acceptance testing.
- the test completion criteria may state that all major functions and sub-functions of the software application being tested will be considered to have passed acceptance testing when no severe system problem remains open.
- the test completion criteria may then describe how to identify a severe system problem.
- the acceptance test plan processes are recorded in an acceptance test plan document.
- all of the processes described from step 301 to step 312 will be recorded in a document to be approved by the persons identified at step 308 .
- FIG. 4 is a flowchart illustrating one embodiment of a process of creating a plurality of test scripts for an acceptance test plan.
- an acceptance test plan is reviewed. Specifically, the test functions/cases section of the test specification created at step 312 of FIG. 3 is reviewed.
- a function log process is performed.
- the functions/cases section of the acceptance test plan is referenced in order to create one or more function logs.
- a function log may contain many individual test cases or scenarios.
- a function log may include a plurality of headings including, a function heading, a function number heading, an identifier heading, an expected results heading and a pass/fail heading. An example of a function log is described with reference to FIG. 6 below.
- An input to the function log process is the test function/cases section of the acceptance test plan.
- An output of the function log process is a physical document called a function log.
- a test script is created using the function log.
- a test script may contain more specific testing instructions than a function log.
- the test script may include a plurality of headings including a function/script heading, a function/script number heading, a script activity per cycle heading, an expected results heading and a pass/fail per cycle heading.
- Either the function logs or the test scripts may be used as guides for executing the acceptance test plan, depending upon the skill level of the tester. If the skill level of the tester is high or if the tester has a lot of experience in conducting acceptance testing, the function logs may be used by the tester. However, if the test scripts are used, a tester having a lower skill level may execute the acceptance test.
- the input to the test script process is the function log document and the output to the test script process is the test script document.
- the test case descriptions from the acceptance test plan, the function logs or the test scripts may be used as instructions for carrying out test cases, where the test scripts comprise more detailed instructions than the function logs, and the function logs comprise more detailed instructions than the test case descriptions.
- FIG. 5 is a flowchart illustrating one embodiment of a process of processing defects in an acceptance test plan.
- a potential defect is identified during the acceptance testing.
- a defect ticket record is created for the identified potential defect.
- the defect is recorded on a project defect tracking log.
- the defect ticket record is transmitted to a developer area 510 . Steps 511 - 513 occur in developer area 510 .
- a developer reviews the defect ticket record.
- the developer corrects the defect or explains why the identified problem is not a defect.
- the developer then updates the defect ticket record to reflect a date of correction of the defect and an identity of a corrector of the defect if the defect was corrected. If the identified defect was not corrected, the developer updates the defect ticket record with an explanation for why the defect was not corrected.
- the developer returns the defect ticket record to a quality assurance area.
- a quality assurance analyst in the quality assurance area checks to see if the defect needs to be retested. If the defect was corrected in the developer area 510 , then the corrected defect will require retesting. If the defect was retested, then, at step 506 , the quality assurance analyst checks to see if the corrected defect passed the retesting.
- the process returns to step 503 where the updated defect information is recorded on the project defect tracking log and transmitted at step 504 to the developer area 510 . If the corrected defect passes the retesting, the project defect tracking log is updated at step 507 and the defect ticket record is also updated to reflect that the corrected defect passed the retesting and that the defect ticket record is closed. If the defect does not need to be retested, the project defect tracking log is updated at step 507 . At step 508 , the defect ticket records are placed in a project folder.
- the defect ticket record of step 502 may include an identification of the project, a defect number, a priority level identification, an identification of a person reporting the defect, a test case number, a function area identification, a description of the defect, a date of opening of the defect ticket, an identification of the software application subject to the defect, an identification of a corrector and a date of correction of the defect, an identification of a retester and a date of retesting, and a date of closing of the defect ticket record if the corrected defect passes the retesting.
- FIG. 6 is an example of a screen shot of one embodiment of a function log document 600 illustrating an acceptance test plan for an automotive insurance software application developed for an automotive insurance provider entity. It should be understood, however, that the function log document of the invention can be modified as required for use by any type of business entity for acceptance, testing of any type of software application.
- Function log document 600 includes a plurality of fields including a Function field 601 , a Function Number field 602 , an Identification Number field 603 , an Expected Results field 604 and a Pass/Fail field 605 .
- a plurality of test cases 606 - 610 are performed.
- the test case 606 is for a user who learned of the insurance provider through a radio commercial, who has one car with one operator where the user has a date of birth in 1930 and the car has an annual mileage of 3,000 miles per year, as listed in Function field 601 of test case 606 .
- a function number assigned this test case 606 is 1, as listed under the Function Number field 602 .
- the test case 606 is assigned an identification number of 840430090 as listed in the Identification Number field 603 .
- the expected results of the entered information for the test case 606 is that the user should not be asked for the odometer purchase date or date of purchase, as listed under the Expected Results field 604 .
- An indication is listed in the Pass/Fail field 605 of whether the test case 606 passed or failed the testing. As shown in FIG. 6 , the test case 606 passed the testing.
- FIG. 7 is an example of a screen shot of one embodiment of a test script document 700 .
- the test script document 700 is a test script used for acceptance testing of the automotive insurance software application for insurance requirements in the State of California.
- the test script document 700 includes a plurality of fields including a Test Case Number field 701 , a Log/Script Activity field 702 , a User Number field 703 , an Expected Results field 704 and a Comments field 705 .
- the Test Case Number field 701 lists the number of the test case.
- the log/script activity field 702 lists a user ID and a password to be entered by a tester and detailed instructions on how to perform the test case.
- a test case 706 is input for a situation where a user lives in California compared to where the user lives in a state other than California.
- the expected results, as listed in the Expected Results field 704 are that a quote for a cost of an automotive insurance product should be the same for the user regardless of whether the user's information was entered into system “Q” or “M”.
- the Comments field 705 the software application subject to testing passed the testing.
- the Comments field 705 may also be used by a tester to enter any comments or notes that indicate some activity needs to be performed or investigated.
- FIG. 8 is a block diagram of a system 800 used for acceptance testing.
- System 800 includes a staff allocation module 801 , an acceptance test plan module 802 , a test script module 803 , a defects processing module 804 , an external processes module 805 and an acceptance testing module 810 .
- the acceptance testing module 810 receives inputs from the staff allocation module 801 , the acceptance test plan module 802 , the test script module 803 , the defects reporting module 804 and the external processes module 805 .
- the execution of the acceptance test plan occurs in the acceptance testing module 810 .
- the high level quality assurance resource estimate and the high level quality assurance time estimate for the acceptance testing and the staff allocation module document are developed in the staff allocation module 801 .
- the staff allocation module 801 includes means (not shown) for developing the high level quality assurance resource estimate and the high level quality-assurance time estimate and for recording such high level quality assurance resource estimate and high level quality assurance time estimate in the staff allocation document.
- the acceptance test plan module 802 receives an output from the staff allocation module 801 and develops an acceptance test plan.
- the acceptance test plan module 802 includes means (not shown) for developing the acceptance test plan as described above with reference to FIG. 3 .
- the outputs of the acceptance test plan module 802 are coupled to the acceptance testing module 810 and the test script module 803 .
- the test script module 803 receives the acceptance test plan from acceptance test plan module 802 and generates both the function log document and the test scripts.
- the test script module 803 includes means (not shown) for creating the function logs document and the test scripts as described above with reference to FIG. 4 .
- the function logs document and the test scripts are transmitted to the acceptance testing module 810 for execution of the acceptance test plan.
- the defects reporting module 804 receives inputs from both the external processes module 805 and the acceptance testing module 810 .
- the defects reporting module 804 processes any defects identified during the execution of the acceptance test plan and transmits the identified defects to the developer area.
- the defects reporting module 804 includes means (not shown) for processing defects as described above with reference to FIG. 5 .
- the external processes module 805 receives inputs from both the acceptance testing module 810 and the defects reporting module 804 .
- the external processes module 805 has an output connected to the staff allocation module 801 .
- the external processes module 805 includes the development area, an information technology area, or an area for any other processes external to the acceptance testing.
- the external processes module 805 receives or generates a business analysis outline which it transmits to the staff allocation module 801 and the acceptance testing module 810 .
- FIG. 9 is a block diagram illustrating the components of one embodiment of a system 900 used for acceptance testing.
- system 900 may be comprised of a processor module 902 , a display 904 , a user input 906 , a data input module 908 , a data storage module 910 , and an output module 912 .
- the processor module 902 receives inputs from the data input module 908 and the user input module 906 , and provides outputs via the display 904 and the output module 912 .
- the processor module 902 may also receive inputs and provide outputs through the data storage module 910 .
- the processor module 902 may be a standard processor suitable for performing any necessary calculations required for the acceptance testing, including multiple task processing as necessary. As illustrated, the processor module 902 may receive inputs from the data input module 908 and the user input module 906 , as well as data from the data storage module 910 .
- the data input module 908 may be any conventional data input device, such as a magnetic or an optical disk drive, a CD-ROM, a scanner, a modern, an Internet connection, a hard-wired connection, or any other device for inputting data to the processor module 902 .
- the user input module 906 may be any conventional user input device, such as a keyboard, a touch-screen, a roller-ball, a mouse, a pointer, or any other device for a user to enter and direct manipulation of data in the processor module 902 .
- the data storage module 910 may be comprised of any conventional storage device, such as a computer memory, a magnetic or an optical disc or a CD-ROM, a tape-to-tape reel, or any other device for storing data.
- the data storage module 902 may contain information related to the business analysis outline, the acceptance test plan, the high level quality assurance time estimate and the high level quality assurance resource estimate, the past quality assurance project, defects and other information.
- the processor module 902 may be capable of accessing data in the data storage module 910 .
- the data storage module 910 may be searchable by a field or in a variety of other conventional manners.
- the processor module 902 may provide information through the display 904 and the output module 912 , as well as provide data to the data storage module 910 .
- the display 904 may be any conventional display device, such as a television, a monitor, or any other display device.
- the output module 912 may be any conventional output device, such as a printer, a facsimile machine, a magnetic disc drive, a compact disc drive or an optical disc drive, a modem, an Internet connection, a hard-wired connection, or any other device for outputting data to the processor module 902 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
- Debugging And Monitoring (AREA)
Abstract
A process and system for quality assurance. The process includes developing a high level quality assurance resource estimate and a high level quality assurance time estimate; producing a business analysis outline; and creating an acceptance test plan using an acceptance test plan template with the business analysis outline. The process further includes creating a plurality of test cases to be carried out during a test execution phase of the quality assurance process using the acceptance test plan; refining the high level quality assurance resource estimate and the high level quality assurance time estimate based on the acceptance test plan; executing each of the test cases in an acceptance test to produce a set of test results for each of the test cases; and evaluating the test results against the refined high level quality assurance resource estimate and the refined high level quality assurance time estimate. One or more defects tracked during the execution of the test cases are reported and a sign off of the acceptance test is negotiated with a client. An application audit package is created and stored for future reference.
Description
- The present invention relates to quality assurance processes used in software development, and more particularly, to acceptance testing processes used in software development.
- Acceptance testing is a quality assurance process that is used in the software industry for a newly developed software application. It is a process of verifying that the newly developed software application performs in accordance with design specifications for such newly developed software. The goals of acceptance testing are to achieve zero defects in the newly developed software application and on-time delivery of such newly-developed software application. Acceptance testing may be performed for an internally developed software application as well as for an externally developed software application. Acceptance testing may be performed for a software application intended for use internally within a business entity as well as for a software application developed for a client. Acceptance testing may be performed by a software developer, a user of the software or a third party.
- However, acceptance testing is often performed in an ad hoc manner. Steps in an acceptance testing process may be repeated or inefficiently performed. Thus, acceptance testing may require a greater amount of time than is ideal and may not effectively eliminate defects in the newly developed software.
- A process and system for quality assurance for a newly developed software application is described. The process includes developing a high level quality assurance resource estimate and a high level quality assurance time estimate; producing a business analysis outline; and creating an acceptance test plan using an acceptance test plan template with the business analysis outline. The process further includes creating a plurality of test cases to be carried out during a test execution phase (i.e., during an acceptance test) of the quality assurance process using the acceptance test plan; refining the high level quality assurance resource estimate and the high level quality assurance time estimate based on at least the acceptance test plan; executing each of the test cases in an acceptance test to produce a set of test results for each of the test cases; and evaluating each of the sets of test results against the refined high level quality assurance resource estimate and the refined high level quality assurance time estimate. One or more defects tracked during the execution of each of the test cases are reported and corrected and a sign off of the acceptance test is negotiated with a client. An application audit package is created and stored for future reference.
-
FIG. 1 is a flow chart illustrating one embodiment of a process of conducting an acceptance test; -
FIG. 2 is a flow chart illustrating one embodiment of a process of conducting a staff allocation; -
FIG. 3 is a flow chart illustrating one embodiment of a process of developing an acceptance test plan; -
FIG. 4 is a flow chart illustrating one embodiment of a process of creating a plurality of test scripts for an acceptance test; -
FIG. 5 is a flow chart illustrating one embodiment of a process for processing defects in an acceptance test; -
FIG. 6 is an example-screen shot of one embodiment of a function log document; -
FIG. 7 is an example screen shot of one embodiment of a test script document; -
FIG. 8 is a block diagram of one embodiment of a system for acceptance testing; and -
FIG. 9 is a block diagram of one embodiment of a system for data analysis for acceptance testing. - The present invention is described in relation to a process and system for acceptance testing of software. Nonetheless, the characteristics and parameters pertaining to the process and system of the invention may be applicable to acceptance testing of other types of products and services. The process and system, and the sub-processes and subsystems, of the invention may be used for updates to existing software applications as well as for newly developed software applications.
- Acceptance testing is a third tier of software functionality verification. Acceptance testing occurs after a first tier of unit testing and a second tier of system testing with respect to a particular software product or application have been completed. Acceptance testing involves testing the entire software application's functionality as it is to be used in a production environment by a user.
-
FIG. 1 is a flowchart illustrating one embodiment of a process of conducting an acceptance test. Atstep 101, a high level quality assurance resource estimate and a high level quality assurance time estimate are developed. Atstep 102, a business analysis outline is produced. Atstep 103, an acceptance test plan is created using an acceptance test plan template with the business analysis outline produced atstep 102. Atstep 104, the high level quality assurance resource estimate and the high level quality assurance time estimate are refined. Atstep 105, a plurality of test cases to be carried out during a test execution phase of an acceptance test are developed. Atstep 106, each of the test cases are executed in the acceptance test to produce a set of test results for each of the test cases. Atstep 107, each of the sets of results of the test cases is evaluated against predicted results outlined in an acceptance test function log or test scripts, described below with reference tostep 312 ofFIG. 3 . Atstep 108, one or more defects tracked during the execution of each of the test cases are processed. Atstep 109, a sign-off of the acceptance test is negotiated with a client. Atstep 110, an application audit package is created and stored for future reference. - As will be described in more detail below, the various processes illustrated in
FIG. 1 may be performed by a system, such as the system illustrated inFIGS. 8 and 9 . Additionally, the sequence of steps shown inFIG. 1 may be modified in accordance with the present invention. The steps illustrated inFIG. 1 will now be described in greater detail. - At
step 101, a high level quality assurance resource estimate and a high level quality assurance time estimate is developed. Each of these estimates may be based on a past quality assurance experience, a number of test cases and/or a level of staffing. The high level quality assurance resource estimate and the high level quality assurance time estimate may each be developed in a staff allocation process, as described below. - The high level quality assurance resource estimate and the high level quality assurance time estimate are each developed very early in a project life cycle of a software development project. Thus, the acceptance testing process is also started early in the life cycle of the software development project. Early initiation of the acceptance testing process allows earlier detection and correction of any defects, and a lowering of a cost of correction of any defects.
- At
step 102, a business analysis outline is produced. The business analysis outline is an input document to the acceptance testing process. The business analysis outline may be prepared by a lead quality assurance analyst. The business analysis outline may include, for example, an interpretation of a plurality of user requirements for a software application to be developed in the software development project. The business analysis outline, may also include a plurality of queries for making a determination of whether the user requirements have been met. Thestep 102 of producing the business analysis outline also provides an opportunity to reevaluate and confirm that the prior high level quality assurance resource estimate and the high level quality assurance time estimate are correct. - At
step 103, an acceptance test plan for testing the software application is created. The acceptance test-plan may be created by the lead quality assurance analyst using an acceptance test plan template along with the business analysis outline developed atstep 102. The acceptance test plan is a document describing a plurality of steps to be conducted during execution of the acceptance test, a plurality of factors that need to be considered in executing the acceptance test, and any other elements associated with the execution of the acceptance test. The acceptance test plan creation process will be described in more detail below with reference toFIG. 3 . - At
step 104, the prior high level quality assurance resource estimate and the high level quality assurance time estimate are refined. The high level quality assurance resource estimate and the high level quality assurance time estimate may be refined based on the acceptance test plan created atstep 103. - At
step 105, a plurality of test cases or “function logs” are developed using the acceptance test plan created instep 103. The test cases or function logs are comprised of a plurality of detailed descriptions of tests that are to be carried out during the execution of the acceptance test plan. The development of the test cases will be described in more detail below with reference toFIG. 4 . - At
step 106, each of the test cases developed instep 105 are executed in the acceptance test. Atstep 107, a set of test results from each of the executed test cases is evaluated. Each of the sets of test results is evaluated against a set of predefined expected results. Each of the test cases is then labeled by a tester to indicate a “pass” or a “fail”. Each of the test cases that fails may indicate a defect in the software application. - At
step 108, each of the defects reported during theexecution step 106 and theevaluation step 107 of the acceptance test are reported. Atstep 108, as a defect is encountered, the defect is identified, logged and submitted to the team of software developers for correction. The team of software developers may correct each of the identified defects or explain why the identified corrected defect is not a defect. Once corrected, each of the identified corrected defects is retested, and if appropriate, closed. - At
step 109, a sign-off of the acceptance test is negotiated with a client. The lead quality assurance analyst may initiate a discussion with the client to review test results. At this time, any outstanding issues (open defects, system or training issues) may be negotiated. For example, the effect of defects on time frames may be discussed with the client. In some cases, open defects have too great an impact on system functionality and, thus, implementation may need to be moved forward until the defect is fixed. In other cases, open defects are minor enough that implementation is undisturbed and the application is moved into production with the known defects. - At
step 110, an application audit package is created and stored for future reference. The application audit package may include completed function logs, a project defect log, a sample of a few defect tickets and resolutions, a representative sample of products produced and a business area sign-off form. The application audit package may also include, where applicable, an acceptance test plan, a final business analysis, and a service request. - The acceptance test process of the present invention provides a repeatable process that effectively directs a verification of a software development process and efficiently eliminates any defects found. The acceptance testing process communicates a plurality of roles and responsibilities for each person involved in the acceptance testing process and a methodology required to successfully validate functionality of a newly developed software application. The acceptance testing process facilitates communication between a client, a team of developers, a tester and an end user, thereby removing any deviations between the software application as specified by the client and the software application delivered to the client by the team of developers.
- The acceptance testing process of the invention also eliminates redundant steps involved in prior acceptance testing processes and is, therefore, more efficient. The acceptance testing process is easy to execute and impart to others, including less-experienced, non-technical employees who may need to be trained in conducting the acceptance testing process. The acceptance testing process also facilitates meeting one or more milestones in a software development project.
-
FIG. 2 is a flowchart illustrating one embodiment of a process for conducting a staff allocation. The staff allocation process involves three (3) stages of making quality assurance resource estimates and quality assurance time estimates. Atstep 201, a high level quality assurance resource estimate and a high level quality assurance time estimate are developed. Atstep 202, the high level quality assurance-resource estimate and the high level quality assurance time estimate are recorded in a staff allocation document. Atstep 203, the high level quality assurance resource estimate and high level quality assurance time estimate are both confirmed. Atstep 204, the high level quality assurance resource estimate and the high level quality assurance time estimate are-both refined. - At
step 201, the high level quality assurance resource estimate and the high level quality assurance time estimate are developed from the business analysis outline. The high level quality assurance resource estimate and the high level quality assurance time estimate developed instep 201 are the same as the high level quality assurance resource estimate and the high level quality assurance time estimate developed instep 101 ofFIG. 1 . - Three (3) processes are used in making the estimates. The first process is a “similar project approach.” In the similar project approach, a high level quality assurance time estimate and a high level quality assurance resource estimate are derived from a quality assurance resource estimate and a quality assurance time estimate from a similarly complex and similarly sized prior development project. The second process is a “building block approach.” In the building block approach, a number of test cases are estimated and that number of test cases is multiplied by a time estimated to execute the test cases. The third process is a “resource leveling approach.” In the resource leveling approach, resources are added to those that were used in a prior development project to reduce an implementation time that was required for implementing the prior development project.
- At
step 202, the high level quality assurance resource estimate and the high level quality assurance time estimate are recorded in the staff allocation document. Atstep 203, the high level quality assurance resource estimate and the high level quality assurance time estimate in the staff allocation document developed instep 202 are confirmed. The high level quality assurance resource estimate and the high level quality assurance time estimate are confirmed during the review of the business analysis outline process described above with reference to step 102 inFIG. 1 . - At
step 204, the high level quality assurance resource estimate and the high level quality assurance time estimate are refined. The high level quality assurance resource estimate and the high level quality assurance time estimate are refined instep 204 based on an acceptance test plan, such as the acceptance test plan created instep 103 ofFIG. 1 above. The processes described with reference toFIG. 2 may be performed by a system, such as the system described below with reference toFIGS. 8 and 9 . - A plurality of inputs to the staff allocation process may include a past quality assurance project data, the business analysis outline and the acceptance test plan. The output of the staff allocation process includes the high level quality assurance time estimate and the high level quality assurance resource estimate, and a staff allocation document which may be provided to members of interested or associated groups including, for example, information technology group. Each successive stages of the quality assurance staff allocation process requires more accurate time and resource-predictions.
-
FIG. 3 is a flowchart illustrating one embodiment of a process of developing an acceptance test plan. Atstep 301, an overview is created for the acceptance test plan process. Atstep 302, a plurality of objectives for completing the acceptance test plan are developed. Atstep 303, a plurality of risks associated with the acceptance testing are identified. Atstep 304, a plurality of remedies for thee identified risks are proposed. Atstep 305, a plurality of assumptions for a successful acceptance testing process are identified. Atstep 306, a required system environment for conducting the acceptance testing is identified. Atstep 307, a plurality of limitations imposed upon the acceptance test are identified. Atstep 308, a plurality of sign-offs required for approving the acceptance test plan are identified. Atstep 309, a glossary of terms used in the acceptance test plan is drafted. Atstep 310, a strategy and an approach for the acceptance test are developed. Atstep 311, a plurality of conditions for execution of the acceptance test plan are identified. Atstep 312, a test specification is created. Atstep 313, the results of the acceptance test plan process are recorded in an acceptance test plan document. The process illustrated inFIG. 3 may be performed by a system, such as the system illustrated inFIGS. 8 and 9 . - Describing the process illustrated in
FIG. 3 in more detail, atstep 301, an overview of the software application being developed is created. The overview comprises a brief description of the software application being developed. For example, if the software application being developed is for a web-site, the overview may include a description of the web-site and a plurality of objectives of the software application. - At
step 302, a plurality of objectives for the acceptance test plan are developed. The objectives may comprise a plurality of high-level goals and a plurality of strategies required to successfully complete the acceptance test. The objectives may comprise ensuring that a functionality for the software application being developed as outlined in the acceptance test plan has been successfully tested, verifying that the objectives of the software application being tested have been successfully met and verifying that the software application being tested can coexist with all required functionality. Additionally, the objectives may include verifying through regression testing that other installed or related software applications continue to properly function, detailing a plurality of activities required to prepare and conduct acceptance testing, defining a methodology that will be used to test and verify system functionality and using acceptance testing as an effective communication tool in identifying and resolving a plurality of system issues and problems. The objectives may further include determining and coordinating a plurality of responsibilities for persons involved in the acceptance testing, defining a plurality of environments and test cycle approaches required to effectively complete the acceptance testing and defining all system functions to be tested, as well as those functions that will not be tested. - At
step 303, a plurality of risks to the acceptance testing process are identified. The risks may include risks to a schedule and/or risks to a quality level. The risks identified may include a tight schedule where any additional functionality or requirement changes could result in a delay in implementation of the software application and a need for dedicated technical support in order to eliminate any testing down time. - At
step 304, a plurality of remedies or contingencies are proposed for the identified risks. Each identified risk must have a corresponding remedy or contingency. For example, if a risk to an acceptance test schedule is that the schedule is very tight and any additional functionality or requirement changes could result in a delay in implementation of the software application, a corresponding contingency may be that any requirement changes after a certain date must follow one or more procedures, such as a predetermined change control procedure. - At
step 305, a plurality of assumptions are identified for the acceptance testing. The assumptions may include a plurality of basic testing related issues that are required for a successful acceptance testing process. For example, the assumptions may include an assumption that, at the end of a design phase, a detailed schedule of the software deliverables will be published, all functionality required to fully support an application must be ready for acceptance testing on or before a certain date, and all documentation relating to the software application will be made available to a quality assurance group on or before a certain date. - At
step 306, a system environment for acceptance testing is described. The description of the system environment may include a description of a hardware and a software interface needed for acceptance testing. The description may also include a type of hardware and software platform, such as, for example, a specific mainframe system and related software interfaces for a Marketing Sales System such as, for example. Choicepoint™, Correspondence™, EPS™ Electronic Publishing,Code 1™, POLK™, Ratabase 1.5™, or other similar systems. - At
step 307, one or more limitations or constraints to the acceptance test plan are identified. The limitations may include, for example, that the scheduled implementation date for a new software application is firm and cannot be changed. - At
step 308, a plurality of sign-offs required for approving the acceptance test plan are identified. The sign-offs required may include a list of persons from a client area who will be responsible for approving the acceptance test plan and reviewing any test material and a list of persons from the information technology group who will approve the acceptance test plan from a technical point of view. - At
step 309, a glossary of terms used in the acceptance test plan is drafted. The glossary of terms may include a definition of each technical, business, and quality assurance term that may be unfamiliar to those reviewing the acceptance test plan. The glossary of terms may include, for example, a name of the software application being tested, a vendor name, and other terms and acronyms that may be unfamiliar to a person reviewing the acceptance test plan. - At
step 310, a strategy is developed for the acceptance testing. The strategy or approach for the acceptance testing includes identifying a test environment for the execution of the acceptance test plan and developing a scope and an approach for the execution of the acceptance test plan. - Identifying the test environment comprises providing a high level description of resources, both hardware and software, that will be required in order to conduct the acceptance testing. For example, the description of the test environment may include a requirement for a facility to be available at a certain time, including evening and weekend availability, an identification of all interfaces that must be available to verify the test cases, and an identification of all functionality of the software being tested that must be ready for acceptance testing on a certain date for the start of acceptance testing,
- The scope and approach for the acceptance testing comprises an identification of one or more functions or tests that are planned for the acceptance testing and any testing limitations or identifiable ranges of the tests. The scope and approach may include, for example, a number of stages of a test with an identification of a type of test. For example, the test may include two stages with the first stage comprising a timing test, if a goal of the software application under development is to reduce processing time, and a second stage comprising a system functionality test. The scope and approach of the acceptance testing may then be further described by listing an approach, such as, for example, “a timing test goal”, “a timing test measurement”, “a timing test specific form” and “one or more timing test participants and locations” under the timing test state, and “one or more screen flows/links”, “a pre-fill validation”, and “one or more default field values”, under the system functionality test stage.
- At
step 311, one-or more conditions required for execution of the acceptance test-plan are identified. The conditions may include an entrance criteria condition, a resources required condition, an acceptance test tools condition, and an organization for testing condition. - The entrance criteria are one or more events required by the quality assurance group to be successfully completed prior to the beginning of the acceptance testing. For example, the entrance criteria may include that a volunteer tester must have an ability to access the acceptance testing site from his home, that an identification of all problems detected during system testing must be forwarded to the quality assurance group prior to the start of acceptance testing, a verification that the acceptance testing environment is ready to support acceptance testing, that a test team has been identified and trained, that all system documentation has been turned over to the quality assurance group, that an acceptance testing schedule has been agreed upon and published and that all major interfaces of the software application to be tested are available for acceptance testing.
- The resources required comprise an identification of the human, hardware and software resources required to complete the acceptance testing. For example, the identification of the human resources required may include a listing of persons required to conduct the acceptance testing such as a number of quality assurance analysts required, and a number of volunteer testers working from home or at the acceptance testing facility. The identification of the hardware/software resources required may include an identification of a type of platform such as a mainframe system, or Internet access.
- The acceptance test tools include any software package that automates any part of the acceptance testing process. For example, the acceptance test tools may include Microsoft Word® or other word processing program, a copy tool for the software application being tested or a session open tool.
- The identification of the organization for acceptance testing comprises identification of persons required to conduct the acceptance testing and their corresponding responsibilities during the acceptance testing. For example, the responsibilities may include a responsibility for “Review of Acceptance Test Plan”, a responsibility for “Keying Test Cases”, and a responsibility for “Reviewing Test Results”. Then, the human resources required to fulfill such responsibilities may be listed under each responsibility, such as, for example, in the “Review of Acceptance Test Plan” category, the human resources required may include a Project Manager, an Applications Team Leader, a Business Analyst, a Quality Assurance and a Business Partner.
- At
step 312, a test specification is created for the acceptance test plan. The test specification may include a test structure/organization section, a test functions or cases section, a test scripts or function logs section, and a test schedules and test completion criteria section. The test structure/organization section may be comprised of one or two paragraphs relating to one or more types of processing cycles and environment changes that may be needed during the acceptance testing. - The test functions or cases section makes up the primary section of the acceptance test plan. The test functions/cases section identifies all of the functions or cases to be verified during the acceptance testing. The test scripts/function logs section should be included in the test specification so that the function logs and test scripts are generally available upon request. A test schedule identifies a schedule established in the scope and approach section of the acceptance test plan.
- The test completion criteria include a paragraph that describes a framework for handling any severe system problems that remain open at the end of acceptance testing. For example, the test completion criteria may state that all major functions and sub-functions of the software application being tested will be considered to have passed acceptance testing when no severe system problem remains open. The test completion criteria may then describe how to identify a severe system problem.
- At
step 313, the acceptance test plan processes are recorded in an acceptance test plan document. Thus, all of the processes described fromstep 301 to step 312 will be recorded in a document to be approved by the persons identified atstep 308. -
FIG. 4 is a flowchart illustrating one embodiment of a process of creating a plurality of test scripts for an acceptance test plan. Atstep 401, an acceptance test plan is reviewed. Specifically, the test functions/cases section of the test specification created atstep 312 ofFIG. 3 is reviewed. - At
step 402, a function log process is performed. In the function log process, the functions/cases section of the acceptance test plan is referenced in order to create one or more function logs. A function log may contain many individual test cases or scenarios. A function log may include a plurality of headings including, a function heading, a function number heading, an identifier heading, an expected results heading and a pass/fail heading. An example of a function log is described with reference toFIG. 6 below. An input to the function log process is the test function/cases section of the acceptance test plan. An output of the function log process is a physical document called a function log. - At
step 403, a test script is created using the function log. A test script may contain more specific testing instructions than a function log. The test script may include a plurality of headings including a function/script heading, a function/script number heading, a script activity per cycle heading, an expected results heading and a pass/fail per cycle heading. Either the function logs or the test scripts may be used as guides for executing the acceptance test plan, depending upon the skill level of the tester. If the skill level of the tester is high or if the tester has a lot of experience in conducting acceptance testing, the function logs may be used by the tester. However, if the test scripts are used, a tester having a lower skill level may execute the acceptance test. The input to the test script process is the function log document and the output to the test script process is the test script document. The test case descriptions from the acceptance test plan, the function logs or the test scripts may be used as instructions for carrying out test cases, where the test scripts comprise more detailed instructions than the function logs, and the function logs comprise more detailed instructions than the test case descriptions. -
FIG. 5 is a flowchart illustrating one embodiment of a process of processing defects in an acceptance test plan. Atstep 501, a potential defect is identified during the acceptance testing. Atstep 502, a defect ticket record is created for the identified potential defect. Atstep 503, the defect is recorded on a project defect tracking log. Atstep 504, the defect ticket record is transmitted to a developer area 510. Steps 511-513 occur in developer area 510. Atstep 511, a developer reviews the defect ticket record. Atstep 512, the developer corrects the defect or explains why the identified problem is not a defect. The developer then updates the defect ticket record to reflect a date of correction of the defect and an identity of a corrector of the defect if the defect was corrected. If the identified defect was not corrected, the developer updates the defect ticket record with an explanation for why the defect was not corrected. Atstep 513, the developer returns the defect ticket record to a quality assurance area. Atstep 505, a quality assurance analyst in the quality assurance area checks to see if the defect needs to be retested. If the defect was corrected in the developer area 510, then the corrected defect will require retesting. If the defect was retested, then, atstep 506, the quality assurance analyst checks to see if the corrected defect passed the retesting. If the corrected defect did not pass the retesting, the process returns to step 503 where the updated defect information is recorded on the project defect tracking log and transmitted atstep 504 to the developer area 510. If the corrected defect passes the retesting, the project defect tracking log is updated atstep 507 and the defect ticket record is also updated to reflect that the corrected defect passed the retesting and that the defect ticket record is closed. If the defect does not need to be retested, the project defect tracking log is updated atstep 507. Atstep 508, the defect ticket records are placed in a project folder. - The defect ticket record of
step 502 may include an identification of the project, a defect number, a priority level identification, an identification of a person reporting the defect, a test case number, a function area identification, a description of the defect, a date of opening of the defect ticket, an identification of the software application subject to the defect, an identification of a corrector and a date of correction of the defect, an identification of a retester and a date of retesting, and a date of closing of the defect ticket record if the corrected defect passes the retesting. -
FIG. 6 is an example of a screen shot of one embodiment of afunction log document 600 illustrating an acceptance test plan for an automotive insurance software application developed for an automotive insurance provider entity. It should be understood, however, that the function log document of the invention can be modified as required for use by any type of business entity for acceptance, testing of any type of software application. -
Function log document 600 includes a plurality of fields including aFunction field 601, aFunction Number field 602, anIdentification Number field 603, anExpected Results field 604 and a Pass/Fail field 605. During an acceptance test, a plurality of test cases 606-610 are performed. With reference to atest case 606, thetest case 606 is for a user who learned of the insurance provider through a radio commercial, who has one car with one operator where the user has a date of birth in 1930 and the car has an annual mileage of 3,000 miles per year, as listed inFunction field 601 oftest case 606. A function number assigned thistest case 606 is 1, as listed under theFunction Number field 602. Thetest case 606 is assigned an identification number of 840430090 as listed in theIdentification Number field 603. The expected results of the entered information for thetest case 606 is that the user should not be asked for the odometer purchase date or date of purchase, as listed under theExpected Results field 604. An indication is listed in the Pass/Fail field 605 of whether thetest case 606 passed or failed the testing. As shown inFIG. 6 , thetest case 606 passed the testing. -
FIG. 7 is an example of a screen shot of one embodiment of atest script document 700. As shown inFIG. 7 , thetest script document 700 is a test script used for acceptance testing of the automotive insurance software application for insurance requirements in the State of California. Thetest script document 700 includes a plurality of fields including a TestCase Number field 701, a Log/Script Activity field 702, aUser Number field 703, anExpected Results field 704 and aComments field 705. - The Test
Case Number field 701 lists the number of the test case. The log/script activity field 702 lists a user ID and a password to be entered by a tester and detailed instructions on how to perform the test case. As an illustrative example, atest case 706 is input for a situation where a user lives in California compared to where the user lives in a state other than California. The expected results, as listed in theExpected Results field 704, are that a quote for a cost of an automotive insurance product should be the same for the user regardless of whether the user's information was entered into system “Q” or “M”. As shown in theComments field 705, the software application subject to testing passed the testing. TheComments field 705 may also be used by a tester to enter any comments or notes that indicate some activity needs to be performed or investigated. -
FIG. 8 is a block diagram of asystem 800 used for acceptance testing.System 800 includes astaff allocation module 801, an acceptancetest plan module 802, atest script module 803, adefects processing module 804, anexternal processes module 805 and anacceptance testing module 810. Theacceptance testing module 810 receives inputs from thestaff allocation module 801, the acceptancetest plan module 802, thetest script module 803, thedefects reporting module 804 and theexternal processes module 805. The execution of the acceptance test plan occurs in theacceptance testing module 810. - The high level quality assurance resource estimate and the high level quality assurance time estimate for the acceptance testing and the staff allocation module document are developed in the
staff allocation module 801. Thestaff allocation module 801 includes means (not shown) for developing the high level quality assurance resource estimate and the high level quality-assurance time estimate and for recording such high level quality assurance resource estimate and high level quality assurance time estimate in the staff allocation document. - The acceptance
test plan module 802 receives an output from thestaff allocation module 801 and develops an acceptance test plan. The acceptancetest plan module 802 includes means (not shown) for developing the acceptance test plan as described above with reference toFIG. 3 . - The outputs of the acceptance
test plan module 802 are coupled to theacceptance testing module 810 and thetest script module 803. Thetest script module 803 receives the acceptance test plan from acceptancetest plan module 802 and generates both the function log document and the test scripts. Thetest script module 803 includes means (not shown) for creating the function logs document and the test scripts as described above with reference toFIG. 4 . The function logs document and the test scripts are transmitted to theacceptance testing module 810 for execution of the acceptance test plan. - The
defects reporting module 804 receives inputs from both theexternal processes module 805 and theacceptance testing module 810. Thedefects reporting module 804 processes any defects identified during the execution of the acceptance test plan and transmits the identified defects to the developer area. Thedefects reporting module 804 includes means (not shown) for processing defects as described above with reference toFIG. 5 . - The
external processes module 805 receives inputs from both theacceptance testing module 810 and thedefects reporting module 804. Theexternal processes module 805 has an output connected to thestaff allocation module 801. Theexternal processes module 805 includes the development area, an information technology area, or an area for any other processes external to the acceptance testing. Theexternal processes module 805 receives or generates a business analysis outline which it transmits to thestaff allocation module 801 and theacceptance testing module 810. -
FIG. 9 is a block diagram illustrating the components of one embodiment of asystem 900 used for acceptance testing. As shown inFIG. 9 ,system 900 may be comprised of aprocessor module 902, adisplay 904, auser input 906, adata input module 908, adata storage module 910, and anoutput module 912. Generally, theprocessor module 902 receives inputs from thedata input module 908 and theuser input module 906, and provides outputs via thedisplay 904 and theoutput module 912. Theprocessor module 902 may also receive inputs and provide outputs through thedata storage module 910. - According to an embodiment of the invention, the
processor module 902 may be a standard processor suitable for performing any necessary calculations required for the acceptance testing, including multiple task processing as necessary. As illustrated, theprocessor module 902 may receive inputs from thedata input module 908 and theuser input module 906, as well as data from thedata storage module 910. Thedata input module 908 may be any conventional data input device, such as a magnetic or an optical disk drive, a CD-ROM, a scanner, a modern, an Internet connection, a hard-wired connection, or any other device for inputting data to theprocessor module 902. Theuser input module 906 may be any conventional user input device, such as a keyboard, a touch-screen, a roller-ball, a mouse, a pointer, or any other device for a user to enter and direct manipulation of data in theprocessor module 902. - The
data storage module 910 may be comprised of any conventional storage device, such as a computer memory, a magnetic or an optical disc or a CD-ROM, a tape-to-tape reel, or any other device for storing data. In the context of conducting the acceptance testing, thedata storage module 902 may contain information related to the business analysis outline, the acceptance test plan, the high level quality assurance time estimate and the high level quality assurance resource estimate, the past quality assurance project, defects and other information. Theprocessor module 902 may be capable of accessing data in thedata storage module 910. Thus, according to an embodiment of the invention, thedata storage module 910 may be searchable by a field or in a variety of other conventional manners. - As illustrated, the
processor module 902 may provide information through thedisplay 904 and theoutput module 912, as well as provide data to thedata storage module 910. Thedisplay 904 may be any conventional display device, such as a television, a monitor, or any other display device. Theoutput module 912 may be any conventional output device, such as a printer, a facsimile machine, a magnetic disc drive, a compact disc drive or an optical disc drive, a modem, an Internet connection, a hard-wired connection, or any other device for outputting data to theprocessor module 902. - While the foregoing description includes many details and specificities, it should be understood that these have been included for purposes of explanation only, and are not to be interpreted as limitations of the present invention. Many modifications to the embodiments described above can be made without departing from the spirit and scope of the invention, as is intended to be encompassed by the following claims and their legal equivalents.
Claims (12)
1-50. (Canceled)
51. A process of creating a plurality of test scripts for use in a quality assurance system comprising:
(a) reviewing an acceptance test plan;
(b) performing a function log process to produce a function log document including a plurality of individual test cases; and
(c) performing a test script process to produce a test script document including a plurality of specific testing instructions for each of the individual test cases of the function log document.
52. The process of claim 51 wherein the function log document comprises an identification of a test case, a function number associated with the identified test case, a function identifier, an expected result and an indication of whether the function passed the test case.
53. The process of claim 51 wherein the test script document comprises an identification of a function/script, a function/script number, a script activity per cycle, an expected result and an indication of whether the function passed the test script.
54. The process of claim 51 wherein an input to the function log process is the acceptance test plan and an output of the function log process is the function log document.
55. The process of claim 54 wherein an input to the test script process is the function log document and an output to the test script process is the test script document.
56. A system for creating a plurality test scripts in a quality assurance system comprising:
(a) means for reviewing an acceptance test plan;
(b) means for performing a function log process to produce a function log document including a plurality of individual test cases; and
(c) means for performing a test script process to produce a test script document including a plurality of specific testing instructions for each of the individual test cases of the function log.
57. The system of claim 56 wherein the function log document comprises an identification of a test case, a function number associated with the identified test case, a function identifier, an expected result and an indication of whether the function passed the test case.
58. The system of claim 56 wherein the test script document comprises an identification of a function/script, a function/script number, a script activity per cycle, an expected result and an indication of whether the function passed the test script.
59. The system of claim 56 wherein an input to the function log process is the acceptance test plan and an output of the function log process is the function log document.
60. The system of claim 56 wherein an input to the test script process is the function log document and an output to the test script process is the test script document.
61-68. (Canceled)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/938,957 US20050043919A1 (en) | 2000-11-09 | 2004-09-13 | Process and system for quality assurance for software |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/708,715 US6601017B1 (en) | 2000-11-09 | 2000-11-09 | Process and system for quality assurance for software |
US10/435,263 US6799145B2 (en) | 2000-11-09 | 2003-05-12 | Process and system for quality assurance for software |
US10/938,957 US20050043919A1 (en) | 2000-11-09 | 2004-09-13 | Process and system for quality assurance for software |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/435,263 Division US6799145B2 (en) | 2000-11-09 | 2003-05-12 | Process and system for quality assurance for software |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050043919A1 true US20050043919A1 (en) | 2005-02-24 |
Family
ID=27613795
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/708,715 Expired - Lifetime US6601017B1 (en) | 2000-11-09 | 2000-11-09 | Process and system for quality assurance for software |
US10/435,263 Expired - Lifetime US6799145B2 (en) | 2000-11-09 | 2003-05-12 | Process and system for quality assurance for software |
US10/938,957 Abandoned US20050043919A1 (en) | 2000-11-09 | 2004-09-13 | Process and system for quality assurance for software |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/708,715 Expired - Lifetime US6601017B1 (en) | 2000-11-09 | 2000-11-09 | Process and system for quality assurance for software |
US10/435,263 Expired - Lifetime US6799145B2 (en) | 2000-11-09 | 2003-05-12 | Process and system for quality assurance for software |
Country Status (1)
Country | Link |
---|---|
US (3) | US6601017B1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070162894A1 (en) * | 2006-01-11 | 2007-07-12 | Archivas, Inc. | Method of and system for dynamic automated test case generation and execution |
US20070174702A1 (en) * | 2005-11-18 | 2007-07-26 | International Business Machines Corporation | Test effort estimator |
US20080209276A1 (en) * | 2007-02-27 | 2008-08-28 | Cisco Technology, Inc. | Targeted Regression Testing |
US20090106597A1 (en) * | 2007-10-19 | 2009-04-23 | International Business Machines Corporation | Automatically Populating Symptom Databases for Software Applications |
US20090171881A1 (en) * | 2007-12-28 | 2009-07-02 | International Business Machines Corporation | Method and Apparatus for Modifying a Process Based on Closed-Loop Feedback |
US9519571B2 (en) | 2007-07-13 | 2016-12-13 | International Business Machines Corporation | Method for analyzing transaction traces to enable process testing |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8234156B2 (en) * | 2001-06-28 | 2012-07-31 | Jpmorgan Chase Bank, N.A. | System and method for characterizing and selecting technology transition options |
JP2003050641A (en) * | 2001-08-07 | 2003-02-21 | Nec Corp | Program management system, its program management method, and information management program |
US7055065B2 (en) * | 2001-09-05 | 2006-05-30 | International Business Machines Corporation | Method, system, and computer program product for automated test generation for non-deterministic software using state transition rules |
US20030192009A1 (en) * | 2002-04-04 | 2003-10-09 | Sun Microsystems, Inc. | Method and system for representing text using markup language |
CA2381596A1 (en) * | 2002-04-12 | 2003-10-12 | Ibm Canada Limited-Ibm Canada Limitee | Generating and managing test plans for testing computer software |
US7424702B1 (en) | 2002-08-19 | 2008-09-09 | Sprint Communications Company L.P. | Data integration techniques for use in enterprise architecture modeling |
GB0221257D0 (en) * | 2002-09-13 | 2002-10-23 | Ibm | Automated testing |
US20040083158A1 (en) * | 2002-10-09 | 2004-04-29 | Mark Addison | Systems and methods for distributing pricing data for complex derivative securities |
US7484087B2 (en) * | 2003-02-24 | 2009-01-27 | Jp Morgan Chase Bank | Systems, methods, and software for preventing redundant processing of transmissions sent to a remote host computer |
US7596778B2 (en) * | 2003-07-03 | 2009-09-29 | Parasoft Corporation | Method and system for automatic error prevention for computer software |
US20050071807A1 (en) * | 2003-09-29 | 2005-03-31 | Aura Yanavi | Methods and systems for predicting software defects in an upcoming software release |
US20050114829A1 (en) * | 2003-10-30 | 2005-05-26 | Microsoft Corporation | Facilitating the process of designing and developing a project |
EP2385069A3 (en) * | 2003-11-12 | 2012-05-30 | Biogen Idec MA Inc. | Neonatal Fc rReceptor (FcRn)- binding polypeptide variants, dimeric Fc binding proteins and methods related thereto |
US7702767B2 (en) * | 2004-03-09 | 2010-04-20 | Jp Morgan Chase Bank | User connectivity process management system |
US7849438B1 (en) * | 2004-05-27 | 2010-12-07 | Sprint Communications Company L.P. | Enterprise software development process for outsourced developers |
US7665127B1 (en) | 2004-06-30 | 2010-02-16 | Jp Morgan Chase Bank | System and method for providing access to protected services |
US7328202B2 (en) * | 2004-08-18 | 2008-02-05 | Xishi Huang | System and method for software estimation |
US20060085492A1 (en) * | 2004-10-14 | 2006-04-20 | Singh Arun K | System and method for modifying process navigation |
EP1899902B1 (en) * | 2005-05-30 | 2011-12-28 | Semiconductor Energy Laboratory Co., Ltd. | Semiconductor device and driving method thereof |
US8484065B1 (en) | 2005-07-14 | 2013-07-09 | Sprint Communications Company L.P. | Small enhancement process workflow manager |
US8572516B1 (en) | 2005-08-24 | 2013-10-29 | Jpmorgan Chase Bank, N.A. | System and method for controlling a screen saver |
US8733440B2 (en) | 2009-07-02 | 2014-05-27 | Halliburton Energy Services, Inc. | Well cement compositions comprising biowaste ash and methods of use |
IL172208A0 (en) * | 2005-11-28 | 2009-02-11 | Bniya & Internet Ltd | Method for monitoring and controlling the construction of an engineering project |
US8181016B1 (en) | 2005-12-01 | 2012-05-15 | Jpmorgan Chase Bank, N.A. | Applications access re-certification system |
US20070174023A1 (en) * | 2006-01-26 | 2007-07-26 | International Business Machines Corporation | Methods and apparatus for considering a project environment during defect analysis |
US7913249B1 (en) | 2006-03-07 | 2011-03-22 | Jpmorgan Chase Bank, N.A. | Software installation checker |
US7895565B1 (en) | 2006-03-15 | 2011-02-22 | Jp Morgan Chase Bank, N.A. | Integrated system and method for validating the functionality and performance of software applications |
JP4907237B2 (en) * | 2006-06-22 | 2012-03-28 | 大日本スクリーン製造株式会社 | Test man-hour estimation device and program |
GB0621408D0 (en) * | 2006-10-27 | 2006-12-06 | Ibm | A method, apparatus and software for determining a relative measure of build quality for a built system |
US7640105B2 (en) | 2007-03-13 | 2009-12-29 | Certus View Technologies, LLC | Marking system and method with location and/or time tracking |
US9659268B2 (en) * | 2008-02-12 | 2017-05-23 | CertusVies Technologies, LLC | Ticket approval system for and method of performing quality control in field service applications |
US20090210852A1 (en) * | 2008-02-19 | 2009-08-20 | International Business Machines Corporation | Automated merging in a software development environment |
US8108178B2 (en) * | 2008-05-16 | 2012-01-31 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Directed design of experiments for validating probability of detection capability of a testing system |
US9208458B2 (en) | 2008-10-02 | 2015-12-08 | Certusview Technologies, Llc | Methods and apparatus for analyzing locate and marking operations with respect to facilities maps |
US20090327024A1 (en) | 2008-06-27 | 2009-12-31 | Certusview Technologies, Llc | Methods and apparatus for quality assessment of a field service operation |
US8612271B2 (en) | 2008-10-02 | 2013-12-17 | Certusview Technologies, Llc | Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks |
US9473626B2 (en) | 2008-06-27 | 2016-10-18 | Certusview Technologies, Llc | Apparatus and methods for evaluating a quality of a locate operation for underground utility |
US9208464B2 (en) | 2008-10-02 | 2015-12-08 | Certusview Technologies, Llc | Methods and apparatus for analyzing locate and marking operations with respect to historical information |
US8620726B2 (en) * | 2008-10-02 | 2013-12-31 | Certusview Technologies, Llc | Methods and apparatus for analyzing locate and marking operations by comparing locate information and marking information |
US8583264B2 (en) | 2008-10-02 | 2013-11-12 | Certusview Technologies, Llc | Marking device docking stations and methods of using same |
CA2691780C (en) * | 2009-02-11 | 2015-09-22 | Certusview Technologies, Llc | Management system, and associated methods and apparatus, for providing automatic assesment of a locate operation |
US8495583B2 (en) | 2009-09-11 | 2013-07-23 | International Business Machines Corporation | System and method to determine defect risks in software solutions |
US8667458B2 (en) * | 2009-09-11 | 2014-03-04 | International Business Machines Corporation | System and method to produce business case metrics based on code inspection service results |
US8689188B2 (en) * | 2009-09-11 | 2014-04-01 | International Business Machines Corporation | System and method for analyzing alternatives in test plans |
US8527955B2 (en) | 2009-09-11 | 2013-09-03 | International Business Machines Corporation | System and method to classify automated code inspection services defect output for defect analysis |
US10235269B2 (en) * | 2009-09-11 | 2019-03-19 | International Business Machines Corporation | System and method to produce business case metrics based on defect analysis starter (DAS) results |
US8539438B2 (en) | 2009-09-11 | 2013-09-17 | International Business Machines Corporation | System and method for efficient creation and reconciliation of macro and micro level test plans |
US8352237B2 (en) | 2009-09-11 | 2013-01-08 | International Business Machines Corporation | System and method for system integration test (SIT) planning |
US8893086B2 (en) | 2009-09-11 | 2014-11-18 | International Business Machines Corporation | System and method for resource modeling and simulation in test planning |
US8578341B2 (en) | 2009-09-11 | 2013-11-05 | International Business Machines Corporation | System and method to map defect reduction data to organizational maturity profiles for defect projection modeling |
US20110214105A1 (en) * | 2010-02-26 | 2011-09-01 | Macik Pavel | Process for accepting a new build |
US20130013363A1 (en) * | 2011-07-06 | 2013-01-10 | Bank Of America Corporation | Management of Project Development |
US8862941B2 (en) | 2011-09-16 | 2014-10-14 | Tripwire, Inc. | Methods and apparatus for remediation execution |
US8819491B2 (en) | 2011-09-16 | 2014-08-26 | Tripwire, Inc. | Methods and apparatus for remediation workflow |
US10002041B1 (en) | 2013-02-01 | 2018-06-19 | Jpmorgan Chase Bank, N.A. | System and method for maintaining the health of a machine |
US9720655B1 (en) | 2013-02-01 | 2017-08-01 | Jpmorgan Chase Bank, N.A. | User interface event orchestration |
CN104956326A (en) * | 2013-02-01 | 2015-09-30 | 惠普发展公司,有限责任合伙企业 | Test script creation based on abstract test user controls |
US9088459B1 (en) | 2013-02-22 | 2015-07-21 | Jpmorgan Chase Bank, N.A. | Breadth-first resource allocation system and methods |
US9268674B1 (en) * | 2013-05-08 | 2016-02-23 | Amdocs Software Systems Limited | System, method, and computer program for monitoring testing progress of a software testing project utilizing a data warehouse architecture |
US9619410B1 (en) | 2013-10-03 | 2017-04-11 | Jpmorgan Chase Bank, N.A. | Systems and methods for packet switching |
US9542259B1 (en) | 2013-12-23 | 2017-01-10 | Jpmorgan Chase Bank, N.A. | Automated incident resolution system and method |
US9868054B1 (en) | 2014-02-10 | 2018-01-16 | Jpmorgan Chase Bank, N.A. | Dynamic game deployment |
WO2015121982A1 (en) * | 2014-02-14 | 2015-08-20 | 富士通株式会社 | Document management program, device, and method |
CN106227662B (en) * | 2016-07-22 | 2019-03-12 | 中国科学院声学研究所 | A kind of smart television application audit test macro and method |
US10771314B2 (en) * | 2017-09-15 | 2020-09-08 | Accenture Global Solutions Limited | Learning based incident or defect resolution, and test generation |
US10831644B2 (en) * | 2018-10-01 | 2020-11-10 | Villani Analytics LLC | Automation of enterprise software inventory and testing |
US11237802B1 (en) | 2020-07-20 | 2022-02-01 | Bank Of America Corporation | Architecture diagram analysis tool for software development |
CN112783762B (en) * | 2020-12-31 | 2024-04-09 | 中电金信软件有限公司 | Software quality assessment method, device and server |
CN113919675B (en) * | 2021-09-28 | 2023-01-03 | 内蒙古万和工程项目管理有限责任公司 | Method and system for checking and accepting actual measurement of project supervision quality |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5798950A (en) * | 1996-10-31 | 1998-08-25 | International Business Machines Corporation | Method and apparatus for estimating durations of activities in forming a current system, based on past durations of activities in forming past systems |
US5892947A (en) * | 1996-07-01 | 1999-04-06 | Sun Microsystems, Inc. | Test support tool system and method |
US5959999A (en) * | 1996-09-20 | 1999-09-28 | Linkabit Wireless, Inc. | Providing control-function data in communication-data channel of a full-mesh satellite communication network by dynamic time-slot assignment in TDMA-frame communication channel |
US6002869A (en) * | 1997-02-26 | 1999-12-14 | Novell, Inc. | System and method for automatically testing software programs |
US6161113A (en) * | 1997-01-21 | 2000-12-12 | Texas Instruments Incorporated | Computer-aided project notebook |
US20020162059A1 (en) * | 2000-10-27 | 2002-10-31 | Mcneely Tracy J. | Methods and systems for testing communications network components |
US6775824B1 (en) * | 2000-01-12 | 2004-08-10 | Empirix Inc. | Method and system for software object testing |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06251023A (en) * | 1993-02-26 | 1994-09-09 | Toshiba Corp | Process progress management device provided with inquiry and guide |
JP3315844B2 (en) * | 1994-12-09 | 2002-08-19 | 株式会社東芝 | Scheduling device and scheduling method |
US6067639A (en) * | 1995-11-09 | 2000-05-23 | Microsoft Corporation | Method for integrating automated software testing with software development |
US5815654A (en) * | 1996-05-20 | 1998-09-29 | Chrysler Corporation | Method for determining software reliability |
US5949999A (en) * | 1996-11-25 | 1999-09-07 | Siemens Corporate Research, Inc. | Software testing and requirements tracking |
US5913023A (en) * | 1997-06-30 | 1999-06-15 | Siemens Corporate Research, Inc. | Method for automated generation of tests for software |
US6243835B1 (en) * | 1998-01-30 | 2001-06-05 | Fujitsu Limited | Test specification generation system and storage medium storing a test specification generation program |
US6332211B1 (en) * | 1998-12-28 | 2001-12-18 | International Business Machines Corporation | System and method for developing test cases using a test object library |
US6256773B1 (en) * | 1999-08-31 | 2001-07-03 | Accenture Llp | System, method and article of manufacture for configuration management in a development architecture framework |
US6662357B1 (en) * | 1999-08-31 | 2003-12-09 | Accenture Llp | Managing information in an integrated development architecture framework |
US7437304B2 (en) * | 1999-11-22 | 2008-10-14 | International Business Machines Corporation | System and method for project preparing a procurement and accounts payable system |
-
2000
- 2000-11-09 US US09/708,715 patent/US6601017B1/en not_active Expired - Lifetime
-
2003
- 2003-05-12 US US10/435,263 patent/US6799145B2/en not_active Expired - Lifetime
-
2004
- 2004-09-13 US US10/938,957 patent/US20050043919A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5892947A (en) * | 1996-07-01 | 1999-04-06 | Sun Microsystems, Inc. | Test support tool system and method |
US5959999A (en) * | 1996-09-20 | 1999-09-28 | Linkabit Wireless, Inc. | Providing control-function data in communication-data channel of a full-mesh satellite communication network by dynamic time-slot assignment in TDMA-frame communication channel |
US5798950A (en) * | 1996-10-31 | 1998-08-25 | International Business Machines Corporation | Method and apparatus for estimating durations of activities in forming a current system, based on past durations of activities in forming past systems |
US6161113A (en) * | 1997-01-21 | 2000-12-12 | Texas Instruments Incorporated | Computer-aided project notebook |
US6002869A (en) * | 1997-02-26 | 1999-12-14 | Novell, Inc. | System and method for automatically testing software programs |
US6775824B1 (en) * | 2000-01-12 | 2004-08-10 | Empirix Inc. | Method and system for software object testing |
US20020162059A1 (en) * | 2000-10-27 | 2002-10-31 | Mcneely Tracy J. | Methods and systems for testing communications network components |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070174702A1 (en) * | 2005-11-18 | 2007-07-26 | International Business Machines Corporation | Test effort estimator |
US8230385B2 (en) * | 2005-11-18 | 2012-07-24 | International Business Machines Corporation | Test effort estimator |
US20070162894A1 (en) * | 2006-01-11 | 2007-07-12 | Archivas, Inc. | Method of and system for dynamic automated test case generation and execution |
US8473913B2 (en) * | 2006-01-11 | 2013-06-25 | Hitachi Data Systems Corporation | Method of and system for dynamic automated test case generation and execution |
US20080209276A1 (en) * | 2007-02-27 | 2008-08-28 | Cisco Technology, Inc. | Targeted Regression Testing |
US7779303B2 (en) * | 2007-02-27 | 2010-08-17 | Cisco Technology, Inc. | Targeted regression testing |
US9519571B2 (en) | 2007-07-13 | 2016-12-13 | International Business Machines Corporation | Method for analyzing transaction traces to enable process testing |
US20090106597A1 (en) * | 2007-10-19 | 2009-04-23 | International Business Machines Corporation | Automatically Populating Symptom Databases for Software Applications |
US8327191B2 (en) * | 2007-10-19 | 2012-12-04 | International Business Machines Corporation | Automatically populating symptom databases for software applications |
US20090171881A1 (en) * | 2007-12-28 | 2009-07-02 | International Business Machines Corporation | Method and Apparatus for Modifying a Process Based on Closed-Loop Feedback |
US7730005B2 (en) * | 2007-12-28 | 2010-06-01 | International Business Machines Corporation | Issue tracking system using a criteria rating matrix and workflow notification |
Also Published As
Publication number | Publication date |
---|---|
US6601017B1 (en) | 2003-07-29 |
US6799145B2 (en) | 2004-09-28 |
US20030204346A1 (en) | 2003-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6799145B2 (en) | Process and system for quality assurance for software | |
Perry | Effective Methods for Software Testing, CafeScribe: Includes Complete Guidelines, Checklists, and Templates | |
Dustin | Effective software testing: 50 specific ways to improve your testing | |
Dustin et al. | Implementing automated software testing: How to save time and lower costs while raising quality | |
US8954934B1 (en) | Method and system for removing unessential test steps | |
Lazic et al. | Cost effective software test metrics | |
US20010032105A1 (en) | Method and system for introducing a new project initiative into a business | |
US20070209010A1 (en) | Computer implemented systems and methods for testing the usability of a software application | |
Felderer et al. | A multiple case study on risk-based testing in industry | |
Chopra | Software testing: a self-teaching introduction | |
Söylemez et al. | Challenges of software process and product quality improvement: catalyzing defect root-cause investigation by process enactment data analysis | |
Amland et al. | Risk based testing and metrics | |
Chopra | Software quality assurance: a self-teaching introduction | |
Abdeen et al. | An approach for performance requirements verification and test environments generation | |
Bhanushali | Ensuring Software Quality Through Effective Quality Assurance Testing: Best Practices and Case Studies | |
Afzal | Metrics in software test planning and test design processes | |
Vukašinović | Software quality assurance | |
Oliveira et al. | Work Product Review Process Applied to Test Cases Review for Software Testing | |
O’Regan | Software Testing | |
Clapp | Software Quality Control, Error, Analysis | |
Fhang et al. | Why a good process fail? Experience in building a sustainable and effective process for software development | |
O'Regan et al. | Fundamentals of Software Testing | |
Parmeza et al. | Cost and Efforts in Product Lines for Developing Safety Critical Products–An Empirical Study | |
Smith et al. | Beyond the software factory: a comparison of" classic" and PC software developers | |
Guðmundsdóttir | A case study on implementation of hybrid software development process |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |