EP1582985B1 - Test case inheritance controlled via attributes - Google Patents
Test case inheritance controlled via attributes Download PDFInfo
- Publication number
- EP1582985B1 EP1582985B1 EP05102449A EP05102449A EP1582985B1 EP 1582985 B1 EP1582985 B1 EP 1582985B1 EP 05102449 A EP05102449 A EP 05102449A EP 05102449 A EP05102449 A EP 05102449A EP 1582985 B1 EP1582985 B1 EP 1582985B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- test
- attributes
- test methods
- base class
- execution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 268
- 238000000034 method Methods 0.000 claims description 146
- 238000010998 test method Methods 0.000 claims description 111
- 238000000605 extraction Methods 0.000 claims description 43
- 230000004044 response Effects 0.000 claims description 2
- 239000000284 extract Substances 0.000 claims 1
- 230000008859 change Effects 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 11
- 230000009471 action Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000000153 supplemental effect Effects 0.000 description 6
- 230000006399 behavior Effects 0.000 description 5
- 230000007717 exclusion Effects 0.000 description 5
- 238000007726 management method Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000003607 modifier Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013522 software testing Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3696—Methods or tools to render software testable
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G17/00—Coffins; Funeral wrappings; Funeral urns
- A61G17/08—Urns
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G17/00—Coffins; Funeral wrappings; Funeral urns
- A61G17/007—Coffins; Funeral wrappings; Funeral urns characterised by the construction material used, e.g. biodegradable material; Use of several materials
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65D—CONTAINERS FOR STORAGE OR TRANSPORT OF ARTICLES OR MATERIALS, e.g. BAGS, BARRELS, BOTTLES, BOXES, CANS, CARTONS, CRATES, DRUMS, JARS, TANKS, HOPPERS, FORWARDING CONTAINERS; ACCESSORIES, CLOSURES, OR FITTINGS THEREFOR; PACKAGING ELEMENTS; PACKAGES
- B65D81/00—Containers, packaging elements, or packages, for contents presenting particular transport or storage problems, or adapted to be used for non-packaging purposes after removal of contents
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65D—CONTAINERS FOR STORAGE OR TRANSPORT OF ARTICLES OR MATERIALS, e.g. BAGS, BARRELS, BOTTLES, BOXES, CANS, CARTONS, CRATES, DRUMS, JARS, TANKS, HOPPERS, FORWARDING CONTAINERS; ACCESSORIES, CLOSURES, OR FITTINGS THEREFOR; PACKAGING ELEMENTS; PACKAGES
- B65D85/00—Containers, packaging elements or packages, specially adapted for particular articles or materials
- B65D85/50—Containers, packaging elements or packages, specially adapted for particular articles or materials for living organisms, articles or materials sensitive to changes of environment or atmospheric conditions, e.g. land animals, birds, fish, water plants, non-aquatic plants, flower bulbs, cut flowers or foliage
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65D—CONTAINERS FOR STORAGE OR TRANSPORT OF ARTICLES OR MATERIALS, e.g. BAGS, BARRELS, BOTTLES, BOXES, CANS, CARTONS, CRATES, DRUMS, JARS, TANKS, HOPPERS, FORWARDING CONTAINERS; ACCESSORIES, CLOSURES, OR FITTINGS THEREFOR; PACKAGING ELEMENTS; PACKAGES
- B65D2585/00—Containers, packaging elements or packages specially adapted for particular articles or materials
Definitions
- Test automation systems are used to automatically test software-driven systems.
- Conventional architectures use a test harness, which is typically used to execute test cases, including test cases specified by attributes.
- Automated tests that use attributes normally consist of a single class with a few methods that are called by a test harness to carry out that test.
- Tests in accordance with the conventional architecture do not inherit test methods from other classes, which prevents the creation of common base test classes.
- a conventional test harness examines each of the classes and/or methods, examines the attributes present, and then based on the examination of the attributes performs an action.
- This method of using attributes requires that the code necessary to perform an action to be present in the test harness. Accordingly, it is not possible to create new kinds of test attributes without modifying the test harness. Additionally, the same code often needs to be written (or modified) for each different test harness in which the code is used.
- the variety of functionality in conventional test harnesses can result in differences between the test harnesses such that a test may be run differently in each test harness (or tests that only run properly in one test harness), which is undesirable.
- attributes can be reused without requiring modifications to run tests written using different test harnesses.
- US-A-6 031 990 concerns a test management system that manages the numerous tests and processes that are used to ensure quality of a software application.
- a hierarchy of tests is created and stored, where the tests include test classes and test cases.
- Each test case is a procedure that has a run command for verifying at least one function of the software application.
- Each test class is a collection of at least one descendent test case or a test class.
- Tests can include special logic called "rules" which are used to test the subsystem. These rules may or may not be inherited by the descendants.
- US-B1-6 430 705 concerns an object oriented programming method for performing concurrent testing of multiple devices which have differing designs and differing test requirements.
- the system uses a base class to enumerate various attributes of a group of devices in various operations relating to the devices.
- the operations of the abstract base class include functions which are not defined by the abstract based class. In other words, they have names and possibly arguments but they contain no code which defines the functions of the operations.
- a class is derived from the base class.
- the derived class may be used to define the enumerated operations of one of the device revisions/variations and may also be used to define or assign values to the attributes for that revision.
- the derived classes corresponding to different revisions would therefore have identically named functions, but the functions themselves may be different.
- US-A-5 751 941 concerns an object oriented framework for testing software.
- the software testing system includes a setup and control system and one or more test systems connected to the setup and control system.
- Each test system sets up test cases from the test data and the test configurations stored in the setup and control system.
- Each test case includes a test case factory and at least one client test case and/ or a server test case.
- the client test case inherits attributes from a client test object and the server test case inherits attributes from a server test object.
- the present invention is directed towards a test case inheritance behavior that can be controlled via attributes.
- a base test class from which test objects are derived is useful for reducing test case code and management.
- base test classes and their derived objects can be used to implement steps that are common between an entire set of test classes (e.g., launching the piece of software to be tested and getting it to a certain stage).
- the principle of inheritance simplifies management of the test software when, for example, the base class is modified.
- the base class is modified, all of the test classes which derive from that base class are automatically modified. Accordingly, only one item needs to be modified (instead of every test) when a change is necessary to modify, for example, the way the software launches.
- a computer-readable medium having computer-executable components comprises two components.
- a test case scenario object comprises test methods that are arranged to test an electronic system, wherein the test methods that are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class, and wherein the principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy.
- a test harness is arranged to provide system test services for the test methods.
- a method for executing test components comprises providing test methods that are arranged to test an electronic system.
- the provided test methods are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class.
- the principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy.
- a test harness is used to provide system test services for the test methods.
- a system for automated testing comprises two components.
- a test case scenario object comprises test methods that are arranged to test an electronic system, wherein the test methods that are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class, and wherein the principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy.
- a test harness is arranged to provide system test services for the test methods.
- a system for automated testing comprises means for providing test methods that are arranged to test an electronic system; means for arranging the provided test methods in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class; means for applying the principle of inheritance to each test method in accordance with the arrangement of the methods within the hierarchy; and using a test harness means to provide system test services for the test methods.
- computing device 100 typically includes at least one processing unit 102 and system memory 104.
- system memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.
- System memory 104 typically includes an operating system 105, one or more applications 106, and may include program data 107.
- application 106 may include a wordprocessor application 120 that further includes ML editor 122. This basic configuration is illustrated in FIGURE 1 by those components within dashed line 108.
- Computing device 100 may have additional features or functionality.
- computing device 100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- additional storage is illustrated in FIGURE 1 by removable storage 109 and non-removable storage 110.
- Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- System memory 104, removable storage 109 and non-removable storage 110 are all examples of computer storage media.
- Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 100. Any such computer storage media may be part of device 100.
- Computing device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc.
- Output device(s) 114 such as a display, speakers, printer, etc. may also be included. These devices are well know in the art and need not be discussed at length here.
- Computing device 100 may also contain communication connections 116 that allow the device to communicate with other computing devices 118, such as over a network.
- Communication connection 116 is one example of communication media.
- Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
- the term computer readable media as used herein includes both storage media and communication media.
- the present invention is directed towards a test case inheritance behavior that can be controlled via attributes.
- a base test class from which test objects are derived is useful for reducing test case code and management.
- base test classes and their derived objects can be used to implement steps that are common between an entire set of test classes (e.g., launching the piece of software to be tested and getting it to a certain stage).
- the principle of inheritance simplifies management of the test software when, for example, the base class is modified.
- the base class is modified, all of the test classes which derive from that base class are automatically modified. Accordingly, only one item needs to be modified (instead of every test) when a change is necessary to modify, for example, the way the software launches.
- selective inheritance can be used to allow tests to be executed properly using inherited methods.
- inheritance is very useful, inheritance need not be mandated for every method derived from the base test class.
- a mechanism is provided for the attributes to select whether inheritance should apply.
- test case inheritance behavior that can be controlled via attributes
- Attribute defined inheritance provides a way to order methods based on class hierarchy (where the order is determined by the attribute present). For example, setup steps are run on a base class first, then subclasses, and so on in a recursive fashion (instead of subclasses, then the base class, or even in a random fashion).
- the execution engine determines which methods should be inherited and the order in which to execute the methods.
- Test Methods that will be run for test class ApplicationSpecificTest are listed in the order in which they should run:
- the indentations signify class hierarchy and illustrate the principle of inheritance for the methods of the given classes and subclasses.
- the ApplicationSpecificTest.Setup inherits from the TestEnvironmentBase. Setup class (unless the test author chooses otherwise, in which case the test author can specify that the method should not be inherited).
- test author can write methods that derive from the base class and selectively apply the principles of inheritance to subclasses as desired.
- the extraction engine orders the methods according to class hierarchy, which determines an execution order.
- the ordering of the methods can be accomplished by using a comparison function that is defined within the attributes, with the result that the attributes themselves can be used to determine the class hierarchy. Modifying the base class automatically modifies its subclasses unless the attribute states that inheritance is toggled "off.”
- attributes can be inherited that modify the state of the test method. For example, an attribute can be inherited that states that a certain exception is expected (in the case of negative testing). This expectation is inherited and modifies the test method result state from "failure” to "pass,” which overrides the default failure case if a test method throws an exception.
- FIGURE 2 is a block diagram illustrating an exemplary environment for practicing the present invention.
- the exemplary environment shown in FIGURE 2 is a test automation system 200 that comprises test harness 210, test runtime 220, and test case scenario 230.
- test runtime 220 is a collection of objects that abstracts knowledge of test cases from the test harness.
- Test runtime 220 typically comprises test services provider object 221, extraction engine 222, attributes 223, and test method executor 224.
- Test runtime 220 can be used by different harnesses to provide consistent support for a particular type of test case format.
- a test harness can use different test runtimes to support different types of test case formats. The test harness typically determines which test runtime to use for a particular test case format.
- Test case extraction is accomplished through a dedicated extraction object (obtained from a test services provider object 221, described below), and invocation is performed by an object (test method executor 224, also described below), which returns a generic result to the test harness.
- the test method executor evaluates attributes using a generic interface to control the execution of the method.
- test harness is not required to, for example, evaluate attributes, determine the order that in which test methods should executed, build argument lists for method invocations, and the like.
- the test harness typically does not have direct access to this information required to perform those tasks, which helps ensure more consistent test execution between different test harnesses.
- All test harness dependent functionality logging, "remoting," and the like should be objects implemented by the test harness, described by an interface, and stored in the test service provider object for use during test execution. Accordingly, a test harness can be created which is easily capable of switching between different test runtimes without requiring changes to the test harness code.
- Test services provider object 221 is used by the test harness to retrieve an extraction engine and is also used by the test case scenario to retrieve objects for test harness-implemented functionality (including test services such as logging, synchronization, and the like).
- the test services provider object typically provides methods that are used to facilitate access to test services.
- An AddService method is called to store references to objects that implement a test service.
- a test service object should implement a "generic" type, which facilitates a set of standard services that different harnesses can provide.
- the object passed in should implement functionality needed by attributes or test methods. This method should be called by the test harness for each service the test harness provides, which is generally done before the test methods are executed.
- a GetService method is typically called to retrieve a type of test service object.
- the type passed in should represent an interface implemented by a test service object.
- An object that implements that object will be returned if found. If the object is not found, a value is returned that indicates such a result (e.g., returns a null). If a test or attribute requires a test service that is not present, the test method being executed should fail.
- An AddDefaultServices protected method is typically called by the constructor. It is typically used to add whatever test services the test runtime provides by default, such as an extraction engine.
- test services provider object data and method calls should be static such that the data set by the test harness can be retrievable by other calls to the test service provider at a point later in time.
- the object is typically a class that implements the functionality described above.
- the constructor of the object is typically used to call the AddDefaultServices method.
- Extraction engine 222 is used to retrieve an ordered list of test methods from the test case for the test harness for a particular test case scenario. Typically there is only one extraction engine present in a test runtime.
- a GetTestMethodWrappers method is used to retrieve the ordered list of test methods for a test case scenario.
- a test method wrapper is a particular implementation of a test method executor.
- a parameter is passed into the method that represents a container that holds test cases. The method returns the ordered list of methods. If an error occurs while extracting the test case, an exception can be "thrown.” If no test cases or test methods are found, an empty list is normally returned. An empty list is usually treated as a failure by the test harness. If additional data needs to be passed to the extraction engine, it can be provided through the TestServicesProvider object by a TestHarnessDataProvider test service.
- An object should be created in order to facilitate easy creation of new extraction engines and runtimes. This class typically implement the functions:
- the extraction engine can use a test service to retrieve information that can be used to modify a test extraction.
- the information can be stored in a file such as an XML file that is in accordance with a schema defined for storing test extraction information. If the data provided by the test service does not refer to an XML file in accordance with the schema, the modification data can be ignored.
- the extraction engine typically loads all extraction modifier XML files specified.
- the contents of the XML files are, for example, placed into two "buckets:" test inclusions, and test exclusions. If both buckets are empty, the extraction engine should include all tests. This case is equivalent to having no extraction modifier xml files, or not being able to retrieve the Extraction Engine Data Source test service. If only the exclusion bucket is empty, the extraction engine should include all tests. If only the inclusion bucket is empty, the extraction engine should include all tests, and exclude tests listed in the exclusion bucket. If both buckets have data, the extraction engine should include tests in the inclusion bucket that are not listed in the exclusion bucket (such that the exclusion list has controlling authority over the inclusion list).
- Test method executors 224 are used to execute a test method without requiring the caller to have beforehand knowledge about the method or its attributes.
- An Invoke method is called to execute a test method.
- An object holding the result of the operation is returned (pass, fail, skip, and the like).
- the Invoke method is responsible for processing the attributes associated with a method and creating a parameter list (if required) for the method being invoked. Execution is typically modified by the attributes associated with the test method.
- An Abort method can be called to abort a currently executing test method.
- the abort typically causes a currently running Invoke method to return. After an abort is performed, no further tests can normally be run.
- a CompareTo method is called to compare two Test Method Wrappers. If the result returned is less than zero, it indicates that this method should be executed before the other Test Method Wrapper (to which the method is compared). If the result returned equals zero it indicates that the order in which both methods are executed in does not matter. If the result returned is greater than zero, it indicates that this method should be executed after the other Test Method Wrapper.
- a GetMethodAttributes method is called to retrieve sorted list of attributes associated with the test method, which are derived from a common base method attribute class. This sorted list of attributes is used by the Test Method Wrapper in several locations. For example, the Invoke method (as discussed above) uses the sorted list to evaluate the attributes in the correct order. Also, the GetMethodAttributes can be used to compare one method wrapper to another. A call such as "get AttributesDescription" uses the ordered list of attributes to create a string description of the attributes associated with the method.
- the test method wrapper has several properties that can be retrieved for use by the test harness, such as by calling "get Description” and "get Name.”
- the properties are generated from the name of the method and the class from which the method was defined. Other properties can be added as needed. Note that these properties do not require the harness to know anything about the test method being queried and that additional properties can be added without requiring modifications to existing test harnesses.
- the MethodResult object is used to convey the result of a test method wrapper to the test harness. Because the test harness does not necessarily have beforehand knowledge of the method being invoked, the results are expressed in abstract form.
- the object typically needs to express three possible outcomes from trying to execute a method: pass, skip, or fail.
- a "pass” would typically indicate that the method completed execution without any errors (for example, the method logged no failures, the test method did not throw an exception, and none of the attributes had an error).
- An "error” would indicate that the method failed (for example, the test method indicated a failure, or an attribute indicated a failure).
- a "skip” would indicate that the method was skipped rather than executed (for example, an attribute specifies the test method should only run on a server, but the test is running on a client machine; in which case the method would be skipped).
- the MethodResult object can also contain optional messages such as a result message and/or an error message.
- the result message can be a human readable description of the methods result. Upon a successful execution of a method, this could be left blank, or it could contain the number of passes recorded. For a completed method in which an error occurred, a textual description of the error may be included while the error message can contain the details of the error.
- TestConditionException When an exception is "thrown" by an attribute a TestConditionException class can be used to convey a modified method state.
- three derived classes that map directly to a method state include TestSkipException, TestSucceededException, and TestErrorException.
- Attributes 223 are typically used to modify and control the execution of a test.
- a test is executed according to a test case scenario, which can be defined through the use of attributes.
- At least three basic types of attributes can be used: class level attributes, method level attributes, and parameter level attributes.
- Test class attributes are optional and can be used to modify an instantiated object state, such that test extraction can be skipped or caused to be performed multiple times for a denoted type. Pre- and post-extraction methods are typically used to modify the instantiated object state. Test class attributes allow such variations in test case scenarios to be implemented.
- Method level attributes are capable of modifying method parameters and method execution.
- method level attributes include execution attributes and supplemental attributes. Both attributes have pre- and post-invocation methods.
- the order in which method level attributes are evaluated is determined by an order property, which is defined when the attribute is written; however, attributes typically have no beforehand knowledge of what other attributes may be present.
- Execution modification at each stage can be handled by a priority based state system - the attribute returning a state with the highest priority is typically used to determine how execution is modified.
- Execution attributes are used to mark a method as a test method.
- a method without an execution attribute is usually not included in a test.
- the core responsibility of an execution attribute is to establish a high-level order to the test and to evaluate method results.
- a method should not have more than one execution attribute. Extraction and execution behavior when more than one execution attribute is present is normally undefined. Examples of execution attributes include "Setup”, “Step", and "Teardown" attributes.
- Supplemental attributes perform supplemental actions to modify the execution of a test method.
- the core responsibility of a supplemental attribute is to perform secondary tasks that are necessary for the execution of a test. They typically are not used to denote high-level order.
- a method may have any number of supplemental attributes. Examples of supplemental attributes include "WaitFor" and "Target" attributes.
- Parameter level attributes are optionally used to modify the parameter input to a method and to modify a state of an object (e.g., the context state) after a method has executed.
- a parameter level attribute is normally not used to alter the execution of a test as method level attributes are used. However, if an exception is thrown, the test method fails immediately in response to the exception. In an embodiment, there is only one parameter level attribute per parameter; behavior with more than one parameter level attribute is left undefined.
- Parameter level attributes are evaluated after method level attributes are evaluated before a method is invoked, and before method level attributes are evaluated after a method is invoked.
- An example of a parameter level attribute includes the "ContextMapping" attribute.
- the MethodState object is used by method attributes to control the execution of a test method. Because a plurality of attributes can be assigned to a test method (and because each attribute can potentially alter the execution of the test method), each attribute can communicate with the MethodState object to ensure consistent execution of the test method.
- the MethodState object can comprise information related to the execution state, a message, an error code, and a state override priority.
- the execution state comprises information regarding how the method has terminated (e.g., skip, pass, fail), whether the state is permitted to be changed, and whether the method should be executed.
- the message can be used to optionally present text that indicates why the test method is in a particular state.
- the error code can be used to indicate the details of an error that the test method wrapper might encounter while executing a test method.
- the state override priority field can be used to improve the consistency of test method execution by allowing the execution state, message, and the error code to be altered only if the new state has a priority greater than the existing state.
- the test method wrapper (224) executes a test method until a terminating state is reached.
- the MethodResult object is constructed from the final MethodState object.
- Execution attributes are responsible for parsing the result obtained from a method invocation. To determine if a method passed or failed, logs can be monitored for pass and fail entries. If any failures are logged, the method likely failed. If no passes or failures were logged, the method also likely failed. If an exception was thrown from the test method or any attributes, the method again likely failed. Otherwise the method can be considered to have (successfully) passed.
- test case scenario 230 is a collection of objects that coordinate the execution of test methods for a particular test case.
- the test methods can be written without beforehand knowledge of the test harness because of the interface provided by and through the test method executor (224).
- Test case scenario 230 typically comprises test methods 231 and other methods and data 232. Test methods access test harness objects by using the runtime object (which comprises the test method executor), rather than by querying specific test harnesses.
- test harness 210 is a collection of objects that coordinate the execution of test cases and provides various test services.
- Test harness 210 typically comprises a UI (User Interface) 211, an Execution Engine 212, a Context object 213, and a Logging object 214.
- the test harness for purposes of added functionality may comprise other objects such as an automation system interface.
- the execution engine (212) is responsible for loading and executing test case scenarios using the test runtime (220).
- FIGURE 3 illustrates of a process 300 flow of an execution engine, in accordance with aspects of the invention. After a start block, the process moves to block 310, at which point a test runtime is loaded. In an embodiment wherein the test runtime is written in Net, the test runtime assembly and test case scenario assembly is loaded into an AppDomain.
- the test harness can display information such as the version of the .Net runtime loaded, or the version of the test runtime being used.
- test cases are loaded/compiled into memory.
- the test cases can be precompiled and loaded into memory or loaded into memory and then compiled.
- the test harness can display information about the test case scenario as well as whether the test case scenario loaded and/or compiled successfully.
- the extraction engine is obtained.
- the extraction engine is obtained by first retrieving the test services provider object (221).
- the type of the base extraction engine is determined from the test method executor (220).
- the static GetService function on the test services provider object is called (passing the type of the base extraction engine to the test services provider object) to receive a reference to an extraction engine.
- test harness functionality is added to the test services provider.
- the AddService method on the test services provider is used to add to the test services provider the test services that are implemented on the test harness.
- the test services include objects that implement various interfaces such as the reboot mechanism, logging, context, and the like. If it is desirable to pass data to the extraction engine (such as, for example, an XML file to specify that certain method should be included or skipped), a test service object that implements a test harness data provider interface can be used.
- the extraction engine is used to get test case steps.
- the extraction engine obtained at block 330 is used to call a GetTestMethodWrappers method of the extraction engine, passing to the method the AppDomain holding the test case.
- An array of TestMethodWrappers is typically returned.
- the array of method wrappers typically contains a list of test actions that should be executed in the order in which they are present in the array. (The extraction engine is typically responsible for ordering the array).
- the list of methods retrieved at block 350 is executed in the order in which the methods are listed. Each method is executed by calling the Invoke method.
- the invoke method typically returns details about the result the test action. The details may include a success/fail/skip result, as well as additional details.
- the details of the result may be, for example, logged, or used to update the UI.
- FIGURE 4 illustrates of a process 400 flow of an execution engine, in accordance with aspects of the invention. For each test action, process 400 is repeated.
- the execution engine calls the InvokeInstanceMethod to initiate the execution of a particular test action.
- the InvokeInstanceMethod calls the InstanceMethod to invoke the particular test action.
- the InstanceMethod in turn calls the Method Wrapper (i.e., an example test method executor) to invoke the particular test action.
- the method wrapper evaluates and executes the attributes (pre-invocation) of the particular test action.
- the method wrapper next invokes the test method in accordance with the evaluated/executed test method.
- the attributes are again evaluated/executed (post-invocation).
- the method wrapper construes the result of the post-invocation attribute evaluation and returns a value that that signals the result.
- the ActionResult is passed to the InstanceMethod, and to the InvokeInstanceMethod in turn.
- the InvokeInstanceMethod evaluates the return value and passes the result to the execution engine.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Hardware Design (AREA)
- Public Health (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Evolutionary Biology (AREA)
- Marine Sciences & Fisheries (AREA)
- Toxicology (AREA)
- Zoology (AREA)
- Debugging And Monitoring (AREA)
- Test And Diagnosis Of Digital Computers (AREA)
Abstract
Description
- Test automation systems are used to automatically test software-driven systems. Conventional architectures use a test harness, which is typically used to execute test cases, including test cases specified by attributes. Automated tests that use attributes normally consist of a single class with a few methods that are called by a test harness to carry out that test. Tests in accordance with the conventional architecture do not inherit test methods from other classes, which prevents the creation of common base test classes.
- A conventional test harness examines each of the classes and/or methods, examines the attributes present, and then based on the examination of the attributes performs an action. This method of using attributes requires that the code necessary to perform an action to be present in the test harness. Accordingly, it is not possible to create new kinds of test attributes without modifying the test harness. Additionally, the same code often needs to be written (or modified) for each different test harness in which the code is used. The variety of functionality in conventional test harnesses can result in differences between the test harnesses such that a test may be run differently in each test harness (or tests that only run properly in one test harness), which is undesirable.
- In accordance with the present invention, attributes can be reused without requiring modifications to run tests written using different test harnesses.
-
US-A-6 031 990 concerns a test management system that manages the numerous tests and processes that are used to ensure quality of a software application. A hierarchy of tests is created and stored, where the tests include test classes and test cases. Each test case is a procedure that has a run command for verifying at least one function of the software application. Each test class is a collection of at least one descendent test case or a test class. Tests can include special logic called "rules" which are used to test the subsystem. These rules may or may not be inherited by the descendants. -
US-B1-6 430 705 concerns an object oriented programming method for performing concurrent testing of multiple devices which have differing designs and differing test requirements. The system uses a base class to enumerate various attributes of a group of devices in various operations relating to the devices. The operations of the abstract base class include functions which are not defined by the abstract based class. In other words, they have names and possibly arguments but they contain no code which defines the functions of the operations. For each revision or variation of the group of devices which are to be tested, a class is derived from the base class. The derived class may be used to define the enumerated operations of one of the device revisions/variations and may also be used to define or assign values to the attributes for that revision. The derived classes corresponding to different revisions would therefore have identically named functions, but the functions themselves may be different. -
US-A-5 751 941 concerns an object oriented framework for testing software. The software testing system includes a setup and control system and one or more test systems connected to the setup and control system. Each test system sets up test cases from the test data and the test configurations stored in the setup and control system. Each test case includes a test case factory and at least one client test case and/ or a server test case. The client test case inherits attributes from a client test object and the server test case inherits attributes from a server test object. - It is the object of the present invention to provide an improved method for automated testing, as well as a corresponding system and computer-readable medium.
- This object is solved by the subject matter of the independent claims.
- Preferred embodiments are defined by the dependent claims.
- The present invention is directed towards a test case inheritance behavior that can be controlled via attributes. A base test class from which test objects are derived is useful for reducing test case code and management. For example, base test classes and their derived objects can be used to implement steps that are common between an entire set of test classes (e.g., launching the piece of software to be tested and getting it to a certain stage). The principle of inheritance simplifies management of the test software when, for example, the base class is modified. When the base class is modified, all of the test classes which derive from that base class are automatically modified. Accordingly, only one item needs to be modified (instead of every test) when a change is necessary to modify, for example, the way the software launches.
- According to one aspect of the invention, a computer-readable medium having computer-executable components comprises two components. A test case scenario object comprises test methods that are arranged to test an electronic system, wherein the test methods that are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class, and wherein the principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy. A test harness is arranged to provide system test services for the test methods.
- According to another aspect of the invention, a method for executing test components comprises providing test methods that are arranged to test an electronic system. The provided test methods are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class. The principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy. A test harness is used to provide system test services for the test methods.
- According to yet another aspect of the invention, a system for automated testing comprises two components. A test case scenario object comprises test methods that are arranged to test an electronic system, wherein the test methods that are arranged in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class, and wherein the principle of inheritance is applied to each test method in accordance with the arrangement of the methods within the hierarchy. A test harness is arranged to provide system test services for the test methods.
- According to a further aspect of the invention, a system for automated testing comprises means for providing test methods that are arranged to test an electronic system; means for arranging the provided test methods in a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class; means for applying the principle of inheritance to each test method in accordance with the arrangement of the methods within the hierarchy; and using a test harness means to provide system test services for the test methods.
-
-
FIGURE 1 illustrates an exemplary computing device that may be used in one exemplary embodiment of the present invention. -
FIGURE 2 is a block diagram illustrating an exemplary environment for practicing the present invention. -
FIGURE 3 illustrates of aprocess 300 flow of an execution engine, in accordance with aspects of the invention. -
FIGURE 4 is a further illustration of aprocess 400 flow of an execution engine, in accordance with aspects of the invention. - Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The terminology and interface specifications used herein are not intended to represent a particular language in which a particular object or method should be written. Rather, the terminology and interface specifications are used to describe the functionality and contents of an interface or object, such as function names, inputs, outputs, return values, and what operations are to be performed using the interface (or what operations are to be performed by the object).
- With reference to
FIGURE 1 , one exemplary system for implementing the invention includes a computing device, such ascomputing device 100. In a very basic configuration,computing device 100 typically includes at least oneprocessing unit 102 andsystem memory 104. Depending on the exact configuration and type of computing device,system memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.System memory 104 typically includes anoperating system 105, one ormore applications 106, and may includeprogram data 107. In one embodiment,application 106 may include awordprocessor application 120 that further includes ML editor 122. This basic configuration is illustrated inFIGURE 1 by those components withindashed line 108. -
Computing device 100 may have additional features or functionality. For example,computing device 100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIGURE 1 byremovable storage 109 andnon-removable storage 110. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.System memory 104,removable storage 109 andnon-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computingdevice 100. Any such computer storage media may be part ofdevice 100.Computing device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 114 such as a display, speakers, printer, etc. may also be included. These devices are well know in the art and need not be discussed at length here. -
Computing device 100 may also containcommunication connections 116 that allow the device to communicate withother computing devices 118, such as over a network.Communication connection 116 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media. - The present invention is directed towards a test case inheritance behavior that can be controlled via attributes. A base test class from which test objects are derived is useful for reducing test case code and management. For example, base test classes and their derived objects can be used to implement steps that are common between an entire set of test classes (e.g., launching the piece of software to be tested and getting it to a certain stage). The principle of inheritance simplifies management of the test software when, for example, the base class is modified. When the base class is modified, all of the test classes which derive from that base class are automatically modified. Accordingly, only one item needs to be modified (instead of every test) when a change is necessary to modify, for example, the way the software launches. As discussed below, selective inheritance can be used to allow tests to be executed properly using inherited methods.
- While inheritance is very useful, inheritance need not be mandated for every method derived from the base test class. In accordance with the present invention, a mechanism is provided for the attributes to select whether inheritance should apply. Using test case inheritance behavior (that can be controlled via attributes) allows tests to be executed properly using inherited methods. Attribute defined inheritance provides a way to order methods based on class hierarchy (where the order is determined by the attribute present). For example, setup steps are run on a base class first, then subclasses, and so on in a recursive fashion (instead of subclasses, then the base class, or even in a random fashion). As described below with reference to
Figure 2 , the execution engine determines which methods should be inherited and the order in which to execute the methods. - An example of pseudocode for illustrating test case inheritance is given below:
- Class TestEnvironmentBase
Setup Method
Teardown Method - Class ApplicationSpecificTest: Inherits from TestEnvironmentBase
Setup Method
Test Methods
Teardown Method - Test Methods that will be run for test class ApplicationSpecificTest are listed in the order in which they should run:
- TestEnvironmentBase. Setup Method
ApplicationSpecificTest. Setup Method
ApplicationSpecificTest.Test Methods
ApplicationSpecificTest.Teardown Method - TestEnvironmentBase. Teardown Method
- In the above example, the indentations signify class hierarchy and illustrate the principle of inheritance for the methods of the given classes and subclasses. For example, the ApplicationSpecificTest.Setup inherits from the TestEnvironmentBase. Setup class (unless the test author chooses otherwise, in which case the test author can specify that the method should not be inherited).
- Accordingly, the test author can write methods that derive from the base class and selectively apply the principles of inheritance to subclasses as desired. The extraction engine orders the methods according to class hierarchy, which determines an execution order. The ordering of the methods can be accomplished by using a comparison function that is defined within the attributes, with the result that the attributes themselves can be used to determine the class hierarchy. Modifying the base class automatically modifies its subclasses unless the attribute states that inheritance is toggled "off."
- Additionally, attributes can be inherited that modify the state of the test method. For example, an attribute can be inherited that states that a certain exception is expected (in the case of negative testing). This expectation is inherited and modifies the test method result state from "failure" to "pass," which overrides the default failure case if a test method throws an exception.
-
FIGURE 2 is a block diagram illustrating an exemplary environment for practicing the present invention. The exemplary environment shown inFIGURE 2 is atest automation system 200 that comprisestest harness 210,test runtime 220, andtest case scenario 230. - In one embodiment,
test runtime 220 is a collection of objects that abstracts knowledge of test cases from the test harness.Test runtime 220 typically comprises testservices provider object 221,extraction engine 222, attributes 223, andtest method executor 224.Test runtime 220 can be used by different harnesses to provide consistent support for a particular type of test case format. By extension, a test harness can use different test runtimes to support different types of test case formats. The test harness typically determines which test runtime to use for a particular test case format. - Test case extraction is accomplished through a dedicated extraction object (obtained from a test
services provider object 221, described below), and invocation is performed by an object (test method executor 224, also described below), which returns a generic result to the test harness. The test method executor evaluates attributes using a generic interface to control the execution of the method. - A test harness is not required to, for example, evaluate attributes, determine the order that in which test methods should executed, build argument lists for method invocations, and the like. The test harness typically does not have direct access to this information required to perform those tasks, which helps ensure more consistent test execution between different test harnesses. All test harness dependent functionality (logging, "remoting," and the like) should be objects implemented by the test harness, described by an interface, and stored in the test service provider object for use during test execution. Accordingly, a test harness can be created which is easily capable of switching between different test runtimes without requiring changes to the test harness code.
- Test
services provider object 221 is used by the test harness to retrieve an extraction engine and is also used by the test case scenario to retrieve objects for test harness-implemented functionality (including test services such as logging, synchronization, and the like). - The test services provider object typically provides methods that are used to facilitate access to test services. An AddService method is called to store references to objects that implement a test service. A test service object should implement a "generic" type, which facilitates a set of standard services that different harnesses can provide. The object passed in should implement functionality needed by attributes or test methods. This method should be called by the test harness for each service the test harness provides, which is generally done before the test methods are executed.
- A GetService method is typically called to retrieve a type of test service object. The type passed in should represent an interface implemented by a test service object. An object that implements that object will be returned if found. If the object is not found, a value is returned that indicates such a result (e.g., returns a null). If a test or attribute requires a test service that is not present, the test method being executed should fail.
- An AddDefaultServices protected method is typically called by the constructor. It is typically used to add whatever test services the test runtime provides by default, such as an extraction engine.
- The test services provider object data and method calls should be static such that the data set by the test harness can be retrievable by other calls to the test service provider at a point later in time. The object is typically a class that implements the functionality described above. The constructor of the object is typically used to call the AddDefaultServices method.
-
Extraction engine 222 is used to retrieve an ordered list of test methods from the test case for the test harness for a particular test case scenario. Typically there is only one extraction engine present in a test runtime. - A GetTestMethodWrappers method is used to retrieve the ordered list of test methods for a test case scenario. (A test method wrapper is a particular implementation of a test method executor.) A parameter is passed into the method that represents a container that holds test cases. The method returns the ordered list of methods. If an error occurs while extracting the test case, an exception can be "thrown." If no test cases or test methods are found, an empty list is normally returned. An empty list is usually treated as a failure by the test harness. If additional data needs to be passed to the extraction engine, it can be provided through the TestServicesProvider object by a TestHarnessDataProvider test service.
- A large part of the functionality present in an extraction engine object is the same between different runtimes; the only detail that may change significantly from one runtime to another is which attributes are used. An object should be created in order to facilitate easy creation of new extraction engines and runtimes. This class typically implement the functions:
- TypeIsATestCase returns a "true" if the type passed in is a test case, "false" if not. This function examines the attributes present on a type by looking for a test case attribute to determine if a type is a test case or not.
- MethodIsATestMethod returns true if the method passed in is a test method, false if not. This function examines the attributes present on a type by looking for an execution attribute to determine if a type is a test method or not.
- GetMethodWrappersFromType gathers all relevant public non-static methods on a type that have execution attributes and returns the methods as an ordered list of MethodWrappers (discussed below). In an embodiment using the Net environment,, the list is ordered using the sort functionality built into Net arrays so that the MethodWrapper calls a compare routine to order the list. If an instance of the type cannot be created, this method fails and returns an empty list.
- The extraction engine can use a test service to retrieve information that can be used to modify a test extraction. The information can be stored in a file such as an XML file that is in accordance with a schema defined for storing test extraction information. If the data provided by the test service does not refer to an XML file in accordance with the schema, the modification data can be ignored.
- The extraction engine typically loads all extraction modifier XML files specified. The contents of the XML files are, for example, placed into two "buckets:" test inclusions, and test exclusions. If both buckets are empty, the extraction engine should include all tests. This case is equivalent to having no extraction modifier xml files, or not being able to retrieve the Extraction Engine Data Source test service. If only the exclusion bucket is empty, the extraction engine should include all tests. If only the inclusion bucket is empty, the extraction engine should include all tests, and exclude tests listed in the exclusion bucket. If both buckets have data, the extraction engine should include tests in the inclusion bucket that are not listed in the exclusion bucket (such that the exclusion list has controlling authority over the inclusion list).
-
Test method executors 224 are used to execute a test method without requiring the caller to have beforehand knowledge about the method or its attributes. An Invoke method is called to execute a test method. An object holding the result of the operation is returned (pass, fail, skip, and the like). The Invoke method is responsible for processing the attributes associated with a method and creating a parameter list (if required) for the method being invoked. Execution is typically modified by the attributes associated with the test method. - An Abort method can be called to abort a currently executing test method. The abort typically causes a currently running Invoke method to return. After an abort is performed, no further tests can normally be run.
- A CompareTo method is called to compare two Test Method Wrappers. If the result returned is less than zero, it indicates that this method should be executed before the other Test Method Wrapper (to which the method is compared). If the result returned equals zero it indicates that the order in which both methods are executed in does not matter. If the result returned is greater than zero, it indicates that this method should be executed after the other Test Method Wrapper.
- A GetMethodAttributes method is called to retrieve sorted list of attributes associated with the test method, which are derived from a common base method attribute class. This sorted list of attributes is used by the Test Method Wrapper in several locations. For example, the Invoke method (as discussed above) uses the sorted list to evaluate the attributes in the correct order. Also, the GetMethodAttributes can be used to compare one method wrapper to another. A call such as "get AttributesDescription" uses the ordered list of attributes to create a string description of the attributes associated with the method.
- The test method wrapper has several properties that can be retrieved for use by the test harness, such as by calling "get Description" and "get Name." The properties are generated from the name of the method and the class from which the method was defined. Other properties can be added as needed. Note that these properties do not require the harness to know anything about the test method being queried and that additional properties can be added without requiring modifications to existing test harnesses.
- The MethodResult object is used to convey the result of a test method wrapper to the test harness. Because the test harness does not necessarily have beforehand knowledge of the method being invoked, the results are expressed in abstract form.
- The object typically needs to express three possible outcomes from trying to execute a method: pass, skip, or fail. A "pass" would typically indicate that the method completed execution without any errors (for example, the method logged no failures, the test method did not throw an exception, and none of the attributes had an error). An "error" would indicate that the method failed (for example, the test method indicated a failure, or an attribute indicated a failure). A "skip" would indicate that the method was skipped rather than executed (for example, an attribute specifies the test method should only run on a server, but the test is running on a client machine; in which case the method would be skipped).
- The MethodResult object can also contain optional messages such as a result message and/or an error message. The result message can be a human readable description of the methods result. Upon a successful execution of a method, this could be left blank, or it could contain the number of passes recorded. For a completed method in which an error occurred, a textual description of the error may be included while the error message can contain the details of the error.
- When an exception is "thrown" by an attribute a TestConditionException class can be used to convey a modified method state. For example, three derived classes that map directly to a method state include TestSkipException, TestSucceededException, and TestErrorException.
-
Attributes 223 are typically used to modify and control the execution of a test. A test is executed according to a test case scenario, which can be defined through the use of attributes. At least three basic types of attributes can be used: class level attributes, method level attributes, and parameter level attributes. - Test class attributes are optional and can be used to modify an instantiated object state, such that test extraction can be skipped or caused to be performed multiple times for a denoted type. Pre- and post-extraction methods are typically used to modify the instantiated object state. Test class attributes allow such variations in test case scenarios to be implemented.
- Method level attributes are capable of modifying method parameters and method execution. In an embodiment, method level attributes include execution attributes and supplemental attributes. Both attributes have pre- and post-invocation methods. The order in which method level attributes are evaluated is determined by an order property, which is defined when the attribute is written; however, attributes typically have no beforehand knowledge of what other attributes may be present. Execution modification at each stage can be handled by a priority based state system - the attribute returning a state with the highest priority is typically used to determine how execution is modified.
- Execution attributes are used to mark a method as a test method. A method without an execution attribute is usually not included in a test. The core responsibility of an execution attribute is to establish a high-level order to the test and to evaluate method results. A method should not have more than one execution attribute. Extraction and execution behavior when more than one execution attribute is present is normally undefined. Examples of execution attributes include "Setup", "Step", and "Teardown" attributes.
- Supplemental attributes perform supplemental actions to modify the execution of a test method. The core responsibility of a supplemental attribute is to perform secondary tasks that are necessary for the execution of a test. They typically are not used to denote high-level order. A method may have any number of supplemental attributes. Examples of supplemental attributes include "WaitFor" and "Target" attributes.
- Parameter level attributes are optionally used to modify the parameter input to a method and to modify a state of an object (e.g., the context state) after a method has executed. A parameter level attribute is normally not used to alter the execution of a test as method level attributes are used. However, if an exception is thrown, the test method fails immediately in response to the exception. In an embodiment, there is only one parameter level attribute per parameter; behavior with more than one parameter level attribute is left undefined. Parameter level attributes are evaluated after method level attributes are evaluated before a method is invoked, and before method level attributes are evaluated after a method is invoked. An example of a parameter level attribute includes the "ContextMapping" attribute.
- The MethodState object is used by method attributes to control the execution of a test method. Because a plurality of attributes can be assigned to a test method (and because each attribute can potentially alter the execution of the test method), each attribute can communicate with the MethodState object to ensure consistent execution of the test method.
- The MethodState object can comprise information related to the execution state, a message, an error code, and a state override priority. The execution state comprises information regarding how the method has terminated (e.g., skip, pass, fail), whether the state is permitted to be changed, and whether the method should be executed. The message can be used to optionally present text that indicates why the test method is in a particular state. The error code can be used to indicate the details of an error that the test method wrapper might encounter while executing a test method. The state override priority field can be used to improve the consistency of test method execution by allowing the execution state, message, and the error code to be altered only if the new state has a priority greater than the existing state.
- The test method wrapper (224) executes a test method until a terminating state is reached. When the terminating state is reached, the MethodResult object is constructed from the final MethodState object.
- Execution attributes are responsible for parsing the result obtained from a method invocation. To determine if a method passed or failed, logs can be monitored for pass and fail entries. If any failures are logged, the method likely failed. If no passes or failures were logged, the method also likely failed. If an exception was thrown from the test method or any attributes, the method again likely failed. Otherwise the method can be considered to have (successfully) passed.
- In one embodiment,
test case scenario 230 is a collection of objects that coordinate the execution of test methods for a particular test case. The test methods can be written without beforehand knowledge of the test harness because of the interface provided by and through the test method executor (224). -
Test case scenario 230 typically comprisestest methods 231 and other methods anddata 232. Test methods access test harness objects by using the runtime object (which comprises the test method executor), rather than by querying specific test harnesses. - In one embodiment,
test harness 210 is a collection of objects that coordinate the execution of test cases and provides various test services.Test harness 210 typically comprises a UI (User Interface) 211, anExecution Engine 212, aContext object 213, and aLogging object 214. The test harness for purposes of added functionality may comprise other objects such as an automation system interface. - The execution engine (212) is responsible for loading and executing test case scenarios using the test runtime (220).
FIGURE 3 illustrates of aprocess 300 flow of an execution engine, in accordance with aspects of the invention. After a start block, the process moves to block 310, at which point a test runtime is loaded. In an embodiment wherein the test runtime is written in Net, the test runtime assembly and test case scenario assembly is loaded into an AppDomain. The test harness can display information such as the version of the .Net runtime loaded, or the version of the test runtime being used. - At
block 320, one or more test cases are loaded/compiled into memory. The test cases can be precompiled and loaded into memory or loaded into memory and then compiled. The test harness can display information about the test case scenario as well as whether the test case scenario loaded and/or compiled successfully. - Continuing at
block 330, the extraction engine is obtained. The extraction engine is obtained by first retrieving the test services provider object (221). Next, the type of the base extraction engine is determined from the test method executor (220). The static GetService function on the test services provider object is called (passing the type of the base extraction engine to the test services provider object) to receive a reference to an extraction engine. - At
block 340, test harness functionality is added to the test services provider. The AddService method on the test services provider is used to add to the test services provider the test services that are implemented on the test harness. The test services include objects that implement various interfaces such as the reboot mechanism, logging, context, and the like. If it is desirable to pass data to the extraction engine (such as, for example, an XML file to specify that certain method should be included or skipped), a test service object that implements a test harness data provider interface can be used. - At
block 350, the extraction engine is used to get test case steps. The extraction engine obtained atblock 330 is used to call a GetTestMethodWrappers method of the extraction engine, passing to the method the AppDomain holding the test case. An array of TestMethodWrappers is typically returned. - The array of method wrappers typically contains a list of test actions that should be executed in the order in which they are present in the array. (The extraction engine is typically responsible for ordering the array).
- At
block 360, the list of methods retrieved atblock 350 is executed in the order in which the methods are listed. Each method is executed by calling the Invoke method. The invoke method typically returns details about the result the test action. The details may include a success/fail/skip result, as well as additional details. The details of the result may be, for example, logged, or used to update the UI. -
FIGURE 4 illustrates of aprocess 400 flow of an execution engine, in accordance with aspects of the invention. For each test action,process 400 is repeated. The execution engine calls the InvokeInstanceMethod to initiate the execution of a particular test action. The InvokeInstanceMethod calls the InstanceMethod to invoke the particular test action. The InstanceMethod in turn calls the Method Wrapper (i.e., an example test method executor) to invoke the particular test action. - The method wrapper evaluates and executes the attributes (pre-invocation) of the particular test action. The method wrapper next invokes the test method in accordance with the evaluated/executed test method. After the test method has been executed, the attributes are again evaluated/executed (post-invocation). The method wrapper construes the result of the post-invocation attribute evaluation and returns a value that that signals the result. The ActionResult is passed to the InstanceMethod, and to the InvokeInstanceMethod in turn. The InvokeInstanceMethod evaluates the return value and passes the result to the execution engine.
-
- The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the scope of the invention, the invention resides in the claims hereinafter appended.
Claims (13)
- A method for automated testing, comprising:providing a test case scenario object (230) that coordinates the execution of test methods (231) that are arranged to test an electronic system;providing a runtime object (220) comprising a plurality of attributes (223) and a test extraction engine (222);extracting, by the extraction engine (222), an ordered list of the test methods based on the attributes, wherein an execution engine (212) orders the test methods into a hierarchy that comprises a base class and subclasses, and wherein each of the subclasses derives from the base class;selectively applying inheritance to the test methods according to the attributes to determine which of the subclasses inherits from the base class; andusing a test harness (210) to provide system test services for the test methods, wherein using the test harness comprises using the execution engine (212) to execute (360) the list of test methods according to the order of the test methods in the list.
- The method of Claim 1, further comprising using a comparison function that is defined within the attributes to order the test methods according to the attributes.
- The method of Claim 1, wherein the base class test methods comprise a setup method and a teardown method.
- The method of Claim 3, wherein the subclass test methods comprise a setup method and a teardown method.
- A test automation system (200), comprising:means for providing a test case scenario object (230) that coordinates the execution of test methods (231) that are arranged to test an electronic system;means for providing a runtime object (220) comprising a plurality of attributes (223) and a test extraction engine (222), wherein the extraction engine extracts an ordered list of the test methods based on the attributes, wherein an execution engine orders the test methods in a hierarchy that comprises a base class and subclasses, and wherein each of the subclasses derives from the base class;means for selectively applying inheritance to the test methods to determine which of the subclasses inherits from the base class; andtest harness means for providing system test services for the test methods, the test harness means comprising the execution engine (212) adapted for executing (360) the list of test methods according to the order of the test methods in the list.
- The system of Claim 5, further comprising means for using a comparison function that is defined within the attributes to order the test methods according to the attributes.
- The system of Claim 5, wherein the base class test methods comprise a setup method and a teardown method.
- The system of Claim 7, wherein the subclass test methods comprise a setup method and a teardown method.
- The system of Claim 5, further comprising means for modifying the state of a subclass test method in response to an attribute inherited by the subclass test method.
- A computer-readable medium having computer-executable components, comprising:a test case scenario object (230) configured to coordinate the execution of test methods (231) that are arranged to test an electronic system;a test runtime object (220) comprising a plurality of attributes (223) and a test extraction engine (222) configured to extract an ordered list of the test methods based on the attributes,wherein the test extraction engine (222) is further configured to order in an ordered list the test methods into a hierarchy that comprises a base class and subclasses, wherein each of the subclasses derives from the base class, andwherein inheritance is selectively applied to the test methods according to the attributes to determine which of the subclasses inherits from the base class; anda test harness (210) that is arranged to provide system test services for the test methods, wherein the test harness comprises an execution engine (212) configured to execute (360) the list of test methods according to the order of the test methods in the list.
- The computer-readable medium of Claim 10, wherein the test extraction engine (222) is further configured to use a comparison function that is defined within the attributes to order the test methods according to the attributes.
- The computer-readable medium of Claim 10, wherein the base class test methods comprise a setup method and a teardown method.
- The computer-readable medium of Claim 12, wherein the subclass test methods comprise a setup method and a teardown method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/815,019 US7552422B2 (en) | 2004-03-31 | 2004-03-31 | Test case inheritance controlled via attributes |
US815019 | 2004-03-31 |
Publications (3)
Publication Number | Publication Date |
---|---|
EP1582985A2 EP1582985A2 (en) | 2005-10-05 |
EP1582985A3 EP1582985A3 (en) | 2009-11-18 |
EP1582985B1 true EP1582985B1 (en) | 2011-06-08 |
Family
ID=34887733
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05102449A Not-in-force EP1582985B1 (en) | 2004-03-31 | 2005-03-29 | Test case inheritance controlled via attributes |
Country Status (6)
Country | Link |
---|---|
US (1) | US7552422B2 (en) |
EP (1) | EP1582985B1 (en) |
JP (1) | JP2005293578A (en) |
KR (1) | KR101036679B1 (en) |
CN (1) | CN100468356C (en) |
AT (1) | ATE512406T1 (en) |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060106819A1 (en) * | 2004-10-28 | 2006-05-18 | Komateswar Dhanadevan | Method and apparatus for managing a computer data storage system |
US7895575B2 (en) | 2005-08-19 | 2011-02-22 | Electronics And Telecommunications Research Institute | Apparatus and method for generating test driver |
KR100777103B1 (en) * | 2005-08-19 | 2007-11-19 | 한국전자통신연구원 | Apparatus and method for generation of test driver |
CN100346308C (en) * | 2005-11-24 | 2007-10-31 | 华为技术有限公司 | Automatic test method based on database operation |
WO2007097014A1 (en) * | 2006-02-27 | 2007-08-30 | Knowledge Brain Inc | Operation confirming method for information system, its operation confirming program, recording medium, and operation confirming system |
US8296731B2 (en) * | 2007-03-05 | 2012-10-23 | Microsoft Corporation | Dynamic method selection based on declarative requirements of interaction scope |
KR101014679B1 (en) * | 2007-09-14 | 2011-02-16 | 주식회사 신한은행 | System for Testing Program Source Code using Scenarios |
US8463760B2 (en) * | 2008-09-04 | 2013-06-11 | At&T Intellectual Property I, L. P. | Software development test case management |
JP2011100420A (en) * | 2009-11-09 | 2011-05-19 | Toshiba Corp | Test program creation device |
CN101984412B (en) * | 2010-10-13 | 2013-01-30 | 北京航空航天大学 | Method for scheduling parallel test tasks based on grouping and tabu search |
US8677320B2 (en) | 2011-04-06 | 2014-03-18 | Mosaic, Inc. | Software testing supporting high reuse of test data |
CN103064785B (en) * | 2012-12-04 | 2016-03-30 | 北京奇虎科技有限公司 | A kind of detection method of terminal capabilities and device |
CN103019900B (en) * | 2012-12-04 | 2016-10-26 | 北京奇虎科技有限公司 | The testing result display packing of terminal capabilities and device |
US9785542B2 (en) * | 2013-04-16 | 2017-10-10 | Advantest Corporation | Implementing edit and update functionality within a development environment used to compile test plans for automated semiconductor device testing |
RU2598988C2 (en) * | 2013-08-07 | 2016-10-10 | Фиизер Инк. | Methods and systems for searching for application software |
US20160239409A1 (en) * | 2013-10-17 | 2016-08-18 | Hewlett Packard Enterprise Development Lp | Testing a web service using inherited test attributes |
WO2015057234A1 (en) * | 2013-10-17 | 2015-04-23 | Hewlett-Packard Development Company, L.P. | Testing a web service using inherited test attributes |
US10546075B2 (en) | 2015-05-15 | 2020-01-28 | Futurewei Technologies, Inc. | System and method for a synthetic trace model |
CN106227666B (en) * | 2016-07-25 | 2019-05-17 | 微梦创科网络科技(中国)有限公司 | A kind of automated testing method and system based on big data |
CN110737597A (en) * | 2019-10-15 | 2020-01-31 | 北京弘远博学科技有限公司 | UI layer automatic testing method based on education training platform |
CN111625445B (en) * | 2020-04-23 | 2024-07-05 | 平安国际智慧城市科技股份有限公司 | Java-based test framework construction method, java-based test framework construction device and storage medium |
US11537508B1 (en) | 2021-02-23 | 2022-12-27 | State Farm Mutual Automobile Insurance Company | Software testing in parallel threads with a record-locking database |
US11714745B1 (en) | 2021-02-23 | 2023-08-01 | State Farm Mutual Automobile Insurance Company | Software testing in parallel with different database instances |
US11816023B1 (en) * | 2021-02-23 | 2023-11-14 | State Farm Mutual Automobile Insurance Company | Test conflict guard for parallel software testing |
US11720482B1 (en) | 2021-02-23 | 2023-08-08 | State Farm Mutual Automobile Insurance Company | Retrying failed test cases in software testing using parallel threads |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5414836A (en) * | 1993-09-29 | 1995-05-09 | International Business Machines Corporation | Software testing system that employs a graphical interface to generate test cases configured as hybrid tree structures |
US5751941A (en) * | 1996-04-04 | 1998-05-12 | Hewlett-Packard Company | Object oriented framework for testing software |
US6031990A (en) * | 1997-04-15 | 2000-02-29 | Compuware Corporation | Computer software testing management |
US6430705B1 (en) * | 1998-08-21 | 2002-08-06 | Advanced Micro Devices, Inc. | Method for utilizing virtual hardware descriptions to allow for multi-processor debugging in environments using varying processor revision levels |
-
2004
- 2004-03-31 US US10/815,019 patent/US7552422B2/en not_active Expired - Fee Related
-
2005
- 2005-03-28 JP JP2005091718A patent/JP2005293578A/en active Pending
- 2005-03-29 AT AT05102449T patent/ATE512406T1/en not_active IP Right Cessation
- 2005-03-29 EP EP05102449A patent/EP1582985B1/en not_active Not-in-force
- 2005-03-31 KR KR1020050027005A patent/KR101036679B1/en not_active IP Right Cessation
- 2005-03-31 CN CNB2005100562599A patent/CN100468356C/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
EP1582985A3 (en) | 2009-11-18 |
KR101036679B1 (en) | 2011-05-23 |
ATE512406T1 (en) | 2011-06-15 |
CN100468356C (en) | 2009-03-11 |
JP2005293578A (en) | 2005-10-20 |
KR20060045072A (en) | 2006-05-16 |
EP1582985A2 (en) | 2005-10-05 |
CN1677365A (en) | 2005-10-05 |
US20050251719A1 (en) | 2005-11-10 |
US7552422B2 (en) | 2009-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1582985B1 (en) | Test case inheritance controlled via attributes | |
JP4950454B2 (en) | Stack hierarchy for test automation | |
CN107273286B (en) | Scene automatic test platform and method for task application | |
US20190324772A1 (en) | Method and device for processing smart contracts | |
US5617533A (en) | System and method for determining whether a software package conforms to packaging rules and requirements | |
US7484223B2 (en) | System and method for building a run-time image from components of a software program | |
US8392873B2 (en) | Methods and apparatus for implementing model-based software solution development and integrated change management | |
US6298353B1 (en) | Checking serialization compatibility between versions of java classes | |
US7340725B1 (en) | Smart test attributes and test case scenario in object oriented programming environment | |
CN109189374B (en) | Object structure code generation method and system based on object reference chain | |
KR20070049166A (en) | System and method for extraction and creation of application meta-information within a software application repository | |
US20090193444A1 (en) | Techniques for creating and managing extensions | |
WO2007044170A1 (en) | Extensible mechanism for object composition | |
CN109729075B (en) | Cloud platform component security policy implementation method | |
CN112181858B (en) | Automatic detection method for Java software project dependent conflict semantic consistency | |
US10275236B2 (en) | Generating related templated files | |
CN111625225A (en) | Program specified data output method and device | |
US20130019225A1 (en) | Incremental Inferences for Developing Data Models | |
US10459698B2 (en) | Framework for generating adapters in an integrated development environment | |
US20060129880A1 (en) | Method and system for injecting faults into a software application | |
US7577541B1 (en) | Test services provider | |
CN103197947A (en) | Script processing method and device | |
US8539468B2 (en) | System and methods for replacing software application classes using transparent object adapters | |
US7082376B1 (en) | State full test method executor | |
CN115469864A (en) | Application development device and method based on atomization packaging command |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR LV MK YU |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR LV MK YU |
|
17P | Request for examination filed |
Effective date: 20100420 |
|
17Q | First examination report despatched |
Effective date: 20100526 |
|
AKX | Designation fees paid |
Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602005028370 Country of ref document: DE Effective date: 20110721 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: VDEP Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110909 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110919 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20111008 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20111010 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20120319 Year of fee payment: 8 |
|
26N | No opposition filed |
Effective date: 20120309 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20120328 Year of fee payment: 8 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602005028370 Country of ref document: DE Effective date: 20120309 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20120411 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20120331 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20120331 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20120331 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20120329 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110908 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20130329 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: ST Effective date: 20131129 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602005028370 Country of ref document: DE Effective date: 20131001 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20131001 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20130329 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20130402 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20110608 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20120329 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20050329 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: 732E Free format text: REGISTERED BETWEEN 20150312 AND 20150318 |