CN115269387A - Automatic interface testing method and device - Google Patents
Automatic interface testing method and device Download PDFInfo
- Publication number
- CN115269387A CN115269387A CN202210833161.3A CN202210833161A CN115269387A CN 115269387 A CN115269387 A CN 115269387A CN 202210833161 A CN202210833161 A CN 202210833161A CN 115269387 A CN115269387 A CN 115269387A
- Authority
- CN
- China
- Prior art keywords
- interface
- case
- test
- information
- cases
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
An automatic interface testing method comprises the following steps: creating a test item and an interface, and associating the test item and the interface; creating an interface use case and associating an interface use case interface; creating an environment variable, storing the environment variable into a database in a key value pair form, adding a parameter extractor into a test case, and storing an extraction interface return value into the database in an environment variable form; creating a test scene, and associating the test scene with an interface case; and executing the interface case to generate a test report. An interface automatic testing device comprises a project management module, an interface case management module, a scene management module, an environment variable management module and a test report management module. The invention extracts the public part of the interface case, automatically triggers the script execution, quickly evaluates the code quality, realizes more test scene coverage and realizes the sharing of the test report.
Description
Technical Field
The invention relates to the technical field of computer software testing, in particular to an automatic interface testing method and device.
Background
During the whole software life cycle, a piece of software needs to be tested in a large amount before being online, so that the quality of the software is improved.
The common postman and meter in the interface test are interface test client tools, based on various models and frames, test cases are automatically generated, test programs are automatically executed to obtain test results, the interfaces are generally tested in a mode of manually writing interface test scripts, however, the number of the interfaces is generally large, and in order to meet the requirement of full coverage of interface test, the number of interface test codes needing to be written by test engineers is often abnormally large.
Meanwhile, the test case scripts are stored locally, so that the sharing of test cases and the sharing of test results cannot be well realized, and team cooperation cannot be realized; when the use cases are combined to test a scene, the existing use cases cannot be reused, and the script development workload is increased, so that the labor cost input in the whole process is higher, and the test efficiency is low.
Therefore, it is desirable to provide an interface automatic testing method and apparatus to solve the above problems in the prior art.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an interface automatic testing method and device aiming at the defects involved in the background technology, and solve the problems of high labor cost and low testing efficiency caused by difficult team cooperation, a large amount of repeated work and incapability of sharing test reports in time in the interface automatic testing process.
The technical problems of the invention are realized by adopting the following technical scheme:
an interface automatic test method comprises the following steps:
s1: creating a test item and an interface, and associating the test item and the interface;
s2: creating one or more interface use cases and associating the interface use case interfaces;
the interface case comprises but is not limited to a case id, a case name, an interface message parameter and expected result information, and the system stores the interface case information into a database to form a test case library;
s3: creating one or more environment variables comprising environment variable names and environment variable values, storing the environment variables into a database in a key value pair mode, adding a parameter extractor into a test case, extracting an interface return value, storing the interface return value into the database in an environment variable mode, and adding other use cases into a use case interface message parameter in an environment variable mode;
s4: creating one or more test scenes, storing the id and name information of the test scenes into a database by the system, and associating the test scenes with interface use cases;
s5: and executing the interface case, and generating a test report through a preset case test report template.
Preferably, in step S1, the creating of the test item includes storing information including, but not limited to, an item name, an item id, and an interface host in a database, and switching a test environment by modifying the item host information.
Preferably, in step S1, the created interface includes, but is not limited to, public information of the interface, and the associated item id and the public information of the interface are stored in the database after the interface is created.
Preferably, in step S3, the system traverses whether the message parameters of the test case include the environment variable name when executing the test case, and if so, queries the corresponding environment variable value from the database through the environment variable name, and replaces the environment variable name with the environment variable value, thereby implementing the business logic association between the use cases.
Preferably, in step S4, the interface use cases in the use case library may be associated by multiple scenes, so as to implement multiplexing of the test cases.
Preferably, the associating the test scenario with the interface use case includes:
and inquiring a target case in a case library through a system presetting method in the test scene, adding an execution sequence number to the test case, storing information of the scene id, the case id and the execution sequence number into a database, realizing the association between the test scene and the interface cases, realizing scene service arrangement through associating a plurality of interface test cases and sequencing through the case sequence numbers in the test scene, and forming a scene service flow.
Preferably, the executing operation of the interface case includes:
the system queries a case name, interface message parameters and expected result information through an interface case id, then queries id, url and heads information of an interface corresponding to the interface case, and queries project host information corresponding to the interface through the interface id;
combining the information inquired by the interface use case to form complete interface request information and sending an interface request to a corresponding server;
the system acquires interface return information and compares the interface return information with an expected result in the interface use case, if the comparison is consistent, the use case is judged to be successfully executed, and if the comparison is inconsistent, the execution is judged to be failed;
the system stores the complete interface request information and the execution result information of the interface use case into a database, acquires data from the database, and generates a test report through a preset use case test report template of the system.
Preferably, in the step S5, the interface may also be operated to execute, and all interface cases associated with the interface are queried through the interface id and sorted according to the test case id, then the system obtains an assembly case message through a preset method according to the case arrangement sequence, sequentially executes all cases, stores the case execution information in a database, and calculates a pass rate according to the case execution result, and generates a test report;
in the step S5, the created project may also be executed, all interfaces associated with the project are queried through the project id, interface test cases are queried through the interface id, all cases under the project are sorted according to the case id, and then the system may obtain an assembly case message through the method in the step S6 according to the case arrangement order, and sequentially execute all cases; the system stores all the case execution information into a database, calculates the passing rate according to the case execution result and generates a test report;
in the step S5, the created scene may also be executed, all interface test case information and case execution numbers associated with the scene are queried through the scene id, all cases are sorted according to the case execution numbers, a case message is assembled according to the order of the case numbers, and all cases are sequentially executed; the system stores all the case execution information into the database, realizes the interface service scene test, and calculates the passing rate according to the case execution result to generate the test report.
According to another aspect of the embodiments of the present invention, an interface automation testing apparatus is provided, which includes a project management module, an interface case management module, a scene management module, an environment variable management module, and a test report management module, and implements modular management of interface cases.
Preferably, the project management module in the interface automation testing device is used for managing the created testing project, storing host information requested by the interface, and storing the created testing project in a project library;
the interface management is used for storing and managing the created interface information and storing the created interface in an interface library;
the interface case management is used for storing case message parameters and expected result information and storing interface cases in a case library;
the scene management module is used for using case arrangement service and storing the created scene in a scene pool;
the environment variable management module is used for storing message parameter transmission data between use cases;
the test report management module is used for storing and displaying the case execution records of different levels and carrying out statistical display on the case execution success rate.
According to the invention, the common part of the interface case is extracted, the script execution is automatically triggered, manual writing is not needed, the requirements of testers are more flexibly met, and the compiling efficiency of the automatic test case is improved;
by flexibly arranging test scenes, the code quality is quickly evaluated, more test scene coverage is realized, and test cases are executed at different levels, so that the test report dimensionality is enriched;
the user can check the test reports of case level, interface level, project level and test scene level in real time according to classification in the test report management of the automatic test platform, so that the test reports can be shared, and the verification is strong, flexible and convenient.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Drawings
Fig. 1 is a flowchart illustrating an interface automation test method according to an embodiment.
Detailed Description
The technical solution of the present invention is further described in detail with reference to the accompanying drawings. It is understood that the embodiments described are only a few embodiments, not all embodiments, and that all other embodiments obtained by those skilled in the art without the use of inventive faculty are within the scope of the invention.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, components and/or sections, these elements, components and/or sections should not be limited by these terms.
In one embodiment, as shown in fig. one, an interface automated testing method includes the following steps:
s1: creating a test item and an interface on an automatic test platform, and associating the test item and the interface;
s2: creating one or more interface use cases on an automatic test platform, and associating the interface use case interfaces;
the interface case comprises but is not limited to a case id, a case name, an interface message parameter and expected result information, and the system stores the interface case information into a database to form a test case library;
s3: creating one or more environment variables comprising environment variable names and environment variable values, storing the environment variables into a database in a key value pair mode, adding a parameter extractor into a test case, storing an extraction interface return value into the database in an environment variable mode, and adding other use cases into a use case interface message parameter in an environment variable mode;
s4: creating one or more test scenes, storing the id and name information of the test scenes into a database by the system, and associating the test scenes with the interface use cases;
s5: and executing the interface case, and generating a test report through a preset case test report template.
Preferably, in the step S1, the creating of the test item includes storing information including, but not limited to, an item name, an item id, and an interface host in a database, and the use case switching test environment may be implemented by modifying the item host information in a later stage.
After the test report is generated, a user can check the test reports of case level, interface level, item level and test scene level in real time according to classification in the test report management of the automatic test platform, so that the test reports can be shared.
In one embodiment, in the step S1, the created interface includes, but is not limited to, public information of the interface, and after the interface is created, the associated project id and the public information of the interface are stored in the database, so that repeated entry work of creating a test case of the interface at a later stage is reduced.
In one embodiment, in step S3, the system traverses whether the message parameters of the test case include an environment variable name when executing the test case, and if so, queries the environment variable value corresponding to the test case from the database by using the environment variable name, and replaces the environment variable name with the environment variable value, thereby implementing the business logic association between the use cases.
When the system executes the test case, the system traverses whether the message parameters contain the environment variable names, if so, the corresponding environment variable values are inquired from the database through the environment variable names, and the environment variable names are replaced into the environment variable values, so that the business logic association between the cases is realized.
In an embodiment, in the step S4, the interface use cases in the use case library may be associated by multiple scenarios, so as to implement multiplexing of the test cases.
By providing the query method, a user can query all created interface test cases in the case library, the sharing of an interface test case platform is realized, and the team cooperation problem is solved.
In one embodiment, the associating the test scenario with the interface use case includes:
and inquiring a target case in a case library through a system presetting method in the test scene, adding an execution sequence number to the test case, storing information of the scene id, the case id and the execution sequence number into a database, realizing the association between the test scene and the interface cases, realizing scene service arrangement through associating a plurality of interface test cases and sequencing through the case sequence numbers in the test scene, and forming a scene service flow.
In one embodiment, the performing an operation on the interface use case includes:
the system queries a case name, interface message parameters and expected result information through an interface case id, then queries id, url and heads information of an interface corresponding to the interface case, and queries project host information corresponding to the interface through the interface id;
combining the information inquired by the interface use case to form complete interface request information, and sending an interface request to a corresponding server;
the system acquires interface return information and compares the interface return information with an expected result in the interface use case, if the comparison is consistent, the use case is judged to be successfully executed, and if the comparison is inconsistent, the execution is judged to be failed;
the system stores the complete interface request information and the execution result information of the interface case into a database, acquires data from the database, and generates a test report through a preset case test report template of the system.
In one embodiment, the interface can be further operated to be executed, all interface use cases relevant to the interface are inquired through the interface id and are sorted according to the test use case id, then the system obtains assembly use case messages through a preset method according to the use case arrangement sequence, all use cases are executed in sequence, the system stores the use case execution information into a database, the passing rate is calculated according to the use case execution result, and a test report is generated;
the created project can be executed, all interfaces related to the project are inquired through the project id, interface test cases are inquired through the interface id, all cases under the project are sequenced according to the case id, then the system can obtain assembly case messages according to the case sequencing sequence through the method in the step 6, and all cases are executed in sequence; the system stores all the case execution information into a database, calculates the passing rate according to the case execution result and generates a test report;
the method can also perform execution operation on the created scene, inquire out all interface test case information and case execution numbers related to the scene through the scene id, sort all cases according to the case execution numbers, assemble case messages according to the order of the case numbers and sequentially execute all cases; the system stores all the case execution information into the database, realizes the interface service scene test, and calculates the passing rate according to the case execution result to generate the test report.
And generating an html-format test report through a preset case test report template for a user to check in the platform.
According to another aspect of the embodiment of the invention, an interface automation test device is provided, which comprises a project management module, an interface case management module, a scene management module, an environment variable management module and a test report management module, and is used for realizing modular management of interface cases;
the project management module is used for managing the created test projects, storing host information requested by the interface and storing the created test projects in a project library;
the interface management is used for storing and managing the created interface information and storing the created interface in an interface library;
the interface case management is used for storing case message parameters and expected result information and storing interface cases in a case library;
the scene management module is used for using case arrangement service and storing the created scene in a scene pool;
the environment variable management module is used for storing message parameter transmission data between use cases;
the test report management module is used for storing and displaying the case execution records of different levels and carrying out statistical display on the case execution success rate.
According to the invention, the common part of the interface case is extracted, the script execution is automatically triggered, manual writing is not needed, the requirements of testers are more flexibly met, and the compiling efficiency of the automatic test case is improved;
by flexibly arranging test scenes, the code quality is quickly evaluated, more test scene coverage is realized, and test cases are executed at different levels, so that the test report dimensionality is enriched;
the user can check the test reports of case level, interface level, project level and test scene level in real time according to classification in the test report management of the automatic test platform, thereby realizing the sharing of the test reports, and having strong checking force, flexibility and convenience.
Those skilled in the art will appreciate that the functionality described in the present invention may be implemented in a combination of hardware and software in one or more of the examples described above. When software is applied, the corresponding functionality may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The preferred embodiments of the present specification disclosed above are intended only to aid in the description of the specification. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the specification and its practical application, to thereby enable others skilled in the art to best understand the specification and its practical application. The specification is limited only by the claims and their full scope and equivalents.
Claims (10)
1. An automatic interface testing method is characterized by comprising the following steps:
s1: creating a test item and an interface, and associating the test item and the interface;
s2: creating one or more interface use cases and associating the interface use case interfaces;
the interface case comprises but is not limited to a case id, a case name, an interface message parameter and expected result information, and the system stores the interface case information into a database to form a test case library;
s3: creating one or more environment variables comprising environment variable names and environment variable values, storing the environment variables into a database in a key value pair mode, adding a parameter extractor into a test case, storing an extraction interface return value into the database in an environment variable mode, and adding other use cases into a use case interface message parameter in an environment variable mode;
s4: creating one or more test scenes, storing the id and name information of the test scenes into a database by the system, and associating the test scenes with interface use cases;
s5: and executing the interface case, and generating a test report through a preset case test report template.
2. The method for automatically testing the interface of claim 1, wherein in the step S1, the creating of the test item comprises storing information including but not limited to an item name, an item id and an interface host in a database, and switching the test environment by modifying the item host information.
3. The method for automatically testing the interface according to claim 1 or 2, wherein in the step S1, the created interface includes but is not limited to public information of the interface, and after the interface is created, the associated project id and the public information of the interface are stored in the database.
4. The method according to claim 1, wherein in step S3, the system traverses whether the message parameters of the test case contain the environment variable name when executing the test case, and if so, queries the corresponding environment variable value from the database through the environment variable name, and replaces the environment variable name with the environment variable value to implement the business logic association between the cases.
5. The method according to claim 1, wherein in step S4, the interface cases in the case library can be associated by multiple scenarios to implement multiplexing of the test cases.
6. The automated interface testing method of claim 1, wherein associating the test scenario with the interface use case comprises:
and inquiring a target case in a case library through a system preset method in the test scene, adding an execution sequence number to the test case, storing scene id, case id and execution sequence number information into a database, realizing the association between the test scene and the interface cases, realizing scene service arrangement through associating a plurality of interface test cases and sequencing through the case sequence numbers in the test scene, and forming a scene service flow.
7. The method for automated interface testing according to claim 1, wherein the performing an operation on the interface case comprises:
the system queries a case name, interface message parameters and expected result information through an interface case id, then queries id, url and heads information of an interface corresponding to the interface case, and queries project host information corresponding to the interface through the interface id;
combining the information inquired by the interface use case to form complete interface request information, and sending an interface request to a corresponding server;
the system acquires interface return information, compares the interface return information with an expected result in the interface use case, judges that the use case is successfully executed if the comparison is consistent, and judges that the execution is failed if the comparison is inconsistent;
the system stores the complete interface request information and the execution result information of the interface use case into a database, acquires data from the database, and generates a test report through a preset use case test report template of the system.
8. The automated interface testing method according to claim 7, wherein in step S5, the interface may be further operated to execute, and all interface cases associated with the interface are queried through the interface id and sorted according to the test case id, then the system obtains assembly case messages through a preset method according to the case arrangement order, and sequentially executes all cases, the system stores the case execution information in a database, and calculates a pass rate according to the case execution result, and generates a test report;
in the step S5, the created project may also be executed, all interfaces associated with the project are queried through the project id, interface test cases are queried through the interface id, all cases under the project are sorted according to the case id, and then the system may obtain an assembly case message through the method in the step 6 according to the case arrangement order, and sequentially execute all cases; the system stores all the case execution information into a database, calculates the passing rate according to the case execution result and generates a test report;
in the step S5, the created scene may also be executed, all interface test case information and case execution numbers associated with the scene are queried through the scene id, all cases are sorted according to the case execution numbers, a case message is assembled according to the case number sequence, and all cases are sequentially executed; the system stores all the case execution information into the database, realizes the interface service scene test, and calculates the passing rate according to the case execution result to generate the test report.
9. An interface automatic testing device is characterized by comprising a project management module, an interface case management module, a scene management module, an environment variable management module and a test report management module, and realizing modular management of interface cases.
10. The automated interface testing apparatus according to claim 9, wherein the project management module is configured to manage created test projects, store host information requested by the interface, and store the created test projects in a project library;
the interface management is used for storing and managing the created interface information and storing the created interface in an interface library;
the interface case management is used for storing case message parameters and expected result information and storing interface cases in a case library;
the scene management module is used for using case editing service and storing the created scene in a scene pool;
the environment variable management module is used for storing message parameter transmission data between use cases;
the test report management module is used for storing and displaying the case execution records of different levels and carrying out statistical display on the case execution success rate.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210347024 | 2022-04-01 | ||
CN2022103470249 | 2022-04-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115269387A true CN115269387A (en) | 2022-11-01 |
Family
ID=83766230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210833161.3A Pending CN115269387A (en) | 2022-04-01 | 2022-07-15 | Automatic interface testing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115269387A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117909246A (en) * | 2024-03-06 | 2024-04-19 | 广东保伦电子股份有限公司 | Automatic testing method and platform for front-end webpage interface |
-
2022
- 2022-07-15 CN CN202210833161.3A patent/CN115269387A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117909246A (en) * | 2024-03-06 | 2024-04-19 | 广东保伦电子股份有限公司 | Automatic testing method and platform for front-end webpage interface |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107908541B (en) | Interface testing method and device, computer equipment and storage medium | |
US5903897A (en) | Software documentation release control system | |
EP2778929B1 (en) | Test script generation system | |
CN112506807B (en) | Automatic test system for interface serving multiple systems | |
US20110113287A1 (en) | System for Automated Generation of Computer Test Procedures | |
CN112597014B (en) | Automatic test method and device based on data driving, medium and electronic equipment | |
CN110716870B (en) | Automatic service testing method and device | |
US20080120602A1 (en) | Test Automation for Business Applications | |
CN112631846B (en) | Fault drilling method and device, computer equipment and storage medium | |
US7328134B1 (en) | Enterprise integration test tool | |
US20090006493A1 (en) | Method For Enabling Traceability And Recovery From Errors During Migration Of Software Applications | |
CN102571403A (en) | Realization method and device for general data quality control adapter | |
US11086765B2 (en) | Test reuse exchange and automation system and method | |
CN111597104B (en) | Multi-protocol adaptive interface regression testing method, system, equipment and medium | |
CN112650688A (en) | Automated regression testing method, associated device and computer program product | |
CN108647147B (en) | Automatic testing robot implemented by using atlas analysis and use method thereof | |
CN112115058A (en) | Test method and device, test case generation method and device and test system | |
US9823999B2 (en) | Program lifecycle testing | |
CN112433948A (en) | Simulation test system and method based on network data analysis | |
CN112799782B (en) | Model generation system, method, electronic device and storage medium | |
CN107506294A (en) | Visualize automated testing method, device, storage medium and computer equipment | |
CN112882927A (en) | Interface automatic testing method, device, equipment and medium | |
EP2913757A1 (en) | Method, system, and computer software product for test automation | |
CN115269387A (en) | Automatic interface testing method and device | |
CN114297961A (en) | Chip test case processing method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |