[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN117290255A - Batch interface performance test method based on Python and Locut frameworks - Google Patents

Batch interface performance test method based on Python and Locut frameworks Download PDF

Info

Publication number
CN117290255A
CN117290255A CN202311576655.9A CN202311576655A CN117290255A CN 117290255 A CN117290255 A CN 117290255A CN 202311576655 A CN202311576655 A CN 202311576655A CN 117290255 A CN117290255 A CN 117290255A
Authority
CN
China
Prior art keywords
test
testing
user
steps
creating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311576655.9A
Other languages
Chinese (zh)
Inventor
欧庆伟
庞志斌
刘斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Hualai Technology Co Ltd
Original Assignee
Tianjin Hualai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Hualai Technology Co Ltd filed Critical Tianjin Hualai Technology Co Ltd
Priority to CN202311576655.9A priority Critical patent/CN117290255A/en
Publication of CN117290255A publication Critical patent/CN117290255A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3644Software debugging by instrumenting at runtime
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides a batch interface performance test method based on Python and Locust frameworks, which comprises the following steps: creating a simulated running environment for test by initializing data and simulating a real flow; configuring relevant parameters for testing and setting a testing strategy according to the relevant parameters for testing; creating a corresponding test function for an interface to be tested, adding a label for the test function by using a decorator, and setting operation times; setting related operation of a simulation user for testing and configuring user behaviors; creating a starting command, and designating a test function to be run and a storage address of a test report; and performing post-test treatment after the test operation is finished. The invention has the beneficial effects that: all interfaces of one project can be subjected to performance test at one time, and a unified test report is generated, so that the test efficiency is remarkably improved, the time cost is reduced, and the complex performance test requirement can be met.

Description

Batch interface performance test method based on Python and Locut frameworks
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a batch interface performance test method based on Python and Locust frameworks.
Background
Locus is an open-source performance test tool, which is mainly used for simulating a large number of users to perform pressure test on application programs, systems or networks, and because Locus is an open-source tool, users can use the Locus freely and customize test schemes and test report output.
Performance testing plays an important role in software development, and is mainly used for ensuring that a system can stably and efficiently run under different load conditions so as to provide excellent user experience, and the performance testing is also beneficial to resource planning and provides reliable data support for future business strategies.
Locus is widely adopted in the field of performance test, however, locus is designed mainly by starting from a simulation user to perform performance test, and when specific interfaces are tested, the performance test cannot be met in batches, for example, 100 interfaces need to be tested for one project, the second concurrency number of each interface is 50, the test duration of a single interface is ten minutes, the existing lobust scheme can only be used for performing operation test of one interface after another or testing the 100 interfaces at the same time, obviously, the existing scheme needs great labor and time cost, and cannot meet the complex performance test requirement.
Disclosure of Invention
In view of the foregoing, the present invention is directed to a batch interface performance testing method based on Python and locusts frameworks, so as to solve at least one of the above-mentioned technical problems.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
a batch interface performance test method based on Python and Locust frameworks comprises the following steps:
creating a simulated running environment for test by initializing data and simulating a real flow;
configuring relevant parameters for testing and setting a testing strategy according to the relevant parameters for testing;
creating a corresponding test function for an interface to be tested, adding a label for the test function by using a decorator, and setting operation times;
setting related operation of a simulation user for testing and configuring user behaviors;
creating a starting command, and designating a test function to be run and a storage address of a test report;
and performing post-test treatment after the test operation is finished.
Further, the process of using the decorator to tag the test function and set the number of runs includes:
adding labels for all test functions by using a decorator, wherein a plurality of labels are arranged for each test function, and the corresponding test function is appointed to run by selecting the labels;
and setting an operation weight parameter for the test functions of all interfaces to be tested by using the decorator, and setting the operation times of the corresponding test functions according to the operation weight parameter.
Further, by initializing data and simulating a real flow, the process of creating a test-use simulated running environment includes:
initializing global variables and providing basic data to be shared for subsequent operations;
executing user registration operation, acquiring corresponding Token, simulating the interaction process of a real user, and providing identity authentication for the simulated user;
initializing equipment information, providing equipment attribute information for equipment operation, and simulating real equipment;
simulating a user to perform binding operation on the test related equipment, and acquiring a corresponding equipment Token in each binding;
request header information is set to provide necessary request header data for subsequent interface requests.
Further, the process of configuring the relevant parameters for testing and setting the testing strategy according to the relevant parameters for testing comprises the following steps:
extracting relevant data of test time, user quantity and generation rate, parameterizing the relevant data, and configuring through a configuration file;
the whole testing process is divided into two stages, different testing strategies are set for each stage, testing preheating is carried out in the starting testing stage, and interface detection operation is carried out in the main testing stage.
Further, the process of creating the corresponding test function for the interface to be tested includes:
traversing interfaces to be tested, creating a corresponding test function for each interface, and simulating alarm operation of equipment by the test function;
the device alert operation includes generating a request identification, building a request, sending the request, and verifying the response.
Further, the process of setting up the relevant operations of the simulated user for the test and configuring the user behavior comprises:
printing a user start stop log, and recording state change of a user in the test process;
setting user behavior waiting time and simulating interval time between user task execution;
specifying a specific test task class to be executed by a user;
the target address of the user test request is specified.
Further, the process of creating a start command to specify a test function to be run and a memory address of a test report includes:
naming the report file according to the scheme of calling the test function in the test process;
executing a Locust test program by using a subprocess calling function to perform interface test;
the report folder is compressed into a ZIP file, and the ZIP file is moved to a specified position for storage.
Further, the process of performing post-test processing after the test operation is finished includes:
recording failure, overtime and abnormal important request information in the test process into a log;
performing batch deletion processing on the newly added data, and cleaning a test environment;
and printing prompt information of ending the test on the console.
Compared with the prior art, the batch interface performance testing method based on the Python and Locust frameworks has the following beneficial effects:
all interfaces of one project can be subjected to performance test at one time, and a unified test report is generated, so that the test efficiency is remarkably improved, the time cost is reduced, and the complex performance test requirement can be met.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention. In the drawings:
fig. 1 is a flow chart of a batch interface performance testing method based on Python and locusts frameworks according to an embodiment of the present invention.
Detailed Description
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
The invention will be described in detail below with reference to the drawings in connection with embodiments.
A batch interface performance test method based on Python and Locust frameworks comprises the following steps:
s1, creating a simulated running environment for test by initializing data and simulating a real flow;
s2, configuring relevant parameters for testing and setting a testing strategy according to the relevant parameters for testing;
s3, creating a corresponding test function for the interface to be tested, adding a label for the test function by using a decorator, and setting operation times;
s4, setting related operations of the simulation user for testing and configuring user behaviors;
s5, creating a starting command, and designating a test function to be operated and a storage address of a test report;
s6, performing post-test treatment after the test operation is finished.
In step S1, the process of creating a test simulation running environment by initializing data and simulating a real flow includes:
s11, initializing global variables, and providing basic data to be shared for subsequent operation, wherein the basic data comprise equipment information, user information and the like;
s12, executing user registration operation, acquiring a corresponding Token, simulating an interaction process of a real user, and providing identity authentication for the simulated user;
s13, initializing equipment information, providing equipment attribute information such as MAC addresses, models and the like for equipment operation, and simulating real equipment;
s14, simulating a user to perform binding operation on the test related equipment, and acquiring a corresponding equipment Token in each binding;
s15, request header information is set, and necessary request header data is provided for subsequent interface requests.
By initializing data and simulating real processes such as registration, equipment binding and the like, an operation environment of a real scene is created for a subsequent pressure test task, necessary input parameters and simulation conditions are provided for testing, and the authenticity of the testing is ensured.
And some basic preparations before performance testing are completed, unified operation support is provided for each subsequent test task, and the preparation work of the test environment is decoupled from the test task, so that independent operation and expansion of the test work are facilitated.
The process of configuring the relevant parameters for testing and setting the testing strategy according to the relevant parameters for testing in the step S2 comprises the following steps:
s21, extracting relevant data of test time, user quantity and generation rate, parameterizing the relevant data, and configuring through a configuration file;
s22, the whole testing process is divided into two stages, different testing strategies are set for each stage, testing preheating is carried out in the starting testing stage, and interface detection operation is carried out in the main testing stage.
The test strategy is specifically as follows:
and (3) starting a testing stage:
the time parameter is setp_time, which is set to 10 seconds, namely a shorter time period;
users is set to 1, indicating that there is only one concurrent user;
the spin_rate is set to 1, indicating that only one user is generated/stopped per second.
The main testing stage: the time parameter is time_limit, and the time parameter is configured according to a user, namely the total test duration;
the users parameter is number_of_user, and the number of concurrent users in the stage is represented according to user configuration;
the spin_rate is set to 10, indicating 10 users are generated/stopped per second.
The testing strategy is parameterized by configuring the testing strategy, so that the configuration is easier, the testing process is staged, the first stage is simply preheated, the second stage is mainly tested, different strategies are set for each stage, and an actual testing scene is simulated.
The process of creating a corresponding test function for the interface to be tested in step S3 includes:
s31, traversing interfaces to be tested, and creating a corresponding test function for each interface, wherein the test function simulates alarm operation of equipment;
s32, the device alarming operation comprises the steps of generating a request identification, constructing a request, sending the request and verifying a response.
The process of adding labels to test functions and setting the running times by using the decorator in the step S3 comprises the following steps:
s33, adding labels to all the test functions by using a decorator, wherein a plurality of labels are arranged for each test function, and the corresponding test function is appointed to run by selecting the labels;
specifically, the same tag may be added to multiple performance test functions, for example @ tag ('all'), which indicates that the functions all belong to the same tag group, and when running, the functions may be selectively run by designating the tag, for example, the tag is designated as 'all', and then only the function with the 'all' tag will be executed;
in addition, multiple labels may be added to certain specific performance test functions, such as @ tag ('top', 'all'), indicating that the functions belong to both the 'top' and 'all' label sets. During operation, the functions can be selectively operated by designating the tag as 'top', and only the most important interfaces are operated;
by using the @ tag decorator, performance test functions can be organized and filtered according to different labels, specific functions can be flexibly and selectively operated, and performance test management and scheduling are facilitated.
S34, respectively setting an operation weight parameter for the test functions of all interfaces to be tested by using the decorator, and setting the operation times of the corresponding test functions according to the operation weight parameter;
for example, a parameter poll_time is set and assigned to 200, and then @ task (poll_time) is set for each interface performance test function, indicating that each performance test function is to run 200 times.
The process of setting the related operation of the simulated user for test and configuring the user behavior in the step S4 comprises the following steps:
s41, printing a user start and stop log, and recording state change of a user in the test process;
s42, setting user behavior waiting time, and simulating interval time between user task execution;
setting wait_time=constant (1), specifically, constant (1) indicates that the waiting time of the user is a constant, i.e. the waiting time after each task execution is 1 second, which makes it possible to meet the requirement of running a batch interface in batches;
for example, if we need to test 100 interfaces, the number of second concurrency users of each interface is 50, when performance test is performed on interface 1, 50 users start to send requests to interface 1, then repeat this process 200 times, after the test is completed, wait for all users to request to complete and then start to test interface 2.
S43, designating a specific test task class to be executed by a user;
s44, designating the target address of the user test request.
Creating a start command in step S5, the process of specifying the test function to be run and the memory address of the test report includes:
s51, naming report files according to a scheme of calling test functions in a test process (for example, all interfaces are tested, top is tested, and a single interface is set to name by using the interface file);
s52, executing a Locust test program by using the subprocess calling function to perform interface test;
s53, compressing the report folder into a ZIP file, and moving the ZIP file to a specified position for storage.
The process of performing the post-test processing after the test operation in step S6 includes:
s61, recording failure, overtime and abnormal important request information in the test process into a log;
s62, deleting the newly added data in batches, and cleaning a test environment;
s63, printing prompt information of ending the test on the console.
Those of ordinary skill in the art will appreciate that the elements and method steps of each example described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the elements and steps of each example have been described generally in terms of functionality in the foregoing description to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in this application, it should be understood that the disclosed methods and systems may be implemented in other ways. For example, the above-described division of units is merely a logical function division, and there may be another division manner when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. The units may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention, and are intended to be included within the scope of the appended claims and description.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (8)

1. A batch interface performance test method based on Python and Locust frameworks is characterized by comprising the following steps:
creating a simulated running environment for test by initializing data and simulating a real flow;
configuring relevant parameters for testing and setting a testing strategy according to the relevant parameters for testing;
creating a corresponding test function for an interface to be tested, adding a label for the test function by using a decorator, and setting operation times;
setting related operation of a simulation user for testing and configuring user behaviors;
creating a starting command, and designating a test function to be run and a storage address of a test report;
and performing post-test treatment after the test operation is finished.
2. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of using the decorator to tag the test function and set the number of runs includes:
adding labels for all test functions by using a decorator, wherein a plurality of labels are arranged for each test function, and the corresponding test function is appointed to run by selecting the labels;
and setting an operation weight parameter for the test functions of all interfaces to be tested by using the decorator, and setting the operation times of the corresponding test functions according to the operation weight parameter.
3. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of creating a test simulated running environment by initializing data and simulating a real flow comprises the following steps:
initializing global variables and providing basic data to be shared for subsequent operations;
executing user registration operation, acquiring corresponding Token, simulating the interaction process of a real user, and providing identity authentication for the simulated user;
initializing equipment information, providing equipment attribute information for equipment operation, and simulating real equipment;
simulating a user to perform binding operation on the test related equipment, and acquiring a corresponding equipment Token in each binding;
request header information is set to provide necessary request header data for subsequent interface requests.
4. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of configuring relevant parameters for testing and setting a testing strategy according to the relevant parameters for testing comprises the following steps:
extracting relevant data of test time, user quantity and generation rate, parameterizing the relevant data, and configuring through a configuration file;
the whole testing process is divided into two stages, different testing strategies are set for each stage, testing preheating is carried out in the starting testing stage, and interface detection operation is carried out in the main testing stage.
5. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of creating a corresponding test function for an interface to be tested includes:
traversing interfaces to be tested, creating a corresponding test function for each interface, and simulating alarm operation of equipment by the test function;
the device alert operation includes generating a request identification, building a request, sending the request, and verifying the response.
6. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of setting up relevant operations of the simulated user for the test and configuring the user behavior comprises the following steps:
printing a user start stop log, and recording state change of a user in the test process;
setting user behavior waiting time and simulating interval time between user task execution;
specifying a specific test task class to be executed by a user;
the target address of the user test request is specified.
7. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
creating a start command, and designating a test function to be run and a memory address of a test report includes:
naming the report file according to the scheme of calling the test function in the test process;
executing a Locust test program by using a subprocess calling function to perform interface test;
the report folder is compressed into a ZIP file, and the ZIP file is moved to a specified position for storage.
8. The batch interface performance testing method based on the Python and locusts framework as claimed in claim 1, wherein the method comprises the following steps:
the process of performing test post-processing after the test operation is finished comprises the following steps:
recording failure, overtime and abnormal important request information in the test process into a log;
performing batch deletion processing on the newly added data, and cleaning a test environment;
and printing prompt information of ending the test on the console.
CN202311576655.9A 2023-11-24 2023-11-24 Batch interface performance test method based on Python and Locut frameworks Pending CN117290255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311576655.9A CN117290255A (en) 2023-11-24 2023-11-24 Batch interface performance test method based on Python and Locut frameworks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311576655.9A CN117290255A (en) 2023-11-24 2023-11-24 Batch interface performance test method based on Python and Locut frameworks

Publications (1)

Publication Number Publication Date
CN117290255A true CN117290255A (en) 2023-12-26

Family

ID=89252050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311576655.9A Pending CN117290255A (en) 2023-11-24 2023-11-24 Batch interface performance test method based on Python and Locut frameworks

Country Status (1)

Country Link
CN (1) CN117290255A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117609101A (en) * 2024-01-23 2024-02-27 云筑信息科技(成都)有限公司 Method for testing multiple engines of big data of user

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444111A (en) * 2020-04-15 2020-07-24 深圳市万睿智能科技有限公司 Performance testing method and device based on python, computer equipment and storage medium
CN114138670A (en) * 2021-12-10 2022-03-04 四川启睿克科技有限公司 Method based on interface automation test and function, performance and safety test fusion
CN114374632A (en) * 2022-01-10 2022-04-19 北京中电兴发科技有限公司 Internet of things data platform multi-protocol test efficiency improvement method
CN116302910A (en) * 2021-12-21 2023-06-23 北京奇虎科技有限公司 Use case retry method, device, equipment and storage medium
CN116820908A (en) * 2023-06-28 2023-09-29 深圳复临科技有限公司 Locust-based performance test method, device, equipment and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444111A (en) * 2020-04-15 2020-07-24 深圳市万睿智能科技有限公司 Performance testing method and device based on python, computer equipment and storage medium
CN114138670A (en) * 2021-12-10 2022-03-04 四川启睿克科技有限公司 Method based on interface automation test and function, performance and safety test fusion
CN116302910A (en) * 2021-12-21 2023-06-23 北京奇虎科技有限公司 Use case retry method, device, equipment and storage medium
CN114374632A (en) * 2022-01-10 2022-04-19 北京中电兴发科技有限公司 Internet of things data platform multi-protocol test efficiency improvement method
CN116820908A (en) * 2023-06-28 2023-09-29 深圳复临科技有限公司 Locust-based performance test method, device, equipment and medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
徐西宁: "软件自动化测试实战解析 基于Python3编程语言", 31 July 2021, 机械工业出版社, pages: 201 - 204 *
梁静等: "进化计算标准测试函数介绍与分析", 31 July 2022, 国防工业出版社, pages: 119 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117609101A (en) * 2024-01-23 2024-02-27 云筑信息科技(成都)有限公司 Method for testing multiple engines of big data of user
CN117609101B (en) * 2024-01-23 2024-05-28 云筑信息科技(成都)有限公司 Method for testing multiple engines of big data of user

Similar Documents

Publication Publication Date Title
CN102880532B (en) Cloud technology-based test system and method
US8434058B1 (en) Integrated system and method for validating the functionality and performance of software applications
CN101411123B (en) Method, system and computer program for the centralized system management on endpoints of a distributed data processing system
CN109933521A (en) Automated testing method, device, computer equipment and storage medium based on BDD
CA3131079A1 (en) Test case generation method and device, computer equipment and storage medium
CN112631846A (en) Fault drilling method and device, computer equipment and storage medium
CN105303112A (en) Component calling bug detection method and apparatus
CN114692169B (en) Page vulnerability processing method applying big data and AI analysis and page service system
CN112433944A (en) Service testing method, device, computer equipment and storage medium
CN112650688A (en) Automated regression testing method, associated device and computer program product
CN100520732C (en) Performance test script generation method
CN112905437A (en) Method and device for testing case and storage medium
CN112579455A (en) Interface automatic testing method and device, electronic equipment and storage medium
CN112433948A (en) Simulation test system and method based on network data analysis
CN117290255A (en) Batch interface performance test method based on Python and Locut frameworks
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN107357721B (en) Method and device for testing system
CN112231206A (en) Script editing method for application program test, computer readable storage medium and test platform
CN113220597B (en) Test method, test device, electronic equipment and storage medium
CN112416805A (en) Test management cloud platform and method
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN111444109A (en) Mobile terminal UI automatic testing method and system
CN115759518A (en) Usability treatment system based on chaos engineering
CN115269387A (en) Automatic interface testing method and device
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20231226

RJ01 Rejection of invention patent application after publication