CN105045710A - Method for automatically generating test data in cloud computing environment - Google Patents
Method for automatically generating test data in cloud computing environment Download PDFInfo
- Publication number
- CN105045710A CN105045710A CN201510373216.7A CN201510373216A CN105045710A CN 105045710 A CN105045710 A CN 105045710A CN 201510373216 A CN201510373216 A CN 201510373216A CN 105045710 A CN105045710 A CN 105045710A
- Authority
- CN
- China
- Prior art keywords
- test
- data
- cloud
- plan
- platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 198
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000008569 process Effects 0.000 claims abstract description 14
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- 238000005516 engineering process Methods 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 5
- 238000010219 correlation analysis Methods 0.000 claims description 4
- 230000006399 behavior Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 4
- 238000011056 performance test Methods 0.000 abstract description 3
- 230000003252 repetitive effect Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 17
- 230000010354 integration Effects 0.000 description 12
- 238000013515 script Methods 0.000 description 8
- 230000003044 adaptive effect Effects 0.000 description 6
- 238000013522 software testing Methods 0.000 description 6
- 239000000243 solution Substances 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 239000012085 test solution Substances 0.000 description 1
Landscapes
- Debugging And Monitoring (AREA)
Abstract
The invention relates to a method for automatically generating test data in a cloud computing environment, belonging to the technical field of performance tests. The method comprises the four steps of defining a test plan, uploading data to be tested to a cloud testing platform, executing a test and generating and analyzing a test report. According to the invention, too much manual intervention is unnecessary; the repetitive process can be effectively reduced, so that the time, the cost and the workload can be saved; and the method has the advantages of being rapid in data processing and testing speed, saved in memory space and the like.
Description
Technical Field
The invention belongs to the technical field of performance testing, and particularly relates to an automatic test data generation method in a cloud computing environment.
Background
Software testing is an important means for ensuring and improving software quality, and is an indispensable important link in the software life cycle. In the software testing process, the generation of test data is the core problem and the key and difficulty of the software testing. Generating appropriate test data is the basis for efficiently conducting software testing. The quantum leapfrog algorithm provides rich theories and methods for software test data generation, and can effectively improve the software test efficiency.
Any test can not be separated from a manual test, based on a test case, the manual test is needed at the initial stage of the test, but the manual test has a very large limitation, each path cannot be covered firstly, the unit test does not have a repeatability problem, once the regression test is achieved, the workload of the test work becomes very large, a plurality of errors related to time sequence, deadlock, resource conflict, multithreading and the like are generated, and the manual test cannot simulate various application occasions such as a large amount of data or a large amount of concurrent users and the like when the system load and the performance test are difficult to capture through the manual test. If the test data volume of the whole test process is huge and the data change is frequent, the test process needs to be completed in a short time (1 day), the operation of inputting, modifying, deleting and the like in a test management tool is almost impossible only depending on the test data manually tested by testers, the efficiency is low, the test data inconsistency is easy to generate, unreliable test data is provided for other testers, and the test result is invalid.
JMeter is a java-based testing tool developed by Apache organization, and compared with other HTTP testing tools, JMeter is mainly characterized by strong expansibility and is widely applied to the performance testing process of each company at present. The method provides a function of recording and generating a test script by using a local ProxyServer (proxy server), realizes a powerful test function and can provide an aggregation report, when a built task, a required task is searched in a plurality of tasks, an HTML report (report) is generated after the operation is finished, and an output result of the test is checked. The generated report displays the result of each test operation, including the test state, time, data execution sequence and time spent by all tests, so that a user or a test developer can intuitively master the whole test condition and check the test report result. Meanwhile, Jenkins is an open source project, and provides an easy-to-use continuous integration system, so that developers are liberated from complicated integration and concentrate on more important business logic implementation. Meanwhile, Jenkins can monitor errors in integration, provide detailed log files and reminding functions, and can vividly show the trend and stability of project construction in the form of charts. The method only needs to define some trigger conditions to support continuous construction based on tasks, and has a set of own plug-in development specifications, while the existing method based on continuous integration of performance (continuous integration) has an open-source performance plug-in dynamic parameters. On the basis, a JMeter and Jenkins test platform is set up, and the platform provides a test driving function library of a Java programming language. And informing Jenkins to compile the engineering project by using api provided by Jenkins, filling project information and automatically triggering a test result path needing to be loaded by a Jmeter plug-in engineering setting, compiling an automatic test script, calling a Jmeter test tool to test the project and generating a test result. Therefore, software development and test workers can conveniently compare the test results at any time, and find out the defects of the program codes from the comparison of the test results so as to improve the robustness of the program codes.
JMeter can simulate a large number of clients sending requests to a server to test the strength of the server and analyze the overall performance of the server. The Jmeter runtime generates a plurality of threads to simulate a plurality of user execution requests, each request records the information of the response time, the request initiation time, the request result and the like of the request, and the generated result file is an xml format file with the suffix of jtl. Therefore, the file has the characteristics of large file, more performance continuous integrated test records and the like, and meanwhile, a friendly GUI (graphical user interface) is convenient for users to use, open source, suitable for performance test of large and medium Web systems, free and low in learning curve. However, as with other java applications, it requires a large amount of memory during execution.
For the maximum flexibility, the assertion is created by adopting a regular expression under a JMeter framework, and whether the program code can return an expected result or not is verified through a script with the assertion, so that the function test of the application program is realized. Only a few simple command statements are needed to complete the control of a jmeter file over a jar type file (yang). Meanwhile, the jmeter file can modify codes and update data in real time along with the progress of the test, the improvement of the application performance of the software and the file update so as to be more suitable for the requirement of the test. However, when the amount of data to be queued for testing is large, a large amount of time is consumed to complete a period of testing, so that the requirement of development and testing personnel for checking the product performance cannot be met, and great inconvenience is brought to the testing work; meanwhile, if data in a file (table) script to be tested is excessive, the response time for loading the script into the meter tool is too long, and the operation is very slow. Therefore, a method for rapidly processing scripts of a large number of data files is urgently needed, and meanwhile, a server is needed to timely clean memory garbage, reduce occupation of disk space and improve analysis and reading efficiency of test result data. However, since a plurality of users share the memory, they need to apply for resources (i.e. memory) from the terminal, if the allocated memory is insufficient, the test will fail, and the memory resources of the system are very limited, which cannot meet the continuous requirements of software developers, and the cloud computing software test can effectively utilize the dynamically expandable mass resources of the cloud platform, save the test time and reduce the test cost, which is a better test solution. However, the existing cloud computing software test platforms and services need to be charged when in use, part of integrated solutions are expensive, and the cloud computing software test platforms and solutions are proprietary to business companies or test as a service (TaaS) providers, and the underlying designs of the cloud computing software test platforms and solutions are not open to the outside, so that external researchers have difficulty in deeply researching related problems.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method and the system for generating the automatic test data under the cloud computing environment are based on the quantum leaping algorithm and the cloud computing technology.
The technical problem of the invention can be realized by the following technical scheme:
an automatic test data generation method under a cloud computing environment comprises 4 steps of defining a test plan, uploading data to be tested to a cloud test platform, executing a test, generating a test report and analyzing the test report;
the test plan definition is that when a test plan is established, the test plan is displayed in a tree structure on a GUI interface of a JMeter, the content of the test plan is stored as a file in an xml format, and the file in the xml format is a formal description of the tree test plan; when the test execution module executes the test plan, determining what kind of object should be established in the memory to reflect the test plan established by the user according to the description of the file in the xml format, and generating respective behaviors according to different objects to access a system to be tested;
the method comprises the steps that data to be tested are uploaded to a cloud test platform, grouping test is carried out on all data to be tested by adopting a quantum leapfrog algorithm to obtain optimal test data on the cloud test platform, statistics and correlation analysis of each module of software test are carried out, and then big data are applied to process the data; all application programs, test tools, test cases and test environments related to the test are deployed on a cloud test platform firstly, and the test is carried out by means of a cloud computing technology;
the test execution is that data uploaded to a cloud test platform is started to simulate multi-user operation through JMeter, wherein each thread calls element objects in a test plan and executes the operation defined by the objects;
the test report generation and analysis is to collect the average response time and the system throughput in real time in the test execution process, and display the result to the tester in the form of an aggregated report for analysis and reference.
The cloud test platform is composed of four layers, namely a cloud resource layer, a cloud resource management layer, a service management layer and a user management layer.
The invention aims to construct a cloud computing software testing platform framework by taking a Jmeter automatic testing tool and Jenkins continuous integration software as substrates to solve the problem of shortage of a memory in a testing process, and adopts a large data buffer pool to load performance continuous integration testing records into a buffer queue of the memory according to each performance continuous integration testing request, so as to realize testing on the cloud platform, obtain optimal testing data by using a quantum leaping algorithm to perform grouping testing, perform statistics and correlation analysis on the data of software testing, then apply a cloud computing technology to process the data, finally accelerate the data processing testing speed and save the memory space.
The invention has the following beneficial effects:
1. any link in the continuous integration is automatically completed without much manual intervention, which is beneficial to reducing repeated processes to save time, cost and workload.
2. The invention uses a Jmeter automatic test tool and Jenkins continuous integration software, can better solve the problem that the automatic test result is stored and continuously integrated, can continuously display the automatic test result, can conveniently compare the test result at any time, finds out the defects of program codes and improves the robustness of the program from the comparison of the test result, and realizes the continuous integration and integration of the automatic test result.
3. The invention adopts the regular expression to establish the assertion under the JMeter framework, and verifies whether the program code can return an expected result through the script with the assertion, so that the test command statement is simplified, the function test of the application program is realized, and the flexibility of the test code is improved.
4. The cloud test service platform can improve the test efficiency of developers, does not occupy the computing resources of the developers in the test, and can be automatically carried out as much as possible.
5. The cloud test service platform improves the test safety, and even if the test fails, the whole system cannot be crashed.
6. The cloud test service platform enables the test to flexibly change the test environment, namely change the resource configuration of the test.
Drawings
FIG. 1 is a schematic diagram of the overall architecture and functional components of the present invention.
Fig. 2 is a flow chart of a method for generating optimal data by using quantum leapfrog algorithm according to the present invention.
Detailed Description
Example 1 general structure of the invention
The invention relates to an automatic test data generation method under a cloud computing environment, which comprises 4 steps of defining a test plan, uploading data to be tested to a cloud test platform, executing a test, generating a test report and analyzing the test report;
the test plan definition is that when a test plan is established, the test plan is displayed in a tree structure on a GUI interface of the JMeter, the storage format of the content is in an xml form, and a script stored in the xml form is a formal description of the tree test plan. When the test execution module executes the test plan, determining what kind of object should be established in the memory according to the description of the xml file to reflect the test plan established by the user, and generating respective behaviors according to different objects to access a system to be tested;
the data to be tested are uploaded to the cloud test platform, the optimal test data are obtained by adopting a quantum leaping algorithm for all the data to be tested on the cloud platform, grouping test is carried out, statistics and correlation analysis of each table of software test are carried out, then big data are applied to process the data, the data processing test speed is accelerated, and the memory space is saved. The cloud test platform is composed of four layers, namely a cloud resource layer, a cloud resource management layer, a service management layer and a user management layer. The four layers jointly form a cloud test platform, all applications related to testing, such as application programs, test tools, test cases, test environments and the like, must be deployed on the cloud test platform, and the testing efficiency is improved by means of a cloud computing technology. The cloud platform dynamic extensible massive resources can be effectively utilized, the testing time is saved, and the testing cost is reduced.
The test execution is that when the test is executed, data uploaded to a cloud test platform is started to simulate multi-user operation through JMeter, wherein each thread calls element objects in a test plan and executes the operation defined by the objects;
the test report generation and analysis is to collect performance index values such as average response time, system throughput and the like in real time in the test execution process, and display the result to a tester in an aggregated report form for analysis and reference.
Embodiment 2 the method for automatically generating software structure test data based on quantum leapfrog algorithm
The problem of software test data generation is solved automatically, so that the work of testers can be effectively reduced, the software test efficiency is improved, and the software development cost is saved. The software test data generation method adopted by the invention is a quantum leaping algorithm. The method comprises the steps of randomly selecting input data from a program input space (input domain), using the input data to execute a tested program, and continuing to operate and test the program to perform trial until an optimal solution is found by combining new input data generated by a quantum leaping algorithm according to an execution result of the input data in the program.
1. Construction of fitness function
The adaptive value function is an optimization target of the quantum leaping algorithm applied to solve the problem, and the structure of the adaptive value function directly influences the efficiency of the PSO on the specific problem. The invention adopts a branch function superposition method to construct an adaptive value function. The branch function is a real-valued function, which is a mapping of branch predicates to real values, and can quantitatively describe the coverage degree of the actual execution path of the unit under test on the selected path under the drive of test data.
If there are m branch points and n parameters on the path to be measured, the m branch functions are f1=f1(x1,x2,…,xn),f2=f2(x1,x2,…,xn),…,fm=fm(x1,x2,…,xn) (ii) a And the branching function of the path is
F=MAX-(F(f1)+F(f2)+…+F(fm))
Wherein, <math><mrow>
<mi>F</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = '{' close = ''>
<mtable>
<mtr>
<mtd>
<mn>0</mn>
<mo>,</mo>
<mi>x</mi>
<mo>≤</mo>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>x</mi>
<mo>,</mo>
<mi>x</mi>
<mo>></mo>
<mn>0</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow></math> MAX is a large integer.
2. Test data generation algorithm
The software structure test data automatic generation method based on the quantum frog leaping algorithm takes the test data as elements of a frog population vector x. Firstly, test data are randomly generated, and then the optimal test data are searched by using a quantum frog leaping algorithm, so that the value of an adaptive value function reaches the maximum value. Referring to fig. 2, the steps are as follows:
(1) the tested program is analyzed. Determining an adaptive value function according to the test coverage strategy and the tested program, and plugging the tested program;
(2) selecting a frog number m, an adaptive threshold value, a maximum allowable iteration number and a colony number, and initializing the quantum position and position of the frog;
(3) the iteration step number t is 0; fg=0;Fp=(0,0,…,0);
(4) When F is satisfiedgT is not more than<Maxiteration conditions, using each frog in the frog population P to perform the post-pile procedure; calculating the fitness of the frog according to the result after the frog runs;
(5) updating the quantum position speed and position of the frog group;
(6) and obtaining an optimal data generation result until the final iteration times are reached.
The following is a matlab program that can be used to solve for example 2:
。
Claims (2)
1. An automatic test data generation method under a cloud computing environment comprises 4 steps of defining a test plan, uploading data to be tested to a cloud test platform, executing a test, generating a test report and analyzing the test report;
the test plan definition is that when a test plan is established, the test plan is displayed in a tree structure on a GUI interface of a JMeter, the content of the test plan is stored as a file in an xml format, and the file in the xml format is a formal description of the tree test plan; when the test execution module executes the test plan, determining what kind of object should be established in the memory to reflect the test plan established by the user according to the description of the file in the xml format, and generating respective behaviors according to different objects to access a system to be tested;
the method comprises the steps that data to be tested are uploaded to a cloud test platform, grouping test is carried out on all data to be tested by adopting a quantum leapfrog algorithm to obtain optimal test data on the cloud test platform, statistics and correlation analysis of each module of software test are carried out, and then big data are applied to process the data; all application programs, test tools, test cases and test environments related to the test are deployed on a cloud test platform firstly, and the test is carried out by means of a cloud computing technology;
the test execution is that data uploaded to a cloud test platform is started to simulate multi-user operation through JMeter, wherein each thread calls element objects in a test plan and executes the operation defined by the objects;
the test report generation and analysis is to collect the average response time and the system throughput in real time in the test execution process, and display the result to the tester in the form of an aggregated report for analysis and reference.
2. The method as claimed in claim 1, wherein the cloud test platform comprises four layers, namely a cloud resource layer, a cloud resource management layer, a service management layer, and a user management layer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510373216.7A CN105045710B (en) | 2015-06-30 | 2015-06-30 | A kind of automatic test data creation method under cloud computing environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510373216.7A CN105045710B (en) | 2015-06-30 | 2015-06-30 | A kind of automatic test data creation method under cloud computing environment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105045710A true CN105045710A (en) | 2015-11-11 |
CN105045710B CN105045710B (en) | 2017-11-10 |
Family
ID=54452273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510373216.7A Expired - Fee Related CN105045710B (en) | 2015-06-30 | 2015-06-30 | A kind of automatic test data creation method under cloud computing environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105045710B (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105022635A (en) * | 2015-07-23 | 2015-11-04 | 北京中油瑞飞信息技术有限责任公司 | Algorithm file generating method and apparatus based on cloud platform and cloud platform |
CN105320599A (en) * | 2015-11-26 | 2016-02-10 | 上海斐讯数据通信技术有限公司 | System and method for web automatic tests |
CN105955749A (en) * | 2016-05-10 | 2016-09-21 | 北京启明星辰信息安全技术有限公司 | Continuous software project integration method and device |
CN106210013A (en) * | 2016-07-04 | 2016-12-07 | 上海华岭集成电路技术股份有限公司 | A kind of integrated circuit testing information integration based on high in the clouds analyzes system and method |
CN106850321A (en) * | 2017-04-05 | 2017-06-13 | 无锡华云数据技术服务有限公司 | A kind of simulated testing system of cluster server |
CN106991039A (en) * | 2016-01-20 | 2017-07-28 | 滴滴(中国)科技有限公司 | Method of testing and device for platform adaptive automotive engine system |
CN107092559A (en) * | 2017-04-18 | 2017-08-25 | 携程旅游信息技术(上海)有限公司 | Test platform middleware, test system and method based on Jmeter |
CN107153601A (en) * | 2016-03-02 | 2017-09-12 | 阿里巴巴集团控股有限公司 | Unit performance method of testing and equipment |
WO2017211042A1 (en) * | 2016-06-07 | 2017-12-14 | 中兴通讯股份有限公司 | Task automation testing method and system for big data |
CN107491386A (en) * | 2016-06-13 | 2017-12-19 | 富士通株式会社 | The method and apparatus for recording test script |
CN107608901A (en) * | 2017-10-20 | 2018-01-19 | 北京京东金融科技控股有限公司 | Method of testing and device based on Jmteter, storage medium, electronic equipment |
CN108334443A (en) * | 2017-12-22 | 2018-07-27 | 海尔优家智能科技(北京)有限公司 | Generate method, apparatus, equipment and the computer readable storage medium of test case |
CN108572919A (en) * | 2018-05-30 | 2018-09-25 | 平安普惠企业管理有限公司 | Automated testing method, device, computer equipment and storage medium |
CN109460367A (en) * | 2018-11-16 | 2019-03-12 | 四川长虹电器股份有限公司 | Method based on the sustainable integrated automation performance test of Jmeter |
CN110196812A (en) * | 2019-06-06 | 2019-09-03 | 四川长虹电器股份有限公司 | Based on the Web application iteration tests method recorded and reset |
WO2020000726A1 (en) * | 2018-06-29 | 2020-01-02 | 平安科技(深圳)有限公司 | Performance test report generating method, electronic device, and readable storage medium |
CN110750458A (en) * | 2019-10-22 | 2020-02-04 | 恩亿科(北京)数据科技有限公司 | Big data platform testing method and device, readable storage medium and electronic equipment |
CN111290934A (en) * | 2018-12-06 | 2020-06-16 | 中车株洲电力机车研究所有限公司 | Jenkins-based vehicle-mounted network product cloud testing method and system |
CN112256595A (en) * | 2020-12-22 | 2021-01-22 | 成都新希望金融信息有限公司 | Heterogeneous system testing method and device and electronic equipment |
CN112765014A (en) * | 2021-01-04 | 2021-05-07 | 光大兴陇信托有限责任公司 | Automatic test system for multi-user simultaneous operation and working method |
CN116909932A (en) * | 2023-09-12 | 2023-10-20 | 吉孚汽车技术(苏州)有限公司 | Continuous integrated automatic software testing system and method based on VT system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103136101A (en) * | 2012-12-31 | 2013-06-05 | 博彦科技(上海)有限公司 | Software automated testing unified operation platform |
CN104378252A (en) * | 2014-08-26 | 2015-02-25 | 国家电网公司 | Cloud testing service platform |
-
2015
- 2015-06-30 CN CN201510373216.7A patent/CN105045710B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103136101A (en) * | 2012-12-31 | 2013-06-05 | 博彦科技(上海)有限公司 | Software automated testing unified operation platform |
CN104378252A (en) * | 2014-08-26 | 2015-02-25 | 国家电网公司 | Cloud testing service platform |
Non-Patent Citations (3)
Title |
---|
孟祥超: ""云计算环境下的软件测试服务研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
布里泰恩: "《Tomcat权威指南(第二版)》", 30 September 2009, 中国电力出版社 * |
胥枫: ""软件自动化测试技术的研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105022635A (en) * | 2015-07-23 | 2015-11-04 | 北京中油瑞飞信息技术有限责任公司 | Algorithm file generating method and apparatus based on cloud platform and cloud platform |
CN105320599A (en) * | 2015-11-26 | 2016-02-10 | 上海斐讯数据通信技术有限公司 | System and method for web automatic tests |
CN106991039A (en) * | 2016-01-20 | 2017-07-28 | 滴滴(中国)科技有限公司 | Method of testing and device for platform adaptive automotive engine system |
CN107153601A (en) * | 2016-03-02 | 2017-09-12 | 阿里巴巴集团控股有限公司 | Unit performance method of testing and equipment |
CN105955749A (en) * | 2016-05-10 | 2016-09-21 | 北京启明星辰信息安全技术有限公司 | Continuous software project integration method and device |
WO2017211042A1 (en) * | 2016-06-07 | 2017-12-14 | 中兴通讯股份有限公司 | Task automation testing method and system for big data |
CN107491386A (en) * | 2016-06-13 | 2017-12-19 | 富士通株式会社 | The method and apparatus for recording test script |
CN106210013A (en) * | 2016-07-04 | 2016-12-07 | 上海华岭集成电路技术股份有限公司 | A kind of integrated circuit testing information integration based on high in the clouds analyzes system and method |
CN106210013B (en) * | 2016-07-04 | 2019-12-20 | 上海华岭集成电路技术股份有限公司 | Integrated circuit test information integration analysis system and method based on cloud |
CN106850321A (en) * | 2017-04-05 | 2017-06-13 | 无锡华云数据技术服务有限公司 | A kind of simulated testing system of cluster server |
CN107092559A (en) * | 2017-04-18 | 2017-08-25 | 携程旅游信息技术(上海)有限公司 | Test platform middleware, test system and method based on Jmeter |
CN107608901A (en) * | 2017-10-20 | 2018-01-19 | 北京京东金融科技控股有限公司 | Method of testing and device based on Jmteter, storage medium, electronic equipment |
CN107608901B (en) * | 2017-10-20 | 2019-12-31 | 京东数字科技控股有限公司 | Jmeter-based testing method and device, storage medium and electronic equipment |
CN108334443A (en) * | 2017-12-22 | 2018-07-27 | 海尔优家智能科技(北京)有限公司 | Generate method, apparatus, equipment and the computer readable storage medium of test case |
CN108572919A (en) * | 2018-05-30 | 2018-09-25 | 平安普惠企业管理有限公司 | Automated testing method, device, computer equipment and storage medium |
WO2020000726A1 (en) * | 2018-06-29 | 2020-01-02 | 平安科技(深圳)有限公司 | Performance test report generating method, electronic device, and readable storage medium |
CN109460367A (en) * | 2018-11-16 | 2019-03-12 | 四川长虹电器股份有限公司 | Method based on the sustainable integrated automation performance test of Jmeter |
CN111290934A (en) * | 2018-12-06 | 2020-06-16 | 中车株洲电力机车研究所有限公司 | Jenkins-based vehicle-mounted network product cloud testing method and system |
CN110196812A (en) * | 2019-06-06 | 2019-09-03 | 四川长虹电器股份有限公司 | Based on the Web application iteration tests method recorded and reset |
CN110196812B (en) * | 2019-06-06 | 2022-02-01 | 四川长虹电器股份有限公司 | Web application iteration test method based on recording and playback |
CN110750458A (en) * | 2019-10-22 | 2020-02-04 | 恩亿科(北京)数据科技有限公司 | Big data platform testing method and device, readable storage medium and electronic equipment |
CN112256595A (en) * | 2020-12-22 | 2021-01-22 | 成都新希望金融信息有限公司 | Heterogeneous system testing method and device and electronic equipment |
CN112256595B (en) * | 2020-12-22 | 2021-03-12 | 成都新希望金融信息有限公司 | Heterogeneous system testing method and device and electronic equipment |
CN112765014A (en) * | 2021-01-04 | 2021-05-07 | 光大兴陇信托有限责任公司 | Automatic test system for multi-user simultaneous operation and working method |
CN112765014B (en) * | 2021-01-04 | 2024-02-20 | 光大兴陇信托有限责任公司 | Automatic test system for multi-user simultaneous operation and working method |
CN116909932A (en) * | 2023-09-12 | 2023-10-20 | 吉孚汽车技术(苏州)有限公司 | Continuous integrated automatic software testing system and method based on VT system |
CN116909932B (en) * | 2023-09-12 | 2023-12-05 | 吉孚汽车技术(苏州)有限公司 | Continuous integrated automatic software testing system and method based on VT system |
Also Published As
Publication number | Publication date |
---|---|
CN105045710B (en) | 2017-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105045710B (en) | A kind of automatic test data creation method under cloud computing environment | |
US8561024B2 (en) | Developing software components and capability testing procedures for testing coded software component | |
US10482001B2 (en) | Automated dynamic test case generation | |
EP2572294B1 (en) | System and method for sql performance assurance services | |
US20130263090A1 (en) | System and method for automated testing | |
Kim et al. | Performance testing of mobile applications at the unit test level | |
US20090019427A1 (en) | Method and Apparatus for Providing Requirement Driven Static Analysis of Test Coverage for Web-Based, Distributed Processes | |
Waller | Performance benchmarking of application monitoring frameworks | |
US10592703B1 (en) | Method and system for processing verification tests for testing a design under test | |
US9396095B2 (en) | Software verification | |
Remenska et al. | Using model checking to analyze the system behavior of the LHC production grid | |
Sottile et al. | Semi-automatic extraction of software skeletons for benchmarking large-scale parallel applications | |
CN114818565A (en) | Simulation environment management platform, method, equipment and medium based on python | |
CN115757167A (en) | Intelligent driving software integration test deployment method, device, equipment and medium | |
Akpinar et al. | Web application testing with model based testing method: case study | |
Augusto et al. | RETORCH: an approach for resource-aware orchestration of end-to-end test cases | |
Marin et al. | Towards testing future web applications | |
Oleshchenko et al. | Web Application State Management Performance Optimization Methods | |
Meyer | Dependable software | |
Yadav et al. | Robotic automation of software testing from a machine learning viewpoint | |
CN113220586A (en) | Automatic interface pressure test execution method, device and system | |
CN118626402B (en) | AI framework test method, apparatus, device and storage medium | |
US20230068602A1 (en) | Automated Performance Measurement Over Software Lifecycle | |
Gibson | Deep learning on a low power gpu | |
Cao et al. | Software Testing Strategy for Mobile Phone |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20171110 Termination date: 20210630 |
|
CF01 | Termination of patent right due to non-payment of annual fee |