CN102622234B - Development system and method for automatic test case - Google Patents
Development system and method for automatic test case Download PDFInfo
- Publication number
- CN102622234B CN102622234B CN201210058151.3A CN201210058151A CN102622234B CN 102622234 B CN102622234 B CN 102622234B CN 201210058151 A CN201210058151 A CN 201210058151A CN 102622234 B CN102622234 B CN 102622234B
- Authority
- CN
- China
- Prior art keywords
- user
- case
- test
- control unit
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 89
- 238000011161 development Methods 0.000 title claims abstract description 25
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000013461 design Methods 0.000 claims abstract description 34
- 239000012634 fragment Substances 0.000 claims description 20
- 230000003993 interaction Effects 0.000 claims description 10
- 238000013100 final test Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 3
- 238000004891 communication Methods 0.000 abstract 1
- 238000012795 verification Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000012938 design process Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
Landscapes
- Debugging And Monitoring (AREA)
Abstract
The invention relates to a command line shell-based automatic test technology and discloses a development system and a development method for an automatic test case, which solve the problem that a testing scheme in the traditional technology is low in expansibility and reusability and high in development difficulty and has a long period. The system comprises a case design unit and a case operation control unit, wherein the case design unit is used for providing a graphical user interface of case design, receiving user input, converting information input by a user into presentation layer data, storing the data and transmitting the data to the case operation control unit; and the case operation control unit is used for analyzing the received presentation layer data into script language segments, filling variable parameters into the script language segments, operating the script language segments to transmit corresponding shell commands to tested equipment and collecting an executive result returned by the tested equipment. The development system and the development method are applied to an automatic test of communication equipment.
Description
Technical Field
The invention relates to an automated testing technology based on a command line shell, in particular to a development system and a development method of an automated testing case.
Background
For automated testing of command line shell based devices, early implementations were generally achieved with a record playback system: such systems achieve this by recording a segment of input data, playing back during the test, and matching whether the test results are completely consistent with the expected results. The test system is simple and easy to use, automatic test work can be carried out only by understanding the test service flow, but the expansibility and the reusability are poor, and the modification is not easy; the replay data is dependent on the recorded data and it is not convenient to modify the test parameters and the test steps.
In order to enhance the extensibility and adaptability of the automated test, a test script system appears in the industry at the later stage: the system generally decomposes the tested functions into one test item from the function angle, programs and judges each test item through the script language, forms a set of test function library based on the test items, and then calls and combines the test items through script programming according to different test cases by developers so as to judge the test result; although the system solves the expansibility and reusability of the test case, the system also has the problems of high development threshold, large development difficulty, long development time, low efficiency and the like.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the development system of the automatic test case is provided, and solves the problems of poor expansibility and reusability of a recording and replaying system in the prior art, and the problems of high development difficulty and long period of a test script system.
The scheme adopted by the invention for solving the technical problems is as follows: a system for developing an automated test case, comprising:
the case design unit is used for providing a graphical user interface for case design, receiving user input, converting user input information into presentation layer data, storing the presentation layer data and then sending the presentation layer data to the case operation control unit; the user input information includes: test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user; the presentation layer data is use case structured data;
and the case operation control unit is used for analyzing the received presentation layer data into script language segments, filling variable parameters into the script language segments, operating the script language segments to send corresponding shell commands to the tested equipment, and collecting execution results returned by the tested equipment.
Furthermore, the case design unit is also responsible for displaying and storing the final test result and the log, and provides a real-time breakpoint debugging interface.
Furthermore, the use case operation control unit is also responsible for comparing and judging the execution result returned by the tested device with an expected result preset by a user, and returning the judgment result to the use case design unit for displaying or debugging at a breakpoint.
Further, the use case design unit comprises an information input subunit and a recording and output subunit;
the information input subunit is used for receiving user input information and converting the user input information into presentation layer data;
and the recording and outputting subunit is used for storing and displaying the related data.
Further, the use case operation control unit comprises an analysis subunit, an execution subunit and a comparison and judgment subunit;
the parsing subunit is configured to parse the received presentation layer data into script language fragments;
the execution subunit is used for running the script language fragments and sending corresponding shell commands to the tested equipment;
and the comparison and judgment subunit is used for comparing and judging the execution result returned by the tested device with an expected result preset by a user.
Further, the device under test is composed of single devices under test through a building network, or is composed of a test bed and single devices under test through a building network.
Further, the user input information includes: the method comprises the steps of test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user.
Another objective of the present invention is to provide a method for developing an automatic test case, which includes the following steps:
a. a user inputs user input information into the case design unit; the user input information includes: test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user;
b. the use case design unit converts the user input information into presentation layer data and transmits the presentation layer data to the use case operation control unit; the presentation layer data is use case structured data;
c. the use case operation control unit analyzes and converts the presentation layer data into script language fragments;
d. filling variable parameters into the script language fragments by using the case operation control unit;
e. the use case operation control unit operates the script language fragments and sends corresponding shell commands to the tested equipment;
f. the tested equipment receives and executes the shell command, and then returns an execution result to the use case operation control unit;
g. the case operation control unit compares and judges the execution result returned by the tested equipment with an expected result preset by a user, and returns the judgment result to the case design unit;
h. and the use case design unit carries out display or breakpoint debugging according to the judgment result.
Further, the step a specifically comprises:
a1. a user inputs test service data to the case design unit through human natural language or man-machine interaction;
a2. the user specifies variable parameters and data generation rules in the input test service data;
a3. the user presets the expected result and compares the judgment rule.
Further, in step d, the use case operation control unit fills the variable parameters into the script language fragments according to the data generation rules.
The invention has the beneficial effects that: the system of the invention keeps the man-machine interface interaction mode similar to a recording playback type system, and the development of the use case is simple, easy to use and efficient; the recording process is subdivided into granularity, the recording of the whole process is not carried out any more, the granularity is subdivided to each verification point through a uniform formatting template, variable parameter setting and parameter value generation of input data are supported, the use case service number is converted into intermediate presentation layer data to carry out use case structurization, and the expansibility, reusability and flexibility of a test use case are guaranteed; because the result matching method approaching to natural language expression is used, the user only needs to know the matching strategy configuration method and does not need any script language program basis.
Drawings
FIG. 1 is a block diagram of an embodiment of a development system of the present invention;
FIG. 2 is a flow chart of an embodiment of a development method of the present invention.
Detailed Description
The invention provides a development system of an automatic test case, which solves the problems of poor expansibility and reusability of a recording and replaying system and the problems of high development difficulty and long period of a test script system in the prior art. The system comprises: a use case design unit and a use case operation control unit; wherein,
the case design unit is used for providing a graphical user interface for case design, receiving user input, converting user input information into presentation layer data, storing the presentation layer data and then sending the presentation layer data to the case operation control unit; the case design unit is also responsible for displaying and storing a final test result and a log and providing a real-time breakpoint debugging interface.
The case operation control unit is used for analyzing the received presentation layer data into script language segments, filling variable parameters into the script language segments, operating the script language segments to send corresponding shell commands to the tested equipment and collecting execution results returned by the tested equipment; the case operation control unit is also responsible for comparing and judging the execution result returned by the tested equipment with an expected result preset by a user, and returning the judgment result to the case design unit for displaying or debugging at a breakpoint.
The tested equipment in the embodiment of the invention receives and executes the shell command sent by the use case operation control unit, and then returns the execution result to the use case operation control unit; the device under test may be comprised of a build network between individual devices under test or a build network between a test bed and an individual device under test. This part is not the focus of the present invention and will not be described herein.
In particular, the structure shown in fig. 1 may be adopted:
the case design unit comprises an information input subunit and a recording and output subunit;
the information input subunit is used for receiving user input information and converting the user input information into presentation layer data;
and the recording and outputting subunit is used for storing and displaying the related data.
The case operation control unit comprises an analysis subunit, an execution subunit and a comparison and judgment subunit;
the parsing subunit is configured to parse the received presentation layer data into script language fragments;
the execution subunit is used for running the script language fragments and sending corresponding shell commands to the tested equipment;
and the comparison and judgment subunit is used for comparing and judging the execution result returned by the tested device with an expected result preset by a user.
The device under test is constructed by a plurality of devices under test through a network.
Based on the development system, the development method of the present invention can be implemented by using the steps shown in fig. 2, and specifically includes:
1. the user inputs user input information into the case design unit: the user inputs test service data to the case design unit through human natural language or man-machine interaction; specifying variable parameters and data generation rules in the input test service data; presetting an expected result and comparing and judging rules;
2. the use case design unit converts the user input information into presentation layer data and transmits the presentation layer data to the use case operation control unit;
3. the use case operation control unit analyzes and converts the presentation layer data into script language fragments;
4. filling variable parameters into the script language fragments by using the case operation control unit;
5. the use case operation control unit operates the script language fragments and sends corresponding shell commands to the tested equipment;
6. the tested equipment receives and executes the shell command, and then returns an execution result to the use case operation control unit;
7. the case operation control unit compares and judges the execution result returned by the tested equipment with an expected result preset by a user, and returns the judgment result to the case design unit;
8. and the use case design unit carries out display or breakpoint debugging according to the judgment result.
In order to make the method of the present invention more comprehensible, the following practical test cases are described as examples:
description of use cases: and configuring the device name as a random character string on the tested device, and testing whether the test is successful or not.
The testing steps are as follows:
running the following command lines on the device under test performs device name configuration:
#config terminal
# devicname [ random string ]
#end
The expected results are:
running a name detection command line on the device under test:
#show devicename
the device under test should come back:
# The devicename is [ random string ]
If the device playback is completely equal to the character strings, the test result is correct, otherwise, the test result is wrong.
If the above use case is to be realized, the implementation process of the invention is as follows:
s10, adding a verification point, namely verification point 1, by a user through a graphical User Interface (UI), and inputting a configuration command:
#config terminal
# devicname [ character string to be replaced ]
#end
S20, adding a variable parameter tmpAdomStr by a user, selecting a data generation rule as a random character string type, and replacing the front character string to be replaced by a grammar format of% tmpAdomStr%;
and S30, since the command sent by the verification point mainly serves as configuration issuing, no matching relation is filled in.
The above process is then repeated, adding a separate verification point for the test command line:
s10, adding a verification point, namely verification point 2, by the user through a graphical User Interface (UI), and entering an inspection command:
#show devicename;
s20, the verification point 2 does not need to generate parameters, only needs to use the existing parameters of the verification point 1, and does not fill in the parameter generation relation;
s30, adding a comparison judgment rule by The user through a graphical User Interface (UI), selecting The relation as equal to, and setting The expression value as The devicname is% tmpAdomStr%. It should be added that, in this example, only the "equal" relationship is used, and actually, there are many preset comparison and determination rules available for the user in the system, and flexible combination is possible.
When the design process of the use case is finished, the system converts the input information into the presentation layer data, stores and uploads the presentation layer data to the use case operation control unit, and converts the presentation layer data into the use case data of the presentation layer; the conversion into structured presentation layer data is for storage and transmission, reverse imaging and later modification.
S40, the analysis unit obtains the presentation layer data and analyzes and converts the presentation layer data into script language fragments;
s50, the parsing unit fills variable parameters into the script language fragments according to the data generation rule;
the parsed code may already be understood and executed by the execution unit.
S60, the execution unit executes the script language fragment and returns the device execution result data;
s70, the comparison and judgment subunit obtains the execution result of the execution unit, and carries out matching judgment according to the expected result of the presentation layer and the comparison and judgment rule thereof, if the judgment is correct, the step is shifted to S90 to output the test result;
and S80, interrupting. If the execution results of the verification points 2 are not correctly matched, if the user allows error interruption, the test environment is kept on site at the moment, a breakpoint debugging mode is started, the tested device establishes a new Shell command line connection window through another thread, and the user can directly connect to the device side to carry out real-time debugging in a similar SSH or Telent mode at the moment. After the real-time debugging is finished, the system re-executes the content of the current error verification point and continues to execute the subsequent verification points which are not executed; it should be noted that, in addition to debugging the false interrupt, the user may also use the interrupt command to force the interrupt and real-time debugging process at a certain location in the verification point, so as to assist the use case development process.
And S90, recording and outputting the test result.
The technical solutions claimed in the present invention include, but are not limited to, the above embodiments, and equivalents of the solutions of the present invention described in the above embodiments may be substituted by those skilled in the art without departing from the spirit of the present invention.
Claims (10)
1. A development system for automated test cases, comprising:
the case design unit is used for providing a graphical user interface for case design, receiving user input, converting user input information into presentation layer data, storing the presentation layer data and then sending the presentation layer data to the case operation control unit; the user input information includes: test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user; the presentation layer data is use case structured data;
and the case operation control unit is used for analyzing the received presentation layer data into script language segments, filling variable parameters into the script language segments, operating the script language segments to send corresponding shell commands to the tested equipment, and collecting execution results returned by the tested equipment.
2. The system for developing an automated test case according to claim 1, wherein the case design unit is further responsible for displaying and saving a final test result and a log, and providing a real-time breakpoint debugging interface.
3. The system for developing an automated test case according to claim 2, wherein the case run control unit is further responsible for comparing and determining an execution result returned by the device under test with an expected result preset by a user, and returning the determination result to the case design unit for displaying or debugging a breakpoint.
4. The system for developing an automated test case according to claim 3, wherein the case design unit includes an information entry subunit, a recording and output subunit;
the information input subunit is used for receiving user input information and converting the user input information into presentation layer data;
and the recording and outputting subunit is used for storing and displaying the related data.
5. The system for developing an automated test case according to claim 3, wherein the case operation control unit comprises an analysis subunit, an execution subunit, and a comparison and determination subunit;
the parsing subunit is configured to parse the received presentation layer data into script language fragments;
the execution subunit is used for running the script language fragments and sending corresponding shell commands to the tested equipment;
and the comparison and judgment subunit is used for comparing and judging the execution result returned by the tested device with an expected result preset by a user.
6. The system for developing the automatic test case according to any one of claims 1 to 5, wherein the device under test is composed of single devices under test through a building network, or is composed of a test bed and single devices under test through a building network.
7. The system for developing an automated test case according to any one of claims 1 to 5, wherein the user input information comprises: the method comprises the steps of test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user.
8. A development method of an automatic test case is characterized by comprising the following steps:
a. a user inputs user input information into the case design unit; the user input information includes: test service data input by a user through natural language or man-machine interaction, variable parameters and data generation rules in the test service data specified by the user, and expected result comparison judgment rules specified by the user;
b. the use case design unit converts the user input information into presentation layer data and transmits the presentation layer data to the use case operation control unit; the presentation layer data is use case structured data;
c. the use case operation control unit analyzes and converts the presentation layer data into script language fragments;
d. filling variable parameters into the script language fragments by using the case operation control unit;
e. the use case operation control unit operates the script language fragments and sends corresponding shell commands to the tested equipment;
f. the tested equipment receives and executes the shell command, and then returns an execution result to the use case operation control unit;
g. the case operation control unit compares and judges the execution result returned by the tested equipment with an expected result preset by a user, and returns the judgment result to the case design unit;
h. and the use case design unit carries out display or breakpoint debugging according to the judgment result.
9. The method for developing the automatic test case according to claim 8, wherein the step a specifically includes:
a1. a user inputs test service data to the case design unit through human natural language or man-machine interaction;
a2. the user specifies variable parameters and data generation rules in the input test service data;
a3. the user presets the expected result and compares the judgment rule.
10. The method for developing an automatic test case according to claim 8 or 9, wherein in step d, the case run control unit fills variable parameters into the script language fragments according to the data generation rules.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210058151.3A CN102622234B (en) | 2012-03-07 | 2012-03-07 | Development system and method for automatic test case |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210058151.3A CN102622234B (en) | 2012-03-07 | 2012-03-07 | Development system and method for automatic test case |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102622234A CN102622234A (en) | 2012-08-01 |
CN102622234B true CN102622234B (en) | 2015-07-15 |
Family
ID=46562164
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210058151.3A Active CN102622234B (en) | 2012-03-07 | 2012-03-07 | Development system and method for automatic test case |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102622234B (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103916283B (en) * | 2012-12-31 | 2017-10-10 | 北京新媒传信科技有限公司 | Server stress test system and method |
CN103218292B (en) * | 2013-03-29 | 2015-09-23 | 北京控制工程研究所 | A kind of aerospace satellite-borne software Auto-Test System |
CN104765679B (en) * | 2014-01-08 | 2017-07-07 | 中国科学院声学研究所 | A kind of business on-line testing method and apparatus based on user behavior |
CN105306292A (en) * | 2015-09-29 | 2016-02-03 | 上海斐讯数据通信技术有限公司 | Automatic test system |
CN105335153A (en) * | 2015-10-12 | 2016-02-17 | 杭州古北电子科技有限公司 | Dynamic script automatic-generating method |
CN106649092A (en) * | 2016-10-21 | 2017-05-10 | 郑州云海信息技术有限公司 | Test script generation method, web page testing method, device and system |
CN107273286B (en) * | 2017-06-02 | 2020-10-27 | 携程计算机技术(上海)有限公司 | Scene automatic test platform and method for task application |
CN107908540B (en) * | 2017-07-26 | 2021-04-06 | 平安壹钱包电子商务有限公司 | Test case creating method and device, computer equipment and medium |
CN107506303A (en) * | 2017-08-24 | 2017-12-22 | 航天中认软件测评科技(北京)有限责任公司 | Method, apparatus and system for automatic test |
CN107544463B (en) * | 2017-09-08 | 2019-12-13 | 北京新能源汽车股份有限公司 | Automatic test method and test device for diagnosis function of vehicle controller |
CN107943689B (en) * | 2017-11-16 | 2021-04-23 | 北京卫星信息工程研究所 | Automatic test method and test system based on parameterized test script |
CN109818833B (en) * | 2019-03-14 | 2021-08-17 | 北京信而泰科技股份有限公司 | Ethernet test system and Ethernet test method |
CN110377507B (en) * | 2019-06-28 | 2022-07-08 | 苏州浪潮智能科技有限公司 | Method and system for transmitting parameter command based on script |
CN110515841B (en) * | 2019-08-05 | 2024-01-12 | 瑞斯康达科技发展股份有限公司 | Command test method and device and computer storage medium |
CN113609015A (en) * | 2021-08-05 | 2021-11-05 | 先进操作系统创新中心(天津)有限公司 | Automatic test framework based on Bash Shell |
CN113923443A (en) * | 2021-09-27 | 2022-01-11 | 深圳市天视通视觉有限公司 | Network video recorder testing method and device and computer readable storage medium |
CN113986771B (en) * | 2021-12-29 | 2022-04-08 | 北京壁仞科技开发有限公司 | Method and device for debugging target program code and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1983288A (en) * | 2005-12-16 | 2007-06-20 | 国际商业机器公司 | Verification operation supporting system and method of the same |
CN101046763A (en) * | 2006-03-29 | 2007-10-03 | 盛趣信息技术(上海)有限公司 | Implementing method of automatic test system based on scenario |
CN101241466A (en) * | 2007-02-08 | 2008-08-13 | 深圳迈瑞生物医疗电子股份有限公司 | Embedded software test method and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7502966B2 (en) * | 2006-02-09 | 2009-03-10 | International Business Machines Corporation | Testcase generation via a pool of parameter files |
-
2012
- 2012-03-07 CN CN201210058151.3A patent/CN102622234B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1983288A (en) * | 2005-12-16 | 2007-06-20 | 国际商业机器公司 | Verification operation supporting system and method of the same |
CN101046763A (en) * | 2006-03-29 | 2007-10-03 | 盛趣信息技术(上海)有限公司 | Implementing method of automatic test system based on scenario |
CN101241466A (en) * | 2007-02-08 | 2008-08-13 | 深圳迈瑞生物医疗电子股份有限公司 | Embedded software test method and system |
Also Published As
Publication number | Publication date |
---|---|
CN102622234A (en) | 2012-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102622234B (en) | Development system and method for automatic test case | |
CN103812726B (en) | Automated testing method and device for data communication equipment | |
CN106445811B (en) | A kind of automatization test system and method based on SecureCRT | |
CN111104315A (en) | Test script generation method and device and computer readable storage medium | |
CN104731566B (en) | Integrated Development Environment test device, method and system | |
JP2000196705A (en) | Automatic communication protocol test system having message/sequence compilation function and tet method | |
WO2014079267A1 (en) | Automatic test method, device and system for infrared remote control device | |
WO2021120544A1 (en) | Method and apparatus for debugging device | |
CN110825595B (en) | Recording playback method and system, storage medium | |
WO2007061241A1 (en) | Error test method for mobile communication terminals | |
CN113672441A (en) | Method and device for testing intelligent equipment | |
CN110245077A (en) | A kind of response method and equipment of program exception | |
CN108897695A (en) | A kind of the interconnection test method and system of demand side apparatus | |
CN106776329B (en) | The adjustment method and commissioning device of energy accumulation current converter | |
CN102750143A (en) | Digital signal processing (DSP) developing method based on matrix laboratory (MATLAB) component object model (COM) component calling | |
CN105677689B (en) | Log recording method and device | |
CN110789580A (en) | Communication detection method, equipment and system of train network control system | |
CN209441385U (en) | The communication detection system of train network control system | |
CN112289345A (en) | Visual intelligent terminal voice diagnosis method | |
CN113867990A (en) | Channel control method of battery detection equipment | |
CN115967660A (en) | Method and device for testing CAN protocol conversion, terminal equipment and storage medium | |
CN113645052B (en) | Firmware debugging method and related equipment | |
CN113934619A (en) | Sound source testing method, system and storage medium based on Robot Framework testing Framework | |
JP2006155047A (en) | Verification system and verification method | |
CN113760235B (en) | BLE development and debugging system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |