[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110908888B - Server testing method and device - Google Patents

Server testing method and device Download PDF

Info

Publication number
CN110908888B
CN110908888B CN201811080414.4A CN201811080414A CN110908888B CN 110908888 B CN110908888 B CN 110908888B CN 201811080414 A CN201811080414 A CN 201811080414A CN 110908888 B CN110908888 B CN 110908888B
Authority
CN
China
Prior art keywords
test
server
return value
tested
query statement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811080414.4A
Other languages
Chinese (zh)
Other versions
CN110908888A (en
Inventor
叶思
欧阳灿
熊伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811080414.4A priority Critical patent/CN110908888B/en
Publication of CN110908888A publication Critical patent/CN110908888A/en
Application granted granted Critical
Publication of CN110908888B publication Critical patent/CN110908888B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/0766Error or fault reporting or storing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/079Root cause analysis, i.e. error or fault diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides a server testing method and device, wherein the method comprises the following steps: transmitting test data to at least one test device corresponding to a server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information; sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding check specification; when the target position information is consistent with the position information of the server, a corresponding error log is acquired so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically acquired to analyze, thereby improving the test efficiency of the server and reducing the test cost.

Description

Server testing method and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and an apparatus for testing a server.
Background
At present, the testing process of the server is that different input parameters of the server interface are manually combed, a test case is generated to test the interface based on the different input parameters, test logs are manually collected, the test logs are checked, whether the interface has faults or not is judged, and the like. In addition, the SQL statement in the server program is checked manually to determine whether there is an error. However, the above-mentioned test method is mainly finished by manpower, and has high labor cost, long test time and low test efficiency.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems in the related art to some extent.
Therefore, a first object of the present invention is to provide a server testing method for solving the problems of high testing cost and poor efficiency.
A second object of the present invention is to provide a server testing device.
A third object of the present invention is to provide another server testing device.
A fourth object of the present invention is to propose a non-transitory computer readable storage medium.
A fifth object of the invention is to propose a computer programme product.
To achieve the above object, an embodiment of a first aspect of the present invention provides a server testing method, including:
transmitting test data to at least one test device corresponding to a server to be tested; the test data includes: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information;
sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding check specification;
and when the target position information is consistent with the position information of the target position information, acquiring a corresponding error log so as to analyze the error log.
Further, in the check specification corresponding to the return value, a preset condition which needs to be met by each field in the return value is defined;
correspondingly, the error log is generated in the way that,
judging whether fields which do not meet corresponding preset conditions exist in the return values or not according to the return values of the test cases;
if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
Further, the test case set includes: a subset of test cases for each interface of the server to be tested;
the test case set includes: and combining the values of the parameters of the interface to generate the test case.
Further, the method further comprises the steps of:
acquiring a query statement set corresponding to the server to be tested; the query statement set comprises the following components: each query statement in the execution program of the server to be tested;
aiming at each query statement in the query statement set, acquiring a check specification corresponding to the query statement;
and comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification so as to analyze the query statement which does not meet the check specification.
Further, before the sending the test data to the at least one test device corresponding to the server to be tested, the method further includes:
acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: testing environments, sandbox environments, and online environments;
correspondingly, the sending the test data to at least one test device corresponding to the server to be tested includes:
and sending test data corresponding to the current environmental information to at least one test device corresponding to the server to be tested.
Further, the data format of the return value is json format; and the check specification corresponding to the return value is a schema check specification.
According to the server testing method, the data for testing is sent to at least one testing device corresponding to the server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information; sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding check specification; when the target position information is consistent with the position information of the server, a corresponding error log is acquired so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically acquired to analyze, thereby improving the test efficiency of the server and reducing the test cost.
To achieve the above object, an embodiment of a second aspect of the present invention provides a server testing device, including:
the deployment module is used for sending test data to at least one test device corresponding to the server to be tested; the test data includes: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information;
the test module is used for sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, a return value is obtained, and an error log is generated according to the return value and a corresponding check specification;
and the log regression module is used for acquiring a corresponding error log when the target position information is consistent with the position information of the target position information so as to analyze the error log.
Further, in the check specification corresponding to the return value, a preset condition which needs to be met by each field in the return value is defined;
correspondingly, the error log is generated in the way that,
judging whether fields which do not meet corresponding preset conditions exist in the return values or not according to the return values of the test cases;
if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
Further, the test case set includes: a subset of test cases for each interface of the server to be tested;
the test case set includes: and combining the values of the parameters of the interface to generate the test case.
Further, the device further comprises: the first acquisition module and the comparison module;
the first acquisition module is used for acquiring a query statement set corresponding to the server to be tested; the query statement set comprises the following components: each query statement in the execution program of the server to be tested;
the first obtaining module is further configured to obtain, for each query statement in the query statement set, a verification specification corresponding to the query statement;
the comparison module is used for comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification, so as to analyze the query statement which does not meet the check specification.
Further, the device further comprises: a second acquisition module;
the second acquisition module is used for acquiring the current environmental information of the server to be tested; the environment information is any one of the following environments: testing environments, sandbox environments, and online environments;
correspondingly, the deployment module is specifically configured to send test data corresponding to current environmental information to at least one test device corresponding to the server to be tested.
Further, the data format of the return value is json format; and the check specification corresponding to the return value is a schema check specification.
According to the server testing device, the data for testing is sent to at least one testing device corresponding to the server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information; sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding check specification; when the target position information is consistent with the position information of the server, a corresponding error log is acquired so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically acquired to analyze, thereby improving the test efficiency of the server and reducing the test cost.
To achieve the above object, an embodiment of a third aspect of the present invention provides another server testing device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the server test method as described above when executing the program.
In order to achieve the above object, a fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the server test method as described above.
To achieve the above object, an embodiment of a fifth aspect of the present invention proposes a computer program product, which when executed by an instruction processor in the computer program product, implements a server testing method as described above.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a schematic flow chart of a server testing method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating another method for testing a server according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a server testing device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another server testing device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another server testing device according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another server testing device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
The server testing method and device according to the embodiment of the invention are described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a server testing method according to an embodiment of the present invention. As shown in fig. 1, the server testing method includes the following steps:
s101, sending test data to at least one test device corresponding to a server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set, and destination location information.
The execution subject of the server test method provided by the invention is a server test device, and the server test device can be hardware equipment such as terminal equipment, a server, an automatic test platform and the like or software installed on the hardware equipment. In this embodiment, a server test device is taken as an automated test platform for illustration.
In this embodiment, at least one test device refers to a device that executes a test case to call a server interface to obtain an interface return value. The test device may be a mobile terminal or a server. It should be noted that the automated test platform may be a server cluster, and the test device may be integrated in the automated test platform, where the automated test platform performs test case deployment on the test device, and controls the test device to perform a test.
In this embodiment, the test case set may include: a subset of test cases for all or part of the interfaces of the server to be tested; examples of test sets include: and combining the values of the parameters of the interface to generate the test case. The automatic test platform can deploy a subset of test cases aiming at all interfaces of the server to be tested on each test device; or the automated test platform can divide the interfaces of the server to be tested and deploy the test case subsets of different interfaces to different test devices.
In this embodiment, the data format of the return value of the test case may be json format; the check specification corresponding to the return value may be a schema check specification. The schema check rule is a rule in a json format, and is as large as the number of the interface return values and as small as the type of each field in the interface return values, so that the schema can be effectively used for rule. In addition, the check specification corresponding to the return value can be other specifications, and can be set according to actual needs.
In this embodiment, the destination location information is location information of the destination device to be subjected to error log analysis. The location information of the destination device may be, for example, an identification of the destination device, an IP address, or the like. The purpose equipment can be the equipment that specific tester used, through the setting of purpose position information, can be with the error log collection that test obtained on each test equipment to each tester to the tester carries out analysis to the error log, avoids the manual collection of tester to the error log, thereby further improves server test efficiency, further reduces test cost.
In addition, the test data sent to each test device by the automated test platform may not include the verification specification, and the destination location information is destination location information of the automated test platform. The test equipment can directly send the test result comprising the return value to the automatic test platform, and the automatic test platform generates each test log according to each test result and the verification specification and distributes each test log to each tester for analysis.
S102, sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding check specification.
In this embodiment, the check specification corresponding to the return value defines preset conditions that each field in the return value needs to satisfy. Correspondingly, the error log is generated by judging whether fields which do not meet the corresponding preset conditions exist in the return values according to the return values of all the test cases; if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value; and generating an error log according to the error return value.
The error log may include: error return value, and corresponding interface and test case. The error log may further include: error fields in the error return value, etc. In addition, if each field in the return value meets the corresponding preset condition, the return value is determined to be the correct return value.
And S103, when the target position information is consistent with the position information of the target position information, acquiring a corresponding error log so as to analyze the error log.
In this embodiment, the own position information is the position information of the server test device. Taking a server testing device as an automatic testing platform as an example, the self position information is the position information of the automatic testing platform. When the target position information is consistent with the position information of the automatic test platform, the automatic test platform is indicated to be the target equipment for error log analysis, so that the automatic test platform can acquire the corresponding error log and analyze the error log. When the target position information is inconsistent with the position information of the target device, the device to be subjected to error log analysis is not an automatic test platform, but other devices, and the other devices can acquire the corresponding error log and analyze the error log.
In this embodiment, through the setting of the destination location information, the error logs generated by each test device may be returned to one or more testers, so that the testers analyze the error logs, and the testers are prevented from manually collecting the error logs.
Further, on the basis of the foregoing embodiment, before step 101, the method may further include: acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: test environments, sandbox environments, and online environments. Correspondingly, in step 101, specifically, test data corresponding to the current environmental information may be sent to at least one test device corresponding to the server to be tested.
The testing purposes of the server to be tested are different according to different environments, such as a testing environment, a sandbox environment and an online environment, and further the corresponding testing cases are different, so that different testing data are required to be deployed for at least one testing device corresponding to the server to be tested according to different environment information.
According to the server testing method, the data for testing is sent to at least one testing device corresponding to the server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information; sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding check specification; when the target position information is consistent with the position information of the server, a corresponding error log is acquired so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically acquired to analyze, thereby improving the test efficiency of the server and reducing the test cost.
Fig. 2 is a flowchart of another server testing method according to an embodiment of the present invention. As shown in fig. 2, on the basis of the embodiment shown in fig. 1, the method may further include the following steps:
s104, acquiring a query statement set corresponding to the server to be tested; the query statement set includes: each query statement in the execution program of the server to be tested.
The query statement set may be obtained by a tester from an execution program; or, acquiring key fields in the query statement in advance, then acquiring a statement with the key fields according to the key fields by taking a search execution program, and determining the statement with the key fields as the query statement. The query statement may be, for example, a structured query language (Structured Query Language, SQL).
S105, acquiring check specifications corresponding to the query sentences for each query sentence in the query sentence set.
The check specification corresponding to the query statement defines conditions required to be met by each field of the query statement.
S106, comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification, so as to analyze the query statement which does not meet the check specification.
In this embodiment, by acquiring the query statement set in the execution program of the server to be tested and determining whether each query statement in the query statement set meets the corresponding check specification, the query statement that does not meet the corresponding check specification can be detected and processed, so that potential safety hazards such as slow query can be solved, and the execution efficiency of the execution program of the server to be tested is improved.
Fig. 3 is a schematic structural diagram of a server testing device according to an embodiment of the present invention. As shown in fig. 3, includes: a deployment module 31, a test module 32, and a log regression module 33.
The deployment module 31 is configured to send test data to at least one test device corresponding to a server to be tested; the test data includes: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information;
the test module 32 is configured to send a test instruction to the at least one test device, so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding check specification;
and the log regression module 33 is configured to obtain a corresponding error log when the destination location information is consistent with the location information of the log regression module, so as to analyze the error log.
The server testing device provided by the invention can be hardware equipment such as terminal equipment, a server, an automatic testing platform and the like, or software installed on the hardware equipment. In this embodiment, a server test device is taken as an automated test platform for illustration.
In this embodiment, at least one test device refers to a device that executes a test case to call a server interface to obtain an interface return value. The test device may be a mobile terminal or a server. It should be noted that the automated test platform may be a server cluster, and the test device may be integrated in the automated test platform, where the automated test platform performs test case deployment on the test device, and controls the test device to perform a test.
In this embodiment, the test case set may include: a subset of test cases for all or part of the interfaces of the server to be tested; examples of test sets include: and combining the values of the parameters of the interface to generate the test case. The automatic test platform can deploy a subset of test cases aiming at all interfaces of the server to be tested on each test device; or the automated test platform can divide the interfaces of the server to be tested and deploy the test case subsets of different interfaces to different test devices.
In this embodiment, the data format of the return value of the test case may be json format; the check specification corresponding to the return value may be a schema check specification. The schema check rule is a rule in a json format, and is as large as the number of the interface return values and as small as the type of each field in the interface return values, so that the schema can be effectively used for rule. In addition, the check specification corresponding to the return value can be other specifications, and can be set according to actual needs.
In this embodiment, the destination location information is location information of the destination device to be subjected to error log analysis. The location information of the destination device may be, for example, an identification of the destination device, an IP address, or the like. The purpose equipment can be the equipment that specific tester used, through the setting of purpose position information, can be with the error log collection that test obtained on each test equipment to each tester to the tester carries out analysis to the error log, avoids the manual collection of tester to the error log, thereby further improves server test efficiency, further reduces test cost.
In this embodiment, the own position information is the position information of the server test device. Taking a server testing device as an automatic testing platform as an example, the self position information is the position information of the automatic testing platform. When the target position information is consistent with the position information of the automatic test platform, the automatic test platform is indicated to be the target equipment for error log analysis, so that the automatic test platform can acquire the corresponding error log and analyze the error log. When the target position information is inconsistent with the position information of the target device, the device to be subjected to error log analysis is not an automatic test platform, but other devices, and the other devices can acquire the corresponding error log and analyze the error log.
In addition, the test data sent to each test device by the automated test platform may not include the verification specification, and the destination location information is destination location information of the automated test platform. The test equipment can directly send the test result comprising the return value to the automatic test platform, and the automatic test platform generates each test log according to each test result and the verification specification and distributes each test log to each tester for analysis.
Further, on the basis of the above embodiment, the preset conditions that each field in the return value needs to satisfy are defined in the check specification corresponding to the return value. Correspondingly, the error log is generated by judging whether fields which do not meet the corresponding preset conditions exist in the return values according to the return values of all the test cases; if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value; and generating an error log according to the error return value.
The error log may include: error return value, and corresponding interface and test case. The error log may further include: error fields in the error return value, etc. In addition, if each field in the return value meets the corresponding preset condition, the return value is determined to be the correct return value.
Further, referring to fig. 4 in combination, on the basis of the embodiment shown in fig. 3, the apparatus may further include: a second obtaining module 34, configured to obtain current environmental information of the server to be tested; the environment information is any one of the following environments: testing environments, sandbox environments, and online environments;
correspondingly, the deployment module 31 is specifically configured to send test data corresponding to the current environmental information to at least one test device corresponding to the server to be tested.
The testing purposes of the server to be tested are different according to different environments, such as a testing environment, a sandbox environment and an online environment, and further the corresponding testing cases are different, so that different testing data are required to be deployed for at least one testing device corresponding to the server to be tested according to different environment information.
According to the server testing device, the data for testing is sent to at least one testing device corresponding to the server to be tested; the test data include: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information; sending a test instruction to at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding check specification; when the target position information is consistent with the position information of the server, a corresponding error log is acquired so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically acquired to analyze, thereby improving the test efficiency of the server and reducing the test cost.
Further, referring to fig. 5 in combination, on the basis of the embodiment shown in fig. 3, the apparatus may further include: a first acquisition module 35 and a comparison module 36.
The first obtaining module 35 is configured to obtain a query statement set corresponding to the server to be tested; the query statement set comprises the following components: each query statement in the execution program of the server to be tested;
the first obtaining module 35 is further configured to obtain, for each query statement in the query statement set, a check specification corresponding to the query statement;
the comparison module 36 is configured to compare the query statement with a corresponding check specification, and obtain a query statement that does not meet the check specification, so as to analyze the query statement that does not meet the check specification.
The query statement set may be obtained by a tester from an execution program; or, acquiring key fields in the query statement in advance, then acquiring a statement with the key fields according to the key fields by taking a search execution program, and determining the statement with the key fields as the query statement. The query statement may be, for example, a structured query language (Structured Query Language, SQL).
In this embodiment, by acquiring the query statement set in the execution program of the server to be tested and determining whether each query statement in the query statement set meets the corresponding check specification, the query statement that does not meet the corresponding check specification can be detected and processed, so that potential safety hazards such as slow query can be solved, and the execution efficiency of the execution program of the server to be tested is improved.
Fig. 6 is a schematic structural diagram of another server testing device according to an embodiment of the present invention. The server test device includes:
memory 1001, processor 1002, and a computer program stored on memory 1001 and executable on processor 1002.
The processor 1002 implements the server test method provided in the above embodiment when executing the program.
Further, the server test device further includes:
a communication interface 1003 for communication between the memory 1001 and the processor 1002.
Memory 1001 for storing computer programs that may be run on processor 1002.
Memory 1001 may include high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
And a processor 1002, configured to implement the server testing method described in the foregoing embodiment when executing the program.
If the memory 1001, the processor 1002, and the communication interface 1003 are implemented independently, the communication interface 1003, the memory 1001, and the processor 1002 may be connected to each other through a bus and perform communication with each other. The bus may be an industry standard architecture (Industry Standard Architecture, abbreviated ISA) bus, an external device interconnect (Peripheral Component, abbreviated PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) bus, among others. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 1001, the processor 1002, and the communication interface 1003 are integrated on a chip, the memory 1001, the processor 1002, and the communication interface 1003 may complete communication with each other through internal interfaces.
The processor 1002 may be a central processing unit (Central Processing Unit, abbreviated as CPU) or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC) or one or more integrated circuits configured to implement embodiments of the present invention.
The present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a server testing method as described above.
The invention also provides a computer program product which, when executed by an instruction processor in the computer program product, implements a server testing method as described above.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present invention.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. As with the other embodiments, if implemented in hardware, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (12)

1. A server testing method, comprising:
transmitting test data to at least one test device corresponding to a server to be tested; the test data includes: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information;
sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding check specification;
when the target position information is consistent with the position information of the target position information, a corresponding error log is obtained so as to analyze the error log;
acquiring a query statement set corresponding to the server to be tested; the query statement set comprises the following components: each query statement in the execution program of the server to be tested;
aiming at each query statement in the query statement set, acquiring a check specification corresponding to the query statement;
and comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification so as to analyze the query statement which does not meet the check specification.
2. The method of claim 1, wherein the check specification corresponding to the return value defines a preset condition to be satisfied by each field in the return value;
correspondingly, the error log is generated in the way that,
judging whether fields which do not meet corresponding preset conditions exist in the return values or not according to the return values of the test cases;
if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
3. The method of claim 1, wherein the set of test cases comprises: a subset of test cases for each interface of the server to be tested;
the test case set includes: and combining the values of the parameters of the interface to generate the test case.
4. The method of claim 1, wherein before sending test data to at least one test device corresponding to a server to be tested, further comprising:
acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: testing environments, sandbox environments, and online environments;
correspondingly, the sending the test data to at least one test device corresponding to the server to be tested includes:
and sending test data corresponding to the current environmental information to at least one test device corresponding to the server to be tested.
5. The method of claim 1, wherein the data format of the return value is json format; and the check specification corresponding to the return value is a schema check specification.
6. A server testing apparatus, comprising:
the deployment module is used for sending test data to at least one test device corresponding to the server to be tested; the test data includes: the test case set, a check specification corresponding to the return value of each test case in the test case set and destination position information;
the test module is used for sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, a return value is obtained, and an error log is generated according to the return value and a corresponding check specification;
the log regression module is used for acquiring a corresponding error log when the target position information is consistent with the position information of the log regression module so as to analyze the error log;
the first acquisition module is used for acquiring a query statement set corresponding to the server to be tested; the query statement set comprises the following components: each query statement in the execution program of the server to be tested;
the first obtaining module is further configured to obtain, for each query statement in the query statement set, a verification specification corresponding to the query statement;
and the comparison module is used for comparing the query statement with the corresponding check specification to acquire the query statement which does not meet the check specification so as to analyze the query statement which does not meet the check specification.
7. The apparatus of claim 6, wherein the check specification corresponding to the return value defines a preset condition to be satisfied by each field in the return value;
correspondingly, the error log is generated in the way that,
judging whether fields which do not meet corresponding preset conditions exist in the return values or not according to the return values of the test cases;
if fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
8. The apparatus of claim 6, wherein the set of test cases comprises: a subset of test cases for each interface of the server to be tested;
the test case set includes: and combining the values of the parameters of the interface to generate the test case.
9. The apparatus as recited in claim 6, further comprising: a second acquisition module;
the second acquisition module is used for acquiring the current environmental information of the server to be tested; the environment information is any one of the following environments: testing environments, sandbox environments, and online environments;
correspondingly, the deployment module is specifically configured to send test data corresponding to current environmental information to at least one test device corresponding to the server to be tested.
10. The apparatus of claim 6, wherein the data format of the return value is json format; and the check specification corresponding to the return value is a schema check specification.
11. A server testing apparatus, comprising:
memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the server testing method according to any of claims 1-5 when executing the program.
12. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor implements the server testing method according to any of claims 1-5.
CN201811080414.4A 2018-09-17 2018-09-17 Server testing method and device Active CN110908888B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811080414.4A CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811080414.4A CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Publications (2)

Publication Number Publication Date
CN110908888A CN110908888A (en) 2020-03-24
CN110908888B true CN110908888B (en) 2023-06-30

Family

ID=69812621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811080414.4A Active CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Country Status (1)

Country Link
CN (1) CN110908888B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475409B (en) * 2020-03-30 2023-06-30 深圳追一科技有限公司 System test method, device, electronic equipment and storage medium
CN111694734A (en) * 2020-05-26 2020-09-22 五八有限公司 Software interface checking method and device and computer equipment
CN113297063B (en) * 2020-06-05 2024-11-05 阿里巴巴集团控股有限公司 Use case generation method, device, server and storage medium
CN111679989A (en) * 2020-06-16 2020-09-18 贝壳技术有限公司 Interface robustness testing method and device, electronic equipment and storage medium
CN113934618A (en) * 2020-07-14 2022-01-14 深圳兆日科技股份有限公司 Interface test case generation method, device, generator and readable storage medium
CN111885051B (en) * 2020-07-22 2022-10-25 微医云(杭州)控股有限公司 Data verification method and device and electronic equipment
CN113904954B (en) * 2021-08-27 2023-09-01 深圳市有方科技股份有限公司 System for testing wireless communication module
CN113806154A (en) * 2021-09-16 2021-12-17 海光信息技术股份有限公司 Test method, test system, test equipment and readable storage medium
CN114281613B (en) * 2021-11-19 2024-01-09 苏州浪潮智能科技有限公司 Server testing method and device, computer equipment and storage medium
CN114816876B (en) * 2022-04-25 2024-06-21 宝德计算机系统股份有限公司 Automatic test system for server Redfish interface specifications

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106776307A (en) * 2016-12-05 2017-05-31 广州唯品会信息科技有限公司 Method for testing software and system
CN106815138A (en) * 2015-12-01 2017-06-09 北京奇虎科技有限公司 A kind of method and apparatus for generating interface testing use-case
CN107741911A (en) * 2017-11-01 2018-02-27 广州爱九游信息技术有限公司 Interface test method, device, client and computer-readable recording medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6044398A (en) * 1997-11-21 2000-03-28 International Business Machines Corporation Virtual dynamic browsing system and method for automated web server and testing
CN104111885B (en) * 2013-04-22 2017-09-15 腾讯科技(深圳)有限公司 The method of calibration and device of interface testing result
US20150370691A1 (en) * 2014-06-18 2015-12-24 EINFOCHIPS Inc System testing of software programs executing on modular frameworks
CN105373469B (en) * 2014-08-25 2018-09-04 广东金赋科技股份有限公司 A kind of software automated testing system and method based on interface
CN104360920B (en) * 2014-12-02 2018-06-26 微梦创科网络科技(中国)有限公司 A kind of automatic interface testing method and device
CN104468275A (en) * 2014-12-18 2015-03-25 辽宁生产力促进中心 Industrial cluster creative platform testing device and method
CN105550113B (en) * 2015-12-18 2019-01-22 网易(杭州)网络有限公司 Web test method and test machine
CN107193681B (en) * 2016-03-15 2020-07-31 阿里巴巴集团控股有限公司 Data verification method and device
CN106095673B (en) * 2016-06-07 2018-12-14 深圳市泰久信息系统股份有限公司 Automated testing method and system based on WEB interface
CN106874192B (en) * 2017-01-03 2020-02-04 中国科学院自动化研究所 Digital publication-oriented standard conformance testing method and system
CN107294808B (en) * 2017-07-05 2020-11-24 网易(杭州)网络有限公司 Interface test method, device and system
CN107643981A (en) * 2017-08-29 2018-01-30 顺丰科技有限公司 A kind of automatic test platform and operation method of polynary operation flow
CN107729243B (en) * 2017-10-12 2020-06-16 上海携程金融信息服务有限公司 Application programming interface automatic test method, system, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815138A (en) * 2015-12-01 2017-06-09 北京奇虎科技有限公司 A kind of method and apparatus for generating interface testing use-case
CN106776307A (en) * 2016-12-05 2017-05-31 广州唯品会信息科技有限公司 Method for testing software and system
CN107741911A (en) * 2017-11-01 2018-02-27 广州爱九游信息技术有限公司 Interface test method, device, client and computer-readable recording medium

Also Published As

Publication number Publication date
CN110908888A (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN110908888B (en) Server testing method and device
CN108563214B (en) Vehicle diagnosis method, device and equipment
CN110673576B (en) Automatic test method and device, vehicle and storage medium
CN111045944A (en) Regression testing method, device and system and computer readable storage medium
CN113778557B (en) Vehicle diagnosis software configuration method, device, server and storage medium
CN113608518B (en) Data generation method, device, terminal equipment and medium
EP4246152A1 (en) Vehicle detection method, apparatus and device
CN109669436B (en) Test case generation method and device based on functional requirements of electric automobile
CN111309602A (en) Software testing method, device and system
CN115686608A (en) Software version management method and device for vehicle, server and storage medium
CN111693294A (en) Vehicle detection method and device, terminal equipment and storage medium
CN109960656B (en) Program detection method and device and electronic equipment
CN117724982A (en) Simulation evaluation method and device, electronic equipment and storage medium
CN113133041B (en) Method and device for testing vehicle-to-vehicle communication function in dynamic interval train control vehicle
CN116257437A (en) ADAS system defect verification method and device based on real vehicle data reinjection
CN110442370B (en) Test case query method and device
CN115685959A (en) Diagnostic write configuration test method, device, equipment and storage medium
CN111198774A (en) Unmanned vehicle simulation abnormity tracking method, device, equipment and computer readable medium
CN113821431B (en) Test result acquisition method and device, electronic equipment and storage medium
CN111143961B (en) Unmanned vehicle simulation monitoring and positioning method, device and storage medium
CN114968827B (en) Vehicle bus signal information verification method and system
CN113836012B (en) Algorithm testing method and device, electronic equipment and storage medium
CN115327953B (en) Simulation test method and device for intelligent driving algorithm, electronic equipment and medium
CN117250940A (en) Automatic test method and device for vehicle, server and storage medium
CN115657633A (en) Electronic control unit electric detection method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant