CN109976997B - Test method and device - Google Patents
Test method and device Download PDFInfo
- Publication number
- CN109976997B CN109976997B CN201711458320.1A CN201711458320A CN109976997B CN 109976997 B CN109976997 B CN 109976997B CN 201711458320 A CN201711458320 A CN 201711458320A CN 109976997 B CN109976997 B CN 109976997B
- Authority
- CN
- China
- Prior art keywords
- test
- generated
- parameter
- request message
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000010998 test method Methods 0.000 title claims abstract description 13
- 238000012360 testing method Methods 0.000 claims abstract description 176
- 238000012545 processing Methods 0.000 claims abstract description 46
- 238000000034 method Methods 0.000 claims abstract description 35
- 238000012549 training Methods 0.000 claims description 29
- 238000002372 labelling Methods 0.000 claims description 23
- 238000005070 sampling Methods 0.000 claims description 22
- 238000010801 machine learning Methods 0.000 claims description 16
- 230000006870 function Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 11
- 238000012216 screening Methods 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 238000003066 decision tree Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000007477 logistic regression Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000007637 random forest analysis Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The embodiment of the application discloses a testing method and a testing device. One embodiment of the method comprises: receiving a test request, wherein the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter; processing a request message sent by a client in a target time period through a program segment to be tested indicated by the identification information; acquiring parameter values of test parameters generated when the request message is processed according to the parameter names; generating a test result according to the generated parameter value. The embodiment provides a test mechanism for generating test results based on parameter values generated during processing of the request message, and enriches the test methods.
Description
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a testing method and a testing device.
Background
With the wider and wider development and application range of the internet, more and more systems provide website services, and more attention is paid to the test of each functional module of the website system; in the existing test system, the quality of each function module of the website is often judged by people according to experience.
Disclosure of Invention
The embodiment of the application provides a test method and a test device.
In a first aspect, an embodiment of the present application provides a testing method, where the method includes: receiving a test request, wherein the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter; processing a request message sent by a client in a target time period through a program segment to be tested indicated by the identification information; acquiring parameter values of test parameters generated when the request message is processed according to the parameter names; generating a test result according to the generated parameter value.
In some embodiments, the length of the target time period is determined based on whether the number of generated parameter values reaches a preset number threshold.
In some embodiments, generating test results from the generated parameter values comprises: sampling the generated parameter values; and generating a test result according to the sampled parameter values.
In some embodiments, the target time period comprises at least two sub-time periods; and sampling the generated parameter values, including: counting the quantity of parameter values generated when the request messages sent by the client in each sub-time period are processed; and extracting parameter values from the parameter values generated when the request messages sent by the client in each sub-period are processed according to the counted number and the preset sampling proportion.
In some embodiments, the test request includes identification information of at least two program segments to be tested for implementing the same function; and processing a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information, wherein the request message comprises: and distributing the request message sent by the client in the target time period to each program segment to be tested indicated by the identification information for processing.
In some embodiments, generating test results from the generated parameter values comprises: transmitting the generated parameter value to the target device; acquiring marking information which is returned by the target equipment and is associated with the generated parameter value; and generating a test result according to the acquired marking information.
In some embodiments, the program segment to be tested includes a program for implementing a first model obtained by pre-training the initial model; and after generating the test result according to the acquired labeling information, the method further comprises the following steps: and training the initial model by using a machine learning method based on the acquired labeling information to obtain a second model.
In some embodiments, training the initial model based on the obtained labeling information by using a machine learning method to obtain a second model comprises: screening the marking information matched with the predefined negative marking information type in the obtained marking information; determining the screened marking information, the request message associated with the screened marking information and the parameter value associated with the screened marking information as negative sample data; and training the initial model by using a machine learning method to obtain a second model based on the negative sample data determined in the time interval at intervals of preset time.
In some embodiments, the method further comprises: generating indicator data for the second model based on at least one of the following test sets: a preset test set and a test set composed of historical negative sample data; and responding to the generated index data matched with a preset index data range used for indicating that the test is passed, and processing a request message sent by the client through the second model.
In a second aspect, an embodiment of the present application provides a testing apparatus, including: the device comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a test request, and the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter; the first processing unit is used for processing a request message sent by a client in a target time period through the program segment to be tested indicated by the identification information; the acquisition unit is used for acquiring the parameter value of the test parameter generated when the request message is processed according to the parameter name; and the generating unit is used for generating a test result according to the generated parameter value.
In some embodiments, the length of the target time period is determined based on whether the number of generated parameter values reaches a preset number threshold.
In some embodiments, the generating unit comprises: a sampling subunit for sampling the generated parameter values; and the generating subunit is used for generating a test result according to the sampled parameter value.
In some embodiments, the target time period comprises at least two sub-time periods; and a sampling subunit further configured to: counting the quantity of parameter values generated when the request messages sent by the client in each sub-time period are processed; and extracting parameter values from the parameter values generated when the request messages sent by the client in each sub-period are processed according to the counted number and the preset sampling proportion.
In some embodiments, the test request includes identification information of at least two program segments to be tested for implementing the same function; and a first processing unit further configured to: and distributing the request message sent by the client in the target time period to each program segment to be tested indicated by the identification information for processing.
In some embodiments, the generating unit is further configured to: transmitting the generated parameter value to the target device; acquiring marking information which is returned by the target equipment and is associated with the generated parameter value; and generating a test result according to the acquired marking information.
In some embodiments, the program segment to be tested includes a program for implementing a first model obtained by training an initial model in advance; and the apparatus further comprises: and the training unit is used for training the initial model by utilizing a machine learning method based on the acquired labeling information to obtain a second model.
In some embodiments, a training unit, comprises: the screening subunit is used for screening the marking information matched with the predefined negative marking information type in the acquired marking information; the determining subunit is used for determining the screened annotation information, the request message associated with the screened annotation information and the parameter value associated with the screened annotation information as negative sample data; and the training subunit is used for training the initial model by using a machine learning method to obtain a second model based on the negative sample data determined in the time interval at intervals of preset time.
In some embodiments, the apparatus further comprises: a test unit for generating index data of the second model based on at least one of the following test sets: a preset test set and a test set consisting of historical negative sample data; and the second processing unit is used for responding to the generated index data matched with a preset index data range used for indicating that the test is passed, and processing the request message sent by the client through a second model.
In a third aspect, an embodiment of the present application provides an apparatus, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method as described above in relation to the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program is configured to, when executed by a processor, implement the method as described above in the first aspect.
According to the testing method and the testing device provided by the embodiment of the application, the testing request is received, the request message sent by the client in the target time period is processed through the program segment to be tested indicated by the identification information, the parameter value of the testing parameter generated when the request message is processed is obtained according to the parameter name, and the testing result is generated according to the generated parameter value, so that a testing mechanism for generating the testing result based on the parameter value generated when the request message is processed is provided, and the testing method is enriched.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 is an exemplary system architecture diagram to which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a testing method according to the present application;
FIG. 3 is a schematic diagram of an application scenario of a testing method according to the present application;
FIG. 4 is a flow chart of yet another embodiment of a testing method according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of a test apparatus according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the testing method or testing apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and servers 105, 106. Network 104 is the medium used to provide communication links between terminal devices 101, 102, 103 and servers 105, 106. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user 110 may use the terminal devices 101, 102, 103 to interact with the servers 105, 106 via the network 104 to receive or transmit data or the like. Various applications may be installed on the terminal devices 101, 102, 103, such as shopping applications, map applications, payment applications, social applications, web browser applications, search engine applications, cell phone assistant applications, etc.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting a data communication function, including but not limited to a smart phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts Group Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts Group Audio Layer 4), a laptop portable computer, a desktop computer, and the like. The user may send a request message to the server via the terminal device 101, 102, 103.
The servers 105, 106 may be servers providing various services, such as background servers providing support for applications installed on the terminal devices 101, 102, 103, and the servers 105, 106 may receive test requests; processing a request message sent by a client in a target time period through the program segment to be tested indicated by the identification information; acquiring parameter values of test parameters generated when the request message is processed according to the parameter names; generating a test result according to the generated parameter value.
It should be noted that the test method provided in the embodiment of the present application may be executed by the servers 105 and 106, and accordingly, the test apparatus may be disposed in the servers 105 and 106.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a testing method according to the present application is shown. The test method comprises the following steps:
In this embodiment, the electronic device (for example, the server shown in fig. 1) on which the test method operates may receive the test request through a wired connection manner or a wireless connection manner. The test request comprises identification information of the program segment to be tested and parameter names of the test parameters. The program segment to be tested may be a program segment in a website or an application for implementing a specific function, for example, a program of a certain module or a program for implementing a certain model, and the model may be a formal expression mode obtained by abstracting a certain actual problem or an objective thing or rule. The identification information of the program segment to be tested may be the name of the file where the program segment to be tested is located, the name of the method in the program, the storage location of the program segment to be tested, and the like.
As an example, a tester may input the identification information of the program segment to be tested and the parameter names of the test parameters into the electronic device according to the test requirements to generate a test request, or input the identification information of the program segment to be tested and the parameter names of the test parameters into other electronic devices to generate a test request, and then the other electronic devices send the generated test request to the electronic device.
And step 202, processing a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information.
In this embodiment, the electronic device may process a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information included in the test request received in step 201. The request message may include a request method for a resource, an identifier of the resource, a Protocol used by the resource, and the like, and specifically, a Protocol such as a HyperText Transfer Protocol (HTTP) may be used. When there may be a problem in the operation of the program segment to be tested, the request message sent by the client in the target time period is processed by the program segment to be tested indicated by the identification information, which may be the request message sent by the client in a preset proportion in the target time period processed by the program segment to be tested indicated by the identification information, so that the use of the website or the application by most users is not affected. The preset proportion can be set according to actual needs, and for example, when the preset proportion is 10%, the background server of the website receives 1000 request messages in total, and can process 100 of the request messages through the program segment to be tested.
In some optional implementations of the present embodiment, the length of the target time period is determined according to whether the number of generated parameter values reaches a preset number threshold. A certain number of parameter values may be required to generate a more accurate test result, and thus, determining the length of the target time period according to the number of generated parameter values may further improve the test efficiency.
In some optional implementation manners of this embodiment, the test request includes identification information of at least two program segments to be tested for implementing the same function; and processing a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information, wherein the request message comprises: and distributing the request message sent by the client in the target time period to each program segment to be tested indicated by the identification information for processing. In the implementation mode, at least two program segments to be tested for realizing the same function can form a contrast experiment, and the advantages and the disadvantages of the two program segments can be compared by comparing the parameter values generated by the operation of the two program segments subsequently.
In this embodiment, the electronic device may obtain the parameter value of the test parameter generated when the request message is processed in the parameter name obtaining step 202 included in the test request received in the step 201. The parameter values of the test parameters generated when the request message is processed can be obtained through a buried point and the like.
In this embodiment, the electronic device may generate a test result according to the parameter value generated in step 203. The mode of generating the test result may be set according to actual needs, for example, when the parameter value is a session amount, a message amount, a click rate, or a satisfaction rate, and the parameter value is within a preset range, the generated test result may be a pass. The test request may further include a generation rule for generating a test result based on the generated parameter value, and the test result may be generated based on the generation rule and the generated parameter value. The generation rule may be under what conditions the test result is pass or fail.
In some optional implementations of the embodiment, generating the test result according to the generated parameter value includes: sampling the generated parameter values; and generating a test result according to the sampled parameter values. The specific sampling rule can be set according to actual needs, and the sampling rule can be used for describing a data set to be sampled, a start-stop time period of sampling, a final expected data volume of sampling results and other business rules.
In some optional implementations of this embodiment, the target time period includes at least two sub-time periods; and sampling the generated parameter values, including: counting the quantity of parameter values generated when the request messages sent by the client in each sub-time period are processed; and extracting parameter values from the parameter values generated when the request messages sent by the client in each sub-period are processed according to the counted number and the preset sampling proportion. Because the generated parameter values are possibly more, the sampling efficiency can be improved by assuming that the data are distributed too positively, cutting the time in a certain period into a plurality of parts, and then calculating the time density to randomly extract a certain number of samples in proportion.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the test method according to the present embodiment. In the application scenario of fig. 3, the electronic device 301 receives identification information of a program segment to be tested and a parameter name 302 of a test parameter, which are input by a tester, and then generates and sends a test request 305 including the identification information and the parameter name to the server 303, the server 303 processes a request message 304 sent by a client in a target time period through the program segment to be tested, which is indicated by the identification information, and obtains a parameter value of the test parameter generated when the request message is processed according to the parameter name, and finally generates and sends a test result 306 to the electronic device 301 according to the generated parameter value.
The method provided by the embodiment of the application processes the request message sent by the client in the target time period through the program segment to be tested indicated by the identification information after receiving the test request, obtains the parameter value of the test parameter generated when processing the request message according to the parameter name, and finally generates the test result according to the generated parameter value, thereby providing a test mechanism for generating the test result based on the parameter value generated when processing the request message, and enriching the test method.
With further reference to fig. 4, a flow 400 of yet another embodiment of a testing method is shown. The process 400 of the test method includes the following steps:
In this embodiment, the electronic device (for example, the server shown in fig. 1) on which the test method operates may receive the test request through a wired connection manner or a wireless connection manner. The test request comprises identification information of the program segment to be tested and parameter names of the test parameters.
And 402, processing a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information.
In this embodiment, the electronic device may process a request message sent by the client in the target time period through the program segment to be tested indicated by the identification information included in the test request received in step 401. The program segment to be tested may include a program for implementing a first model obtained by training an initial model in advance. The initial model may be a model for classification such as a model for classification, such as a Logistic Regression (Logistic Regression), a random forest (random forest), an iterative decision tree (iterative decision tree) or a Support Vector Machine (Support Vector Machine), a model for clustering such as a K-means cluster (K-means), a model for generating a confrontation Network (GAN) model, a neural Network model, or the like, or may be a composite model formed by combining the above various models.
In this embodiment, the electronic device may obtain the parameter value of the test parameter generated when the request message is processed in step 402 through the parameter name included in the test request received in step 401.
In this embodiment, the electronic device may transmit the parameter value generated in step 403 to the target device. The target device may be a client or a device for data annotation. As an example, when the target device is a client, the program segment to be tested includes a program for implementing a chat robot, and when the intention of the user cannot be determined, a plurality of options may be sent to the client for the user to click on, and the annotation information is generated according to the user click. And a labeling task can also be established and distributed to equipment for labeling data, and the labeling is carried out by a machine or a human.
And step 405, acquiring the labeling information which is returned by the target equipment and is associated with the generated parameter value.
In this embodiment, the electronic device may obtain the label information returned by the target device and associated with the parameter value sent in step 404. The labeling information can be simply divided into positive labeling information and negative labeling information, and the type of the labeling information can also be set according to business needs.
And 406, generating a test result according to the acquired labeling information.
In this embodiment, the electronic device may generate a test result according to the label information acquired in step 405. For example, if the annotation information is divided into positive annotation information and negative annotation information, a test result can be generated according to the ratio of the positive annotation information to the negative annotation information, and a test result that the test is passed can be generated when the ratio of the positive annotation information to the negative annotation information exceeds a preset threshold. Negative annotation information can also include the cause of a negative result.
In this embodiment, the electronic device may train the initial model to obtain the second model by using a machine learning method based on the labeling information obtained in step 405. The input and output of the trained model can be determined according to specific services, and the parameter names of the input and output parameters can also be included in the test request, or input by the user through other ways, for example, relevant data included in the request information sent by the client can be used as input, the labeling information can be used as output, and the machine learning method is used for training the initial model to obtain the second model.
In some optional implementation manners of this embodiment, training the initial model by using a machine learning method based on the obtained labeling information to obtain a second model, including: screening the marking information matched with the predefined negative marking information type in the obtained marking information; determining the screened marking information, the request message associated with the screened marking information and the parameter value associated with the screened marking information as negative sample data; and training the initial model by using a machine learning method to obtain a second model based on the negative sample data determined in the time interval at intervals of preset time. Retraining the model at predetermined time intervals can continuously optimize the model to achieve optimal operation effect. The predetermined time interval may be determined based on the amount of samples taken, which may be shortened if the amount of samples is sufficiently large.
In this embodiment, the electronic device may generate the index data of the second model based on the training in the test set generating step 407. The test set includes at least one of: the test set is composed of a preset test set and historical negative sample data. The preset test set may be a standard evaluation set (baseline) established by the service personnel, and the historical negative sample data may be negative sample data determined in a past period of time. The evaluation of the second model may be performed offline. And setting a weight for each type of marking information, and weighting according to the weight to obtain final index data. As an example, for a chat robot, the index data can be divided into types of overall effect, response effect, conversation strategy, and the like.
And step 409, responding to the generated index data matched with a preset index data range for indicating that the test is passed, and processing a request message sent by the client through the second model.
In this embodiment, the electronic device may process, in response to the generated index data in step 408 matching a preset index data range for indicating that the test is passed, the request message sent by the client through the second model. The evaluation may use sample weighting to compute the macroaverage. This step may produce an effect value that may evaluate the effect of the model. The effect value may be compared with a historical effect value (e.g., the effect value of the first model), and if the fluctuation range is within a predetermined range, it may be considered that the second model may bring a negative effect. The discrimination of the model effect can not only see a certain index, but also determine the index data range of the index influenced by a plurality of factors by professional data analysts. The request message sent by the client is processed through the second model, wherein the standard of the second model is met, the request message can be processed through the second model, an online real user request can be processed through the request message, and if the standard of the second model is not met, the request message sent by the client can be processed through the first model.
In this embodiment, the operations of step 401, step 402, and step 403 are substantially the same as the operations of step 201, step 202, and step 203, and are not repeated herein.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, in the process 400 of the testing method in this embodiment, the model related to the program segment to be tested is retrained through the obtained labeling information, the trained model is evaluated, and the client request is processed through the evaluation, so that the scheme described in this embodiment makes full use of the test data, provides a scheme of active optimization, and can uninterruptedly generate an effect of iterative optimization.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of a testing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied to various electronic devices.
As shown in fig. 5, the test apparatus 500 of the present embodiment includes: a receiving unit 501, a first processing unit 502, an obtaining unit 503, and a generating unit 504. The receiving unit 501 is configured to receive a test request, where the test request includes identification information of a program segment to be tested and a parameter name of a test parameter; a first processing unit 502, configured to process, through the program segment to be tested indicated by the identification information, a request message sent by the client in the target time period; an obtaining unit 503, configured to obtain a parameter value of a test parameter generated when the request message is processed according to the parameter name; a generating unit 504, configured to generate a test result according to the generated parameter value.
In this embodiment, the specific processing of the receiving unit 501, the first processing unit 502, the obtaining unit 503, and the generating unit 504 of the testing apparatus 500 may refer to step 201, step 202, step 203, and step 204 in the corresponding embodiment of fig. 2.
In some optional implementations of the present embodiment, the length of the target time period is determined according to whether the number of generated parameter values reaches a preset number threshold.
In some optional implementations of this embodiment, the generating unit 504 includes: a sampling subunit (not shown in the figure) for sampling the generated parameter values; a generating subunit (not shown in the figure) for generating a test result according to the sampled parameter values.
In some optional implementations of this embodiment, the target time period includes at least two sub-time periods; and a sampling subunit (not shown in the figures) further configured to: counting the quantity of parameter values generated when the request messages sent by the client in each sub-time period are processed; and extracting parameter values from the parameter values generated when the request messages sent by the client in each sub-period are processed according to the counted number and the preset sampling proportion.
In some optional implementation manners of this embodiment, the test request includes identification information of at least two program segments to be tested for implementing the same function; and a first processing unit 502, further configured to: and distributing the request message sent by the client in the target time period to each program segment to be tested indicated by the identification information for processing.
In some optional implementations of this embodiment, the generating unit 504 is further configured to: transmitting the generated parameter value to the target device; acquiring marking information which is returned by the target equipment and is associated with the generated parameter value; and generating a test result according to the acquired marking information.
In some optional implementation manners of this embodiment, the program segment to be tested includes a program for implementing a first model obtained by training an initial model in advance; and the apparatus further comprises: and a training unit (not shown in the figure) for training the initial model by using a machine learning method based on the obtained labeling information to obtain a second model.
In some optional implementations of this embodiment, the training unit (not shown in the figure) includes: a filtering subunit (not shown in the figure) for filtering the label information matching with the predefined negative label information type in the obtained label information; a determining subunit (not shown in the figure) for determining the screened annotation information, the request message associated with the screened annotation information, and the parameter value associated with the screened annotation information as negative sample data; and a training subunit (not shown in the figure) for training the initial model by using a machine learning method to obtain a second model at predetermined time intervals based on the negative sample data determined in the time intervals.
In some optional implementations of this embodiment, the apparatus further comprises: a test unit (not shown in the figure) for generating index data of the second model based on at least one of the following test sets: a preset test set and a test set composed of historical negative sample data; and a second processing unit (not shown in the figure) for processing the request message sent by the client through the second model in response to the generated index data matching with a preset index data range for indicating that the test is passed.
According to the device provided by the embodiment of the application, the test request is received, wherein the test request comprises the identification information of the program segment to be tested and the parameter name of the test parameter; processing a request message sent by a client in a target time period through a program segment to be tested indicated by the identification information; acquiring parameter values of test parameters generated when the request message is processed according to the parameter names; and generating a test result according to the generated parameter value, thereby providing a test mechanism for generating the test result based on the parameter value generated when the request message is processed, and enriching the test method.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that the computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, the processes described above with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609 and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the C language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a receiving unit, a first processing unit, an obtaining unit, and a generating unit. Where the names of the units do not in some cases constitute a limitation of the unit itself, for example, a receiving unit may also be described as a "unit for receiving test requests".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: receiving a test request, wherein the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter; processing a request message sent by a client in a target time period through a program segment to be tested indicated by the identification information; acquiring parameter values of test parameters generated when the request message is processed according to the parameter names; generating a test result according to the generated parameter value.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements in which any combination of the features described above or their equivalents does not depart from the spirit of the invention disclosed above. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.
Claims (10)
1. A method of testing, comprising:
receiving a test request, wherein the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter;
processing a request message sent by a client in a target time period through the program segment to be tested indicated by the identification information;
acquiring parameter values of test parameters generated when the request message is processed according to the parameter names;
generating a test result according to the generated parameter value;
the generating of the test result according to the generated parameter value comprises:
transmitting the generated parameter value to the target device; acquiring marking information which is returned by the target equipment and is associated with the generated parameter value; generating a test result according to the acquired marking information;
the program segment to be tested comprises a program for realizing a first model obtained by pre-training an initial model; and
the method further comprises the following steps: and training the initial model by using a machine learning method based on the acquired labeling information to obtain a second model.
2. The method of claim 1, wherein the length of the target time period is determined according to whether the number of generated parameter values reaches a preset number threshold.
3. The method of claim 1, wherein said generating test results from the generated parameter values comprises:
sampling the generated parameter values;
and generating a test result according to the sampled parameter values.
4. The method of claim 3, wherein the target time period comprises at least two sub-time periods; and
the sampling of the generated parameter values comprises:
counting the quantity of parameter values generated when the request messages sent by the client in each sub-time period are processed;
and extracting parameter values from the parameter values generated when the request messages sent by the client in each sub-period are processed according to the counted number and the preset sampling proportion.
5. The method of claim 1, wherein the test request includes identification information of at least two program segments under test for implementing the same function; and
the processing of the request message sent by the client in the target time period by the program segment to be tested indicated by the identification information includes:
and distributing the request message sent by the client in the target time period to the program segment to be tested indicated by each identification information for processing.
6. The method of claim 1, wherein training the initial model using a machine learning method based on the obtained labeling information results in a second model comprising:
screening the marking information matched with the predefined negative marking information type in the obtained marking information;
determining the screened marking information, the request message associated with the screened marking information and the parameter value associated with the screened marking information as negative sample data;
and training the initial model by using a machine learning method to obtain a second model based on the negative sample data determined in the time interval at intervals of preset time.
7. The method of claim 6, wherein the method further comprises:
generating indicator data for the second model based on at least one of the following test sets: a preset test set and a test set composed of historical negative sample data;
and responding to the generated index data matched with a preset index data range used for indicating that the test is passed, and processing a request message sent by the client through the second model.
8. A test apparatus, comprising:
the device comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a test request, and the test request comprises identification information of a program segment to be tested and a parameter name of a test parameter;
the first processing unit is used for processing a request message sent by a client in a target time period through the program segment to be tested indicated by the identification information;
the acquisition unit is used for acquiring parameter values of test parameters generated when the request message is processed according to the parameter names;
a generating unit for generating a test result according to the generated parameter value;
the generating unit is further configured to: transmitting the generated parameter value to the target device; acquiring marking information which is returned by the target equipment and is associated with the generated parameter value; generating a test result according to the acquired labeling information;
the program section to be tested comprises a program for realizing a first model obtained by pre-training an initial model; and the apparatus further comprises: and the training unit is used for training the initial model by utilizing a machine learning method based on the acquired labeling information to obtain a second model.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method recited in any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711458320.1A CN109976997B (en) | 2017-12-28 | 2017-12-28 | Test method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711458320.1A CN109976997B (en) | 2017-12-28 | 2017-12-28 | Test method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109976997A CN109976997A (en) | 2019-07-05 |
CN109976997B true CN109976997B (en) | 2022-12-27 |
Family
ID=67074586
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711458320.1A Active CN109976997B (en) | 2017-12-28 | 2017-12-28 | Test method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109976997B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111045918B (en) * | 2019-07-12 | 2023-09-22 | 华控清交信息科技(北京)有限公司 | Method, device, client, storage medium and system for debugging program |
CN111061623B (en) * | 2019-07-12 | 2023-08-22 | 华控清交信息科技(北京)有限公司 | Method, device, middle server, storage medium and system for debugging program |
CN111045919B (en) * | 2019-07-12 | 2023-08-22 | 华控清交信息科技(北京)有限公司 | Method, device, background server, storage medium and system for debugging program |
CN111625473B (en) * | 2020-07-01 | 2023-08-22 | 北京字节跳动网络技术有限公司 | Interface test case generation method and device, storage medium and electronic equipment |
CN111901310A (en) * | 2020-07-06 | 2020-11-06 | 北京达佳互联信息技术有限公司 | Website security testing method and device, electronic equipment and storage medium |
CN113760708B (en) * | 2020-09-25 | 2024-10-18 | 北京沃东天骏信息技术有限公司 | Automatic test method and device |
CN112241160B (en) * | 2020-10-20 | 2022-02-11 | 广州小鹏汽车科技有限公司 | Vehicle testing method and device, vehicle detection system and test board card |
CN112346425B (en) * | 2020-11-20 | 2024-01-16 | 宜宾市极米光电有限公司 | Factory automation testing method, system, projection equipment and storage medium |
CN113472458B (en) * | 2021-06-30 | 2023-09-26 | 珠海泰芯半导体有限公司 | Radio frequency performance test method, device, storage medium and system |
EP4408278A1 (en) * | 2021-09-27 | 2024-08-07 | Medtrum Technologies Inc. | Analyte detection system |
CN117149551B (en) * | 2023-10-30 | 2024-02-09 | 鹰驾科技(深圳)有限公司 | Test method of vehicle-mounted wireless communication chip |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5390325A (en) * | 1992-12-23 | 1995-02-14 | Taligent, Inc. | Automated testing system |
CN103019902A (en) * | 2012-12-13 | 2013-04-03 | 中国航空无线电电子研究所 | Automatic testing device and automatic testing method of ARINC 429 bus signal performance parameters |
CN104424093A (en) * | 2013-08-26 | 2015-03-18 | 阿里巴巴集团控股有限公司 | Compatibility testing method and system |
CN104022913B (en) * | 2013-12-18 | 2015-09-09 | 深圳市腾讯计算机系统有限公司 | For method of testing and the device of data cluster |
CN105099988B (en) * | 2014-04-24 | 2018-11-27 | 阿里巴巴集团控股有限公司 | Method, access method and device and system for supporting gray scale to issue |
CN104035869A (en) * | 2014-06-19 | 2014-09-10 | 科大讯飞股份有限公司 | Application evaluation method, terminal, and server |
WO2017071369A1 (en) * | 2015-10-31 | 2017-05-04 | 华为技术有限公司 | Method and device for predicting user unsubscription |
CN107194427A (en) * | 2017-05-26 | 2017-09-22 | 温州大学 | A kind of milling cutter malfunction monitoring and recognition methods and system |
-
2017
- 2017-12-28 CN CN201711458320.1A patent/CN109976997B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN109976997A (en) | 2019-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109976997B (en) | Test method and device | |
CN109460513B (en) | Method and apparatus for generating click rate prediction model | |
CN111125574B (en) | Method and device for generating information | |
CN108960316B (en) | Method and apparatus for generating a model | |
CN109447156B (en) | Method and apparatus for generating a model | |
CN111695041B (en) | Method and device for recommending information | |
CN110929799A (en) | Method, electronic device, and computer-readable medium for detecting abnormal user | |
CN110555451A (en) | information identification method and device | |
CN110659657A (en) | Method and device for training model | |
CN110798567A (en) | Short message classification display method and device, storage medium and electronic equipment | |
CN111126649B (en) | Method and device for generating information | |
US10678821B2 (en) | Evaluating theses using tree structures | |
CN111915086A (en) | Abnormal user prediction method and equipment | |
CN110866040A (en) | User portrait generation method, device and system | |
CN112836128A (en) | Information recommendation method, device, equipment and storage medium | |
CN111723180A (en) | Interviewing method and device | |
US20210349920A1 (en) | Method and apparatus for outputting information | |
CN111125502B (en) | Method and device for generating information | |
CN113111165A (en) | Deep learning model-based alarm receiving warning condition category determination method and device | |
CN112860999B (en) | Information recommendation method, device, equipment and storage medium | |
CN112231373B (en) | Knowledge point data processing method, apparatus, device and computer readable medium | |
CN112307324B (en) | Information processing method, device, equipment and medium | |
CN113780610B (en) | Customer service portrait construction method and device | |
CN111767290B (en) | Method and apparatus for updating user portraits | |
CN111949860B (en) | Method and apparatus for generating a relevance determination model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TG01 | Patent term adjustment | ||
TG01 | Patent term adjustment |