Disclosure of Invention
The application provides a software testing method and device, server equipment and storage medium, which can help to improve the testing efficiency of software testing based on flow recording and playback.
At least one aspect of the embodiments of the present application provides a software testing method, the method including: monitoring at least one network port to obtain a data packet of network traffic of a subject to be tested; when a request type data packet carrying a tracking identifier is obtained, adding the request type data packet into a request data queue; when a response type data packet carrying a tracking identifier is obtained, searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists; when obtaining a playback data packet carrying a tracking identifier, searching whether the request data packet and the response data packet corresponding to the tracking identifier exist in a response data queue, and generating a test data group consisting of the request data packet, the response data packet and the playback data packet corresponding to the tracking identifier when the request data packet and the response data packet exist; the playback data packet is a data packet sent by a test server in response to a recording request data packet obtained by recording the request data packet; and testing the object to be tested by using the test data comprising the test data set.
In some possible implementations, the method further includes: when any request type data packet and a corresponding response type data packet are added to a response data queue in pairs, the request type data packet is shifted out of the request data queue; and when any test data group is generated, shifting out the request class data packet and the response class data packet in the test data group from the response data queue.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: when the configuration parameters of the bug repair verification are on, determining whether the test data set belongs to the test data of the bug repair project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
In some possible implementations, after generating the test data set, the method further includes: determining an identification code of each data packet in the test data set based on an identification code comparison table, wherein each data packet comprises parameter values of at least one parameter item, the identification code comparison table comprises identification characters of a plurality of parameter items, and the identification code of each data packet comprises the identification characters of at least one parameter item and the parameter values of at least one parameter item; and when the identification code of the data packet exists in the flow database, the data packet is stored in a coverage mode as the data packet corresponding to the identification code in the flow database.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: generating at least one test case based on the data packet stored in the flow database in the monitoring period when one monitoring period is finished; and testing the object to be tested by using the at least one test case.
At least one aspect of the embodiments of the present application provides a software testing apparatus, the apparatus including: the monitoring module is used for monitoring at least one network port to acquire a data packet of the network flow of the object to be tested; the first processing module is used for adding the request type data packet to a request data queue when the request type data packet carrying the tracking identifier is obtained; the second processing module is used for searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue when the response type data packet carrying the tracking identifier is obtained, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists; the third processing module is used for searching whether the request type data packet and the response type data packet corresponding to the tracking identification exist in the response data queue when the playback type data packet carrying the tracking identification is obtained, and generating a test data group consisting of the request type data packet, the response type data packet and the playback type data packet corresponding to the tracking identification when the request type data packet and the response type data packet exist; the playback data packet is a data packet returned by the test server in response to the received recording data packet of the request data packet; and the test module is used for testing the object to be tested by using the test data comprising the test data set.
At least one aspect of the embodiments of the present application provides a server apparatus, including: a processor; a memory for storing executable instructions of the processor; the processor is configured to execute the executable instructions to implement any one of the software testing methods described above.
At least one aspect of the embodiments provides a computer-readable storage medium storing executable instructions of a processor configured to, when executed by the processor, cause the processor to implement any one of the software testing methods described above.
In the embodiment of the application, based on the tracking identification configured for various data packets, when the data packets carrying the tracking identification are monitored, the data packets are cached through the corresponding data queues, and the data packets carrying the same tracking identification are associated, so that the data packets are output in the form of test data groups consisting of request data packets, response data packets and playback data packets, the subsequent analysis and verification links are independent of the filtering, classifying and matching of the test personnel on the data packets, the workload of the test personnel on processing the data packets can be reduced, the automatic verification and test process can be realized, the test efficiency of the software test based on flow recording and playback is improved, and the test process with higher speed and high service coverage rate is realized.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is an application scenario schematic diagram of a software testing method provided in an embodiment of the present application. Referring to fig. 1, in an exemplary application scenario, the service server 100 performs network communication with the terminal device 200 through the network connection 300, thereby receiving the request class data packet D1 transmitted by the terminal device 200 through the network interface, and returning the response class data packet D2 to the terminal device 200 after processing, to provide network services for the terminal device 200. In the software test based on the flow recording and playback, by monitoring at the network port of the service server 100, the real network flows such as the request class data packet D1, the response class data packet D2 and the like generated when the service server 100 provides the network service for the terminal device 200 can be obtained; thus, the service server 100 may record the request data packet D1 to obtain a record request data packet D1 'and send the record request data packet D1' to the test server 400 in the test environment, so as to perform a required test based on the playback data packet D2 'returned by the test server 400—for example, by comparing the playback data packet D2' with the response data packet D2, it may be known how the processing results of the requests of the online service and the offline service with the same content are different, so as to implement a test procedure such as bug fix verification. It can be seen that the above-described test procedure is performed only between the test server 400 and the service server 100, and the data used in the test does not change the data transmitted and received by the terminal device 200, so that the service server 100 does not affect the network service provided by the terminal device 200 regardless of the test result. It should be understood that in addition to the testing procedures such as data interception and playback testing on the service server 100, these testing procedures may be performed on other server devices—for example, the testing server 400 may receive data packets of network traffic intercepted by multiple service servers 100 on their network ports through a network connection, so that the required testing procedures may be completed on the testing server 400; similarly, other server devices capable of network communication with the business server 100 and the test server 400, respectively, may also perform these test procedures.
In the related art, a software test based on traffic recording and playback needs to rely on a corresponding traffic recording and playback tool (e.g., goreplay). However, the grabbing of the flow recording and playback tools on various data packets can only be limited to outputting according to time sequence, but the relevance between the data packets and the service specifically related to the data packets cannot be directly indicated, so that after collecting the data packets, a tester needs to filter, classify and match the data packets almost manually to perform subsequent analysis and verification, and the test requirement on quick and high service coverage rate cannot be met.
Fig. 2 is a flowchart illustrating steps of a software testing method according to an embodiment of the present application. Referring to fig. 2, the method may be performed, for example, by any of the server devices described above, and includes the following steps.
In step 201, at least one network port is monitored to obtain a data packet of network traffic of a subject to be tested.
In step 202, when a request class data packet carrying a tracking identifier is obtained, the request class data packet is added to a request data queue.
In step 203, when obtaining the response class data packet carrying the tracking identifier, searching whether the request class data packet corresponding to the tracking identifier exists in the request data queue, and adding the request class data packet corresponding to the tracking identifier and the response class data packet to the response data queue in pairs when the request class data packet exists.
In step 204, when obtaining the playback data packet carrying the tracking identifier, searching whether the request data packet and the response data packet corresponding to the tracking identifier exist in the response data queue, and generating a test data set consisting of the request data packet, the response data packet and the playback data packet corresponding to the tracking identifier when the request data packet and the response data packet exist. The playback data packet is a data packet sent by the test server in response to a recording request data packet obtained by recording the request data packet.
In step 205, the object to be tested is tested with test data comprising a test data set.
It can be seen that, in the embodiment of the present application, based on the trace identifier configured for each type of data packet, when the data packet carrying the trace identifier is obtained by listening, the data packet is cached through the corresponding data queue, and the data packet carrying the same trace identifier is associated, so that the form of the test data group consisting of the request type data packet, the response type data packet and the playback type data packet is output, so that the subsequent analysis and verification links are not dependent on the filtering, classifying and matching of the data packet by the tester, thereby not only helping to reduce the workload of the tester for processing the data packet, but also realizing the automated verification and test process, and being helpful to promote the test efficiency of the software test based on the flow recording and playback, and realizing the test process of higher speed and high service coverage rate.
It should be noted that, the object to be tested may be, for example, a service, an application, a service, a system, or the like, and in the step 201, the listening rule may be configured to screen out a data packet of a required network traffic at a selected network port, and the service server where the network port is located sends the data packet of the network traffic to a server device executing the software testing method through a network connection, so as to achieve acquisition of the data packet of the network traffic of the object to be tested. Of course, the software testing method can be executed by any service server.
It should be further noted that, the tracking identifier refers to the data content in the packets of the network traffic, where the data content is used to indicate the response, recording and playback relationships, and may be, for example, an identifier added by the service server 100 in the request type packet D1 and the response type packet D2, where the identifier is capable of indicating the response relationships between each other, and by retaining this item in the packets during recording and playback, the identifier is capable of indicating the correspondence between the request type packet D1, the response type packet D2, the recording request packet D1 'and the playback type packet D2'. Of course, the implementation of the tracking identifier described above may not be limited thereto.
It should be further noted that, the request data queue and the response data queue may be, for example, in any type of data buffer area established in the memory of the server device, or in a data file established in a specified directory in the memory of the server device; to avoid data overflow and data expiration, the maximum queue length and timeout may be configured to maintain the request data queue and the response data queue by, for example, deleting tail packets and periodically deleting expired packets when data overflows.
In some possible implementations, the request data queue and the response data queue are maintained in a "dequeue after pairing is successful" manner, for example: when any request type data packet and a corresponding response type data packet are added to the response data queue in pairs, the request type data packet is shifted out of the request data queue; and when any test data group is generated, the request class data packet and the response class data packet in the test data group are shifted out of the response data queue. Therefore, each request type data packet can be paired with one response type data packet, and each pair of request type data packets and response type data packets can only form a test data set with one playback type data packet, so that the corresponding relation between the data packets is clearer, and the subsequent analysis and test are convenient.
In some possible implementations, the testing procedure in step 205 includes: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In one example, during the process of listening to at least one network port to obtain new data packets, each time a new data packet carrying the tracking identifier is obtained, one of the steps 202 to 205 is performed thereon. If the packet is a request class packet, it is added to the request data queue by step 202; if the packet is a response type packet, adding the pairing thereof to a response data queue in step 203, and automatically performing a series of operations of traffic recording and playback on the request type packet paired thereof (for example, recording the recording request packet and sending the recording request packet to a test server to wait for returning to the corresponding playback type packet); if the packet is a playback class packet, a test data set is correspondingly generated, via step 204. Thus, with the continuous progress of the above process, a plurality of test data sets are automatically generated in each monitoring period; for these test data sets, some test operations may also be automatically performed through the above step 205—when the configuration parameters of the consistency verification item in the automatic test configuration file are on, for each test data set, the test results regarding the consistency verification of these test data sets are automatically generated by comparing, for example, the dissimilarity of the response class data packet with the playback class data packet by means of jsondiff, and recording whether the results of the two are identical. Because the above-mentioned process is all finished by the server apparatus automatically, the tester can directly obtain the test data group and its consistency verification result that are produced in each monitoring period by means of above-mentioned process, and can call the corresponding record to analyze when necessary, in order to obtain further test result, therefore the above-mentioned software test method can help the tester save the trouble of processing the data packet and carrying on consistency verification separately, promote the test efficiency greatly.
In some possible implementations, the testing process in step 205 further includes: when the configuration parameters of the bug fix verification are on, determining whether the test data set belongs to the test data of the bug fix project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
For example, when the configuration parameters of the bug fix verification items in the automatic test configuration file are on, reading an interface list of bug fix verification and verification rules corresponding to each interface in the specified directory, for each test data set: matching the test data set with the interface list by utilizing the tracking identifier so as to determine whether the test data belongs to the bug fix project; if the result does not belong to the group, carrying out consistency verification according to the mode, and obtaining a corresponding consistency verification result; if so, determining that the corresponding verification rule according to the matched interface processes the test data set (for example, if the specified parameter item of the playback class data packet is the specified parameter value, the test data set passes, otherwise, the test data set does not pass) so as to obtain a test result indicating whether verification passes.
In some possible implementations, the testing process in step 205 further includes: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
For example, when the configuration parameters of the version characteristic verification items in the automatic test configuration file are on, loading the version characteristic verification file pre-written by the tester in the specified directory to process the test data set by using the version characteristic verification rule recorded in the version characteristic verification file (for example, if the specified parameter items of the playback class data packet are specified parameter values, the test data set passes, otherwise, the test data set does not pass), so as to obtain a test result indicating whether the version characteristic verification passes.
Fig. 3 is a schematic step flow diagram of a process of automatically generating test cases in a software testing method according to an embodiment of the present application. Referring to fig. 3, on the basis of any of the above software testing methods, the following steps may be further included.
In step 301, the identification code of each data packet in the test data set is determined based on the identification code look-up table. Wherein, each data packet comprises at least one parameter value of the parameter item, the identification code comparison table comprises a plurality of identification characters of the parameter item, and the identification code of each data packet comprises at least one identification character of the parameter item and at least one parameter value of the parameter item.
In one example, the tester maintains an identification code comparison table on the server device, wherein the identification code comparison table records identification characters corresponding to each known parameter item, and can additionally record information such as corresponding business logic description and the like; for example, for a data packet with the following contents:
{‘data’:{'channel':'123','type':'webrtc','isfullscreen':1,'relation':'720X1280'}},
the identification code lookup table records an identification character "ch" corresponding to the parameter item "channel", an identification character "t1" corresponding to the parameter item "type=webrtc", and an identification character "fs1" corresponding to the parameter item "isfullscreen=1", but does not record a parameter item related to the "relation ': 720X1280'", thereby determining that the identification code of the data packet is "ch123-t 1-fs1=relation 720X1280". Wherein the identification code of each parameter item is connected with the parameter value which it may exist in hyphae "-" before the equal sign "=" and the parameter item which lacks a record and its parameter value are placed directly after the equal sign "=" in character form. It can be seen that according to the identifier mapping rule, data packets with the same parameter items and parameter values can all have the same identification code, so that the data packets can be compared more simply by using the identification code.
In step 302, for each data packet, when the identification code of the data packet does not exist in the traffic database, the data packet and the identification code thereof are correspondingly stored in the traffic database, and when the identification code of the data packet exists in the traffic database, the data packet is stored in an overlapping manner as the data packet corresponding to the identification code in the traffic database.
In one example, the traffic database may be configured in the server device with reference to the configuration of the queues described above for storing the respective test data sets identified with the identification code. For a new data packet, the new data packet can be directly stored in a flow database; for data items for which the identification code already exists, it can be updated to a new data packet to ensure that the traffic database holds the most up-to-date data.
In step 303, at least one test case is generated based on the data packets stored in the traffic database during a listening period each time the listening period ends.
In step 304, the object to be tested is tested using at least one test case.
Because each test data set can directly provide the service case sample diversity information and the case weight information, the test cases can be generated based on the test data sets stored in the flow database. In one example, each time a listening period ends, test data sets stored in the traffic database during the listening period may be collected, so that test cases may be automatically assembled in conjunction with a pre-configured use case schedule, to automatically generate a new test case set using the data packets of the latest monitored network traffic, and to update the previously obtained test results. In one example, when a new requirement is added, a new parameter item and a corresponding parameter value may appear in the data packet, for which a tester may add a corresponding identification character in the identification code comparison table, and pre-fill in a sample material table of a new process flow, so as to generate a new test sample and a test sample set at the end of each monitoring period.
Fig. 4 is a block diagram of a software testing device according to an embodiment of the present application. Referring to fig. 4, the software testing apparatus is applied to any one of the above server devices, and includes: a monitoring module 41, configured to monitor at least one network port to obtain a data packet of network traffic of a to-be-tested object; the first processing module 42 is configured to add the request class data packet carrying the tracking identifier to a request data queue when obtaining the request class data packet; the second processing module 43 is configured to, when obtaining a response class data packet carrying a tracking identifier, search for whether a request class data packet corresponding to the tracking identifier exists in the request data queue, and add the request class data packet and the response class data packet corresponding to the tracking identifier to the response data queue in pairs when the request class data packet exists; the third processing module 44 is configured to, when obtaining a playback class data packet carrying a tracking identifier, search a response data queue for whether a request class data packet and a response class data packet corresponding to the tracking identifier exist, and generate a test data set including the request class data packet, the response class data packet and the playback class data packet corresponding to the tracking identifier when the request class data packet and the response class data packet exist; the playback data packet is a data packet returned by the test server in response to the received recording data packet of the request data packet; a test module 45 for testing the object to be tested by using the test data comprising the test data set.
In some possible implementations, the second processing module 43 is further configured to, when adding any request class packet and a corresponding response class packet to the response data queue in pairs, shift the request class packet out of the request data queue; the third processing module 44 is further configured to, when any one of the test data sets is generated, dequeue the request class data packet and the response class data packet in the test data set from the response data queue.
In some possible implementations, the test module 45 is further to: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In some possible implementations, the test module 45 is further to: when the configuration parameters of the bug fix verification are on, determining whether the test data set belongs to the test data of the bug fix project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
In some possible implementations, the test module 45 is further to: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
In some possible implementations, the test module 45 is further configured to: after the test data set is generated, determining an identification code of each data packet in the test data set based on an identification code comparison table, wherein each data packet comprises parameter values of at least one parameter item, the identification code comparison table comprises identification characters of a plurality of parameter items, and the identification code of each data packet comprises the identification characters of at least one parameter item and the parameter values of at least one parameter item; for each data packet, when the identification code of the data packet does not exist in the flow database, the data packet and the identification code thereof are correspondingly stored in the flow database, and when the identification code of the data packet exists in the flow database, the data packet is stored in a covering mode as the data packet corresponding to the identification code in the flow database.
In some possible implementations, the test module 45 is further to: generating at least one test case based on the data packet stored in the flow database in the monitoring period when one monitoring period is finished; and testing the object to be tested by using at least one test case.
The implementation process of the software testing device provided by the embodiment of the application is consistent with the software testing method provided by the embodiment of the application, and the achieved effect is the same as that of the software testing method provided by the embodiment of the application, and the details are not repeated here.
Fig. 5 is a block diagram of a server device according to an embodiment of the present application. Referring to fig. 5, the server device includes a processor 51 and a memory 52 for storing executable instructions of the processor 51; wherein the processor 51 is configured to execute the executable instructions to implement any of the software testing methods described above. Taking any one of the server devices as an example, the server device in the embodiment of the application can help reduce the workload of testers for processing data packets, can realize automatic verification and testing processes, is beneficial to improving the testing efficiency of software testing based on flow recording and playback, and realizes a testing process with higher speed and high service coverage rate.
Embodiments of the present application also provide a computer-readable storage medium that is a non-volatile storage medium, and that stores executable instructions of a processor configured to cause the processor to implement any one of the software testing methods described above when executed by the processor. Taking the above memory 52 as an example, the computer readable storage medium of the embodiment of the present application can be used to help reduce the workload of the tester for processing the data packet, and can also realize an automatic verification and test process, thereby being beneficial to improving the test efficiency of the software test based on the flow recording and playback, and realizing a test process with higher speed and high service coverage rate.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.