CN116719750A - Software testing method and device, server equipment and storage medium - Google Patents

Software testing method and device, server equipment and storage medium Download PDF

Info

Publication number
CN116719750A
CN116719750A CN202311007690.9A CN202311007690A CN116719750A CN 116719750 A CN116719750 A CN 116719750A CN 202311007690 A CN202311007690 A CN 202311007690A CN 116719750 A CN116719750 A CN 116719750A
Authority
CN
China
Prior art keywords
data packet
test
request
response
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311007690.9A
Other languages
Chinese (zh)
Other versions
CN116719750B (en
Inventor
李飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Haima Cloud Technology Co ltd
Original Assignee
Haima Cloud Tianjin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haima Cloud Tianjin Information Technology Co Ltd filed Critical Haima Cloud Tianjin Information Technology Co Ltd
Priority to CN202311007690.9A priority Critical patent/CN116719750B/en
Publication of CN116719750A publication Critical patent/CN116719750A/en
Application granted granted Critical
Publication of CN116719750B publication Critical patent/CN116719750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3636Software debugging by tracing the execution of the program

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The application provides a software testing method and device, server equipment and storage media, and belongs to the technical field of computers. The method comprises the following steps: monitoring at least one network port to obtain a data packet of network traffic of a subject to be tested; when a request type data packet carrying a tracking identifier is obtained, adding the request type data packet into a request data queue; when a response type data packet carrying a tracking identifier is obtained, the request type data packet and the response type data packet corresponding to the tracking identifier are added into a response data queue in pairs; when a playback data packet carrying a tracking identifier is obtained, generating a test data set consisting of a request data packet, a response data packet and a playback data packet corresponding to the tracking identifier; and testing the object to be tested by using the test data comprising the test data set. The application can help to promote the test efficiency of the software test based on flow recording and playback.

Description

Software testing method and device, server equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a software testing method and apparatus, a server device, and a storage medium.
Background
Software testing refers to the process of operating a program under specified conditions to find program errors, measure software quality, and evaluate whether it meets design requirements, and is an essential element in many informationized projects. With the continuous deep industrialization of China, the business scale of each functional platform is continuously enlarged, business scenes are more and more refined, business logic is more and more complex, and the requirements for high-efficiency software testing methods are more and more increased.
Automated testing techniques refer to techniques that utilize a computer system to perform part or all of the testing process to increase testing efficiency. For example, the traffic recording and playback tool can monitor the network traffic packets at the network port and help the tester to perform efficient software testing without affecting the online traffic by recording the actual network traffic and "playing back" it in the testing environment.
However, the capturing of many current traffic recording and playback tools (such as Goreplay) on various data packets can only be limited to outputting the data packets according to time sequence, so that after collecting the data packets, a tester needs to filter, classify and match the data packets almost manually to perform subsequent analysis and verification, and the test requirement on quick and high service coverage rate cannot be met.
Disclosure of Invention
The application provides a software testing method and device, server equipment and storage medium, which can help to improve the testing efficiency of software testing based on flow recording and playback.
At least one aspect of the embodiments of the present application provides a software testing method, the method including: monitoring at least one network port to obtain a data packet of network traffic of a subject to be tested; when a request type data packet carrying a tracking identifier is obtained, adding the request type data packet into a request data queue; when a response type data packet carrying a tracking identifier is obtained, searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists; when obtaining a playback data packet carrying a tracking identifier, searching whether the request data packet and the response data packet corresponding to the tracking identifier exist in a response data queue, and generating a test data group consisting of the request data packet, the response data packet and the playback data packet corresponding to the tracking identifier when the request data packet and the response data packet exist; the playback data packet is a data packet sent by a test server in response to a recording request data packet obtained by recording the request data packet; and testing the object to be tested by using the test data comprising the test data set.
In some possible implementations, the method further includes: when any request type data packet and a corresponding response type data packet are added to a response data queue in pairs, the request type data packet is shifted out of the request data queue; and when any test data group is generated, shifting out the request class data packet and the response class data packet in the test data group from the response data queue.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: when the configuration parameters of the bug repair verification are on, determining whether the test data set belongs to the test data of the bug repair project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
In some possible implementations, after generating the test data set, the method further includes: determining an identification code of each data packet in the test data set based on an identification code comparison table, wherein each data packet comprises parameter values of at least one parameter item, the identification code comparison table comprises identification characters of a plurality of parameter items, and the identification code of each data packet comprises the identification characters of at least one parameter item and the parameter values of at least one parameter item; and when the identification code of the data packet exists in the flow database, the data packet is stored in a coverage mode as the data packet corresponding to the identification code in the flow database.
In some possible implementations, the testing the object to be tested with the test data including the test data set includes: generating at least one test case based on the data packet stored in the flow database in the monitoring period when one monitoring period is finished; and testing the object to be tested by using the at least one test case.
At least one aspect of an embodiment of the present application provides a software testing apparatus, the apparatus including: the monitoring module is used for monitoring at least one network port to acquire a data packet of the network flow of the object to be tested; the first processing module is used for adding the request type data packet to a request data queue when the request type data packet carrying the tracking identifier is obtained; the second processing module is used for searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue when the response type data packet carrying the tracking identifier is obtained, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists; the third processing module is used for searching whether the request type data packet and the response type data packet corresponding to the tracking identification exist in the response data queue when the playback type data packet carrying the tracking identification is obtained, and generating a test data group consisting of the request type data packet, the response type data packet and the playback type data packet corresponding to the tracking identification when the request type data packet and the response type data packet exist; the playback data packet is a data packet returned by the test server in response to the received recording data packet of the request data packet; and the test module is used for testing the object to be tested by using the test data comprising the test data set.
At least one aspect of an embodiment of the present application provides a server apparatus including: a processor; a memory for storing executable instructions of the processor; the processor is configured to execute the executable instructions to implement any one of the software testing methods described above.
At least one aspect of an embodiment of the application provides a computer-readable storage medium storing executable instructions of a processor configured to, when executed by the processor, cause the processor to implement any one of the software testing methods described above.
In the embodiment of the application, based on the tracking identification configured for various data packets, when the data packets carrying the tracking identification are monitored, the data packets are cached through the corresponding data queues, and the data packets carrying the same tracking identification are associated to output in the form of test data groups consisting of request data packets, response data packets and playback data packets, so that the subsequent analysis and verification links are not dependent on the filtering, classifying and matching of the data packets by the testers, thereby not only helping to reduce the workload of the testers for processing the data packets, but also realizing the automatic verification and test process, being beneficial to improving the test efficiency of the software test based on flow recording and playback, and realizing the test process with higher speed and high service coverage rate.
Drawings
Fig. 1 is a schematic diagram of an application scenario of a software testing method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of a software testing method according to an embodiment of the present application;
FIG. 3 is a schematic step flow diagram of a process of automatically generating test cases in a software testing method according to an embodiment of the present application;
FIG. 4 is a block diagram of a software testing apparatus according to an embodiment of the present application;
fig. 5 is a block diagram of a server device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Fig. 1 is an application scenario schematic diagram of a software testing method provided by an embodiment of the present application. Referring to fig. 1, in an exemplary application scenario, the service server 100 performs network communication with the terminal device 200 through the network connection 300, thereby receiving the request class data packet D1 transmitted by the terminal device 200 through the network interface, and returning the response class data packet D2 to the terminal device 200 after processing, to provide network services for the terminal device 200. In the software test based on the flow recording and playback, by monitoring at the network port of the service server 100, the real network flows such as the request class data packet D1, the response class data packet D2 and the like generated when the service server 100 provides the network service for the terminal device 200 can be obtained; thus, the service server 100 may record the request data packet D1 to obtain a record request data packet D1 'and send the record request data packet D1' to the test server 400 in the test environment, so as to perform a required test based on the playback data packet D2 'returned by the test server 400—for example, by comparing the playback data packet D2' with the response data packet D2, it may be known how the processing results of the requests of the online service and the offline service with the same content are different, so as to implement a test procedure such as bug fix verification. It can be seen that the above-described test procedure is performed only between the test server 400 and the service server 100, and the data used in the test does not change the data transmitted and received by the terminal device 200, so that the service server 100 does not affect the network service provided by the terminal device 200 regardless of the test result. It should be understood that in addition to the testing procedures such as data interception and playback testing on the service server 100, these testing procedures may be performed on other server devices—for example, the testing server 400 may receive data packets of network traffic intercepted by multiple service servers 100 on their network ports through a network connection, so that the required testing procedures may be completed on the testing server 400; similarly, other server devices capable of network communication with the business server 100 and the test server 400, respectively, may also perform these test procedures.
In the related art, a software test based on traffic recording and playback needs to rely on a corresponding traffic recording and playback tool (e.g., goreplay). However, the grabbing of the flow recording and playback tools on various data packets can only be limited to outputting according to time sequence, but the relevance between the data packets and the service specifically related to the data packets cannot be directly indicated, so that after collecting the data packets, a tester needs to filter, classify and match the data packets almost manually to perform subsequent analysis and verification, and the test requirement on quick and high service coverage rate cannot be met.
Fig. 2 is a flowchart illustrating steps of a software testing method according to an embodiment of the present application. Referring to fig. 2, the method may be performed, for example, by any of the server devices described above, and includes the following steps.
In step 201, at least one network port is monitored to obtain a data packet of network traffic of a subject to be tested.
In step 202, when a request class data packet carrying a tracking identifier is obtained, the request class data packet is added to a request data queue.
In step 203, when obtaining the response class data packet carrying the tracking identifier, searching whether the request class data packet corresponding to the tracking identifier exists in the request data queue, and adding the request class data packet corresponding to the tracking identifier and the response class data packet to the response data queue in pairs when the request class data packet exists.
In step 204, when obtaining the playback data packet carrying the tracking identifier, searching whether the request data packet and the response data packet corresponding to the tracking identifier exist in the response data queue, and generating a test data set consisting of the request data packet, the response data packet and the playback data packet corresponding to the tracking identifier when the request data packet and the response data packet exist. The playback data packet is a data packet sent by the test server in response to a recording request data packet obtained by recording the request data packet.
In step 205, the object to be tested is tested with test data comprising a test data set.
It can be seen that, in the embodiment of the application, based on the tracking identifier configured for various data packets, when the data packet carrying the tracking identifier is obtained by monitoring, the data packet carrying the same tracking identifier is cached through the corresponding data queue, and the data packet carrying the same tracking identifier is associated to output in the form of a test data group consisting of a request data packet, a response data packet and a playback data packet, so that the subsequent analysis and verification links are not dependent on the filtering, classifying and matching of the data packet by a tester, the workload of the tester on processing the data packet can be reduced, the automatic verification and test process can be realized, the test efficiency of the software test based on the flow recording and playback can be improved, and the test process with higher speed and high service coverage rate can be realized.
It should be noted that, the object to be tested may be, for example, a service, an application, a service, a system, or the like, and in the step 201, the listening rule may be configured to screen out a data packet of a required network traffic at a selected network port, and the service server where the network port is located sends the data packet of the network traffic to a server device executing the software testing method through a network connection, so as to achieve acquisition of the data packet of the network traffic of the object to be tested. Of course, the software testing method can be executed by any service server.
It should be further noted that, the tracking identifier refers to the data content in the packets of the network traffic, where the data content is used to indicate the response, recording and playback relationships, and may be, for example, an identifier added by the service server 100 in the request type packet D1 and the response type packet D2, where the identifier is capable of indicating the response relationships between each other, and by retaining this item in the packets during recording and playback, the identifier is capable of indicating the correspondence between the request type packet D1, the response type packet D2, the recording request packet D1 'and the playback type packet D2'. Of course, the implementation of the tracking identifier described above may not be limited thereto.
It should be further noted that, the request data queue and the response data queue may be, for example, in any type of data buffer area established in the memory of the server device, or in a data file established in a specified directory in the memory of the server device; to avoid data overflow and data expiration, the maximum queue length and timeout may be configured to maintain the request data queue and the response data queue by, for example, deleting tail packets and periodically deleting expired packets when data overflows.
In some possible implementations, the request data queue and the response data queue are maintained in a "dequeue after pairing is successful" manner, for example: when any request type data packet and a corresponding response type data packet are added to the response data queue in pairs, the request type data packet is shifted out of the request data queue; and when any test data group is generated, the request class data packet and the response class data packet in the test data group are shifted out of the response data queue. Therefore, each request type data packet can be paired with one response type data packet, and each pair of request type data packets and response type data packets can only form a test data set with one playback type data packet, so that the corresponding relation between the data packets is clearer, and the subsequent analysis and test are convenient.
In some possible implementations, the testing procedure in step 205 includes: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In one example, during the process of listening to at least one network port to obtain new data packets, each time a new data packet carrying the tracking identifier is obtained, one of the steps 202 to 205 is performed thereon. If the packet is a request class packet, it is added to the request data queue by step 202; if the packet is a response type packet, adding the pairing thereof to a response data queue in step 203, and automatically performing a series of operations of traffic recording and playback on the request type packet paired thereof (for example, recording the recording request packet and sending the recording request packet to a test server to wait for returning to the corresponding playback type packet); if the packet is a playback class packet, a test data set is correspondingly generated, via step 204. Thus, with the continuous progress of the above process, a plurality of test data sets are automatically generated in each monitoring period; for these test data sets, some test operations may also be automatically performed through the above step 205—when the configuration parameters of the consistency verification item in the automatic test configuration file are on, for each test data set, the test results regarding the consistency verification of these test data sets are automatically generated by comparing, for example, the dissimilarity of the response class data packet with the playback class data packet by means of jsondiff, and recording whether the results of the two are identical. Because the above-mentioned process is all finished by the server apparatus automatically, the tester can directly obtain the test data group and its consistency verification result that are produced in each monitoring period by means of above-mentioned process, and can call the corresponding record to analyze when necessary, in order to obtain further test result, therefore the above-mentioned software test method can help the tester save the trouble of processing the data packet and carrying on consistency verification separately, promote the test efficiency greatly.
In some possible implementations, the testing process in step 205 further includes: when the configuration parameters of the bug fix verification are on, determining whether the test data set belongs to the test data of the bug fix project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
For example, when the configuration parameters of the bug fix verification items in the automatic test configuration file are on, reading an interface list of bug fix verification and verification rules corresponding to each interface in the specified directory, for each test data set: matching the test data set with the interface list by utilizing the tracking identifier so as to determine whether the test data belongs to the bug fix project; if the result does not belong to the group, carrying out consistency verification according to the mode, and obtaining a corresponding consistency verification result; if so, determining that the corresponding verification rule according to the matched interface processes the test data set (for example, if the specified parameter item of the playback class data packet is the specified parameter value, the test data set passes, otherwise, the test data set does not pass) so as to obtain a test result indicating whether verification passes.
In some possible implementations, the testing process in step 205 further includes: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
For example, when the configuration parameters of the version characteristic verification items in the automatic test configuration file are on, loading the version characteristic verification file pre-written by the tester in the specified directory to process the test data set by using the version characteristic verification rule recorded in the version characteristic verification file (for example, if the specified parameter items of the playback class data packet are specified parameter values, the test data set passes, otherwise, the test data set does not pass), so as to obtain a test result indicating whether the version characteristic verification passes.
Fig. 3 is a schematic step flow diagram of a process of automatically generating test cases in a software testing method according to an embodiment of the present application. Referring to fig. 3, on the basis of any of the above software testing methods, the following steps may be further included.
In step 301, the identification code of each data packet in the test data set is determined based on the identification code look-up table. Wherein, each data packet comprises at least one parameter value of the parameter item, the identification code comparison table comprises a plurality of identification characters of the parameter item, and the identification code of each data packet comprises at least one identification character of the parameter item and at least one parameter value of the parameter item.
In one example, the tester maintains an identification code comparison table on the server device, wherein the identification code comparison table records identification characters corresponding to each known parameter item, and can additionally record information such as corresponding business logic description and the like; for example, for a data packet with the following contents:
{‘data’:{'channel':'123','type':'webrtc','isfullscreen':1,'relation':'720X1280'}},
the identification code lookup table records an identification character "ch" corresponding to the parameter item "channel", an identification character "t1" corresponding to the parameter item "type=webrtc", and an identification character "fs1" corresponding to the parameter item "isfullscreen=1", but does not record a parameter item related to the "relation ': 720X1280'", thereby determining that the identification code of the data packet is "ch123-t 1-fs1=relation 720X1280". Wherein the identification code of each parameter item is connected with the parameter value which it may exist in hyphae "-" before the equal sign "=" and the parameter item which lacks a record and its parameter value are placed directly after the equal sign "=" in character form. It can be seen that according to the identifier mapping rule, data packets with the same parameter items and parameter values can all have the same identification code, so that the data packets can be compared more simply by using the identification code.
In step 302, for each data packet, when the identification code of the data packet does not exist in the traffic database, the data packet and the identification code thereof are correspondingly stored in the traffic database, and when the identification code of the data packet exists in the traffic database, the data packet is stored in an overlapping manner as the data packet corresponding to the identification code in the traffic database.
In one example, the traffic database may be configured in the server device with reference to the configuration of the queues described above for storing the respective test data sets identified with the identification code. For a new data packet, the new data packet can be directly stored in a flow database; for data items for which the identification code already exists, it can be updated to a new data packet to ensure that the traffic database holds the most up-to-date data.
In step 303, at least one test case is generated based on the data packets stored in the traffic database during a listening period each time the listening period ends.
In step 304, the object to be tested is tested using at least one test case.
Because each test data set can directly provide the service case sample diversity information and the case weight information, the test cases can be generated based on the test data sets stored in the flow database. In one example, each time a listening period ends, test data sets stored in the traffic database during the listening period may be collected, so that test cases may be automatically assembled in conjunction with a pre-configured use case schedule, to automatically generate a new test case set using the data packets of the latest monitored network traffic, and to update the previously obtained test results. In one example, when a new requirement is added, a new parameter item and a corresponding parameter value may appear in the data packet, for which a tester may add a corresponding identification character in the identification code comparison table, and pre-fill in a sample material table of a new process flow, so as to generate a new test sample and a test sample set at the end of each monitoring period.
Fig. 4 is a block diagram of a software testing device according to an embodiment of the present application. Referring to fig. 4, the software testing apparatus is applied to any one of the above server devices, and includes: a monitoring module 41, configured to monitor at least one network port to obtain a data packet of network traffic of a to-be-tested object; the first processing module 42 is configured to add the request class data packet carrying the tracking identifier to a request data queue when obtaining the request class data packet; the second processing module 43 is configured to, when obtaining a response class data packet carrying a tracking identifier, search for whether a request class data packet corresponding to the tracking identifier exists in the request data queue, and add the request class data packet and the response class data packet corresponding to the tracking identifier to the response data queue in pairs when the request class data packet exists; the third processing module 44 is configured to, when obtaining a playback class data packet carrying a tracking identifier, search a response data queue for whether a request class data packet and a response class data packet corresponding to the tracking identifier exist, and generate a test data set including the request class data packet, the response class data packet and the playback class data packet corresponding to the tracking identifier when the request class data packet and the response class data packet exist; the playback data packet is a data packet returned by the test server in response to the received recording data packet of the request data packet; a test module 45 for testing the object to be tested by using the test data comprising the test data set.
In some possible implementations, the second processing module 43 is further configured to, when adding any request class packet and a corresponding response class packet to the response data queue in pairs, shift the request class packet out of the request data queue; the third processing module 44 is further configured to, when any one of the test data sets is generated, dequeue the request class data packet and the response class data packet in the test data set from the response data queue.
In some possible implementations, the test module 45 is further to: and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
In some possible implementations, the test module 45 is further to: when the configuration parameters of the bug fix verification are on, determining whether the test data set belongs to the test data of the bug fix project according to the tracking identification corresponding to the test data set; and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
In some possible implementations, the test module 45 is further to: when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
In some possible implementations, the test module 45 is further configured to: after the test data set is generated, determining an identification code of each data packet in the test data set based on an identification code comparison table, wherein each data packet comprises parameter values of at least one parameter item, the identification code comparison table comprises identification characters of a plurality of parameter items, and the identification code of each data packet comprises the identification characters of at least one parameter item and the parameter values of at least one parameter item; for each data packet, when the identification code of the data packet does not exist in the flow database, the data packet and the identification code thereof are correspondingly stored in the flow database, and when the identification code of the data packet exists in the flow database, the data packet is stored in a covering mode as the data packet corresponding to the identification code in the flow database.
In some possible implementations, the test module 45 is further to: generating at least one test case based on the data packet stored in the flow database in the monitoring period when one monitoring period is finished; and testing the object to be tested by using at least one test case.
The implementation process of the software testing device provided by the embodiment of the application is consistent with the software testing method provided by the embodiment of the application, and the achieved effect is the same as the software testing method provided by the embodiment of the application, and the details are not repeated here.
Fig. 5 is a block diagram of a server device according to an embodiment of the present application. Referring to fig. 5, the server device includes a processor 51 and a memory 52 for storing executable instructions of the processor 51; wherein the processor 51 is configured to execute the executable instructions to implement any of the software testing methods described above. Taking any one of the server devices as an example, the server device in the embodiment of the application can help reduce the workload of testers for processing data packets, can realize an automatic verification and test process, is beneficial to improving the test efficiency of software test based on flow recording and playback, and realizes a test process with higher speed and high service coverage rate.
Embodiments of the present application also provide a computer-readable storage medium that is a non-volatile storage medium and that stores executable instructions of a processor that are configured to, when executed by the processor, cause the processor to implement a software testing method of any of the above. Taking the above memory 52 as an example, the computer readable storage medium of the embodiment of the present application can be used to help reduce the workload of the tester for processing the data packet, and can also realize an automatic verification and test process, thereby being helpful to improve the test efficiency of the software test based on the flow recording and playback, and realize a test process with higher speed and high service coverage rate.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (10)

1. A method of software testing, the method comprising:
monitoring at least one network port to obtain a data packet of network traffic of a subject to be tested;
when a request type data packet carrying a tracking identifier is obtained, adding the request type data packet into a request data queue;
when a response type data packet carrying a tracking identifier is obtained, searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists;
when obtaining a playback data packet carrying a tracking identifier, searching whether the request data packet and the response data packet corresponding to the tracking identifier exist in a response data queue, and generating a test data group consisting of the request data packet, the response data packet and the playback data packet corresponding to the tracking identifier when the request data packet and the response data packet exist; the playback data packet is a data packet sent by a test server in response to a recording request data packet obtained by recording the request data packet;
and testing the object to be tested by using the test data comprising the test data set.
2. The method according to claim 1, wherein the method further comprises:
when any request type data packet and a corresponding response type data packet are added to a response data queue in pairs, the request type data packet is shifted out of the request data queue;
and when any test data group is generated, shifting out the request class data packet and the response class data packet in the test data group from the response data queue.
3. The method of claim 1, wherein the testing the subject with test data comprising the test data set comprises:
and when the configuration parameters of the consistency verification are on, comparing the playback class data packet in the test data set with the response class data packet in the test data set to generate a test result of the consistency verification.
4. The method of claim 1, wherein the testing the subject with test data comprising the test data set comprises:
when the configuration parameters of the bug repair verification are on, determining whether the test data set belongs to the test data of the bug repair project according to the tracking identification corresponding to the test data set;
and when the test data set belongs to the test data of the bug fix project, processing the test data set according to bug fix verification rules corresponding to the test data set to generate a test result of bug fix verification.
5. The method of claim 1, wherein the testing the subject with test data comprising the test data set comprises:
when the configuration parameters of the version characteristic verification are on, loading a pre-configured version characteristic verification file, and processing the test data set by utilizing the version characteristic verification rule recorded in the version characteristic verification file to generate a test result of the version characteristic verification.
6. The method of any one of claims 1 to 5, wherein after generating the test data set, the method further comprises:
determining an identification code of each data packet in the test data set based on an identification code comparison table, wherein each data packet comprises parameter values of at least one parameter item, the identification code comparison table comprises identification characters of a plurality of parameter items, and the identification code of each data packet comprises the identification characters of at least one parameter item and the parameter values of at least one parameter item;
and when the identification code of the data packet exists in the flow database, the data packet is stored in a coverage mode as the data packet corresponding to the identification code in the flow database.
7. The method of claim 6, wherein the testing the subject with test data comprising the test data set comprises:
generating at least one test case based on the data packet stored in the flow database in the monitoring period when one monitoring period is finished;
and testing the object to be tested by using the at least one test case.
8. A software testing apparatus, the apparatus comprising:
the monitoring module is used for monitoring at least one network port to acquire a data packet of the network flow of the object to be tested;
the first processing module is used for adding the request type data packet to a request data queue when the request type data packet carrying the tracking identifier is obtained;
the second processing module is used for searching whether the request type data packet corresponding to the tracking identifier exists in the request data queue when the response type data packet carrying the tracking identifier is obtained, and adding the request type data packet corresponding to the tracking identifier and the response type data packet into the response data queue in pairs when the request type data packet exists;
the third processing module is used for searching whether the request type data packet and the response type data packet corresponding to the tracking identification exist in the response data queue when the playback type data packet carrying the tracking identification is obtained, and generating a test data group consisting of the request type data packet, the response type data packet and the playback type data packet corresponding to the tracking identification when the request type data packet and the response type data packet exist; the playback data packet is a data packet returned by the test server in response to the received recording data packet of the request data packet;
and the test module is used for testing the object to be tested by using the test data comprising the test data set.
9. A server device, characterized in that the server device comprises:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the executable instructions to implement the method of any one of claims 1 to 7.
10. A computer readable storage medium storing executable instructions of a processor, the executable instructions being configured to cause the processor to implement the method of any one of claims 1 to 7 when executed by the processor.
CN202311007690.9A 2023-08-11 2023-08-11 Software testing method and device, server equipment and storage medium Active CN116719750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311007690.9A CN116719750B (en) 2023-08-11 2023-08-11 Software testing method and device, server equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311007690.9A CN116719750B (en) 2023-08-11 2023-08-11 Software testing method and device, server equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116719750A true CN116719750A (en) 2023-09-08
CN116719750B CN116719750B (en) 2023-12-22

Family

ID=87868353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311007690.9A Active CN116719750B (en) 2023-08-11 2023-08-11 Software testing method and device, server equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116719750B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117130941A (en) * 2023-10-24 2023-11-28 易方信息科技股份有限公司 Interface automation method, system, equipment and medium based on browser plug-in

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111045952A (en) * 2019-12-16 2020-04-21 广州品唯软件有限公司 Software testing method, flow playback device, terminal equipment and readable storage medium
CN112559361A (en) * 2020-12-22 2021-03-26 京东数字科技控股股份有限公司 Flow playback method, device, equipment and computer readable medium
CN113076251A (en) * 2021-04-16 2021-07-06 北京京东拓先科技有限公司 Test method and device
CN113553260A (en) * 2021-07-22 2021-10-26 工银科技有限公司 Test method, test apparatus, device, medium, and program product
CN114138631A (en) * 2021-11-11 2022-03-04 易视腾科技股份有限公司 Test method and test device
CN114328268A (en) * 2022-01-14 2022-04-12 中国平安人寿保险股份有限公司 Software testing method, device, equipment and medium based on flow playback
CN114661580A (en) * 2022-03-01 2022-06-24 上海复深蓝软件股份有限公司 Flow recording playback method and device, computer equipment and storage medium
CN115048257A (en) * 2022-05-19 2022-09-13 上海数禾信息科技有限公司 System service function verification method and device, computer equipment and storage medium
CN115396501A (en) * 2022-08-31 2022-11-25 北京奇艺世纪科技有限公司 Information processing method and device, electronic equipment and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111045952A (en) * 2019-12-16 2020-04-21 广州品唯软件有限公司 Software testing method, flow playback device, terminal equipment and readable storage medium
CN112559361A (en) * 2020-12-22 2021-03-26 京东数字科技控股股份有限公司 Flow playback method, device, equipment and computer readable medium
CN113076251A (en) * 2021-04-16 2021-07-06 北京京东拓先科技有限公司 Test method and device
CN113553260A (en) * 2021-07-22 2021-10-26 工银科技有限公司 Test method, test apparatus, device, medium, and program product
CN114138631A (en) * 2021-11-11 2022-03-04 易视腾科技股份有限公司 Test method and test device
CN114328268A (en) * 2022-01-14 2022-04-12 中国平安人寿保险股份有限公司 Software testing method, device, equipment and medium based on flow playback
CN114661580A (en) * 2022-03-01 2022-06-24 上海复深蓝软件股份有限公司 Flow recording playback method and device, computer equipment and storage medium
CN115048257A (en) * 2022-05-19 2022-09-13 上海数禾信息科技有限公司 System service function verification method and device, computer equipment and storage medium
CN115396501A (en) * 2022-08-31 2022-11-25 北京奇艺世纪科技有限公司 Information processing method and device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117130941A (en) * 2023-10-24 2023-11-28 易方信息科技股份有限公司 Interface automation method, system, equipment and medium based on browser plug-in
CN117130941B (en) * 2023-10-24 2024-03-26 易方信息科技股份有限公司 Interface automation method, system, equipment and medium based on browser plug-in

Also Published As

Publication number Publication date
CN116719750B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN107908541B (en) Interface testing method and device, computer equipment and storage medium
CN111522922A (en) Log information query method and device, storage medium and computer equipment
CN116719750B (en) Software testing method and device, server equipment and storage medium
CN109885496B (en) Test log management method and system
US20050167486A1 (en) System, method, and program for generating transaction profile for measuring and analyzing computer system performance
CN107800565A (en) Method for inspecting, device, system, computer equipment and storage medium
CN111400127B (en) Service log monitoring method and device, storage medium and computer equipment
CN109710439B (en) Fault processing method and device
CN111061696B (en) Method and device for analyzing transaction message log
CN110083581B (en) Log tracing method and device, storage medium and computer equipment
CN110633195B (en) Performance data display method and device, electronic equipment and storage medium
EP3864516A1 (en) Veto-based model for measuring product health
US20120303625A1 (en) Managing heterogeneous data
CN112153378B (en) Method and system for testing video auditing capability
CN114328566A (en) Relationship graph updating method, device, medium, equipment and generating method
CN115203004A (en) Code coverage rate testing method and device, storage medium and electronic equipment
CN106648722B (en) Method and device for processing Flume receiving terminal data based on big data
CN111831574A (en) Regression test planning method, device, computer system and medium
CN113360353B (en) Test server and cloud platform
CN112948262A (en) System test method, device, computer equipment and storage medium
CN112965912B (en) Interface test case generation method and device and electronic equipment
CN110611715A (en) System and method for collecting cloud monitoring information by service link
CN116662204A (en) Method, device, system and storage medium for generating code-free test cases
CN115022201B (en) Data processing function test method, device, equipment and storage medium
CN114610689B (en) Recording and analyzing method for request log in distributed environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240205

Address after: 230031 Room 672, 6/F, Building A3A4, Zhong'an Chuanggu Science Park, No. 900, Wangjiang West Road, High-tech Zone, Hefei, Anhui

Patentee after: Anhui Haima Cloud Technology Co.,Ltd.

Country or region after: China

Address before: 301700 room 2d25, Building 29, No.89 Heyuan Road, Jingjin science and Technology Valley Industrial Park, Wuqing District, Tianjin

Patentee before: HAIMAYUN (TIANJIN) INFORMATION TECHNOLOGY CO.,LTD.

Country or region before: China

TR01 Transfer of patent right