CN114143213B - Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium - Google Patents

Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium Download PDF

Info

Publication number
CN114143213B
CN114143213B CN202111445082.7A CN202111445082A CN114143213B CN 114143213 B CN114143213 B CN 114143213B CN 202111445082 A CN202111445082 A CN 202111445082A CN 114143213 B CN114143213 B CN 114143213B
Authority
CN
China
Prior art keywords
test
equipment
feedback information
tested
benchmarking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111445082.7A
Other languages
Chinese (zh)
Other versions
CN114143213A (en
Inventor
满朕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202111445082.7A priority Critical patent/CN114143213B/en
Publication of CN114143213A publication Critical patent/CN114143213A/en
Application granted granted Critical
Publication of CN114143213B publication Critical patent/CN114143213B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/145Network analysis or design involving simulating, designing, planning or modelling of a network

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Debugging And Monitoring (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The application relates to a benchmarking test system, a benchmarking test method, a benchmarking test apparatus, a benchmarking test computer device and a benchmarking storage medium. Relates to the technical field of testing, and can be used in the field of financial science and technology. The system comprises: a test device, an analog network device, a first device, and at least one second device; the simulation network equipment is used for simulating a target network environment by building a hot spot, and the first equipment and at least one second equipment are accessed into the target network environment; the test equipment is used for sending a first test instruction to the first equipment and sending a second test instruction to the second equipment; the first device and the second device are respectively used for responding to the first test instruction and the second test instruction to execute test operation and sending first test feedback information and second test feedback information to the test device; the test equipment is also used for obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information. The method can improve the calibration test efficiency.

Description

Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium
Technical Field
The present disclosure relates to the field of testing technologies, and in particular, to a benchmarking system, method, apparatus, computer device, and storage medium.
Background
In the process of system development and version upgrade, the robustness and performance of the system in different network environments need to be tested.
In the related art, in order to test the performance of a system in different network environments, it is often required to actually go to different areas and use different operator networks for testing.
Because the stability of the network environment is difficult to maintain in the outfield test, and the same network environment is difficult to reproduce for repeated test, the difficulty is higher when the network environment is required to reproduce when a problem occurs, and the test efficiency of the system on the standard test is lower.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a benchmarking system, method, apparatus, computer device, and storage medium that can improve benchmarking efficiency.
In a first aspect, the present application provides a benchmarking system, said system including: the system comprises test equipment, simulation network equipment, first equipment and at least one second equipment, wherein the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing a benchmarking system; wherein,
the simulation network equipment is used for simulating a target network environment by building a hot spot, and the first equipment and the at least one second equipment are accessed to the target network environment;
The test equipment is used for sending a first test instruction to the first equipment and sending a second test instruction to the second equipment;
the first device is used for responding to the first test instruction to execute test operation and sending first test feedback information to the test device, and the second device is used for responding to the second test instruction to execute test operation and sending second test feedback information to the test device;
the test equipment is also used for receiving the first test feedback information sent by the first equipment and the second test feedback information sent by the second equipment, and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the test device is further configured to create a problem list according to the test result and send the problem list to the target device when the test result indicates that the system to be tested fails the test, where the problem list is used to prompt a user to adjust the system to be tested.
In one embodiment, the test device is further configured to repeatedly perform test processing on the system to be tested in response to the processing operation for the problem list until the test result of the system to be tested characterizes that the system to be tested passes the test, and generate a test report according to the problem list, the processing operation for the problem list, the test result of the system to be tested, and the target network environment.
In one embodiment, the test device is further configured to obtain first test item data corresponding to a to-be-tested item from the first test feedback information, obtain second test item data corresponding to the to-be-tested item from the second test feedback information, and obtain a test result for the to-be-tested system according to a comparison result of the first test item data and the second test item data.
In one embodiment, the test feedback information includes at least one of load time, response time, execution time, flow consumption.
In a second aspect, the present application further provides a benchmarking method, which includes:
a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
receiving first test feedback information sent by the first equipment and receiving second test feedback information sent by the second equipment;
and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the obtaining the test result for the system to be tested according to the first test feedback information and the second test feedback information includes:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information;
and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the method further comprises:
and under the condition that the test result represents that the system to be tested fails the test, creating a problem list according to the test result, and sending the problem list to target equipment, wherein the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the method further comprises:
and responding to the processing operation aiming at the problem list, repeatedly executing test processing on the system to be tested until the test result of the system to be tested represents that the system to be tested passes the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
In one embodiment, the test feedback information includes at least one of load time, response time, execution time, flow consumption.
In a third aspect, the present application further provides a benchmarking test apparatus, which is characterized in that the apparatus includes:
the device comprises a sending module, a comparing module and a comparing module, wherein the sending module is used for sending a first test instruction to a first device and sending a second test instruction to a second device, wherein the first device and the second device are devices which are connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a comparison system;
the receiving module is used for receiving the first test feedback information sent by the first equipment and receiving the second test feedback information sent by the second equipment;
and the test module is used for obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the test module is further configured to:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information;
And obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the apparatus further comprises:
the first processing module is used for creating a problem list according to the test result and sending the problem list to target equipment under the condition that the test result represents that the system to be tested fails to pass the test, and the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the method further comprises:
and the second processing module is used for responding to the processing operation aiming at the problem list, repeatedly executing the test processing on the system to be tested until the test result of the system to be tested characterizes the system to be tested to pass the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
In one embodiment, the test feedback information includes at least one of load time, response time, execution time, flow consumption.
In a fourth aspect, the present application also provides a computer device. The computer device comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the benchmarking test method when executing the computer program.
In a fifth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which when executed by a processor implements the above benchmarking method.
In a sixth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the benchmarking method above.
The benchmarking test system, the benchmarking test method, the benchmarking test device, the benchmarking test computer equipment and the storage medium can send a first test instruction to first equipment and a second test instruction to second equipment, wherein the first equipment and the second equipment are equipment which is accessed into the same simulation network environment, the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing the benchmarking system. After receiving the first test feedback information sent by the first device and the second test feedback information sent by the second device, a test result aiming at the system to be tested can be obtained according to the first test feedback information and the second test feedback information. According to the benchmarking test system, the benchmarking test method, the benchmarking test device, the benchmarking test computer equipment and the storage medium, the benchmarking test can be performed in different network environments through simulating the network environments, and the network environments can be quickly reproduced through simulating the network environments, so that the testing efficiency of the benchmarking test can be improved.
Drawings
FIG. 1 is a block diagram of a benchmarking system in one embodiment;
FIG. 2 is a schematic diagram of a benchmarking method in one embodiment;
FIG. 3 is a flow chart of a benchmarking method in one embodiment;
FIG. 4 is a flow chart of a benchmarking method in one embodiment;
FIG. 5 is a block diagram of a benchmarking apparatus in one embodiment;
fig. 6 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
An embodiment of the present application provides a benchmarking test system, referring to fig. 1, the benchmarking test system includes: the test device 102, the analog network device 104, the first device 106 and at least one second device 108, wherein the first device 106 is a device for installing a system to be tested, and the second device 108 is a device for installing a benchmarking system;
the network simulation device 104 is configured to simulate a target network environment by building a hot spot, and the first device 106 and the at least one second device 108 access the target network environment;
The test device 102 is configured to send a first test instruction to the first device 106 and a second test instruction to the second device 108;
the first device 106 is configured to perform a test operation in response to the first test instruction and send first test feedback information to the test device 102, and the second device 108 is configured to perform a test operation in response to the second test instruction and send second test feedback information to the test device 102;
the test device 102 is further configured to receive the first test feedback information sent by the first device 106 and receive the second test feedback information sent by the second device 108, and obtain a test result for the system to be tested according to the first test feedback information and the second test feedback information.
In this embodiment, the target network environment may be simulated by the simulation network device 104, where the target network environment may be a simulated network environment for testing a system to be tested. For example, a third party tool can be installed through a mobile phone hot spot and a PC (Personal Computer ) to build the hot spot to simulate different network environments so as to perform benchmarking test on a system to be tested in the different network environments. Such as: the operator access point may be set by a mobile phone, including: IPv4, IPv6, IPv4 and IPv6 dual stack, 2G, 3G, 4G, 5G, etc. to simulate different network environments, PC installations such as: fiddler, charles, network Emulatot Toolkit, etc. tools set network bandwidth, delay, fluctuation, etc. to simulate different network environments.
After the target network environment is simulated by the simulated network device 104, the first device 106 and the second device 108 may access the target network environment. The first device 106 is provided with a system to be tested, and the second device 108 is provided with a target matching system, which may be a bidding system of the system to be tested or a historical version of the system to be tested.
The first test instruction may be sent by the test device 102 to the first device 106 and the second test instruction may be sent to the second device 108. The first test instruction may include an execution instruction for controlling the first device 106 to perform various functions of the system to be tested in a traversing manner, that is, the first test instruction may be used to instruct the first device 106 to perform various functions in a traversing manner. Similarly, the second test instruction may include an execution instruction for controlling the second device 108 to perform various functions of the benchmarking system in a traversal manner, i.e., the second test instruction may be used to instruct the second device 108 to perform various functions in a traversal manner.
After the first device 106 receives the first test instruction, the first device may execute a test operation in response to the first test instruction, and send corresponding first test feedback information to the test device 102 according to the running situation during the test, while the second device 108 receives the second test instruction, may execute a test operation in response to the second test instruction, and send corresponding second test feedback information to the test device 102 according to the running situation during the test. In one possible implementation, the feedback information may include at least one of loading time, response time, execution time, and traffic consumption.
After the test device 102 receives the first test feedback information sent by the first device 106 and the second test feedback information sent by the second device 108, the first test feedback information and the second test feedback information may be compared, so as to obtain a test result of the system to be tested. For example: in the case that the loading time of the page in the first test feedback information is longer than the loading time of the page in the second test feedback information, the obtained test result may include: the system to be tested fails the benchmarking test, the problem: the page loading of the system to be tested is slower than that of the comparison standard system, and the network environment is that: a target network environment. Or, in the case that the feedback data of each function in the first test feedback information is better than the feedback data of each function in the second test feedback information, the obtained test result may include: the system to be tested tests through standard alignment, and the network environment: a target network environment.
It should be noted that the above test result is only an example in the embodiment of the present application, and is not to be understood as a limitation of the test result, in fact, the test result may also include the first test feedback information and the second test feedback information, and the comparison situation of the first test feedback information and the second test feedback information, and the embodiment of the present application does not specifically limit the test result.
In one embodiment, the test device 102 is further configured to obtain first test item data corresponding to the to-be-tested item from the first test feedback information, obtain second test item data corresponding to the to-be-tested item from the second test feedback information, and obtain a test result for the to-be-tested system according to a comparison result of the first test item data and the second test item data.
In this embodiment of the present application, an item to be tested may be preset, where the item to be tested may be a data item to be tested by a system to be tested and a target system to be tested for functional comparison, and may include: at least one of loading time, response time, execution time, traffic consumption, etc. After obtaining the first test feedback information and the second test feedback information, the test device 102 may obtain the first test item data and the second test item data corresponding to the item to be tested from the first test feedback information and the second test feedback information, respectively. For example: under the condition that the item to be tested is the loading time, the loading time of the page can be obtained from the first test feedback information to serve as the first loading time, and the loading time of the page can be obtained from the second test feedback information to serve as the second loading time.
After the first loading time and the second loading time are obtained, the first loading time and the second loading time can be compared, and corresponding comparison results are obtained according to the comparison rules corresponding to the items to be tested. For example: the comparison rule corresponding to the loading time comprises the following steps: the loading time of the system to be tested is smaller than that of the standard system, the loading time test is successful, and otherwise, the loading time test fails.
According to the benchmarking test system, the simulation network equipment is used for simulating a target network environment by building a hot spot, the first equipment and at least one second equipment are connected into the target network environment, the test equipment is used for sending a first test instruction to the first equipment and a second test instruction to the second equipment, the first equipment is used for responding to the first test instruction to execute test operation and sending first test feedback information to the test equipment, and the second equipment is used for responding to the second test instruction to execute test operation and sending second test feedback information to the test equipment. The test equipment is also used for receiving the first test feedback information sent by the first equipment and the second test feedback information sent by the second equipment, and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information. According to the benchmarking test system provided by the embodiment of the disclosure, the benchmarking test can be performed in different network environments through simulating the network environments, and the reproduction of the network environments can be rapidly realized through simulating the network environments, so that the test efficiency of the benchmarking test can be improved.
In one embodiment, the test device 102 is further configured to create a problem list according to the test result and send the problem list to the target device, where the test result indicates that the system to be tested fails the test, where the problem list is used to prompt the user to adjust the system to be tested.
In this embodiment of the present application, in the case that the test result indicates that the system to be tested fails the test, the test device 102 may create a problem list according to the test result, and send the problem list to the target device. The problem list may include a reason that the test result fails the test, and the user may view the problem list from the target device, and locate a portion to be adjusted of the system to be tested according to the problem list, so as to adjust the system to be tested.
In one embodiment, the test device 102 is further configured to repeatedly perform a test process on the system to be tested in response to the processing operation for the problem list until the test result of the system to be tested indicates that the system to be tested passes the test, and generate a test report according to the problem list, the processing operation for the problem list, the test result of the system to be tested, and the target network environment.
In the embodiment of the present application, the processing operation for the problem list may include an adjustment operation of the user to respond to the problem list to the system to be tested. The test device 102 may track the processing operation for the problem list, after the system to be tested completes adjustment, may simulate the target network environment again, and repeatedly perform the operations of performing the foregoing objective test, generating the problem list, and the like in the target network environment until the test result for the system to be tested characterizes that the system to be tested passes the test, stop the test, and may summarize the problem list in the foregoing test process and the processing operation for each problem list, the test result for the system to be tested in each test process, and the target network environment, generate a test report, and may respond to the query or export operation of the user for the test report, query and export the problem, the processing operation, and the like found in the objective test process, so as to provide data support for the subsequent evaluation through the test report.
In order for those skilled in the art to better understand the embodiments of the present application, the embodiments of the present disclosure are described below by way of specific examples.
The benchmarking test system mainly comprises analog network equipment, test equipment, first equipment and at least one second equipment, wherein the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing the benchmarking system. The test equipment can comprise a test execution module, a data recording module, a data analysis module, a problem tracking module and a test report module, and the first equipment and the second equipment can comprise a data monitoring module.
Referring to fig. 2, different network environments may be simulated by simulating a network device. For example, a mobile phone hotspot and a PC installation third party tool can be used to build the hotspot to simulate different network environments, such as: the mobile phone sets an operator access point, namely IPv4, IPv6, IPv4 and IPv6 dual stacks, namely 2G, 3G, 4G and 5G. PC installation such as: fiddler, charles, network Emulatot Toolkit, etc. tools set network bandwidth, delay, fluctuation, etc. to simulate different network environments. After the simulation network device simulates the target network environment, the first device and the second device may be placed in the target network environment and connected to the server by setting up the proxy.
The test execution module in the test device controls the system to be tested installed in the first device to synchronously traverse the same kind of functions of the target system in response to manual operation of a user or an automation tool/script, for example: the automatic test means such as selenium, monkey, appium, airtest are used for controlling the system to be tested and the target system to execute the automatic traversal of the similar service scene function under the target network environment (repeated execution can be carried out so as to obtain average data and improve the accuracy).
During the testing process, the first device and the second device may pass through a third party tool or a program script, for example: fiddler, wireshark, and continuously monitoring test data of the system to be tested and the target system in the target network environment during the test execution process, for example: the loading time (millisecond), the response time (millisecond), the execution time (millisecond), the flow consumption (KB) and other data, and test feedback information such as success/failure/abnormal conditions of the function execution of the system to be tested and the target system, an execution log and the like are transmitted to a data recording module of the test equipment, so that the target test of the system to be tested and similar service scenes of the target system in the same network environment in the same time is realized.
The data recording module of the test equipment can record the test feedback information uploaded by the data monitoring module into a database and a file system, so that the data analysis module can analyze the test feedback information further.
The data analysis module of the test equipment analyzes the single and multiple test feedback information recorded by the data recording module, and repeatedly tests the execution module by repeatedly reproducing the target network environment, confirms the performance bottleneck or the optimal performance data of the system to be tested, compares the performance bottleneck or the optimal performance data with the expected standard (such as average time and optimal data) or the data of the target system in the same network environment, records the difference part, and can conveniently compare the user experience difference of each link and operation step of the system to be tested and the target system in the similar service scene by the execution log recorded by the data recording module, thereby finding out the liftable link and realizing more accurate target test.
The problem tracking module of the test equipment can upload test execution failure data, benchmarking test (log, screenshot, flow) and other data found by the data analysis module to the problem system, so that the problem system can automatically create a problem list, track the problem list, transfer the problem list to the target equipment, enable a user to conveniently position the problem according to the problem list displayed in the target equipment, and after the problem is solved, continue to repeat the test steps of the test execution module until the system to be tested shows better than expected results or benchmarking systems.
The test report module of the test device may send a test report according to the test condition of the foregoing process, for example: and whether the functions are normal or not under the weak network condition is counted through the TestNGreport, information such as problems solved by the test and test results is summarized, the inquiry and the derivation of problems found in the standard test process and the solution history record are supported, and the post evaluation and the use are convenient.
The benchmarking test system provided by the embodiment of the application is lower in environment construction cost, stronger in operability, capable of providing a weak network environment with fixed conditions and a fluctuation weak network environment controllable in range, and suitable for benchmarking tests of various systems to be tested such as mobile phones, mobile equipment and PCs.
In one embodiment, as shown in fig. 3, a benchmarking method is provided, where the method is applied to a terminal to illustrate, it is understood that the method may also be applied to a server, and may also be applied to a system including the terminal and the server, and implemented through interaction between the terminal and the server. In this embodiment, the method includes the steps of:
step 302, a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
step 304, receiving first test feedback information sent by a first device and receiving second test feedback information sent by a second device;
and 306, obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In this embodiment of the present application, a mobile phone hotspot and a PC may be used to install a third party tool to build the hotspot to simulate a target network environment, that is, a simulation process of a specific network environment may refer to the related description of the foregoing embodiment, which is not repeated in this embodiment of the present disclosure.
The first device for installing the system to be tested and the second device for installing the benchmarking system can be simultaneously accessed into the target network environment. After the first device and the second device are accessed into the target network environment, a first test instruction can be sent to the first device and a second test instruction can be sent to the second device. The first test instruction may include a function execution instruction for controlling the first device to perform various functions of the system under test in a traversal manner, and the second test instruction may include a function execution instruction for controlling the second device to perform various functions of the standard system in a traversal manner.
The first device can respond to the first test instruction to execute the test operation, and feed back corresponding first test feedback information according to the running condition in the test process, and the second device can respond to the second test instruction to execute the test operation, and feed back corresponding second test feedback information according to the running condition in the test process. In one possible implementation, the feedback information may include at least one of loading time, response time, execution time, and traffic consumption.
After receiving the first test feedback information sent by the first device and the second test feedback information sent by the second device, the first test feedback information and the second test feedback information can be compared, so that a test result of the system to be tested is obtained. The specific process of comparing the obtained test results may refer to the related description of the foregoing embodiments, which are not described herein.
According to the benchmarking method, the first testing instruction can be sent to the first equipment and the second testing instruction can be sent to the second equipment, wherein the first equipment and the second equipment are equipment connected into the same simulation network environment, the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing the benchmarking system. After receiving the first test feedback information sent by the first device and the second test feedback information sent by the second device, a test result aiming at the system to be tested can be obtained according to the first test feedback information and the second test feedback information. According to the benchmarking test method provided by the embodiment of the disclosure, the benchmarking test can be performed in different network environments through simulating the network environments, and the reproduction of the network environments can be rapidly realized through simulating the network environments, so that the test efficiency of the benchmarking test can be improved.
In one embodiment, referring to fig. 4, in step 302, according to the first test feedback information and the second test feedback information, obtaining the test result for the system to be tested may include:
step 402, acquiring first test item data corresponding to a to-be-tested item from first test feedback information, and acquiring second test item data corresponding to the to-be-tested item from second test feedback information;
And step 404, obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In this embodiment of the present application, an item to be tested may be preset, where the item to be tested may be a data item to be tested by a system to be tested and a target system to be tested for functional comparison, and may include: at least one of load time, response time, execution time, traffic consumption.
After the first test feedback information and the second test feedback information are obtained, first test item data and second test item data corresponding to the items to be tested can be obtained from the first test feedback information and the second test feedback information respectively, and corresponding comparison results can be obtained according to comparison rules corresponding to the items to be tested. The specific process may refer to the related description of the foregoing embodiment, which is not specifically described in detail in this embodiment of the present application.
In one embodiment, the calibration test method may further include:
under the condition that the test result represents that the system to be tested fails the test, a problem list is created according to the test result, and the problem list is sent to the target equipment and is used for prompting a user to adjust the system to be tested.
In one embodiment, the calibration test method may further include:
and repeatedly executing test processing on the system to be tested until the test result of the system to be tested represents that the system to be tested passes the test, and generating a test report according to the problem list, the processing operation on the problem list, the test result of the system to be tested and the target network environment.
In the embodiment of the present application, the process of creating the problem list and the subsequent tracking process may refer to the related description of the foregoing embodiment, which is not repeated herein.
The method and the system for testing the target of the embodiment of the invention have the advantages of low cost, easiness in simulating and deploying different network environments, accurate flow consumption monitoring, repeated verification and lower labor cost, and can automatically execute the target testing task and monitor the flow consumption condition.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a benchmarking test device for realizing the benchmarking test method. The implementation of the solution provided by the device is similar to that described in the above method, so the specific limitation of one or more embodiments of the benchmarking device provided below may refer to the limitation of the benchmarking method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 5, there is provided a benchmarking test apparatus comprising: a transmitting module 502, a receiving module 504, and a testing module 506, wherein:
a sending module 502, configured to send a first test instruction to a first device and send a second test instruction to a second device, where the first device and the second device are devices that access the same analog network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
a receiving module 504, configured to receive first test feedback information sent by the first device and receive second test feedback information sent by the second device;
and the test module 506 is configured to obtain a test result for the system to be tested according to the first test feedback information and the second test feedback information.
The calibration test device can send a first test instruction to the first equipment and send a second test instruction to the second equipment, wherein the first equipment and the second equipment are equipment connected into the same simulation network environment, the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing a calibration system. After receiving the first test feedback information sent by the first device and the second test feedback information sent by the second device, a test result aiming at the system to be tested can be obtained according to the first test feedback information and the second test feedback information. According to the benchmarking test device provided by the embodiment of the disclosure, the benchmarking test can be performed in different network environments through simulating the network environments, and the reproduction of the network environments can be rapidly realized through simulating the network environments, so that the test efficiency of the benchmarking test can be improved.
In one embodiment, the test module is further configured to:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information;
and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the apparatus further comprises:
the first processing module is used for creating a problem list according to the test result and sending the problem list to target equipment under the condition that the test result represents that the system to be tested fails to pass the test, and the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the method further comprises:
and the second processing module is used for responding to the processing operation aiming at the problem list, repeatedly executing the test processing on the system to be tested until the test result of the system to be tested characterizes the system to be tested to pass the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
In one embodiment, the test feedback information includes at least one of load time, response time, execution time, flow consumption.
The modules in the benchmarking device can be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure of which may be as shown in fig. 6. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program, when executed by a processor, implements a benchmarking method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 6 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
receiving first test feedback information sent by the first equipment and receiving second test feedback information sent by the second equipment;
and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the processor when executing the computer program further performs the steps of:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information; and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the processor when executing the computer program further performs the steps of:
and under the condition that the test result represents that the system to be tested fails the test, creating a problem list according to the test result, and sending the problem list to target equipment, wherein the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the processor when executing the computer program further performs the steps of:
and responding to the processing operation aiming at the problem list, repeatedly executing test processing on the system to be tested until the test result of the system to be tested represents that the system to be tested passes the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
receiving first test feedback information sent by the first equipment and receiving second test feedback information sent by the second equipment;
and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information; and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and under the condition that the test result represents that the system to be tested fails the test, creating a problem list according to the test result, and sending the problem list to target equipment, wherein the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and responding to the processing operation aiming at the problem list, repeatedly executing test processing on the system to be tested until the test result of the system to be tested represents that the system to be tested passes the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
In one embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, performs the steps of:
a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, and the second device is a device for installing a benchmarking system;
Receiving first test feedback information sent by the first equipment and receiving second test feedback information sent by the second equipment;
and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information;
and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and under the condition that the test result represents that the system to be tested fails the test, creating a problem list according to the test result, and sending the problem list to target equipment, wherein the problem list is used for prompting a user to adjust the system to be tested.
In one embodiment, the computer program when executed by the processor further performs the steps of:
And responding to the processing operation aiming at the problem list, repeatedly executing test processing on the system to be tested until the test result of the system to be tested represents that the system to be tested passes the test, and generating a test report according to the problem list, the processing operation aiming at the problem list, the test result of the system to be tested and the target network environment.
The benchmarking test method and device provided by the application can be used in the field of financial science and technology, and can also be used in any field except the field of financial science and technology, such as the artificial intelligence field, the big data field and the like.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A benchmarking system, said system comprising: the system comprises test equipment, simulation network equipment, first equipment and at least one second equipment, wherein the first equipment is equipment for installing a system to be tested, and the second equipment is equipment for installing a benchmarking system; wherein,
the simulation network equipment is used for simulating a target network environment by building hot spots, the first equipment and the at least one second equipment are accessed into the target network environment, wherein the simulation network equipment can simulate different network environments by building hot spots through mobile phone hot spots and/or installing a third-party tool on a personal computer, and the calibration test of the system to be tested under different network environments is realized;
The test equipment is used for sending a first test instruction to the first equipment and sending a second test instruction to the second equipment;
the first device is used for responding to the first test instruction to execute test operation and sending first test feedback information to the test device, and the second device is used for responding to the second test instruction to execute test operation and sending second test feedback information to the test device;
the test equipment is also used for receiving the first test feedback information sent by the first equipment and the second test feedback information sent by the second equipment, and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
2. The system of claim 1, wherein the test device is further configured to create a problem list based on the test result and send the problem list to a target device, where the test result indicates that the system to be tested fails the test, the problem list being configured to prompt a user to adjust the system to be tested.
3. The system of claim 2, wherein the test equipment is further configured to repeatedly perform a test process on the system under test in response to a processing operation for the problem list until a test result of the system under test characterizes that the system under test passes a test, and generate a test report according to the problem list, the processing operation for the problem list, and the test result of the system under test, the target network environment.
4. The system according to any one of claims 1 to 3, wherein the test device is further configured to obtain first test item data corresponding to a to-be-tested item from the first test feedback information, obtain second test item data corresponding to the to-be-tested item from the second test feedback information, and obtain a test result for the to-be-tested system according to a comparison result of the first test item data and the second test item data.
5. A system according to any one of claims 1 to 3, wherein the test feedback information comprises at least one of load time, response time, execution time, traffic consumption.
6. A method of benchmarking, said method comprising:
a first test instruction is sent to a first device and a second test instruction is sent to a second device, wherein the first device and the second device are devices connected into the same simulation network environment, the first device is a device for installing a system to be tested, the second device is a device for installing a benchmarking system, and the simulation network environment is obtained by performing network simulation through a mobile phone hotspot and/or a third party tool building hotspot installed on a personal computer;
Receiving first test feedback information sent by the first equipment and receiving second test feedback information sent by the second equipment;
and obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
7. The method of claim 6, wherein the obtaining the test result for the system under test according to the first test feedback information and the second test feedback information comprises:
acquiring first test item data corresponding to an item to be tested from the first test feedback information, and acquiring second test item data corresponding to the item to be tested from the second test feedback information;
and obtaining a test result aiming at the system to be tested according to the comparison result of the first test item data and the second test item data.
8. A benchmarking test apparatus, said apparatus comprising:
the system comprises a sending module, a testing module and a testing module, wherein the sending module is used for sending a first testing instruction to a first device and sending a second testing instruction to a second device, wherein the first device and the second device are devices which are connected into the same simulation network environment, the first device is a device for installing a system to be tested, the second device is a device for installing a benchmarking system, and the simulation network environment is obtained by performing network simulation through a mobile phone hot spot and/or a third party tool building hot spot installed on a personal computer;
The receiving module is used for receiving the first test feedback information sent by the first equipment and receiving the second test feedback information sent by the second equipment;
and the test module is used for obtaining a test result aiming at the system to be tested according to the first test feedback information and the second test feedback information.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of claim 6 or 7 when executing the computer program.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of claim 6 or 7.
CN202111445082.7A 2021-11-30 2021-11-30 Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium Active CN114143213B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111445082.7A CN114143213B (en) 2021-11-30 2021-11-30 Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111445082.7A CN114143213B (en) 2021-11-30 2021-11-30 Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium

Publications (2)

Publication Number Publication Date
CN114143213A CN114143213A (en) 2022-03-04
CN114143213B true CN114143213B (en) 2024-03-26

Family

ID=80385992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111445082.7A Active CN114143213B (en) 2021-11-30 2021-11-30 Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium

Country Status (1)

Country Link
CN (1) CN114143213B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109960646A (en) * 2017-12-25 2019-07-02 福建天晴数码有限公司 A kind of test method and terminal of application
CN110138617A (en) * 2019-05-22 2019-08-16 北京字节跳动网络技术有限公司 Data transmission quality test method, system, electronic equipment and storage medium
CN110224897A (en) * 2019-06-26 2019-09-10 深圳市腾讯信息技术有限公司 Vulnerable network test method, device, mobile device and the storage medium of application program
CN113676368A (en) * 2021-07-12 2021-11-19 交控科技股份有限公司 Method and device applied to ATS network performance test
CN113709126A (en) * 2021-08-18 2021-11-26 深圳开源互联网安全技术有限公司 Network protocol security fuzzy test method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109960646A (en) * 2017-12-25 2019-07-02 福建天晴数码有限公司 A kind of test method and terminal of application
CN110138617A (en) * 2019-05-22 2019-08-16 北京字节跳动网络技术有限公司 Data transmission quality test method, system, electronic equipment and storage medium
CN110224897A (en) * 2019-06-26 2019-09-10 深圳市腾讯信息技术有限公司 Vulnerable network test method, device, mobile device and the storage medium of application program
CN113676368A (en) * 2021-07-12 2021-11-19 交控科技股份有限公司 Method and device applied to ATS network performance test
CN113709126A (en) * 2021-08-18 2021-11-26 深圳开源互联网安全技术有限公司 Network protocol security fuzzy test method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114143213A (en) 2022-03-04

Similar Documents

Publication Publication Date Title
JP5937724B2 (en) Techniques for network replication
US20180048555A1 (en) Device profile-driven automation for cell-based test systems
CN105378680A (en) System and method for coordinating field user testing results for mobile application across various mobile devices
US11151025B1 (en) Generating software test plans based at least in part on monitored traffic of a production application
CN109684150B (en) Performance test system, test method and simulation platform of storage particle controller
CN106339273A (en) Application program restoration method, terminal and server
CN112783793A (en) Automatic interface test system and method
CN109815119A (en) A kind of test method and device of APP link channel
CN114356631A (en) Fault positioning method and device, computer equipment and storage medium
CN114143213B (en) Benchmarking test system, benchmarking test method, benchmarking test device, benchmarking test computer equipment and benchmarking test storage medium
CN115129574A (en) Code testing method and device
CN111147586B (en) Equipment end control method and device and conference system
CN116185774A (en) Log monitoring installation method, device, computer equipment and storage medium
CN115658794A (en) Data query method and device, computer equipment and storage medium
CN104268231A (en) File access method, device and intelligent file system
US8789039B2 (en) Method and system of installing a program on a first computer, and duplicating the installation on a second computer
CN110806981A (en) Application program testing method, device, equipment and storage medium
CN114286378B (en) Ad hoc network wireless communication device network performance test system and method
CN107967366A (en) File management method, USB flash disk and computer-readable recording medium
CN109726053B (en) Switching method and device of database control center and computer equipment
CN111241437B (en) Data processing method, device, server side and storage medium
CN117950997A (en) Analysis method and device of test coverage rate, computer equipment and storage medium
CN116909640A (en) Sensor configuration system, method, apparatus, computer device, and storage medium
CN116737562A (en) APP response delay test method and device and computer equipment
CN117076291A (en) Service testing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant