CN111752828A - Performance test method and device for Web application - Google Patents

Performance test method and device for Web application Download PDF

Info

Publication number
CN111752828A
CN111752828A CN202010499402.6A CN202010499402A CN111752828A CN 111752828 A CN111752828 A CN 111752828A CN 202010499402 A CN202010499402 A CN 202010499402A CN 111752828 A CN111752828 A CN 111752828A
Authority
CN
China
Prior art keywords
test
web application
sub
performance
target web
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010499402.6A
Other languages
Chinese (zh)
Inventor
梁俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Wind Gather Intelligence Technology Co ltd
Original Assignee
Wuhan Wind Gather Intelligence Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Wind Gather Intelligence Technology Co ltd filed Critical Wuhan Wind Gather Intelligence Technology Co ltd
Priority to CN202010499402.6A priority Critical patent/CN111752828A/en
Publication of CN111752828A publication Critical patent/CN111752828A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]

Abstract

The application provides a performance test method and device for Web application, which are used for improving test efficiency to a certain extent when the performance of the Web application is tested. The application provides a performance test method for Web application, which comprises the following steps: after determining a target Web application to be tested, the first node equipment acquires a test case and a test script matched with the target Web application; the first node equipment distributes the test cases and the test scripts to a plurality of second node equipment; the plurality of second node devices respectively initiate performance tests to the target Web application according to the test cases and the test scripts, and feed back the sub-performance test results obtained by the second node devices to the first node devices; the first node device collects the multiple sub performance test results and generates a performance test result of the target Web application on the basis of the collected result.

Description

Performance test method and device for Web application
Technical Field
The present application relates to the field of testing, and in particular, to a method and an apparatus for testing performance of a Web application.
Background
In a development link or an updating and maintaining link, the Web application needs to be tested so as to test the application performance, abnormal parts are corrected, and the Web application can be put on the shelf and put on the market and users when passing the test.
The traditional application testing mode adopts a manual testing mode, workers specially used for testing application manually initiate a large number of processing requests to the Web application from a terminal side, and the Web application is tested for various performances, such as response speed, processing time, maximum processing capacity and the like, required by a target test. Because the manual testing mode depends on manual implementation, although the testing efficiency is low despite the advantage of high testing precision, if the Web application needs to pass the test in a short time and be brought to the market, the manual testing mode may affect the shelf life.
In the background that the testing efficiency of the manual testing mode is low, an automatic testing mode appears, and the setting of the testing script automatically simulates the operation behavior of a worker to perform testing, so that the testing efficiency of the Web application can be improved to a great extent.
Disclosure of Invention
The application provides a performance test method and device for Web application, which are used for improving test efficiency to a certain extent when the performance of the Web application is tested.
In a first aspect, the present application provides a method for testing performance of a Web application, where the method includes:
after determining a target Web application to be tested, the first node equipment acquires a test case and a test script matched with the target Web application;
the first node equipment distributes the test cases and the test scripts to a plurality of second node equipment;
the plurality of second node devices respectively initiate performance tests to the target Web application according to the test cases and the test scripts, and feed back the sub-performance test results obtained by the second node devices to the first node devices;
the first node device collects the multiple sub performance test results and generates a performance test result of the target Web application on the basis of the collected result.
With reference to the first aspect of the present application, in a first possible implementation manner of the first aspect of the present application, after determining a target Web application to be tested, the obtaining, by a first node device, a test case and a test script that are matched with the target Web application includes:
after determining a target Web application to be tested, the first node equipment acquires a test case corresponding to the performance test task;
the first node device obtains a test script obtained by recording user operation behaviors when a test case is used for carrying out a historical performance test task based on a browser based on a Chrome Dettools Protocol corresponding to the browser.
With reference to the first possible implementation manner of the first aspect of the present application, in a second possible implementation manner of the first aspect of the present application, the test script includes a plurality of sub-test scripts, and the plurality of sub-test scripts splits the historical performance test task into a plurality of sub-historical performance test tasks according to different function points by the first node device based on a chromedevices Protocol, and records corresponding user operation behaviors obtained when the plurality of sub-historical performance test tasks are performed respectively.
With reference to the second possible implementation manner of the first aspect of the present application, in a third possible implementation manner of the first aspect of the present application, the distributing, by the first node device, the test case and the test script to the plurality of second node devices includes:
the first node equipment takes the sub-test scripts and the sub-test cases corresponding to the sub-test scripts in the test cases as basic units, and respectively sends the sub-test scripts and the sub-test cases to corresponding second node equipment; alternatively, the first and second electrodes may be,
and the first node equipment respectively sends the plurality of sub-test scripts to the corresponding second node equipment, and sends the test cases to the plurality of second node equipment.
With reference to the third possible implementation manner of the first aspect of the present Application, in the fourth possible implementation manner of the first aspect of the present Application, the target Web Application is a Web Application of an Application Programming Interface (API) performance to be tested, the test cases and the test scripts correspond to API performance tests, and the initiating, by the plurality of second node devices, the performance test on the target Web Application according to the test cases and the test scripts respectively includes:
and the second node devices respectively launch the API performance test to the target Web application according to the test case and the test script so as to obtain the API performance test result of the target Web application.
In a second aspect, the present application further provides a device for testing performance of a Web application, where the device includes:
the first processing unit is used for acquiring a test case and a test script matched with a target Web application after the target Web application to be tested is determined;
the first transceiving unit is used for distributing the test cases and the test scripts to a plurality of second node devices;
the plurality of second processing units are used for respectively initiating performance tests to the target Web application according to the test cases and the test scripts and feeding back the sub-performance test results obtained by the second processing units to the first node equipment through the second transceiving units;
and the first processing unit is also used for summarizing the plurality of sub performance test results and generating the performance test result of the target Web application on the basis of the summarized result.
With reference to the second aspect of the present application, in a first possible implementation manner of the second aspect of the present application, the first processing unit is specifically configured to:
after determining a target Web application to be tested, acquiring a test case corresponding to the performance test task;
based on a Chrome Dettools Protocol corresponding to the browser, a test script obtained by recording user operation behaviors when a historical performance test task is performed based on a test case is obtained.
In combination with the first possible implementation manner of the second aspect of the present application, in the second possible implementation manner of the second aspect of the present application, the test script includes a plurality of sub-test scripts, and the plurality of sub-test scripts split the historical performance test task into a plurality of sub-historical performance test tasks according to different function points by the first node device based on a chromedevices Protocol, and record corresponding user operation behaviors when the plurality of sub-historical performance test tasks are performed respectively.
In combination with the second possible implementation manner of the second aspect of the present application, in a third possible implementation manner of the second aspect of the present application, the first transceiver unit is specifically configured to send the multiple sub-test scripts and the multiple sub-test cases to the corresponding second node devices respectively, with the sub-test scripts and the sub-test cases corresponding to the sub-test scripts as basic units; or the first transceiving unit is specifically configured to send the multiple sub-test scripts to the corresponding second node devices respectively, and send the test case to the multiple second node devices.
With reference to the third possible implementation manner of the second aspect of the present application, in a fourth possible implementation manner of the second aspect of the present application, the target Web application is a Web application of an API performance to be tested, the test case and the test script correspond to an API performance test, and the plurality of second processing units are specifically configured to initiate the API performance test to the target Web application according to the test case and the test script, respectively, so as to obtain an API performance test result of the target Web application.
In a third aspect, the present application further provides a testing device, where the testing device includes a processor and a memory, where the memory stores a computer program, and the processor executes, when calling the computer program in the memory, the steps executed by the first node device or the second node device in the performance testing method for the Web application according to the first aspect of the present application.
In a fourth aspect, the present application further provides a computer-readable storage medium, where a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor to perform steps performed by the first node device or the second node device in the performance testing method for the Web application according to the first aspect of the present application.
According to the technical scheme, the method has the following advantages:
after determining a target Web application to be tested, a first node device acquires a test case and a test script matched with the target Web application, and then distributes the test case and the test script to a plurality of second node devices, so that a distributed architecture is formed, under the scheduling of the first node device, the plurality of second node devices can respectively initiate a performance test to the target Web application according to the test case and the test script, and perform a high concurrency test, each second node device can fully utilize server resources to simulate high concurrency in an assistance and asynchronous processing mode, and laterally expand the high concurrency capability again on the basis of a single node, so that under the setting of the test script, the test efficiency of the Web application is improved to a certain extent, the high test efficiency requirement is met, and the first node device is based on a sub-performance test result obtained by each second node device, the performance test results of the target Web application can be summarized and generated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic view of a scenario of a performance testing method for Web applications according to the present application;
FIG. 2 is a schematic flow chart of a performance testing method for Web applications according to the present application;
FIG. 3 is a schematic flow chart of the present application for obtaining test cases and test scripts;
FIG. 4 is a schematic structural diagram of a performance testing apparatus for Web applications according to the present application;
fig. 5 is a schematic structural diagram of the testing apparatus of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Moreover, the terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules explicitly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus. The naming or numbering of the steps appearing in the present application does not mean that the steps in the method flow have to be executed in the chronological/logical order indicated by the naming or numbering, and the named or numbered process steps may be executed in a modified order depending on the technical purpose to be achieved, as long as the same or similar technical effects are achieved.
The division of the modules presented in this application is a logical division, and in practical applications, there may be another division, for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not executed, and in addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, and the indirect coupling or communication connection between the modules may be in an electrical or other similar form, which is not limited in this application. The modules or sub-modules described as separate components may or may not be physically separated, may or may not be physical modules, or may be distributed in a plurality of circuit modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of the present disclosure.
In the application, the performance testing method and device of the Web application or the computer readable storage medium can be applied to related testing equipment and used for improving the testing efficiency of the target Web application.
First, referring to fig. 1, fig. 1 shows a scene schematic diagram of a performance testing method for a Web application in the present application, a worker in charge of a Web application test may trigger a performance testing task of the Web application on a device installed with a testing tool, where the device may be a first node device, a second node device, or any device, and the worker may determine a target Web application to be currently tested on a User Interface (UI) of the testing tool, and then distribute the testing task, that is, distribute a test case and a test script, to a plurality of second node devices by the first node device, so as to perform a performance test with high testing efficiency on the target Web application.
The first node device, the second node device, and the like may specifically be hardware devices such as a physical host, a physical server, and the like, and may be used for performing data calculation related to the present application, and further may further have components capable of performing human-computer interaction, such as a display screen, a keyboard, and the like, along with an application scene, so as to have human interaction capability, for example, a Personal Digital Assistant (PDA), a smart phone, a tablet computer, a notebook computer, a smart band, and the like.
Continuing to refer to fig. 2, fig. 2 shows a schematic flow chart of the performance testing method of the Web application in the present application, and the performance testing method of the Web application in the present application may specifically include the following steps S201 to S204:
step S201, after determining a target Web application to be tested, a first node device acquires a test case and a test script matched with the target Web application;
it can be understood that in the present application, test cases and test scripts related to performance testing may be configured in advance for one or more Web applications according to different test contents, such as different applications, different versions, different test time periods, and the like.
A test case refers to a particular set of input data, operations, or various environmental settings and desired results that are provided to a Web application to implement a test.
The test script is a script written for automatic testing and is used for simulating the user operation behavior of a worker in the testing process.
Illustratively, the test case set and the test script set may be configured, and when the current Web application test task is triggered, the corresponding test case and the corresponding test script are searched and matched from the test case set and the test script set, respectively.
For example, the test cases in the test case set and the test scripts in the test script set may be given application Identifications (IDs) and test content IDs of corresponding Web applications, the target application ID and the target test content ID corresponding to the current test task are extracted from the task information of the current test task, and then the matched target test case and target test script are extracted from the test case set and the test script set.
Step S202, the first node equipment distributes the test cases and the test scripts to a plurality of second node equipment;
after the test case and the test script corresponding to the current test task are obtained, the first node device can perform task distribution and scheduling of the current test task, and distribute the test case and the test script to a plurality of second node devices.
For example, the second node device may be a preconfigured node device, or may also be a node device determined for the current first node device, for example, the first node device may determine node devices having idle computing resources from the device network, and then identify the node devices as the second node devices corresponding to the test task, further, the number of the second node devices and task processing amounts corresponding to the second node devices may also be determined according to the size of the idle computing resources of the node devices, so that on the basis of task distribution, the utilization rate of the idle computing resources is improved, and flexible deployment of the second node devices and adjustment of the task processing amounts are facilitated.
Step S203, the plurality of second node devices respectively initiate performance tests to the target Web application according to the test cases and the test scripts, and feed back the sub performance test results obtained by the second node devices to the first node devices;
after receiving the test case and the test script, the second node device can establish communication connection with the target Web application specified by the task, and initiate performance test based on the communication connection with the target Web application.
In the performance test process, the second node device may send different types of processing requests, such as a data query request, a data push request, a data update request, and the like, to the Web application according to task requirements, so that the target Web application may respond to the processing according to the received processing request and may feed back a processing result, and in this process, the second node device may monitor the processing of the target Web application to obtain a sub-performance test result.
After the second node device monitors and obtains the sub performance test result of the performance test initiated by the target Web application, the sub performance test result can be fed back to the first node device.
Step S204, the first node device collects a plurality of sub performance test results and generates a performance test result of the target Web application on the basis of the collected result.
In the application, the first node device may generate the performance test result of the target Web application in a corresponding form according to a preset output requirement on the basis of the summary result of the plurality of sub performance test results.
For example, taking the output requirement as a test report as an example, the sub-performance test results received by the first node device only simply record the test results, so that the first node device can conveniently extract and summarize the content, and after obtaining the summarized result, the first node device writes the performance test result of the test task in the report template according to the format requirement of the test report, so as to obtain the performance test report which is displayed in a multi-directional manner by using a table and a timing diagram.
As can be seen from the embodiment shown in fig. 2, after determining a target Web application to be tested, a first node device obtains a test case and a test script matched with the target Web application, and then distributes the test case and the test script to a plurality of second node devices, so that a distributed architecture is formed, under the scheduling of the first node device, the plurality of second node devices can respectively initiate a performance test to the target Web application according to the test case and the test script, and perform a high concurrency test, each second node device can fully utilize server resources to simulate high concurrency in an assistance and asynchronous processing manner, and laterally expand the high concurrency capability again on the basis of a single node, thereby improving the test efficiency of the Web application to a certain extent under the setting of the test script, satisfying the requirement of the high test efficiency, the first node device on the basis of a sub-performance test result obtained by each second node device, the performance test results of the target Web application can be summarized and generated.
The following provides a detailed description of possible implementation manners of each step in the performance testing method for Web applications.
In an exemplary implementation manner, as shown in a flowchart of fig. 3 for obtaining a test case and a test script, the first node device obtaining a test case and a test script that match a target Web application may be implemented by the following steps:
step S301, after determining a target Web application to be tested, a first node device acquires a test case corresponding to the performance test task;
illustratively, the test case may be carried in the task information of the performance test task and written by a worker in real time, or the corresponding test case may be searched and matched from a pre-configured test case set.
For example, the test case in the test case set may be given an application Identification (ID) of the corresponding Web application, and then the target application ID and the target test content ID corresponding to the current test task are extracted from the task information of the current test task, and then the matched target test case is extracted from the test case set.
Step S302, the first node device obtains a test script obtained by recording user operation behaviors when performing a historical performance test task based on a test case based on a Chrome Dettools Protocol corresponding to a browser.
In the application, a performance test can be initiated to the Web application through the browser, and correspondingly, in a specific implementation manner, a test script can be recorded based on a Chrome devices Protocol in the browser.
For example, a Chrome devtols Protocol may allow a worker to debug, analyze, etc. a Chrome browser-based Web application, where data communication of the Protocol is based on WebSocket, and a fast data channel connecting the devtols and a browser kernel is established by using the WebSocket.
HTML5 on websocket provides a protocol for full duplex communication over a single TCP connection. Websocket makes data exchange between the client and the server simpler, and allows the server to actively push data to the client. In the Websocket API, the browser and the server only need to complete one handshake, and persistent connection can be directly established between the browser and the server, and bidirectional data transmission is carried out. In the Websocket API, a browser and a server only need to perform a handshake action, then a fast channel is formed between the browser and the server, and data can be directly transmitted between the browser and the server.
Under the environment, the test script obtained by recording the user operation behavior when the historical performance test task is performed based on the test case can be obtained, namely, the user operation behavior when the historical performance test task is performed on the test case can be recorded before the performance test task is triggered, the obtained test script can be used for the subsequent automatic test, and the test script can be taken and used when the performance test task is triggered, so that the user operation behavior when the worker performs the historical performance test task based on the test case is restored, and the test is performed again.
In another exemplary implementation manner, in the process of distributing the test task to the second node device, in order to facilitate distribution of the test task and continuous improvement of the test efficiency and the test efficiency, the first node device may further split the test task according to a preset splitting policy.
Illustratively, the test script comprises a plurality of sub-test scripts, and the plurality of sub-test scripts are obtained by splitting the historical performance test task into a plurality of sub-historical performance test tasks according to different function points by using a Chrome development Protocol by the first node device, and respectively recording corresponding user operation behaviors when the plurality of sub-historical performance test tasks are performed.
The function point may also be understood as a type of performance testing task, for example, a UI interface corresponds to a plurality of user personal information items, a user may refer to and update one or more of the user personal information items in the UI interface, and a user operation behavior of updating the user personal information items with the UI interface may be regarded as a function point, which is also a performance testing task corresponding to an updated personal information type.
Correspondingly, when the test script is recorded, corresponding user operation behaviors can be respectively recorded according to different function points to obtain sub test scripts; or when the whole test script is obtained by recording, the corresponding sub-test scripts can be extracted from the whole test script according to different function points, and the sub-test scripts can be specifically adjusted according to actual needs.
Therefore, after different test scripts are obtained by splitting different function points, the second node equipment can perform performance test on specific function points in a targeted and concentrated manner according to the different function points, so that the test task distribution and the test efficiency improvement are facilitated, and corresponding test results can be obtained according to the different function points due to the fact that the performance test is accurately refined, and the test precision can be further improved.
In another exemplary implementation manner, the first node device distributes the test case and the test script to a plurality of second node devices, which may be specifically divided into two cases.
First case
And the first node equipment respectively sends the plurality of sub-test scripts and the plurality of sub-test cases to corresponding second node equipment by taking the sub-test scripts and the sub-test cases corresponding to the sub-test scripts in the test cases as basic units.
It can be understood that when the first node device splits the test task according to different functional points, the test script can be split into a plurality of sub-test scripts, and the test case can be split into a plurality of sub-test cases, each sub-test task corresponds to a group of sub-test scripts and the sub-test cases corresponding to the sub-test scripts, and each sub-test task is taken as a unit to respectively send the plurality of sub-test scripts and the plurality of sub-test cases to the corresponding second node device.
And part of the second node equipment can also be subjected to a plurality of sub-test tasks, namely, a plurality of groups of sub-test scripts and a plurality of sub-test cases can be received.
Second case
And the first node equipment respectively sends the plurality of sub-test scripts to the corresponding second node equipment, and sends the test cases to the plurality of second node equipment.
It can be understood that the first node device may also split the test script into a plurality of sub-test scripts according to different function points, and the complete test case may be synchronized between the second node devices, and share one same test case, and when executing the received sub-test scripts, only the corresponding partial test cases need to be searched and extracted from the complete test case.
In another exemplary implementation manner, the performance test referred to herein may specifically be an API performance test, and correspondingly, the target Web application is a Web application with API performance to be tested, where the test case and the test script correspond to the API performance test, and the plurality of second node devices may respectively initiate the API performance test to the target Web application according to the test case and the test script, so as to obtain an API performance test result of the target Web application.
The API performance test result may specifically include specific test results such as API interface test success number, failure number, maximum time, minimum time, average time, T50 time, T90 time real-time analysis preview, and the like.
In order to better implement the performance test method of the Web application provided by the application, the application also provides a performance test device of the Web application.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a performance testing apparatus for Web applications according to the present application, in which the performance testing apparatus 400 for Web applications may specifically include the following structure:
the first processing unit 401 is configured to, after determining a target Web application to be tested, obtain a test case and a test script that are matched with the target Web application;
a first transceiving unit 402, configured to distribute the test cases and the test scripts to a plurality of second node devices;
the plurality of second processing units 403 are configured to initiate a performance test to the target Web application according to the test case and the test script, and feed back the sub-performance test results obtained by each of the second processing units to the first node device through the second transceiving unit 404;
the first processing unit 401 is further configured to aggregate the multiple sub performance test results, and generate a performance test result of the target Web application based on the aggregated result.
In an exemplary implementation manner, the first processing unit 401 is specifically configured to:
after determining a target Web application to be tested, acquiring a test case corresponding to the performance test task;
based on a Chrome Dettools Protocol corresponding to the browser, a test script obtained by recording user operation behaviors when a historical performance test task is performed based on a test case is obtained.
In another exemplary implementation manner, the test script includes a plurality of sub-test scripts, and the plurality of sub-test scripts are obtained by splitting the historical performance test task into a plurality of sub-historical performance test tasks according to different function points, and recording corresponding user operation behaviors when the plurality of sub-historical performance test tasks are performed, respectively, based on a Chrome devtols Protocol by the first node device.
In another exemplary implementation manner, the first transceiving unit 402 is specifically configured to send the multiple sub-test scripts and the multiple sub-test cases to corresponding second node devices, respectively, based on both the sub-test scripts and the sub-test cases corresponding to the sub-test scripts in the test cases; or, the first transceiver unit 402 is specifically configured to send the multiple sub-test scripts to the corresponding second node devices respectively, and send the test case to the multiple second node devices.
In another exemplary implementation manner, the target Web application is a Web application with API performance to be tested, the test case and the test script correspond to an API performance test, and the plurality of second processing units are specifically configured to initiate the API performance test to the target Web application according to the test case and the test script, respectively, so as to obtain an API performance test result of the target Web application.
Referring to fig. 5, fig. 5 shows a schematic structural diagram of the testing apparatus of the present application, specifically, the testing apparatus of the present application includes a processor 501, a memory 502, and an input/output device 503, where when the processor 501 is used to execute a computer program stored in the memory 502, each step executed by a first node device or a second node device in the performance testing method for Web application in the embodiment corresponding to fig. 2 or fig. 3 is implemented; or, when the processor 501 is configured to execute the computer program stored in the memory 502, the functions of the units in the embodiment corresponding to fig. 4 are implemented, for example, a hardware structure corresponding to the first processing unit 401 and the second processing unit 403 in fig. 4 is the processor 501, a hardware structure corresponding to the first transceiver unit 402 or the second transceiver unit 404 is the input/output device 503, and the memory 502 is configured to store the computer program required by the processor 501 to execute the method executed by the first node device or the second node device in the performance testing method of the Web application in the embodiment corresponding to fig. 2 or fig. 3.
Illustratively, a computer program may be partitioned into one or more modules/units, which are stored in memory 502 and executed by processor 501 to accomplish the present application. One or more modules/units may be a series of computer program instruction segments capable of performing certain functions, the instruction segments being used to describe the execution of a computer program in a computer device.
The test equipment may include, but is not limited to, a processor 501, a memory 502, and input-output devices 503. It will be appreciated by those skilled in the art that the illustration is merely an example of a test device and does not constitute a limitation of a test device and may include more or less components than those illustrated, or some components may be combined, or different components, for example, the test device may also include a network access device, a bus, etc. through which the processor 501, the memory 502, the input output device 503, and the network access device, etc. are connected.
The Processor 501 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center for the test equipment and connecting the various parts of the overall test equipment using various interfaces and lines.
The memory 502 may be used to store computer programs and/or modules, and the processor 501 may implement various functions of the computer device by running or executing the computer programs and/or modules stored in the memory 502, as well as invoking data stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data (e.g., test cases, test scripts, sub performance test results, performance test results) created according to the test equipment, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the above-described specific working processes of the performance testing apparatus and the testing device for Web applications and the corresponding units thereof may refer to the description of the performance testing method for Web applications in the embodiment corresponding to fig. 2 or fig. 3, and are not described herein again in detail.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
For this reason, an embodiment of the present application provides a computer-readable storage medium, where multiple instructions are stored, where the instructions can be loaded by a processor to execute steps executed by a first node device or a second node device in a performance testing method for a Web application in an embodiment corresponding to fig. 2 or fig. 3 in the present application, and specific operations may refer to descriptions of the performance testing method for the Web application in the embodiment corresponding to fig. 2 or fig. 3, which are not described herein again.
Wherein the computer-readable storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the computer-readable storage medium may execute the steps executed by the first node device or the second node device in the performance testing method for Web applications in the embodiment corresponding to fig. 2 or fig. 3, the beneficial effects that can be realized by the performance testing method for Web applications in the embodiment corresponding to fig. 2 or fig. 3 can be achieved, for details, see the foregoing description, and are not repeated herein.
The method, the device, the testing equipment and the computer-readable storage medium for testing the performance of the Web application provided by the application are introduced in detail, a specific example is applied in the description to explain the principle and the implementation of the application, and the description of the above embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A performance test method for Web application is characterized in that the method comprises the following steps:
after determining a target Web application to be tested, a first node device acquires a test case and a test script matched with the target Web application;
the first node equipment distributes the test cases and the test scripts to a plurality of second node equipment;
the second node devices respectively initiate performance tests to the target Web application according to the test cases and the test scripts, and feed back the sub performance test results obtained by the second node devices to the first node devices;
the first node device collects the sub performance test results and generates a performance test result of the target Web application on the basis of the collected result.
2. The method of claim 1, wherein the obtaining, by the first node device, the test case and the test script matched with the target Web application after determining the target Web application to be tested comprises:
the first node equipment acquires a test case corresponding to the performance test task after determining a target Web application to be tested;
the first node device obtains the test script obtained by recording user operation behaviors when the test case is used for carrying out a historical performance test task based on a Chrome Dettools Protocol corresponding to a browser.
3. The method according to claim 2, wherein the test script comprises a plurality of sub-test scripts, and the plurality of sub-test scripts are obtained by the first node device by splitting the historical performance test task into a plurality of sub-historical performance test tasks according to different function points based on the Chrome devtols Protocol, and recording corresponding user operation behaviors when the plurality of sub-historical performance test tasks are performed respectively.
4. The method of claim 3, wherein the first node device distributing the test cases and the test scripts to a plurality of second node devices comprises:
the first node equipment respectively sends the plurality of sub-test scripts and the plurality of sub-test cases to the corresponding second node equipment by taking the sub-test scripts and the sub-test cases corresponding to the sub-test scripts in the test cases as a basic unit; alternatively, the first and second electrodes may be,
and the first node equipment respectively sends the plurality of sub-test scripts to the corresponding second node equipment, and sends the test cases to the plurality of second node equipment.
5. The method of claim 4, wherein the target Web application is a Web application with API performance of an application interface to be tested, the test case and the test script correspond to API performance testing, and the initiating, by the plurality of second node devices, a performance test to the target Web application according to the test case and the test script respectively comprises:
and the second node devices respectively initiate the API performance test to the target Web application according to the test case and the test script to obtain an API performance test result of the target Web application.
6. A performance testing apparatus for a Web application, the apparatus comprising:
the first processing unit is used for acquiring a test case and a test script matched with a target Web application after the target Web application to be tested is determined;
the first transceiving unit is used for distributing the test cases and the test scripts to a plurality of second node devices;
the plurality of second processing units are used for respectively initiating performance tests to the target Web application according to the test cases and the test scripts and feeding back the sub-performance test results obtained by the second processing units to the first node equipment through the second transceiving units;
the first processing unit is further configured to summarize the plurality of sub performance test results, and generate a performance test result of the target Web application on the basis of the summarized result.
7. The apparatus according to claim 6, wherein the first processing unit is specifically configured to:
after determining a target Web application to be tested, acquiring a test case corresponding to the performance test task;
and acquiring the test script obtained by recording user operation behaviors when the test case is used for carrying out a historical performance test task based on a Chrome Dettools Protocol corresponding to a browser.
8. The apparatus according to claim 7, wherein the test script includes a plurality of the sub-test scripts, and the plurality of the sub-test scripts are obtained by the first node device splitting the historical performance test task into a plurality of sub-historical performance test tasks according to different function points based on the Chrome devtolls Protocol, and respectively recording corresponding user operation behaviors when the plurality of the sub-historical performance test tasks are performed.
9. The apparatus according to claim 8, wherein the first transceiver unit is specifically configured to send the plurality of sub-test scripts and the plurality of sub-test cases to the corresponding second node devices, respectively, based on both the sub-test scripts and the sub-test cases corresponding to the sub-test scripts in the test cases; or, the first transceiver unit is specifically configured to send the multiple sub-test scripts to the corresponding second node devices respectively, and send the test case to the multiple second node devices.
10. The apparatus according to claim 9, wherein the target Web application is a Web application with API performance of an application interface to be tested, the test case and the test script correspond to an API performance test, and the plurality of second processing units are specifically configured to initiate the API performance test to the target Web application according to the test case and the test script, respectively, so as to obtain an API performance test result of the target Web application.
CN202010499402.6A 2020-06-04 2020-06-04 Performance test method and device for Web application Pending CN111752828A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010499402.6A CN111752828A (en) 2020-06-04 2020-06-04 Performance test method and device for Web application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010499402.6A CN111752828A (en) 2020-06-04 2020-06-04 Performance test method and device for Web application

Publications (1)

Publication Number Publication Date
CN111752828A true CN111752828A (en) 2020-10-09

Family

ID=72674582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010499402.6A Pending CN111752828A (en) 2020-06-04 2020-06-04 Performance test method and device for Web application

Country Status (1)

Country Link
CN (1) CN111752828A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461855A (en) * 2013-09-22 2015-03-25 腾讯科技(北京)有限公司 Automatic Web testing method, system and device
CN106411633A (en) * 2016-08-23 2017-02-15 国家电网公司 Web application compatibility testing method based on openstack and system thereof
CN109254910A (en) * 2018-08-08 2019-01-22 北京城市网邻信息技术有限公司 A kind of test method of application program, device, electronic equipment and storage medium
CN109359031A (en) * 2018-09-04 2019-02-19 中国平安人寿保险股份有限公司 More appliance applications test methods, device, server and storage medium
CN109871326A (en) * 2019-02-13 2019-06-11 广州云测信息技术有限公司 A kind of method and apparatus that script is recorded
CN110262964A (en) * 2019-05-21 2019-09-20 深圳壹账通智能科技有限公司 Test method, device, equipment and computer readable storage medium
CN111177003A (en) * 2019-12-30 2020-05-19 北京同邦卓益科技有限公司 Test method, device, system, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461855A (en) * 2013-09-22 2015-03-25 腾讯科技(北京)有限公司 Automatic Web testing method, system and device
CN106411633A (en) * 2016-08-23 2017-02-15 国家电网公司 Web application compatibility testing method based on openstack and system thereof
CN109254910A (en) * 2018-08-08 2019-01-22 北京城市网邻信息技术有限公司 A kind of test method of application program, device, electronic equipment and storage medium
CN109359031A (en) * 2018-09-04 2019-02-19 中国平安人寿保险股份有限公司 More appliance applications test methods, device, server and storage medium
CN109871326A (en) * 2019-02-13 2019-06-11 广州云测信息技术有限公司 A kind of method and apparatus that script is recorded
CN110262964A (en) * 2019-05-21 2019-09-20 深圳壹账通智能科技有限公司 Test method, device, equipment and computer readable storage medium
CN111177003A (en) * 2019-12-30 2020-05-19 北京同邦卓益科技有限公司 Test method, device, system, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN105988688A (en) Screen capture method and apparatus
CN108256118B (en) Data processing method, device, system, computing equipment and storage medium
US8683268B2 (en) Key based cluster log coalescing
CN104699591A (en) Reappearing method and device of test scenes
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
CN110554831B (en) Operation synchronization method, device, equipment and storage medium
CN108280020B (en) Monitoring method and device of computer equipment, terminal and computer storage medium
US20160147643A1 (en) Web browser emulator
CN105975367B (en) Test method and device for mobile equipment
CN110519127B (en) Network delay detection method, device and storage medium
US9141522B1 (en) Virtual cluster immune system for dynamic testing
US20170033980A1 (en) Agent manager for distributed transaction monitoring system
WO2022179034A1 (en) Automatic testing method and apparatus, electronic device, and storage medium
CN112988346A (en) Task processing method, device, equipment and storage medium
CN110795353B (en) Quick application debugging method, device, equipment and storage medium
US20160328200A1 (en) Testing screens of a multi-screen device
CN113821254A (en) Interface data processing method, device, storage medium and equipment
CN110750453B (en) HTML 5-based intelligent mobile terminal testing method, system, server and storage medium
US10432490B2 (en) Monitoring single content page application transitions
CN111752828A (en) Performance test method and device for Web application
CN109213955B (en) Data processing method and related equipment
CN111752827A (en) Test script generation method and device
CN110928647A (en) Virtualization performance testing method and device
CN113127369A (en) Processing method and device for execution script
US11714699B2 (en) In-app failure intelligent data collection and analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201009

RJ01 Rejection of invention patent application after publication