CN107341098B - Software performance testing method, platform, equipment and storage medium - Google Patents

Software performance testing method, platform, equipment and storage medium Download PDF

Info

Publication number
CN107341098B
CN107341098B CN201710569691.0A CN201710569691A CN107341098B CN 107341098 B CN107341098 B CN 107341098B CN 201710569691 A CN201710569691 A CN 201710569691A CN 107341098 B CN107341098 B CN 107341098B
Authority
CN
China
Prior art keywords
test
script
software
testing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710569691.0A
Other languages
Chinese (zh)
Other versions
CN107341098A (en
Inventor
王俊杰
徐谊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ctrip Travel Information Technology Shanghai Co Ltd
Original Assignee
Ctrip Travel Information Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ctrip Travel Information Technology Shanghai Co Ltd filed Critical Ctrip Travel Information Technology Shanghai Co Ltd
Priority to CN201710569691.0A priority Critical patent/CN107341098B/en
Publication of CN107341098A publication Critical patent/CN107341098A/en
Application granted granted Critical
Publication of CN107341098B publication Critical patent/CN107341098B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3419Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging

Abstract

The invention provides a software performance testing method, a platform, equipment and a storage medium, wherein the method comprises the steps of establishing a testing script template, wherein the testing script template comprises a plurality of testing nodes; acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node; filling the test parameters into corresponding nodes in the test script template to form a test script to be executed; and executing the software testing task according to the formed testing script to be executed and acquiring testing result data. According to the invention, through the unified management of the test scripts, even if the test scripts are damaged or lost, the test scripts can be quickly and conveniently regenerated, so that the workload of compiling the test scripts by a user is reduced, and the test efficiency is improved; by the real-time acquisition of the test results and the unified management of the test results, the test results are convenient to trace, the abnormal test analysis in multiple tests is more convenient, and the software performance is improved.

Description

Software performance testing method, platform, equipment and storage medium
Technical Field
The invention relates to the technical field of internet testing, in particular to a software performance testing method, a platform, equipment and a storage medium for realizing unified management of a test script and a test result in a software testing process.
Background
The performance test plays an important role in the quality assurance of the software, and the test content included in the performance test is rich and diverse. The performance test is summarized in three aspects: the application is used for testing the performance of the client, the performance of the application on the network and the performance of the application at the server. In general, the three aspects are effectively and reasonably combined, and comprehensive analysis on platform performance and prediction on bottleneck can be achieved.
However, the performance testing platform in the prior art still has some defects. For example, the test scripts are often managed by the user in a personal localization manner, are not easy to maintain, are difficult to share, and are easy to lose, and the scripts need to be rewritten in each test, so that the workload of the user for executing the test is greatly increased, and the test efficiency is reduced. This is very disadvantageous for today's ever increasing test requirements.
In addition, the existing test platform has the problems of difficult maintenance, difficult sharing and easy loss due to the inconsistent management of test results, and is difficult to trace once problems occur.
Therefore, the existing software performance test platform cannot meet the increasing test requirements of users, and a new performance test platform for uniformly managing test scripts and test results is urgently needed to be developed.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a software performance testing method, a platform, equipment and a storage medium, which realize the unified management of a test script and a test result in the software testing process, reduce the workload of compiling the test script by a user and facilitate the tracing of the test result.
The embodiment of the invention provides a software performance testing method, which comprises the following steps:
creating a test script template, wherein the test script template comprises a plurality of test nodes;
acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node;
filling the test parameters into corresponding nodes in the test script template to form a test script to be executed;
and executing the software testing task according to the formed testing script to be executed and acquiring testing result data.
Optionally, the forming a test script to be executed includes the following steps:
filling the test parameters into corresponding test nodes in the test script template;
combining the filled test nodes to form a test script to be executed;
and naming the test script to be executed by using the test script name input by the user.
Optionally, the executing the software testing task includes sequentially executing the testing programs of the testing nodes in the testing script to be executed according to a preset sequence.
Optionally, the software to be tested is stored in at least one test server, and the executing of the software testing task and the obtaining of the testing result data includes the following steps:
sending a test request to a test server according to the test script to be executed;
the test server executes the tested software according to the test request;
and acquiring the returned response data from the test server.
Optionally, the obtaining of the test result data further includes the following steps:
calculating a response time from each sending of a test request to receipt of corresponding response data;
and counting test result data, wherein the test result data comprises maximum response time, minimum response time, average response time and platform throughput in a preset time period.
Optionally, the statistical test result data includes the following steps:
counting the maximum response time, the minimum response time, the average response time and the platform throughput in the current time period at intervals of a preset time period to obtain a segmented statistical result;
counting the maximum response time, the minimum response time, the average response time and the platform throughput in a time interval from the current time to the test starting time at intervals of a preset time period to obtain a whole-period counting result;
and generating a test report according to the segmented statistical result and the full-segment statistical result.
Optionally, the method further comprises the following steps:
acquiring the CPU utilization rate and the memory utilization rate of a test server in real time;
and judging whether the test server can be used for executing the performance test or not according to the comparison between the CPU utilization rate of the test server and a preset CPU utilization rate threshold value and the comparison between the memory utilization rate of the test server and a preset memory utilization rate threshold value.
Optionally, the test nodes in the test script template include an interface test node, a page test node, and an assertion test node;
the test parameters corresponding to the interface test nodes comprise application server interface addresses;
the test parameters corresponding to the page test nodes comprise page domain name addresses;
the test parameters corresponding to the assertion test nodes comprise preset check values.
Optionally, when the test node in the test script to be executed includes an interface test node, the test parameter corresponding to the interface test node includes an application server interface address and a contract;
the steps of executing the software testing task and acquiring the testing result data comprise:
searching a corresponding test server according to the application server interface address, and establishing connection with the test server;
sending a test request to the test server, wherein the test request comprises an application server interface address and a contract;
the test server executes the tested software according to the test request;
and acquiring the returned response data from the test server.
Optionally, when the test node in the test script to be executed includes an interface test node and an assertion test node, the test parameter corresponding to the assertion test node includes a preset check value;
the step of executing the software testing task and acquiring the testing result data further comprises the following steps:
extracting response data to be verified corresponding to a preset verification value from the returned response data;
and judging whether the response data to be verified is consistent with the preset verification value or not and recording a judgment result.
Optionally, when the test node in the test script to be executed includes a page test node, the test parameter corresponding to the page test node includes a page domain name address;
the steps of executing the software testing task and acquiring the testing result data comprise:
acquiring a test server address pointed by the page domain name according to the page domain name address, and establishing connection with the test server;
sending a page acquisition request to the test server;
the test server executes the tested software according to the page acquisition request;
receiving page data from the test server.
Optionally, the obtaining the test result further includes the following steps:
and calculating response time from sending the page acquisition request to receiving page data successfully, wherein the page data successfully received is that the proportion of the received page data to the total page data is greater than a preset threshold value.
Optionally, when the test nodes in the test script to be executed include a page test node and an assertion test node, the test parameters corresponding to the assertion test node include a preset check value;
the step of executing the software testing task and acquiring the testing result data further comprises the following steps:
extracting a field to be checked corresponding to a preset check value from the received page data;
and judging whether the field to be checked is consistent with a preset check value or not and recording a judgment result.
Optionally, the method further comprises the following steps:
and searching test script data corresponding to the name of the test script when the test script cannot be executed, and regenerating the corresponding test script according to the searched test script data.
Optionally, after the test script data is filled into the called test script template, forming a test script in an xml format;
when the software testing task is executed, the testing script in the xml format is firstly converted into the testing script in the jmx format, and then the software testing task is executed according to the testing script in the jmx format.
Optionally, the software testing task is performed in a non-graphical user interface mode.
Optionally, the method further comprises the following steps:
acquiring a test script to be executed uploaded by a user;
and executing the software testing task according to the to-be-executed testing script uploaded by the user and acquiring testing result data.
The embodiment of the invention also provides a software performance testing platform, which is used for realizing the software performance testing method and comprises the following steps:
the database is used for creating and storing a test script template, and the test script template comprises a plurality of test nodes;
the web server is used for acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node, and the test parameter is filled into the corresponding node in the test script template to form a test script to be executed;
the executive machine is used for sending a test request to the test server according to the formed test script to be executed and acquiring test result data;
and the test server is used for storing the tested software and executing a software test task according to the test request.
An embodiment of the present invention further provides a software performance testing apparatus, including:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the software performance testing method via execution of the executable instructions.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the software performance testing method when being executed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
The software performance testing method, the software performance testing platform and the software performance testing storage medium have the following advantages:
according to the invention, through unified management of the test scripts in the software test process, the test scripts are easy to manage, maintain and share, a user only needs to input a small amount of test script data, and can automatically generate the corresponding test scripts based on the test script template, and even if the test scripts are damaged or lost, the test scripts can be quickly and conveniently regenerated, so that the workload of compiling the test scripts by the user is reduced, and the test efficiency is improved; by the real-time acquisition of the test results and the unified management of the test results in the software test process, the test results are easy to manage, maintain and share, the test results are convenient to trace, the abnormal test analysis in multiple tests is more convenient, and the software performance is improved.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
FIG. 1 is a flow chart of a software performance testing method according to an embodiment of the invention;
FIG. 2 is a flow diagram of forming a test script to be executed in accordance with one embodiment of the present invention;
FIG. 3 is a flowchart of executing a software test task and obtaining test result data, in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart of interface testing according to one embodiment of the present invention;
FIG. 5 is a flow diagram of page testing in accordance with one embodiment of the present invention;
FIG. 6 is a block diagram of a software performance testing platform according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating the operation of the software performance testing platform according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a web operations page of a software performance testing platform according to an embodiment of the invention;
FIG. 9 is a page schematic diagram of a report query of the software performance testing platform of an embodiment of the present invention;
FIG. 10 is a page schematic of a view log of a software performance testing platform of an embodiment of the present invention;
FIG. 11 is a schematic diagram of a page of a newly added script of the software performance testing platform according to an embodiment of the present invention;
FIG. 12 is a schematic structural diagram of a software performance testing device according to an embodiment of the present invention;
fig. 13 is a schematic diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
As shown in fig. 1, in order to achieve the above object, an embodiment of the present invention provides a method for testing software performance. The method comprises the following steps:
s100: creating a test script template, wherein the test script template comprises a plurality of test nodes, each test node actually corresponds to a section of preset instruction in the test script, the instruction comprises some test parameters, and part of the test parameters can be left empty or filled with preset values;
s200: acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node;
s300: filling the test parameters into corresponding nodes in the test script template to form a test script to be executed, specifically filling the test parameters into corresponding positions of the test parameters in the corresponding nodes respectively to form a test script meeting the user requirements;
s400: and executing the software testing task according to the formed testing script to be executed and acquiring testing result data.
The software performance testing method can be applied to a special software performance testing platform. Therefore, the invention changes the traditional software performance test mode, uniformly inputs the performance script, and centrally manages and maintains the performance test script. When the test script is formed, a user only needs to input the name of the test script and some test parameters on a page, the software performance test platform can analyze data, identify nodes corresponding to the test parameters, automatically generate the matched test script and execute the test according to the matched test script, so that the situation that the user rewrites the script every time the test is performed is avoided, and the workload of the user is greatly reduced.
When a user inputs, the user can directly input through a uniform web interface, a form filling form can be formed on the web interface, and each test node corresponds to a position for filling in test parameters. When filling in, the user can select the required test node and fill in the test parameter at the corresponding position, and for the unnecessary test node, the user can leave a space without filling or select the option without adding the test node.
As shown in fig. 2, optionally, the forming a test script to be executed includes the following steps:
s301: filling the test parameters into corresponding test nodes in the test script template, wherein the original blank can be filled or the original preset value can be replaced;
s302: combining the filled test nodes to form a test script to be executed;
s303: and naming the test script to be executed by using the test script name input by the user.
The sequence between the steps is merely an example, and in practical applications, the sequence can be adjusted as needed. For example, first creating a new test script, inputting a name of the test script for naming, and then inputting test parameters, or first selecting a test node to be used, and then filling the test parameters in each test node, etc., all fall within the scope of the present invention.
Furthermore, the test script name and the test script data of the embodiment of the invention are all stored in the database, so that the test script, the test script name and the test script data are convenient to call when in use. In this embodiment, the test script name, the test script data, and the test script are in a one-to-one correspondence, and the test script name is not repeatable. Therefore, when a test script cannot be executed due to damage, loss and the like, the test script data corresponding to the name of the test script can be searched, and the corresponding test script can be regenerated according to the searched test script data. After the test script is generated, the test script and the test script data are automatically written into a database and are associated with the name of the test script, and meanwhile, the generated test script is subjected to secondary backup, wherein the first-time script is backed up on a test server, and the second-time script is backed up at one end of executing the test, namely a software performance test executing machine.
Compared with the prior art, the test tool which works on the machine is used, the problem that the test script is damaged or lost is not needed to be worried about, the storage and backup of the test script are realized, and if the user allows, the test script can be carried out among users
Optionally, the executing the software testing task includes sequentially executing each testing node in the testing script to be executed according to a preset sequence. Here, the preset sequence may be various, for example, the preset sequence is executed from front to back according to the sequence of the test nodes, or a default setting is firstly to select the page test node and then to select the assertion test node, and the like, where the preset sequence may be a default of the software performance test platform, or may be set by the user, and both of them belong to the scope of the present invention.
As shown in fig. 3, optionally, the software to be tested is stored in at least one test server, and the executing of the software test task and the obtaining of the test result data include the following steps:
s401: sending a test request to a test server according to the test script to be executed; the test server can be an application server, wherein the software to be tested is stored, and after the test request is received, the software to be tested is executed according to the content of the test request;
s402: acquiring returned response data from the test server; preferably, after the test request is sent to the test server, the test server is monitored in real time, and response data is received in real time;
further, in order to better analyze the test result data, the obtaining of the test result data further includes the following steps:
s403: calculating a response time from each sending of a test request to receipt of corresponding response data;
s404: and counting test result data, wherein the test result data comprises maximum response time, minimum response time, average response time and TPS (platform throughput) in a preset time period, and the test result data can further comprise test success times or test success rates, test failure times or test failure rates and the like in a period of time.
In a preferred embodiment, the statistical test result data comprises the following steps:
counting the maximum response time, the minimum response time, the average response time and the TPS in the current time period every other preset time period to obtain a segmented counting result;
counting the maximum response time, the minimum response time, the average response time and the platform throughput in a time interval from the current time to the test starting time at intervals of a preset time period to obtain a whole-period counting result;
and generating a test report according to the segmented statistical result and the full-segment statistical result.
For example, in the performance test process, the statistical results of the performance test result data (the maximum response time, the minimum response time, the average response time, TPS, and the like) in the 10-second interval are acquired every 10 seconds, and the statistical results of the test result data (the maximum response time, the minimum response time, the average response time, TPS, and the like) in the time interval from the current time to the test start time are acquired every 10 seconds. When a test report is generated, a full-section statistical result and a sectional statistical result can be generated, the two statistical results are analyzed and evaluated respectively, and the accuracy and the reliability of data are guaranteed in a double-layer mode.
After the test report is generated, a user can check the corresponding test report through the web page, can call the test report for many times at the same time, and displays the test reports in the same web page, thereby facilitating data comparison. For example, when the tested software code is changed for multiple times, and a new performance test is performed every time the software code is changed, test reports of multiple performance tests can be displayed simultaneously, and differences among test results formed by multiple software code changes can be visually checked.
Further, the software performance testing method of the invention may further include the steps of:
acquiring the CPU utilization rate and the memory utilization rate of a test server in real time;
and judging whether the test server can be used for executing the performance test or not according to the comparison between the CPU utilization rate of the test server and a preset CPU utilization rate threshold value and the comparison between the memory utilization rate of the test server and a preset memory utilization rate threshold value.
For example, the occupancy rate values of the real-time CPU and the memory in the test server (the current CPU usage rate is 30%, the memory usage value is 3.2G, the total memory amount is 8G, and the memory usage rate is 40%) are obtained, and the user calculates the content occupancy rate according to the percentage sum (total memory amount-memory usage amount), thereby judging whether the test server is healthy. Whether healthy here refers to whether someone else on the application server is performing performance testing or usage.
Optionally, the test nodes in the test script template include an interface test node, a page test node, and an assertion test node; the combination manner among the interface test node, the page test node and the assertion test node may be various, and may further include other types of test nodes, without being limited thereto. The test parameters corresponding to the interface test nodes comprise application server interface addresses; the test parameters corresponding to the page test nodes comprise page domain name addresses; the test parameters corresponding to the assertion test nodes comprise preset check values. That is, the performance test types of an embodiment of the present invention may include an interface test and a page test, and an assertion test may be added to the interface test and the page test.
The application program to be tested is deployed with a software code to be tested, for example, the interface test can realize the test of the interface data calling software, and the page test can realize the test of the page reading display software.
Furthermore, the test script data may also include other data, such as start-stop time of the test, duration of the test, number of times of the test, and the like, which may facilitate the user to better customize his test script.
In a preferred embodiment, when the test node in the test script to be executed includes an interface test node, the test parameter corresponding to the interface test node includes an application server interface address and a contract.
As shown in fig. 4, the interface test includes the following steps:
s411: searching a corresponding test server according to the application server interface address, and establishing connection with the test server;
s412: sending a test request to the test server, wherein the test request comprises an application server interface address and a contract; after receiving the test request, the test server calls a corresponding interface according to the interface address and the contract of the application server, executes the tested software and returns response data;
s413: and acquiring the returned response data from the test server.
For example, the test parameters corresponding to one interface test node are:
application server interface address: http://10.23.3.4:5050/getUserNamebyid
Contract: { Id:10}
That is, the user needs to obtain the UserName of the user from the interface of the target address, the contract is the code for obtaining the UserName agreed in the development process, for example, the { Id:10} is set in the development process, and the position corresponding to the UserName is selected.
When generating a corresponding test script and executing a software test task accordingly, the software performance test platform simulates 100 users (the number is only an example) to send requests to the test server, and the test server receives the requests, calls corresponding interfaces and returns a UserName as response data. The format of the response data may be { username: abc }. Specifically, the plurality of users respectively send the requests, and the requests may be sequentially sent according to a preset number of times of the requests, or may be sent repeatedly in a loop.
The response time between each request and the receipt of the returned UserName may be calculated separately and the minimum, maximum, average response time and TPS (number of requests processed per second) over a period of time counted.
In addition, when the UserName is acquired, various data may be acquired, such as LastName, FirstName, and the like, and one or more of the data may be selected for accuracy verification to determine whether the response of the application server is correct.
Therefore, further, when the test nodes in the test script to be executed include the interface test nodes and the assertion test nodes, the test parameters corresponding to the assertion test nodes include preset check values; the interface test may further comprise the steps of:
s414: extracting response data to be verified corresponding to a preset verification value from the returned response data; for example, if the preset check value is set as a theoretical value of the LastName, the LastName value in the response data is extracted during extraction;
s415: and judging whether the response data to be verified is consistent with the preset verification value or not and recording a judgment result. If consistent, the response return is indicated to be accurate, otherwise, the response return is indicated to be erroneous.
Optionally, when the test node in the test script to be executed includes a page test node, the test parameter corresponding to the page test node includes a page domain name address.
As shown in fig. 5, the page test includes the following steps:
s421: acquiring a test server address pointed by the page domain name according to the page domain name address, and establishing connection with the test server;
s422: sending a page acquisition request to the test server; after receiving the page acquisition request, the test server executes the tested software according to the request;
s423: receiving page data from the test server.
Further, the obtaining of the test result may further include the steps of:
s424: and calculating the response time from the sending of the page acquisition request to the successful receiving of the page data.
For example, the test parameters corresponding to a page test node are as follows:
page domain name address: www.baidu.com
When generating a corresponding test script and executing a software test task accordingly, the software performance test platform may simulate 100 users (the number is merely an example) to send a page acquisition request to the test server, and may also calculate the minimum, maximum, average response time and TPS in a period of time, with the time between the user accessing the address and the page data reception success being taken as the response time. Here, the page data reception success is that the ratio of the received page data to the total page data is greater than a preset threshold, for example, the page data reception is considered to be successful when the preset page data reception accounts for more than 90% of the total page data. The preset threshold is only an example, and in practical application, the preset threshold can be adjusted according to actual test requirements, and all of the preset thresholds belong to the protection scope of the present invention.
Similarly, after the page data is acquired, the accuracy of the page data acquisition can be verified, that is, assertion test nodes are further added, and test parameters corresponding to the assertion test nodes comprise preset check values; the page test may also include the following steps:
s425: extracting a field to be checked corresponding to a preset check value from the received page data;
s426: and judging whether the field to be checked is consistent with a preset check value or not and recording a judgment result. For example, header fields, trailer fields, etc. of the page data may be checked.
Optionally, after the test script data is filled into the called test script template, forming a test script in an xml format; when the software testing task is executed, the testing script in the xml format is firstly converted into the testing script in the jmx format, and then the software testing task is executed according to the testing script in the jmx format. The way scripts are generated in this embodiment can be applied to all xml structured files, not specific to a particular industry or product, xml is an extensible markup language and is also one of the most widely used markup languages.
Optionally, the software testing task is performed in a non-graphical user interface mode. That is, when executing the software testing tasks, the user cannot see that the software testing tasks are executed one by one, and the software testing tasks are performed in the background, so that the testing load of the software performance testing platform is favorably reduced.
Furthermore, the embodiment of the invention also provides a channel for uploading the existing script by the user, namely, the user can directly upload the existing test script on hand to the software performance test platform, so that the software test task is directly executed according to the test script of the user.
As shown in fig. 6, an embodiment of the present invention further provides a software performance testing platform, which is used in the software performance testing method, and includes:
a database 100, configured to create and store a test script template, where the test script template includes a plurality of test nodes; the database 100 may further store a test script name and test script data input by a user, a formed test script to be executed, and a test script uploaded by the user, wherein the test script and the test script name are stored in a one-to-one correspondence manner;
the web server 200 is configured to obtain a test script name and test script data input by a user on a web interface, where the test script data includes a test parameter corresponding to at least one test node, and fill the test parameter into a corresponding node in the test script template to form a test script to be executed;
the execution machine 300 is used for sending a test request to the test server according to the formed test script to be executed and acquiring test result data;
and the test server 400 is used for storing the tested software and executing a software test task according to the test request.
The execution machine 300 is responsible for sending a test request to the test server 400 and listening to the test state of the test server 400. The user can connect to the web server through the user terminal 500 to create a script and view a test report. The embodiment of the invention preferably adopts the independent execution machine 300, ensures the clean performance test environment, and does not have the problems of inaccurate data, failure and the like caused by any external factors.
Fig. 7 is a flowchart illustrating a software performance testing platform according to an embodiment of the present invention. Wherein, the A section represents the action executed by the web server, the B section represents the action executed by the execution machine, and the C section represents the action executed by the test server.
Firstly, a user inputs a test script name and test script data to a web server from a web page, or directly uploads a test script;
when a user needs to execute the test script, selecting the corresponding test script and sending a test instruction to the execution machine;
the executive machine monitors the generated test script in real time, when receiving the starting command, the executive machine triggers and executes the software test task according to the name of the test script required by the user, meanwhile, the executive machine receives the time required to be tested and starts time calculation, and the executive machine finishes the work when reaching the test time required by the user.
When the performance test is carried out, result data (maximum, minimum, average response time, TPS and the like) of the performance test within a range of 10 seconds are obtained every 10 seconds, all test result data (maximum, minimum, average response time, TPS and the like) within the current performance test time are obtained every 10 seconds, and the request times, success times or success rates, failure times or failure rates can be further counted. And after the whole performance test is finished, analyzing all results and segmentation results, ensuring the accuracy and reliability of data by double layers and storing the data in a database.
And uniformly generating a test report according to the performance test results of multiple times, wherein a user can randomly select multiple reports on a web interface and display the reports on the same page for data search and comparison.
The execution machine can also acquire the state information of the application server and the execution machine in real time, for example, the CPU and the memory use state of the application server are acquired, and whether the current server is in a healthy state and is suitable for a performance software test task to be deployed is judged.
The test result data of the embodiment of the invention is obtained in real time in the performance test process, but not obtained at one time after the test is finished, so that the real-time performance and the accuracy of the result data obtaining can be ensured.
Fig. 8 is a schematic diagram of a web page of a software performance testing platform according to an embodiment of the present invention. The user can add new scripts on the page, upload scripts, query/modify, select script execution, modify and view test environment, view test results, perform performance comparisons for multiple tests, and manage scripts. The software performance testing platform may obtain and graphically illustrate application server state information.
Wherein the tabular region displays the results of the aggregated report, for example, as shown in table 1 below:
table 1 test report examples table 1
Figure BDA0001349329270000141
Wherein N/A indicates that the test is currently not applicable, and after the test is performed and test data is obtained, a specific test result can be filled in the form. The drawing area may display a graph plotted against the test result data with time as a horizontal and vertical table and statistical variables as vertical coordinates.
Further, the table area can simultaneously display the results of the aggregated reports of multiple tests, and the drawing area can correspondingly and simultaneously draw the curve graphs of the results of the multiple tests, so that the user can conveniently check and compare the results. For example, the table may be as shown in table 2 below:
table 2 test report examples table 2
Figure BDA0001349329270000151
Fig. 9 is a schematic diagram of a page of report query of the software performance testing platform according to an embodiment of the present invention. The results of multiple tests can be simultaneously placed on one page to be checked, so that the comparison of the test results of multiple tests by a user is facilitated, different influences caused by code adjustment can be conveniently checked, and the software codes can be adjusted accordingly.
Fig. 10 is a schematic page diagram of a view log of a software performance testing platform according to an embodiment of the present invention. During the test, the logs can be stored in real time, and a user can call and view the corresponding logs according to actual needs.
FIG. 11 is a schematic page diagram of an additional script of the software performance testing platform according to an embodiment of the present invention. As shown in the figure, the user may fill in a production line, owner, script name, service name, execution engine, request type, encoding format, Header, Body, parameterization, duration, concurrency number, assertion, remarks, and so forth. Each frame can be selected and filled as required, and after filling is completed, the platform can automatically identify test script data filled by a user and generate a test script for execution.
The page diagrams shown in fig. 8 to 11 are only examples, and may be adjusted according to needs in practical applications, but not limited thereto.
The embodiment of the invention also provides software performance test equipment, which comprises a processor; a memory having stored therein executable instructions of the processor; wherein the processor is configured to perform the steps of the software performance testing method via execution of the executable instructions.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a platform, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 12. The electronic device 600 shown in fig. 12 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 12, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating platform, one or more application programs, other program modules, and program data, each of which, and in some combination, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID platforms, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the software performance testing method when being executed. In some possible embodiments, aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
Referring to fig. 13, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution platform, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor platform, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution platform, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The software performance testing method, the software performance testing platform and the software performance testing storage medium have the following advantages:
according to the invention, through unified management of the test scripts in the software test process, the test scripts are easy to manage, maintain and share, a user only needs to input a small amount of test script data, and can automatically generate the corresponding test scripts based on the test script template, and even if the test scripts are damaged or lost, the test scripts can be quickly and conveniently regenerated, so that the workload of compiling the test scripts by the user is reduced, and the test efficiency is improved; by the real-time acquisition of the test results and the unified management of the test results in the software test process, the test results are easy to manage, maintain and share, the test results are convenient to trace, the abnormal test analysis in multiple tests is more convenient, and the software performance is improved.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (17)

1. A software performance testing method is characterized by comprising the following steps:
creating a test script template, wherein the test script template comprises a plurality of test nodes;
acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node;
filling the test parameters into corresponding nodes in the test script template to form a test script to be executed;
executing a software test task according to the formed test script to be executed and acquiring test result data;
the method for forming the test script to be executed comprises the following steps:
filling the test parameters into corresponding test nodes in the test script template;
combining the filled test nodes to form a test script to be executed;
naming the test script to be executed by the name of the test script input by the user;
the software to be tested is stored in at least one testing server, the software testing task is executed, and the testing result data is obtained, the method comprises the following steps:
sending a test request to a test server according to the test script to be executed;
the test server executes the tested software according to the test request;
acquiring returned response data from the test server;
the method further comprises the steps of:
acquiring the CPU utilization rate and the memory utilization rate of a test server in real time;
and judging whether the test server can be used for executing the performance test or not according to the comparison between the CPU utilization rate of the test server and a preset CPU utilization rate threshold value and the comparison between the memory utilization rate of the test server and a preset memory utilization rate threshold value.
2. The software performance testing method of claim 1, wherein the executing of the software testing task includes the step of sequentially executing the testing program of each testing node in the testing script to be executed according to a preset sequence.
3. The software performance testing method of claim 1, wherein the obtaining test result data further comprises the steps of:
calculating a response time from each sending of a test request to receipt of corresponding response data;
and counting test result data, wherein the test result data comprises maximum response time, minimum response time, average response time and platform throughput in a preset time period.
4. The software performance testing method of claim 3, wherein the statistical test result data comprises the steps of:
counting the maximum response time, the minimum response time, the average response time and the platform throughput in the current time period at intervals of a preset time period to obtain a segmented statistical result;
counting the maximum response time, the minimum response time, the average response time and the platform throughput in a time interval from the current time to the test starting time at intervals of a preset time period to obtain a whole-period counting result;
and generating a test report according to the segmented statistical result and the full-segment statistical result.
5. The software performance testing method of claim 1, wherein the test nodes in the test script template include an interface test node, a page test node, and an assertion test node;
the test parameters corresponding to the interface test nodes comprise application server interface addresses;
the test parameters corresponding to the page test nodes comprise page domain name addresses;
the test parameters corresponding to the assertion test nodes comprise preset check values.
6. The software performance testing method of claim 1, wherein when the test node in the test script to be executed comprises an interface test node, the test parameters corresponding to the interface test node comprise an application server interface address and a contract;
the steps of executing the software testing task and acquiring the testing result data comprise:
searching a corresponding test server according to the application server interface address, and establishing connection with the test server;
sending a test request to the test server, wherein the test request comprises an application server interface address and a contract;
the test server executes the tested software according to the test request;
and acquiring the returned response data from the test server.
7. The software performance testing method of claim 6, wherein when the test nodes in the test script to be executed include the interface test nodes and the assertion test nodes, the test parameters corresponding to the assertion test nodes include preset check values;
the step of executing the software testing task and acquiring the testing result data further comprises the following steps:
extracting response data to be verified corresponding to a preset verification value from the returned response data;
and judging whether the response data to be verified is consistent with the preset verification value or not and recording a judgment result.
8. The software performance testing method of claim 1, wherein when the test node in the test script to be executed comprises a page test node, the test parameter corresponding to the page test node comprises a page domain name address;
the steps of executing the software testing task and acquiring the testing result data comprise:
acquiring a test server address pointed by the page domain name according to the page domain name address, and establishing connection with the test server;
sending a page acquisition request to the test server;
the test server executes the tested software according to the page acquisition request;
receiving page data from the test server.
9. The software performance testing method of claim 8, wherein the obtaining of the test results further comprises the steps of:
and calculating response time from sending the page acquisition request to receiving page data successfully, wherein the page data successfully received is that the proportion of the received page data to the total page data is greater than a preset threshold value.
10. The software performance testing method of claim 8, wherein when the test nodes in the test script to be executed include a page test node and an assertion test node, the test parameters corresponding to the assertion test node include a preset check value;
the step of executing the software testing task and acquiring the testing result data further comprises the following steps:
extracting a field to be checked corresponding to a preset check value from the received page data;
and judging whether the field to be checked is consistent with a preset check value or not and recording a judgment result.
11. The software performance testing method of claim 1, further comprising the steps of:
and searching test script data corresponding to the name of the test script when the test script cannot be executed, and regenerating the corresponding test script according to the searched test script data.
12. The software performance testing method of claim 1, wherein the test script data is filled into the called test script template to form a test script in an xml format;
when the software testing task is executed, the testing script in the xml format is firstly converted into the testing script in the jmx format, and then the software testing task is executed according to the testing script in the jmx format.
13. The software performance testing method of claim 1, wherein the software testing task is performed in a non-graphical user interface mode.
14. The software performance testing method of claim 1, further comprising the steps of:
acquiring a test script to be executed uploaded by a user;
and executing the software testing task according to the to-be-executed testing script uploaded by the user and acquiring testing result data.
15. A software performance testing platform for implementing the software performance testing method of any one of claims 1 to 14, comprising:
the database is used for creating and storing a test script template, and the test script template comprises a plurality of test nodes;
the web server is used for acquiring a test script name and test script data input by a user on a web interface, wherein the test script data comprises a test parameter corresponding to at least one test node, and the test parameter is filled into the corresponding node in the test script template to form a test script to be executed;
the executive machine is used for sending a test request to the test server according to the formed test script to be executed and acquiring test result data;
and the test server is used for storing the tested software and executing a software test task according to the test request.
16. A software performance testing apparatus, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the software performance testing method of any one of claims 1 to 14 via execution of the executable instructions.
17. A computer readable storage medium storing a program, wherein the program when executed implements the steps of the software performance testing method of any of claims 1 to 14.
CN201710569691.0A 2017-07-13 2017-07-13 Software performance testing method, platform, equipment and storage medium Active CN107341098B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710569691.0A CN107341098B (en) 2017-07-13 2017-07-13 Software performance testing method, platform, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710569691.0A CN107341098B (en) 2017-07-13 2017-07-13 Software performance testing method, platform, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107341098A CN107341098A (en) 2017-11-10
CN107341098B true CN107341098B (en) 2020-06-19

Family

ID=60218761

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710569691.0A Active CN107341098B (en) 2017-07-13 2017-07-13 Software performance testing method, platform, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107341098B (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109960645B (en) * 2017-12-22 2022-10-18 迈普通信技术股份有限公司 Script test method and device and script test system
CN108415826B (en) * 2018-01-12 2021-10-29 深圳壹账通智能科技有限公司 Application testing method, terminal device and computer readable storage medium
CN108519943A (en) * 2018-03-06 2018-09-11 平安科技(深圳)有限公司 Testing and control and test execution device, method and computer storage media
CN108255738A (en) * 2018-04-09 2018-07-06 平安普惠企业管理有限公司 Automated testing method, device, computer equipment and storage medium
CN108874499A (en) * 2018-04-12 2018-11-23 彭根 software evaluation method and device
CN108829574B (en) * 2018-04-13 2023-04-18 深圳壹账通智能科技有限公司 Test data laying method, test server and computer readable storage medium
CN108829575B (en) * 2018-04-17 2021-08-24 平安科技(深圳)有限公司 Test case recommendation method, electronic device and readable storage medium
CN108415847A (en) * 2018-05-08 2018-08-17 平安普惠企业管理有限公司 Performance test methods, device, computer equipment and storage medium
CN109726105B (en) * 2018-05-15 2023-11-28 深圳市易行网数字科技有限公司 Test data construction method, device, equipment and storage medium
CN108845950A (en) * 2018-08-03 2018-11-20 平安科技(深圳)有限公司 Test device, the method for test and storage medium
CN109299006A (en) * 2018-09-11 2019-02-01 平安科技(深圳)有限公司 The method and apparatus for determining system stability
CN109783336A (en) * 2018-12-18 2019-05-21 中国平安人寿保险股份有限公司 Data analyze scenario generation method, device, medium and electronic equipment
CN109871314A (en) * 2019-01-02 2019-06-11 石化盈科信息技术有限责任公司 The automatic generation method of test script
CN111402658A (en) * 2019-01-03 2020-07-10 福建天泉教育科技有限公司 Method and terminal for automatically checking answering system
CN110109825A (en) * 2019-04-12 2019-08-09 平安普惠企业管理有限公司 Method for early warning and relevant apparatus on line
CN110309038B (en) * 2019-04-17 2023-02-07 中国平安人寿保险股份有限公司 Performance test method and device, electronic equipment and computer readable storage medium
CN110334003A (en) * 2019-05-22 2019-10-15 梁俊杰 A kind of flow designing method and relevant device
CN110347559A (en) * 2019-07-18 2019-10-18 浪潮商用机器有限公司 A kind of test method of server energy consumption, device, equipment and readable storage medium storing program for executing
CN110659870A (en) * 2019-08-14 2020-01-07 平安普惠企业管理有限公司 Business audit test method, device, equipment and storage medium
CN110427331B (en) * 2019-09-03 2021-06-22 四川长虹电器股份有限公司 Method for automatically generating performance test script based on interface test tool
CN110633150A (en) * 2019-09-12 2019-12-31 广东浪潮大数据研究有限公司 Container scheduling performance testing method and device
CN110674042B (en) * 2019-09-23 2022-08-02 苏州浪潮智能科技有限公司 Concurrency performance testing method and device
CN112650666B (en) * 2019-10-12 2024-04-09 北京达佳互联信息技术有限公司 Software testing system, method, device, control equipment and storage medium
CN111143214A (en) * 2019-12-26 2020-05-12 上海米哈游天命科技有限公司 Game performance monitoring method, device, server and storage medium
CN111258902B (en) * 2020-01-17 2022-07-19 深圳平安医疗健康科技服务有限公司 Performance test method and performance test system based on SockJS server
CN111581097A (en) * 2020-05-09 2020-08-25 深圳市卡数科技有限公司 Performance test method, system, equipment and storage medium of parameterized data
CN112015605A (en) * 2020-07-28 2020-12-01 深圳市金泰克半导体有限公司 Memory test method and device, computer equipment and storage medium
CN112328482A (en) * 2020-11-05 2021-02-05 中国平安人寿保险股份有限公司 Test method and device based on script template, computer equipment and storage medium
CN112433942A (en) * 2020-11-24 2021-03-02 北京云测信息技术有限公司 Software automation testing method, device and system based on artificial intelligence model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104717103A (en) * 2013-12-13 2015-06-17 中国移动通信集团公司 Method and device for testing network device
CN105843716A (en) * 2016-04-01 2016-08-10 浪潮电子信息产业股份有限公司 IO function test method based on MLTT (The Medusa Lab Test Tool Suite)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060190579A1 (en) * 2005-02-23 2006-08-24 Alcatel Assisted command script template creation
CN103268226B (en) * 2013-05-17 2016-07-06 瑞斯康达科技发展股份有限公司 A kind of test script file generates method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104717103A (en) * 2013-12-13 2015-06-17 中国移动通信集团公司 Method and device for testing network device
CN105843716A (en) * 2016-04-01 2016-08-10 浪潮电子信息产业股份有限公司 IO function test method based on MLTT (The Medusa Lab Test Tool Suite)

Also Published As

Publication number Publication date
CN107341098A (en) 2017-11-10

Similar Documents

Publication Publication Date Title
CN107341098B (en) Software performance testing method, platform, equipment and storage medium
EP2572294B1 (en) System and method for sql performance assurance services
CN110826071A (en) Software vulnerability risk prediction method, device, equipment and storage medium
CN107957940B (en) Test log processing method, system and terminal
US20160274997A1 (en) End user monitoring to automate issue tracking
US10528456B2 (en) Determining idle testing periods
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN109815119B (en) APP link channel testing method and device
CN110109824B (en) Big data autoregression test method and device, computer equipment and storage medium
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
WO2021097824A1 (en) Code quality and defect analysis method, server and storage medium
CN106713011B (en) Method and system for obtaining test data
CN112286806A (en) Automatic testing method and device, storage medium and electronic equipment
CN110647471A (en) Interface test case generation method, electronic device and storage medium
CN111654495B (en) Method, apparatus, device and storage medium for determining traffic generation source
US11169910B2 (en) Probabilistic software testing via dynamic graphs
CN112882863A (en) Method, device and system for recovering data and electronic equipment
CN115248782B (en) Automatic testing method and device and computer equipment
CN115017047A (en) Test method, system, equipment and medium based on B/S architecture
CN113656391A (en) Data detection method and device, storage medium and electronic equipment
CN114003497A (en) Method, device and equipment for testing service system and storage medium
CN112631929A (en) Test case generation method and device, storage medium and electronic equipment
CN112965910A (en) Automatic regression testing method and device, electronic equipment and storage medium
CN113127284A (en) Server pressure testing method and system, electronic equipment and storage medium
CN112988589A (en) Interface testing method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant