CN111209218A - Automatic performance testing method based on Jmeter - Google Patents

Automatic performance testing method based on Jmeter Download PDF

Info

Publication number
CN111209218A
CN111209218A CN202010248639.7A CN202010248639A CN111209218A CN 111209218 A CN111209218 A CN 111209218A CN 202010248639 A CN202010248639 A CN 202010248639A CN 111209218 A CN111209218 A CN 111209218A
Authority
CN
China
Prior art keywords
test
jmeter
performance test
performance
script
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010248639.7A
Other languages
Chinese (zh)
Inventor
水晓艺
何鹏林
李旭
王莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Wanwei Information Technology Co Ltd
Original Assignee
China Telecom Wanwei Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Wanwei Information Technology Co Ltd filed Critical China Telecom Wanwei Information Technology Co Ltd
Priority to CN202010248639.7A priority Critical patent/CN111209218A/en
Publication of CN111209218A publication Critical patent/CN111209218A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the technical field of software testing, in particular to a performance automatic testing method based on a Jmeter, which comprises the following steps: s1, defining keywords; s2, defining a test task; s3, defining a performance test scene; s4, selecting a performance test keyword; s5, setting a remote monitoring server; s6, setting a result checking strategy; s7, generating a Jmeter script; s8, running a Jmeter script; s9, ending running of the Jmeter script; s10, generating a performance test report; s11, sending a test report mail, and finishing the test; the invention can quickly, accurately and automatically generate the Jmeter script, thereby improving the readability and the comprehensibility of the script configuration; the multi-node concurrent execution and the single-node execution performance test script are supported, and the performance test fully utilizes each test server under the high concurrent condition, so that the waste of the test servers is avoided; the performance indexes do not need to be counted manually, so that the performance testing efficiency is greatly improved, and the testing result is more accurate; the performance test method has the advantages of simple flow, easy operation and understanding, and capability of reducing the performance test difficulty.

Description

Automatic performance testing method based on Jmeter
Technical Field
The invention relates to the technical field of software testing, in particular to a performance automatic testing method based on a Jmeter.
Background
As a popular performance testing tool, the meter tool is increasingly applied to performance testing of various servers, software systems, software interfaces, and the like. Generally, compiling, running and result statistics of performance test scripts are manually compiled, run and counted by performance testers, and due to the fact that experience and technical capability of the performance testers are greatly different, and a large number of professional terms and skills which are not easy to master are arranged in a Jmeter tool and the scripts, the threshold of performance tests is high. Meanwhile, due to the diversity and variability of the test requirements, the test script is changed accordingly. Therefore, the manually written Jmeter script and the manually counted test results often have many errors and deviations when actually performing the performance test, resulting in low efficiency, inaccurate test data, and even failure to complete the expected performance test.
In order to solve the problems, some units or individuals also provide some performance semi-automatic testing methods and systems based on a Jmeter tool, the methods and systems can improve the performance testing efficiency to a certain extent, but cannot automatically generate a complete Jmeter testing script, cannot finish the performance testing in a one-click manner, still need to manually operate some links, are complex in system operation flow, are not easy to operate, need to master some technologies, have higher threshold of the performance testing, and do not meet the requirement of reducing the difficulty of the performance testing. In addition, a large amount of error reports or incompatibility of the performance test data and the obtained performance test data statistics are possibly caused in the semi-automatic performance test, so that the performance test result is not ideal, and the Jmeter performance test improvement efficiency is not high.
Disclosure of Invention
The invention aims to provide a performance automatic testing method based on a Jmeter, which improves the performance testing efficiency, has simple operation flow, can monitor various performance indexes and resource use conditions in real time in the execution process, has intuitive test report data display, facilitates the performance testing and reduces the difficulty of the performance testing.
In order to solve the above problems, the present invention provides a method for automatically testing performance based on a Jmeter, which comprises the following steps:
s1, defining keywords: packaging a Jmeter keyword, and setting a keyword name and a parameter name for the packaged keyword;
s2, defining a test task: the method comprises the steps of testing items, sending mails, executing at fixed time, executing time, recipients and senders; whether the test is executed regularly or not can be determined according to actual conditions, and whether a test report is sent in a mailbox mode or not is determined after the test is finished; the method can be executed at any time without manual operation of performance testing personnel, and the set sending mail can send a performance testing report to related personnel after the operation is finished, and does not need manual operation of the testing personnel;
s3, defining a performance test scene: the method comprises the range of performance test and a test strategy, wherein the test strategy comprises the number of concurrent users, execution times, set time and a concurrent user growth mode;
s4, selecting a performance test keyword: selecting the Jmeter keywords packaged in the step S1 and the test task defined in the step S2, and combining the required keywords; the packaged keywords are the minimum set meeting performance tests, and other keywords can be packaged and expanded according to actual requirements; the keyword name is a suggested name, and can be freely appointed according to actual requirements; the keyword parameter is a minimum set required in the meter or an optimized set meeting the minimum set, and can be expanded according to actual requirements; the preset value of the keyword parameter is a suggested value, and other values or no preset value can be specified according to actual requirements.
S5, setting a remote monitoring server: the method comprises the steps of server IP, server type, server application, CPU utilization rate of a monitored server in a test process, memory idle rate, import and export broadband and CPU load;
s6, setting a result checking strategy: the method comprises the steps of TPS, response time, success rate, the number of concurrent users, the utilization rate of a server CPU and the idle rate of a server memory;
s7, generating a Jmeter script: saving the configuration information in the steps S2, S3, S4, S5 and S6, and automatically generating a Jmeter script and a parameter file; the method can run scripts regularly or manually, automatically acquire idle servers under the condition of high concurrency so as to realize remote multi-node concurrent execution of the Jmeter, and simultaneously start a remote monitoring server to monitor and summarize performance test data in real time;
s8, running a Jmeter script: running the Jmeter script generated in the step S7, starting the remote monitoring server set in the step S5 when the running starts, monitoring and summarizing performance test data in real time, automatically distributing idle servers according to the test strategy defined in the step S3, and executing the idle servers by multi-node concurrent test or single-node test, wherein the real-time performance test data comprises TPS, maximum response time, minimum response time, 90% response time, the number of concurrent users, success rate, cpu utilization rate of the monitoring server, disk read-write rate, import and export broadband, cpu load and memory use condition; the real-time performance test data can be displayed as an integral trend graph, and simultaneously, the real-time data of the number of concurrent users, the maximum and minimum, 90% response time, the number of requests, the success rate, tps, 4xx/5xx and the like of each transaction name are also displayed; the multi-node concurrent execution ensures that the number of the executors is automatically controlled to share high concurrent pressure under the condition of high concurrency;
s9, ending the operation of the Jmeter script: ending the operation of the Jmeter script of the step S8, and stopping the remote monitoring server set in the step S5;
s10, generating a performance test report: after step 9, generating a performance test report, wherein the performance test report comprises basic information of a test task, test strategy information, various indexes of a performance test, various indexes of a remote monitoring server and whether a final test is passed; the performance test report shows various indexes of the performance test and a final test result;
s11, sending a test report mail: the performance test report generated at step S10 is sent to the recipient defined at step S2, after which the test ends.
Further, in step S1, the step of encapsulating the Jmeter keyword is to encapsulate the Jmeter keyword using Java or Python language, where the encapsulated Jmeter keyword includes a test plan, an execution plan, an http Cookie manager, an http request default value, an http request, a transaction controller, a simple controller, a conditional branch controller, a loop controller, a regular expression, a CSV file generation file, an upload CSV file, and a back-end listener.
Further, the result checking policy in step S6 may be provided with a plurality of rules for checking whether the performance test reaches the expected result and determining whether the performance test passes.
The method of claim 1, wherein the method comprises the following steps: and step S3, generating a trend graph of the increase of the number of concurrent users after the test strategy is set.
Further, in the step S4, the keyword support is repeatedly selected; in step S4, the keywords support parent-child relationship keywords, that is, one keyword includes a plurality of child keywords as a group of keywords; the http request keywords can be imported in batch through an external excel file, a CSV file and a json file, and the imported parameters comprise parameters and parameter data in the http request keywords.
Further, the performance test data in step S8 is visually displayed as a table, a pie chart, a progress chart, and a graph.
Further, the performance test report generated in step S10 is a PDF file.
The invention has the beneficial effects that: the invention checks each parameter in the aspect of automatically generating the script, avoids script errors caused by data and parameter configuration errors, can quickly and accurately automatically generate the Jmeter script, and improves the readability and the comprehensibility of the script configuration; the running script automatically starts the monitoring service and monitors data in real time so that the performance test data can be checked in real time in the performance test process, and simultaneously, multi-node concurrent execution and single-node execution of the performance test script are supported, so that each test server is fully utilized under the condition of high concurrency of the performance test, and the waste of the test servers is avoided; the automatically generated performance report can intuitively reflect various performance test indexes in the test and the use conditions of the CPU, the memory and the like of the tested server, does not need to count the performance indexes manually, greatly improves the performance test efficiency and ensures that the test result is more accurate; the performance test method is simple in process, easy to operate and understand and capable of reducing the performance test difficulty.
The invention can automatically generate scripts, can execute performance tests regularly or manually, can support multi-node high-concurrency performance tests, can automatically monitor performance test data, can also automatically generate test reports, automatically send mails and the like, and improves the performance test efficiency; the operation flow of the system is simple, various performance indexes and resource use conditions can be monitored in real time in the execution process, test report data is displayed visually, performance test is easier, and the difficulty of the performance test is reduced.
Drawings
FIG. 1 is a schematic view of the structure of the present invention.
Detailed Description
As shown in FIG. 1, the automated performance testing method based on the Jmeter of the present invention includes the following steps:
s1, defining keywords: packaging a Jmeter keyword, and setting a keyword name and a parameter name for the packaged keyword;
s2, defining a test task: the method comprises the steps of testing items, sending mails, executing at fixed time, executing time, recipients and senders; whether the test is executed regularly or not can be determined according to actual conditions, and whether a test report is sent in a mailbox mode or not is determined after the test is finished; the method can be executed at any time without manual operation of performance testing personnel, and the set sending mail can send a performance testing report to related personnel after the operation is finished, and does not need manual operation of the testing personnel;
s3, defining a performance test scene: the method comprises the range of performance test and a test strategy, wherein the test strategy comprises the number of concurrent users, execution times, set time and a concurrent user growth mode;
s4, selecting a performance test keyword: selecting the Jmeter keywords packaged in the step S1 and the test task defined in the step S2, and combining the required keywords; the packaged keywords are the minimum set meeting performance tests, and other keywords can be packaged and expanded according to actual requirements; the keyword name is a suggested name, and can be freely appointed according to actual requirements; the keyword parameter is a minimum set required in the meter or an optimized set meeting the minimum set, and can be expanded according to actual requirements; the preset value of the keyword parameter is a suggested value, and other values or no preset value can be specified according to actual requirements.
S5, setting a remote monitoring server: the method comprises the steps of server IP, server type, server application, CPU utilization rate of a monitored server in a test process, memory idle rate, import and export broadband and CPU load;
s6, setting a result checking strategy: the method comprises the steps of TPS, response time, success rate, the number of concurrent users, the utilization rate of a server CPU and the idle rate of a server memory;
s7, generating a Jmeter script: saving the configuration information in the steps S2, S3, S4, S5 and S6, and automatically generating a Jmeter script and a parameter file; the method can run scripts regularly or manually, automatically acquire idle servers under the condition of high concurrency so as to realize remote multi-node concurrent execution of the Jmeter, and simultaneously start a remote monitoring server to monitor and summarize performance test data in real time;
s8, running a Jmeter script: running the Jmeter script generated in the step S7, starting the remote monitoring server set in the step S5 when the running starts, monitoring and summarizing performance test data in real time, automatically distributing idle servers according to the test strategy defined in the step S3, and executing the idle servers by multi-node concurrent test or single-node test, wherein the real-time performance test data comprises TPS, maximum response time, minimum response time, 90% response time, the number of concurrent users, success rate, cpu utilization rate of the monitoring server, disk read-write rate, import and export broadband, cpu load and memory use condition; the real-time performance test data can be displayed as an integral trend graph, and simultaneously, the real-time data of the number of concurrent users, the maximum and minimum, 90% response time, the number of requests, the success rate, tps, 4xx/5xx and the like of each transaction name are also displayed; the multi-node concurrent execution ensures that the number of the executors is automatically controlled to share high concurrent pressure under the condition of high concurrency;
s9, ending the operation of the Jmeter script: ending the operation of the Jmeter script of the step S8, and stopping the remote monitoring server set in the step S5;
s10, generating a performance test report: after step 9, generating a performance test report, wherein the performance test report comprises basic information of a test task, test strategy information, various indexes of a performance test, various indexes of a remote monitoring server and whether a final test is passed; the performance test report shows various indexes of the performance test and a final test result;
s11, sending a test report mail: the performance test report generated at step S10 is sent to the recipient defined at step S2, after which the test ends.
Further, in step S1, the step of encapsulating the Jmeter keyword is to encapsulate the Jmeter keyword using Java or Python language, where the encapsulated Jmeter keyword includes a test plan, an execution plan, an http Cookie manager, an http request default value, an http request, a transaction controller, a simple controller, a conditional branch controller, a loop controller, a regular expression, a CSV file generation file, an upload CSV file, and a back-end listener.
Further, the result checking policy in step S6 may be provided with a plurality of rules for checking whether the performance test reaches the expected result and determining whether the performance test passes.
The method of claim 1, wherein the method comprises the following steps: and step S3, generating a trend graph of the increase of the number of concurrent users after the test strategy is set.
Further, in the step S4, the keyword support is repeatedly selected; in step S4, the keywords support parent-child relationship keywords, that is, one keyword includes a plurality of child keywords as a group of keywords; the http request keywords can be imported in batch through an external excel file, a CSV file and a json file, and the imported parameters comprise parameters and parameter data in the http request keywords.
Further, the performance test data in step S8 is visually displayed as a table, a pie chart, a progress chart, and a graph.
Further, the performance test report generated in step S10 is a PDF file.
The invention can automatically generate scripts, can execute performance tests regularly or manually, can support multi-node high-concurrency performance tests, can automatically monitor performance test data, can also automatically generate test reports, automatically send mails and the like, and improves the performance test efficiency; the operation flow of the system is simple, various performance indexes and resource use conditions can be monitored in real time in the execution process, test report data is displayed visually, performance test is easier, and the difficulty of the performance test is reduced.
The invention checks each parameter in the aspect of automatically generating the script, avoids script errors caused by data and parameter configuration errors, can quickly and accurately automatically generate the Jmeter script, and improves the readability and the comprehensibility of the script configuration; the running script automatically starts the monitoring service and monitors data in real time so that the performance test data can be checked in real time in the performance test process, and simultaneously, multi-node concurrent execution and single-node execution of the performance test script are supported, so that each test server is fully utilized under the condition of high concurrency of the performance test, and the waste of the test servers is avoided; the automatically generated performance report can intuitively reflect various performance test indexes in the test and the use conditions of the CPU, the memory and the like of the tested server, does not need to count the performance indexes manually, greatly improves the performance test efficiency and ensures that the test result is more accurate; the performance test method is simple in process, easy to operate and understand and capable of reducing the performance test difficulty.

Claims (7)

1. A performance automatic test method based on a Jmeter is characterized in that: the method comprises the following steps:
s1, defining keywords: packaging a Jmeter keyword, and setting a keyword name and a parameter name for the packaged keyword;
s2, defining a test task: the method comprises the steps of testing items, sending mails, executing at fixed time, executing time, recipients and senders;
s3, defining a performance test scene: the method comprises the range of performance test and a test strategy, wherein the test strategy comprises the number of concurrent users, execution times, set time and a concurrent user growth mode;
s4, selecting a performance test keyword: selecting the Jmeter keywords packaged in the step S1 and the test task defined in the step S2, and combining the required keywords;
s5, setting a remote monitoring server: the method comprises the steps of server IP, server type, server application, CPU utilization rate of a monitored server in a test process, memory idle rate, import and export broadband and CPU load;
s6, setting a result checking strategy: the method comprises the steps of TPS, response time, success rate, the number of concurrent users, the utilization rate of a server CPU and the idle rate of a server memory;
s7, generating a Jmeter script: saving the configuration information in the steps S2, S3, S4, S5 and S6, and automatically generating a Jmeter script and a parameter file;
s8, running a Jmeter script: running the Jmeter script generated in the step S7, starting the remote monitoring server set in the step S5 when the running starts, monitoring and summarizing performance test data in real time, automatically distributing idle servers according to the test strategy defined in the step S3, and executing the idle servers by multi-node concurrent test or single-node test, wherein the real-time performance test data comprises TPS, maximum response time, minimum response time, 90% response time, the number of concurrent users, success rate, cpu utilization rate of the monitoring server, disk read-write rate, import and export broadband, cpu load and memory use condition;
s9, ending the operation of the Jmeter script: ending the operation of the Jmeter script of the step S8, and stopping the remote monitoring server set in the step S5;
s10, generating a performance test report: after step 9, generating a performance test report, wherein the performance test report comprises basic information of a test task, test strategy information, various indexes of a performance test, various indexes of a remote monitoring server and whether a final test is passed;
s11, sending a test report mail, and finishing the test: the performance test report generated at step S10 is sent to the recipient defined at step S2, after which the test ends.
2. The method of claim 1, wherein the method comprises the following steps: in step S1, the step of encapsulating the Jmeter keyword is to encapsulate the Jmeter keyword using Java or Python language, where the encapsulated Jmeter keyword includes a test plan, an execution plan, an http Cookie manager, an http request default, an http request, a transaction controller, a simple controller, a conditional branch controller, a cycle controller, a regular expression, a CSV file generation, a CSV file upload, and a back-end listener.
3. The method of claim 1, wherein the method comprises the following steps: the result checking policy in step S6 may be provided with a plurality of policies for checking whether the performance test reaches the expected result and determining whether the performance test passes.
4. The method of claim 1, wherein the method comprises the following steps: and step S3, generating a trend graph of the increase of the number of concurrent users after the test strategy is set.
5. The method of claim 2, wherein the method comprises the following steps: repeatedly selecting a keyword support in the step S4; in step S4, the keywords support parent-child relationship keywords, that is, one keyword includes a plurality of child keywords as a group of keywords; the http request keywords can be imported in batch through an external excel file, a CSV file and a json file, and the imported parameters comprise parameters and parameter data in the http request keywords.
6. The method of claim 1, wherein the method comprises the following steps: the performance test data in the step S8 are visually displayed as a table, a pie chart, a progress chart, and a graph.
7. The method of claim 1, wherein the method comprises the following steps: the performance test report generated in step S10 is a PDF file.
CN202010248639.7A 2020-04-01 2020-04-01 Automatic performance testing method based on Jmeter Pending CN111209218A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010248639.7A CN111209218A (en) 2020-04-01 2020-04-01 Automatic performance testing method based on Jmeter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010248639.7A CN111209218A (en) 2020-04-01 2020-04-01 Automatic performance testing method based on Jmeter

Publications (1)

Publication Number Publication Date
CN111209218A true CN111209218A (en) 2020-05-29

Family

ID=70784574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010248639.7A Pending CN111209218A (en) 2020-04-01 2020-04-01 Automatic performance testing method based on Jmeter

Country Status (1)

Country Link
CN (1) CN111209218A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111949545A (en) * 2020-08-19 2020-11-17 彩讯科技股份有限公司 Automatic testing method, system, server and storage medium
CN112148599A (en) * 2020-09-16 2020-12-29 上海中通吉网络技术有限公司 Performance pressure measurement method, device and equipment
CN112559360A (en) * 2020-12-22 2021-03-26 盛银消费金融有限公司 Code method level-based pressure test method
CN112765005A (en) * 2021-01-21 2021-05-07 中信银行股份有限公司 Performance test execution method and system
CN112948480A (en) * 2021-04-21 2021-06-11 平安好医投资管理有限公司 Data extraction method and device, electronic equipment and storage medium
CN113051158A (en) * 2021-03-17 2021-06-29 中国人民解放军国防科技大学 System-level and link-level multi-index synchronous automatic testing method and system
CN113220562A (en) * 2020-02-06 2021-08-06 北京沃东天骏信息技术有限公司 Terminal testing method and device, computer storage medium and electronic equipment
CN114268569A (en) * 2020-09-16 2022-04-01 中盈优创资讯科技有限公司 Configurable network operation, maintenance, acceptance and test method and device
CN114816956A (en) * 2022-04-28 2022-07-29 马上消费金融股份有限公司 Interface performance test method and device
CN114896142A (en) * 2022-04-25 2022-08-12 矩阵时光数字科技有限公司 Performance index optimization method based on JMeter

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130159974A1 (en) * 2011-12-15 2013-06-20 The Boeing Company Automated Framework For Dynamically Creating Test Scripts for Software Testing
CN108415832A (en) * 2018-02-07 2018-08-17 平安科技(深圳)有限公司 Automatic interface testing method, device, equipment and storage medium
CN110502435A (en) * 2019-07-26 2019-11-26 广东睿江云计算股份有限公司 Automated performance testing method and its system based on Jmeter

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130159974A1 (en) * 2011-12-15 2013-06-20 The Boeing Company Automated Framework For Dynamically Creating Test Scripts for Software Testing
CN108415832A (en) * 2018-02-07 2018-08-17 平安科技(深圳)有限公司 Automatic interface testing method, device, equipment and storage medium
CN110502435A (en) * 2019-07-26 2019-11-26 广东睿江云计算股份有限公司 Automated performance testing method and its system based on Jmeter

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
小叮当爱学习: "JMeter性能测试,完整入门篇", 《HTTPS://WWW.JIANSHU.COM/P/7C0A76C78697》 *
豆姐姐: "Jmeter 之测试片段、Include Controller、模块控制器应用 [6]", 《HTTPS://WWW.CNBLOGS.COM/TUDOU-22/P/11452192.HTML》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113220562A (en) * 2020-02-06 2021-08-06 北京沃东天骏信息技术有限公司 Terminal testing method and device, computer storage medium and electronic equipment
CN111949545A (en) * 2020-08-19 2020-11-17 彩讯科技股份有限公司 Automatic testing method, system, server and storage medium
CN112148599A (en) * 2020-09-16 2020-12-29 上海中通吉网络技术有限公司 Performance pressure measurement method, device and equipment
CN114268569B (en) * 2020-09-16 2023-10-31 中盈优创资讯科技有限公司 Configurable network operation and maintenance acceptance test method and device
CN114268569A (en) * 2020-09-16 2022-04-01 中盈优创资讯科技有限公司 Configurable network operation, maintenance, acceptance and test method and device
CN112559360A (en) * 2020-12-22 2021-03-26 盛银消费金融有限公司 Code method level-based pressure test method
CN112765005A (en) * 2021-01-21 2021-05-07 中信银行股份有限公司 Performance test execution method and system
CN113051158A (en) * 2021-03-17 2021-06-29 中国人民解放军国防科技大学 System-level and link-level multi-index synchronous automatic testing method and system
CN112948480A (en) * 2021-04-21 2021-06-11 平安好医投资管理有限公司 Data extraction method and device, electronic equipment and storage medium
CN112948480B (en) * 2021-04-21 2023-11-14 平安好医投资管理有限公司 Data extraction method, device, electronic equipment and storage medium
CN114896142A (en) * 2022-04-25 2022-08-12 矩阵时光数字科技有限公司 Performance index optimization method based on JMeter
CN114896142B (en) * 2022-04-25 2024-05-07 矩阵时光数字科技有限公司 JMeter-based performance index optimization method
CN114816956A (en) * 2022-04-28 2022-07-29 马上消费金融股份有限公司 Interface performance test method and device

Similar Documents

Publication Publication Date Title
CN111209218A (en) Automatic performance testing method based on Jmeter
EP3278214B1 (en) Systems and methods for live testing performance conditions of a multi-tenant system
US9015667B2 (en) Fuzz testing of asynchronous program code
US7941332B2 (en) Apparatus, system, and method for modeling, projecting, and optimizing an enterprise application system
CN109165168A (en) A kind of method for testing pressure, device, equipment and medium
CN110287052A (en) A kind of root of abnormal task determines method and device because of task
CN106844198A (en) Distributed dispatching automation test platform and method
CN112286806B (en) Automatic test method and device, storage medium and electronic equipment
WO2019095580A1 (en) Test method and apparatus, computer device, and readable storage medium
CA2948700A1 (en) Systems and methods for websphere mq performance metrics analysis
CN113238930B (en) Method and device for testing software system, terminal equipment and storage medium
CN106933709A (en) A kind of method of testing and device
US11709758B2 (en) Enhanced application performance framework
CN115080370B (en) Software concurrency capability assessment method and device, storage medium and electronic equipment
Liu Research of performance test technology for big data applications
CN113986746A (en) Performance test method and device and computer readable storage medium
da Silva et al. A science-gateway workload archive to study pilot jobs, user activity, bag of tasks, task sub-steps, and workflow executions
CN104106245B (en) Method and device for managing network connection for use by plurality of application program processes
CN115396352A (en) CMS server-side protocol testing method, system, equipment and storage medium
Lei et al. Performance and scalability testing strategy based on kubemark
US20200019382A1 (en) Constraint programming using block-based workflows
CN106502887A (en) A kind of stability test method, test controller and system
CN111427793A (en) Automatic Jmeter script generation method
CN115056234B (en) RPA controller scheduling method and system based on event-driven and infinite state machine
CN115187131A (en) Task execution state display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200529

RJ01 Rejection of invention patent application after publication