CN116383025A - Performance test method, device, equipment and medium based on Jmeter - Google Patents

Performance test method, device, equipment and medium based on Jmeter Download PDF

Info

Publication number
CN116383025A
CN116383025A CN202310443355.7A CN202310443355A CN116383025A CN 116383025 A CN116383025 A CN 116383025A CN 202310443355 A CN202310443355 A CN 202310443355A CN 116383025 A CN116383025 A CN 116383025A
Authority
CN
China
Prior art keywords
performance
jmeter
test
running
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310443355.7A
Other languages
Chinese (zh)
Inventor
伍健
张勇军
谢春伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulin Technology Co Ltd
Original Assignee
Shenzhen Fulin Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fulin Technology Co Ltd filed Critical Shenzhen Fulin Technology Co Ltd
Priority to CN202310443355.7A priority Critical patent/CN116383025A/en
Publication of CN116383025A publication Critical patent/CN116383025A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/258Data format conversion from or to a database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the technical field of software testing, and discloses a performance testing method, device, equipment and medium based on a Jmeter. The method comprises the following steps: configuring an instance environment on a performance analysis tool platform, writing jmx scripts by utilizing a Jmeter tool, and configuring running information on a job program of the Jenkins tool; executing jmx script on the instance environment according to the operation information, and debugging jmx script; and transmitting the jmx script which is normally debugged to the pressure test server, running the job program, executing performance test through the job program, and outputting a performance report. By the method, the performance test can be fully automatically executed, the manual intervention time of the whole performance test life cycle is very little, the daily maintenance cost is low, the daily working efficiency is greatly improved, and the cost of the performance test is saved.

Description

Performance test method, device, equipment and medium based on Jmeter
Technical Field
The present invention relates to the field of software testing technologies, and in particular, to a Jmeter-based performance testing method, device, apparatus, and medium.
Background
Jmeter, also known as "Apache Jmeter", is a 100% pure Java application that is open-source with a graphical interface designed as software, e.g., a web application, for testing client or server architecture. Jmeters aim to analyze and measure the performance of various applications and services. The following are some of the most important characteristics of jmeters in use: (1) open source application: the JMeter is a free open source application program, which can help users or developers develop other application programs by using source codes; (2) support multiple protocols: JMeter supports protocols such as HTTP, webService, JDBC, LDAP, JMS and FTP; (3) support multiple tests: such as performance testing, functional testing, regression testing, etc.
In the prior art, the programming, running and result statistics of the performance test script are all manually operated and counted by performance test staff, and the performance test cannot be fully automatically executed. The experience and technology of performance testers are greatly different, and a great number of professional terms and skills exist in the Jmeter, so that the performance test is difficult to complete.
Disclosure of Invention
In view of the above, the present invention aims to overcome the defects in the prior art, and provide a Jmeter-based performance testing method, device, apparatus and medium.
The invention provides the following technical scheme:
in a first aspect, in an embodiment of the present disclosure, there is provided a Jmeter-based performance testing method, including:
configuring an instance environment on a performance analysis tool platform, writing jmx scripts by utilizing a Jmeter tool, and configuring running information on a job program of the Jenkins tool;
executing the jmx script on the instance environment according to the running information, and debugging the jmx script;
and sending the jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report.
Further, the running information comprises the name, the execution path, the execution time and the corresponding instance environment identification of the jmx script.
Further, the configuring the instance environment on the performance analysis tool platform includes:
configuring an instance type of an instance environment on the performance analysis tool platform, wherein the instance type is used for specifying a running memory of the Jmeter tool;
and configuring a hard disk type of an instance environment on the performance analysis tool platform, wherein the hard disk type is used for running a specific pressure scene of the performance test.
Further, the running the job program, performing, by the job program, a performance test, including:
running the job program, and starting a plurality of back-end API performance timing tasks in the specific pressure scene at fixed time;
and instantiating a plurality of servers, carrying out the performance test on the test environment corresponding to each back-end API performance timing task through each server, and outputting a plurality of performance reports.
Further, after the outputting the performance report, the method further includes:
performing selenium crawler processing on the performance report, and extracting key information of the performance report, wherein the key information comprises total statistics chart information, error chart information and key error chart information;
performing logic judgment on the total statistical chart information, the error chart information and the key error chart information to obtain a measurement result of the performance test;
and storing the measurement result of the performance test into a corresponding database.
Further, after extracting the key information of the performance report, the method further includes:
converting the format of the key information into json data format, and storing the converted key information into a corresponding database by using an API (application program interface);
and processing each piece of key information to generate a test file, and storing the test file into a corresponding execution path.
Further, the processing the key information obtained by executing the performance test on each server, after generating the test file, further includes:
and sending the test file to a designated directory through the performance analysis tool platform, and generating a corresponding download link.
In a second aspect, in an embodiment of the present disclosure, there is provided a Jmeter-based performance testing apparatus, the apparatus including:
the configuration module is used for configuring an instance environment on the performance analysis tool platform, compiling jmx scripts by utilizing a Jmeter tool and configuring operation information on a job program of the Jenkins tool;
the execution module is used for executing the jmx script on the instance environment according to the running information and debugging the jmx script;
and the test module is used for sending the jmx script which is normally debugged to the pressure test server, running the job program, executing performance test through the job program and outputting a performance report.
In a third aspect, in an embodiment of the present disclosure, there is provided a computer device, where the computer device includes a memory and a processor, where the memory stores a computer program, and where the processor executes the computer program to implement the steps of the Jmeter-based performance test method described in the first aspect.
In a fourth aspect, in an embodiment of the present disclosure, there is provided a computer readable storage medium storing a computer program, where the computer program implements the steps of the Jmeter-based performance test method described in the first aspect when the computer program is executed by a processor.
Embodiments of the present application have the following advantages:
the performance testing method based on the JMter provided by the embodiment of the application comprises the following steps: configuring an instance environment on a performance analysis tool platform, writing jmx scripts by utilizing a Jmeter tool, and configuring running information on a job program of the Jenkins tool; executing the jmx script on the instance environment according to the running information, and debugging the jmx script; and sending the jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report. By the method, the performance test can be fully automatically executed, the manual intervention time of the whole performance test life cycle is very little, the daily maintenance cost is low, the daily working efficiency is greatly improved, and the cost of the performance test is saved.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Like elements are numbered alike in the various figures.
FIG. 1 shows a flowchart of a method for testing performance based on a JMter according to an embodiment of the present application;
FIG. 2 is a flowchart of another method for testing performance based on a JMter according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a performance testing apparatus based on a JMter according to an embodiment of the present application;
fig. 4 shows a schematic hardware architecture of a computer device according to an embodiment of the present application.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly on" another element, there are no intervening elements present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the templates herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Example 1
As shown in fig. 1, a flowchart of a Jmeter-based performance testing method in an embodiment of the present application is shown, where the Jmeter-based performance testing method provided in the embodiment of the present application is executed by a server, and specifically includes the following steps:
step S110, an instance environment is configured on the performance analysis tool platform, jmx scripts are written by using the Jmeter tool, and running information is configured on the job program of the Jenkins tool.
It should be noted that the performance analysis tool is a Linux-based performance analysis tool, and the Linux performance counter is a new kernel-based subsystem, which provides a performance analysis framework, such as a hardware (CPU, PMU) function and a software (software counter, trace) function. Through the performance analysis tool, the application may utilize PMU, tracepoint and counters in the kernel to make performance statistics. The method can analyze the performance problem (per thread) of the specified application program, can also be used for analyzing the performance problem of the kernel, and can also analyze the application program and the kernel simultaneously, so that the performance bottleneck in the application program is comprehensively understood.
It will be appreciated that in this embodiment, the performance analysis tool is a self-grinding onesperf tool, and the specific performance analysis tool may be determined according to the actual situation, which is not limited in this embodiment.
Before starting a task, an instance (instance) environment is configured in advance on the performance analysis tool platform, and performance tests can be executed by depending on the performance analysis tool platform. The INSTANCE environment needs to contain information values of INSTANCE TYPE (INSTANCE_TYPE), hard DISK TYPE (SYSTEM_DISK_CATEGORY and DATA_DISK_CATEGORY), and the like. The instance type is used for specifying the running memory of the Jmeter tool, and the hard disk type is used for a specific pressure scene of the running performance test.
It should be noted that, the instance environment only needs to be configured once before the task is started, and if maintenance and modification are needed in the later period, reconfiguration is basically not needed.
Meanwhile, a plurality of jmx scripts are written by using a Jmeter tool, and running information is configured on the job program of the Jenkins tool, wherein the running information comprises names, execution paths, execution time and corresponding instance environment identifiers of all jmx scripts.
The environment and the script required by the performance test are configured in advance, so that the configuration is simplified, and the efficiency of the automatic operation performance test is improved.
And step S120, executing the jmx script on the instance environment according to the running information, and debugging the jmx script.
Specifically, the corresponding jmx script is executed on the instance environment according to the name, execution path, execution time and other running information of the jmx script, and the jmx script is debugged. The debugging process is divided into: (1) local debug: the method is characterized in that a GUI graphical interface is used for designating a thread group on a personal terminal computer, and 3-5 scenes corresponding to jmx scripts are concurrently operated; (2) server debug: the method refers to a scene corresponding to the jmx script running in 3-5 concurrences by using a command line on a server.
Among these, problems with server debugging may include: (1) The method comprises the steps that a Jmeter plug-in package is inconsistent with a local plug-in package, so that debugging and running of jmx script on a server fail, and the solution is that the corresponding correct Jmeter plug-in package is uploaded to the server; (2) When the command line is executed, the parameters are wrongly written, and the solution is to manually check the execution log and check the command line parameters; (3) When the server runs, the hardware memory is smaller than the JVM memory set by the Jmeter, so that the server is failed to start, and the solution is to manually check an error log and adjust the hardware memory.
By debugging the jmx script, the jmx script can be prevented from generating errors, and the normal jmx script is further ensured to be sent to the pressure measurement server.
And step S130, transmitting a jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report.
When jmx script debugging is free of problems, the jmx script which is normally debugged is sent to a pressure measurement server, wherein the pressure measurement server is a server responsible for running the job program, namely the pressure measurement server starts a plurality of back-end API performance timing tasks in a specific pressure scene at fixed time through running the job program. Further, a plurality of pressure testing servers are opened to obtain instantiation servers, performance tests are conducted on test environments corresponding to the performance timing tasks of the back-end APIs through the instantiation servers, and a plurality of performance reports are output.
It should be noted that, according to the actual requirement, the specific pressure scenario of the back-end API performance timing task is more, the running time is longer, and because of considering the running efficiency and guaranteeing the measurement quality, the delay time from the current moment to the beginning of the running needs to be added in the performance test process, and the running memory of enough Jmeter tools needs to be allocated.
By the method, the performance test can be fully automatically executed, the manual intervention time of the whole performance test life cycle is very short, the daily work efficiency is greatly improved, and the cost of the performance test is saved.
In an alternative embodiment, as shown in fig. 2, step S130 further includes:
and step S140, performing selenium crawler processing on the performance report, and extracting key information of the performance report, wherein the key information comprises total statistics chart information, error chart information and key error chart information.
And step S150, carrying out logic judgment on the total statistics chart information, the error chart information and the key error chart information to obtain a measurement result of the performance test.
Step S160, storing the measurement result of the performance test in a corresponding database.
Specifically, after the automatic performance test is finished, the performance report generated by the Jmeter tool is developed and utilized for the second time. By using the selenium crawler scheme by python, key information on the performance report is extracted, mainly three pieces of chart information with the following keys are extracted: total Statistics (Statistics) chart information, error (error) chart information, and critical error (Top 5Errors by sampler) chart information.
Further, the total statistical chart information, the error chart information and the key error chart information are subjected to logic judgment of cross comparison difference, so that a measurement result of the performance test is obtained, and the measurement result of the performance test is stored in a corresponding database. The measurement result can be the key performance indexes such as detail performance index information of various scenes, the overall passing rate of performance test, the error rate of API indexes, some response time of each segment bit and the like.
And simultaneously, converting the format of the key information into json data format according to the key information obtained by the secondary development, and storing the converted key information into a corresponding Mongordb database by using an API (application program interface). And processing each key information to generate a test file, and storing the test file in a corresponding execution path, so that a later-period checking person can conveniently download and check the performance change condition and the optimization direction of the system. In addition, the function on the performance analysis tool platform can be used for sending the test file to the appointed directory and generating a corresponding download link so as to enable a viewer to download and view in the operation details of the job program.
By the method, daily work efficiency is greatly improved, performance testing cost is saved, historical data in a database is supported to be checked and summarized, and a foundation is provided for using various data in the later period.
According to the performance testing method based on the JMER, an instance environment is configured on a performance analysis tool platform, jmx scripts are written by using the JMER tool, and running information is configured on a job program of the Jenkins tool; executing the jmx script on the instance environment according to the running information, and debugging the jmx script; and sending the jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report. By the method, the performance test can be fully automatically executed, the manual intervention time of the whole performance test life cycle is very little, the daily maintenance cost is low, the daily working efficiency is greatly improved, and the cost of the performance test is saved.
Example 2
As shown in fig. 3, a schematic structural diagram of a Jmeter-based performance testing apparatus 300 according to an embodiment of the present application includes:
a configuration module 310, configured to configure an instance environment on a performance analysis tool platform, and write jmx scripts by using a Jmeter tool, and configure operation information on the job program of the Jenkins tool;
an execution module 320, configured to execute the jmx script on the instance environment according to the running information, and debug the jmx script;
and the test module 330 is used for sending the jmx script which is normally debugged to the pressure test server, running the job program, executing performance test through the job program and outputting a performance report.
Optionally, the Jmeter-based performance testing apparatus 300 further includes:
a first configuration sub-module, configured to configure an instance type of an instance environment on the performance analysis tool platform, where the instance type is used to specify a running memory of the Jmeter tool;
and the second configuration submodule is used for configuring the hard disk type of the instance environment on the performance analysis tool platform, wherein the hard disk type is used for running the specific pressure scene of the performance test.
Optionally, the Jmeter-based performance testing apparatus 300 further includes:
the starting module is used for running the job program and starting a plurality of back-end API performance timing tasks in the specific pressure scene at fixed time;
the instance module is used for instantiating a plurality of servers, carrying out the performance test on the test environment corresponding to each back-end API performance timing task through each server, and outputting a plurality of performance reports.
Optionally, the Jmeter-based performance testing apparatus 300 further includes:
the extraction module is used for carrying out selenium crawler processing on the performance report and extracting key information of the performance report, wherein the key information comprises total statistics chart information, error chart information and key error chart information;
the judging module is used for carrying out logic judgment on the total statistics chart information, the error chart information and the key error chart information to obtain a measurement result of the performance test;
and the storage module is used for storing the measurement result of the performance test into a corresponding database.
Optionally, the Jmeter-based performance testing apparatus 300 further includes:
the conversion module is used for converting the format of the key information into json data format and storing the converted key information into a corresponding database by using an API (application program interface);
and the processing module is used for processing each piece of key information, generating a test file and storing the test file into a corresponding execution path.
Optionally, the Jmeter-based performance testing apparatus 300 further includes:
and the sending module is used for sending the test file to a designated directory through the performance analysis tool platform and generating a corresponding download link.
The Jmeter-based performance testing device provided in the embodiment of the application includes: the configuration module is used for configuring an instance environment on the performance analysis tool platform, compiling jmx scripts by utilizing a Jmeter tool and configuring operation information on a job program of the Jenkins tool; the execution module is used for executing the jmx script on the instance environment according to the running information and debugging the jmx script; and the test module is used for sending the jmx script which is normally debugged to the pressure test server, running the job program, executing performance test through the job program and outputting a performance report. The device can fully automatically execute the performance test, the manual intervention time of the whole performance test life cycle is very little, the daily maintenance cost is low, the daily work efficiency is greatly improved, and the cost of the performance test is saved.
Example 3
Fig. 4 shows a schematic hardware architecture of a computer device 400 provided in the present application, where the computer device includes a memory 410, a processor 420 and a network interface 430, where the network interface 430 may include a wireless network interface or a wired network interface, and the network interface 430 is generally used to establish a communication link between the computer device 400 and other computer devices, where the memory 410 stores a computer program, and where the processor 420 implements the steps of the Jmeter-based performance test method described in embodiment 1 when executing the computer program.
Configuring an instance environment on a performance analysis tool platform, writing jmx scripts by utilizing a Jmeter tool, and configuring running information on a job program of the Jenkins tool;
executing the jmx script on the instance environment according to the running information, and debugging the jmx script;
and sending the jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report.
Optionally, the running information includes a name, an execution path, an execution time and a corresponding instance environment identifier of the jmx script.
Optionally, the configuring the instance environment on the performance analysis tool platform includes:
configuring an instance type of an instance environment on the performance analysis tool platform, wherein the instance type is used for specifying a running memory of the Jmeter tool;
and configuring a hard disk type of an instance environment on the performance analysis tool platform, wherein the hard disk type is used for running a specific pressure scene of the performance test.
Optionally, the running the job program, performing, by the job program, a performance test, including:
running the job program, and starting a plurality of back-end API performance timing tasks in the specific pressure scene at fixed time;
and instantiating a plurality of servers, carrying out the performance test on the test environment corresponding to each back-end API performance timing task through each server, and outputting a plurality of performance reports.
Optionally, after the outputting the performance report, the method further includes:
performing selenium crawler processing on the performance report, and extracting key information of the performance report, wherein the key information comprises total statistics chart information, error chart information and key error chart information;
performing logic judgment on the total statistical chart information, the error chart information and the key error chart information to obtain a measurement result of the performance test;
and storing the measurement result of the performance test into a corresponding database.
Optionally, after extracting the key information of the performance report, the method further includes:
converting the format of the key information into json data format, and storing the converted key information into a corresponding database by using an API (application program interface);
and processing each piece of key information to generate a test file, and storing the test file into a corresponding execution path.
Optionally, after processing the key information obtained by executing the performance test on each server to generate a test file, the method further includes:
and sending the test file to a designated directory through the performance analysis tool platform, and generating a corresponding download link.
It can be understood that the computer program in the computer device provided in embodiment 3 of the present invention can execute each step of the Jmeter-based performance testing method provided in embodiment 1 when running on a processor, and can achieve the same technical effects, and for avoiding repetition, the description is omitted herein.
Example 4
A computer readable storage medium is also provided in the present embodiment, and the computer readable storage medium may be a nonvolatile storage medium or a volatile storage medium. For example, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk, optical disk, or the like may be used. A computer-readable storage medium has stored thereon a computer program which, when run on a processor, performs the Jmeter-based performance test method as described in embodiment 1.
It can be understood that the computer program stored in the computer readable storage medium provided in embodiment 4 of the present invention is similar to the computer program in embodiment 3, and is used to execute each step of the Jmeter-based performance testing method provided in embodiment l when running on a processor, and the same technical effects can be achieved, so that repetition is avoided, and detailed description is omitted here.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, of the flow diagrams and block diagrams in the figures, which illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules or units in various embodiments of the invention may be integrated together to form a single part, or the modules may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a smart phone, a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention.

Claims (10)

1. A Jmeter-based performance testing method, the method comprising:
configuring an instance environment on a performance analysis tool platform, writing jmx scripts by utilizing a Jmeter tool, and configuring running information on a job program of the Jenkins tool;
executing the jmx script on the instance environment according to the running information, and debugging the jmx script;
and sending the jmx script with normal debugging to a pressure test server, running the job program, executing performance test through the job program, and outputting a performance report.
2. The Jmeter-based performance testing method of claim 1, wherein the running information includes a name of the jmx script, an execution path, an execution time, and a corresponding instance environment identifier.
3. The Jmeter-based performance testing method of claim 2, wherein configuring the instance environment on the performance analysis tool platform comprises:
configuring an instance type of an instance environment on the performance analysis tool platform, wherein the instance type is used for specifying a running memory of the Jmeter tool;
and configuring a hard disk type of an instance environment on the performance analysis tool platform, wherein the hard disk type is used for running a specific pressure scene of the performance test.
4. The Jmeter-based performance testing method according to claim 3, wherein the running the job program, performing performance testing by the job program, comprises:
running the job program, and starting a plurality of back-end API performance timing tasks in the specific pressure scene at fixed time;
and instantiating a plurality of servers, carrying out the performance test on the test environment corresponding to each back-end API performance timing task through each server, and outputting a plurality of performance reports.
5. The Jmeter-based performance testing method according to claim 4, wherein after outputting the performance report, further comprising:
performing selenium crawler processing on the performance report, and extracting key information of the performance report, wherein the key information comprises total statistics chart information, error chart information and key error chart information;
performing logic judgment on the total statistical chart information, the error chart information and the key error chart information to obtain a measurement result of the performance test;
and storing the measurement result of the performance test into a corresponding database.
6. The Jmeter-based performance testing method according to claim 5, wherein after extracting the key information of the performance report, further comprising:
converting the format of the key information into json data format, and storing the converted key information into a corresponding database by using an API (application program interface);
and processing each piece of key information to generate a test file, and storing the test file into a corresponding execution path.
7. The Jmeter-based performance testing method according to claim 6, wherein the processing the key information obtained by performing the performance test on each of the servers, after generating the test file, further comprises:
and sending the test file to a designated directory through the performance analysis tool platform, and generating a corresponding download link.
8. A Jmeter-based performance testing apparatus, the apparatus comprising:
the configuration module is used for configuring an instance environment on the performance analysis tool platform, compiling jmx scripts by utilizing a Jmeter tool and configuring operation information on a job program of the Jenkins tool;
the execution module is used for executing the jmx script on the instance environment according to the running information and debugging the jmx script;
and the test module is used for sending the jmx script which is normally debugged to the pressure test server, running the job program, executing performance test through the job program and outputting a performance report.
9. A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the steps of the Jmeter-based performance testing method of any one of claims 1-7 when the computer program is executed.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, implements the steps of the Jmeter based performance testing method of any of claims 1-7.
CN202310443355.7A 2023-04-13 2023-04-13 Performance test method, device, equipment and medium based on Jmeter Pending CN116383025A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310443355.7A CN116383025A (en) 2023-04-13 2023-04-13 Performance test method, device, equipment and medium based on Jmeter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310443355.7A CN116383025A (en) 2023-04-13 2023-04-13 Performance test method, device, equipment and medium based on Jmeter

Publications (1)

Publication Number Publication Date
CN116383025A true CN116383025A (en) 2023-07-04

Family

ID=86980659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310443355.7A Pending CN116383025A (en) 2023-04-13 2023-04-13 Performance test method, device, equipment and medium based on Jmeter

Country Status (1)

Country Link
CN (1) CN116383025A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194155A (en) * 2023-09-05 2023-12-08 北京安锐卓越信息技术股份有限公司 Automatic performance pressure measurement method, device and medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194155A (en) * 2023-09-05 2023-12-08 北京安锐卓越信息技术股份有限公司 Automatic performance pressure measurement method, device and medium

Similar Documents

Publication Publication Date Title
CN109302522B (en) Test method, test device, computer system, and computer medium
US11755919B2 (en) Analytics for an automated application testing platform
US8924933B2 (en) Method and system for automated testing of computer applications
US9697104B2 (en) End-to end tracing and logging
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
US20040153837A1 (en) Automated testing
US9582391B2 (en) Logistics of stress testing
CN111124919A (en) User interface testing method, device, equipment and storage medium
CN103049371A (en) Testing method and testing device of Android application programs
CN112241360B (en) Test case generation method, device, equipment and storage medium
CN106886493B (en) Method and device for establishing automatic test system
US10942837B2 (en) Analyzing time-series data in an automated application testing system
US20180357143A1 (en) Testing computing devices
US10459830B2 (en) Executable code abnormality detection
US10437717B2 (en) Defect reporting in application testing
WO2014088398A1 (en) Automated test environment deployment with metric recommender for performance testing on iaas cloud
US20180300229A1 (en) Root cause analysis of non-deterministic tests
US10528456B2 (en) Determining idle testing periods
CN112650676A (en) Software testing method, device, equipment and storage medium
CN111258913A (en) Automatic algorithm testing method and device, computer system and readable storage medium
CN112540924A (en) Interface automation test method, device, equipment and storage medium
CN106855844B (en) Performance test method and system
CN116383025A (en) Performance test method, device, equipment and medium based on Jmeter
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN112749083A (en) Test script generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination