Performance index optimization method based on JMeter
Technical Field
The invention relates to the technical field of software testing, in particular to a performance index optimization method based on JMeter.
Background
The reason that the software performance test is more and more emphasized is because the quality of the software is the basic of the industry survival, and meanwhile, the performance of the software is improved to meet the increasingly-proliferated use customers.
In JMeter, the method runs in a thread mode, a test plan and an http or websocket request needing to be tested are configured, the test plan and the http or websocket request can be combined to simulate a test scene if needed, and finally a performance result is analyzed according to a monitoring module. If the tested protocol JMeter does not support, the recompilation of the source code can be carried out, and a customized protocol is added to complete the performance test. Therefore, the JMeter can analyze and measure the performance of the web application or various services, has the characteristics of high expansibility, open source and multi-protocol support, and ensures wide usability.
Currently, the deviation degree of the summary report statistics of JMeter is only oriented to the response time of an interface, and the statistic items needing to be analyzed by a performance test also comprise TPS, CPU, I/O, error rate and the like. The JMeter related indexes such as TPS, error rate, response time and the like can be directly obtained, and the JMeter related indexes of the CPU, the memory, the Swap and the like can not be directly obtained. Therefore, in the process of using JMeter performance test, only the relevant indexes can be directly tested and the obtained result is often only an average value, which causes the deviation of the result to a great extent and leads to misleading result. In summary, the need of inventing a JMeter-based performance index optimization process with low deviation, high accuracy, safety and high efficiency is particularly important and urgent.
Disclosure of Invention
1. The technical problem to be solved is as follows:
the conventional JMeter performance test at present has three problems:
(1) JMeter can directly obtain TPS, error rate, response time and other request related indexes, but cannot directly obtain CPU, memory, Swap and other server related indexes, so that data directly tested by JMeter is incomplete, and the one-sidedness of a test result is caused;
(2) the test result of pressure test data has larger deviation because the configuration of a JMeter component is different, a scenario that a plurality of requests realize one function needs the JMeter to set a transaction controller, and the plurality of requests can be regarded as one transaction, for example, if one of the requests of TPS fails, the JMeter can directly count the TPS as 0, but actually, the server has processed a plurality of requests of the TPS;
(3) the statistical results of the JMeter aggregation report and the summary report comprise error rate, response time, throughput, bandwidth, Swap, disk I/O and the like, the statistical results are average values, meanwhile, scenes such as regular extraction, parameter verification, variable assignment, control use and the like can also influence performance results, and the statistical performance results often cannot reflect deviation degrees under the two conditions, so that no objective basis exists for judging authenticity and reliability of test data, and the probability of misleading results is high.
2. The technical scheme is as follows:
a performance index optimization method based on JMeter comprises the steps of starting the JMeter from a press to send a request, then returning data from a server, then calculating a mean square error through a JMeter system, adjusting the configuration of the JMeter according to a calculation result, then calculating the standard deviation of each data to obtain a standard deviation chart of each data, and judging the deviation degree of all the data according to the standard deviation charts.
As an improvement of the invention, the method comprises the following steps:
s1, the press machine with the JMeter sends a simulation request to the server to obtain pressure measurement data, wherein the pressure measurement data comprise request related indexes and server related indexes;
s2, directly returning the request related indexes from the server to the press, calculating the mean square error of a group of data of TPS in the request related indexes by the JMeter to obtain the mean square error value, judging whether the mean square error value meets the requirement, and making corresponding adjustment to finally obtain the global configuration of the JMeter;
s3, under the global configuration of the JMeter, the JMeter calculates the standard deviation of all data in all indexes of the request related indexes to obtain a first standard deviation chart;
s4, under the global configuration of JMeter, a server end returns relevant indexes of the server through Server agent service, a monitoring plug-in Perfmon of a press machine exports the relevant indexes of the server in a csv file form, then the csv file is imported into JMeter, and the JMeter calculates standard deviations of all data in all indexes of the relevant indexes of the server to obtain a second standard deviation chart;
and S5, judging the data deviation degree of the request related index according to the first standard deviation chart, and judging the data deviation degree of the server related index according to the second standard deviation chart, thereby obtaining the reliability result of the measured data according to the deviation degree.
As an improvement of the present invention, in step S2, the JMeter calculates a mean square error of a group of data of the TPS in the request correlation index, and obtains the mean square error value by the following specific process:
s2-1, firstly, receiving a request related index directly returned to a press from a server by a JMeter, directly acquiring a group of data in TPS in the request related index through a sampleResult class, calculating by the JMeter according to a source code getRate () method to obtain a test value of the group of data in the TPS, and storing an actual test value of each request in the sampleResult class;
s2-2, obtaining the number of requests for each TPS actual test value under the current request using samplingstatcall, and calculating the true value e (xn) of the set of data using formula (1) in calvultorexpression ():
E(xn)=COUNT(request)/T (1)
wherein count (request) refers to the accumulation of the number of times of each request in the actual test value, and T refers to the total response time; the total response time refers to the starting time of the first request in the actual test value to the ending time of the last request in the test value;
s2-3, calling a JMeter aggregation report StatGraphVisualizer class, and calculating the mean square error in getExaction () by using a formula (2) according to E (xn) obtained in S2-2 and a TPS test value:
sqrt(((X1-E(x1))^2+(X2-E(x2))^2+......(Xn-E(xn))^2)/(n-1)) (2)
where Xn refers to the actual test value of the nth TPS sample calculated by JMeter, and E (Xn) refers to E (Xn) obtained by S2-2.
As an improvement of the present invention, in step S2, the specific process of determining whether the mean square error value meets the requirement and making corresponding adjustment is:
s2-4, firstly presetting a required mean square error value range as [0, 2], observing the mean square error value obtained in S2-3, and if the mean square error value is not less than 0 and not more than 2, selecting the parameter and component configuration of the JMeter corresponding to the mean square error value, and setting the parameter and component configuration as the global configuration of the JMeter; if the mean square error value is larger than 2, the configuration of the parameters and components of the JMeter is adapted, the request is sent again, the group of data in the TPS is subjected to mean square error calculation again until the mean square error value meets the requirement, and the configuration of the parameters and components of the JMeter corresponding to the required mean square error value is set as the global configuration of the JMeter.
As an improvement of the present invention, in step S3, the JMeter calculates the standard deviation of all data in each index of the requested relevant index, and the specific process of obtaining the first standard deviation chart is as follows:
s3-1, directly obtaining the test value of each request related index through a sample result class, transmitting the test value of the request related index to a StatCalculator () method provided by JMeter when calling the calvulator Standard () method to calculate all sample average values of the request related index, and calculating all the sample average values together with the test value by using a formula (3) in the calvulator Standard () method to calculate the standard deviation:
sqrt(((X1-E(n))^2+(X2-E(n))^2+......(Xn-E(n))^2)/(n-1)) (3)
wherein Xn is the actual test value of the nth request related index sample calculated by JMeter, and E (n) is the average value of all samples;
s3-2, sending multiple requests to the related indexes of each request to obtain multiple groups of data, and repeatedly calculating the standard deviation of the multiple groups of data;
s3-3, adding a standardGraph class to the JMeter, transmitting the calculation results of all requests with the same name after the operation is finished to a createServerGraph () method in the standardGraph class, and deriving a plurality of groups of first standard deviation charts of the request correlation indexes.
As an improvement of the invention, each index of the request related indexes refers to TPS, error rate and Response Time (RT).
As an improvement of the present invention, in step S4, the specific process of calculating the standard deviation of all data in all indexes of the server related indexes by JMeter to obtain the second standard deviation chart is as follows:
s4-1, uploading relevant indexes of the server in the press, obtaining the relevant indexes of each server by Perfmon and exporting the relevant indexes of each server in a csv file form;
s4-2, then adding an uploading file entry in a JMeter GUI interface, and importing all csv file data obtained in S4-1;
s4-3, calling DealWithServerData () in the sampleResult class to read csv file data imported in S4-2, and performing Map set processing on the data: screening out all test values and storing the test values in a set;
s4-4, acquiring a set of server related index test values in Map by parameters through S4-3, calling a StatCalculator method of JMeter, calculating all sample average values of server related indexes by using the test values, calculating all the calculated sample average values together with the test values, and calculating the standard deviation by using a formula (3) in a calcultorStandard () method:
sqrt(((X1-E(n))^2+(X2-E(n))^2+......(Xn-E(n))^2)/(n-1)) (3)
wherein Xn refers to the actual test value of the nth sample calculated by JMeter, and E (n) refers to the average value of all samples;
s4-5, sending a plurality of requests to the relevant indexes of each server to obtain a plurality of groups of data, and repeatedly carrying out standard deviation calculation of S4-4 on the plurality of groups of data;
and S4-6, transmitting a plurality of groups of calculation results after the operation is finished to a standardGraph class, outputting an html report by adopting a createServerGraph () method, and deriving a second standard deviation chart of a plurality of groups of server related indexes.
As an improvement of the invention, the recorded content of the csv file comprises a timestamp, a monitoring index, a success status and a monitoring value.
As an improvement of the invention, each index of the related indexes of the server refers to CPU, memory, Swap, disk I/O and network I/O.
As an improvement of the present invention, in step S5, a standard deviation range of the measured data is preset to [0, 1], and when the standard deviation of the measured data is within the preset standard deviation range, it is determined that the reliability of the group of data is high; and when the standard difference value of the measured data is not in the range, judging that the group of data is unreliable.
3. Has the advantages that:
compared with the method for processing the performance test index data in the prior JMeter, the method has the advantages that:
(1) adding a monitoring plug-in Perfmon, monitoring server resources in real time, exporting the server resources in a csv file form, and importing the csv file into a JMeter, so that the JMeter obtains relevant indexes of the server, test data are more complete, and test results are more comprehensive;
(2) providing a mean square error calculation item for a JMeter performance test index data item, wherein the calculation item reflects whether the self configuration of the JMeter influences a test value or not, and provides a basis for adapting the self configuration of the JMeter for a tester, so that the influence of the self configuration of the JMeter on a deviation result is reduced, and an invalid performance data is avoided;
(3) in order to judge whether the test results of the request related indexes and the server related indexes are relatively real and reliable, the standard deviation calculation is carried out on the multiple groups of results of all the indexes of the request related indexes and the server related indexes so as to analyze the deviation degree of the measured data. If the standard deviation value of a certain group is obviously larger than that of other groups, the group of data can be inferred to be unreliable, misleading data group selection is avoided, all indexes are optimized, objective basis is provided for judging the authenticity and reliability of all indexes, and a test scheme is optimized.
Drawings
FIG. 1 is a schematic view of the environment deployment and flow of the present invention;
FIG. 2 is a timing diagram of the mean square error of the computed TPS;
FIG. 3 is a timing diagram illustrating standard deviation of metrics associated with a computing request;
FIG. 4 is a timing diagram illustrating standard deviation of metrics associated with a computing server;
fig. 5 is a standard deviation graph of the CPU.
Detailed Description
The invention is further described below with reference to the following figures and examples:
the invention provides a performance index optimization method based on JMeter, which can reduce the influence of JMeter self-configuration on a test value, provide a basis for adapting the JMeter self-configuration for a tester, and also provide an objective judgment basis for judging whether a test result of pressure test data is real and reliable, so that the authenticity and the reliability of the pressure test result are presented in a more intuitive way.
The optimization method comprises the steps of starting a JMeter from a press to send a request, then returning data from a server, then calculating a mean square error through a JMeter system, adjusting the configuration of the JMeter according to a calculation result, then calculating a standard deviation of each data to obtain a standard deviation chart of each data, and judging the deviation degree of all data according to the standard deviation chart. The method specifically comprises the following steps:
s1, as shown in fig. 1, first, a press equipped with a JMeter sends a simulation request to a server to obtain pressure measurement data, where the pressure measurement data includes request related indexes and server related indexes: the request-related index has transaction number per second (TPS), error rate, Response Time (RT); the related indexes of the server comprise a CPU, an internal memory, a Swap, a disk I/O and a network I/O.
S2, aiming at whether the parameters of the JMeter, the configuration of the components and the calling influence the result, firstly, the deviation degree of the mean square error is calculated for the JMeter. The mean square error reflects the difference between the calculated test value and the true value. The test value of the step is provided by the JMeter, the size of the test value is influenced by the environment, the configuration and the server stability, the calculated mean square error is used for judging that the JMeter configuration can not influence the mean square error, so the mean square error range meeting the requirement is required to be set in [0, 2], and the interval is set for the fault tolerance of the environment and the server stability to the mean square error, so that the test value which is not influenced by the JMeter configuration is obtained. In this step, after the request correlation index is directly returned to the press from the server, the JMeter calculates the mean square error of a set of data of the TPS in the returned request correlation index, and obtains the mean square error value:
s2-1, as shown in FIG. 2, first call SamplingStatCalculator method, calculate the true value E (xn). JMeter receives request related indexes directly returned to the press from the server, because TPS is the request related indexes, parameters can be directly obtained through a sampleResult class, JMeter calculates a test value of the group of data in TPS according to a source code getRate () method, and actual test values of each request are stored in the sampleResult class;
s2-2, JMeter all requests of the same name will share a calculation object, SamplingStatCalculator method, thus the calculation results can be accumulated. Acquiring the number of times of requests of each TPS actual test value under the current request by using a SamplingStatCalculator, and calculating the true value E (xn) of the group of data by using a formula (1) in a calculator specification (), wherein the value of E (xn) is used for calculating the mean square error:
E(xn)=COUNT(request)/T (1)
wherein count (request) refers to the accumulation of the number of times of each request in the actual test value, and T refers to the total response time; the total response time refers to the starting time of the first request in the actual test value to the ending time of the last request in the actual test value;
s2-3, calling a JMeter aggregation report StatgraphVisualizer class, and calculating the mean square error of E (xn) obtained according to S2-2 and TPS test value by using a formula (2) in getExpection ():
sqrt(((X1-E(x1))^2+(X2-E(x2))^2+......(Xn-E(xn))^2)/(n-1)) (2)
wherein Xn refers to the actual test value of the nth TPS sample calculated by JMeter, and E (Xn) refers to E (Xn) obtained by S2-2;
in this step, each request may call the add method of the StatGraphVisualizer class to update the statistics.
S2-4, after obtaining the statistical result, firstly presetting the required mean square error value range as [0, 2], observing the mean square error value obtained in S2-3, if the mean square error value is not less than 0 and not more than 2, selecting the parameter and component configuration of the JMeter corresponding to the mean square error value, and setting the parameter and component configuration as the global configuration of the JMeter; if the mean square error value is larger than 2, the configuration of the parameters and components of the JMeter is adapted, the request is sent again, the group of data in the TPS is subjected to mean square error calculation again until the mean square error value meets the requirement, and the configuration of the parameters and components of the JMeter corresponding to the required mean square error value is set as the global configuration of the JMeter. In other embodiments of the present invention, the mean square error may be calculated by calculating other request correlation indicators, such as error rate, RT. The parameters of the JMeter and the configuration of the components can be adjusted by increasing or decreasing each component, and repeating the operation for a plurality of times, so that the expected mean square error value is finally achieved. Taking the case of writing a script using JMeter as an example, script a is set: and the request 1 and the request 2 are added with assertion component setting and judge response results. And (4) running the script A once, only executing each request once according to a set strategy, and obtaining a mean square error value related to the TPS after the execution is finished. The resulting mean square error value at this point is assumed to be 5 and is not within the [0, 2] range, and the script configuration is modified once. And (3) modifying the script A: request 1 adds no setting of the assertion component, request 2 adds setting of the assertion component, and judges the response result, wherein the use of the assertion component is reduced. At this point the script we note script B. The operations of running and calculating the mean square error value are repeated for script B. If the mean square error value is within the preset interval, the influence of the parameters of the script, the configuration and the calling of the components on the test result can be considered to be within an acceptable range. If the mean square error value is not within the preset interval, continuing to reduce the configuration of the assertion component of the request 2, repeating the calculation steps, and so on until the mean square error is within the range of [0, 2 ].
S3, standard deviation is the difference between the calculated test value and the mean. The standard deviation calculation mode of each group of data is the same, each group of standard deviation can be obtained through the following formula (3), and the data can be displayed in a graph mode to intuitively reflect which group of data is more reliable. Here, unlike the calculation of the mean square error, there is no need to additionally calculate the true value. All the index item information is configured into JMeter, and the JMeter script is operated to return a test value. The size of the test values can be affected by the environment, configuration, server stability, and thus the standard deviation. Therefore, multiple sets of standard calculations need to be performed on the same index, and comparing a single set of standard deviation values in multiple sets, the closer to zero the more reliable the set of data is. The mean square error in step S2 is calculated to reduce the effect of the JMeter configuration on the test value result, so that the effect of the JMeter configuration on the test value result is negligible in the case of pre-configuration. Therefore, under the global configuration of JMeter, JMeter calculates the standard deviation of all data in all indexes requesting relevant indexes to obtain a first standard deviation chart:
s3-1, as shown in FIG. 3, calls the SamplingStatCalculator class, and calculates the standard deviation by using calcultorStandard (). Firstly, directly obtaining the test value of each request related index through a sampleResult class, transmitting the test value of the request related index to a StatCalculator () method provided by JMeter when a calvulator Standard () method is called to calculate the average value of all samples of the request related index, and calculating the standard deviation of all the calculated average values of all the samples together with the test value by using a formula (3) in the calvulator Standard () method:
sqrt(((X1-E(n))^2+(X2-E(n))^2+......(Xn-E(n))^2)/(n-1)) (3)
wherein Xn is the actual test value of the nth request related index sample calculated by JMeter, and E (n) is the average value of all samples;
s3-2, sending multiple requests to the related indexes of each request to obtain multiple groups of data, and repeatedly calculating the standard deviation of the multiple groups of data;
s3-3, adding a standby Graph class to the JMeter, transmitting the calculation results of all requests after the operation is finished to a createServerGraph () method in the standby Graph class, and deriving a plurality of groups of first standard deviation graphs of request related indexes for displaying the standard deviation and the graphs of the request related indexes. Each index of the request related indexes refers to TPS, error rate and Response Time (RT).
S4, as shown in FIG. 1, if the relevant index data of the server needs to be obtained, the performance of the server needs to be monitored on the installed JMeter performance monitoring plug-in Perfmon in the press machine. Perfmon monitoring of relevant indexes of the server is added in the JMeter, the server side starts a Server service, and server resources are monitored in real time. See fig. 4 for a timing diagram of calculating the standard deviation of the server related metrics. First, index data information is returned through proxy service. Under the global configuration of JMeter, a server end returns relevant indexes of the server through Server agent service, a monitoring plug-in Perfmon of a press machine exports the relevant indexes of the server in a csv file form, then the csv file is imported into JMeter, and the JMeter calculates standard deviations of all data in all indexes of the relevant indexes of the server to obtain a second standard deviation chart:
s4-1, uploading relevant indexes of the server in the press, obtaining the relevant indexes of each server by utilizing Perfmon and exporting the relevant indexes in a csv file form, wherein the recorded content of the csv file comprises a timestamp, a monitoring index, a success state and a monitoring value;
s4-2, then adding an uploading file entry in a JMeter GUI interface, and importing all csv file data obtained in S4-1;
s4-3, calling DealWithServerData () in the sampleResult class to read csv file data imported in S4-2, and performing Map set processing on the data: screening out all test values and storing the test values in a set;
s4-4, acquiring a set of server related index test values in Map by parameters through S4-3, calling a StatCalculator method of JMeter, calculating all sample average values of server related indexes by using the test values, calculating all the calculated sample average values together with the test values, and calculating the standard deviation by using a formula (3) in a calcultorStandard () method:
sqrt(((X1-E(n))^2+(X2-E(n))^2+......(Xn-E(n))^2)/(n-1)) (3)
wherein Xn refers to the actual test value of the nth sample calculated by JMeter, and E (n) refers to the average value of all samples;
s4-5, sending a plurality of requests to the relevant indexes of each server to obtain a plurality of groups of data, and repeatedly carrying out standard deviation calculation of S4-4 on the plurality of groups of data;
and S4-6, transmitting a plurality of groups of calculation results after the operation is finished to a standardGraph class, outputting an html report by adopting a createServerGraph () method, and deriving a second standard deviation chart of a plurality of groups of server related indexes.
Each index of the relevant indexes of the server refers to CPU, memory, Swap, disk I/O and network I/O.
S5, judging the data deviation degree of the request related index according to the first standard deviation chart, and judging the data deviation degree of the server related index according to the second standard deviation chart, so as to obtain the reliability result of the measured data according to the deviation degree:
in step S5, a standard deviation range of the measured pressure data is preset to [0, 1], and when the standard deviation of the measured pressure data is within the preset standard deviation range, it is determined that the reliability of the group of data is high; and when the standard difference value of the measured data is not in the range, judging that the group of data is unreliable. According to an objective basis, the authenticity and the reliability of the measured data are objectively judged, and all indexes are greatly optimized.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.