CN112115064A - Method for automatically configuring performance scene based on performance test requirement - Google Patents

Method for automatically configuring performance scene based on performance test requirement Download PDF

Info

Publication number
CN112115064A
CN112115064A CN202011053695.1A CN202011053695A CN112115064A CN 112115064 A CN112115064 A CN 112115064A CN 202011053695 A CN202011053695 A CN 202011053695A CN 112115064 A CN112115064 A CN 112115064A
Authority
CN
China
Prior art keywords
performance
template
scene
parameters
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011053695.1A
Other languages
Chinese (zh)
Inventor
张荣芸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Changhong Electric Co Ltd
Original Assignee
Sichuan Changhong Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Changhong Electric Co Ltd filed Critical Sichuan Changhong Electric Co Ltd
Priority to CN202011053695.1A priority Critical patent/CN112115064A/en
Publication of CN112115064A publication Critical patent/CN112115064A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a method for automatically configuring a performance scene based on performance test requirements, which comprises the following steps: designing a configuration file template; constructing a performance scene template; determining the position of each parameter in a performance scene template file; constructing a configuration file; generating a configured performance scene file; according to the invention, the relevant information is extracted from the performance test requirement document and is filled into the configuration file, and the script is compiled to fill the configuration in the configuration file into the corresponding template, so that the configured performance scene can be generated, and the efficiency of the configuration of the performance test scene can be greatly improved.

Description

Method for automatically configuring performance scene based on performance test requirement
Technical Field
The invention relates to the technical field of software testing, in particular to a method for automatically configuring a performance scene based on performance testing requirements.
Background
The most widely used commercial tool for performance testing is the LoadRunner developed by HP corporation, while the open source project JMeter supported by the Apache foundation is the most used free tool.
LoadRunner is a load testing tool for predicting system behavior and performance, and confirms and searches problems by simulating thousands of users to implement concurrent load and real-time performance monitoring. The Controller component is a control center of LoadRunner, mainly comprises a scene design part and a scene execution part, and provides two test scenes of manual design and target facing. Manual scene design methods are generally used because the scene model can be designed more flexibly as desired, enabling the scene to better approximate the real use of the user. The main steps of manual scene design include: loading a performance test script into a Controller component; configuring a loader through Load Generators; setting the runtime configuration of each script through Run-time Settings; the protocol and Schedule are set by the Scenario Schedule.
The JMeter performance principle is similar to LoadRunner. And multiple users are simulated by adopting multiple threads, and the increased complex requirements of the actual users are achieved by controlling the starting and running of the threads. The JMeter scene design method mainly comprises the following steps: adding a Thread Group (Thread Group); configuring Thread concurrency related parameters (Thread Properties); if there are multiple requests, a thread group is established for each request and relevant parameters are configured.
The two introduced tool scene configuration operations are manually operated in a window interface, which is very tedious and has many repeated operations:
firstly, there may be a plurality of different requests in a performance test scene, and each request needs to set different concurrency numbers, time intervals, operation durations and the like for more truly simulating user operation due to different use frequencies in a real scene;
secondly, different performance requirements can be provided for the same test object in different test stages or for different test purposes, so that the test scene also needs to be changed correspondingly, and the content is configured again.
Disclosure of Invention
In order to solve the problems in the prior art, the invention aims to provide a method for automatically configuring a performance scene based on performance test requirements, and the method can greatly improve the efficiency of configuring the performance test scene.
In order to achieve the purpose, the invention adopts the technical scheme that: a method for automatically configuring a performance scene based on performance test requirements comprises the following steps:
s1, designing a configuration file template: storing parameters to be configured in a listed performance test scene as a configuration file template according to a certain format;
s2, constructing a performance scene template: configuring and storing the listed parameters in an interface of a performance testing tool as a performance scene template file;
s3, determining the positions of the parameters in the performance scene template file: modifying the parameters in an interface of a performance testing tool in sequence and storing the parameters as a comparison file, opening the comparison file and a performance scene template file as texts, comparing the difference between the two files by using the comparison tool, determining the position of the modified parameters in the performance scene template file, recording text characteristics before and after the position of the parameters, adding the text characteristics as the attributes of each parameter in the configuration file template into the configuration file template, and updating the configuration file template;
s4, constructing a configuration file: designing a performance scene according to the performance test requirement, quantizing the performance scene into corresponding parameters, and filling the parameters into an updated configuration file template to construct a configuration file;
s5, generating a configured performance scene file: and circularly analyzing each parameter in the configuration file and the front and back text characteristics corresponding to each parameter in sequence, finding the position of the corresponding parameter by using a character processing method, replacing the parameter of the corresponding position in the performance scene template, and generating a configured performance scene.
As a further improvement of the present invention, the parameters to be configured in the performance test scenario include: interval time, number of concurrencies and running duration of the runtime.
As a further improvement of the present invention, in step S1, the parameters are organized as a profile template in a canonical format, which is json or xml.
As a further improvement of the invention, the character processing method is regular matching.
As a further improvement of the invention, the performance testing tools are LoadRunner, JMeter and any performance testing tool which saves the performance testing scene into a text format.
The invention has the beneficial effects that:
the invention can be opened in the way of text file by using scene configuration file of LoadRunner and jmx file of Jmeter, and all configurations carried out on the interface can achieve the same purpose by modifying the content of the text file; therefore, the configured performance scene can be generated by extracting the relevant information from the performance test requirement document, filling the relevant information into the configuration file and writing the script to fill the configuration in the configuration file into the corresponding template, and the efficiency of configuring the performance test scene can be greatly improved.
Drawings
FIG. 1 is a block flow diagram of an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Examples
As shown in fig. 1, the present embodiment aims to improve efficiency of configuring a performance test scenario, and provides a method for automatically configuring a performance scenario based on a performance test requirement, where the present embodiment uses an automatic configuration LoadRunner Controller, a programming language uses python3, and a configuration file format uses json as an example to describe in detail, and specifically includes the following steps:
s1, designing a configuration file template:
s11, listing parameters to be configured in the performance test scene (such as performance test script paths, the interval time of each script operation, the concurrence quantity of each script and the like);
s12, organizing the parameters into a configuration file template in a json format according to a hierarchical relationship;
s2, constructing a Controller template:
s21, configuring all relevant parameters listed in S11 in a Controller interface of LoadRunner;
s22, saving the lrs file configured in S21 as a Controller template file;
s3, determining the position of each parameter in the Controller template file:
s31, sequentially modifying the parameters in the S11, and storing the parameters as another lrs file;
s32, opening the lrs file of S31 and the Controller template file described in S2 as texts, comparing the differences of the lrs file and the Controller template file by using a comparison tool (such as Beyond company), and determining the position of the modified parameter of S31 in the Controller template file;
s33, recording text characteristics before and after the position of the confirmed parameter S32, and storing the text characteristics in a regular expression mode;
s34, adding the regular expression in the S33 as the attribute of each parameter into the configuration file template in the json format in the S12, and updating the configuration file template into a more complete configuration file template;
s4, constructing a configuration file:
s41, designing a performance scene according to performance test requirements (such as supporting at least 50 users to concurrently access the database, the number of requests for accessing the database per second is at least 100, and the like), and quantizing the performance scene into corresponding parameters (such as the concurrent amount of database scripts is 50, the interval time of the database scripts during operation is 0.4S, and the like);
s42, filling the parameters obtained in the S41 into the configuration file template completed in the S34 to construct a configuration file;
s5, generating a configured performance scene file:
s51, reading in the Controller template constructed in S2;
s52, circularly analyzing the parameter values in the configuration file S4 in sequence, and representing regular expressions corresponding to the parameter positions;
s53, positioning the position of the corresponding parameter in the Controller template read in S51 by using the regular expression analyzed in S52 through a sub function in a re library of python, and replacing the value in the original template by using the parameter value read in S52;
and S54, circularly executing the steps S52 and S53 until all the parameters are replaced, namely generating the configured performance scene.
The above-mentioned embodiments only express the specific embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (5)

1. A method for automatically configuring a performance scene based on performance test requirements is characterized by comprising the following steps:
s1, designing a configuration file template: storing parameters to be configured in a listed performance test scene as a configuration file template according to a certain format;
s2, constructing a performance scene template: configuring and storing the listed parameters in an interface of a performance testing tool as a performance scene template file;
s3, determining the positions of the parameters in the performance scene template file: modifying the parameters in an interface of a performance testing tool in sequence and storing the parameters as a comparison file, opening the comparison file and a performance scene template file as texts, comparing the difference between the two files by using the comparison tool, determining the position of the modified parameters in the performance scene template file, recording text characteristics before and after the position of the parameters, adding the text characteristics as the attributes of each parameter in the configuration file template into the configuration file template, and updating the configuration file template;
s4, constructing a configuration file: designing a performance scene according to the performance test requirement, quantizing the performance scene into corresponding parameters, and filling the parameters into an updated configuration file template to construct a configuration file;
s5, generating a configured performance scene file: and circularly analyzing each parameter in the configuration file and the front and back text characteristics corresponding to each parameter in sequence, finding the position of the corresponding parameter by using a character processing method, replacing the parameter of the corresponding position in the performance scene template, and generating a configured performance scene.
2. The method of claim 1, wherein the parameters to be configured in the performance test scenario comprise: interval time, number of concurrencies and running duration of the runtime.
3. The method for automatically configuring performance scenarios based on performance testing requirements of claim 1, wherein in step S1, the parameters are organized as a profile template in a canonical format, either json or xml.
4. The method of claim 1, wherein the character processing method is a regular match.
5. The method for automatically configuring performance scenario based on performance test requirement as claimed in any of claims 1-4, wherein the performance test tool is LoadRunner, JMeter and any performance test tool that saves performance test scenario as text format.
CN202011053695.1A 2020-09-29 2020-09-29 Method for automatically configuring performance scene based on performance test requirement Pending CN112115064A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011053695.1A CN112115064A (en) 2020-09-29 2020-09-29 Method for automatically configuring performance scene based on performance test requirement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053695.1A CN112115064A (en) 2020-09-29 2020-09-29 Method for automatically configuring performance scene based on performance test requirement

Publications (1)

Publication Number Publication Date
CN112115064A true CN112115064A (en) 2020-12-22

Family

ID=73797360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011053695.1A Pending CN112115064A (en) 2020-09-29 2020-09-29 Method for automatically configuring performance scene based on performance test requirement

Country Status (1)

Country Link
CN (1) CN112115064A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113254332A (en) * 2021-05-14 2021-08-13 山东英信计算机技术有限公司 Multi-scenario testing method, system, terminal and storage medium for storage system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832207A (en) * 2017-10-16 2018-03-23 深圳市牛鼎丰科技有限公司 Interface performance test method, apparatus, storage medium and computer equipment
CN109871314A (en) * 2019-01-02 2019-06-11 石化盈科信息技术有限责任公司 The automatic generation method of test script
CN110427331A (en) * 2019-09-03 2019-11-08 四川长虹电器股份有限公司 The method for automatically generating performance test script based on interface testing tool
CN110597721A (en) * 2019-09-11 2019-12-20 四川长虹电器股份有限公司 Automatic interface pressure testing method based on pressure testing script
US10534701B1 (en) * 2019-06-17 2020-01-14 Capital One Services, Llc API driven continuous testing systems for testing disparate software

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832207A (en) * 2017-10-16 2018-03-23 深圳市牛鼎丰科技有限公司 Interface performance test method, apparatus, storage medium and computer equipment
CN109871314A (en) * 2019-01-02 2019-06-11 石化盈科信息技术有限责任公司 The automatic generation method of test script
US10534701B1 (en) * 2019-06-17 2020-01-14 Capital One Services, Llc API driven continuous testing systems for testing disparate software
CN110427331A (en) * 2019-09-03 2019-11-08 四川长虹电器股份有限公司 The method for automatically generating performance test script based on interface testing tool
CN110597721A (en) * 2019-09-11 2019-12-20 四川长虹电器股份有限公司 Automatic interface pressure testing method based on pressure testing script

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张峰 等: "LoadRunner工具在性能测试中应用研究", 《电脑知识与技术》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113254332A (en) * 2021-05-14 2021-08-13 山东英信计算机技术有限公司 Multi-scenario testing method, system, terminal and storage medium for storage system

Similar Documents

Publication Publication Date Title
CN107704395B (en) Cloud platform automation test implementation method and system based on Openstack
US7373636B2 (en) Automated software testing system and method
CN105912474B (en) A kind of game on-line debugging method, system and editing service end
US11126938B2 (en) Targeted data element detection for crowd sourced projects with machine learning
US20100218168A1 (en) System and Method for Generating a Test Environment Script File
CN112597014B (en) Automatic test method and device based on data driving, medium and electronic equipment
CN112328489B (en) Test case generation method and device, terminal equipment and storage medium
CN108647147B (en) Automatic testing robot implemented by using atlas analysis and use method thereof
CN110399299A (en) The execution method of automated test frame and test case
CN110597721A (en) Automatic interface pressure testing method based on pressure testing script
CN111722873A (en) Code reconstruction method, device, equipment and medium
CN108874649A (en) Generation method, device and its computer equipment of automatic test script
CN112115064A (en) Method for automatically configuring performance scene based on performance test requirement
CN114297961A (en) Chip test case processing method and related device
CN113127312A (en) Method and device for testing database performance, electronic equipment and storage medium
CN114416547A (en) Test case based test method
CN117235527A (en) End-to-end containerized big data model construction method, device, equipment and medium
KR100369252B1 (en) Software test system and method
CN108205608B (en) Simulation device with configurable model
US20160246465A1 (en) Duplicating a task sequence from a graphical user interface interaction for a development application in view of trace data
CN113672509A (en) Automatic testing method, device, testing platform and storage medium
US20080022258A1 (en) Custom database system and method of building and operating the same
US8775873B2 (en) Data processing apparatus that performs test validation and computer-readable storage medium
CN116755983A (en) Method for automatically configuring performance scene based on performance test requirement
CN118503270B (en) NL2SQL data set construction method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201222

RJ01 Rejection of invention patent application after publication