CN113138917A - Performance test platform - Google Patents
Performance test platform Download PDFInfo
- Publication number
- CN113138917A CN113138917A CN202110400763.5A CN202110400763A CN113138917A CN 113138917 A CN113138917 A CN 113138917A CN 202110400763 A CN202110400763 A CN 202110400763A CN 113138917 A CN113138917 A CN 113138917A
- Authority
- CN
- China
- Prior art keywords
- test
- server
- management module
- platform
- demand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention relates to a performance test platform, which comprises a demand analysis module, a demand analysis module and a performance test module, wherein the demand analysis module is used for analyzing key knowledge points from a test application form to obtain demand points; the machine resource management module is used for acquiring server information and automatically acquiring resource information and key configuration parameters of the server according to the server information; the result analysis module is used for analyzing the data of the test result and generating a report; and the report management module is used for storing reports. The invention designs the demand analysis module, can analyze key knowledge points from the test application form to obtain demand points, has an analysis and refinement process on the test demand, and further leads testers to have more energy to concentrate on the quality of platform research and development, test result analysis and performance optimization.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a performance test platform.
Background
With the increase of service systems and functional modules, performance testing tasks are increased, the workload of performance testing is increased, a large amount of repeated workload is carried out by testing personnel, the testing personnel spend a large amount of energy on testing, and the time for the testing personnel to pay attention to the quality of the system is extruded.
Chinese patent publication No. CN109960619A discloses a performance test platform and a method, where the performance test platform includes a performance test console and a background service platform, and the performance test console at least includes: the test requirement management module is used for managing the test requirement and acquiring standardized project test information; the machine resource management module is used for butting a resource pool management system and applying machine resources to the resource pool management system; and the test report management module is used for analyzing and processing the received test data in real time, displaying the corresponding data and storing the final result into the database. The performance test platform is not provided with an analysis module for the extraction and test requirements, cannot extract key knowledge points according to the extraction and test requirements, is only used for standardizing the extraction and test requirements, and is not provided with an analysis and extraction process for the extraction and test requirements.
Disclosure of Invention
The invention aims to provide a performance test platform, which solves the technical problems that: the performance test platform in the prior art has no analysis module for the extraction and test requirements, cannot extract key knowledge points according to the extraction and test requirements, is only used for standardizing the extraction and test requirements, and has no analysis and refinement process for the extraction and test requirements; in addition, the existing performance test platform lacks an execution scenario management module, and cannot automatically generate a performance test scenario (i.e., a test case) by analyzing and extracting requirements and server resource information.
In order to solve the technical problems, the invention adopts the following technical scheme: a performance testing platform, comprising:
the demand analysis module is used for analyzing key knowledge points from the test application form to obtain demand points;
the machine resource management module is used for acquiring server information and automatically acquiring resource information and key configuration parameters of the server according to the server information;
the result analysis module is used for analyzing the data of the test result and generating a report;
and the report management module is used for storing reports.
Preferably, the system further comprises an execution scenario management module, which is used for managing the execution scenario and forming a test scenario according to the setting information, the demand point and the machine resource management module data.
Preferably, the pressure measurement system further comprises an environment management module for managing the pressure measurement environment.
Preferably, the system further comprises a monitoring management module, configured to acquire the execution situation of the scene.
Preferably, the content on the test application form includes information of a tester, a project name, a version number, test time, expected completion time, and a test department, and also includes a test type, a test target, and a product/system design index.
Preferably, the server information includes an IP, an account number, and a password of the server; the resource information of the server comprises a CPU, a MEM, a disk and an operating system.
Preferably, the setting information includes set execution time, start policy, stop policy, number of concurrent users, and precondition;
the number of the concurrent users is defaulted to be the number of logic cores of an application server in a load test, and the number of the concurrent users is defaulted to be twice of the number of the logic cores in a capacity test;
the default start-up strategy in the load test is 20% of the number of users started every 30 s;
the stop strategy is to stop 50% of users every 30 s;
in the load test, the execution time is 15 minutes after starting, and in the capacity scene, the execution time is 10 minutes after starting.
Preferably, the pressure measurement environment is a jmeter environment, and the jmeter environment is created by uploading a jmeter version and a dependent file.
Preferably, the execution case comprises TPS and response time; the monitoring management module is also used for acquiring the utilization rate of server resources.
Preferably, the result analysis module analyzes the TPS and response time data in conjunction with the pressure measurement server and application server resources.
By adopting the technical scheme, the beneficial technical effects of the invention are as follows: the invention designs a demand analysis module which can analyze key knowledge points from a test application form to obtain demand points, and has an analysis and refinement process on the test demand, so that a tester has more energy to concentrate on the quality of platform research and development, test result analysis and performance optimization; the invention also designs an execution scene management module which can automatically generate a performance test scene (namely a test case) according to the test requirement and the server resource information; the result analysis module of the invention is combined with the resources of the pressure measurement server and the application server to analyze the TPS and the response time data.
Drawings
FIG. 1 is a scene list;
FIG. 2 is a diagram of the environment management module creating a jmeter environment interface;
FIG. 3 is a monitor management module interface;
fig. 4 is a report management module.
Detailed Description
The invention will be further explained with reference to the drawings.
The method comprises the steps that a natural performance testing platform provides a demand analysis module, the demand analysis module analyzes key knowledge points from a testing application form to obtain demand points, the testing application form comprises testing person information, project names, version numbers, testing time, expected completion time, testing departments, testing clothes types, testing targets and product/system design indexes, and the demand analysis module recovers to form a simple testing scene after analyzing the key knowledge points and is used by a subsequent execution scene management module; the demand analysis module is detailed below:
uploading a performance test application form, wherein the application form comprises the following knowledge points:
function point | Content providing method and apparatus |
Type of test | Load testing |
Test function point | Order inquiry and commodity inquiry |
Service index | TPS:1000ART<1s |
Server resource index | CPU<75% |
According to the test application form, the program analysis automatically generates the following scenes:
the platform provides a machine resource management module, provides functions of uploading IP (Internet protocol), account numbers and passwords of the application server, can upload the IP, the account numbers and the passwords singly, and also supports batch uploading. The platform automatically acquires the resource information and the key configuration parameters of the application server according to the uploaded IP and other information, and can also acquire the service condition of the server resource in real time. The application service resource information comprises a CPU, a MEM, a disk and an operating system.
The platform provides an execution scene management module, and can also become a use case management module. And providing input necessary parameters, wherein the parameters comprise execution time, a starting strategy, a stopping strategy, the number of concurrent users and preconditions. The platform generates a test scene according to the set preference, the demand analysis result and the server resource information acquired by the machine resource management module, and simultaneously can generate a test scheme according to a formulated test scheme template.
The platform provides an environment management module, currently supports management of a Jmeter environment, and creates a Jmeter environment capable of running by uploading a Jmeter version and a dependent file list. The platform will automatically configure the meter on the press server.
The platform provides a monitoring management module, and can acquire the execution condition of a scene, such as TPS (service data set) and response time, and also can acquire the resource utilization rate of an application server through the monitoring module.
The platform provides a result analysis module, the platform automatically analyzes data of a test result, abnormal parts are displayed in a red marking mode, and after the analysis result is manually input, a report can be generated.
The platform also provides a report management module for storing performance test reports, and the result analysis module is used for analyzing the TPS and response time data by combining the resources of the pressure test server and the application server.
The technical solution of the present invention is clearly and completely described in the following point of view of implementing performance test on an automated performance testing platform.
The project group submits performance test requirements, and the test team provides a test application form template and guides the project team to finish writing the test application form.
And the performance test engineer uploads a performance test application on the demand analysis module after receiving the test application form, and the platform analyzes the test type, the test function point, the service index and the resource index according to the content of the application form.
And a test engineer inputs the machine information into the machine management module according to the application server information, and the platform automatically generates a machine CPU, MEM, disk and operating system information list.
Recording execution time, a starting strategy, a stopping strategy, the number of concurrent users and a precondition in a test scene; in the load test, the number of default concurrent users is the number of logic cores of the application server, and the number of the concurrent users in the capacity test is twice of the number of the logic cores; defaulting to start 20% of the number of users every 30 seconds in the load test (start-up strategy); default to stop 50% of users every 30 seconds (stop strategy); in the load test, the user executes for 15 minutes after starting, and executes for 10 minutes after starting in the capacity scene, and the platform automatically forms a test scene according to the setting information, the requirement analysis result and the management module data thereof and stores the test scene in the scene list shown in fig. 1.
At the environment management module, as shown in FIG. 2, the jmeter version and dependent files are uploaded on the create environment interface and a pressure measurement environment is generated. And uploading the jmeter script after the user records the jmeter script, and executing the scene after configuring the dependence environment. As shown in fig. 3, when a scenario is executed, information such as TPS, response time, and the like may be monitored in a monitoring management module in real time.
After the test scene is executed, the result analysis module can automatically generate a simple analysis result, judge whether the index reaches the standard, if abnormal conditions exist, add the analysis result in an input mode, then generate a test report according to the analysis result, and store the test report to the report management module.
Claims (10)
1. A performance testing platform, comprising:
the demand analysis module is used for analyzing key knowledge points from the test application form to obtain demand points;
the machine resource management module is used for acquiring server information and automatically acquiring resource information and key configuration parameters of the server according to the server information;
the result analysis module is used for analyzing the data of the test result and generating a report;
and the report management module is used for storing reports.
2. The performance test platform of claim 1, further comprising an execution scenario management module configured to manage the execution scenario and form a test scenario according to the setting information, the demand point, and the machine resource management module data.
3. The performance testing platform of claim 1, further comprising an environment management module to manage a pressure testing environment.
4. The performance testing platform of claim 1, further comprising a monitoring management module configured to obtain execution conditions of a scenario.
5. The performance testing platform of claim 1, wherein the content on the test application form includes information of testers, project names, version numbers, test time, expected completion time, and test departments, and also includes test types, test targets, and product/system design indexes.
6. The performance testing platform of claim 1, wherein the server information includes an IP, an account number, and a password of the server; the resource information of the server comprises a CPU, a MEM, a disk and an operating system.
7. The performance test platform of claim 2, wherein the setting information comprises set execution time, start policy, stop policy, number of concurrent users, and preconditions;
the number of the concurrent users is defaulted to be the number of logic cores of an application server in a load test, and the number of the concurrent users is defaulted to be twice of the number of the logic cores in a capacity test;
the default start-up strategy in the load test is 20% of the number of users started every 30 s;
the stop strategy is to stop 50% of users every 30 s;
in the load test, the execution time is 15 minutes after starting, and in the capacity scene, the execution time is 10 minutes after starting.
8. The performance testing platform of claim 3, wherein the pressure testing environment is a jmeter environment created from uploaded jmeter versions and dependent files.
9. The performance testing platform of claim 4, wherein the execution conditions include TPS and response time; the monitoring management module is also used for acquiring the utilization rate of server resources.
10. The performance testing platform of claim 1, wherein the results analysis module analyzes TPS and response time data in conjunction with pressure testing server and application server resources.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110400763.5A CN113138917A (en) | 2021-04-14 | 2021-04-14 | Performance test platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110400763.5A CN113138917A (en) | 2021-04-14 | 2021-04-14 | Performance test platform |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113138917A true CN113138917A (en) | 2021-07-20 |
Family
ID=76812595
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110400763.5A Withdrawn CN113138917A (en) | 2021-04-14 | 2021-04-14 | Performance test platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113138917A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115248782A (en) * | 2022-09-22 | 2022-10-28 | 中邮消费金融有限公司 | Automatic testing method and device and computer equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109960619A (en) * | 2019-02-12 | 2019-07-02 | 众安在线财产保险股份有限公司 | A kind of Testing Platform and method |
CN112148616A (en) * | 2020-09-30 | 2020-12-29 | 中国民航信息网络股份有限公司 | Performance test management platform |
-
2021
- 2021-04-14 CN CN202110400763.5A patent/CN113138917A/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109960619A (en) * | 2019-02-12 | 2019-07-02 | 众安在线财产保险股份有限公司 | A kind of Testing Platform and method |
CN112148616A (en) * | 2020-09-30 | 2020-12-29 | 中国民航信息网络股份有限公司 | Performance test management platform |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115248782A (en) * | 2022-09-22 | 2022-10-28 | 中邮消费金融有限公司 | Automatic testing method and device and computer equipment |
CN115248782B (en) * | 2022-09-22 | 2022-12-23 | 中邮消费金融有限公司 | Automatic testing method and device and computer equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108399114B (en) | System performance testing method and device and storage medium | |
WO2018120721A1 (en) | Method and system for testing user interface, electronic device, and computer readable storage medium | |
CN105760286B (en) | Application database dynamic property detection method and detection device | |
CN109062780B (en) | Development method of automatic test case and terminal equipment | |
CN111400186B (en) | Performance test method and system | |
WO2019100576A1 (en) | Automated test management method and apparatus, terminal device, and storage medium | |
WO2019100577A1 (en) | Automated test management method and apparatus, terminal device, and storage medium | |
CN108628748B (en) | Automatic test management method and automatic test management system | |
CN110750458A (en) | Big data platform testing method and device, readable storage medium and electronic equipment | |
CN111563014A (en) | Interface service performance test method, device, equipment and storage medium | |
CN111401722B (en) | Intelligent decision method and intelligent decision system | |
US20180357143A1 (en) | Testing computing devices | |
CN106850330B (en) | Intelligent cloud desktop performance test system and method | |
CN110764980A (en) | Log processing method and device | |
US6691259B1 (en) | Terminal server data file extraction and analysis application | |
CN113704077A (en) | Test case generation method and device | |
CN113138917A (en) | Performance test platform | |
CN112148616B (en) | Performance test management platform | |
CN114116422A (en) | Hard disk log analysis method, hard disk log analysis device and storage medium | |
CN111737143B (en) | Method and system for troubleshooting AB test of webpage | |
WO2021253239A1 (en) | Method and apparatus for determining resource configuration of cloud service system | |
CN103092864A (en) | Method and system for generating test data report form | |
CN112199273A (en) | Virtual machine pressure/performance testing method and system | |
CN115248782B (en) | Automatic testing method and device and computer equipment | |
CN113706098B (en) | Business-based deviation reason identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20210720 |
|
WW01 | Invention patent application withdrawn after publication |