CN115328758A - Performance test method and system for large data volume of industrial software - Google Patents

Performance test method and system for large data volume of industrial software Download PDF

Info

Publication number
CN115328758A
CN115328758A CN202210768773.9A CN202210768773A CN115328758A CN 115328758 A CN115328758 A CN 115328758A CN 202210768773 A CN202210768773 A CN 202210768773A CN 115328758 A CN115328758 A CN 115328758A
Authority
CN
China
Prior art keywords
test
data
script
management center
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210768773.9A
Other languages
Chinese (zh)
Inventor
吴彬彬
徐文豪
袁强
王勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Supcon Technology Co Ltd
Original Assignee
Zhejiang Supcon Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Supcon Technology Co Ltd filed Critical Zhejiang Supcon Technology Co Ltd
Priority to CN202210768773.9A priority Critical patent/CN115328758A/en
Publication of CN115328758A publication Critical patent/CN115328758A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a performance test method and a system for large data volume of industrial software, wherein the performance test method comprises the steps of deploying an API request grabbing tool; simulating a user to access a system to be tested, and collecting API request information to generate a performance test script; the deployment script management center tool is used for importing a test case, analyzing the imported test case, storing the analyzed content to a case maintenance table, and connecting a database through a data test script; deploying a test management center tool, wherein the test management center tool comprises a message management center module used for sending a message task to a script management center and sending a performance test task with large data volume; the performance test script calls a data test script, the data test script inserts test data into the database based on the state flag of the test case maintenance table, and the generated test data is used for performance test of large data volume; and test data with large data volume is quickly created aiming at different service scenes of industrial software.

Description

Performance test method and system for large data volume of industrial software
Technical Field
The invention relates to the technical application field of industrial software performance test, in particular to a performance test method and system for large data volume of industrial software.
Background
The traditional performance test has a single service scene, a performance test result can be obtained only after one performance test is finished, the message is delayed, and the time and resource utilization rate is low. The invention patent CN201010613464.1 proposes a performance testing system and method, the system can monitor the performance data of a tested server in real time, and when the CPU utilization rate of the tested server does not reach a set threshold value, the number of concurrent users is increased; and when the CPU utilization rate of the tested server exceeds a set threshold value, stopping the performance test. The performance test method realizes a certain degree of unattended operation, but has certain defects:
1. the system still aims at a single performance test scene, when the service system is huge and the performance scenes needing to be tested are more, the scenes still need to be switched by the intervention of testers, the switching of the service scenes needs to create a large amount of sql data, and the manual creation of the data takes a large amount of time.
2. The system only monitors the tested server resources in real time, has single judgment condition, does not acquire and analyze other key performance results, possibly causes invalid subsequent performance tests, and wastes time and resources of the subsequent performance tests.
3. When the CPU utilization rate of the tested server exceeds a set threshold value, the current test is stopped, a user cannot be informed in time, and time difference still exists between the problem troubleshooting and the subsequent test.
Disclosure of Invention
The invention aims to overcome the defects of the technology and provides a performance testing method and a system for large data volume of industrial software, aiming at different business scenes of the industrial software, test data with large data volume is quickly created, and therefore the efficiency of manually creating sql statements is improved. And simultaneously monitoring the test result and the hardware resource of the tested server in real time, and giving a warning notice to the tester in time when the preset performance index and the server resource are not met in the test process.
The invention provides a performance test method for large data volume of industrial software, which comprises the following steps: deploying an API request grabbing tool, wherein the API request grabbing tool is used for collecting API request information;
simulating a plurality of users to access corresponding service functions through a browser based on a preset performance test scene and performance test requirements, and collecting API request information generated by access based on an API request grabbing tool so as to generate a performance test script; deploying a script management center tool, wherein the script management center tool at least comprises a test case maintenance module for importing a plurality of test cases, a function module for importing a performance test script and a data test script corresponding to the test cases of different test scenes, the script management center analyzes the imported test cases through the test case modules and stores the analyzed content into a case maintenance table, and the data test script at least comprises a script required by connecting a database; deploying a test management center tool, wherein the test management center at least comprises a message management center module, and the message management center module is used for sending a message task to a script management center and starting a performance test of large data volume; the performance test script calls a data test script, the data test script inserts simulation data of a preset data volume into the database based on a state flag of a test case maintenance table corresponding to a test case, and the generated simulation data is used for performance test of a large data volume; the message management center module receives the execution result of the data test script and starts a performance test task with large data volume; and starting test result collection and analyzing the test results.
Furthermore, the performance test script calls a data test script, and the data test script inserts simulation data of a preset data volume into the database based on the state flag of the test case maintenance module, and specifically includes: the script management center tool starts to execute the test case after receiving the performance test request with large data volume of the message management center module, calls the data test script through the performance test script, judges whether data with preset data volume needs to be inserted or not based on the state mark of the test case maintenance table, does not need to insert the data if the state mark indicates that the data for performance test already exists, and inserts the data with the preset data volume if the state mark indicates that the data does not exist.
Further, the inserting the data of the preset data amount specifically includes: initializing database connection and calling a database insertion statement through set database information and table names in the data test script; and obtaining a cursor object capable of executing the SQL sentence, writing the insertion data into a random variable form, inserting the insertion data in a large batch through circulation, sending an insertion completion message of a message management center after the execution is completed based on the preset data volume, and returning a result set to display in a tuple manner.
Further, the method also comprises the step of adding the identification of the interface sending the API request and the identification of the testing step to the performance testing sample and storing the performance testing sample.
Further, starting test result collection and analyzing test results, specifically including storing pressure test result data based on each user server and resource use condition data of a server deploying industrial software to be tested to a time sequence database according to a time sequence, wherein each piece of data at least comprises a timestamp, sending the data at regular time, and dynamically displaying the pressure test result data and the resource use condition data of the server based on a Web page visual mode.
Further, the method also comprises the following steps: and if the test result data is wrong or the resource use condition data of the server exceeds a preset threshold value, notifying a receiver set in the configuration test parameters through message middleware, and accessing and checking the pressure test result data and the resource use condition data of the server through a Web page of a test management center tool.
Furthermore, the performance test method for the industrial software with the large data volume also comprises the steps of collecting, storing and filtering logs based on a preset log level, and displaying the logs based on graphical programming, so that problems are quickly located.
Furthermore, the test management center tool also comprises a system configuration module, a resource file module, a task scheduling module, a monitoring module, a data analysis module and a data display module; the system configuration module is used for configuring the tested server, the test execution machine operation parameters and the test environment; the resource file module is used for configuring a test script; the task scheduling module is used for configuring task names, selecting executed test scripts and executing time and frequency, and after the tasks are constructed and executed, starting performance tests and executing the test scripts. The monitoring module is used for monitoring the running conditions of the server to be tested and the test execution machine pool and collecting the running data of the server to be tested and the test execution machine pool; the data analysis module is used for analyzing and summarizing performance test result indexes and server resource use conditions in a Python programming mode; the data display module is used for dynamically displaying various performance test result indexes and server resource use conditions set by a user in real time through a visual graph; the message management center module is also used for configuring related notification personnel, notification modes and notification frequencies and notifying related responsible persons in time through message middleware; the log management module is used for collecting log files in the execution process of the test execution machine pool.
Further, the data test script is a Python script, and the performance test script is a Jmeter script.
The second aspect of the present invention further provides a performance testing system for operating the performance testing method, which at least includes a script management center, a real-time monitoring center and a message management center, wherein the script management center is used for adapting to different service scenes to operate corresponding scripts and create testing data; the real-time monitoring system is used for acquiring the performance test result and the server resource information after the script is operated in real time and sending the performance test result and the server resource information to the message management center; and the message management center feeds back the test result based on the preset threshold, the performance test result and the server resource information.
The invention has the beneficial effects that:
1. different scripts are called to quickly generate a large amount of database data for pain points with complex service scenes, large data volume and high timeliness of performance test in the field of industrial software;
2. the performance testing method has the advantages that the performance testing result and server resources are monitored in real time in the performance testing process, and messages are actively sent to users in time so that the users can quickly make adjustments, and the method is a quick and efficient performance testing method with good adaptability and strong pertinence in the field of industrial software;
3. meanwhile, the testing result can be sent to the analysis device in real time through the configuration timing task, the analysis result can be analyzed through the analysis device, and the analysis result is sent to the user through the mail configuration server, so that the unattended performance testing process is really realized.
Drawings
FIG. 1 is a schematic flow chart of a method for testing the performance of industrial software with large data volume according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of generating performance test samples according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an example of a workflow of a script management center tool according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a test case maintenance representation according to an embodiment of the present invention
Fig. 5 is a schematic flowchart illustrating a process of a test result exceeding a preset threshold in a test execution process according to an embodiment of the present invention.
Detailed Description
API: an Application Programming Interface (API), also called API, is a convention for linking different components of a software system.
Test case: a particular set of input data, operational or various environmental settings, and desired results to be provided to the system under test for conducting the test;
testing the script: the test script is written for automatic test, and the writing of the test script must correspond to a corresponding test case;
fiddler: the Internet debugging agent tool can capture various http communications between a computer and even a mobile phone and the Internet, and can check and analyze the http communications.
A meter: software (e.g., web applications) for testing client/server structures. It can be used to test the performance of static and dynamic resources.
In order to facilitate a better understanding of the invention for those skilled in the art, the invention will be described in further detail with reference to the accompanying drawings and specific examples, which are given by way of illustration only and do not limit the scope of the invention.
The invention discloses a performance testing method for large data volume of industrial software, which is a flow schematic diagram of the performance testing method for large data volume of industrial software as shown in fig. 1 and specifically comprises the steps of deploying an API request grabbing tool, wherein the API request grabbing tool is used for collecting API request information; simulating a plurality of users to access corresponding business functions through a browser based on a preset performance test scene and performance test requirements, and collecting API request information generated by access based on an API request grabbing tool so as to generate a performance test script; deploying a script management center tool, wherein the script management center tool at least comprises a test case maintenance module for importing a plurality of test cases, a function module for importing performance test scripts and a data test script corresponding to the test cases of different test scenes, the script management center analyzes the imported test cases through the test case modules and stores the analyzed content into a case maintenance table, and the data test script at least comprises scripts required by connecting a database; deploying a test management center tool, wherein the test management center at least comprises a message management center module, and the message management center module is used for sending a message task to a script management center and starting a performance test of large data volume; importing a performance test script based on a script management center tool, receiving a message by a message management center, and sending a scheduling task to the script management center; the performance test script calls a data test script, the data test script inserts simulation data of a preset data volume into the database based on a state flag of a test case maintenance table corresponding to a test case, and the generated simulation data is used for performance test of a large data volume; the message management center module receives the execution result of the data test script and starts a performance test task with large data volume; and starting test result collection and analyzing the test results.
The following describes the steps of a large number of performance tests of industrial software applied to the pharmaceutical industry.
S1, deploying an API request grabbing tool, wherein the API request grabbing tool is used for collecting API request information.
In an embodiment of the present invention, fiddler is employed as the API request grabbing tool. Downloading and installing Fiddler, and opening Fiddler during user test to realize API request grabbing tool
S2, simulating a plurality of users to access corresponding service functions through a browser based on a preset performance test scene and performance test requirements, and collecting API request information generated by access based on an API request grabbing tool so as to generate a performance test script;
and simulating a user to access the tested system, namely the industrial software to be tested, through a browser according to the application scene of the industrial software to be tested and the requirement of the corresponding performance test. And the simulation user sends an API request to the tested system through the browser.
For example, when multiple users simultaneously request a certain service function, findlerr is installed and started in the system under test, and the user is simulated to access the system under test through a browser, where the flow diagram is shown in fig. 2. Fiddlerr intercepts API requests sent by simulation users, collects all API request information generated in the running process of Web products, and stores and forms a script in a Jmeter format.
In some embodiments, the method further comprises adding an identification of an interface sending the API request and an identification of the testing step to the performance testing sample, additionally, generating a unique ID according to a timestamp of the request sending, intercepting the request path to produce a value, and adding the request identification according to the form of the ID and the value.
S3, deploying a script management center tool, wherein the script management center tool at least comprises a test case maintenance module for importing a plurality of test cases, a performance test script importing function module for importing performance test scripts and data test scripts of the test cases corresponding to different test scenes, the script management center analyzes the imported test cases through the test case modules, and stores analyzed contents into a case maintenance table, and the data test scripts are connected with a database.
The script management center is realized by python, a python script file is written and then is imported into the script management center, and the script management center analyzes the imported test case and manages the test case. The operation diagram of the script management center tool is shown in fig. 3. The script management center imports a performance test script, the message management center module receives the message and sends the task to the script management center, the script management center starts the task, executes the performance test script and judges a state mark of the test case, if the state mark of the test case is False, the data test script is called to produce test data with large data volume, and after the execution is finished, the data test script sends the message to the message management center module to start the energy-wanting test with large data volume.
In some embodiments, the script management center has an import button to import the performance test script, i.e. import the test case. The test cases are maintained and stored in an oracle data table in a script management center, and table language sentences are created as follows:
CREATE TABLE"TEST_CASES"("ID"NUMBER(20,0)NOT NULL ENABLE,"CASE_NAME"VARCHAR2(256)NOT NULL ENABLE,"PYCASENAME"VARCHAR2(20,0)NOT NULL ENABLE,"OPERATE"VARCHAR2(256)NOT NULL ENABLE,"STATUS"NUMBER(1,0)DEFAULT 0,"CREATOR"VARCHAR2(64),"MODIFY_TIME"TIMESTAMP(6)DEFAULT NULL,PRIMARY KEY("ID"));
ID: unique identifier for example table representing our script
CASE _ NAME: representing case names, test case custom names, e.g. test case 1, test case 2
PYCASENAME: python script file corresponding to representation case
STATUS: the state, effective case is marked by True, and the ineffective case is marked by False
OPERATE: operations are stored in the database with 0,1,2, 0 for edit, 1 for execute, 2 for delete
CREATOR: indicating by whom the use case was created
MODIFY _ TIME: modification time, representing modification time of use case
When a certain test case is executed, a corresponding python script file is operated, and data can be generated by inputting a starting value and an ending value into an interface, so that the test case can cover different service scenes by only compiling a plurality of pythons to realize scripts of different service scenes. Writing a python script file first requires creating a use case maintenance directory such as: d \ datatest \ case \ test _ process _ type. Py, a schematic diagram of a use case maintenance table of the embodiment example is shown in FIG. 4.
And S4, deploying a test management center tool, wherein the test management center at least comprises a message management center module, and the message management center module is used for sending a message task to the script management center and starting a performance test with large data volume.
In some embodiments, the test management center tool includes the following modules, a system configuration module: the method is mainly used for configuring the tested server, the test execution machine operation parameters and the test environment. A resource file module: the method is mainly used for configuring the test script. A task scheduling module: the method is mainly used for configuring task names, selecting test scripts to be executed, and executing time and frequency. And after the task is constructed and executed, starting a performance test and executing a test script. A monitoring module: the method is mainly used for monitoring the running conditions of the tested server and the test execution machine pool and collecting the running data of the tested server and the test execution machine pool; a data analysis module: the method is used for analyzing and summarizing the performance test result indexes and the service conditions of the server resources in a Python programming mode. The data display module: the method is used for dynamically displaying various performance test result indexes and server resource use conditions set by a user in real time through a visual graph. Message management center module: the method is mainly used for configuring related notification personnel, notification modes and notification frequencies, and timely notifying related responsible persons and a script management center to start tasks through message middleware. A log management module: the method is mainly used for collecting the log files in the execution process of the test execution machine pool.
The test management center tool is accessible through a Web page.
A plurality of test cases are led in a case maintenance module of a script management center, one test case corresponds to one performance test script, a user can directly access a Web interface to lead in the performance test script in a resource file, the number of virtual users of each scene, user thinking time, a test result storage path and the use threshold of CPU, memory, IO and network card of a test server resource, a test error rate threshold, a log level, various index display conditions of the test result, the use condition of the server resource and the like are set in test parameters in the system setting of a system to be tested, a timing task is configured in task scheduling, a trigger is constructed to execute the timing task, and after construction operation is configured, a test report template and mail receiver information are configured, wherein the test report template supports a default template and user-defined setting; and unattended operation is realized.
The configuration of the timing task can trigger the interface to send the task, that is, the test result can be sent to the information collecting device at regular time for collecting the test result.
And S5, calling the data test script by the performance test script, inserting simulation data of a preset data volume into the database by the data test script based on the state flag of the test case maintenance table corresponding to the test case, and using the generated simulation data for the performance test of large data volume.
Different test cases can be clicked according to a service scene, when a certain test case is executed, namely after a performance test of a large data volume is started, a corresponding python script file can be operated, data are inserted into a database, the data are written into a random variable form, and through circular insertion, effective realization of batch test data is achieved. In the process of inserting data, whether the data needs to be inserted is judged according to the status field of the test case maintenance table, if the data exists, the data does not need to be inserted, and if the data does not exist, the data starts to be prepared for inserting.
Taking the example of connecting a database Oracle in a data test script, creating one million data in the Oracle database, initializing database connection cx-Oracle by defining a table object class A to obtain a cursor object capable of executing an SQL statement, and displaying a result set returned after execution as a tuple by default.
In some embodiments, for example, creating a time field, we implement the time field inserted each loop by writing a string in a time format, dt = (dt + datatime. Time. Delay (days = -1)) will automatically subtract 1 day by writing a for loop, modifying the time field may require data fixed for a certain period of time, time may be converted to a corresponding timestamp mode _ time = time. History ('% Y-% M-% d% H:% M:% S', time. Localtime (1585497600-random. Range (1, 1000000))). Data insert inter table names (fields) values (placeholders) are then batch inserted by the executemany method.
Another service scenario is different fields such as ID field, where ID is located as str (i) in the loop, and input _ key field we set as input _ key = 'LIMS: ammonia nitrogen content' + shift _ date.
S6, the message management center module receives the execution result of the data test script and starts a performance test task with large data volume;
and S7, starting test result collection and analyzing the test results.
And starting the performance test task, starting a result collecting device and a server monitoring device, wherein the server monitoring device monitors the hardware resource of the tested server, and the result collecting device collects a performance test receipt. In one embodiment of the invention, the interface is triggered to send the task through the configuration of the timing task, and the test result is sent to the test result collecting device at regular time. Collecting and storing a time sequence database for the pressure test result of each server and the service condition of server resources, wherein each piece of data has a time stamp based on the time stored database; and collecting and integrating the test results and then sending the test results to an analysis device. The analysis device collects data and analyzes the data to dynamically display various performance test result indexes and server resource use conditions set by a user in real time through a visual graph based on the collected test result data and the server resource use data. When the error rate of the test result and the utilization rate of the server resources exceed the set threshold values, the flow chart is as shown in fig. 5, and the relevant responsible person is notified in time through the message middleware, and can access and view the historical test index data and the server resource monitoring data through the Web page to perform troubleshooting analysis.
The user can know the real-time performance test result in the first time and perform manual intervention in time, so that the time and resource waste from the performance test error to the performance test end period is avoided, and the performance test efficiency is ensured.
And S8, collecting, storing and filtering the log based on a preset log level, and displaying the log based on Python graphical programming, so as to quickly locate the problem.
In some embodiments, the log management module of the script-based management center is configured to collect log information generated by the test execution pool, and the log level is selectable as follows: the logging analysis analyzes the log files of the dates, searches according to keywords input by a user, and can quickly locate the dates.
S9, configuring the mail service center and sending test report results to different users
In some embodiments, the method further comprises configuring the mail service center to send the test report results to different users.
The invention provides a large data volume performance test system for industrial software, which at least comprises a script management center, a real-time monitoring center and a message management center, wherein the script management center is used for adapting to different service scenes to run corresponding scripts and creating test data; the real-time monitoring system is used for acquiring the performance test result and the server resource information after the script is operated in real time and sending the performance test result and the server resource information to the message management center; and the message management center feeds back the test result based on the preset threshold, the performance test result and the server resource information.
It should be noted that: in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described in this specification. In some other embodiments, the method may include more or fewer steps than those described herein. Moreover, a single step described in this specification may be broken down into multiple steps for description in other embodiments; multiple steps described in this specification may be combined into a single step in other embodiments.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The system and the system embodiments described above are merely illustrative, and some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement without inventive effort.

Claims (10)

1. A performance test method for large data volume of industrial software is characterized by comprising the following steps:
deploying an API request grabbing tool, wherein the API request grabbing tool is used for collecting API request information;
simulating a plurality of users to access corresponding service functions through a browser based on a preset performance test scene and performance test requirements, and collecting API request information generated by access based on an API request grabbing tool so as to generate a performance test script;
deploying a script management center tool, wherein the script management center tool at least comprises a test case maintenance module for importing a plurality of test cases, a function module for importing a performance test script and a data test script corresponding to the test cases of different test scenes, the script management center analyzes the imported test cases through the test case modules and stores the analyzed content into a case maintenance table, and the data test script at least comprises a script required by connecting a database;
deploying a test management center tool, wherein the test management center at least comprises a message management center module, and the message management center module is used for sending a message task to a script management center and starting a performance test of large data volume;
the performance test script calls a data test script, the data test script inserts simulation data of a preset data volume into the database based on a state flag of a test case maintenance table corresponding to a test case, and the generated simulation data is used for performance test of a large data volume;
the message management center module receives the execution result of the data test script and starts a performance test task with large data volume;
and starting test result collection and analyzing the test results.
2. The method for performance testing of large data volume of industrial software according to claim 1, wherein the performance testing script calls a data testing script, and the data testing script inserts simulation data of a preset data volume into the database based on the status flag of the test case maintenance module, specifically comprising:
the script management center tool starts to execute the test case after receiving the performance test request with large data volume of the message management center module, calls the data test script through the performance test script, judges whether the data with preset data volume needs to be inserted or not based on the state mark of the test case maintenance table,
no further data insertion is required if the status flag indicates that data for performance testing already exists,
and if the status flag indicates that the data does not have the data for performance test, inserting the data with the preset data quantity.
3. The performance testing method for the large data volume of the industrial software according to claim 2, wherein the inserting of the data with the preset data volume specifically comprises:
initializing database connection and calling a database insertion statement through set database information and table names in the data test script;
and obtaining a cursor object capable of executing the SQL statement, writing the insertion data into a random variable form, inserting the insertion data in a large batch through circulation, sending an insertion completion message to a message management center after the execution is completed based on the preset data volume, and returning a result set to display in a tuple form.
4. The performance testing method for the large data volume of the industrial software as claimed in claim 1, further comprising the step of adding and saving the identification of the interface sending the API request and the identification of the testing step to the performance test sample.
5. The method for testing the performance of the industrial software with the large data volume according to claim 1, wherein the collection of the test results and the analysis of the test results are started, and specifically comprises,
and storing the pressure test result data based on each user server and the resource use condition data of the server deploying the industrial software to be tested into a time sequence database according to a time sequence, wherein each piece of data at least comprises a time stamp, sending the data at regular time, and dynamically displaying the pressure test result data and the resource use condition data of the server based on a Web page visual mode.
6. The method for testing the performance of the industrial software with the large data volume is characterized by further comprising the following steps: and if the test result data is wrong or the resource use condition data of the server exceeds a preset threshold value, notifying a receiver set in the configuration test parameters through message middleware, and accessing and checking the pressure test result data and the resource use condition data of the server through a Web page of a test management center tool.
7. The industrial software big data volume performance testing method according to claim 6, further comprising collecting, storing and filtering logs based on a preset log level, and displaying the logs based on graphical programming, thereby quickly locating problems.
8. The performance test method for the large data volume of the industrial software according to claim 6, wherein the test management center tool further comprises a system configuration module, a resource file module, a task scheduling module, a monitoring module, a data analysis module and a data display module;
the system configuration module is used for configuring the tested server, the test execution machine operating parameters and the test environment;
the resource file module is used for configuring a test script;
the task scheduling module is used for configuring task names, selecting executed test scripts and executing time and frequency, and after the tasks are constructed and executed, starting performance tests and executing the test scripts.
The monitoring module is used for monitoring the running conditions of the server to be tested and the test execution machine pool and collecting the running data of the server to be tested and the test execution machine pool;
the data analysis module is used for analyzing and summarizing the performance test result index and the service condition of the server resource in a Python programming mode;
the data display module is used for dynamically displaying various performance test result indexes and server resource use conditions set by a user in real time through a visual graph;
the message management center module is also used for configuring related notification personnel, notification modes and notification frequencies and notifying related responsible persons in time through message middleware;
the log management module is used for collecting log files in the execution process of the test execution machine pool.
9. The method for performance testing of the large data volume of the industrial software according to any one of claims 1 to 8, wherein the data testing script is Python script, and the performance testing script is Jmeter script.
10. A performance testing system for running the industrial software mass data performance testing method of any one of claims 1-9, characterized by comprising at least a script management center, a real-time monitoring center and a message management center,
the script management center is used for adapting to different service scenes to run corresponding scripts and creating test data;
the real-time monitoring system is used for acquiring the performance test result and the server resource information after the script is operated in real time and sending the performance test result and the server resource information to the message management center;
and the message management center feeds back the test result based on the preset threshold, the performance test result and the server resource information.
CN202210768773.9A 2022-06-30 2022-06-30 Performance test method and system for large data volume of industrial software Pending CN115328758A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210768773.9A CN115328758A (en) 2022-06-30 2022-06-30 Performance test method and system for large data volume of industrial software

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210768773.9A CN115328758A (en) 2022-06-30 2022-06-30 Performance test method and system for large data volume of industrial software

Publications (1)

Publication Number Publication Date
CN115328758A true CN115328758A (en) 2022-11-11

Family

ID=83918627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210768773.9A Pending CN115328758A (en) 2022-06-30 2022-06-30 Performance test method and system for large data volume of industrial software

Country Status (1)

Country Link
CN (1) CN115328758A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117176611A (en) * 2023-10-30 2023-12-05 翼方健数(北京)信息科技有限公司 Performance test method, system and medium of distributed communication library
CN117785643A (en) * 2024-02-23 2024-03-29 广州飞进信息科技有限公司 Performance test platform for software development

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117176611A (en) * 2023-10-30 2023-12-05 翼方健数(北京)信息科技有限公司 Performance test method, system and medium of distributed communication library
CN117176611B (en) * 2023-10-30 2024-01-30 翼方健数(北京)信息科技有限公司 Performance test method, system and medium of distributed communication library
CN117785643A (en) * 2024-02-23 2024-03-29 广州飞进信息科技有限公司 Performance test platform for software development
CN117785643B (en) * 2024-02-23 2024-05-14 广州飞进信息科技有限公司 Performance test platform for software development

Similar Documents

Publication Publication Date Title
US6446120B1 (en) Configurable stresser for a web server
CN115328758A (en) Performance test method and system for large data volume of industrial software
US9111019B2 (en) Modeling and testing interactions between components of a software system
US9697104B2 (en) End-to end tracing and logging
CN112286806B (en) Automatic test method and device, storage medium and electronic equipment
CN102693183A (en) Method and system for realizing automatic software testing
CN109885496B (en) Test log management method and system
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
US7698543B2 (en) User interface for specifying desired configurations
CN112882927A (en) Interface automatic testing method, device, equipment and medium
WO2015187001A2 (en) System and method for managing resources failure using fast cause and effect analysis in a cloud computing system
CN114297961A (en) Chip test case processing method and related device
CN114185791A (en) Method, device and equipment for testing data mapping file and storage medium
CN111930611B (en) Statistical method and device for test data
CN116578585B (en) Data query method, device, electronic equipment and storage medium
CN117389825A (en) Method, system and device for monitoring Flink job log in real time
CN112559525A (en) Data checking system, method, device and server
CN114610689B (en) Recording and analyzing method for request log in distributed environment
CN108959041B (en) Method for transmitting information, server and computer readable storage medium
CN116467188A (en) Universal local reproduction system and method under multi-environment scene
CN110413496B (en) Method for realizing componentized collection of electronic license operation data
CN113448985A (en) API (application program interface) interface generation method, calling method and device and electronic equipment
CN112306862A (en) Front-end automatic test system and method, storage medium and computing equipment
Vainio Implementation of Centralized Logging and Log Analysis in Cloud Transition
CN111159004A (en) Hadoop cluster simulation test method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination