CN117539755A - High-simulation performance test method and system based on user behavior - Google Patents

High-simulation performance test method and system based on user behavior Download PDF

Info

Publication number
CN117539755A
CN117539755A CN202311489741.6A CN202311489741A CN117539755A CN 117539755 A CN117539755 A CN 117539755A CN 202311489741 A CN202311489741 A CN 202311489741A CN 117539755 A CN117539755 A CN 117539755A
Authority
CN
China
Prior art keywords
test
script
scene
container
concurrent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311489741.6A
Other languages
Chinese (zh)
Inventor
魏义鹏
李清
李岚
魏来
祝君平
丁涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Postal Savings Bank of China Ltd
Original Assignee
Postal Savings Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Postal Savings Bank of China Ltd filed Critical Postal Savings Bank of China Ltd
Priority to CN202311489741.6A priority Critical patent/CN117539755A/en
Publication of CN117539755A publication Critical patent/CN117539755A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/4557Distribution of virtual machine instances; Migration and load balancing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a high-simulation performance test method and system based on user behaviors. The method comprises the following steps: constructing a basic environment based on a container technology, and deploying a press and container resources; based on the UI automatic test system, writing performance test scripts and debugging to simulate user behaviors; establishing a test scene according to the test requirement, and configuring concurrent scene parameters; according to the configured concurrent scene parameters and the current use conditions of the press and container resources, container nodes meeting the test requirements are distributed for the test scene; and distributing the performance test script to each distributed container node for execution, and realizing concurrent performance test. The method breaks through the traditional performance test thought, based on the container technology and the UI automatic test system, comprehensively considers multiple aspects of user operation authenticity, test index accuracy, comprehensive test range, browser variability and the like, and can effectively reduce hardware resource cost and labor cost while improving the performance test simulation degree.

Description

High-simulation performance test method and system based on user behavior
Technical Field
The application relates to the technical field of performance testing, in particular to a high-simulation performance testing method and system based on user behaviors.
Background
The current performance test is based on the traditional performance test thought, user operation is simulated through a message, steps of browser resource analysis, rendering and the like (including building a DOM tree, analyzing CSS codes to calculate all patterns of the DOM tree, combining CSS and DOM, building a rendering tree, laying out and drawing, redrawing and rearranging and the like) are absent in the test process, and the problems that user behaviors cannot be really simulated, the test result indexes have large difference from the actual experience of the user, the test range is limited, the difference of the client browser is not considered and the like exist. However, if the performance test is performed by using a real user, a large amount of hardware resources and human resources are required, which is too high in cost and not easy to operate.
Disclosure of Invention
The application provides a high-simulation performance testing method and system based on user behaviors, which are based on a container and UI automatic testing system, comprehensively consider multiple aspects of user operation authenticity, testing index accuracy, testing range comprehensiveness, browser variability and the like, and effectively reduce hardware resource cost and labor cost while improving performance testing simulation degree.
According to a first aspect of the present application, a high simulation performance testing method based on user behavior is provided, including:
s1, building a basic environment based on a container technology, and deploying a press and container resources;
s2, writing a performance test script and debugging to simulate user behaviors based on a UI automatic test system;
s3, a test scene is established according to the test requirement, and concurrent scene parameters are configured;
s4, distributing container nodes meeting testing requirements for a testing scene according to the configured concurrent scene parameters and the current use conditions of the press and container resources;
and S5, distributing the performance test script to each distributed container node for execution, and realizing concurrent performance test.
According to a second aspect of the present application, a high simulation performance test system based on user behavior is provided, comprising:
the basic environment module is used for building a basic environment based on a container technology and deploying a press and container resources;
the script management module is used for writing performance test scripts and debugging to simulate user behaviors based on the UI automatic test system;
the scene management module is used for establishing a test scene according to the test requirement and configuring concurrent scene parameters;
the distribution control module is used for distributing container nodes meeting the test requirements for the test scene according to the configured concurrent scene parameters and the current use conditions of the press and the container resources;
and the script execution module is used for distributing the performance test scripts to each distributed container node for execution, so as to realize concurrent performance test.
The following beneficial effects can be achieved by adopting the technical scheme of the embodiment of the application:
the method and the system break through the traditional performance test thought, firstly construct a basic environment based on a container technology, deploy a press and container resources, and further construct a set of simulation performance test system based on a UI automatic test system, write performance test scripts and debug through simulation user behaviors; then, a test scene is established according to the test requirement, and concurrent scene parameters are configured, so that the control of the test scene is realized; then, according to the configured concurrent scene parameters and the current use conditions of the press and the container resources, container nodes meeting the test requirements are distributed for the test scene, so that distribution control is realized integrally; and then distributing the performance test script to each distributed container node for execution, so as to realize concurrent performance test. Therefore, the scheme of the embodiment of the application is upgraded and improved on the basis of the traditional UI automation test framework, the functions of performance test script running and debugging, concurrent control of pressure test scenes, pressure distribution of container nodes and the like are increased, and the system can be used for automatic test while supporting performance test. In addition, the scheme of the embodiment of the application simulates real user operation through the UI automation means, increases front-end user behavior simulation, gets rid of the performance test mode of simulating the pressure of the traditional report Wen Mo, not only fits the actual use mode of the user, but also improves the collection mode of the indexes of the traditional performance test tool, is closer to the real experience of the user, and improves the authenticity and accuracy of the performance test result; for example, the response time index is the time of the whole process that the user makes a request from the client browser and gets a response. In addition, the scheme of the embodiment of the application expands the testing range of the traditional performance test, and expands the performance test of the transaction complete link from paying more attention to the service performance of the back end, so that the service is optimized by analyzing the overall performance, and the overall improvement of the performance is achieved; and by supporting various browser types and versions, the user behavior is simulated, the client browser differential simulation is realized, and the method can also be used for browser compatibility test. Therefore, by adopting the method and the system of the embodiment of the application, the hardware resource cost and the labor cost can be effectively reduced while the performance test simulation degree is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic flow chart of a high-simulation performance testing method based on user behavior according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a high-simulation performance test method based on user behavior according to an embodiment of the present application;
FIG. 3 is a schematic diagram showing a comparison of a conventional message simulation and a simulation performance test process according to an embodiment of the present application;
FIG. 4 is a diagram showing the comparison between the simulation performance test results of the conventional message simulation and the embodiment of the present application;
fig. 5 shows a functional structure schematic of a high-simulation performance test system based on user behavior according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
Before describing the technical solutions of the present application, the terms used in the present application will be described:
the performance test in the application means that the multi-user concurrency condition is simulated by a certain means, various performance indexes of the tested system are obtained, and whether the processing capability, the response capability, the stability and the like of the tested system under high concurrency can meet the expectations is verified.
Response time, which in this application refers to the time that the system responds to a request, generally consists of presentation time, data transmission time and system processing time, and currently the response time in performance testing generally only counts the data transmission time and the system processing time.
UI (User Interface) automated testing refers to using a tool or script to run a system or an application program on a front-end interface of software to be tested under preset conditions and known test data, and obtaining a data result displayed on a front-end page of the system or the application program to verify, and evaluating to obtain a test conclusion.
The Selenium Grid framework is a set of distributed front-end automation testing framework, and can test different browsers on different machines simultaneously, and comprises hub nodes and node nodes. The node nodes are registered on the hub node after being started, the hub node records and tracks the state of each node, and after receiving the test script to be executed, the hub node sends the relevant test instruction to the node nodes for execution.
A container, which can be understood as a lightweight set of application runtime environments, contains files and components that make up an application, is essentially a set of processes that are resource constrained and isolated from each other, so a container can also be understood as an instance.
Mirror image is a special file system, which provides the needed program, library, data and configuration files for the running of the container, and the content of the mirror image is not modified after the mirror image is constructed.
Mirror repository, a type of repository (or collection of repositories). The mirror warehouse mainly has the functions of mirror storage, mirror management and mirror distribution. Each warehouse may contain multiple images, distinguished by labels. The mirror image warehouse can conveniently share mirror images among a plurality of running environments, and the same running environment is rapidly simulated through the container to run the application, so that abnormal running or inconsistent running of the application caused by different running environments is avoided.
Container orchestration, which refers to the user's completion of the definition, configuration, creation, deletion, etc. of a set of containers and associated resources through some tools or configuration, may be understood as tools that define the container organization and management specifications.
Example 1
According to a first aspect of the present application, referring to the flow diagrams shown in fig. 1 and 2, an embodiment of the present application proposes a high simulation performance testing method based on user behavior, including steps S1 to S5:
s1, constructing a basic environment based on a container technology, and deploying a press and container resources.
The step S1 can realize the basic environment construction based on the container. The bottom support is provided for the simulation performance test of the user behavior through the technologies of containers, mirror image libraries, container arrangement and the like, for example, the mirror image libraries can provide mirror image resources of a multi-version browser, and the container arrangement can be used for stably and orderly managing the container resources. Before performance test is implemented, testers can deploy press and container resources according to information such as press resources, test scale and the like, a certain number of container nodes are configured, and stable environment resource support is provided for operation in a high concurrency scene.
S2, based on the UI automatic test system, writing performance test scripts and debugging to simulate user behaviors.
The step S2 specifically includes:
s21, writing a performance test script and configuring script parameters according to the business flow, storing the configured script parameter data in a parameter file library, and adding transactions, association values and response assertions for key requests of the script.
The script parameters to be configured comprise parameter types, parameter value ranges, modes and the like. Configuration data of script parameters is stored in a parameter file library to be called when the script is executed.
The transaction added for the key request of the script comprises a login mark, a start mark, an end mark and the like, and the added response assertion comprises data such as test results and the like, so that the purpose of the transaction is to facilitate the summarization analysis of the test data.
S22, setting operation control parameters of each test script.
The method is used for debugging the single-leg notebook. The operation control parameters to be set for each test script include iteration times, iteration intervals, log levels and the like. When the single-leg script is debugged, whether script parameter configuration rules of the previous step are reasonable or not can be detected, and whether operation control parameters such as iteration times, interval duration and the like of the single-leg script are consistent with service facts or not can be detected.
S23, script debugging is conducted in a playback mode to verify the correctness of the test script.
And observing the page and the log of the execution process and the result of the single-leg script test in a playback mode, if the script is debugged to pass, executing the following steps, otherwise, returning to the execution step S21.
It can be seen that in this step S2, user behavior simulation is mainly implemented by writing UI automation test scripts. The script can be written by using a Selenium script development technology, after a UI automatic test script is written according to a business flow, the operation differentiation of a user and the pressure equalization on a back-end server under the concurrent scene test can be realized by carrying out variable processing on some parameters in the script and providing data support for each script by using a parameter file library; the script operation control is used for debugging a single test script, detecting whether the script can normally operate, whether parameter configuration rules are reasonable, and whether iteration times, interval duration and the like meet service test requirements.
Fig. 3 is a schematic diagram showing a comparison between a conventional message simulation and a simulation performance test process in an embodiment of the present application, referring to fig. 3, a UI automation performance test script written in an embodiment of the present application, where an execution process of the UI automation performance test script includes steps of user click interaction at a front end page and browser resource analysis, rendering (corresponding to a A, B, F processing node in fig. 3) in addition to steps of sending a request and receiving a response (corresponding to a C, D, E processing node in fig. 3) to a server, so that a simulation performance test based on a real operation of a user can be implemented.
Fig. 4 is a schematic diagram showing comparison between a conventional message simulation and a simulation performance test result of an embodiment of the present application, referring to transaction response time-consuming data shown in fig. 4, for example, in a first row, the response time-consuming of an interface message is 0.939 seconds, the user experience 1 is 3.1 seconds, the user experience 2 is 3 seconds, the response time-consuming of the simulation performance test is 3.269 seconds, it is seen that the conventional message simulation result has a larger difference from the actual experience of the user, and the simulation performance test result of the embodiment of the present application is very close to the actual experience of the user.
And S3, building a test scene according to the test requirement, and configuring concurrent scene parameters.
The step S3 specifically includes:
s31, building a test scene according to the test requirement, adding one or more test scripts in the test scene, and configuring a transaction duty ratio for each test script.
For example, a single transaction test scenario, a transaction ratio of 100%; the mixed test scene comprises a plurality of transaction test scripts, the target performance indexes of all transactions are comprehensively considered, and reasonable transaction duty ratio is configured for each test script.
S32, configuring concurrent scene parameters of each test script to realize control of the test scene.
The concurrency scene parameters to be configured include three types: the first category is parameters for integrally controlling the scene operation process, such as the number of concurrent users, the start-stop mode, the browser type, the operation time length and the like; the second category is parameters for troubleshooting problems in supporting scene operation, such as log level, abnormal screenshot, error retry, etc.; the third class is parameters that normalize the management scenario and its running result data, such as using scenario names, result storage paths, etc.
In step S3, the parameters of the concurrent scene are configured, so that the control of the test scene is realized according to the test requirement. The script execution system can convert the user executing the script from a single user to multiple users according to the test index requirements of transactions, the current conditions of the current press and container resources and the like, so that a large number of user operations are simulated, the concurrent user numbers are flexibly proportioned, the scene operation mode is set according to the needs and the like, and load tests are carried out.
And S4, distributing container nodes meeting the test requirements for the test scene according to the configured concurrent scene parameters and the current use conditions of the press and the container resources.
The step S4 specifically includes:
s41, acquiring the number of concurrent users from the configured concurrent scene parameters, acquiring the current use conditions of the press and the container resources from the built basic environment, and automatically distributing idle container nodes for the test scene according to the acquired number of concurrent users and the current use conditions of the press and the container resources;
s42, judging whether the allocated press and container resources meet the test requirement, namely whether the number of the allocated total container nodes is larger than or equal to the total required concurrent users, if so, continuing to execute the following steps to meet the test requirement, and if not, returning to the step S32 to adjust the concurrent users in the concurrent scene parameter configuration until the number of the automatically allocated container nodes meets the test requirement.
The step S4 can automatically allocate the press and the container resources for the test scene according to the number of concurrent users of the test scene, the current use condition of the press and the container resources. The system acquires the deployment and use condition of the container nodes from the basic environment configuration according to the requirements of the current test scene, distributes concurrent pressure to idle container nodes in an equalizing mode, updates the use condition of the press and container resources, and integrally realizes distribution control.
And S5, distributing the performance test script to each distributed container node for execution, and realizing concurrent performance test.
The step S5 is mainly responsible for executing the concurrent scene, and specifically includes: using a Selenium Grid automation test framework, configuring 1 Selenium Hub Node and a plurality of Selenium Node nodes in each press, wherein each Node is registered on the Hub Node after being started, and the Hub Node and the Node are all operated in different container nodes; and distributing the test script to each Node for execution through Hub nodes, thereby realizing the concurrent performance test under different browser types and versions.
In summary, according to the high simulation performance test method based on user behavior in the embodiment of the present application, a basic environment is built based on a container technology, a press and container resources are deployed, a performance test script is written based on a UI automation test system, and user behavior is simulated by debugging; then, a test scene is established according to the test requirement, and concurrent scene parameters are configured, so that the control of the test scene is realized; then, according to the configured concurrent scene parameters and the current use conditions of the press and the container resources, container nodes meeting the test requirements are distributed for the test scene, so that distribution control is realized integrally; and then distributing the performance test script to each distributed container node for execution, and finally realizing concurrent performance test.
Therefore, the scheme of the embodiment of the application breaks through the traditional performance test thought, upgrades and reforms on the basis of the traditional UI automation test framework, increases the functions of performance test script running and debugging, concurrent control of pressure test scenes, container node pressure distribution and the like, establishes a set of user simulation performance test system based on container technology and the UI automation test system, and can be used for automation test while supporting performance test. In addition, the scheme of the embodiment of the application simulates real user operation through the UI automation means, increases front-end user behavior simulation, gets rid of the performance test mode of simulating the pressure of the traditional report Wen Mo, not only fits the actual use mode of the user, but also improves the collection mode of the indexes of the traditional performance test tool, is closer to the real experience of the user, and improves the authenticity and accuracy of the performance test result; for example, the response time index is the time of the whole process that the user makes a request from the client browser and gets a response. In addition, the scheme of the embodiment of the application expands the testing range of the traditional performance test, and expands the performance test of the transaction complete link from paying more attention to the service performance of the back end, so that the service is optimized by analyzing the overall performance, and the overall improvement of the performance is achieved; and by supporting various browser types and versions, the user behavior is simulated, the client browser differential simulation is realized, and the method can also be used for browser compatibility test. Therefore, by adopting the scheme of the embodiment of the application, the hardware resource cost and the labor cost can be effectively reduced while the performance test simulation degree is improved.
According to some embodiments of the present application, referring to fig. 2, the method of the embodiment of the present application further includes a step of processing the test result, where the step specifically includes:
s61, in the script execution process, collecting test index information to a front-end display component to display test indexes in real time, collecting abnormal screenshot of the process to a file server, and collecting other scene operation information to a database. The collected test index information comprises TPS, response time, throughput, transaction success rate and the like. The collected other scene operation information comprises script operation time, concurrency user number, browser classification statistics and the like.
And S62, after the script execution is finished, automatically summarizing and analyzing the test results to help the testers to find out performance problems in time by combining the test results.
S63, judging whether the test result meets the expectation, if so, entering a step S64, otherwise, returning to the step S3, and reconfiguring the concurrent scene parameters;
s64, after the scene test is finished, collecting the finishing test result.
According to the embodiment of the application, the execution results of each script are collected and summarized through the database, the file server, the front-end display component and the like, the results display of test indexes, resource service conditions, abnormal screenshot, scene operation information and the like can be provided, the functions of real-time analysis and data monitoring on the test indexes are realized, the test reality can be attached, the result analysis function is customized, the test result content is enriched, for example, page screenshot under the abnormal condition is increased, the result is more visual, and the problem is easier to find.
Example 2
According to a second aspect of the present application, as shown in fig. 5, an embodiment of the present application further provides a high simulation performance test system based on user behavior, including:
a base environment module 51, configured to build a base environment based on container technology, and deploy a press and container resources;
the script management module 52 is configured to write a performance test script and debug the simulation user behavior based on the UI automation test system;
the scene management module 53 is configured to establish a test scene according to a test requirement, and configure concurrent scene parameters;
the distribution control module 54 is configured to allocate container nodes meeting the test requirements for the test scene according to the configured concurrent scene parameters and the current use conditions of the press and the container resources;
and the script execution module 55 is used for distributing the performance test script to each distributed container node for execution, so as to realize concurrent performance test.
According to some embodiments of the present application, the script management module 52 is specifically configured to: writing a performance test script and configuring script parameters according to the service flow, storing the configured script parameter data in a parameter file library, and adding transactions, association values and response assertions for key requests of the script; setting operation control parameters of each test script; and performing script debugging in a playback mode to verify the correctness of the test script.
According to some embodiments of the present application, the scene management module 53 is specifically configured to: establishing a test scene according to test requirements, adding one or more test scripts into the test scene, and configuring transaction duty ratio for each test script; and configuring concurrent scene parameters of each test script to realize control of the test scene.
According to some embodiments of the present application, the distribution control module 54 is specifically configured to: acquiring the number of concurrent users from the configured concurrent scene parameters, acquiring the current use conditions of the press and the container resources from the built basic environment, and automatically distributing idle container nodes for the test scene according to the acquired number of concurrent users and the current use conditions of the press and the container resources; judging whether the number of the distributed total container nodes is larger than or equal to the required total concurrency user number, if so, meeting the test requirement, and if not, adjusting the concurrency user number in the concurrency scene parameter configuration until the number of the automatically distributed container nodes meets the test requirement.
According to some embodiments of the present application, the script execution module 55 is specifically configured to: using a Selenium Grid automation test framework, configuring 1 Selenium Hub Node and a plurality of Selenium Node nodes in each press, wherein each Node is registered on the Hub Node after being started, and the Hub Node and the Node both operate in different container nodes; distributing the test script to each Node for execution through the Hub Node.
Referring still to fig. 5, in accordance with some implementations of the present application, the system of the present application embodiment further includes a result processing module 56; the result processing module 56 is specifically configured to: in the script execution process, collecting test index information to a front-end display component to display test indexes in real time, collecting abnormal screenshot of the process to a file server, and collecting other scene operation information to a database; after script execution is finished, automatically summarizing and analyzing test results; if the test result meets the expectation, ending the scene test, and collecting the finishing test result; otherwise, returning to the scene management module 33, and reconfiguring the concurrent scene parameters.
It can be understood that the high simulation performance test system based on user behavior shown in fig. 5 can implement the steps in the method of the foregoing embodiment 1, and the relevant explanation about the method of embodiment 1 is applicable to the modules in the system shown in fig. 5, which is not repeated herein. It should be noted that, in some cases, the names of the units or modules described in the embodiments of the present application do not limit the units or modules. Moreover, each functional module in the system shown in fig. 5 is mainly defined based on the steps of the method flow, and does not necessarily correspond to an entity module in a physical sense.
Finally, it should be noted that:
the terms "comprises" and "comprising," along with any variations thereof, in the description and claims of the present application are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements that are expressly listed or inherent to such process, method, article, or apparatus.
The embodiment numbers are merely for the purpose of description and do not represent the advantages or disadvantages of the embodiments. In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments. Embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The system embodiments described above are merely exemplary, and for example, the division of the units may be a logic function division, and there may be another division manner when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with respect to each other may be an indirect coupling or communication connection via some interfaces, units or modules, which may be in electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods, systems and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units. Moreover, the integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.

Claims (10)

1. The high-simulation performance test method based on the user behavior is characterized by comprising the following steps of:
s1, building a basic environment based on a container technology, and deploying a press and container resources;
s2, writing a performance test script and debugging to simulate user behaviors based on a UI automatic test system;
s3, a test scene is established according to the test requirement, and concurrent scene parameters are configured;
s4, distributing container nodes meeting testing requirements for a testing scene according to the configured concurrent scene parameters and the current use conditions of the press and container resources;
and S5, distributing the performance test script to each distributed container node for execution, and realizing concurrent performance test.
2. The method according to claim 1, wherein the step S2 specifically comprises:
s21, writing a performance test script and configuring script parameters according to the service flow, storing the configured script parameter data in a parameter file library, and adding transactions, association values and response assertions for key requests of the script;
s22, setting operation control parameters of each test script;
s23, script debugging is conducted in a playback mode to verify the correctness of the test script.
3. The method according to claim 1, wherein the step S3 specifically includes:
s31, building a test scene according to test requirements, adding one or more test scripts into the test scene, and configuring transaction duty ratio for each test script;
s32, configuring concurrent scene parameters of each test script to realize control of the test scene.
4. The method according to claim 1, wherein the step S4 specifically includes:
s41, acquiring the number of concurrent users from the configured concurrent scene parameters, acquiring the current use conditions of the press and the container resources from the built basic environment, and automatically distributing idle container nodes for the test scene according to the acquired number of concurrent users and the current use conditions of the press and the container resources;
s42, judging whether the number of the total allocated container nodes is greater than or equal to the total required concurrency user number, if so, meeting the test requirement, and if not, adjusting the concurrency user number in the concurrency scene parameter configuration until the number of the container nodes automatically allocated meets the test requirement.
5. The method according to claim 1, wherein the step S5 specifically includes:
using a Selenium Grid automation test framework, configuring 1 Selenium Hub Node and a plurality of Selenium Node nodes in each press, wherein each Node is registered on the Hub Node after being started, and the Hub Node and the Node both operate in different container nodes;
distributing the test script to each Node for execution through the Hub Node.
6. The method according to any one of claims 1 to 5, further comprising:
s61, collecting test index information to a front-end display component to display test indexes in real time, collecting abnormal screenshot of the process to a file server and collecting other scene operation information to a database in the script execution process;
s62, after script execution is finished, automatically summarizing and analyzing test results;
s63, judging whether the test result meets the expectation, if so, entering a step S64, otherwise, returning to the step S3, and reconfiguring the concurrent scene parameters;
s64, after the scene test is finished, collecting the finishing test result.
7. A high simulation performance test system based on user behavior, comprising:
the basic environment module is used for building a basic environment based on a container technology and deploying a press and container resources;
the script management module is used for writing performance test scripts and debugging to simulate user behaviors based on the UI automatic test system;
the scene management module is used for establishing a test scene according to the test requirement and configuring concurrent scene parameters;
the distribution control module is used for distributing container nodes meeting the test requirements for the test scene according to the configured concurrent scene parameters and the current use conditions of the press and the container resources;
and the script execution module is used for distributing the performance test scripts to each distributed container node for execution, so as to realize concurrent performance test.
8. The system of claim 7, wherein the script management module is specifically configured to: writing a performance test script and configuring script parameters according to the service flow, storing the configured script parameter data in a parameter file library, and adding transactions, association values and response assertions for key requests of the script; setting operation control parameters of each test script; and performing script debugging in a playback mode to verify the correctness of the test script.
9. The system of claim 7, wherein the distribution control module is specifically configured to: acquiring the number of concurrent users from the configured concurrent scene parameters, acquiring the current use conditions of the press and the container resources from the built basic environment, and automatically distributing idle container nodes for the test scene according to the acquired number of concurrent users and the current use conditions of the press and the container resources; judging whether the number of the distributed total container nodes is larger than or equal to the required total concurrency user number, if so, meeting the test requirement, and if not, adjusting the concurrency user number in the concurrency scene parameter configuration until the number of the automatically distributed container nodes meets the test requirement.
10. The system of any one of claims 7-9, further comprising a result processing module; the result processing module is specifically configured to: in the script execution process, collecting test index information to a front-end display component to display test indexes in real time, collecting abnormal screenshot of the process to a file server, and collecting other scene operation information to a database; after script execution is finished, automatically summarizing and analyzing test results; if the test result meets the expectation, ending the scene test, and collecting the finishing test result; otherwise, returning to the scene management module to reconfigure concurrent scene parameters.
CN202311489741.6A 2023-11-09 2023-11-09 High-simulation performance test method and system based on user behavior Pending CN117539755A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311489741.6A CN117539755A (en) 2023-11-09 2023-11-09 High-simulation performance test method and system based on user behavior

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311489741.6A CN117539755A (en) 2023-11-09 2023-11-09 High-simulation performance test method and system based on user behavior

Publications (1)

Publication Number Publication Date
CN117539755A true CN117539755A (en) 2024-02-09

Family

ID=89793081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311489741.6A Pending CN117539755A (en) 2023-11-09 2023-11-09 High-simulation performance test method and system based on user behavior

Country Status (1)

Country Link
CN (1) CN117539755A (en)

Similar Documents

Publication Publication Date Title
US9465718B2 (en) Filter generation for load testing managed environments
US10050848B2 (en) Data-driven profiling for distributed applications
CN110309071B (en) Test code generation method and module, and test method and system
US9294296B2 (en) Automated test execution in a shared virtualized resource pool
US7596546B2 (en) Method and apparatus for organizing, visualizing and using measured or modeled system statistics
US7849447B1 (en) Application testing and evaluation
US8205191B1 (en) System and method for change-based testing
US8977739B2 (en) Configurable frame work for testing and analysis of client-side web browser page performance
US20170199809A1 (en) Graphical transaction model
US20090320002A1 (en) Method and system for testing and analyzing user interfaces
CN106933729A (en) A kind of method of testing and system based on cloud platform
CN102331970A (en) Safety critical system-oriented automatic testing resource management method and platform
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
KR20060079080A (en) Methods and apparatus for evaluating aspects of a web page
US20070083631A1 (en) System and method for queued and on-demand testing for performance test
CN112711526A (en) UI test method, device, equipment and storage medium
CN112433948A (en) Simulation test system and method based on network data analysis
US9983965B1 (en) Method and system for implementing virtual users for automated test and retest procedures
CN113126993B (en) Automatic test method and system applied to vehicle detection software
Brown et al. An approach to benchmarking configuration complexity
US20130318499A1 (en) Test script generation
Liu A compatibility testing platform for android multimedia applications
CN117539755A (en) High-simulation performance test method and system based on user behavior
CN111694752A (en) Application testing method, electronic device and storage medium
CN109669868A (en) The method and system of software test

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination