CN117056218A - Test management method, platform, medium and equipment - Google Patents

Test management method, platform, medium and equipment Download PDF

Info

Publication number
CN117056218A
CN117056218A CN202311023489.XA CN202311023489A CN117056218A CN 117056218 A CN117056218 A CN 117056218A CN 202311023489 A CN202311023489 A CN 202311023489A CN 117056218 A CN117056218 A CN 117056218A
Authority
CN
China
Prior art keywords
test
performance
banking system
script
performance test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311023489.XA
Other languages
Chinese (zh)
Inventor
赵江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202311023489.XA priority Critical patent/CN117056218A/en
Publication of CN117056218A publication Critical patent/CN117056218A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Hardware Design (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a test management method, a platform, a medium and equipment, comprising the following steps: performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system; after test preparation is completed, a test environment of a banking system is built; writing a performance test script and importing the performance test script into a test management platform; and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool. The application can hatch an integrated pluggable platform system for performance test, improves the working efficiency of people, saves the cost for enterprises and creates more value.

Description

Test management method, platform, medium and equipment
Technical Field
The present application relates to the field of performance testing technologies, and in particular, to a test management method, a test management platform, a test management medium, and a test management device.
Background
The tester must know the skill of writing the performance test script. However, this skill is not easy for novice to learn, especially to quickly make applicable scripts based on differences in the test scenarios. To create a scenario-compliant script, many scattered and incomplete knowledge points need to be searched on the network, but this cannot fully guarantee the correctness and applicability of the script.
Disclosure of Invention
Based on this, it is necessary to provide a test management method, platform, medium and device to solve the problem that the correctness and applicability of the script cannot be completely ensured.
A test management method, the method comprising:
performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system;
after test preparation is completed, a test environment of a banking system is built;
writing a performance test script and importing the performance test script into a test management platform;
and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool.
In one embodiment, the test preparation for demand analysis, performance objective, application software, deployment platform, load model and project plan of the banking system includes:
determining the test requirement of a bank user;
selecting the importance degree and the frequency degree of a core business process of a bank card, and determining the number of concurrent users of a bank system, the demand for throughput rate of things, the demand for response time, the demand for resources occupied by the system and the demand for expandability;
determining the architecture of a banking system, the development language adopted, the communication protocol and the verification mode;
determining a physical deployment mode, a hardware architecture, an operating system, a database, middleware and a software deployment mode of a banking system;
determining the service type, system load and data source of a banking system;
and (5) making a test project plan of the banking system.
In one embodiment, the building a test environment of a banking system includes:
selecting a test server according to the number of concurrent users, the transaction throughput rate and the response time requirement of the banking system;
installing and configuring a database server, creating a database and a table structure, and inserting test data;
connecting a banking system and a performance testing tool to the same local area network or the internet to simulate network communication in a real environment;
a load balancer is configured in a test environment.
In one embodiment, the writing the performance test script and importing the test management platform includes:
determining a communication protocol adopted by a banking system, and selecting a performance test tool compatible with the communication protocol;
according to the business flow and performance targets of the banking system, using the performance testing tool to write performance testing scripts related to parameterized user input, associated data, added transactions and checkpoints;
after the performance test script is compiled, checking errors or logic problems in the performance test script and repairing the errors or the logic problems;
writing a verification script to verify the performance index of the bank system;
after the performance index passes the verification, importing the written performance test script according to the importing requirement of the used test management platform.
In one embodiment, the selecting a performance test script adapted to a target banking system from a test management platform, and performing a performance test on the target banking system using test data generated by a performance test tool includes:
checking whether the processing capacity and resources of the pressing machine are sufficient;
setting the current pressure level, user behavior and concurrent user number to be simulated by a target banking system, and selecting an adapted performance test script;
and randomly selecting key functions of the bank system, generating test data required by the key functions by using a performance test tool, performing stability test on the key functions by using the performance test script and the test data, monitoring the running conditions of the bank system, the network and the database, and checking whether memory leakage exists.
In one embodiment, the method further comprises:
and analyzing the performance index, the response time and the error rate, evaluating the performance of the bank system according to the obtained analysis result, and formulating performance optimization measures related to codes, hardware and configuration resources.
In one embodiment, the method further comprises:
setting up a regular tracking plan, collecting response time, throughput and error rate of a system as performance tracking indexes, and carrying out trend analysis on the performance tracking indexes to determine the change trend of the performance;
the change trend of the performance is fed back to a development team or a designated responsible person.
A test management platform, the test management platform comprising:
the system comprises a preparation module, a performance target, application software, a deployment platform, a load model and a project plan, wherein the preparation module is used for carrying out test preparation of demand analysis, performance target, application software, the deployment platform, the load model and the project plan on a bank system;
the environment construction module is used for constructing a test environment of the bank system after the test preparation is completed;
the writing module is used for writing performance test scripts and importing the performance test scripts into the test management platform;
and the test module is used for selecting a performance test script matched with the target banking system from the test management platform and performing performance test on the target banking system by using test data generated by the performance test tool.
A computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of the test management method described above.
A test management apparatus comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the test management method described above.
The application provides a test management method, a platform, a medium and equipment, which are used for carrying out test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system; after test preparation is completed, a test environment of a banking system is built; writing a performance test script and importing the performance test script into a test management platform; and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool. The application can hatch an integrated pluggable platform system for performance test, improves the working efficiency of people, saves the cost for enterprises and creates more value.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Wherein:
FIG. 1 is a flow chart of a test management method;
FIG. 2 is a schematic diagram of a test management platform;
fig. 3 is a block diagram of the structure of the test management apparatus.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
As shown in fig. 1, fig. 1 is a flow chart of a test management method in an embodiment, where the test management method in the embodiment includes the following steps:
s101, performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system.
In one particular embodiment, the test preparation includes:
(1) And determining the test requirement of the bank user.
This step is to communicate with the bank users, knowing their expectations and requirements in terms of function, performance, security, availability, etc. of the system, as well as their degree of participation in the test and acceptance criteria. For example, a banking user may need that the system can support multiple service types, multiple payment modes, multiple account types, and the like, or may need that the system can stably operate under high concurrency and high pressure, can prevent risks such as data loss, tampering, leakage, and the like, can timely respond to a user request, and can adapt to service changes, and the like.
(2) And selecting the importance degree and the frequency degree of the core business process of the bank card, and determining the concurrency user number, the transaction throughput rate requirement, the response time requirement, the system occupation resource requirement and the expandability requirement of the bank system.
The step is to prioritize each functional module or business flow of the system according to the business characteristics and scenes of the bank user so as to make a reasonable test strategy and allocate proper test resources. Meanwhile, according to the business volume and the expected effect of the bank user, the performance index of the system is quantized and normalized so as to be convenient for effective performance test and evaluation. For example, the bank card core business processes may include account opening, deposit and withdrawal, transfer, consumption, repayment, etc., where account opening and deposit and withdrawal may be the most important and frequent business processes, and thus, important concerns about functional correctness and performance stability are required. While consumption and repayment may be minor and less business processes, so their test priorities and coverage may be reduced appropriately. Meanwhile, according to the actual situation or the predicted situation of the bank user, the number of concurrent users (such as 1000 people) which the system needs to support, the transaction throughput rate (such as 100 transactions per second), the response time (such as not more than 3 seconds), the occupied resources of the system (such as not more than 80% of CPU utilization rate), the expandability (such as supporting a newly added service type or payment mode) and the like are determined.
(3) Determining the architecture of the banking system, the development language adopted, the communication protocol and the verification mode.
This step refers to knowing the overall structure and components of the system, and the relationships and dependencies between the components, based on the design documents of the banking system or the introduction of the developer. At the same time, the development language (e.g., java), communication protocol (e.g., HTTP) and verification mode (e.g., digital signature) adopted by the system are known, so that the proper testing tool and method can be selected. For example, banking systems may employ a hierarchical architecture including front-end, back-end, database, etc., where the front-end is responsible for interacting with users, the back-end is responsible for processing business logic, and the database is responsible for storing data. Meanwhile, the system may be developed in Java language, communicate using HTTP protocol, and verify using digital signature.
(4) And determining a physical deployment mode, a hardware architecture, an operating system, a database, middleware and a software deployment mode of the bank system.
This step refers to knowing the physical deployment mode (such as stand-alone deployment or distributed deployment), hardware architecture (such as CPU, memory, disk, etc.), operating system (such as Windows or Linux), database (such as Oracle or MySQL), middleware (such as WebLogic or Tomcat), and software deployment mode (such as single instance or multiple instances) of the system according to the running environment and configuration requirements of the banking system. The information plays an important role in the construction and maintenance of a test environment and the analysis and evaluation of test results. For example, a banking system may employ a distributed deployment approach, where multiple servers are used to form a cluster, each configured with a 4-core CPU, 8G memory, 500G disk, etc. Meanwhile, the system may run on a Linux operating system, store data using an Oracle database, use WebLogic as middleware, and deploy software using a multi-instance mode.
(5) And determining the service type, the system load and the data source of the banking system.
This step refers to determining the service types (such as deposit, transfer, consumption, etc.), system loads (such as concurrency user number, thing throughput rate, etc.), and data sources (such as test data, production data, etc.) that the system needs to support according to the service requirements and scenes of the bank users. This information is important for the design and execution of test cases, as well as the preparation and management of test data. For example, banking systems may need to support deposit, transfer, consumption, etc. business types, where deposit business needs to simulate 1000 concurrent users, performing 100 transactions per second; the transfer service needs to simulate 500 concurrent users, and 50 transactions are executed per second; the consumer business needs to simulate 100 concurrent users, performing 10 transactions per second. Meanwhile, the system may need to perform functional testing using the test data and performance testing using the production data.
(6) And making a test project plan of the banking system.
The step is to make a detailed test project plan according to the analysis result, wherein the detailed test project plan comprises contents such as test targets, test ranges, test strategies, test methods, test standards, test resources, test progress, test risks and the like. The test project plan is an important document for guiding the whole test process, and needs to be communicated and confirmed with bank users and developers, and is adjusted and updated according to actual conditions.
S102, after test preparation is completed, a test environment of the bank system is built.
In a specific embodiment, the method for building the test environment of the banking system is as follows:
(1) And selecting a test server according to the number of concurrent users, the transaction throughput rate and the response time requirement of the banking system.
The method is to select proper hardware equipment and network equipment as test servers, such as CPU, memory, disk, network bandwidth and the like of the servers according to the concurrent user number, the transaction throughput rate and the response time requirement of the banking system. The purpose of selecting a test server is to simulate the actual user load and business pressure, and to evaluate the performance and bottlenecks of the banking system.
(2) And installing and configuring a database server, creating a database and a table structure, and inserting test data.
Database software, such as Oracle, mySQL, etc., that is the same as or similar to a banking system is installed and configured on a test server, and a database and table structure are created and test data is inserted. The purpose of installing and configuring the database server is to provide data support required by the banking system, simulating a real data environment.
(3) The banking system and the performance testing tool are connected to the same local area network or the internet to simulate network communication in a real environment.
The bank system to be tested and the performance testing tools for sending requests and collecting results, such as LoadRunner, JMeter, are connected to the same local area network or the internet to simulate network communication in a real environment. The bank system and the performance testing tool are connected to the same local area network or the Internet to ensure the stability and reliability of network communication and avoid the influence of network factors on the performance testing result.
(4) The load balancer is configured in a test environment.
Load balancers, such as nmginx, LVS, etc., are installed and configured in a testing environment to distribute requests sent by performance testing tools to different banking servers to implement load balancing 2. The aim of configuring the load equalizer in the test environment is to improve the usability and expansibility of the bank system and simulate the load distribution in the real environment.
For example, if a performance test environment is to be built for a core business process of a bank card, the following steps may be referred to:
selecting a test server: according to the performance target of the core business flow of the bank card, for example, the number of concurrent users is 1000, the throughput rate of things is 200 pens/second, the response time is within 2 seconds, and the like, a proper test server is selected, for example, a CPU is 8-core 16-thread, a memory is 32G, a disk is 1T, the network bandwidth is 100M, and the like.
Installing and configuring a database server: oracle 19c database software is installed and configured on the test server, and a bank card database and associated table structure is created, and a certain amount of test data, such as account information, transaction records, etc., is inserted.
Connecting the banking system and the performance testing tool to the same local area network or internet: the tested bank card core service system and the LoadRunner performance testing tool for sending requests and collecting results are connected to the same local area network or the Internet, and communication protocols and verification modes such as Socket, TONG and the like are set.
Configuring a load balancer in a test environment: and installing and configuring an Nginx load balancer in a test environment, and distributing requests sent by LoadRunner to different bank card core service system servers to realize load balancing.
S103, writing a performance test script and importing the performance test script into a test management platform.
In one embodiment, the writing and importing methods are as follows:
(1) And determining a communication protocol adopted by the banking system, and selecting a performance test tool compatible with the communication protocol.
This step refers to knowing the communication protocol (e.g., HTTP, TCP/IP, RTE, etc.) employed by the system based on the design documentation of the banking system or the introduction of the developer, so as to facilitate selection of the appropriate performance testing tool (e.g., loadRunner, JMeter, etc.) and protocol type (e.g., web, winsock, RTE, etc.). For example, if the banking system communicates using the RTE protocol, loadRunner may be selected as a performance testing tool and a script may be recorded or written using the RTE protocol type.
(2) According to the business process and performance goal of the banking system, performance test scripts related to parameterized user input, associated data, and added transactions and checkpoints are written using performance test tools.
The step is to simulate each functional module or business flow of the user operation system by using the recording or writing function of the performance test tool according to the business requirement and scene of the bank system, and generate a corresponding performance test script. Meanwhile, in order to make the script more flexible and real, parameterization processing needs to be performed on user input and associated data so as to use different data in different iterations. In addition, in order to monitor and evaluate performance metrics of the system, transactions and checkpoints need to be added to the script in order to calculate response time, error rate, etc. For example, if performance test is to be performed on deposit business of banking system, the RTE recording function of LoadRunner may be used to record operations such as user logging in counter, inputting account number, inputting amount, confirming deposit, etc., and generate corresponding script. Then, the data such as account numbers, amounts and the like can be parameterized and replaced by using different data files. Meanwhile, transaction names (such as Deposits) may be added before and after the Deposit operation in order to count the response time of the Deposit operation. Checkpoints (e.g., text check) may also be added after the deposit operation to verify that the deposit operation was successful.
(3) After the performance test script is compiled, errors or logic problems existing in the performance test script are checked and repaired.
After the performance test script is written, the script needs to be debugged and verified, whether the problems of grammar errors, logic errors, parameter errors and the like exist or not is checked, and the problems are repaired in time. Meanwhile, the script needs to be optimized and reconstructed, redundant codes are deleted, notes are added, and readability is improved. For example, in LoadRunner, the Verify function may be used to check whether the script has syntax errors, and Run single or multiple virtual users using the Run function to Verify whether the script has logic errors or parameter errors. If a problem is found, an editor or function generator may be used to modify or add code.
(4) And writing a verification script to verify the performance index of the bank system.
This step is to write verification scripts according to the performance targets of the banking system, so as to check and evaluate the performance indexes of the system. The validation script may be implemented using the analytical functions of the performance testing tool, or using other tools or languages (e.g., excel, python, etc.). The verification script needs to calculate and compare according to the performance index (such as the number of concurrent users, the throughput rate of things, the response time, the error rate, etc.) of the system to determine whether the system reaches the expected performance level. For example, the Analysis function of LoadRunner may be used to import performance test result files, generate corresponding charts and reports, and then analyze and evaluate the data in the charts and reports according to the performance goals of the banking system.
(5) And after the performance index passes the verification, importing the written performance test script according to the importing requirement of the used test management platform.
This step means that after the performance test script is written and verified, the script needs to be imported into a test management platform (such as Quality Center, testLink, etc.), so as to facilitate the management and execution of test cases. Different test management platforms may have different import requirements and need to be adjusted and adapted according to specific situations. For example, if a Quality Center is used as a test management platform, then the Export function of LoadRunner may be used to Export scripts as zip files, then a corresponding test set is created in the Quality Center, and the zip files are imported into the test set.
S104, selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool.
In one embodiment, the test is performed in the following manner:
(1) Checking whether the processing capacity and resources of the pressing machine are sufficient.
This step is to ensure that the stress testing tool can simulate access to the banking system by a large number of concurrent users and that the test results are not accurate or the test is interrupted due to insufficient machine performance. For example, if 100 tens of thousands of users were to be simulated to log into a banking website simultaneously, the pressing machine would need to have sufficient memory, CPU, network bandwidth, etc. resources to support such loads.
(2) And setting the current pressure level, user behavior and concurrent user number to be simulated by the target banking system, and selecting the adaptive performance test script.
This step is to define the goals and scope of the stress test and to select the appropriate test methods and tools. Pressure levels refer to unfavorable market conditions and economic variables to be simulated, such as 20% stock market drop, 30% price drop, 5% improvement in interest rate, etc. User behavior refers to operations performed by a user in a banking system, such as inquiring a balance, transferring accounts, applying for loans, etc. The number of concurrent users refers to the number of users accessing the banking system simultaneously. Performance test scripts refer to program code for simulating user behavior and sending requests to a banking system. For example, if 10 ten thousand users are to be simulated to query the balance simultaneously, then one performance test script is selected that can generate 10 ten thousand different accounts and passwords and can send a request to query the balance to the banking system.
(3) And randomly selecting key functions of the banking system, generating test data required by the key functions by using a performance test tool, performing stability test on the key functions by using a performance test script and the test data, monitoring the running conditions of the banking system, a network and a database, and checking whether memory leakage exists.
This step is to evaluate the stability and reliability of the banking system at the pressure level and to find possible performance bottlenecks or defects. The key functions refer to functions critical to the normal operation of the banking system, such as deposit, withdrawal, payment, etc. Test data refers to input data such as account numbers, amounts, payees, etc. required for performing key functions. Stability testing refers to the continuous application of high loads to critical functions over a period of time and the observation of indicators of response time, error rate, etc. Monitoring refers to collecting and analyzing resource usage (e.g., CPU, memory, disk, etc.) of a banking system and its associated components (e.g., network, database, etc.) during stress testing, and outputting logs or reports. Memory leakage refers to the fact that a bank system fails to release allocated memory space correctly in the running process, so that memory occupation is increased continuously, and system performance or stability can be affected. For example, if a stability test is to be performed on the deposit function, it is necessary to generate a large amount of account number and amount data using a performance test tool and to simulate a user to send a deposit request to the banking system using a performance test script while monitoring response time, error rate, CPU usage, memory usage, etc. of the banking system and checking whether there is a memory leak.
Further, the method further comprises: and analyzing the performance index, the response time and the error rate, evaluating the performance of the bank system according to the obtained analysis result, and formulating performance optimization measures related to codes, hardware and configuration resources.
The performance index is a quantitative index for measuring the performance of the banking system in the stress test, such as the number of requests processed per second (TPS), the Average Response Time (ART), the Maximum Response Time (MRT), the Error Rate (ER), and the like. These metrics may reflect aspects of throughput, efficiency, stability, etc. of the banking system at different pressure levels. For example, if a banking system is under stress with a TPS of 1000, ART of 0.5 seconds, MRT of 2 seconds, and ER of 0.1%, this means that the system can process 1000 requests per second, taking on average 0.5 seconds for each request to respond, and 2 seconds for the slowest request to respond, 0.1% of the requests are in error.
Response time refers to the time elapsed from the user sending a request to the banking system to receiving a response, typically in milliseconds or seconds. Response time is an important factor affecting user experience and satisfaction, and in general, the shorter the response time, the more satisfied the user. The response time is affected by a number of factors, such as network latency, server load, database query efficiency, etc. For example, if a user sends a request to the banking system to query the balance, the time it takes from clicking the query button until seeing the balance displayed on the screen is the response time.
Error rate refers to the ratio of requests for which errors occur in a stress test to the total number of requests, typically expressed in percent or thousandths. Error rate is an important indicator reflecting the reliability and robustness of a banking system, and in general, the lower the error rate, the more reliable the system. Error rates are affected by a variety of factors, such as code logic, exception handling, input verification, security, etc. For example, if a user sends a request for a transfer to a banking system, but the request fails due to an incorrect payee account being entered or the amount exceeding a limit, the request is an incorrect request.
The performance index, response time and error rate are analyzed to find out the performance problem or potential risk of the banking system in the pressure test, and corresponding solutions or improvement measures are provided according to different problems or risks. The analysis method can be various, such as comparing index changes under different pressure levels, comparing index differences under different functional modules or scenes, comparing index advantages and disadvantages under different versions or configurations. The analysis results may be presented in the form of charts, reports, suggestions, or the like. For example, if a functional module is found by analysis to have a significant increase in response time or a significant increase in error rate at high pressure levels, this may mean that the functional module has a performance bottleneck or defect.
Based on the analysis results obtained, the performance of the banking system is evaluated in order to determine whether the banking system is able to meet the expected performance requirements or objectives and to give a corresponding evaluation or rating. There are various evaluation methods, for example, setting a performance reference or threshold, using a performance evaluation model or formula, using a performance evaluation standard or specification, and the like. The evaluation result may be presented in the form of a score, a rank, a description, or the like. For example, if it is found by evaluation that a banking system can maintain good throughput, efficiency, stability, etc. at all pressure levels, a high score or superior rating may be given.
Performance tuning measures related to codes, hardware and configuration resources are formulated to improve the performance level of a banking system, eliminate or reduce performance problems or risks, and improve user experience and satisfaction. The tuning measures may be varied, such as optimizing code logic, adding exception handling, using caches or indexes, upgrading servers or network devices, tuning parameters or options, and the like. The tuning measures may be tailored or combined for different performance indicators or problems. For example, if a significant decrease in the response time or error rate of a functional module is found by tuning, the performance of the functional module may be considered to be improved.
Further, the method further comprises: setting up a regular tracking plan, collecting response time, throughput and error rate of the system as performance tracking indexes, and carrying out trend analysis on the performance tracking indexes to determine the change trend of the performance; the change trend of the performance is fed back to a development team or a designated responsible person.
Wherein a periodic tracking plan is established for continuously monitoring and assessing the performance status of the banking system and for timely discovering and handling performance problems or risks. Periodic tracking plans may be formulated based on characteristics and needs of the banking system, such as performing performance tests or checks daily, weekly, monthly, or quarterly. The periodic tracking plan may include different stress levels, user behavior, functional modules or scenarios, and the like. For example, if performance testing is to be performed on a banking system daily, it may be selected to be performed during the low peak hours of the system and simulate common user behavior and functional modules.
The response time, throughput and error rate of the system are collected as performance tracking indexes to reflect the performance of the banking system at different time points or conditions with quantized data. Response time, throughput, and error rate are common performance tracking metrics that represent the time elapsed from the user sending a request to the banking system to receiving a response, the number of requests processed per second, and the ratio of erroneous requests to total requests, respectively. The collection of performance tracking metrics may use specialized performance testing tools or monitoring software, or may use custom programs or scripts. The collection of performance tracking metrics may be performed in different dimensions or granularities, e.g., in terms of time, stress level, user behavior, functional modules or scenarios, etc. For example, if the response time, throughput and error rate of a banking system per day are to be collected, these metrics may be recorded once every hour or half-hour using performance testing tools or monitoring software and categorized by different user behaviors or functional modules.
Trend analysis is performed on the performance tracking index to determine the variation trend of the performance, so as to find the performance variation rule of the banking system in a period of time or under different conditions, and find out the reasons or factors possibly influencing the performance variation. Trend analysis is a statistical analysis method, and graphs, formulas or models can be used to show and calculate the trend of the performance tracking index over time or the change of conditions, such as rising, falling, fluctuation, stability and the like. Trend analysis may help identify the frequency, severity, duration, etc. of occurrence of performance problems or risks and provide basis and reference. For example, if the response time of a banking system is found to exhibit an upward trend at high pressure levels and a steady trend at low pressure levels by trend analysis, it may mean that the banking system has performance bottlenecks or defects.
The change trend of the performance is fed back to a development team or an appointed responsible person, so that related personnel can know the current and historical performance conditions of the banking system, and corresponding measures or improvement schemes can be adopted according to feedback results. The feedback mode can be various, such as document forms of sending mail, report, notice, etc., or communication forms of holding meeting, discussion, demonstration, etc. Feedback content may include performance tracking metrics, trend analysis results, problem or risk descriptions, advice or solutions, and the like. The feedback aims to improve the performance level of the banking system, eliminate or reduce performance problems or risks, and improve user experience and satisfaction. For example, if a certain functional module of a banking system is found to have a performance problem or risk through feedback, a development team or designated responsible person may be required to optimize or repair the functional module and verify the effect in the next performance test or check.
The test management method is used for carrying out test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on the bank system; after test preparation is completed, a test environment of a banking system is built; writing a performance test script and importing the performance test script into a test management platform; and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool. The application can hatch an integrated pluggable platform system for performance test, improves the working efficiency of people, saves the cost for enterprises and creates more value.
In one embodiment, as shown in fig. 2, a test management platform is provided, which includes:
the preparation module 201 is used for carrying out test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system;
an environment construction module 202, configured to construct a test environment of the banking system after completing the test preparation;
the writing module 203 is used for writing performance test scripts and importing the performance test scripts into the test management platform;
and the test module 204 is used for selecting a performance test script adapted to the target banking system from the test management platform and performing performance test on the target banking system by using test data generated by the performance test tool.
FIG. 3 illustrates an internal block diagram of a test management device in one embodiment. As shown in fig. 3, the test management device includes a processor, a memory, and a network interface connected by a system bus. The memory includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium of the test management apparatus stores an operating system, and may also store a computer program which, when executed by the processor, causes the processor to implement a test management method. The internal memory may also store a computer program that, when executed by the processor, causes the processor to perform the test management method. It will be appreciated by those skilled in the art that the structure shown in FIG. 3 is a block diagram of only some of the structures associated with the present inventive arrangements and is not limiting of the test management apparatus to which the present inventive arrangements are applied, and that a particular test management apparatus may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
A computer readable storage medium storing a computer program which when executed by a processor performs the steps of: performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system; after test preparation is completed, a test environment of a banking system is built; writing a performance test script and importing the performance test script into a test management platform; and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool.
A test management apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of when executing the computer program: performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system; after test preparation is completed, a test environment of a banking system is built; writing a performance test script and importing the performance test script into a test management platform; and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool.
It should be noted that the above test management method, platform, device and computer readable storage medium belong to a general inventive concept, and the content in the embodiments of the test management method, platform, device and computer readable storage medium may be mutually applicable.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored in a non-transitory computer-readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A method of test management, the method comprising:
performing test preparation of demand analysis, performance targets, application software, deployment platforms, load models and project plans on a banking system;
after test preparation is completed, a test environment of a banking system is built;
writing a performance test script and importing the performance test script into a test management platform;
and selecting a performance test script matched with the target banking system from the test management platform, and performing performance test on the target banking system by using test data generated by a performance test tool.
2. The method of claim 1, wherein the testing preparation of the banking system for demand analysis, performance goals, application software, deployment platforms, load models, and project plans comprises:
determining the test requirement of a bank user;
selecting the importance degree and the frequency degree of a core business process of a bank card, and determining the number of concurrent users of a bank system, the demand for throughput rate of things, the demand for response time, the demand for resources occupied by the system and the demand for expandability;
determining the architecture of a banking system, the development language adopted, the communication protocol and the verification mode;
determining a physical deployment mode, a hardware architecture, an operating system, a database, middleware and a software deployment mode of a banking system;
determining the service type, system load and data source of a banking system;
and (5) making a test project plan of the banking system.
3. The method according to claim 1, wherein the building a test environment of a banking system comprises:
selecting a test server according to the number of concurrent users, the transaction throughput rate and the response time requirement of the banking system;
installing and configuring a database server, creating a database and a table structure, and inserting test data;
connecting a banking system and a performance testing tool to the same local area network or the internet to simulate network communication in a real environment;
a load balancer is configured in a test environment.
4. The method of claim 1, wherein writing the performance test script and importing the test management platform comprises:
determining a communication protocol adopted by a banking system, and selecting a performance test tool compatible with the communication protocol;
according to the business flow and performance targets of the banking system, using the performance testing tool to write performance testing scripts related to parameterized user input, associated data, added transactions and checkpoints;
after the performance test script is compiled, checking errors or logic problems in the performance test script and repairing the errors or the logic problems;
writing a verification script to verify the performance index of the bank system;
after the performance index passes the verification, importing the written performance test script according to the importing requirement of the used test management platform.
5. The method of claim 1, wherein selecting a performance test script from a test management platform that is adapted to a target banking system and performing a performance test on the target banking system using test data generated by a performance test tool comprises:
checking whether the processing capacity and resources of the pressing machine are sufficient;
setting the current pressure level, user behavior and concurrent user number to be simulated by a target banking system, and selecting an adapted performance test script;
and randomly selecting key functions of the bank system, generating test data required by the key functions by using a performance test tool, performing stability test on the key functions by using the performance test script and the test data, monitoring the running conditions of the bank system, the network and the database, and checking whether memory leakage exists.
6. The method according to claim 1, characterized in that the method further comprises:
and analyzing the performance index, the response time and the error rate, evaluating the performance of the bank system according to the obtained analysis result, and formulating performance optimization measures related to codes, hardware and configuration resources.
7. The method according to claim 1, characterized in that the method further comprises:
setting up a regular tracking plan, collecting response time, throughput and error rate of a system as performance tracking indexes, and carrying out trend analysis on the performance tracking indexes to determine the change trend of the performance;
the change trend of the performance is fed back to a development team or a designated responsible person.
8. A test management platform, the test management platform comprising:
the system comprises a preparation module, a performance target, application software, a deployment platform, a load model and a project plan, wherein the preparation module is used for carrying out test preparation of demand analysis, performance target, application software, the deployment platform, the load model and the project plan on a bank system;
the environment construction module is used for constructing a test environment of the bank system after the test preparation is completed;
the writing module is used for writing performance test scripts and importing the performance test scripts into the test management platform;
and the test module is used for selecting a performance test script matched with the target banking system from the test management platform and performing performance test on the target banking system by using test data generated by the performance test tool.
9. A computer readable storage medium, characterized in that a computer program is stored, which, when being executed by a processor, causes the processor to perform the steps of the method according to any of claims 1 to 7.
10. A test management device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the method of any one of claims 1 to 7.
CN202311023489.XA 2023-08-14 2023-08-14 Test management method, platform, medium and equipment Pending CN117056218A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311023489.XA CN117056218A (en) 2023-08-14 2023-08-14 Test management method, platform, medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311023489.XA CN117056218A (en) 2023-08-14 2023-08-14 Test management method, platform, medium and equipment

Publications (1)

Publication Number Publication Date
CN117056218A true CN117056218A (en) 2023-11-14

Family

ID=88662081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311023489.XA Pending CN117056218A (en) 2023-08-14 2023-08-14 Test management method, platform, medium and equipment

Country Status (1)

Country Link
CN (1) CN117056218A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117331846A (en) * 2023-11-30 2024-01-02 河北雄安尚世嘉科技有限公司 Internet-based software development, operation, test and management system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117331846A (en) * 2023-11-30 2024-01-02 河北雄安尚世嘉科技有限公司 Internet-based software development, operation, test and management system
CN117331846B (en) * 2023-11-30 2024-03-08 河北雄安尚世嘉科技有限公司 Internet-based software development, operation, test and management system

Similar Documents

Publication Publication Date Title
US11755319B2 (en) Code development management system
US9710257B2 (en) System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US8938719B2 (en) System and method for performance assurance of applications
Nassif et al. Revisiting turnover-induced knowledge loss in software projects
US20120174057A1 (en) Intelligent timesheet assistance
CN117056218A (en) Test management method, platform, medium and equipment
CN114036034A (en) Performance test method applied to real-time streaming computation
Abdeen et al. An approach for performance requirements verification and test environments generation
Portillo‐Dominguez et al. PHOEBE: an automation framework for the effective usage of diagnosis tools in the performance testing of clustered systems
Arvanitou et al. Assessing change proneness at the architecture level: An empirical validation
Paul End-to-end integration testing
CN115658452A (en) Buried point checking method, buried point checking device, readable storage medium and electronic equipment
CN110008098B (en) Method and device for evaluating operation condition of nodes in business process
Brunnert et al. Detecting performance change in enterprise application versions using resource profiles
CN111767222A (en) Data model verification method and device, electronic equipment and storage medium
Kozlov et al. Evaluating the impact of adaptive maintenance process on open source software quality
CN111240981A (en) Interface testing method, system and platform
Bouwers et al. Criteria for the evaluation of implemented architectures
CN116932414B (en) Method and equipment for generating interface test case and computer readable storage medium
CN113742226B (en) Software performance test method and device, medium and electronic equipment
Michailidis Evolution Analysis of Technical Debt: Monitoring the Breaking Point
Kim et al. The new approach to IT testing: real transaction-based automated validation solution
Roman et al. Testing Techniques
Luo On Improving (Non) Functional Testing
Prouty Test Selection and Prioritization for EMC

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination