CN116244202A - Automatic performance test method and device - Google Patents

Automatic performance test method and device Download PDF

Info

Publication number
CN116244202A
CN116244202A CN202310244454.2A CN202310244454A CN116244202A CN 116244202 A CN116244202 A CN 116244202A CN 202310244454 A CN202310244454 A CN 202310244454A CN 116244202 A CN116244202 A CN 116244202A
Authority
CN
China
Prior art keywords
test
message
transaction
tested
analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310244454.2A
Other languages
Chinese (zh)
Inventor
吉帅
朱怡雯
朱仲毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202310244454.2A priority Critical patent/CN116244202A/en
Publication of CN116244202A publication Critical patent/CN116244202A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides an automatic performance test method and device, which relate to the technical field of artificial intelligence and software test, and the method comprises the following steps: according to the test interface information, retrieving a history log of the test environment, and obtaining history log information of the test interface; analyzing historical log information of a test interface by using a message intelligent analysis model, and taking a message which is successfully transacted as a message template; carrying out parameterization transformation on the message template according to the parameterization rule to form a test script; carrying out transaction performance test by using the test script and the test interface information; analyzing the returned message by using a message intelligent analysis model, judging whether the transaction is successful or not, responding to the transaction failure, and further analyzing the type of the transaction failure; and informing the analysis result to a tester. The invention can judge and classify the failure text through the intelligent message analysis model, reduces manual steps, reduces the complexity of system performance test work and improves the efficiency of system performance test.

Description

Automatic performance test method and device
Technical Field
The invention relates to the technical field of artificial intelligence and software testing, which can be used in the financial field, in particular to an automatic performance testing method and device.
Background
Currently, in the system test work, due to the complex and changeable procedure to be tested, a large number of manual steps are still required by a tester to cope with the changeable environment. Since the past automated test is more automated based on rule judgment, the rule is fixed, so the method can be well performed in the scene facing to the reproduction of the past test result. However, the rule lacks some flexibility, and the old rule-based automatic testing method cannot be used in the face of the daily and monthly programs to be tested, which is the root cause why the automatic test still cannot replace the manual test in a large scale nowadays. Therefore, an automatic performance test method is needed, so that the method can well perform against the frequently-changed program to be tested.
Disclosure of Invention
In view of the above, the present invention provides an automated performance testing method and apparatus to solve at least one of the above-mentioned problems.
In order to achieve the above purpose, the present invention adopts the following scheme:
according to a first aspect of the present invention there is provided an automated performance testing method, the method comprising: according to the test interface information, retrieving a history log of a test environment, and obtaining history log information of the test interface; analyzing the historical log information of the test interface by using a message intelligent analysis model, and taking a message which is successfully transacted as a message template; carrying out parameterization transformation on the message template according to parameterization rules to form a test script; carrying out transaction performance test on the system to be tested by utilizing the test script and the test interface information; and analyzing the message returned by the system to be tested by using the message intelligent analysis model, judging whether the transaction is successful or not, responding to the transaction failure, and further analyzing the type of the transaction failure.
According to a second aspect of the present invention there is provided an automated performance testing apparatus, the apparatus comprising: the history log obtaining unit is used for retrieving the history log of the test environment according to the test interface information and obtaining the history log information of the test interface; the message template generating unit is used for analyzing the history log information of the test interface by utilizing a message intelligent analysis model, and taking a message which is successfully transacted as a message template; the test script generation unit is used for carrying out parameterization transformation on the message template according to parameterization rules to form a test script; the performance test unit is used for testing transaction performance of the system to be tested by utilizing the test script and the test interface information; and the message analysis unit is used for analyzing the message returned by the system to be tested by utilizing the message intelligent analysis model, judging whether the transaction is successful or not, and further analyzing the type of the transaction failure in response to the transaction failure.
According to a third aspect of the present invention there is provided an electronic device comprising a memory, a processor and a computer program stored on said memory and executable on said processor, the processor implementing the steps of the above method when executing said computer program.
According to a fourth aspect of the present invention there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the above method.
According to the technical scheme, the automatic performance testing method and the automatic performance testing device can judge and classify the failed text through the intelligent message analysis model, so that manual steps are reduced, the complexity of system performance testing work is reduced, and the efficiency of system performance testing is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. In the drawings:
FIG. 1 is a schematic flow chart of an automated performance testing method according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of an automated performance testing method according to another embodiment of the present application;
FIG. 3 is a schematic structural diagram of an automated performance testing apparatus according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an automated performance testing apparatus according to another embodiment of the present disclosure;
fig. 5 is a schematic block diagram of a system configuration of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention will be described in further detail with reference to the accompanying drawings. The exemplary embodiments of the present invention and their descriptions herein are for the purpose of explaining the present invention, but are not to be construed as limiting the invention.
The system performance test is generally based on the interface, the return message of the interface often provides a certain error reporting text information, the error reporting information is easy and understandable for human beings, but the computer cannot understand, so that the error reporting text is learned by using the natural language processing technology, and the computer can judge and classify the error reporting information, thereby replacing a certain amount of manpower.
Fig. 1 is a schematic flow chart of an automatic performance testing method according to an embodiment of the present application, where the method includes the following steps:
step S101: and retrieving a history log of the test environment according to the test interface information, and acquiring the history log information of the test interface.
Firstly, according to the embodiment, a relevant history log is searched according to test interface information of a system to be tested, history log information belonging to the test interface is obtained, and the history log information has returned message information of the test interface during testing.
Step S102: and analyzing the historical log information of the test interface by using a message intelligent analysis model, and taking a message of successful transaction as a message template.
In order to reduce manual participation, the embodiment uses a message intelligent analysis model to analyze the historical log information of the test interface, the message intelligent analysis model is trained by using a natural language processing technology, the historical log information obtained in the step S101 is used as the input of the message intelligent analysis model, namely, the message information of successful transaction can be output, and then the embodiment uses the message of successful transaction as a message template for subsequent performance test.
If the intelligent message analysis model analysis in the step does not find a message meeting the requirements, the performance test can be interrupted, and a tester is informed to perform manual processing.
Step S103: and carrying out parameterization transformation on the message template according to parameterization rules to form a test script.
Preferably, in this embodiment, the parametric transformation may be performed on the message template by using a regular expression, that is, the key fields in the message template are replaced by specific character strings by using the regular expression to form the test script.
Step S104: and carrying out transaction performance test on the system to be tested by utilizing the test script and the test interface information.
The transaction performance test of the system to be tested can be implemented by using a pressing tool, and preferably, in this embodiment, the specific character string in the test script can be replaced by preset test data, so that the messages of each transaction are different during the performance test, and then the transaction performance test is performed by using the test script to test the test interface of the system to be tested.
Step S105: and analyzing the message returned by the system to be tested by using the message intelligent analysis model, judging whether the transaction is successful or not, responding to the transaction failure, and further analyzing the type of the transaction failure.
In this embodiment, the intelligent message analysis model may determine whether the transaction is successful according to the message, and further analyze the type of transaction failure for the failure message. Therefore, the intelligent message analysis model of the embodiment may include an emotion analysis sub-model and an abnormality analysis sub-model, where the emotion analysis sub-model may output information about whether the transaction is successful according to the input data, and the abnormality analysis sub-model may output a transaction failure type according to the input data.
The intelligent message analysis model can be obtained by training in the following way: the method comprises the steps of collecting transaction return messages in a log system, collecting a large number of transaction return messages for higher model accuracy obtained by training, and then taking the collected transaction return messages as training data, specifically, dividing the training set and the verification set according to a proportion, carrying out emotion analysis training and abnormality analysis training by adopting a natural language processing technology, and obtaining corresponding emotion analysis sub-models and abnormality analysis sub-models after verification passes.
According to the technical scheme, the automatic performance testing method provided by the invention can judge and classify the failed text through the intelligent message analysis model, so that the manual steps are reduced, the complexity of the system performance testing work is reduced, and the efficiency of the system performance testing is improved.
Fig. 2 is a schematic flow chart of an automatic performance testing method according to another embodiment of the present application, where the method includes the following steps:
step S201: and retrieving a history log of the test environment according to the test interface information, and acquiring the history log information of the test interface.
Step S202: and analyzing the historical log information of the test interface by using a message intelligent analysis model, and taking a message of successful transaction as a message template.
Step S203: and replacing key fields in the message template with specific character strings by using the regular expression to form the test script.
Step S204: and replacing the specific character string in the test script with preset test data, and testing the transaction performance of the test interface of the system to be tested by using the test script.
Step S205: and analyzing the message returned by the system to be tested by using the message intelligent analysis model, judging whether the transaction is successful or not, responding to the transaction failure, and further analyzing the type of the transaction failure.
Step S206: and updating the preset test data according to the analysis result of the return message of the system to be tested, and only retaining the test data of successful transaction.
In this embodiment, after the analysis result of the intelligent analysis model of the message is obtained, the preset test data can be automatically updated according to the analysis result, so that automatic and intelligent screening of the test data can be realized.
Step S207: and carrying out statistics and summarization on the analysis result of the system return message to be tested to form a statistics report, wherein the statistics report comprises a successful transaction state of each transaction, a duty ratio of each transaction state, and system performance indexes in the first transaction and the last transaction time period, and the system performance indexes can be, for example, CPU (Central processing unit) or memory occupation condition. The statistical report can be in a chart form or a form matched with a text description, wherein the text description can be formed by replacing relevant parameters in the model by data of an analysis result through a preset language model, or can be a text description formed by using a language model trained through deep learning, a section of summarization description is output, and the like.
Preferably, the types of transaction failures may include systematic errors and data errors, and after forming a summary to form a statistical report, the present application may further perform the following operations: judging whether the system type error in the analysis result of the system to be tested return message exceeds a first preset threshold value, if so, carrying out fine adjustment on the parameters of the system to be tested, and judging whether the data type error in the analysis result of the system to be tested return message exceeds a second preset threshold value, if so, adjusting the test data in the test script. The embodiment can automatically adjust the system to be tested and the test data, so that the performance test result is more accurate.
Step S208: and notifying the statistical report to a tester.
In this embodiment, the statistics may be notified to the tester in the form of a mail, a document, a chat dialog, or the like. In addition, the analysis result of the analysis of the history log information using the intelligent message analysis model in step S202 and the update of the test data in step S206 may also be transmitted to the tester in the above-described form.
According to the technical scheme, the automatic performance testing method provided by the invention can judge and classify the failed text through the intelligent message analysis model, so that the manual steps are reduced, the complexity of the system performance testing work is reduced, and the efficiency of the system performance testing is improved.
Fig. 3 is a schematic structural diagram of an automatic performance testing apparatus according to an embodiment of the present application, where the apparatus includes: the history log obtaining unit 310, the message template generating unit 320, the test script generating unit 330, the transmitting test unit 340 and the message parsing unit 350 0 are sequentially connected with each other.
The history log obtaining unit 310 is configured to retrieve a history log of a test environment according to test interface information, and obtain the history log information of the test interface.
The message template generating unit 320 is configured to analyze the history log information of the test interface by using a message intelligent analysis model, and take a message that the transaction is successful as a message template.
The test script generating unit 330 is configured to perform parameterization transformation on the message template according to a parameterization rule to form a test script.
The compression test unit 340 is configured to perform transaction performance test on the system to be tested by using the test script and the test interface information.
The message parsing unit 350 is configured to parse the message returned by the system to be tested by using the message intelligent parsing model, determine whether the transaction is successful, and further analyze the type of transaction failure in response to the transaction failure.
Preferably, the intelligent message analysis model may include an emotion analysis sub-model and an abnormality analysis sub-model, where the intelligent message analysis model is obtained by training in the following manner: and collecting a transaction return message in a log system, and carrying out emotion analysis training and abnormality analysis training on the transaction return message serving as training data by using a natural language processing technology, so that the emotion analysis sub-model can output information of whether the transaction is successful or not according to input data, and the abnormality analysis sub-model can output a transaction failure type according to the input data.
Preferably, the generating unit 330 for generating the test script by performing parameterization transformation on the message template according to parameterization rules includes: and replacing key fields in the message template with specific character strings by using the regular expression to form the test script.
Preferably, the testing the transaction performance of the system to be tested by the compression testing unit 340 using the test script and the test interface information includes: and replacing the specific character string in the test script with preset test data, and testing the transaction performance of the test interface of the system to be tested by using the test script.
Preferably, as shown in fig. 4, the above apparatus further includes a data updating unit 380, which is respectively connected to the message parsing unit 350 and the voltage-generating testing unit 340, and configured to update the preset test data according to the parsing result of the system return message to be tested, and only retain the test data that the transaction is successful.
Preferably, as shown in fig. 4, the types of transaction failures include systematic errors and data errors, and the apparatus further includes: the first adjusting unit 390, connected to the message parsing unit 350, is configured to fine-tune parameters of the system to be tested when a systematic error in a parsing result of a return message of the system to be tested exceeds a first preset threshold; the second adjusting unit 400 is connected to the message parsing unit 350 and the burst test unit 340, respectively, and is configured to adjust the test data in the test script when the data type error in the parsing result of the system to be tested return message exceeds a second preset threshold.
Preferably, as shown in fig. 4, the automated performance test apparatus of the present embodiment further includes:
the report generating unit 360 is configured to statistically summarize the analysis result of the system return message to be tested to form a statistical report, where the statistical report includes a transaction state of each transaction that is successful, a duty ratio of each transaction state, and system performance indexes in the first transaction period and the last transaction period;
and a notification unit 370 for notifying the tester of the statistical report.
According to the technical scheme, the automatic performance testing device provided by the invention can judge and classify the failed text through the intelligent message analysis model, so that the manual steps are reduced, the complexity of the system performance testing work is reduced, and the efficiency of the system performance testing is improved. In addition, the device can automatically update preset test data according to the analysis result, so that the automatic and intelligent screening of the test data can be realized, and the efficiency of performance test work is greatly improved.
The embodiment of the invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the method when executing the program.
Embodiments of the present invention also provide a computer program product comprising a computer program/instruction which, when executed by a processor, performs the steps of the above method.
As shown in fig. 5, the electronic device 600 may further include: a communication module 110, an input unit 120, an audio processor 130, a display 160, a power supply 170. It is noted that the electronic device 600 need not include all of the components shown in fig. 5; in addition, the electronic device 600 may further include components not shown in fig. 5, to which reference is made to the prior art.
As shown in fig. 5, the central processor 100, sometimes also referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, which central processor 100 receives inputs and controls the operation of the various components of the electronic device 600.
The memory 140 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information about failure may be stored, and a program for executing the information may be stored. And the central processor 100 can execute the program stored in the memory 140 to realize information storage or processing, etc.
The input unit 120 provides an input to the central processor 100. The input unit 120 is, for example, a key or a touch input device. The power supply 170 is used to provide power to the electronic device 600. The display 160 is used for displaying display objects such as images and characters. The display may be, for example, but not limited to, an LCD display.
The memory 140 may be a solid state memory such as Read Only Memory (ROM), random Access Memory (RAM), SIM card, or the like. But also a memory which holds information even when powered down, can be selectively erased and provided with further data, an example of which is sometimes referred to as EPROM or the like. Memory 140 may also be some other type of device. Memory 140 includes a buffer memory 141 (sometimes referred to as a buffer). The memory 140 may include an application/function storage 142, the application/function storage 142 for storing application programs and function programs or a flow for executing operations of the electronic device 600 by the central processor 100.
The memory 140 may also include a data store 143, the data store 143 for storing data, such as contacts, digital data, pictures, sounds, and/or any other data used by the electronic device. The driver storage 144 of the memory 140 may include various drivers of the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, address book applications, etc.).
The communication module 110 is a transmitter/receiver 110 that transmits and receives signals via an antenna 111. A communication module (transmitter/receiver) 110 is coupled to the central processor 100 to provide an input signal and receive an output signal, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, etc., may be provided in the same electronic device. The communication module (transmitter/receiver) 110 is also coupled to a speaker 131 and a microphone 132 via an audio processor 130 to provide audio output via the speaker 131 and to receive audio input from the microphone 132 to implement usual telecommunication functions. The audio processor 130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 130 is also coupled to the central processor 100 so that sound can be recorded locally through the microphone 132 and so that sound stored locally can be played through the speaker 131.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principles and embodiments of the present invention have been described in detail with reference to specific examples, which are provided to facilitate understanding of the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. An automated performance testing method, the method comprising:
according to the test interface information, retrieving a history log of a test environment, and obtaining history log information of the test interface;
analyzing the historical log information of the test interface by using a message intelligent analysis model, and taking a message which is successfully transacted as a message template;
carrying out parameterization transformation on the message template according to parameterization rules to form a test script;
carrying out transaction performance test on the system to be tested by utilizing the test script and the test interface information;
and analyzing the message returned by the system to be tested by using the message intelligent analysis model, judging whether the transaction is successful or not, responding to the transaction failure, and further analyzing the type of the transaction failure.
2. The automated performance testing method of claim 1, wherein the intelligent message analysis model comprises a emotion analysis sub-model and an abnormality analysis sub-model, and is obtained by training the intelligent message analysis model in the following manner: and collecting a transaction return message in a log system, and carrying out emotion analysis training and abnormality analysis training on the transaction return message serving as training data by using a natural language processing technology, so that the emotion analysis sub-model can output information of whether the transaction is successful or not according to input data, and the abnormality analysis sub-model can output a transaction failure type according to the input data.
3. The automated performance testing method of claim 1, wherein parameterizing the message templates according to parameterized rules to form test scripts comprises: and replacing key fields in the message template with specific character strings by using the regular expression to form the test script.
4. The automated performance testing method of claim 3, wherein the performing the transaction performance test on the system under test using the test script and the test interface information comprises: and replacing the specific character string in the test script with preset test data, and testing the transaction performance of the test interface of the system to be tested by using the test script.
5. The automated performance testing method of claim 4, further comprising: and updating the preset test data according to the analysis result of the return message of the system to be tested, and only retaining the test data of successful transaction.
6. The automated performance testing method of claim 1, wherein the types of transaction failures include systematic errors and data errors, the method further comprising: if the system type error in the analysis result of the system to be tested return message exceeds a first preset threshold, the parameters of the system to be tested are finely adjusted, and if the data type error in the analysis result of the system to be tested return message exceeds a second preset threshold, the test data in the test script are adjusted.
7. The automated performance testing method of claim 1, wherein the method further comprises:
the analysis results of the system return messages to be tested are counted and summarized to form a statistical report, wherein the statistical report comprises a transaction state of whether each transaction is successful or not, a duty ratio of each transaction state, and system performance indexes in a first transaction period and a last transaction period;
and notifying the statistical report to a tester.
8. An automated performance testing apparatus, the apparatus comprising:
the history log obtaining unit is used for retrieving the history log of the test environment according to the test interface information and obtaining the history log information of the test interface;
the message template generating unit is used for analyzing the history log information of the test interface by utilizing a message intelligent analysis model, and taking a message which is successfully transacted as a message template;
the test script generation unit is used for carrying out parameterization transformation on the message template according to parameterization rules to form a test script;
the compression test unit is used for testing the transaction performance of the system to be tested by utilizing the test script and the test interface information;
and the message analysis unit is used for analyzing the message returned by the system to be tested by utilizing the message intelligent analysis model, judging whether the transaction is successful or not, and further analyzing the type of the transaction failure in response to the transaction failure.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed by the processor.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 7.
CN202310244454.2A 2023-03-09 2023-03-09 Automatic performance test method and device Pending CN116244202A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310244454.2A CN116244202A (en) 2023-03-09 2023-03-09 Automatic performance test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310244454.2A CN116244202A (en) 2023-03-09 2023-03-09 Automatic performance test method and device

Publications (1)

Publication Number Publication Date
CN116244202A true CN116244202A (en) 2023-06-09

Family

ID=86633058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310244454.2A Pending CN116244202A (en) 2023-03-09 2023-03-09 Automatic performance test method and device

Country Status (1)

Country Link
CN (1) CN116244202A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194134A (en) * 2023-11-08 2023-12-08 武汉凌久微电子有限公司 Display output automatic test method and system based on domestic operating system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194134A (en) * 2023-11-08 2023-12-08 武汉凌久微电子有限公司 Display output automatic test method and system based on domestic operating system
CN117194134B (en) * 2023-11-08 2024-01-30 武汉凌久微电子有限公司 Display output automatic test method and system based on domestic operating system

Similar Documents

Publication Publication Date Title
CN112783793B (en) Automatic interface test system and method
CN107370637B (en) Vehicle-mounted ECU communication function automatic test system and method
CN115422065A (en) Fault positioning method and device based on code coverage rate
CN116244202A (en) Automatic performance test method and device
CN112035325A (en) Automatic monitoring method and device for text robot
CN112181784B (en) Code fault analysis method and system based on byte code injection
CN113095782A (en) Automatic approval decision-making method and device
CN112882934B (en) Test analysis method and system based on defect growth
CN113128986A (en) Error reporting processing method and device for long-link transaction
CN114840421A (en) Log data processing method and device
CN112860527A (en) Fault monitoring method and device of application server
CN112631850A (en) Fault scene simulation method and device
CN113392009A (en) Exception handling method and device for automatic test
CN112035666A (en) Method and device for optimizing cross validation of text robot
CN113515577A (en) Data preprocessing method and device
CN112527631A (en) bug positioning method, system, electronic equipment and storage medium
CN114500215B (en) Centralized management method, device and equipment of storage equipment and readable storage medium
CN114201145A (en) Risk demand classification detection method and system based on configurated semantic recognition
US20220114208A1 (en) Sound recognition model training method and system and non-transitory computer-readable medium
CN117370141A (en) Test report generation method and device, electronic equipment and storage medium
CN113051292A (en) Data checking method and device
CN113778901A (en) Automatic evaluation method and device for test cases
CN115472180A (en) Automatic pressure measurement method and device, storage medium and electronic device
CN115907407A (en) Intelligent community resource data processing method and device
CN117009156A (en) Probability-based hard disk fault analysis method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination