CN116680185A - Test automation execution optimization method, device, equipment and storage medium - Google Patents

Test automation execution optimization method, device, equipment and storage medium Download PDF

Info

Publication number
CN116680185A
CN116680185A CN202310676914.9A CN202310676914A CN116680185A CN 116680185 A CN116680185 A CN 116680185A CN 202310676914 A CN202310676914 A CN 202310676914A CN 116680185 A CN116680185 A CN 116680185A
Authority
CN
China
Prior art keywords
data
test
execution
test data
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310676914.9A
Other languages
Chinese (zh)
Inventor
王闪闪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202310676914.9A priority Critical patent/CN116680185A/en
Publication of CN116680185A publication Critical patent/CN116680185A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a test automation execution optimization method, a device, equipment and a storage medium, which are used for recording test data in the test case automation execution process, judging the data state of the test data by each node in the automation execution process through assertion, recording generated defects together, dynamically adjusting the automation execution familiarity of the test case, optimizing the execution plan of the test case, improving the test cooperation efficiency, further guaranteeing the reliability of a test system, and solving the technical problems that the test efficiency is low because the scheme of optimizing the dynamic adjustment of a test strategy does not exist at present, and checking and repairing are only carried out after the execution fails.

Description

Test automation execution optimization method, device, equipment and storage medium
Technical Field
The present application relates to the technical field of financial science and technology, and in particular, to a method, an apparatus, a device, and a storage medium for testing automation execution optimization.
Background
The automatic execution is to replace manual mode with automatic mode in a certain flow, which is different from common single interface test, the single interface test has low requirement for test data, the single interface test does not need data transmission, and the generation and destruction of staged data. But the data requirements in the process automation implementation are high.
In the testing process of a banking system, continuous automatic test execution is generally required to be performed on a plurality of test cases, such as loan flow class data, and a plurality of test cases, such as loan application test cases, loan approval test cases, loan release test cases, and the like, are required to be performed.
With the expansion of the system scale, testing is an important way of ensuring the system quality, and the complete testing process comprises multiple aspects of data recording, data maintenance, data verification and the like.
At present, no method for timely adjusting strategies exists in the industry, most of regression test cases are all executed, no priority exists, test data needed in the execution process are re-created, and no historical data-based creation exists, so that dynamic adjustment and optimization are not performed in the whole execution process, and checking and repairing are only performed according to failure of execution.
Disclosure of Invention
The application provides a test automation execution optimization method, device, equipment and storage medium, which solve the technical problem of low test efficiency caused by the fact that the scheme of optimizing and dynamically adjusting a test strategy does not exist at present and checking and repairing are carried out only after execution fails.
In view of this, a first aspect of the present application provides a test automation execution optimization method, the method comprising:
s1, reading test data from a data pool;
s2, carrying out automatic execution of each test case through the test data, and recording state torsion conditions of the test data in the execution process;
s3, if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion, recording the generated defect;
s4, adjusting the automatic execution sequence of each test case according to the recorded state torsion condition of the test data and the generated defects in the execution process.
Optionally, the step S3 further includes:
if the data state of the test data of each node in the automatic execution process is judged to be correct through assertion, the execution data of each test case is recorded into the data pool as historical data.
Optionally, the attribute of the test data includes reusable shared data and isolated data defining an application scenario.
Optionally, the method further comprises:
retested data is created in the data pool based on historical data inheritance.
Optionally, the method further comprises:
if the attribute of the repeated test data is isolation data, creating new test data to cover the repeated test data;
and if the attribute of the repeated test data is shared data, reserving the repeated test data.
Optionally, the method further comprises:
and carrying out data cleaning on the data pool at regular time, and deleting the same test data.
Optionally, the step S4 specifically includes:
counting the number of data state verification failures of each test case according to the recorded state torsion condition of the test data in the executing process;
counting the times of generating defects of each test case according to the generated defects;
counting the recorded execution times of each test case;
and adjusting the priority of each test case based on the data state verification failure times, the defect generation times and the execution times of each test case, and sequencing the automatic execution sequence from high priority to low priority.
A second aspect of the present application provides a test automation execution optimization apparatus, the apparatus comprising:
a reading unit for reading test data from the data pool;
the execution unit is used for carrying out automatic execution of each test case through the test data and recording the state torsion condition of the test data in the execution process;
the judging unit is used for recording the generated defects if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion;
and the optimizing unit is used for adjusting the automatic execution sequence of each test case according to the state torsion condition of the test data and the generated defects in the recorded execution process.
A third aspect of the application provides a test automation execution optimization apparatus, the apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the steps of the method of test automation execution optimization as described in the first aspect above according to instructions in the program code.
A fourth aspect of the present application provides a computer readable storage medium storing program code for executing the test automation execution optimization method according to the first aspect.
From the above technical solutions, the embodiment of the present application has the following advantages:
the application provides a test automation execution optimization method, a device, equipment and a storage medium, which are used for recording test data in the test case automation execution process, judging the data state of the test data of each node in the automation execution process through assertion, recording generated defects together, dynamically adjusting the automation execution familiarity of the test case, optimizing the execution plan of the test case, improving the test cooperation efficiency, further guaranteeing the reliability of a test system, and solving the technical problems that the test efficiency is low because the test strategy optimization dynamic adjustment scheme does not exist at present, and checking and repairing are only carried out after the execution failure.
Drawings
FIG. 1 is a flow chart of a method for optimizing test automation in an embodiment of the present application;
FIG. 2 is a schematic diagram of a test automation execution optimization device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a test automation execution optimization device according to an embodiment of the present application.
Detailed Description
In order to make the present application better understood by those skilled in the art, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application designs a test automation execution optimization method, a device, equipment and a storage medium, which solve the technical problem of low test efficiency caused by the fact that the existing scheme of optimizing and dynamically adjusting a test strategy does not exist and checking and repairing are only carried out according to the execution failure.
For ease of understanding, referring to fig. 1, fig. 1 is a flowchart of a method for performing optimization of test automation in an embodiment of the present application, as shown in fig. 1, specifically:
s1, reading test data from a data pool;
it should be noted that, the user of the test data may read the test data from the data pool, where the attribute of the test data includes reusable shared data and isolation data defining an application scenario, and the attribute of the test data may be marked according to an actual situation.
The isolated data can only be used in a defined application scenario and cannot be re-applied after use.
S2, carrying out automatic execution of each test case through the test data, and recording state torsion conditions of the test data in the execution process;
it should be noted that, based on the read test data, the plurality of test cases are automatically executed according to a preset sequence, wherein the state torsion condition of the test data in the execution process is recorded in the data record warehouse.
The state twist condition of the test data refers to a change in the state of the data. In the actual test procedure, there are cases where the data state is irreversible, such as: the loan flow class data performs a use case test of a loan flow, first applying for a loan, then going to a loan approval, then going to a loan release.
It will be appreciated that in the loan application phase, the test data is in a first state, the loan approval phase is in a second state, and the loan release phase is in a third state.
It is also possible that the test data becomes the fourth state during the loan approval stage. Thus, all state torsion conditions of the test data during execution need to be recorded in the data record repository.
S3, if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion, recording the generated defects;
it should be noted that, the assertion refers to determining whether the data state of the test data in each node satisfies certain specific conditions or service logic, if so, the assertion is correct, otherwise, the assertion is incorrect.
If the assertion is wrong, the defect is determined to be generated and needs to be recorded in a data record warehouse.
S4, adjusting the automatic execution sequence of each test case according to the recorded state torsion condition of the test data in the execution process and the generated defects.
The method specifically comprises the following steps:
counting the number of data state verification failures of each test case according to the recorded state torsion condition of the test data in the executing process;
counting the times of generating defects of each test case according to the generated defects;
counting the recorded execution times of each test case;
based on the data state verification failure times, the times of generating defects and the execution times of each test case, the priority of each test case is adjusted, and the automatic execution sequence is ordered according to the priority from high to low.
It should be noted that, for the test cases with more data status checking failures, more defects and more execution times, the priority can be adjusted to be higher.
It can be understood that the failure times of the data state verification and the more defects are generated, which indicates that the scene or the node of the test case is easy to have problems, and the priority of executing the test should be ensured;
the scenario with more execution times indicates that the test case may be an underlying scenario or a scenario with more called times, and the test may be preferentially executed.
Further, step S3 further includes:
if the data state of the test data of each node in the automatic execution process is judged to be correct through the assertion, the execution data of each test case is recorded into a data pool as historical data.
If the assertion is correct, it means that the test data has no problem, and the test data can be repeatedly recorded as history data in the data pool.
Further, the method further comprises the following steps:
creating retest data in the data pool based on historical data inheritance;
if the attribute of the repeated test data is isolation data, creating new test data to cover the repeated test data;
and if the attribute of the repeated test data is shared data, reserving the repeated test data.
It should be noted that, creating new test data based on the historical data may be determined according to the attribute of the test data, and first, all the historical data may be selected to be integrated to create duplicate test data that is identical, but similarly, the duplicate test data may include isolated data that cannot be reused, and then, new test data needs to be created to cover the duplicate test data whose attribute is the isolated data.
Further, the method further comprises the following steps:
and (5) carrying out data cleaning on the data pool at regular time, and deleting the same test data.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a test automation execution optimization device according to an embodiment of the present application, and as shown in fig. 2, the structure is specifically as follows:
a reading unit 201, configured to read test data from the data pool, where an attribute of the test data includes reusable shared data and isolated data defining an application scenario;
it should be noted that, the user of the test data may read the test data from the data pool, where the attribute of the test data includes reusable shared data and isolation data defining an application scenario, and the attribute of the test data may be marked according to an actual situation.
The isolated data can only be used in a defined application scenario and cannot be re-applied after use.
The execution unit 202 is configured to perform automatic execution of each test case through the test data, and record a state torsion condition of the test data in an execution process;
it should be noted that, based on the read test data, the plurality of test cases are automatically executed according to a preset sequence, wherein the state torsion condition of the test data in the execution process is recorded in the data record warehouse.
The state twist condition of the test data refers to a change in the state of the data. In the actual test procedure, there are cases where the data state is irreversible, such as: the loan flow class data performs a use case test of a loan flow, first applying for a loan, then going to a loan approval, then going to a loan release.
It will be appreciated that in the loan application phase, the test data is in a first state, the loan approval phase is in a second state, and the loan release phase is in a third state.
It is also possible that the test data becomes the fourth state during the loan approval stage. Thus, all state torsion conditions of the test data during execution need to be recorded in the data record repository.
A judging unit 203, configured to record the generated defect if the data state of the test data of each node in the automated execution process is determined to be incorrect by asserting;
the judging unit 203 is further configured to:
if the data state of the test data of each node in the automatic execution process is judged to be correct through assertion, the execution data of each test case is recorded into a data pool as historical data;
it should be noted that, the assertion refers to determining whether the data state of the test data in each node satisfies certain specific conditions or service logic, if so, the assertion is correct, otherwise, the assertion is incorrect.
If the assertion is wrong, the defect is determined to be generated and needs to be recorded in a data record warehouse.
If the assertion is correct, the test data is free of problems, and can be repeatedly recorded into the data pool as historical data.
The optimizing unit 204 is configured to adjust an automatic execution sequence of each test case according to the recorded state torsion condition of the test data in the execution process and the generated defect;
the optimizing unit 204 specifically is configured to:
counting the number of data state verification failures of each test case according to the recorded state torsion condition of the test data in the executing process;
counting the times of generating defects of each test case according to the generated defects;
counting the recorded execution times of each test case;
based on the data state verification failure times, the times of generating defects and the execution times of each test case, the priority of each test case is adjusted, and the automatic execution sequence is ordered according to the priority from high to low.
It should be noted that, for the test cases with more data status checking failures, more defects and more execution times, the priority can be adjusted to be higher.
It can be understood that the failure times of the data state verification and the more defects are generated, which indicates that the scene or the node of the test case is easy to have problems, and the priority of executing the test should be ensured;
the scenario with more execution times indicates that the test case may be an underlying scenario or a scenario with more called times, and the test may be preferentially executed.
The data construction unit is used for creating repeated test data based on historical data inheritance in the data pool;
if the attribute of the repeated test data is isolation data, creating new test data to cover the repeated test data;
and if the attribute of the repeated test data is shared data, reserving the repeated test data.
It should be noted that, creating new test data based on the historical data may be determined according to the attribute of the test data, and first, all the historical data may be selected to be integrated to create duplicate test data that is identical, but similarly, the duplicate test data may include isolated data that cannot be reused, and then, new test data needs to be created to cover the duplicate test data whose attribute is the isolated data.
The data cleaning unit is used for cleaning the data of the data pool at regular time and deleting the same test data.
The embodiment of the present application further provides another test automation execution optimization device, as shown in fig. 3, for convenience of explanation, only the relevant parts of the embodiment of the present application are shown, and specific technical details are not disclosed, please refer to the method part of the embodiment of the present application. The terminal can be any terminal equipment including a mobile phone, a tablet personal computer, a personal digital assistant (English full name: personal DigitalAssistant, english abbreviation: PDA), a sales terminal (English full name: point of sales, english abbreviation: POS), a vehicle-mounted computer and the like, taking the mobile phone as an example of the terminal:
fig. 3 is a block diagram showing a part of a structure of a mobile phone related to a terminal provided by an embodiment of the present application. Referring to fig. 3, the mobile phone includes: radio Frequency (RF) circuit 1010, memory 1020, input unit 1030, display unit 1040, sensor 1050, audio circuit 1060, wireless fidelity (wireless fidelity, wiFi) module 1070, processor 1080, and power source 1090. Those skilled in the art will appreciate that the handset configuration shown in fig. 3 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or may be arranged in a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 3:
the RF circuit 1010 may be used for receiving and transmitting signals during a message or a call, and particularly, after receiving downlink information of a base station, the signal is processed by the processor 1080; in addition, the data of the design uplink is sent to the base station. Generally, RF circuitry 1010 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (English full name: lowNoiseAmplifier, english abbreviation: LNA), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (english: global System ofMobile communication, english: GSM), general packet radio service (english: generalPacket Radio Service, GPRS), code division multiple access (english: code Division Multiple Access, english: CDMA), wideband code division multiple access (english: wideband Code DivisionMultipleAccess, english: WCDMA), long term evolution (english: long TermEvolution, english: LTE), email, short message service (english: shortMessaging Service, SMS), and the like.
The memory 1020 may be used to store software programs and modules that the processor 1080 performs various functional applications and data processing of the handset by executing the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 1020 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state memory device.
The input unit 1030 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the handset. In particular, the input unit 1030 may include a touch panel 1031 and other input devices 1032. The touch panel 1031, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1031 or thereabout using any suitable object or accessory such as a finger, stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 1031 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 1080 and can receive commands from the processor 1080 and execute them. Further, the touch panel 1031 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1030 may include other input devices 1032 in addition to the touch panel 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, etc.
The display unit 1040 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit 1040 may include a display panel 1041, and alternatively, the display panel 1041 may be configured in the form of a liquid crystal display (english full name: liquid Crystal Display, acronym: LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1031 may overlay the display panel 1041, and when the touch panel 1031 detects a touch operation thereon or thereabout, the touch panel is transferred to the processor 1080 to determine a type of touch event, and then the processor 1080 provides a corresponding visual output on the display panel 1041 according to the type of touch event. Although in fig. 3, the touch panel 1031 and the display panel 1041 are two independent components for implementing the input and output functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1050, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1041 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 1060, a speaker 1061, and a microphone 1062 may provide an audio interface between a user and a cell phone. Audio circuit 1060 may transmit the received electrical signal after audio data conversion to speaker 1061 for conversion by speaker 1061 into an audio signal output; on the other hand, microphone 1062 converts the collected sound signals into electrical signals, which are received by audio circuit 1060 and converted into audio data, which are processed by audio data output processor 1080 for transmission to, for example, another cell phone via RF circuit 1010 or for output to memory 1020 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a mobile phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a WiFi module 1070, so that wireless broadband Internet access is provided for the user. Although fig. 3 shows a WiFi module 1070, it is understood that it does not belong to the necessary constitution of the handset, and can be omitted entirely as required within the scope of not changing the essence of the application.
Processor 1080 is the control center of the handset, connects the various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset by running or executing software programs and/or modules stored in memory 1020, and invoking data stored in memory 1020, thereby performing overall monitoring of the handset. Optionally, processor 1080 may include one or more processing units; preferably, processor 1080 may integrate an application processor primarily handling operating systems, user interfaces, applications, etc., with a modem processor primarily handling wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1080.
The handset further includes a power source 1090 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 1080 by a power management system, such as to provide for managing charging, discharging, and power consumption by the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
In an embodiment of the present application, the processor 1080 included in the terminal further has the following functions:
s1, reading test data from a data pool;
s2, carrying out automatic execution of each test case through the test data, and recording state torsion conditions of the test data in the execution process;
s3, if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion, recording the generated defects;
s4, adjusting the automatic execution sequence of each test case according to the recorded state torsion condition of the test data in the execution process and the generated defects.
The embodiments of the present application also provide a computer readable storage medium storing program code for executing any one of the test automation execution optimization methods described in the foregoing embodiments.
In the embodiment of the application, the method, the device, the equipment and the storage medium for optimizing the test automation execution are provided, test data in the test case automation execution process are recorded, the data state of the test data in each node in the automation execution process is judged through assertion, the generated defects are recorded together, the automation execution familiarity of the test case is dynamically adjusted, the execution plan of the test case is optimized, the cooperative efficiency of the test is improved, the reliability of a test system is further ensured, and the technical problem that the test efficiency is low because the test strategy optimization dynamic adjustment scheme does not exist at present and only checking and repairing are carried out after the execution fails is solved.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The terms "first," "second," "third," "fourth," and the like in the description of the application and in the above figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one (item)" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (RandomAccess Memory, RAM), magnetic disk or optical disk, etc.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method for optimizing test automation execution, comprising:
s1, reading test data from a data pool;
s2, carrying out automatic execution of each test case through the test data, and recording state torsion conditions of the test data in the execution process;
s3, if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion, recording the generated defect;
s4, adjusting the automatic execution sequence of each test case according to the recorded state torsion condition of the test data and the generated defects in the execution process.
2. The method for optimizing test automation execution according to claim 1, wherein the step S3 further comprises:
if the data state of the test data of each node in the automatic execution process is judged to be correct through assertion, the execution data of each test case is recorded into the data pool as historical data.
3. The method of claim 2, wherein the attributes of the test data include shared data that is reusable and isolated data that defines an application scenario.
4. The test automation execution optimization method of claim 3, further comprising:
retested data is created in the data pool based on historical data inheritance.
5. The method of claim 4, further comprising:
if the attribute of the repeated test data is isolation data, creating new test data to cover the repeated test data;
and if the attribute of the repeated test data is shared data, reserving the repeated test data.
6. The method of claim 5, further comprising:
and carrying out data cleaning on the data pool at regular time, and deleting the same test data.
7. The method for optimizing the automated test execution according to claim 1, wherein the step S4 specifically comprises:
counting the number of data state verification failures of each test case according to the recorded state torsion condition of the test data in the executing process;
counting the times of generating defects of each test case according to the generated defects;
counting the recorded execution times of each test case;
and adjusting the priority of each test case based on the data state verification failure times, the defect generation times and the execution times of each test case, and sequencing the automatic execution sequence from high priority to low priority.
8. A test automation execution optimization device, comprising:
a reading unit for reading test data from the data pool;
the execution unit is used for carrying out automatic execution of each test case through the test data and recording the state torsion condition of the test data in the execution process;
the judging unit is used for recording the generated defects if the data state of the test data of each node in the automatic execution process is judged to be incorrect through assertion;
and the optimizing unit is used for adjusting the automatic execution sequence of each test case according to the state torsion condition of the test data and the generated defects in the recorded execution process.
9. A test automation execution optimization device, the device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the test automation execution optimization method of any one of claims 1-7 according to instructions in the program code.
10. A computer readable storage medium, characterized in that the computer readable storage medium is for storing a program code for performing the test automation execution optimization method of any one of claims 1-7.
CN202310676914.9A 2023-06-08 2023-06-08 Test automation execution optimization method, device, equipment and storage medium Pending CN116680185A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310676914.9A CN116680185A (en) 2023-06-08 2023-06-08 Test automation execution optimization method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310676914.9A CN116680185A (en) 2023-06-08 2023-06-08 Test automation execution optimization method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116680185A true CN116680185A (en) 2023-09-01

Family

ID=87778777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310676914.9A Pending CN116680185A (en) 2023-06-08 2023-06-08 Test automation execution optimization method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116680185A (en)

Similar Documents

Publication Publication Date Title
CN104516812A (en) Method and device for testing software
CN111770009A (en) Data transmission method and related equipment
US11147038B2 (en) Notification message processing method and terminal
CN105005529A (en) Application testing method and apparatus
CN116303085A (en) Test reason analysis method, device, equipment and storage medium
CN104166899A (en) Voice interaction method and terminals
CN116468382A (en) RPA robot flow management method, device, equipment and storage medium
CN116680185A (en) Test automation execution optimization method, device, equipment and storage medium
CN112667868B (en) Data detection method and device
CN109582240B (en) Data moving method and related equipment thereof
CN117011023A (en) Full link regression data management method, device, equipment and storage medium
CN112199245B (en) Mobile terminal screen detection method, system, storage medium and mobile terminal
CN117041013A (en) Fault node processing method, device, system, equipment and storage medium
CN116881143A (en) Data object copying abnormality investigation method, device, equipment and storage medium
CN111651313B (en) Conversion method and system of identification card, storage medium and terminal equipment
CN116862473A (en) Bank production application calling relation analysis method, device, equipment and storage medium
CN116303086A (en) End-to-end testing method, configuration method, device, equipment and storage medium
CN116804922A (en) Service quality inspection management method, device, equipment and storage medium
CN117743015A (en) SQL fault positioning method, device, system and equipment
CN116467192A (en) Automatic test case generation method, device, equipment and storage medium
CN116303646A (en) Cross-database data comparison method, device, equipment and storage medium
CN116627797A (en) Demand standard processing method, device, equipment and storage medium
CN116303060A (en) ESB call request processing method, device, system, equipment and storage medium
CN116382964A (en) Class conflict detection method, device, equipment and storage medium based on probe
CN116257503A (en) Data migration processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination