CN116303085A - Test reason analysis method, device, equipment and storage medium - Google Patents

Test reason analysis method, device, equipment and storage medium Download PDF

Info

Publication number
CN116303085A
CN116303085A CN202310394462.5A CN202310394462A CN116303085A CN 116303085 A CN116303085 A CN 116303085A CN 202310394462 A CN202310394462 A CN 202310394462A CN 116303085 A CN116303085 A CN 116303085A
Authority
CN
China
Prior art keywords
target
log
test
public
test failure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310394462.5A
Other languages
Chinese (zh)
Inventor
王闪闪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202310394462.5A priority Critical patent/CN116303085A/en
Publication of CN116303085A publication Critical patent/CN116303085A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The application discloses a test reason analysis method, a device, equipment and a storage medium, clustering target logs based on span and target identification, distinguishing application levels and public levels, further matching test failure reasons of the target logs based on a preset template, realizing quick and accurate positioning of problems, avoiding the process of manual analysis of testers, improving the test efficiency, saving the test manpower, and solving the problems that the manual reason analysis is time-consuming and labor-consuming under the condition that a plurality of current tests are automatically executed. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to carry out the complicated technical problem of cause analysis.

Description

Test reason analysis method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for analyzing a test cause.
Background
The software test is a complex project with high degree of specialization, in the test process, a lot of problems are inevitably encountered, whether the functions of the software can meet the requirements of clients, whether the functions are available, whether the aspects of performance and reliability meet the requirements, and the software test can be discovered after the software test is subjected to the system test, so the software test is a quite important ring. During the software testing process, different problems can be found certainly, and the testing needs to check specific reasons for the problems.
At present, the conventional problem is usually solved by manually positioning the problem or seeking a developer in the test process, the prior art consumes manpower, and the prior art is more time-consuming and labor-consuming to manually perform the cause analysis depending on the experience degree of the tester, especially when a lot of tests are automatically executed. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to perform reason analysis is also complicated.
Disclosure of Invention
The application provides a test reason analysis method, a device, equipment and a storage medium, which solve the problem that the manual reason analysis is time-consuming and labor-consuming under the condition that a plurality of current tests are automatically executed. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to carry out the complicated technical problem of cause analysis.
In view of this, a first aspect of the present application provides a test failure cause analysis method, the method including:
s1, clustering target logs containing target identifiers according to span and the target identifiers in all the logs;
s2, dividing the target log according to an application level and a public level to obtain a target application log and a target public log;
s3, matching the target application log with a preset template in a target template library;
and S4, if the matching is successful, determining a test failure reason according to the preset template.
Optionally, the step S3 further includes:
s5, if the matching fails, extracting the target application log and the target public log keywords respectively;
and S6, respectively carrying out problem positioning on the target application log and the target public log according to the keywords, and determining the reason of test failure.
Optionally, the step S6 specifically includes:
performing problem positioning on the target application log and the target public log according to the keywords, and determining that the test failure cause belongs to a network environment problem, a public component problem or an application problem;
if the test failure cause belongs to a public component problem, determining that the test failure cause belongs to a database problem or a middleware problem according to the source of the target public log;
if the test failure reason belongs to an application problem, further analyzing and positioning the interface information of the target application log to determine a problem interface.
Optionally, the step S6 further includes:
generating a new preset template according to the keyword, the target application log, the target public log and the test failure reason;
and adding the preset template to the target template library.
Optionally, the step S6 further includes:
and executing a corresponding preset solution according to the test failure reason.
A second aspect of the present application provides a test failure cause analysis apparatus, the apparatus comprising:
the clustering unit is used for clustering target logs containing target identifiers according to the span and the target identifiers in all the logs;
the dividing unit is used for dividing the target log according to the application level and the public level to obtain a target application log and a target public log;
the matching unit is used for matching the target application log and the target public log with a preset template in a target template library;
and the first analysis unit is used for determining a test failure reason according to the preset template if the matching is successful.
Optionally, the method further comprises:
the processing unit is used for respectively extracting the target application log and the target public log keywords if the matching fails;
and the second analysis unit is used for respectively carrying out problem positioning on the target application log and the target public log according to the keywords and determining the reason of test failure.
Optionally, the second analysis unit is specifically configured to:
performing problem positioning on the target application log and the target public log according to the keywords, and determining that the test failure cause belongs to a network environment problem, a public component problem or an application problem;
if the test failure cause belongs to a public component problem, determining that the test failure cause belongs to a database problem or a middleware problem according to the source of the target public log;
if the test failure reason belongs to an application problem, further analyzing and positioning the interface information of the target application log to determine a problem interface.
A third aspect of the present application provides a test reason analysis apparatus, the apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the steps of the method of test reason analysis as described in the first aspect above according to instructions in the program code.
A fourth aspect of the present application provides a computer readable storage medium for storing program code for performing the method of the first aspect described above.
From the above technical solutions, the embodiments of the present application have the following advantages:
in the application, the method, the device, the equipment and the storage medium for analyzing the test reasons are provided, clustering target logs are clustered based on span and target identification, an application level and a public level are distinguished, the problem is rapidly and accurately positioned based on the test failure reasons of the target logs matched by a preset template, the process of manual analysis of a tester is avoided, the test efficiency is improved, the test manpower is saved, and the problems that the time and the labor are wasted when the reason analysis is performed manually under the condition that a plurality of current tests are automatically executed are solved. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to carry out the complicated technical problem of cause analysis.
Drawings
FIG. 1 is a flow chart of a first method of analyzing a test cause according to an embodiment of the present application;
FIG. 2 is a flow chart of a second method of analyzing a test cause according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a test reason analysis device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a test reason analysis device in an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The application designs a test reason analysis method, a device, equipment and a storage medium, which solve the problems that the manual reason analysis is time-consuming and labor-consuming under the condition that a plurality of current tests are automatically executed. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to carry out the complicated technical problem of cause analysis.
For ease of understanding, referring to fig. 1, fig. 1 is a flowchart of a first method of analyzing a test cause in an embodiment of the present application, and as shown in fig. 1, specifically:
s1, clustering target logs containing target identifiers according to span and the target identifiers in all the logs;
it should be noted that, for each test case, a target identifier corresponds to each test case, and the role of the target identifier is to mark the scene call.
And if the test execution result is failure, automatically starting a test failure reason analysis engine according to the setting to analyze.
When the test case triggers the execution, generating a target identifier bound with the test case;
the target mark is downwards transferred according to the calling process in the execution process of the test case, so that the series logs generated in the calling process record calling relations through the span, and the target logs are recorded through the target mark, thereby realizing that the cluster contains the target logs of the target mark according to the span and the target mark.
S2, dividing the target log according to the application level and the public level to obtain a target application log and a target public log;
it should be noted that the application level is equivalent to an application program itself, and the common level is common use, that is, not only the one application, but several applications are used simultaneously.
For positioning problems, the operations employed by the application level and the common level are not identical.
For the public assembly problem, only the public assembly is needed to be repaired in the follow-up process, and related operations such as application modification, repackaging deployment and the like are not needed.
While the problem at the application level requires the logging to be run down to a specific interface level to see what is specifically the problem.
S3, matching the target application log and the target public log with a preset template in a target template library;
it should be noted that, a preset template corresponding to the log is stored in the target template library, and matching can be performed. The preset template not only can contain keyword matching, but also can contain logic error flow matching, and the flow comprises operations such as data checking and the like.
And S4, if the matching is successful, determining a test failure reason according to a preset template.
Further, referring to fig. 2, fig. 2 is a flowchart of a second method of analyzing a test cause in the embodiment of the present application, and as shown in fig. 2, specifically:
s1, clustering target logs containing target identifiers according to span and the target identifiers in all the logs;
it should be noted that, for each test case, a target identifier corresponds to each test case, and the role of the target identifier is to mark the scene call.
And if the test execution result is failure, automatically starting a test failure reason analysis engine according to the setting to analyze.
When the test case triggers the execution, generating a target identifier bound with the test case;
the target mark is downwards transferred according to the calling process in the execution process of the test case, so that the series logs generated in the calling process record calling relations through the span, and the target logs are recorded through the target mark, thereby realizing that the cluster contains the target logs of the target mark according to the span and the target mark.
S2, dividing the target log according to the application level and the public level to obtain a target application log and a target public log;
it should be noted that the application level is equivalent to an application program itself, and the common level is common use, that is, not only the one application, but several applications are used simultaneously.
For positioning problems, the operations employed by the application level and the common level are not identical.
For the public assembly problem, only the public assembly is needed to be repaired in the follow-up process, and related operations such as application modification, repackaging deployment and the like are not needed.
While the problem at the application level requires the logging to be run down to a specific interface level to see what is specifically the problem.
S3, matching the target application log and the target public log with a preset template in a target template library;
it should be noted that, a preset template corresponding to the log is stored in the target template library, and matching can be performed. The preset template not only can contain keyword matching, but also can contain logic error flow matching, and the flow comprises operations such as data checking and the like.
S5, if the matching fails, extracting target application logs and target public log keywords respectively;
if the corresponding preset template does not exist, the target application log and the target public log keyword can be extracted for further analysis.
S61, performing problem positioning on the target application log and the target public log according to the keywords, and determining that the reason of the test failure belongs to a network environment problem, a public component problem or an application problem;
it should be noted that, determining a specific reason for failure of testing according to the keyword belongs to a network environment problem, a public component problem or an application problem.
S62, if the reason of the test failure belongs to the public component problem, determining that the reason belongs to the database problem or the middleware problem according to the source of the target public log;
and S63, if the reason of the test failure belongs to the application problem, further analyzing and positioning the interface information of the target application log, and determining a problem interface.
Further, step S63 further includes:
generating a new preset template according to the keywords, the target application log, the target public log and the test failure reasons;
and adding the preset template to a target template library.
Further, step S63 further includes:
and executing a corresponding preset solution according to the test failure reason.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a test reason analysis device according to an embodiment of the present application, and as shown in fig. 3, the structure specifically includes:
a clustering unit 301, configured to cluster target logs including target identifiers according to span and target identifiers in all logs;
the dividing unit 302 is configured to divide the target log according to an application level and a public level, so as to obtain a target application log and a target public log;
a matching unit 303, configured to match the target application log and the target public log with a preset template in the target template library;
the first analysis unit 304 is configured to determine a cause of the test failure according to a preset template if the matching is successful.
Further, the method further comprises the following steps:
a processing unit 305, configured to extract the target application log and the target public log keyword if the matching fails;
and the second analysis unit 306 is configured to perform problem location on the target application log and the target public log according to the keywords, and determine a cause of the test failure.
Further, the second analysis unit 306 specifically is configured to:
performing problem positioning on the target application log and the target public log according to the keywords, and determining that the test failure cause belongs to a network environment problem, a public component problem or an application problem;
if the test failure cause belongs to the public component problem, determining that the test failure cause belongs to the database problem or the middleware problem according to the source of the target public log;
if the reason of the test failure belongs to the application problem, further analyzing and positioning the interface information of the target application log, and determining a problem interface.
Further, the method further comprises the following steps:
a template generating unit 307, configured to generate a new preset template according to the keyword, the target application log, the target public log, and the test failure reason;
an adding unit 308, configured to add the preset template to the target template library.
Further, the method further comprises the following steps:
the execution unit 309 is configured to execute a corresponding preset solution according to the test failure reason.
The embodiment of the present application further provides another test reason analysis device, as shown in fig. 4, for convenience of explanation, only the portion relevant to the embodiment of the present application is shown, and specific technical details are not disclosed, please refer to the method portion of the embodiment of the present application. The terminal can be any terminal equipment including a mobile phone, a tablet personal computer, a personal digital assistant (English full name: personal digital Assistant; english abbreviation: PDA), a sales terminal (English full name: pointofsales; english abbreviation: POS), a vehicle-mounted computer and the like, taking the mobile phone as an example of the terminal:
fig. 4 is a block diagram showing a part of a structure of a mobile phone related to a terminal provided in an embodiment of the present application. Referring to fig. 4, the mobile phone includes: radio Frequency (RF) circuit 1010, memory 1020, input unit 1030, display unit 1040, sensor 1050, audio circuit 1060, wireless fidelity (WiFi) module 1070, processor 1080, and power source 1090. Those skilled in the art will appreciate that the handset configuration shown in fig. 4 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or may be arranged in a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 4:
the RF circuit 1010 may be used for receiving and transmitting signals during a message or a call, and particularly, after receiving downlink information of a base station, the signal is processed by the processor 1080; in addition, the data of the design uplink is sent to the base station. Generally, RF circuitry 1010 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (English full name: lowNoiseAmplifier, english abbreviation: LNA), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (GSM), general packet radio service (GeneralPacketRadioService, GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), long Term Evolution (LTE), e-mail, short message service (ShortMessagingService, SMS), etc.
The memory 1020 may be used to store software programs and modules that the processor 1080 performs various functional applications and data processing of the handset by executing the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 1020 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state memory device.
The input unit 1030 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the handset. In particular, the input unit 1030 may include a touch panel 1031 and other input devices 1032. The touch panel 1031, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1031 or thereabout using any suitable object or accessory such as a finger, stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 1031 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 1080 and can receive commands from the processor 1080 and execute them. Further, the touch panel 1031 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1030 may include other input devices 1032 in addition to the touch panel 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, etc.
The display unit 1040 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit 1040 may include a display panel 1041, and the display panel 1041 may be optionally configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. Further, the touch panel 1031 may overlay the display panel 1041, and when the touch panel 1031 detects a touch operation thereon or thereabout, the touch panel is transferred to the processor 1080 to determine a type of touch event, and then the processor 1080 provides a corresponding visual output on the display panel 1041 according to the type of touch event. Although in fig. 4, the touch panel 1031 and the display panel 1041 are two independent components for implementing the input and output functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1050, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1041 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 1060, a speaker 1061, and a microphone 1062 may provide an audio interface between a user and a cell phone. Audio circuit 1060 may transmit the received electrical signal after audio data conversion to speaker 1061 for conversion by speaker 1061 into an audio signal output; on the other hand, microphone 1062 converts the collected sound signals into electrical signals, which are received by audio circuit 1060 and converted into audio data, which are processed by audio data output processor 1080 for transmission to, for example, another cell phone via RF circuit 1010 or for output to memory 1020 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a mobile phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a WiFi module 1070, so that wireless broadband Internet access is provided for the user. Although fig. 4 shows a WiFi module 1070, it is understood that it does not belong to the necessary constitution of the handset, and can be omitted entirely as required within the scope of not changing the essence of the invention.
Processor 1080 is the control center of the handset, connects the various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset by running or executing software programs and/or modules stored in memory 1020, and invoking data stored in memory 1020, thereby performing overall monitoring of the handset. Optionally, processor 1080 may include one or more processing units; preferably, processor 1080 may integrate an application processor primarily handling operating systems, user interfaces, applications, etc., with a modem processor primarily handling wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1080.
The handset further includes a power source 1090 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 1080 by a power management system, such as to provide for managing charging, discharging, and power consumption by the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
In the embodiment of the present application, the processor 1080 included in the terminal further has the following functions:
s1, clustering target logs containing target identifiers according to span and the target identifiers in all the logs;
s2, dividing the target log according to the application level and the public level to obtain a target application log and a target public log;
s3, matching the target application log and the target public log with a preset template in a target template library;
and S4, if the matching is successful, determining a test failure reason according to a preset template.
The embodiments of the present application also provide a computer readable storage medium for storing program code for executing any one of the methods for analyzing test causes described in the foregoing embodiments.
In the embodiment of the application, the method, the device, the equipment and the storage medium for analyzing the test reasons are provided, clustering target logs are clustered based on the span and the target mark, an application level and a public level are distinguished, the problem is rapidly and accurately positioned further based on the test failure reasons of the target logs matched by the preset template, the process of manual analysis of a tester is avoided, the test efficiency is improved, the test manpower is saved, and the problems that the manual reason analysis is time-consuming and labor-consuming under the condition that a plurality of current tests are automatically executed are solved. In addition, for the test process with a longer call link, the middle call process is extremely complex, and the log is checked layer by layer to carry out the complicated technical problem of cause analysis.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The terms "first," "second," "third," "fourth," and the like in the description of the present application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be capable of operation in sequences other than those illustrated or described herein, for example. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in this application, "at least one" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (RandomAccess Memory, RAM), magnetic disk or optical disk, etc.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. A test failure cause analysis method, comprising:
s1, clustering target logs containing target identifiers according to span and the target identifiers in all the logs;
s2, dividing the target log according to an application level and a public level to obtain a target application log and a target public log;
s3, matching the target application log with a preset template in a target template library;
and S4, if the matching is successful, determining a test failure reason according to the preset template.
2. The method according to claim 1, wherein the step S3 further comprises:
s5, if the matching fails, extracting the target application log and the target public log keywords respectively;
and S6, respectively carrying out problem positioning on the target application log and the target public log according to the keywords, and determining the reason of test failure.
3. The method for analyzing the cause of test failure according to claim 2, wherein the step S6 specifically includes:
performing problem positioning on the target application log and the target public log according to the keywords, and determining that the test failure cause belongs to a network environment problem, a public component problem or an application problem;
if the test failure cause belongs to a public component problem, determining that the test failure cause belongs to a database problem or a middleware problem according to the source of the target public log;
if the test failure reason belongs to an application problem, further analyzing and positioning the interface information of the target application log to determine a problem interface.
4. The method according to claim 2, wherein the step S6 further comprises:
generating a new preset template according to the keyword, the target application log, the target public log and the test failure reason;
and adding the preset template to the target template library.
5. The method according to claim 2, wherein the step S6 further comprises:
and executing a corresponding preset solution according to the test failure reason.
6. A test reason analyzing apparatus, comprising:
the clustering unit is used for clustering target logs containing target identifiers according to the span and the target identifiers in all the logs;
the dividing unit is used for dividing the target log according to the application level and the public level to obtain a target application log and a target public log;
the matching unit is used for matching the target application log and the target public log with a preset template in a target template library;
and the first analysis unit is used for determining a test failure reason according to the preset template if the matching is successful.
7. The test failure cause analysis apparatus according to claim 5, further comprising:
the processing unit is used for respectively extracting the target application log and the target public log keywords if the matching fails;
and the second analysis unit is used for respectively carrying out problem positioning on the target application log and the target public log according to the keywords and determining the reason of test failure.
8. The test failure cause analysis apparatus according to claim 7, wherein the second analysis unit is specifically configured to:
performing problem positioning on the target application log and the target public log according to the keywords, and determining that the test failure cause belongs to a network environment problem, a public component problem or an application problem;
if the test failure cause belongs to a public component problem, determining that the test failure cause belongs to a database problem or a middleware problem according to the source of the target public log;
if the test failure reason belongs to an application problem, further analyzing and positioning the interface information of the target application log to determine a problem interface.
9. A test reason analysis device, the device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the test reason analysis method of any one of claims 1-5 according to instructions in the program code.
10. A computer readable storage medium storing program code for performing the test cause analysis method according to any one of claims 1 to 5.
CN202310394462.5A 2023-04-13 2023-04-13 Test reason analysis method, device, equipment and storage medium Pending CN116303085A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310394462.5A CN116303085A (en) 2023-04-13 2023-04-13 Test reason analysis method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310394462.5A CN116303085A (en) 2023-04-13 2023-04-13 Test reason analysis method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116303085A true CN116303085A (en) 2023-06-23

Family

ID=86832541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310394462.5A Pending CN116303085A (en) 2023-04-13 2023-04-13 Test reason analysis method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116303085A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116578499A (en) * 2023-07-13 2023-08-11 建信金融科技有限责任公司 Intelligent analysis and test method and system for public component function change influence

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116578499A (en) * 2023-07-13 2023-08-11 建信金融科技有限责任公司 Intelligent analysis and test method and system for public component function change influence
CN116578499B (en) * 2023-07-13 2023-09-22 建信金融科技有限责任公司 Intelligent analysis and test method and system for public component function change influence

Similar Documents

Publication Publication Date Title
CN112148579B (en) User interface testing method and device
CN111078556B (en) Application testing method and device
CN111666222A (en) Test method and related device
CN104809055B (en) Application program testing method and device based on cloud platform
CN116303085A (en) Test reason analysis method, device, equipment and storage medium
CN109196480B (en) Method for displaying equipment identification, mobile terminal and terminal equipment
CN116468382A (en) RPA robot flow management method, device, equipment and storage medium
CN115904950A (en) Test case generation method, device, equipment and storage medium
CN112667868B (en) Data detection method and device
CN114490307A (en) Unit testing method, device and storage medium
CN113961380A (en) Cross-application repair method, device, equipment and storage medium
CN116881143A (en) Data object copying abnormality investigation method, device, equipment and storage medium
CN116862473A (en) Bank production application calling relation analysis method, device, equipment and storage medium
CN111651313B (en) Conversion method and system of identification card, storage medium and terminal equipment
CN116303086A (en) End-to-end testing method, configuration method, device, equipment and storage medium
CN116303060A (en) ESB call request processing method, device, system, equipment and storage medium
CN117743015A (en) SQL fault positioning method, device, system and equipment
CN116303646A (en) Cross-database data comparison method, device, equipment and storage medium
CN116361189A (en) Unit test statistical method, device, equipment and storage medium
CN117041013A (en) Fault node processing method, device, system, equipment and storage medium
CN116467192A (en) Automatic test case generation method, device, equipment and storage medium
CN116501413A (en) Automatic generation interface calling method, device, equipment and storage medium
CN116737761A (en) Asynchronous reconciliation processing method, device, equipment and storage medium
CN110569234A (en) Data checking method and device, electronic equipment and computer readable storage medium
CN116361258A (en) Cross-system log information scanning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination