CN112882957A - Test task validity checking method and device - Google Patents

Test task validity checking method and device Download PDF

Info

Publication number
CN112882957A
CN112882957A CN202110343526.XA CN202110343526A CN112882957A CN 112882957 A CN112882957 A CN 112882957A CN 202110343526 A CN202110343526 A CN 202110343526A CN 112882957 A CN112882957 A CN 112882957A
Authority
CN
China
Prior art keywords
test
task
target
result
cases
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110343526.XA
Other languages
Chinese (zh)
Other versions
CN112882957B (en
Inventor
付静
冷炜
高蕊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Citic Bank Corp Ltd
Original Assignee
China Citic Bank Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Citic Bank Corp Ltd filed Critical China Citic Bank Corp Ltd
Priority to CN202110343526.XA priority Critical patent/CN112882957B/en
Publication of CN112882957A publication Critical patent/CN112882957A/en
Application granted granted Critical
Publication of CN112882957B publication Critical patent/CN112882957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the technical field of software testing, in particular to a method and a device for checking the validity of a test task. The method comprises the following steps: determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result; extracting a test log of the target test task from the tested system according to the serial number, and extracting key elements of the test log; comparing the key elements with test case abstract information and target test task result information of a target test task respectively according to a preset analysis rule; and determining the validity of the test result of the target test task according to the comparison result. The scheme of this application has solved and has been difficult to verify the problem of tester test result validity at present.

Description

Test task validity checking method and device
Technical Field
The invention relates to the technical field of software testing, in particular to a method and a device for checking the validity of a test task.
Background
In the test execution stage in the software test lifecycle, a test executor executes a test according to a test case and a plan, and usually after the test environment executes a completed case, the tester needs to manually modify the test task state into a test passing state in a test case result registration list on a case management platform. Based on this, the case execution result is completely fed back by the test executive personnel, and the test execution may have omission or errors. There are also separate solutions to solve the above-mentioned manual recording problem, but there is a lack of theory and method for analyzing the verification from the perspective of checking the validity of the test execution. In the test execution stage, how to automatically judge the validity of the test execution result is an important problem that affects the software quality to be solved urgently. There are also separate solutions to solve the above-mentioned manual recording problem, but there is a lack of theory and method for analyzing the verification from the perspective of checking the validity of the test execution. In the test execution stage, how to automatically judge the validity of the test execution result is an important problem that affects the software quality to be solved urgently.
Disclosure of Invention
The present application aims to solve at least one of the above technical drawbacks. The technical scheme adopted by the application is as follows:
in a first aspect, an embodiment of the present application discloses a method for checking validity of a test task, where the method includes:
determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
extracting a test log of the target test task from the tested system according to the serial number, and extracting key elements of the test log;
comparing the key elements with test case abstract information and test task result information of a target test task respectively according to a preset analysis rule;
and determining the validity of the test result of the target test task according to the comparison result.
Further, the test log key elements include, but are not limited to:
task codes of the test cases, time stamps of the test cases, the number of the test cases and serial numbers of the test cases;
the test case summary information of the target test task includes but is not limited to: the transaction codes of the test cases and the number of the test cases;
the target test task result information includes but is not limited to: the serial number of the test case, the completion time of the test case and the test result.
Further, according to a preset analysis rule, comparing the key elements with the test case summary information of the target test task comprises: and when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
Further, according to a preset analysis rule, comparing the key elements with the test task result information includes:
when the task code in the test log is the same as the task code in the test case abstract information of the target test task, checking whether the test case time stamp in the test log is consistent with the test case completion time recorded in the target test task result information;
and if not, determining that the test result of the target test task is invalid.
Further, the comparing the key elements with the test case summary information and the test task result information of the target test task according to a preset analysis rule includes:
when the task codes in the test logs are the same as the task codes in the test case abstract information of the target test task and the test case time stamps in the test logs are consistent with the test case completion time recorded in the test task result information, extracting the number of test cases in the test logs;
and when the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are both 1, determining that the test result of the target test task is valid.
Further, when it is determined that the number of the test cases recorded in the test case summary information of the target test task is different from the number of the test cases in the test log, it is determined that the test result of the target test task is invalid.
Further, when the number of the test cases recorded in the test case summary information of the target test task is determined to be the same as the number of the test cases in the test log and is more than 1, acquiring the serial numbers of all the test cases in the test log,
analyzing the summary information of the target test task test cases when the serial numbers of the test cases in the test logs are the same, and determining that the test results of the target test task are valid if the test cases in the test logs are confirmed to be different check points of the same task
Further, when the serial numbers of the test cases in the test log are different, extracting test request messages of the test cases with different serial numbers respectively;
when the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
and when the valid field in the test request message of each test case is the same as the valid field of the test request message of any test case, determining that the test result of the target test task is invalid.
In a second aspect, an embodiment of the present application provides a test task validity checking apparatus, where the apparatus includes: a determining module, an extracting module, an analyzing module and a judging module, wherein,
the determining module is used for determining a target test task to be checked;
the extraction module is used for extracting the test case serial number of the target test task recorded in the test result; the extraction module is used for extracting the test log of the target test task from the tested system according to the serial number and extracting key elements of the test log;
the analysis module is used for comparing the key elements with test case abstract information and test task result information of a target test task respectively according to a preset analysis rule;
and the judging module is used for determining the validity of the test result of the target test task according to the comparison result.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory;
the memory is used for storing operation instructions;
the processor is configured to execute the method in any of the embodiments by calling the operation instruction.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the method of any one of the above embodiments.
The embodiment of the application provides a test task validity checking scheme, and the test log of a tested system in a test environment is analyzed, so that the test evidence corresponding to a test case is identified in the log to check the validity (also called as authenticity) of a test execution result, and the problem that whether the test result of a tester is real and valid is difficult to verify at present is solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flowchart of a method for checking validity of a test task according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a device for checking validity of a test task according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present invention.
It will be understood by those skilled in the art that, unless otherwise specified, the singular forms "a", "an", "the" and "the" may include the plural forms, and the plural forms "a", "an", "a", and "the" are merely intended to illustrate the object definition for clarity and do not limit the object itself, and certainly, the object definition for "a" and "an" may be the same terminal, device, user, etc., and may also be the same terminal, device, user, etc. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
In addition, it is to be understood that "at least one" in the embodiments of the present application means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
In the following embodiments "task code" refers to an identification of an interface of a task system or task server that carries a certain type of task. For example, the type a task requested from the server is performed through the type a task interface of the server, and the same task interface has only one task code, but the same task interface can have many specific tasks of the same type, that is, all tasks through the type a task interface have the same task code, but each specific task is an independent task, and thus each task has a task serial number. It should be noted that the embodiments described below may be used in any software testing field, but are particularly applicable to the field of bank transaction software or system testing. The method is applied to the field of bank financial transaction software testing, and correspondingly, the task code is embodied as a transaction code, and the task serial number is embodied as a transaction serial number. For example, the transfer transaction requested by the server is performed through a transfer transaction interface of the server, the same transaction interface has only one transaction code, but the same transaction interface can perform a plurality of transactions, that is, all the transfer transactions through the transfer transaction interface have the same transaction code, but each transfer transaction is an independent service, so that each transaction has a transaction service serial number, which may be referred to as a transaction serial number for short.
What needs to be further illustrated is the relationship between test cases, test scenarios and test tasks. One test task corresponds to one task code, at least one test case is arranged under one test task, one test case corresponds to one test scene, each test case corresponds to key element information such as an abstract and the like, and after the test executive completes the execution, the task serial number (the test case serial number or the transaction serial number for short) of the test case and the execution completion time of the test case are recorded. The relationship between the test case and the task code and the transaction serial number can be divided into the following cases:
(1) one test case corresponds to one test task, namely one test case corresponds to the same task code and the same serial number;
(2) one test task corresponds to a plurality of test cases, in this case, the plurality of test cases correspond to the same task code, but the task flow number of each test case is different. In this case, the test cases verify different scenes (or different test case tasks) of the same type of task, so that a certain transaction under different scenes needs to be initiated for many times; for example, if the test case is "the single transfer amount of bank a must not exceed 50 ten thousand", three test cases (scenarios) with three different scenarios "equal to 50 ten thousand", "less than 50 ten thousand" and "more than 50 ten thousand" need to be tested respectively.
(3) One test task corresponds to a plurality of test cases, but the test cases all correspond to the same serial number. In this case, different test points of the same task are tested by the test case, and only one task needs to be initiated to check different test points to execute a plurality of test cases. For example, if the test case is "3000 yuan can be transferred from the account a", only one transaction task is needed to test two different test points, namely "account a" and "3000 yuan".
Fig. 1 shows a schematic flowchart of a test task validity check provided in an embodiment of the present application, and as shown in fig. 1, the method mainly includes:
s101, determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
s102, extracting a test log of a target test task from the tested system according to the serial number, and extracting key elements of the test log;
in the embodiment of the present application, the key elements of the test log include, but are not limited to: task codes of the test cases, time stamps of the test cases, the number of the test cases and serial numbers of the test cases.
S103, comparing the key elements with test case abstract information and test task result information of the target test task respectively according to a preset analysis rule;
in the embodiment of the present application, the test case summary information of the target test task includes, but is not limited to: the transaction codes of the test cases and the number of the test cases; target test task result information includes, but is not limited to: the serial number of the test case, the completion time of the test case and the test result.
And S104, determining the validity of the test result of the target test task according to the comparison result. The effectiveness includes that the test is not carried out or is not carried out completely according to the test cases of the test tasks, and the test cases are not executed, the test cases are omitted to be executed, or the test cases are executed mistakenly.
On the basis of the above embodiment, further, comparing the key elements with the test case summary information of the target test task according to a preset analysis rule includes: and when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
In an optional embodiment of the present application, the comparing the key elements with the test task result information according to the preset analysis rule includes:
step 1, when a task code in a test log is the same as a task code in test case summary information of a target test task, checking whether a test case timestamp in the test log is consistent with test case completion time recorded in target test task result information;
and 2, if the test results are inconsistent, determining that the test result of the target test task is invalid.
On the basis of the above embodiment, further, according to a preset analysis rule, comparing the key elements with the test case summary information and the test task result information of the target test task respectively includes:
step 1, when task codes in a test log are the same as task codes in test case abstract information of a target test task, and test case timestamps in the test log are consistent with test case completion time recorded in test task result information, extracting the number of test cases in the test log;
and 2, when the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are both 1, determining that the test result of the target test task is valid.
On the basis of the above embodiment, when it is determined that the number of test cases recorded in the test case summary information of the target test task is different from the number of test cases in the test log, it is determined that the test result of the target test task is invalid.
On the basis of the above embodiment, in a preferred embodiment, the method further includes:
step 1, when the number of the test cases recorded in the test case abstract information of the target test task is determined to be the same as and greater than 1 of the number of the test cases in the test log, acquiring serial numbers of all the test cases in the test log;
step 2, when the serial numbers of the test cases in the test logs are the same, analyzing the summary information of the test cases of the target test task, and if the test cases in the test logs are confirmed to be different check points of the same task, determining that the test result of the target test task is valid
On the basis of the above embodiment, in a preferred embodiment, the method further includes:
step 1, when the serial numbers of the test cases in the test log are different, respectively extracting test request messages of the test cases with different serial numbers;
step 2, when the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
and 3, when the effective field in the test request message of each test case is the same as the effective field of the test request message of any test case, determining that the test result of the target test task is invalid.
In the embodiment of the application, the test logs are extracted according to the key elements (serial numbers) in the executed test cases recorded by the testers, the evidences which can be used as the execution basis of the test cases in the test logs are used as the key element information and the test case abstract information of the test tasks, and the test task execution result information is checked and compared, so that whether the test task results executed by the testers are real and effective, whether omission exists and the like are detected. In an alternative embodiment, a test task validity check report may be generated according to the comparison result and the judgment result.
Based on the method for checking the validity of the test task shown in fig. 1, another aspect of the present application provides a device for checking the validity of the test task, where as shown in fig. 2, the device may include: a 201 determination module, a 202 extraction module, a 203 analysis module and a 204 judgment module; wherein the content of the first and second substances,
the 201 determination module is used for determining a target test task to be checked;
the 202 extraction module is used for extracting the serial number of the test case of the target test task recorded in the test result; the extraction module is used for extracting a test log of the target test task from the tested system according to the serial number and extracting key elements of the test log;
the 203 analysis module is used for comparing the key elements with the test case abstract information and the test task result information of the target test task respectively according to a preset analysis rule;
and the 204 judging module is used for determining the validity of the test result of the target test task according to the comparison result.
It is to be understood that the above-mentioned respective constituent devices of the test task validity checking apparatus in the present embodiment have functions of implementing the respective steps of the method in the embodiment shown in fig. 1. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or systems corresponding to the above-described functions. The modules and systems can be software and/or hardware, and the modules and systems can be realized independently or integrated by a plurality of modules and systems. For the functional description of each module and system, reference may be specifically made to the corresponding description of the method in the embodiment shown in fig. 1, and therefore, the beneficial effects that can be achieved by the method may refer to the beneficial effects in the corresponding method provided above, which are not described again here.
It is to be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation to the specific structure of the test task validity checking apparatus. In other embodiments of the present application, the test task validity checking means may comprise more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The embodiment of the application provides an electronic device, which comprises a processor and a memory;
a memory for storing operating instructions;
and the processor is used for executing the test task validity checking method provided by any embodiment of the application by calling the operation instruction.
As an example, fig. 3 shows a schematic structural diagram of an electronic device to which the embodiment of the present application is applied, and as shown in fig. 3, the electronic device 300 includes: a processor 301 and a memory 303. Wherein processor 301 is coupled to memory 303, such as via bus 302. Optionally, the electronic device 300 may further include a transceiver 304. It should be noted that the practical application of the transceiver 304 is not limited to one. It is to be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation to the specific structure of the electronic device 300. In other embodiments of the present application, electronic device 300 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. Optionally, the electronic device may further include a display screen 305 for displaying images or receiving operation instructions of a user as needed.
The processor 301 is applied to the embodiment of the present application, and is configured to implement the method shown in the foregoing method embodiment. The transceiver 304 may include a receiver and a transmitter, and the transceiver 304 is applied in the embodiment of the present application and is used for implementing the function of the electronic device of the embodiment of the present application to communicate with other devices when executed.
The processor 301 may run the test task validity checking method provided in the embodiment of the present application, so as to reduce the operation complexity of the user, improve the intelligent degree of the terminal device, and improve the user experience. The processor 301 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the test task validity checking method provided in the embodiment of the present application, for example, part of the algorithms in the test task validity checking method is executed by the CPU, and another part of the algorithms is executed by the GPU, so as to obtain faster processing efficiency.
Bus 302 may include a path that transfers information between the above components. The bus 302 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 302 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 3, but this does not mean only one bus or one type of bus.
Optionally, the memory 303 is used for storing application program codes for executing the scheme of the present application, and is controlled by the processor 301 to execute. The processor 301 is configured to execute the application program code stored in the memory 303 to implement the test task validity checking method provided in any embodiment of the present application.
The memory 303 may further store one or more computer programs corresponding to the test task validity checking method provided in the embodiment of the present application. The one or more computer programs stored in the memory 303 and configured to be executed by the one or more processors 301 include instructions that may be used to perform the various steps in the respective embodiments described above.
Of course, the code of the test task validity checking method provided by the embodiment of the present application may also be stored in the external memory. In this case, the processor 301 may execute the code of the test task validity checking method stored in the external memory through the external memory interface, and the processor 301 may control the execution of the test task validity checking flow.
The display screen 305 includes a display panel. In some embodiments, the electronic device 300 may include 1 or N display screens 305, N being a positive integer greater than 1. The display screen 305 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display screen 305 may display a photograph, video, web page, or file, etc.
The electronic device provided by the embodiment of the present application is applicable to any embodiment of the above method, and therefore, the beneficial effects that can be achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and are not described again here.
The embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the method for checking validity of a test task shown in the above method embodiment.
The computer-readable storage medium provided in the embodiments of the present application is applicable to any embodiment of the foregoing method, and therefore, the beneficial effects that can be achieved by the computer-readable storage medium can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the method in the above embodiment. The computer program product provided in the embodiments of the present application is applicable to any of the embodiments of the method described above, and therefore, the beneficial effects that can be achieved by the computer program product can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
The test task validity checking scheme disclosed by the embodiment of the application determines a target test task to be checked, and extracts a test case serial number of the target test task recorded in a test result; extracting a test log of the target test task from the tested system according to the serial number, and extracting key elements of the test log; comparing the key elements with test case abstract information and target test task result information of a target test task respectively according to a preset analysis rule; and determining the validity of the test result of the target test task according to the comparison result. The scheme of this application has solved and has been difficult to verify the problem of tester test result validity at present.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and can make several modifications and decorations, and these changes, substitutions, improvements and decorations should also be considered to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A test task validity checking method is characterized by comprising the following steps:
determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
extracting a test log of the target test task from the tested system according to the serial number, and extracting key elements of the test log;
comparing the key elements with test case abstract information and target test task result information of a target test task respectively according to a preset analysis rule;
and determining the validity of the test result of the target test task according to the comparison result.
2. The method according to claim 1, wherein the key elements of the test log include but are not limited to:
task codes of the test cases, time stamps of the test cases, the number of the test cases and serial numbers of the test cases;
the test case summary information of the target test task includes but is not limited to: the transaction codes of the test cases and the number of the test cases;
the target test task result information includes but is not limited to: the serial number of the test case, the completion time of the test case and the test result.
3. The method for checking the validity of the test tasks according to claim 2, wherein comparing the key elements with the summary information of the test cases of the target test task according to a preset analysis rule comprises:
and when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
4. The method for checking the validity of the test task according to claim 2, wherein the comparing the key element with the result information of the test task according to a preset analysis rule comprises:
when the task code in the test log is the same as the task code in the test case abstract information of the target test task, checking whether the test case time stamp in the test log is consistent with the test case completion time recorded in the target test task result information;
and if not, determining that the test result of the target test task is invalid.
5. The method for checking the validity of the test task according to claim 2, wherein the comparing the key elements with the summary information of the test case and the result information of the test task of the target test task according to the preset analysis rule comprises:
when the task codes in the test logs are the same as the task codes in the test case abstract information of the target test task and the test case time stamps in the test logs are consistent with the test case completion time recorded in the test task result information, extracting the number of test cases in the test logs;
and when the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are both 1, determining that the test result of the target test task is valid.
6. The method for checking the validity of a test task according to claim 5, wherein when it is determined that the number of test cases recorded in the test case summary information of the target test task is different from the number of test cases in the test log, it is determined that the test result of the target test task is invalid.
7. The method of claim 5, wherein when it is determined that the number of test cases recorded in the summary information of the test cases of the target test task is the same as the number of test cases in the test log and is greater than 1, the serial number of all test cases in the test log is obtained,
and analyzing the summary information of the target test task test cases when the serial numbers of the test cases in the test logs are the same, and determining that the test results of the target test task are valid if the test cases in the test logs are confirmed to be different check points of the same task.
8. The method for checking the validity of a test task according to claim 7, wherein when the serial numbers of the test cases in the test log are different, the test request messages of the test cases with different serial numbers are respectively extracted;
when the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
and when the valid field in the test request message of each test case is the same as the valid field of the test request message of any test case, determining that the test result of the target test task is invalid.
9. A test task validity checking apparatus, characterized in that the apparatus comprises: a determining module, an extracting module, an analyzing module and a judging module, wherein,
the determining module is used for determining a target test task to be checked;
the extraction module is used for extracting the test case serial number of the target test task recorded in the test result; the extraction module is used for extracting the test log of the target test task from the tested system according to the serial number and extracting key elements of the test log;
the analysis module is used for comparing the key elements with test case abstract information and test task result information of a target test task respectively according to a preset analysis rule;
and the judging module is used for determining the validity of the test result of the target test task according to the comparison result.
10. An electronic device comprising a processor and a memory;
the memory is used for storing operation instructions;
the processor is used for executing the method of any one of claims 1-8 by calling the operation instruction.
11. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method of any one of claims 1-8.
CN202110343526.XA 2021-03-30 2021-03-30 Test task validity checking method and device Active CN112882957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110343526.XA CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110343526.XA CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Publications (2)

Publication Number Publication Date
CN112882957A true CN112882957A (en) 2021-06-01
CN112882957B CN112882957B (en) 2024-05-24

Family

ID=76040273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110343526.XA Active CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Country Status (1)

Country Link
CN (1) CN112882957B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961464A (en) * 2021-10-28 2022-01-21 中国银行股份有限公司 Test case demand coverage inspection method and device
CN115629950A (en) * 2022-12-19 2023-01-20 深圳联友科技有限公司 Method for extracting asynchronous request processing time point of performance test

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110727567A (en) * 2019-09-09 2020-01-24 平安证券股份有限公司 Software quality detection method and device, computer equipment and storage medium
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112052170A (en) * 2020-09-03 2020-12-08 中国银行股份有限公司 Automatic detection method and device, storage medium and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110727567A (en) * 2019-09-09 2020-01-24 平安证券股份有限公司 Software quality detection method and device, computer equipment and storage medium
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112052170A (en) * 2020-09-03 2020-12-08 中国银行股份有限公司 Automatic detection method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961464A (en) * 2021-10-28 2022-01-21 中国银行股份有限公司 Test case demand coverage inspection method and device
CN115629950A (en) * 2022-12-19 2023-01-20 深圳联友科技有限公司 Method for extracting asynchronous request processing time point of performance test

Also Published As

Publication number Publication date
CN112882957B (en) 2024-05-24

Similar Documents

Publication Publication Date Title
CN111177005B (en) Service application testing method, device, server and storage medium
CN106506283B (en) Business test method and device of bank and enterprise docking system
US20140344788A1 (en) Logic validation and deployment
CN112882957B (en) Test task validity checking method and device
CN112395177A (en) Interactive processing method, device and equipment of service data and storage medium
CN112732499A (en) Test method and device based on micro-service architecture and computer system
CN112100070A (en) Version defect detection method and device, server and storage medium
CN107220169B (en) Method and equipment for simulating server to return customized data
CN105955838A (en) System halt reason check method and device
CN111339136A (en) Data checking method and device, electronic equipment and storage medium
CN111045935A (en) Automatic version auditing method, device, equipment and storage medium
CN114238295A (en) Data sorting method and device based on grouping
CN116738091A (en) Page monitoring method and device, electronic equipment and storage medium
CN113282496B (en) Automatic interface testing method, device, equipment and storage medium
CN112181485B (en) Script execution method and device, electronic equipment and storage medium
CN115061924A (en) Automatic test case generation method and generation device
CN111949510B (en) Test processing method, device, electronic equipment and readable storage medium
CN112667501A (en) Link testing method and device based on automatic baffle and related equipment
CN110362464B (en) Software analysis method and equipment
CN112580334A (en) File processing method, file processing device, server and storage medium
CN108108369B (en) Method and device for processing calling error of common interface library
CN114968829B (en) Full link pressure test method, electronic device and storage medium
CN112100077B (en) Transaction testing method and device
CN112650679B (en) Test verification method, device and computer system
CN117290223A (en) Multisystem test efficiency analysis method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant