CN107229565B - Test method and device - Google Patents

Test method and device Download PDF

Info

Publication number
CN107229565B
CN107229565B CN201710399662.4A CN201710399662A CN107229565B CN 107229565 B CN107229565 B CN 107229565B CN 201710399662 A CN201710399662 A CN 201710399662A CN 107229565 B CN107229565 B CN 107229565B
Authority
CN
China
Prior art keywords
data
sequence
program
test
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710399662.4A
Other languages
Chinese (zh)
Other versions
CN107229565A (en
Inventor
贺玉娇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201710399662.4A priority Critical patent/CN107229565B/en
Publication of CN107229565A publication Critical patent/CN107229565A/en
Application granted granted Critical
Publication of CN107229565B publication Critical patent/CN107229565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The application discloses a testing method and a testing device. One embodiment of the method comprises: receiving an instruction for testing a program to be tested, wherein the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data are expected to flow; inputting the test data into the program to be tested, and running the program to be tested so as to enable the program to be tested to generate execution process information after the running is finished, wherein the execution process information comprises a second data mark sequence formed by data marks of branches through which the test data actually flows; and matching the second data mark sequence with the first data mark sequence, and generating a test result based on a matching result. This embodiment improves the testing efficiency.

Description

Test method and device
Technical Field
The application relates to the technical field of computers, in particular to the technical field of internet, and particularly relates to a testing method and a testing device.
Background
At present, a test method for a program having a branch structure (for example, a distributed application program for large data analysis, calculation, or the like) is generally an equivalence class division method, a boundary value analysis method, a causal graph method, or the like. Usually, the actual result data generated after the operation of the program under test is compared with the expected result data to determine whether the actual result data is correct, but it is difficult to locate the problem.
Disclosure of Invention
It is an object of the present application to provide an improved testing method and apparatus to solve the technical problems mentioned in the background section above.
In a first aspect, an embodiment of the present application provides a testing method, where the method includes: receiving an instruction for testing a program to be tested, wherein the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data flow; inputting the test data into the program to be tested, and running the program to be tested so as to enable the program to be tested to generate execution process information after the running is finished, wherein the execution process information comprises a second data mark sequence formed by data marks of branches through which the test data actually flows; and matching the second data mark sequence with the first data mark sequence, and generating a test result based on a matching result.
In some embodiments, the matching the second data tag sequence with the first data tag sequence and generating a test result based on the matching result includes: determining whether the number of data marks included in the second data mark sequence and the first data mark sequence are the same; if the number of the data marks is the same, further determining whether the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same; and if the second data mark sequence comprises a target data mark different from the data mark at the corresponding position in the first data mark sequence, generating a test result, wherein the test result comprises the target data mark.
In some embodiments, when the number of data marks included in the second data mark sequence exceeds a preset value, the test result further includes position information of the target data mark in the second data mark sequence.
In some embodiments, the matching the second data tag sequence with the first data tag sequence and generating a test result based on the matching result includes: in response to determining that the number of data markers included in each of the second sequence of data markers and the first sequence of data markers is not the same, further determining whether the number of data markers included in the second sequence of data markers is less than the number of data markers included in the first sequence of data markers; and if the number of the data loss prompt messages is less than the preset number, generating a test result comprising the data loss prompt message.
In some embodiments, when the number of data markers included in the second sequence of data markers is less than the number of data markers included in the first sequence of data markers, the generated test result further includes data markers in the first sequence of data markers that are not included in the second sequence of data markers.
In some embodiments, the desired result further comprises expected result data; and generating a test result based on the matching result, comprising: in response to determining that the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same, further determining whether actual result data generated after the operation of the program to be tested is the same as the expected result data; in response to determining that the actual result data is the same as the expected result data, generating a test result comprising at least one of: and the test passes the prompt message, the second data mark sequence and the actual result data.
In a second aspect, the present application provides a test apparatus comprising: the device comprises a receiving unit, a judging unit and a processing unit, wherein the receiving unit is configured to receive an instruction for testing a program to be tested, the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data flow is expected; a program running unit configured to input the test data into the program to be tested, and run the program to be tested, so that the program to be tested generates execution process information after the running is finished, wherein the execution process information includes a second data tag sequence formed by data tags of branches through which the test data actually flows; and the generating unit is configured to match the second data tag sequence with the first data tag sequence and generate a test result based on a matching result.
In some embodiments, the generating unit includes: a first determining subunit, configured to determine whether the numbers of data marks included in the second data mark sequence and the first data mark sequence are the same; a second determining subunit, configured to determine whether the number of the data marks included in the second data mark sequence is the same as the number of the data marks included in the first data mark sequence; a generating subunit, configured to determine to generate a test result if the second data mark sequence includes a target data mark different from a data mark at a corresponding position in the first data mark sequence, where the test result includes the target data mark.
In some embodiments, when the number of data marks included in the second data mark sequence exceeds a preset value, the test result further includes position information of the target data mark in the second data mark sequence.
In some embodiments, the generating unit includes: a third determining subunit, configured to, in response to determining that the numbers of data marks included in the second data mark sequence and the first data mark sequence are different, further determine whether the number of data marks included in the second data mark sequence is less than the number of data marks included in the first data mark sequence; and the first generation subunit is configured to generate a test result including the data loss prompt information if the number of the data loss prompt information is less than the first number.
In some embodiments, when the number of data markers included in the second sequence of data markers is less than the number of data markers included in the first sequence of data markers, the generated test result further includes data markers in the first sequence of data markers that are not included in the second sequence of data markers.
In some embodiments, the desired result further comprises expected result data; and the generating unit is further configured to: in response to determining that the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same, further determining whether actual result data generated after the operation of the program to be tested is the same as the expected result data; in response to determining that the actual result data is the same as the expected result data, generating a test result comprising at least one of: and the test passes the prompt message, the second data mark sequence and the actual result data.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement the method described in any implementation manner of the first aspect.
According to the testing method and the testing device provided by the embodiment of the application, after the instruction for testing the program to be tested is received, the testing data included in the instruction is input into the program to be tested, and the program to be tested is operated, so that the program to be tested generates the execution process information after the operation is finished. And then generating a test result based on a matching result by matching the second data tag sequence included in the execution process information with the first data tag sequence included in the instruction. The generation of the second data mark sequence is effectively utilized, each branch through which the test data actually flows is clearly recorded, and the second data mark sequence is matched with the first data mark sequence, so that when the data marks included in the second data mark sequence are not identical with the data marks included in the first data mark sequence, which branch has a problem is quickly positioned, and the test efficiency can be improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a testing method according to the present application;
FIG. 3 is a diagram of a portion of a branch of a program having a branching structure;
FIG. 4 is a schematic diagram of an application scenario of a testing method according to the present application;
FIG. 5 is a flow chart of yet another embodiment of a testing method according to the present application;
FIG. 6 is a schematic block diagram of one embodiment of a test apparatus according to the present application;
FIG. 7 is a block diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the testing method or testing apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or transmit information or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as a web browser application, a shopping-type application, a game-type application, a software testing tool, etc. The terminal device 101, 102, 103 may locally receive an instruction to test a program to be tested and process the instruction.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server that provides various services, for example, a server that deploys a program to be tested having a branch structure, and the server may write information (for example, time consumed during the operation) generated after the operation of the program to be tested is completed into a designated storage location or send the information to the terminal devices 101, 102, and 103.
It should be noted that the test method provided in the embodiment of the present application is generally executed by the terminal devices 101, 102, and 103, and accordingly, the test apparatus is generally disposed in the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
It should be noted that if the program to be tested is not a program deployed on a server, the server 105 may not be included in the system architecture 100.
With continued reference to FIG. 2, a flow 200 of one embodiment of a testing method according to the present application is shown. The test method comprises the following steps:
step 201, an instruction for testing a program to be tested is received.
In this embodiment, the electronic device (for example, the terminal devices 101, 102, 103 shown in fig. 1) on which the test method operates may locally receive an instruction to test the program to be tested. Wherein the program to be tested may be a program having a branch structure. As shown in fig. 3, fig. 3 shows a schematic diagram of a partial branch of a program having a branch structure. In fig. 3, the program includes conditions "determination 1", "determination 2", "determination 3", "determination 4", "determination 5", "determination 6", and "determination 7". The conditions "judgment 1", "judgment 2" and "judgment 3" may correspond to two branches, respectively, that is, when the conditions are satisfied ("yes" in fig. 3 may indicate that the conditions are satisfied), one branch is corresponded to, and when the conditions are not satisfied ("no" in fig. 3 may indicate that the conditions are not satisfied), one branch is corresponded to. Branches corresponding to the conditions "judgment 4", "judgment 5", "judgment 6", and "judgment 7" are not shown in fig. 3. Here, each of the branches included in the program to be tested may be provided with a data flag in advance, the data flag may include characters such as numerals, letters, chinese characters, special symbols (e.g., "), etc.), and the data flag may be, for example," flag 1 (yes) "," flag 1(Y) "," flag 1 (no) "," flag 1(F) ", etc. The instructions may include test data and an expected result, and the expected result may include a first sequence of data markers comprising data markers for each branch through which the test data is expected to flow. It should be noted that the program to be tested may be a program deployed locally, or may be a program deployed on a connected server (for example, the server 105 shown in fig. 1), and this embodiment does not limit this aspect at all.
In some optional implementations of this embodiment, the expected result may further include expected result data, where the expected result data may be result data that a tester expects the program to be tested to generate after the end of the running. For example, the above-described program to be tested may be used for store matching, and determine whether two stores are the same store, and the resulting data is "the same" or "not the same", and if the test data includes a data set a and a data set B, wherein the data set a includes "trinitron 38", "western meal", "accommodate 40 persons", and the data set B includes "trinitron 38", "western restaurant", and "40 persons", it is expected that the resulting data may be "the same".
In some optional implementation manners of this embodiment, the test data may be derived from one test case, or may be derived from multiple test cases. When the test data is derived from a plurality of test cases, it means that the electronic device can execute the plurality of test cases at one time. It should be noted that a test case generally refers to a description of a test task performed on a specific software product, and embodies a test scheme, a method, a technique, and a policy. The content may include test targets, test environments, input data, test steps, expected results, test scripts, and the like.
Step 202, inputting test data included in the instruction into the program to be tested, and running the program to be tested, so that the program to be tested generates execution process information after the running is finished.
In this embodiment, after receiving the instruction, the electronic device may input the test data into the program to be tested, and run the program to be tested, so that the program to be tested generates execution process information after the running is finished. The execution process information may include a second data tag sequence formed by data tags of branches through which the test data actually flows, and each data tag in the second data tag sequence may be arranged according to the order in which the corresponding branches are executed. Here, the electronic device may be locally preconfigured with a thread having software and hardware resources, and the electronic device may input the test data into the program to be tested through the thread and run the program to be tested, which is not limited in this respect.
In some optional implementation manners of this embodiment, when the test data is derived from a plurality of test cases, the execution process information may include a second data tag sequence corresponding to the test data derived from each test case. For example, the data to be tested includes test data a and test data B, and the execution process information may include a second data flag sequence composed of data flags of branches through which the test data a flows and a second data flag sequence composed of data flags of branches through which the test data B flows.
And step 203, matching the second data mark sequence included by the execution process information with the first data mark sequence included by the instruction, and generating a test result based on the matching result.
In this embodiment, after generating the execution process information, the electronic device may match a second data tag sequence included in the execution process information with a first data tag sequence included in the instruction, and generate a test result based on a matching result. As an example, the electronic device may directly compare the data marks at the same position included in the second data mark sequence and the first data mark sequence, for example, sequentially determine whether each data mark in the second data mark sequence is identical to the data mark at the corresponding position in the first data mark sequence; if different data marks exist, the electronic device may generate a test result, where the test result may include a first determined data mark in the second data mark sequence that is different from the data mark at the corresponding position in the first data mark sequence.
In some optional implementation manners of this embodiment, the electronic device may further compare the numbers of the second data marker sequence and the first data marker sequence to determine whether the numbers of the data markers respectively included in the second data marker sequence and the first data marker sequence are the same, if the numbers are not the same, the electronic device may further determine whether the number of the data markers included in the second data marker sequence is less than the number of the data markers included in the first data marker sequence, and if the number is less than the number, the electronic device may generate a test result including data loss notification information. Optionally, the test result may further include data marks in the first data mark sequence that are not included in the second data mark sequence. In this way, a tester or developer can easily locate the branch that is missed by looking at the test result.
In some optional implementations of this embodiment, if the numbers of data markers respectively included in the second data marker sequence and the first data marker sequence are the same, the electronic device may further determine whether the data markers at the same position respectively included in the second data marker sequence and the first data marker sequence are the same; if the data marks at the same position are the same, the electronic device may further determine whether actual result data generated after the operation of the program to be tested is the same as expected result data included in the instruction; in response to determining that the actual result data is the same as the expected result data, the electronic device may generate test data comprising at least one of: and the test passes the prompt message, the second data mark sequence and the actual result data.
In some optional implementation manners of this embodiment, the electronic device may further output the execution process information and the test result, for example, to a display screen of the electronic device, or to a memory or a hard disk of the electronic device, or to a server in remote communication connection with the electronic device, which is not limited in this respect.
With continued reference to fig. 4, fig. 4 is a schematic diagram of an application scenario of the test method according to the present embodiment. In the application scenario of fig. 4, the program to be tested 401 is a distributed application program deployed on the server 402, and each branch included in the program to be tested 401 is preset with a data flag. The client 403 may have a client application installed thereon, which supports a function of submitting test data and a first data marker sequence, and a user may submit the test data 404 and a first data marker sequence 405 corresponding to the test data 404 through the client application to trigger an instruction 406 for testing the program to be tested 401, where the instruction 406 includes the test data 404 and the first data marker sequence 405, and the first data marker sequence 405 sequentially includes data markers "marker 1 (yes)", "marker 2 (yes)", "marker 4 (yes)", "marker 9 (no)". The client 403 may receive the instructions 406 locally. Then, the client 403 may input the test data 404 into the program to be tested 401, and run the program to be tested 401, so that the program to be tested 401 generates execution process information after the running is finished, where the execution process information includes a second data tag sequence 407 formed by data tags of branches through which the test data 404 actually flows, and it is assumed that the second data tag sequence 407 sequentially includes data tags "tag 1 (yes)", "tag 2 (no)", "tag 5 (yes)", and "tag 10 (no)"; finally, the client 403 may directly compare the data markers at the same position included in the second data marker sequence 407 and the first data marker sequence 405, and compare that the data markers "marker 2 (yes)" and "marker 2 (no)" at the same position are different, the data markers "marker 4 (yes)" and "marker 5 (yes)" are different, and the data markers "marker 9 (no)" and "marker 10 (no)" are different, the client 403 may generate the test result 408, and the test result 408 may include the data marker "marker 2 (no)" in the second data marker sequence 407, that is, the data marker of the first branch in the program to be tested 401 that is executed incorrectly.
The method provided by the above embodiment of the present application effectively utilizes the generation of the second data mark sequence, clearly records each branch through which the test data actually flows, and matches the second data mark sequence with the first data mark sequence, so as to quickly locate which branch has a problem when the data marks included in the second data mark sequence and the data marks included in the first data mark sequence are not identical, thereby improving the test efficiency.
With further reference to fig. 5, a flow 500 of yet another embodiment of a testing method is shown. The process 500 of the test method includes the following steps:
step 501, an instruction for testing a program to be tested is received.
In this embodiment, the electronic device (for example, the terminal devices 101, 102, 103 shown in fig. 1) on which the test method operates may locally receive an instruction to test the program to be tested. The program to be tested may be a program having a branch structure, each branch included in the program to be tested may be provided with a data mark in advance, the data mark may include characters such as numbers, letters, chinese characters, special symbols (e.g., "(", ")" and the like), and the data mark may be, for example, "mark 1 (yes)", "mark 1 (Y)", "mark 1 (no)", "mark 1 (F)", and the like. The instructions may include test data and an expected result, and the expected result may include a first sequence of data markers comprising data markers for each branch through which the test data is expected to flow. It should be noted that the program to be tested may be a program deployed locally, or may be a program deployed on a connected server (for example, the server 105 shown in fig. 1), and this embodiment does not limit this aspect at all.
Step 502, inputting test data included in the instruction into the program to be tested, and running the program to be tested, so that the program to be tested generates execution process information after the running is finished.
In this embodiment, after receiving the instruction, the electronic device may input the test data into the program to be tested, and run the program to be tested, so that the program to be tested generates execution process information after the running is finished. The execution process information may include a second data tag sequence formed by data tags of branches through which the test data actually flows, and each data tag in the second data tag sequence may be arranged according to the order in which the corresponding branches are executed.
Step 503, determining whether the numbers of the data marks respectively included in the second data mark sequence included in the execution process information and the first data mark sequence included in the instruction are the same.
In this embodiment, after generating the execution process information, the electronic device may compare the numbers of the data markers respectively included in the second data marker sequence and the first data marker sequence to determine whether the numbers are the same. If so, the electronic device may perform step 504.
Step 504 determines whether the data marks in the same position respectively included in the second data mark sequence and the first data mark sequence are the same.
In this embodiment, in response to determining that the numbers of the data marks respectively included in the second data mark sequence and the first data mark sequence are the same in step 503, the electronic device may further compare the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence to determine whether the data marks at the same position are the same. If the second data mark sequence includes a target data mark different from the data mark at the corresponding position in the first data mark sequence, the electronic device may execute step 505.
Step 505, generating a test result.
In this embodiment, in response to the second data mark sequence including a target data mark different from the data mark at the corresponding position in the first data mark sequence, the electronic device may generate a test result, where the test result may include the target data mark. Here, if the number of data marks included in the second data mark sequence exceeds a preset value (for example, 30), the test result may further include position information of the target data mark in the second data mark sequence. It should be noted that each data marker in the second data marker sequence may have an index number, and the position information may be an index number of the target data. It should be noted that the preset value may be modified according to actual needs, and this embodiment does not limit this aspect at all.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 2, the flow 500 of the testing method in this embodiment highlights the processing steps in the case that the number of data marks included in the second data mark sequence and the first data mark sequence is the same. Therefore, the scheme described by the embodiment can improve the comprehensiveness of the test, and further improve the test efficiency.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present application provides an embodiment of a testing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied to various electronic devices.
As shown in fig. 6, the test apparatus 600 of the present embodiment includes: a receiving unit 601, a program running unit 602, and a generating unit 603. The receiving unit 601 is configured to receive an instruction for testing a program to be tested, where the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data tag, the instruction includes test data and an expected result, and the expected result includes a first data tag sequence formed by data tags of branches through which the test data is expected to flow; the program running unit 602 is configured to input the test data into the program to be tested, and run the program to be tested, so that the program to be tested generates execution process information after the running is finished, where the execution process information includes a second data tag sequence formed by data tags of branches through which the test data actually flows; the generating unit 603 is configured to match the second data tag sequence with the first data tag sequence, and generate a test result based on a matching result.
In this embodiment, in the test apparatus 600: the specific processing of the receiving unit 601, the program running unit 602, and the generating unit 603 and the technical effects thereof can refer to the related descriptions of step 201, step 202, and step 203 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementations of this embodiment, the generating unit 603 may include: a first determining subunit (not shown in the figure), configured to determine whether the numbers of data marks included in the second data mark sequence and the first data mark sequence are the same; a second determining subunit (not shown in the figure), configured to determine whether the number of the data marks included in the second data mark sequence is the same as the number of the data marks included in the first data mark sequence; a generating subunit (not shown in the figure) configured to determine that a test result is generated if the second data mark sequence includes a target data mark different from the data mark at the corresponding position in the first data mark sequence, wherein the test result includes the target data mark.
In some optional implementations of this embodiment, when the number of data marks included in the second data mark sequence exceeds a preset value, the test result further includes position information of the target data mark in the second data mark sequence.
In some optional implementations of this embodiment, the generating unit 603 may include: a third determining subunit (not shown in the figure), configured to, in response to determining that the numbers of data marks included in the second data mark sequence and the first data mark sequence are different, further determine whether the number of data marks included in the second data mark sequence is less than the number of data marks included in the first data mark sequence; and a first generating subunit (not shown in the figure) configured to, if less than the first generating subunit, generate a test result including the data loss prompt information.
In some optional implementations of this embodiment, when the number of data markers included in the second data marker sequence is less than the number of data markers included in the first data marker sequence, the generated test result further includes data markers that are not included in the second data marker sequence in the first data marker sequence.
In some optional implementations of this embodiment, the desired result further includes expected result data; and the generating unit 603 may be further configured to: in response to determining that the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same, further determining whether actual result data generated after the operation of the program to be tested is the same as the expected result data; in response to determining that the actual result data is the same as the expected result data, generating a test result comprising at least one of: and the test passes the prompt message, the second data mark sequence and the actual result data.
The apparatus provided by the above embodiment of the present application effectively utilizes the generation of the second data mark sequence, clearly records each branch through which the test data actually flows, and matches the second data mark sequence with the first data mark sequence, so as to quickly locate which branch has a problem when the data marks included in the second data mark sequence and the data marks included in the first data mark sequence are not identical, thereby improving the test efficiency.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program executes the above-described functions defined in the system of the present application when executed by the Central Processing Unit (CPU) 701.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a receiving unit, a program executing unit, and a generating unit. Where the names of the units do not in some cases constitute a definition of the unit itself, for example, the first determination unit may also be described as a "unit that receives an instruction to test a program to be tested".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to include: receiving an instruction for testing a program to be tested, wherein the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data flow; inputting the test data into the program to be tested, and running the program to be tested so as to enable the program to be tested to generate execution process information after the running is finished, wherein the execution process information comprises a second data mark sequence formed by data marks of branches through which the test data actually flows; and matching the second data mark sequence with the first data mark sequence, and generating a test result based on a matching result.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A method of testing, the method comprising:
receiving an instruction for testing a program to be tested, wherein the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data are expected to flow;
inputting the test data into the program to be tested, and running the program to be tested so as to enable the program to be tested to generate execution process information after the running is finished, wherein the execution process information comprises a second data mark sequence formed by data marks of branches through which the test data actually flows;
matching the second data tag sequence with the first data tag sequence, and generating a test result based on a matching result, wherein the matching the second data tag sequence with the first data tag sequence comprises: comparing the second sequence of data markers to the co-located data markers comprised by the first sequence of data markers.
2. The method of claim 1, wherein matching the second data tag sequence with the first data tag sequence generates a test result based on a matching result, comprising:
determining whether the number of data markers respectively included in the second sequence of data markers and the first sequence of data markers is the same;
if the number of the data marks is the same, further determining whether the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same;
and if the second data mark sequence comprises a target data mark different from the data mark at the corresponding position in the first data mark sequence, generating a test result, wherein the test result comprises the target data mark.
3. The method of claim 2, wherein the test result further comprises position information of the target data mark in the second sequence of data marks when the number of data marks comprised by the second sequence of data marks exceeds a preset value.
4. The method of claim 2, wherein matching the second data tag sequence with the first data tag sequence generates a test result based on a matching result, comprising:
in response to determining that the second sequence of data markers and the first sequence of data markers, respectively, include a different number of data markers, further determining whether the second sequence of data markers includes a fewer number of data markers than the first sequence of data markers;
and if the number of the data loss prompt messages is less than the preset number, generating a test result comprising the data loss prompt message.
5. The method of claim 4, wherein when the second sequence of data markers includes a fewer number of data markers than the first sequence of data markers, the generated test result further includes data markers in the first sequence of data markers that are not included in the second sequence of data markers.
6. The method of claim 2, wherein the desired outcome further comprises expected outcome data; and
the generating a test result based on the matching result includes:
in response to determining that the data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same, further determining whether actual result data generated after the operation of the program to be tested is the same as the expected result data;
in response to determining that the actual result data is the same as the expected result data, generating a test result comprising at least one of: and the test passes prompt information, the second data marking sequence and the actual result data.
7. A test apparatus, the apparatus comprising:
the device comprises a receiving unit and a processing unit, wherein the receiving unit is configured to receive an instruction for testing a program to be tested, the program to be tested is a program with a branch structure, each branch included in the program to be tested is preset with a data mark, the instruction comprises test data and an expected result, and the expected result comprises a first data mark sequence formed by the data marks of the branches through which the test data are expected to flow;
the program running unit is configured to input the test data into the program to be tested, run the program to be tested, and enable the program to be tested to generate execution process information after the running is finished, wherein the execution process information comprises a second data mark sequence formed by data marks of branches through which the test data actually flows;
a generating unit, configured to match the second data tag sequence with the first data tag sequence, and generate a test result based on a matching result, where the matching the second data tag sequence with the first data tag sequence includes: comparing the second sequence of data markers to the co-located data markers comprised by the first sequence of data markers.
8. The apparatus of claim 7, wherein the generating unit comprises:
a first determining subunit configured to determine whether the numbers of data markers respectively included in the second data marker sequence and the first data marker sequence are the same;
a second determining subunit, configured to further determine whether data marks at the same position respectively included in the second data mark sequence and the first data mark sequence are the same, if the numbers are the same;
a generating subunit, configured to generate a test result if the second data marker sequence includes a target data marker different from a data marker at a corresponding position in the first data marker sequence, where the test result includes the target data marker.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201710399662.4A 2017-05-31 2017-05-31 Test method and device Active CN107229565B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710399662.4A CN107229565B (en) 2017-05-31 2017-05-31 Test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710399662.4A CN107229565B (en) 2017-05-31 2017-05-31 Test method and device

Publications (2)

Publication Number Publication Date
CN107229565A CN107229565A (en) 2017-10-03
CN107229565B true CN107229565B (en) 2020-05-01

Family

ID=59934327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710399662.4A Active CN107229565B (en) 2017-05-31 2017-05-31 Test method and device

Country Status (1)

Country Link
CN (1) CN107229565B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688295A (en) * 2018-07-06 2020-01-14 北京京东尚科信息技术有限公司 Data testing method and device
CN111258882B (en) * 2020-01-03 2023-08-25 恩亿科(北京)数据科技有限公司 Test data acquisition method and device based on digital media system
CN113836021A (en) * 2021-09-24 2021-12-24 昆仑芯(北京)科技有限公司 Test method, test device, electronic apparatus, and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101625709A (en) * 2008-07-09 2010-01-13 华为技术有限公司 Method and device for collecting functional coverage
CN102323906A (en) * 2011-09-08 2012-01-18 哈尔滨工程大学 MC/DC test data automatic generation method based on genetic algorithm
CN102419728A (en) * 2011-11-01 2012-04-18 北京邮电大学 Method for determining software test process sufficiency based on coverage rate quantitative indicators
CN103365771A (en) * 2012-04-10 2013-10-23 阿里巴巴集团控股有限公司 Method and equipment for obtaining code coverage rate
CN105512021A (en) * 2014-09-25 2016-04-20 阿里巴巴集团控股有限公司 Method and device for Diff analysis used for software testing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5988444B2 (en) * 2014-02-14 2016-09-07 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Method for testing an optimized binary module, computer for testing the optimized binary module, and computer program therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101625709A (en) * 2008-07-09 2010-01-13 华为技术有限公司 Method and device for collecting functional coverage
CN102323906A (en) * 2011-09-08 2012-01-18 哈尔滨工程大学 MC/DC test data automatic generation method based on genetic algorithm
CN102419728A (en) * 2011-11-01 2012-04-18 北京邮电大学 Method for determining software test process sufficiency based on coverage rate quantitative indicators
CN103365771A (en) * 2012-04-10 2013-10-23 阿里巴巴集团控股有限公司 Method and equipment for obtaining code coverage rate
CN105512021A (en) * 2014-09-25 2016-04-20 阿里巴巴集团控股有限公司 Method and device for Diff analysis used for software testing

Also Published As

Publication number Publication date
CN107229565A (en) 2017-10-03

Similar Documents

Publication Publication Date Title
CN106815031B (en) Kernel module loading method and device
US11151024B2 (en) Dynamic automation of DevOps pipeline vulnerability detecting and testing
WO2018223717A1 (en) Webpage front-end testing method, device, system, apparatus and readable storage medium
CN109684188B (en) Test method and device
CN109359194B (en) Method and apparatus for predicting information categories
CN107302597B (en) Message file pushing method and device
CN109460652B (en) Method, apparatus and computer readable medium for annotating image samples
CN107229565B (en) Test method and device
CN112631911A (en) Automatic testing method and device, computer equipment and storage medium
CN111367531A (en) Code processing method and device
CN108959102B (en) Method and device for generating test data and testing application to be tested
US8949991B2 (en) Testing web services that are accessible via service oriented architecture (SOA) interceptors
CN113535577A (en) Application testing method and device based on knowledge graph, electronic equipment and medium
CN107305528B (en) Application testing method and device
CN109145591B (en) Plug-in loading method of application program
US11121912B2 (en) Method and apparatus for processing information
CN111125503B (en) Method and apparatus for generating information
CN107247661B (en) Method and system for supporting automatic verification of installation package of application
CN108287792B (en) Method and apparatus for outputting information
CN111273970B (en) Calling method, device, system, medium and electronic equipment of intelligent contract
CN110209959B (en) Information processing method and device
CN113626301A (en) Method and device for generating test script
CN112579428A (en) Interface testing method and device, electronic equipment and storage medium
CN111831530A (en) Test method and device
CN111414566A (en) Method and device for pushing information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant