CN108614770B - Automatic test assertion method, device, storage medium and equipment - Google Patents

Automatic test assertion method, device, storage medium and equipment Download PDF

Info

Publication number
CN108614770B
CN108614770B CN201810309343.4A CN201810309343A CN108614770B CN 108614770 B CN108614770 B CN 108614770B CN 201810309343 A CN201810309343 A CN 201810309343A CN 108614770 B CN108614770 B CN 108614770B
Authority
CN
China
Prior art keywords
test
rule
data
assertion
rules
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810309343.4A
Other languages
Chinese (zh)
Other versions
CN108614770A (en
Inventor
刘鹏
许宜
张家宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN201810309343.4A priority Critical patent/CN108614770B/en
Publication of CN108614770A publication Critical patent/CN108614770A/en
Application granted granted Critical
Publication of CN108614770B publication Critical patent/CN108614770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention provides an automatic test assertion method, an automatic test assertion device, a storage medium and equipment, wherein the method comprises the following steps: acquiring a test rule, a rule parameter, test data, a data embedding point, a rule check point and an expected result of each test case in a test scene, wherein the test rule is compiled according to a parameterization rule and is used for a plurality of test scenes or a plurality of test cases; judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and injecting the test data into a test object if the data embedding point accords with the injection time point defined by the test data; judging whether the rule check points meet the test result check time points defined by the test cases, if so, splicing the unique rule parameters of the test cases and the test rules to obtain specific rule contents of the test cases, and executing the specific rule contents in the test objects injected with the test data to obtain actual operation results; and comparing the actual operation result with the expected result aiming at each test case to generate a verification result. The invention can improve the multiplexing rate of the test script.

Description

Automatic test assertion method, device, storage medium and equipment
Technical Field
The invention relates to the field of computer automated testing, in particular to an automated testing assertion method, an automated testing assertion device, a storage medium and equipment.
Background
The software test is an important link in the software development life cycle and plays an important role in the development process of a software system. The automatic test can save regression test manpower and test time to a great extent, has the characteristic of being more precise than manual test, and is introduced and used by a plurality of projects.
Traditional automated testing is achieved by recording/playback of scripts or by simulating page operations. Different test cases require different test scripts to be written specifically. Assertion code is hard coded in test scripts with rule redundancy and high repetition rates. If the assertion needs to be added or deleted, the test script needs to be modified, which results in poor maintainability of the test script. In addition, when the actual result is not matched with the expected result during the script running, the test is interrupted and quit, so that the problem found by one-time running is limited, and the test efficiency is reduced. The shortcomings of conventional automated testing are increasingly evident in the context of the higher cost of current automated testing.
Disclosure of Invention
The embodiment of the invention provides an automatic test assertion method, which is used for flexibly and reusably configuring rules and improving the reusability of test scripts. The automatic test assertion method comprises the following steps: analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases; judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object; judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result; and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
In one embodiment, the method further comprises: and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
In one embodiment, the method further comprises: and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
In one embodiment, the method further comprises: and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
In one embodiment, the method further comprises: and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
The embodiment of the invention also provides an automatic test assertion device, which is used for flexibly and reusably configuring the rules and improving the reusability of the test script. The automated test assertion device comprises: a test case parsing module for: analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases; a data embedding module to: judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object; a rule checking module to: judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result; a bulk assertion module to: and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
In one embodiment, the apparatus further comprises: a configuration file parsing module for: and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
In one embodiment, the apparatus further comprises: a reusable rule parsing module to: and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
In one embodiment, the bulk assertion module is further configured to: and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
In one embodiment, the apparatus further comprises: a test report generation module to: and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
The embodiment of the invention also provides a computer readable storage medium, which is used for flexibly and reusably configuring the rules and improving the reusability of the test script. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the method described in the embodiments above.
The embodiment of the invention also provides computer equipment for flexibly and reusably configuring the rules and improving the reusability of the test scripts. The computer device comprises a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the steps of the method of the above embodiments.
According to the automatic test assertion method, the device, the storage medium and the equipment, the parameterized test rules can be applied to multiple application scenes or multiple test cases, and similar test rules do not need to be repeatedly compiled under the conditions that the service flows are similar but the test scenes are different and are relatively single, so that the effect of reusing the test rules can be achieved, and the reuse rate of the test scripts can be improved. The data embedding point, the rule check point and the test rule define the injection point of the data and the assertion, the test case determines which assertion is performed at a certain time, and the rule describes the implementation mode of the assertion.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts. In the drawings:
FIG. 1 is a flow diagram of an automated test assertion method in accordance with an embodiment of the present invention;
FIG. 2 is a flow diagram of an automated test assertion method according to another embodiment of the present invention;
FIG. 3 is a flow diagram of an automated test assertion method according to yet another embodiment of the present invention;
FIG. 4 is a flow diagram of an automated test assertion method according to yet another embodiment of the invention;
FIG. 5 is a flow diagram of an automated test assertion method according to yet another embodiment of the present invention;
FIG. 6 is a flow diagram of an automated test assertion method in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram of an automated test assertion device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an automated test assertion device according to another embodiment of the present invention;
FIG. 9 is a schematic diagram of an automated test assertion device according to yet another embodiment of the present invention;
FIG. 10 is a schematic diagram of an automated test assertion device according to yet another embodiment of the present invention;
FIG. 11 is a diagram of a test script driver according to an embodiment of the invention;
FIG. 12 is a schematic diagram of an automated test assertion device according to another embodiment of the present invention;
fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
In order to improve the multiplexing efficiency of the automatic test script, the embodiment of the invention provides an automatic test assertion method. Fig. 1 is a flow chart of an automated test assertion method according to an embodiment of the present invention. As shown in fig. 1, an automated test assertion method according to an embodiment of the present invention may include:
step S110: analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases;
step S120: judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object;
step S130: judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result;
step S140: and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
In step S110, a test scenario may be predefined in the configuration file. A plurality of different test scenarios can be defined in the configuration file, and each test scenario can contain a plurality of different test cases. The rule parameters, rule check points, and expected results may be defined by a test case, the test rules may be referenced by a rule name, and the data embedding points and test data may be dynamically loaded by a test script. The test case may refer to the test rules in some defined way (referring means that the test script is responsible for organizing the test at various stages of the test), for example, by referring to the rule content of the test rules by the rule name added in the test case (behind the rule name is referred to as the test rule object, which contains the rule content). The test rule adopts a general definition method, such as a method for extracting parameters. After the test rules compiled by the parameterization rules are stored, the test rules can be quoted by a plurality of test scenes or test cases, similar scenes and different scenes can be used, and the difference is that if the scenes are similar, the test scripts can be reused. For example, the same test rule can be repeatedly used for test scenes with similar business processes, and testers can use the written test rule for the condition that the test scenes are single.
In step S120, by determining whether the data embedding point matches the injection time point defined by the test data, the script can identify when to perform data injection. By judging whether the rule check point meets the test result check point defined by the test case or not, the data injection can be realized while identifying which data in the data file needs to be injected at the current time point. The data driving test may refer to a functional test performed by creating test data for different functions by using test data to carry a function to be tested. The need for the script to identify the data injection points and the rule check points is described in S110, so steps S120 and S130 illustrate how each point fetches data. In steps S120 and S130, by injecting the test data according to the data embedding point, the verification rules in the test cases can be replaced according to the rule verification points, and the test script does not need to be modified for the cases with similar business processes but different test cases.
In step S130, the assembling may mean that parameters of the rule are skipped, and each case reuse rule is different in terms of different parameters, and different parameters are assembled into a complete executable rule in combination with the corresponding rule. By assembling the rule parameters and the test rules, for example, substituting the rule parameters into the test rules, the corresponding specific rule contents can be obtained for each test case, so that the universal test rules can be applied to the specific test cases. Different test cases have different rule parameters, and specific test rules for different test cases can be obtained by assembling the rule parameters and the test rules. The specific rule content may include the rule content for a specific parameter value (e.g., a specific transaction number). Executing the specific rule content in the test object into which the test data has been injected may refer to executing the corresponding rule in the application database under test to obtain the actual result.
The "determining whether the rule check point meets the test result check point defined by the test case" more specifically means "determining whether the execution point of the test script reaches the rule check point defined by the test case". "rule check points" can be used to specify test rules, but multiple "rule check points" can be configured in one test script as needed "
In an embodiment, a test rule (verification rule) may be specified at a rule verification point, and verification for a plurality of test cases may be implemented by combining respective rule parameters of the test cases.
In step S140, all the verification results may be generated by sequentially comparing the expected results defined by all the test rules in each test case with the actual operation results obtained after the execution.
In an embodiment, the test data, test cases, and test rules are data for test scripts. Therefore, dynamic loading of the test data and the assertion rules used in the test script can be realized, and decoupling of the data, the cases and the rules is realized.
In an embodiment, the injection points of the data and the assertions may be defined by a test script, what kind of assertions are made at a certain time may be determined by a test case, and the implementation manner of the assertions may be described by a test rule. Under the condition that the assertion needs to be adjusted, a tester does not need to modify the test script, and only needs to add or delete the verification rule in the case storage device, so that the maintainability of the test script is improved.
In the embodiment of the invention, the parameterized test rule is used as a universal rule, and the unique rule parameters of the test case can be spliced with the universal test rule aiming at different test cases to obtain the rule content of the specific test case without repeatedly compiling the test rules of different or similar test cases. The universal test rule can be applied to a plurality of application scenes or a plurality of test cases, similar test rules do not need to be repeatedly written, the effect that the test rule can be reused can be achieved, and the reuse rate of the test script can be improved. By defining the test rule, the data embedding point and the rule check point, under the condition that the assertion needs to be adjusted, a tester does not need to modify the test script, and only needs to add or delete the check rule in the storage device of the test case, so that the maintainability of the test script is improved. For the test cases with different test flows but similar assertion rules, the reusable assertion rules (test rules) improve the writing efficiency of the test scripts.
In an embodiment, at the rule checkpoint, the test rules may be updated according to the test cases. Aiming at the test cases with similar test flows but different assertion rules, the reusability of the test script is improved due to the fact that the embedded point of the test data and the embedded point of the test rules are provided.
The test script can provide dynamically loaded test data and an embedded point of a test rule, and defines a check time point aiming at a test object; the test case can define the inspection items to be done to the test object; the reusable test rule can define how to specifically check the test object, and the embodiment of the invention can realize the combination of the test case, the test script and the reusable test rule, thereby improving the reusability of the test rule.
In an embodiment, the test rules, test data, and test cases may be stored in their respective storage media. The tester can add and modify the test data by using the test data storage medium, can add and modify the data of the test case by using the test case storage medium, and can realize mutual sharing of the test data by using the storage medium of the test data.
In an embodiment, the test rule may be stored in a data table form in a storage medium, or may be stored in a virtual table in a database, or may be stored in a table of an Excel tool which is very friendly to a tester. In some embodiments, the test rule may define a rule name, a rule type, and a rule content, as shown in Table 1.
Rule name Rule type Rule content
TABLE 1 data sheet of test rules
In the embodiment, the rule type can be adapted to different program languages, and has a particularly important meaning for software projects developed by combining multiple languages. The rule content may define a specific method for verifying the test rule, and a general definition method is adopted, for example, the rule content defines a database query statement with parameters extracted, or an encapsulated JAVA method, and the like.
In some embodiments, the parameterized variables are identified by the specifiers in the test rules, and the specifiers are sequentially replaced according to the field arrangement sequence of the test data, so that object detection is performed on the test by using the test data and the test rules. Taking a database query statement as an example: is the query condition (e.g., transaction number) given as a variable'? fm _ trade _ id? ' is expressed in terms of a form; combining the parameter column of the parameterized test rule to identify the' specifiers in the query statement; the specifiers "are replaced in sequence in the order of fields (transaction number/test data content). Therefore, the query of different parameter dimensions under different scenes can be met, and the actual test result value can be obtained.
In some embodiments, the function to be verified in the test object is encapsulated by the JAVA method, the encapsulated JAVA method is identified in the rule content, and in the test process of the test object, different required parameters are transmitted into the encapsulated JAVA method to obtain actual operation results for different test cases. Taking the packaged JAVA method as an example, the packaged JAVA method is similar to a database query statement, the function to be verified in the test process is packaged by the JAVA method, and the called JAVA method is noted in the rule content; in the checking process, different parameters can be introduced to obtain a return value processed by the corresponding parameter.
In the embodiment, the test rules compiled by the parameterized rules can be stored, so that the test rules can be referred by a plurality of test scenes or test cases to achieve a reusable effect.
In an embodiment, the test rules are invoked from a rule base, wherein the rule base is pre-established by cumulatively storing the various test rules. Generic test rules may be written and accumulated by the tester on a continuous basis into a "rule base" from which the test rules may be retrieved for use. Example of rule base: as shown in table 2, for the transaction test, the inflow of transaction information needs to be verified in the test flows of different scenarios, but the transaction numbers of different transactions are different. Then, a configured query statement can be defined in the rule base, and the transaction number can be expressed as a variable? fm _ trade _ id? Such a check rule can be reused by other test cases.
Figure BDA0001621915610000071
Figure BDA0001621915610000081
TABLE 2 rule base examples of test rules
The test data may be stored in the storage medium in the form of a data table. In an embodiment, the test data indicates a test scenario, defines a data embedding point, and the content of the test data is represented by a plurality of data fields, as shown in table 3. The test scenario identifies an applicable scenario for the piece of test data. The test scenario needs to be indicated in the scheduling of the test script, and the corresponding test data can be injected into the software system to be verified. The data embedding point identifies that the test data is not injected until a specific time when the test script is executed.
Figure BDA0001621915610000082
TABLE 3 data sheet of test data
Test data storage example: as shown in table 4, in the full-flow test of a certain test scenario, test data needs to be imported into the test apparatus at different test stages (data embedding points), a complex association test for the sale transactions of the first day, the second day, and the time point of importing the test data and the specific content of the test data can be indicated through the data table (storage medium) of the test data.
Figure BDA0001621915610000083
Figure BDA0001621915610000091
Table 4 data sheet storage example of test data
The test cases may be stored in the storage medium in the form of a data table. In the embodiment, the test case indicates the test scenario and the rule name, and defines the rule check point, the rule parameter and the expected result, as shown in table 5. The test scenario identifies a main target to be tested by the test case, and indicates a core function use scenario of test case daemon. A reference to a reusable test rule may be made by the rule name. The meaning of the rule check point is that the condition that a plurality of rules needing to be checked at a time point are similar often exists in the set of automatic testing process. In an embodiment, when the execution time point of the test script reaches the rule check point, the required test rule may be configured. By introducing the rule check point, the rule needing to be checked can be flexibly configured at a certain time point of the test process. Rule parameters, since the test rules of embodiments of the present invention are reusable, the rules can be maximally shared and utilized by bringing in different test parameters/rule parameters. The expected result, the expected result of the test execution is compiled in the test case storage device, and the time cost of the test workers for maintaining the test cases is reduced.
Test scenario Rule name Rule check point Rule parameters Expected result
TABLE 5 data sheet for test cases
Test case storage example: as shown in table 6, in a complex test scenario full-flow test, the real-time performance of the verification needs to be ensured, for example, the result after the transaction purchase is updated after the test data sold in the transaction is entered. The 'rule check points' are stored and defined in the data sheet of the test case, at the appointed test time point, the real-time verification can be carried out by matching the appointed check rule (which can be determined according to the appointed test scene, and the case is compiled by the tester according to the tested scene, and the rule depends on the scene to be tested), and the special rule parameters of the case are matched, and the actual case execution result after the test rule and the rule parameters are assembled is compared with the expected result, so that the automatic test assertion is realized.
Figure BDA0001621915610000092
Figure BDA0001621915610000101
TABLE 6 data sheet store examples of test cases
FIG. 2 is a flow chart illustrating an automated test assertion method according to another embodiment of the present invention. As shown in fig. 2, the automated test assertion method shown in fig. 1 may further include:
step S150: and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
"indicate" can refer to the meaning of a configuration, and a test framework can fall to any environment, but requires a synchronous modification of the configuration to indicate the environment base on which the current program runs. "mapping" may refer to the mapping between the current script under test and the scenario that the script needs to test. For example, there are 100 test cases, of which 60 are applicable to the a script and 70 are applicable to the B script, and only different cases need to be configured for different scripts.
In an embodiment, a test worker may compile a configuration file of a test project. The configuration file can indicate the test environment of the test project operation and the mapping relation of the test scene required to be performed. Therefore, the automatic test scripts which are different in test scenes but have approximately similar test flows can be reused (one test script can run a plurality of test cases, so that the reusable effect is achieved). For test scenes with similar processes, the verification rules can be dynamically configured to achieve targeted tests of different products, and the test script provides points for dynamically calling the rules and the verification, so that a reusable effect can be achieved.
FIG. 3 is a flow diagram of an automated test assertion method according to yet another embodiment of the invention. As shown in fig. 3, the automated test assertion method shown in fig. 1 may further include:
step S160: and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
The rule content may define the specific method by which the test rule is verified. The rule name can be conveniently used by the test case to refer to the rule content through the mapping relation between the rule name and the rule content.
FIG. 4 is a flow chart illustrating an automated test assertion method according to yet another embodiment of the invention. As shown in fig. 4, the automated test assertion method shown in fig. 1 may further include:
step S170: and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
In the embodiment, when the verification fails, the operation of the test engineering is not interrupted, the limitation of finding problems in one-time operation is removed, and the test efficiency is improved.
FIG. 5 is a flow diagram of an automated test assertion method according to yet another embodiment of the invention. As shown in fig. 5, the automated test assertion method shown in fig. 1 may further include:
step S180: and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
In this embodiment, a test report capable of obtaining a test scenario, a test case, and a verification rule specified at a rule verification point is generated through the front-end page, so that a tester can read the test scenario, the test case, and the verification rule easily, and can quickly read and position the problem scenario, the test case, and the verification rule. The assertion result can be visually fed back to a tester through analysis of the report visualization device, so that the testing efficiency is improved, and the time cost in the software development process is saved.
Example XML data File for test report:
<?xml version="1.0"encoding="GBK"?>
<test name="root">
scene name ═ test scene of buying in the first day and selling in the next day of transaction'
< checkpoint name ═ first day inter-day online processing ">
< rule adaptor ═ kfzx-xxx ═ author ═ kfzx-xxx ═ index ═ 10 ═ name ═ background transaction information correctness verification "status ═ PASS >
<asserts>
<assert status="PASS">
<params>
<param index="0"key="fm_trade_id">Trade_id001</param>
</params>
<groups>
<group index="0"status="PASS">
< assigndata expect ═ 1-buy ═ key ═ buy _ sell _ flag ═ result ═ 1-buy >
<assertdata expect="100w"key="notional"result="100w"/>
</group>
</groups>
</assert>
</asserts>
</rule>
</checkpoint>
</scene>
</test>
Test report example: the generation of the test report may be based on an XML file output by the testing device. For a group of test scenes, a node name of XML may be defined as < secene name >, a plurality of rule check point node names belonging to the same group of test scenes may be defined as < checkpoint name >, a plurality of check items may be executed at the same rule check point, a check item node may be named as < rule >, a tester attribution < adaptor > of the check item may be recorded at the same time, a check rule name < name > corresponding to the check item, and a case result < status > of finally executing the check, finally, an assertion parameter < params > specifically used by the check item may be displayed in a form of key value pairs, and assertion detail < assertion data > grouping < groups > may be displayed, a keyword < key > of comparing assertion results is recorded one by one, an expected result < expect >, and an actual result < result >.
In an embodiment, as shown in table 7, the test report may include contents of an application, a version, a test scenario, a checkpoint, a test rule, an assertion parameter, an assertion result, and a test case result.
Figure BDA0001621915610000121
Table 7 test report examples
In the embodiment, the test environment parameters and the configuration scheduling may be driven by a test script driving device (including but not limited to a test script, and besides the script, there are some front-drive and back-drive components, such as environment analysis, rule loading, data cleaning backup recovery, and the like), and the test rule, the test data, the test case, the report, and the visualization device, and the like, are automatically executed according to the test script developed by the tester.
FIG. 6 is a flowchart illustrating an automated test assertion method according to an embodiment of the invention. As shown in fig. 6, in specific implementation, the automated test assertion method according to the embodiment of the present invention may include:
step S101: the test worker first compiles a configuration file of the test project (for example, through a configuration file analysis module), and indicates a test environment in which the test project operates and a test scenario mapping relation required to be performed. Thus, the automatic test scripts which are different in test scenes but have approximately similar test flows can be reused.
Figure BDA0001621915610000131
Table 8 scene mapping example
Step S102: the test case precursor module is used for preparing the test environment, and necessary prepositive services such as backup and recovery of test data, preparation of test basic parameters and the like are provided for the whole test environment.
Step S103: and (e.g. by a reusable rule analysis module), analyzing the rule in the reusable rule storage medium to form a mapping relation between the rule name and the rule content.
Step S104: the method comprises the steps of (for example, analyzing the test scenes defined in the configuration file by a test case analyzing module) and acquiring rules, parameters, test data, data embedding points, rule check points and expected results quoted by each test case in each test scene.
Steps S105 and S106: the test data is used as a drive (for example, in a data embedding module), whether the data embedding point meets the injection time point defined by the test data or not is judged, and if the conditions are met, the test data is injected.
Step S107, S108: and (e.g., using a rule checking module) judging whether the rule checking points meet the time point when the test case is expected to need to check the test result, if so, splicing the test parameters and the test rules to obtain specific rule contents, and executing the specific rule contents in the tested system to obtain an actual operation result.
Step S109: the expected results defined by all the rules in each case are compared in sequence (for example, by a batch assertion module) with the actual results obtained after the execution of step S108, and the running of the test script is not interrupted due to the failure of verification, so that all verification results are finally generated.
Step S110: generating an XML data file (for example, by a test report generation module) according to the verification result generated after the step S109 is executed, and generating a set of test reports which are easy for the tester to read and can quickly read and position the problem scene, case and verification rule through a front-end page for visual display. And (for example, through a test case back-driving module) the processing of the whole test environment after the automatic test execution, such as the recovery and release of resources, the recovery of data and the like, is assisted.
The steps of the methods of the above embodiments may be combined to form a new technical solution without conflict, which is within the scope of the present invention, and the present invention is not limited thereto.
Based on the same inventive concept as the automated test assertion method shown in fig. 1, the embodiment of the present application further provides an automated test assertion device, as described in the following embodiments. Because the principle of solving the problems of the automatic test assertion device is similar to that of the automatic test assertion method, the implementation of the automatic test assertion device can refer to the implementation of the automatic test assertion method, and repeated parts are not repeated.
Fig. 7 is a schematic structural diagram of an automated test assertion device according to an embodiment of the present invention. As shown in fig. 7, an automated test assertion device may include: the test case analysis module 210, the data embedding module 220, the rule checking module 230, and the batch assertion module 240 are connected in sequence.
A test case parsing module 210 for: analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases;
a data embedding module 220 to: judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object;
a rule checking module 230 for: judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result;
a bulk assertion module 240 to: and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
Fig. 8 is a schematic structural diagram of an automated test assertion device according to another embodiment of the present invention. As shown in fig. 8, the automated test assertion device shown in fig. 7 may further include: the configuration file parsing module 250 may be connected to the test case parsing module 210.
A profile parsing module 250 configured to: and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
Fig. 9 is a schematic structural diagram of an automated test assertion device according to yet another embodiment of the present invention. As shown in fig. 9, the automated test assertion device shown in fig. 7 may further include: the reusable rule parsing module 260 may be connected to the test case parsing module 210.
A reusable rule parsing module 260 to: and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
In an embodiment, the bulk assertion module 240 may further be configured to: and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
FIG. 10 is a schematic structural diagram of an automated test assertion device according to yet another embodiment of the invention. As shown in fig. 10, the automated test assertion device shown in fig. 7 may further include: the test report generation module 270 is connected to the batch assertion module 240.
A test report generation module 270 configured to: and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
In an embodiment, the automated test assertion device shown in fig. 10 may further include: the test case back-drive module may be connected to the test report generation module 270. The test case back-drive module is used for: and the processing of the whole test environment after the automatic test execution, such as the recovery and release of resources, the recovery of data and the like, is assisted.
In an embodiment, the automated test assertion device shown in fig. 7 may further include: the test case precursor module may be connected to the test case parsing module 210. The test case precursor module is to: the method comprises the steps of preparing for a test environment, providing necessary prepositive services for the whole test environment, such as backup and recovery of test data, preparation of test basic parameters and the like.
The modules in the above embodiments may be combined arbitrarily to form a new scheme without conflict, and the present invention is not limited to this.
FIG. 11 is a diagram of a test script driver according to an embodiment of the invention. As shown in fig. 11, in an implementation, the test script driver may include: a configuration file parsing module 250, a test case precursor module 280, a reusable rule parsing module 260, a test case parsing module 210, a data embedding module 220, a rule checking module 230, a batch assertion module 240, a test report generating module 270, and a test case back-driving module 290. The modules may be connected in sequence. The test script driving device 4 is used for realizing the automatic scheduling of the whole test script and adding corresponding test data to the test device in cooperation with different test cases.
Fig. 12 is a schematic structural diagram of an automated test assertion device according to another embodiment of the present invention. As shown in fig. 12, an automated test assertion device may include: the reusable rule storage device 1, the test data storage device 2, the test case storage device 3, the test case driving device 4, and the reporting and visualizing device 5 are connected as shown in fig. 12. The reusable rule storage device 1, the test data storage device 2, the test case storage device 3 and the report and visualization device 5 can be used for storing test rule data, test case data and test report data, respectively. The test data storage device 2 and the test case storage device 3 are used for test personnel to newly add and modify test case data and realize mutual sharing of the test data. The driving device 4 is used for realizing automatic scheduling of the whole test script, adding corresponding test data to the test device in cooperation with different test cases, verifying the test result, and finally submitting a report and a visualization device to visually feed back a tester for inspection. By testing the drive device 4, assertions during the test can be processed in batches.
The invention also provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to the embodiments described above.
The present invention further provides a computer device, as shown in fig. 13, the computer device 300 includes a memory 310, a processor 320, and a computer program stored on the memory 310 and executable on the processor 320, and when the processor 320 executes the computer program, the steps of the method according to the above embodiments are implemented.
In summary, the automated test assertion method, apparatus, storage medium, and device of the embodiments of the present invention have the following advantages:
(1) the parameterization compiling test rule can be applied to a plurality of application scenes or a plurality of test cases, and similar test rules do not need to be repeatedly compiled, so that the effect of reusing the test rules can be achieved, and the reuse rate of the test scripts can be improved;
(2) the general rule writing mode and the data driving mode can greatly improve the reusability of the test script and the test rule for test scenes with similar business processes;
(3) under the condition that the test scene is single and special, the test personnel can still reduce the compiling time of the test assertion through the reference of the rule base, so that the efficiency of the automatic test is improved;
(4) the dynamic loading of the test data and the assertion rule used in the test script is realized according to the data and rule embedding points of the test script, and the decoupling of the data, the case and the rule is realized;
(5) the test script can define the injection points of data and assertion, the test case determines what kind of assertion is performed at a certain time, and the rule describes the implementation mode of the assertion, under the condition that the assertion needs to be adjusted, a tester does not need to modify the test script, and the implementation can be realized only by adding and deleting the check rule in the test case storage device, so that the maintainability of the test script is improved;
(6) for the cases with different test flows and similar assertion rules, the repeatable assertion rule storage device improves the compiling efficiency of the test script;
(7) for the cases that the test flows are similar but the assertion rules are different, the test script provides embedded points of data and rules, so that the reusability of the test script is improved;
(8) through the test driving device, assertions in the test process can be processed in batches, and an assertion result can be visually fed back to a tester through analysis of the report visualization device, so that the test efficiency is improved, and the time cost in the software development process is saved.
In the description herein, reference to the description of the terms "one embodiment," "a particular embodiment," "some embodiments," "for example," "an example," "a particular example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The sequence of steps involved in the various embodiments is provided to schematically illustrate the practice of the invention, and the sequence of steps is not limited and can be suitably adjusted as desired.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (12)

1. An automated test assertion method, comprising:
analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases;
judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object; the data embedding point identifies the moment of injecting test data when the test script is executed;
judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result; the rule check point marks the time when the rule needing to be checked is configured during the execution of the test script;
and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
2. The automated test assertion method of claim 1 further comprising:
and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
3. The automated test assertion method of claim 1 further comprising:
and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
4. The automated test assertion method of claim 1 further comprising:
and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
5. The automated test assertion method of claim 1 further comprising:
and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
6. An automated test assertion device, comprising:
a test case parsing module for: analyzing the test scenes in the configuration file, and acquiring the test rules, rule parameters, test data, data embedding points, rule check points and expected results of each test case in each test scene, wherein the test rules are compiled according to parameterization rules to be used for a plurality of test scenes or a plurality of test cases;
a data embedding module to: judging whether the data embedding point accords with the injection time point defined by the test data or not by taking the test data as a drive, and if so, injecting the test data into the test object; wherein the data embedding point identifies a time at which test data is injected when the test script is executed;
a rule checking module to: judging whether the rule check points meet the test result check time points defined by the test case, if so, splicing the unique rule parameters of the test case with the test rules to obtain specific rule contents of the test case, and executing the specific rule contents in the test object injected with the test data to obtain an actual operation result; the rule check point marks the time when the rule needing to be checked is configured during the execution of the test script;
a bulk assertion module to: and comparing the actual operation result with the expected result aiming at each test case to generate a verification result.
7. The automated test assertion device of claim 6, further comprising:
a configuration file parsing module for: and generating a configuration file, wherein the configuration file indicates the test environment of the automatic test operation and the mapping relation of the test scene to be performed.
8. The automated test assertion device of claim 6, further comprising:
a reusable rule parsing module to: and analyzing the test rule to generate a mapping relation between the rule name and the rule content, wherein the test case can refer to the rule content through the rule name.
9. The automated test assertion device of claim 6, wherein the batch assertion module is further to:
and if the verification result shows that the verification fails, the running of the test script is not interrupted until all verification results are generated.
10. The automated test assertion device of claim 6, further comprising:
a test report generation module to: and generating an XML data file according to the verification result, and generating a test report capable of acquiring a test scene, a test case and a verification rule specified at a rule verification point through a front-end page based on the XML data file.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of claims 1 to 5.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of claims 1 to 5 are implemented when the processor executes the program.
CN201810309343.4A 2018-04-09 2018-04-09 Automatic test assertion method, device, storage medium and equipment Active CN108614770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810309343.4A CN108614770B (en) 2018-04-09 2018-04-09 Automatic test assertion method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810309343.4A CN108614770B (en) 2018-04-09 2018-04-09 Automatic test assertion method, device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN108614770A CN108614770A (en) 2018-10-02
CN108614770B true CN108614770B (en) 2021-08-27

Family

ID=63659590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810309343.4A Active CN108614770B (en) 2018-04-09 2018-04-09 Automatic test assertion method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN108614770B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109542764B (en) * 2018-10-17 2023-08-18 平安健康保险股份有限公司 Webpage automatic testing method and device, computer equipment and storage medium
CN109582563B (en) * 2018-10-26 2024-04-05 平安科技(深圳)有限公司 Test method, device, computer equipment and storage medium for test cases
CN111124870A (en) * 2018-10-31 2020-05-08 北京国双科技有限公司 Interface testing method and device
CN109522225B (en) * 2018-11-09 2022-06-07 网宿科技股份有限公司 Automatic test assertion method and device, test platform and storage medium
CN109828904A (en) * 2018-12-14 2019-05-31 深圳壹账通智能科技有限公司 System Authentication method, device, electronic equipment and storage medium
CN109960660A (en) * 2019-04-12 2019-07-02 广东电网有限责任公司信息中心 A kind of electrical network business networking security evaluation method based on Ansible
CN113075537B (en) * 2019-07-01 2022-10-11 成都奥卡思微电科技有限公司 Test method, storage medium and terminal for verifying and asserting null-flood strength in iterative mode
CN110502508A (en) * 2019-08-23 2019-11-26 行吟信息科技(上海)有限公司 A kind of method of calibration and system based on dynamic state of parameters
CN110781088A (en) * 2019-10-30 2020-02-11 口碑(上海)信息技术有限公司 Software system testing method and device
CN110968514A (en) * 2019-12-02 2020-04-07 北京明略软件系统有限公司 Test method, test device, electronic equipment and storage medium
CN111858355B (en) * 2020-07-23 2023-05-26 建信金融科技有限责任公司 Test case processing method and device, computer equipment and readable storage medium
CN111930617B (en) * 2020-07-31 2023-08-25 中国工商银行股份有限公司 Automatic test method and device based on data objectification
CN112783793B (en) * 2021-02-09 2024-02-02 中国工商银行股份有限公司 Automatic interface test system and method
CN112860584A (en) * 2021-03-31 2021-05-28 中国工商银行股份有限公司 Test method and device based on workflow model
CN113110997A (en) * 2021-04-23 2021-07-13 中国工商银行股份有限公司 Test method, device and equipment
CN112988601B (en) * 2021-04-29 2024-03-05 中国工商银行股份有限公司 Test script development method and device
CN113360369A (en) * 2021-04-30 2021-09-07 江苏康众汽配有限公司 Automatic testing method and system based on MQ message
CN113282513B (en) * 2021-06-28 2022-11-29 平安消费金融有限公司 Interface test case generation method and device, computer equipment and storage medium
CN114741320B (en) * 2022-05-17 2022-11-15 杭州优诗科技有限公司 Unit test case management method and device and readable storage medium
CN115687162B (en) * 2023-01-03 2023-03-21 北京集度科技有限公司 Software testing device and method and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122265A (en) * 2011-03-03 2011-07-13 中国工商银行股份有限公司 System and method for verifying computer software test results
CN102841841A (en) * 2011-06-20 2012-12-26 阿里巴巴集团控股有限公司 Method and system for processing assertion in test
WO2014052655A3 (en) * 2012-09-28 2014-06-05 Coverity, Inc. Policy evaluation based upon code change history
CN104899149A (en) * 2015-06-29 2015-09-09 上海瀚银信息技术有限公司 Automatic testing management method
CN105512036A (en) * 2015-12-12 2016-04-20 天津南大通用数据技术股份有限公司 Test template for automatically generating test case according to preset rules and test method
CN106919509A (en) * 2017-03-09 2017-07-04 腾讯科技(深圳)有限公司 A kind of client generation method, device and electronic equipment
CN107861879A (en) * 2017-11-28 2018-03-30 成都视达科信息技术有限公司 A kind of method and system for realizing software automated testing

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10112695A1 (en) * 2001-03-16 2002-09-19 Philips Corp Intellectual Pty TDMA communication system e.g. for motor vehicle, includes one node for checking system start
US7694181B2 (en) * 2005-12-12 2010-04-06 Archivas, Inc. Automated software testing framework
CN102222036B (en) * 2010-04-14 2014-11-12 阿里巴巴集团控股有限公司 Automatic testing method and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122265A (en) * 2011-03-03 2011-07-13 中国工商银行股份有限公司 System and method for verifying computer software test results
CN102841841A (en) * 2011-06-20 2012-12-26 阿里巴巴集团控股有限公司 Method and system for processing assertion in test
WO2014052655A3 (en) * 2012-09-28 2014-06-05 Coverity, Inc. Policy evaluation based upon code change history
CN104899149A (en) * 2015-06-29 2015-09-09 上海瀚银信息技术有限公司 Automatic testing management method
CN105512036A (en) * 2015-12-12 2016-04-20 天津南大通用数据技术股份有限公司 Test template for automatically generating test case according to preset rules and test method
CN106919509A (en) * 2017-03-09 2017-07-04 腾讯科技(深圳)有限公司 A kind of client generation method, device and electronic equipment
CN107861879A (en) * 2017-11-28 2018-03-30 成都视达科信息技术有限公司 A kind of method and system for realizing software automated testing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
8 tips for squashing bugs using ASSERT in C;JACOB BENINGO;《https://www.edn.com/8-tips-for-squashing-bugs-using-assert-in-c/》;20150820;全文 *

Also Published As

Publication number Publication date
CN108614770A (en) 2018-10-02

Similar Documents

Publication Publication Date Title
CN108614770B (en) Automatic test assertion method, device, storage medium and equipment
CN107273286B (en) Scene automatic test platform and method for task application
US10025696B2 (en) System and method for equivalence class analysis-based automated requirements-based test case generation
CN110287052B (en) Root cause task determination method and device for abnormal task
CN107665171B (en) Automatic regression testing method and device
CN110659018B (en) Method and device for realizing flow engine
US20100121668A1 (en) Automated compliance checking for process instance migration
CN114281694A (en) ETL framework-based data warehouse operation scheduling method, system and computer readable medium
CN113254350A (en) Flink operation testing method, device, equipment and storage medium
US8819645B2 (en) Application analysis device
CN114741300A (en) Test case based test method and device
CN114035783A (en) Software code knowledge graph construction method and tool
Krichen et al. Towards a runtime standard-based testing framework for dynamic distributed information systems
CN113126998A (en) Incremental source code acquisition method and device, electronic equipment and storage medium
Al-Khanjari Proposing a systematic approach to verify software requirements
CN111209183A (en) UI function traversal test method and device
CN116795378B (en) Method and device for arranging and executing process based on code dynamic compiling
CN109800155B (en) Method and device for testing QTE interlocking application software based on Probe
CN115658551B (en) Code testing method, storage medium, electronic device and apparatus
Godboley et al. Dy-COPECA: A Dynamic Version of MC/DC Analyzer for C Program.
CN111651365B (en) Automatic interface testing method and device
Deuter et al. Measuring the software size of sliced V-model projects
Beckmann et al. Information extraction from high-level activity diagrams to support development tasks
CN107341150B (en) Automatic publishing method and device
CN112035364A (en) Function test result evaluation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant