CN116010274A - Data testing method and system - Google Patents

Data testing method and system Download PDF

Info

Publication number
CN116010274A
CN116010274A CN202310035027.3A CN202310035027A CN116010274A CN 116010274 A CN116010274 A CN 116010274A CN 202310035027 A CN202310035027 A CN 202310035027A CN 116010274 A CN116010274 A CN 116010274A
Authority
CN
China
Prior art keywords
test
unit
result
data file
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310035027.3A
Other languages
Chinese (zh)
Inventor
刘泽
蒋晓莲
杨程
鞠春生
刘冰
曹旭
由天宇
张鑫淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Travelsky Technology Co Ltd
Original Assignee
China Travelsky Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Travelsky Technology Co Ltd filed Critical China Travelsky Technology Co Ltd
Priority to CN202310035027.3A priority Critical patent/CN116010274A/en
Publication of CN116010274A publication Critical patent/CN116010274A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The application discloses a data testing method and system, which are used for acquiring test support data conforming to a preset test scene, storing the test support data into a database, acquiring test case data files, loading the test support files and the test case data files in the database to execute generated unit test codes to obtain execution results, asserting the execution results and expected results in the test case data files to obtain assertion results, and determining that the test case is executed successfully if the assertion results represent that the execution results are consistent with the expected results in the test case data files. By the scheme, all the test codes are generated by the customized code generator, so that the time for writing the test case codes is reduced, the development cost and the development difficulty are reduced, a developer only needs to write the test case data file required by the preset test scene, the pressure of writing the unit test codes by the developer is released, and the workload of the developer is reduced.

Description

Data testing method and system
Technical Field
The present disclosure relates to the field of electronic information technologies, and in particular, to a data testing method and system.
Background
With the development of civil aviation service, the freight rate release service is gradually transited from an original mainframe to cloud protogenesis, and with the clouding of the freight rate release service, the cloud protogenesis application is required to be tested through a unit test technology.
However, in the process of testing the cloud native application by using the conventional unit testing technology, because many technologies of writing unit testing designs can cope with complex service scenarios, and meanwhile, the work is assisted by means of many unit testing frameworks of third parties, the cost and difficulty of the testing technology are high, and the workload of developers is increased.
Therefore, how to reduce development cost, development difficulty and workload of a developer in the process of testing the cloud native application through the unit testing technology is a problem to be solved in the present application.
Disclosure of Invention
In view of this, the application discloses a data testing method and system, which aim to achieve the purposes of reducing development cost and development difficulty and reducing workload of developers.
In order to achieve the above purpose, the technical scheme disclosed by the method is as follows:
the first aspect of the application discloses a data testing method, which comprises the following steps:
acquiring test support data conforming to a preset test scene, and storing the test support data into a database; basic data required by supporting the test scene exists in the preset test scene representation database; the test support data is used for assisting a basic data file executed by the test case data file;
acquiring a test case data file; the test case data file represents data files of input and expected results in the preset test scene;
generating a unit test code;
loading a test supporting file and the test case data file in the database to execute the unit test code to obtain an execution result;
asserting the execution result and an expected result in the test case data file to obtain an asserted result;
and if the assertion result characterizes that the execution result is consistent with the expected result in the test case data file, determining that the test case is successfully executed.
Preferably, the construction process of the preset test scene includes:
determining test boundaries and parameter definition rules; the test boundary characterizes a method inlet for performing unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed;
and constructing a preset test scene through the test boundary and the parameter definition rule.
Preferably, the asserting the execution result and the expected result in the test case data file to obtain an asserted result includes:
comparing the execution result with an expected result in a test case data file;
if the execution result is consistent with the expected result in the test case data file, obtaining an assertion result which characterizes that the execution result is consistent with the expected result in the test case data file;
and if the execution result is inconsistent with the expected result in the test case data file, obtaining an assertion result which characterizes that the execution result is inconsistent with the expected result in the test case data file.
Preferably, the generating unit test code includes:
acquiring a test class, a test method and test parameters configured in a pre-customized generator configuration file;
generating unit test codes according to the test classes, the test methods and the test parameters configured in the generator configuration file.
Preferably, the method further comprises:
judging whether a return value exists in the test method;
if the type of the test method is the void type, determining that a return value exists in the test method;
if the type of the test method is not the void type, determining that a return value does not exist in the test method, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
A second aspect of the present application discloses a data testing system, the system comprising:
the first acquisition unit is used for acquiring test support data conforming to a preset test scene and storing the test support data into a database; basic data required by supporting the test scene exists in the preset test scene representation database; the test support data is used for assisting a basic data file executed by the test case data file;
the second acquisition unit is used for acquiring the test case data file; the test case data file represents data files of input and expected results in the preset test scene;
the first generating unit is used for generating a unit test code;
the loading execution unit is used for loading the test supporting file and the test case data file in the database to execute the unit test codes to obtain an execution result;
the assertion unit is used for asserting the execution result and the expected result in the test case data file to obtain an assertion result;
and the determining unit is used for determining that the test case is successfully executed if the assertion result represents that the execution result is consistent with the expected result in the test case data file.
Preferably, the first obtaining unit for presetting a construction process of a test scene includes:
the determining module is used for determining a test boundary and a parameter definition rule; the test boundary characterizes a method inlet for performing unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed;
and the construction module is used for constructing a preset test scene through the test boundary and the parameter definition rule.
Preferably, the asserting unit includes:
the comparison module is used for comparing the execution result with an expected result in the test case data file;
the first acquisition module is used for obtaining an assertion result representing that the execution result is consistent with the expected result in the test case data file if the execution result is consistent with the expected result in the test case data file;
and the second acquisition module is used for obtaining an assertion result which characterizes that the execution result is inconsistent with the expected result in the test case data file if the execution result is inconsistent with the expected result in the test case data file.
Preferably, the first generating unit includes:
the third acquisition module is used for acquiring the test class, the test method and the test parameter configured in the pre-customized generator configuration file if the assertion result represents that the execution result is consistent with the expected result in the test case data file;
and the generating module is used for generating unit test codes according to the test classes, the test methods and the test parameters configured in the generator configuration file and executing the unit test codes.
Preferably, the method further comprises:
the judging unit is used for judging whether a return value exists in the test method;
the first determining unit is used for determining that a return value exists in the test method if the type of the test method is a void type;
the second determining unit is used for determining that a return value does not exist in the test method if the type of the test method is not the void type, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
According to the technical scheme, the data testing method and system are disclosed, test supporting data which accords with a preset test scene are obtained, the test supporting data are stored in a database, basic data which are required by supporting the test scene exist in a preset test scene representation database, the test supporting data are used for assisting in the basic data file execution of the test case data file, the test case data file is obtained, the test case data file represents the data file of the input and expected result under the preset test scene, the unit test code is generated, the test supporting file and the test case data file in the database are loaded to execute the unit test code, the execution result is obtained, the expected result in the execution result and the test case data file is asserted, the assertion result is obtained, and if the assertion result represents that the execution result is consistent with the expected result in the test case data file, the test case execution success is determined. By the scheme, all the test codes are generated by the customized code generator, so that the time for writing the test case codes is reduced, the development cost and the development difficulty are reduced, a developer only needs to write the test case data file required by the preset test scene, the pressure of writing the unit test codes by the developer is released, and the workload of the developer is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings may be obtained according to the provided drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic flow chart of a data testing method disclosed in an embodiment of the present application;
FIG. 2 is a schematic diagram of a data testing system according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
In this application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As known from the background art, in the process of testing the cloud native application by the conventional unit testing technology, because many technologies of writing unit testing designs can cope with complex service scenarios, and meanwhile, the work is assisted by means of many unit testing frames of third parties, the cost and difficulty of the testing technology are high, and the workload of developers is increased. Therefore, how to reduce development cost, development difficulty and workload of a developer in the process of testing the cloud native application through the unit testing technology is a problem to be solved in the present application.
In order to solve the problems, the application discloses a data testing method and a system, because all test codes are generated by a customized code generator, the time for writing test case codes is reduced, the development cost and the development difficulty are reduced, a developer only needs to write test case data files required by a preset test scene, the pressure of writing unit test codes by the developer is released, and the workload of the developer is reduced. Specific implementations are illustrated by the following examples.
The embodiment of the application is based on the fully combined drive design scheme in the field of the freight release system, analyzes the data access layer, the data processing layer and the anomaly capturing layer of the freight release system, and finally determines the data processing service method of which the boundary of the data drive unit test is each interface. The unit test is carried out for the access, the processing and the abnormality of the data processing service method. The data driven unit test framework mainly uses techniques of Junit5, dbunit, jsonAssert, exceptionAssert, H, testCodeGenerator. The goal is to focus the test on the construction of test case data and the preparation of test support data.
The scheme realizes data testing through a lightweight, low-coupling and easy-to-integrate data-driven unit testing framework. The data-driven unit test framework has universal adaptability, and all programs developed by adopting java language can be rapidly integrated and used. The main functions of the data-driven unit test framework are as follows:
(1) Using the parameterized test function of Juit 5, we can perform one test method multiple times using different parameters, covering different conditional branches. The time for repeatedly writing test codes is saved. The quick principle is embodied (the test is quick, so that a developer can run the test for each small change without relying on complex various environments).
(2) The db unit is used for data initialization during test operation, so that data isolation among different test classes is achieved, and a great deal of effort is avoided to maintain the work of test site recovery. The isolation principle is embodied (different test cases are isolated, one test does not depend on the other test, and data of different tests are isolated from each other).
(3) The custom JsonASsert is used for meeting the assertion of the result in most test scenes, and the self-confirmation principle is embodied (a developer can finish unit test writing in a short time after the code development is finished).
(4) The self-confirmation principle is reflected by using the customized Exceptionassert to meet the assertion of the return code of all abnormal conditions (the test results are automatically confirmed and uniformly output without manual confirmation).
(5) The H2 database is decoupled from the real database environment, so that the test efficiency is improved, and the replay principle is embodied (the test program can run in different environments, such as a development environment, a Sonar scanning environment, a pipeline environment, a cloud native environment and the like).
(6) Customizing the TestCodeGenerator to generate codes in most test scenes greatly reduces the time for writing test case codes and concentrates efforts in writing test scene data. The method and the device embody the timely principle (a developer can complete unit test writing in a short time after the code development is finished).
Referring to fig. 1, a flow chart of a data testing method disclosed in an embodiment of the present application is shown, where the data testing method mainly includes the following steps:
s101: acquiring test support data conforming to a preset test scene, and storing the test support data into a database; the method comprises the steps that basic data required by supporting a test scene exist in a preset test scene representation database; the test support data is used for assisting in the basic data file of the test case data file execution.
The test support data is a basic data file for assisting the execution of test case data files described below.
For example, to execute a test case of a scenario in which an additional operation is performed and data of the same primary key already exists in the database, the data of the same primary key needs to be prepared to verify the scenario, which is a preset test scenario.
And loading test support data of a preset test scene through the DbUnit component. The code for specifically loading the test support data of the preset test scene is as follows:
@DataSet(locations=
{"/com/travelsky/dff/producer/business/internationalfare/ifimtsubch/IfimTSub chRepresentationService.xml",
"/com/travelsky/dff/producer/business/internationalfare/if imtsubch/IfimTSubchRepresentationServiceGb1.xml"},dataSourceSpringName="dataSource",dbType=DBType.H2,
schema="FCIMS;NFS GBL",columnSensing=true setUpOperation=DBOperation.INSERT),
public class IfimTSubchRepresentationServiceTest extends AbstractTest{
}。
wherein @ DataSe: all data required for the test is initialized and is independently owned by the test class without affecting the data of the other test classes. All data needed for initializing the test has the effect that the data of each test scene is independent, and the test in the scene needs to be supported, and is needed to be loaded before the test for test use. For example: testing the updated data, the data to be updated needs to be prepared in advance, so that the test can be performed.
Positioning: multiple data initialization files are supported, requiring that the initialization under the same database schema be placed in one data file, multiple files separated by commas.
scheme a: and (3) supporting the initialization of a plurality of database modes, and keeping the sequence of the database modes and the schema to which the data of the location files belong all the time, wherein a plurality of database modes are separated by semicolons.
The test support data format definition rules are as follows:
(1) The test data file is in xml format.
(2) The data in each file is the corresponding data in one schema in the database.
(3) Each row of data is a row of records in the database table.
(4) Each element is a table NAME, e.g., a table NAME is CARRIER, and each attribute is a field corresponding to the table, e.g., carrier_code, carrier_numericc_code, carrier_name, create_time, etc. Reference may be made in particular to the following codes:
<?xml version="1.0"encoding="UTF-8"?>;
<dataset>;
<CARRTER CARRTER CODE="CA"CARRTER NUMERIC CODE="999"CARRIER NAME"ATR CHINA"CREATE TIME="2021-07-0718:21:39.0"/>;
<CARRIER CARRIER CODE="MU"CARRIER NAME="PALAIR MACEDONIAN AIRLINES A.D."CREATE TIME="2021-07-0718:21:39,0"/>;
<CARRIER CARRIERCODE="CZ"CARRIER NAME="TRANSAERO"CREATE TIME="2021-07-0718:21:39.0"/>。
and storing the data of the test support.
The storage mode is to store the integrated memory database H2. All tables in the real database are fully built and each initializes the test data for use by different test methods. The initialized data is recovered after being used up without deletion and modification. Dml and ddl are defined in, for example, a freight rate distribution system to initialize table structures and data. The specific definition positions are as follows:
dff-producer\src\test\resources\ddl\init.sql: initializing a file of a table structure;
dff-producer\src\test\resources\dml\xxx.sql: the table data initializes the file.
The construction process of the preset test scene is shown as A1-A2.
A1: determining test boundaries and parameter definition rules; a test boundary represents a method inlet for carrying out unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed.
The determination of the test boundary is to find the boundary of the software to which the data-driven test method needs to be applied, that is, the method entrance to which the unit test needs to be performed. For example: the freight rate issuing system adopts a field driving design scheme, all the portals of the external requests are uniformly packaged, and all the external request interfaces correspond to own processing service methods, so that the testing boundary of the freight rate issuing system is the service method of each interface.
And determining parameter definition rules of the parameterized test method so as to perform parameter construction when the test scene use cases are designed. For example, the entry of the method to be tested by the freight issuing system is a Command object with the end of Command, and the external request is in a way of taking the JSON format as a transmission parameter, so the test case data is defined as the JSON format and is consistent with the object structure with the end of Command. And the test case data file is stored by adopting a CSV file, so that the test case data file is conveniently loaded by using the JUNIT5 parameterization.
Definition rules of the test case data file are as follows (1) - (3):
(1) Each row of files is input and expected value corresponding to a test case scene, and the @ # $% "is used as a separator between input data and expected value.
(2) The expected value is before the separator, the input parameter is after the separator, and in a line, no line feed is possible.
(3) The second input parameter format is the JSON string of the method's entry.
Code examples of definition rules for a particular test case data file are as follows:
015-18-01-31065& (1) & lt/1 & gt that the sublot contains freight rate with date of effect earlier than current date-! Telegrams are refused-! @ # $% { "sbList" [ { "rptFrom": "CA", "rptFileNo": "015-18-01-31065" } ];
015-18-01-31066& (1) & lt 7 days (inclusive) of reporting of the freight rate-! Declare rejected-! @ # $% { "sbList [ {" rptFrom ":" CA "," rptFileNo ":"015-18-01-31066"} ] }.
A2: and constructing a preset test scene through the test boundary and the parameter definition rule.
S102: acquiring a test case data file; the test case data file characterizes data files of input and expected results under a preset test scene.
In S102, a parameterized unit test method is defined, and test case data files are loaded using a test method definition mode of a test framework (JUNIT 5) and test support data in a database.
The test case data file refers to a data file of input and expected results under a preset test scene.
Specifically, a parameterized unit testing method is defined, and codes of a testing method definition mode using a Juit 5 are as follows:
@ParameterizedTest;
@CsvFileSource(resources="/com/travelsky/dff/producer/business/internation alfare/ifimtsubch/GetIfimTSubchs.csv",delimiterString="@#$%")public void testGetIfimTSubchs(String expected,@Aggregatewith(CommandArgumentsAggregator.class)QueryIfimTSubchsComm and command){
try{
Object result=service.getIfimTSubchs(command);
Assertions.assertNotNull(result);
assertJsonEquals(expected,JSONUtil.toJsonStr(result));}catch(Exception e){
assertErrorCodeEquals(expected,e);
}
}。
the parameterization annotation, the CSV file loading annotation and the parameter aggregator annotation are used for completing the definition of the parameterization unit test in a matching way. The specific explanation is as follows:
@ parameterizer dtest: parameterized test annotations are added to the method for which parameterized testing is required.
@ CsvFileSource: the parameterized test case file is introduced with comments, and the test method is executed by analyzing the file to carry out multiple method calls.
@ AggregateeWith: and the parameter aggregator is used for generating the entry of different scenes through the custom parameter aggregator.
S103: the generation unit tests the code.
In S103, a test class, a test method, a test parameter, a return value, etc. to be generated configured in the pre-customized generator configuration file are acquired, and a unit test code is generated according to the test class, the test method, the test parameter, the return value, etc. configured in the generator configuration file.
The unit test is to write codes, and the correctness of the program is verified by the written codes, so that the codes of the unit test can be generated by a customized code generator, and the generated unit test codes can be executed. The process performed is independent of the code generator being customized. This saves a lot of time in writing unit test code. The code generator is an auxiliary function.
Judging whether a return value exists in the test method, and if the type of the test method is the void type, determining that the return value exists in the test method; if the type of the test method is not the void type, determining that a return value does not exist in the test method, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
Wherein, the void type is no type.
The preset configuration mode is a configuration mode of a fun function defining a null return value through void fun (XxxCommand).
With some conventions made as above, custom generation of test code may be made according to the conventions. Taking the unit test of the freight issuing system as an example, the definition of test class, the definition of test method, the use of assertion and the like are all unified rules. The code for which the code generation rule is designed according to the above convention is as follows:
#Coder
global.author=liuze;
#Classes thatneedto generate unittests;
target.class=com.travelsky.dff.producer.business.basedata.bcauthmgr.BCAuth MgrDomainService;
#The methodto be tested is inthe format ofmethodname(parametertype),andmultiple methods needto be separatedby semicolons(;);
target.methods=void add(CreateBCAuthMgrCommand);voidmodify(UpdateBCAuthMgrCommand);void remove(List<RemoveBCAuthMgrCommand>);
#Package path forparameter;
target.methodParamsPackage=com.travelsky.dff.domain.basedata.command.b cauthmgr;
#database schema,and multiple schemas need to be separated by semicolons(;)
dataset.schema=NFS GBL;FCIMS;NFS CA。
wherein, global. Author: the generated test class and test method can synchronously generate class and method comments, so that the authors of the comments are necessary items.
target. Class: which class is configured to generate test methods for that service class needs to be the full path with the package name.
target methods: the method in the target class is configured, only the method with Command or List < Command > as a parameter is supported, if the method is the type of the void, the void must be filled in, and the generator defaults to have a return value.
target. Method params package: the packet path where the parameter command of the method is located.
dataset.schema: the data used for testing belongs to that schema of the database, and a plurality of data are separated by semicolons. Such as nfs_gbl; FCIMS; nfs_ca.
The process of executing the test code is as follows:
1) The generator configuration file is filled in.
2) Executing the main method of dff-mpg\src\main\java\com\travelsky\dff\TestCodegenerator.
3) It is checked whether there is already a file to be generated under the following directory.
For example: to be tested is the save method in com\travelsky\diff\producer\business, the data used comes from nfs_gbl, FCIMS.
Test class files: dff-producer\src\test\java\com\travelsky\dff producer\business\xxx\yyy\xxxservice test.
Use case file: dff-producer\src\test\resources\com\travelsky\dff producer\business\xxx\yyy\save c sv.
Data file: dff-producer\src\test\resources\com\travelsky\dff producer\business\xxx\yyy\datas et_FCIMS.
dff-producer\src\test\resources\com\travelsky\dff\producer\business\xxx\yyy\DataS et_NFS_GBL.xml。
If the test method to be generated has no return value, the configuration void fun (XxxCommand), void must be added, and the return value can be omitted because it is the return value by default.
Only one method with parameter Command or List < Command > is supported, and if other methods write test codes by themselves.
If the generated file is needed to be additionally generated, the original test file, the use case and the data file are kept unchanged, and only the new test method is added.
The test mode can meet most of test scenes, and if the test cannot meet the requirement that the test scenes are self-modified after being generated, the custom code cannot be covered when the test scenes are regenerated later.
S104: and loading the test supporting file and the test case data file in the database to execute the unit test codes, thereby obtaining an execution result.
The execution result is used for asserting an expected result in the test case data file.
An assertion is a first order logic (e.g., a logical decision that results in true or false) in a program, in order to represent the result expected by a verified software developer, when the program executes to the location of the assertion, the corresponding assertion should be true, and if the assertion is not true, the program will cease execution and give an error message.
S105: and asserting the execution result and the expected result in the test case data file to obtain an asserted result.
In S105, the execution result is judged to be identical to the desired result (desired data) of the test case, that is, a way of judging whether the output of the input data (execution result) subjected to the program processing matches the desired result.
Customizing assertions that conform to the actual test scenario. For example, assertions that meet the actual test scenario are assertions JsonASSERT and exception assertions Exceptionasset used by the price issuing system.
Wherein JsonAssert is asserted: it can be determined whether two JSON strings are equal, ignoring the order of attributes of the same hierarchy, the order in the array must be consistent. The method is mainly used for comparing expected values with actual results.
Exceptionassert is declared abnormally: and supporting business abnormality, and checking comparison of error codes and error contents in the abnormality. The method is mainly applied to comparison of expected values and actual values in abnormal situations.
And particularly, carrying out assertion judgment on the test case data, wherein the process of obtaining an assertion judgment result is shown as B1-B3.
B1: and comparing the execution result with the expected result in the test case data file.
B2: and if the execution result is consistent with the expected result in the test case data file, obtaining an assertion result which indicates that the execution result is consistent with the expected result in the test case data file.
B3: and if the execution result is inconsistent with the expected result in the test case data file, obtaining an assertion result which characterizes the inconsistent execution result and the expected result in the test case data file.
After the test case data is executed to carry out the assertion judgment, judging whether the unexecuted test case data exists or not after an assertion result is obtained, and if the unexecuted test case data exists, reading the unexecuted test case data and carrying out the assertion judgment on the test case data; if the unexecuted test case data does not exist and the next case method exists, the case method is executed and the step of acquiring the test support data conforming to the preset test scene is executed in a return mode.
The test case data file is a data file containing a plurality of rows, and each row is an independent case, so that after each row is executed, that is, after one case is executed, other cases are continuously executed.
S106: if the assertion result represents that the execution result is consistent with the expected result in the test case data file, determining that the test case is successfully executed.
In S106, it is determined that the test case is successfully executed, that is, the assertion is correct.
The scheme is applied to the project through the data-driven unit test, and the following results are obtained:
first, the branch coverage rate is improved. One test method is performed multiple times using different parameters, covering different conditional branches. The coverage rate of unit test branches is effectively improved on the basis of not increasing the writing test codes;
second, the time to write test code is reduced. Most test codes in the test scene can be generated by a generator, so that the time for writing test case codes is greatly reduced, and the effort is concentrated in writing test scene data;
thirdly, the later reconstruction is facilitated. The unit test can provide guarantee for the reconstruction of the code, and as long as the unit test passes through all after the code is reconstructed, the reconstruction is shown to a large extent that no new BUG is introduced, and the method is based on complete and effective unit test coverage rate;
fourth, optimizing design. Writing unit tests will enable the user to observe and think from the caller's perspective, will enable the user to design the program to be easily callable and testable, and decouple from the software;
fifth, document record. The unit test is a valuable document which is the best document showing how functions or classes are used, which is compilable, runnable, and which is kept up to date, always synchronized with the code;
sixth, regression is provided. The automatic unit test avoids the regression of codes, and the test can be rapidly run anytime and anywhere after the writing is completed.
In the embodiment of the application, because all the test codes are generated by the customized code generator, the time for writing the test case codes is reduced, the development cost and the development difficulty are reduced, and a developer only needs to write the test case data file required by the preset test scene, so that the pressure of writing the unit test codes by the developer is released, and the workload of the developer is reduced.
Based on the data testing method disclosed in fig. 1 of the foregoing embodiment, the embodiment of the present application correspondingly discloses a data testing system, as shown in fig. 2, where the data testing system includes a first obtaining unit 201, a second obtaining unit 202, a first generating unit 203, a load executing unit 204, an asserting unit 205, and a determining unit 206.
The first obtaining unit 201 is configured to obtain test support data that accords with a preset test scenario, and store the test support data into a database; the method comprises the steps that basic data required by supporting a test scene exist in a preset test scene representation database; the test support data is used for assisting in the basic data file of the test case data file execution.
A second obtaining unit 202, configured to obtain a test case data file; the test case data file characterizes the data files of the input and expected results under the preset test scene.
A first generating unit 203 for generating a unit test code.
And the loading execution unit 204 is used for loading the test supporting file and the test case data file in the database to execute the unit test codes, so as to obtain an execution result.
And the asserting unit 205 is configured to assert the execution result and the expected result in the test case data file to obtain an asserted result.
And the determining unit 206 is configured to determine that the test case is successfully executed if the assertion result indicates that the execution result is consistent with the expected result in the test case data file.
The first obtaining unit 201 of the build process of the preset test scenario includes a determining module and a build module.
The determining module is used for determining a test boundary and a parameter definition rule; a test boundary represents a method inlet for carrying out unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed.
The construction module is used for constructing a preset test scene through the test boundary and the parameter definition rule.
Further, the asserting unit 205 includes an comparing module, a first generating module, and a second generating module.
The comparison module is used for comparing the execution result with the expected result in the test case data file;
the first acquisition module is used for acquiring an assertion result representing that the execution result is consistent with the expected result in the test case data file if the execution result is consistent with the expected result in the test case data file;
and the second acquisition module is used for acquiring an assertion result representing that the execution result is inconsistent with the expected result in the test case data file if the execution result is inconsistent with the expected result in the test case data file.
Further, the first generating unit 203 includes a second acquiring module and a generating module.
And the third acquisition module is used for acquiring the test class, the test method and the test parameter configured in the pre-customized generator configuration file if the assertion result represents that the execution result is consistent with the expected result in the test case data file.
And the generating module is used for generating unit test codes according to the test classes, the test methods and the test parameters configured in the generator configuration file and executing the unit test codes.
Further, the data testing system further comprises a judging unit, a first determining unit and a second determining unit.
And the judging unit is used for judging whether a return value exists in the test method.
The first determining unit is used for determining that a return value exists in the test method if the type of the test method is void type.
The second determining unit is used for determining that a return value does not exist in the test method if the type of the test method is not the void type, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
In the embodiment of the application, because all the test codes are generated by the customized code generator, the time for writing the test case codes is reduced, the development cost and the development difficulty are reduced, and a developer only needs to write the test case data file required by the preset test scene, so that the pressure of writing the unit test codes by the developer is released, and the workload of the developer is reduced.
The embodiment of the application also provides a storage medium, which comprises stored instructions, wherein the equipment where the storage medium is controlled to execute the data testing method when the instructions run.
The embodiment of the present application further provides an electronic device, whose structural schematic diagram is shown in fig. 3, specifically including a memory 301, and one or more instructions 302, where the one or more instructions 302 are stored in the memory 301, and configured to be executed by the one or more processors 303 to perform the above-mentioned data testing method by executing the one or more instructions 302.
The specific implementation and derivative manner of each embodiment are all within the protection scope of the application.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for a system or system embodiment, since it is substantially similar to a method embodiment, the description is relatively simple, with reference to the description of the method embodiment being made in part. The systems and system embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A method of testing data, the method comprising:
acquiring test support data conforming to a preset test scene, and storing the test support data into a database; basic data required by supporting the test scene exists in the preset test scene representation database; the test support data is used for assisting a basic data file executed by the test case data file;
acquiring a test case data file; the test case data file represents data files of input and expected results in the preset test scene;
generating a unit test code;
loading a test supporting file and the test case data file in the database to execute the unit test code to obtain an execution result;
asserting the execution result and an expected result in the test case data file to obtain an asserted result;
and if the assertion result characterizes that the execution result is consistent with the expected result in the test case data file, determining that the test case is successfully executed.
2. The method according to claim 1, wherein the step of presetting the construction process of the test scene comprises:
determining test boundaries and parameter definition rules; the test boundary characterizes a method inlet for performing unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed;
and constructing a preset test scene through the test boundary and the parameter definition rule.
3. The method of claim 1, wherein asserting the execution result with the desired result in the test case data file to obtain an asserted result comprises:
comparing the execution result with an expected result in a test case data file;
if the execution result is consistent with the expected result in the test case data file, obtaining an assertion result which characterizes that the execution result is consistent with the expected result in the test case data file;
and if the execution result is inconsistent with the expected result in the test case data file, obtaining an assertion result which characterizes that the execution result is inconsistent with the expected result in the test case data file.
4. The method of claim 1, wherein the generating unit test code comprises:
acquiring a test class, a test method and test parameters configured in a pre-customized generator configuration file;
generating unit test codes according to the test classes, the test methods and the test parameters configured in the generator configuration file.
5. The method as recited in claim 4, further comprising:
judging whether a return value exists in the test method;
if the type of the test method is the void type, determining that a return value exists in the test method;
if the type of the test method is not the void type, determining that a return value does not exist in the test method, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
6. A data testing system, the system comprising:
the first acquisition unit is used for acquiring test support data conforming to a preset test scene and storing the test support data into a database; basic data required by supporting the test scene exists in the preset test scene representation database; the test support data is used for assisting a basic data file executed by the test case data file;
the second acquisition unit is used for acquiring the test case data file; the test case data file represents data files of input and expected results in the preset test scene;
the first generating unit is used for generating a unit test code;
the loading execution unit is used for loading the test supporting file and the test case data file in the database to execute the unit test codes to obtain an execution result;
the assertion unit is used for asserting the execution result and the expected result in the test case data file to obtain an assertion result;
and the determining unit is used for determining that the test case is successfully executed if the assertion result represents that the execution result is consistent with the expected result in the test case data file.
7. The system according to claim 6, wherein the first obtaining unit for presetting the construction process of the test scene includes:
the determining module is used for determining a test boundary and a parameter definition rule; the test boundary characterizes a method inlet for performing unit test; the unit test is a test for checking and verifying the smallest testable unit in the software; the parameter definition rule is a definition rule for parameter construction when a test case of a preset test scene is constructed;
and the construction module is used for constructing a preset test scene through the test boundary and the parameter definition rule.
8. The system of claim 6, wherein the asserting unit comprises:
the comparison module is used for comparing the execution result with an expected result in the test case data file;
the first acquisition module is used for obtaining an assertion result representing that the execution result is consistent with the expected result in the test case data file if the execution result is consistent with the expected result in the test case data file;
and the second acquisition module is used for obtaining an assertion result which characterizes that the execution result is inconsistent with the expected result in the test case data file if the execution result is inconsistent with the expected result in the test case data file.
9. The system of claim 8, wherein the first generation unit comprises:
the third acquisition module is used for acquiring the test class, the test method and the test parameter configured in the pre-customized generator configuration file if the assertion result represents that the execution result is consistent with the expected result in the test case data file;
and the generating module is used for generating unit test codes according to the test classes, the test methods and the test parameters configured in the generator configuration file and executing the unit test codes.
10. The system of claim 9, further comprising:
the judging unit is used for judging whether a return value exists in the test method;
the first determining unit is used for determining that a return value exists in the test method if the type of the test method is a void type;
the second determining unit is used for determining that a return value does not exist in the test method if the type of the test method is not the void type, and configuring the test method in a preset configuration mode; the preset configuration mode represents a configuration mode for defining a null return value function.
CN202310035027.3A 2023-01-10 2023-01-10 Data testing method and system Pending CN116010274A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310035027.3A CN116010274A (en) 2023-01-10 2023-01-10 Data testing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310035027.3A CN116010274A (en) 2023-01-10 2023-01-10 Data testing method and system

Publications (1)

Publication Number Publication Date
CN116010274A true CN116010274A (en) 2023-04-25

Family

ID=86023068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310035027.3A Pending CN116010274A (en) 2023-01-10 2023-01-10 Data testing method and system

Country Status (1)

Country Link
CN (1) CN116010274A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117435512A (en) * 2023-12-21 2024-01-23 摩尔元数(福建)科技有限公司 Unit test method for automatically switching different database types based on Junit5

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117435512A (en) * 2023-12-21 2024-01-23 摩尔元数(福建)科技有限公司 Unit test method for automatically switching different database types based on Junit5
CN117435512B (en) * 2023-12-21 2024-03-08 摩尔元数(福建)科技有限公司 Unit test method for automatically switching different database types based on Junit5

Similar Documents

Publication Publication Date Title
US11106440B2 (en) Source code translation
US6889158B2 (en) Test execution framework for automated software testing
US9792203B2 (en) Isolated testing of distributed development projects
US8392880B2 (en) Rapid application development for database-aware applications
US7779036B2 (en) Integration functionality for a test tool for application programming interfaces
JP2602205B2 (en) Database access control method
US20070277163A1 (en) Method and tool for automatic verification of software protocols
EP2228726B1 (en) A method and system for task modeling of mobile phone applications
CN111897570B (en) Multi-dependency item file extraction method and device based on Maven plug-in
CN101866315B (en) Test method and system of software development tool
US20130055197A1 (en) Modeling and code generation for sql-based data transformations
CN107111545B (en) Method and system for generating computer executable graph
US20210191845A1 (en) Unit testing of components of dataflow graphs
JP2010231782A (en) Method and system for function automation
US20050086022A1 (en) System and method for providing a standardized test framework
CN116010274A (en) Data testing method and system
CN113535141A (en) Database operation code generation method and device
Hovy et al. Towards automatic and flexible unit test generation for legacy hpc code
Anbunathan et al. Data driven architecture based automated test generation for Android mobile
CN112559359A (en) Based on S2ML safety critical system analysis and verification method
Venkatraj et al. Development of test automation framework for rest api testing
CN113703769B (en) CLI command execution method and related device
US20040194091A1 (en) System and method for capturing and managing a process flow
Lim et al. D-TAF: test automation framework compatible with various DBMS
Pradhan User interface test automation and its challenges in an industrial scenario

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination