CN107665171B - Automatic regression testing method and device - Google Patents

Automatic regression testing method and device Download PDF

Info

Publication number
CN107665171B
CN107665171B CN201710942943.XA CN201710942943A CN107665171B CN 107665171 B CN107665171 B CN 107665171B CN 201710942943 A CN201710942943 A CN 201710942943A CN 107665171 B CN107665171 B CN 107665171B
Authority
CN
China
Prior art keywords
configuration file
information
test
data
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710942943.XA
Other languages
Chinese (zh)
Other versions
CN107665171A (en
Inventor
孔新
赵泊瑄
陈深龙
李晓群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Minsheng Banking Corp Ltd
Original Assignee
China Minsheng Banking Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Minsheng Banking Corp Ltd filed Critical China Minsheng Banking Corp Ltd
Priority to CN201710942943.XA priority Critical patent/CN107665171B/en
Publication of CN107665171A publication Critical patent/CN107665171A/en
Application granted granted Critical
Publication of CN107665171B publication Critical patent/CN107665171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3644Software debugging by instrumenting at runtime
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/366Software debugging using diagnostics

Abstract

The invention provides an automatic regression testing method and device. The method comprises the following steps: when a test task is received, acquiring a configuration file corresponding to the test task, wherein the configuration file records relevant information of all test cases needing to be executed for completing the test task; executing each test case according to the relevant information of each test case in the configuration file to obtain an execution result of each test case; and comparing the execution result of each test case with the corresponding expected result to obtain a test result. The method of the invention realizes the automatic regression test without the need of test personnel to compile test codes, all functions of the test case are completed through the configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly submitted.

Description

Automatic regression testing method and device
Technical Field
The invention relates to the technical field of communication, in particular to an automatic regression testing method and device.
Background
Regression testing is a kind of software testing, which refers to testing a computer program again after a part of code in the computer program is modified, so as to ensure that the modification does not introduce new errors or cause errors in the code of other unmodified parts of the computer program.
Regression testing is divided into manual testing and automatic testing. The manual regression test requires a tester to manually execute each test case, is an activity with more repeatability, and is easy to cause the tester to feel tired and tired. The automatic regression test is completed by the automatic execution of a computer after a tester writes a test case or a test script.
The existing auto-regression testing method needs to compile corresponding testing codes for each testing case, and different testing cases need to be compiled for different testing data, so that when a target program is tested, a large number of testing cases are usually needed, testers still need to compile a large number of testing codes manually, a large amount of time and manpower are consumed, the testing efficiency is low, the number of testing codes is too large, and the management and maintenance of the testing codes are difficult.
Disclosure of Invention
The invention provides an automatic regression testing method and device, which are used for solving the problems that in the existing automatic regression testing method, a lot of testing codes still need to be written manually by testing personnel, a lot of time and manpower are consumed, the testing efficiency is low, the number of the testing codes is too large, and the management and the maintenance of the testing codes are difficult.
One aspect of the present invention provides an auto-regression testing method, including:
acquiring a configuration file corresponding to a test task, wherein the configuration file records relevant information of all test cases needing to be executed for completing the test task;
executing each test case according to the relevant information of each test case in the configuration file to obtain an execution result of each test case;
and comparing the execution result of each test case with the corresponding expected result to obtain a test result.
Another aspect of the present invention provides an auto-regressive test apparatus, comprising:
the acquisition module is used for acquiring a configuration file corresponding to the test task, wherein the configuration file records relevant information of all test cases needing to be executed for completing the test task;
the execution module is used for executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case;
and the comparison module is used for comparing the execution result of each test case with the corresponding expected result to obtain a test result.
According to the automatic regression testing method and device, the relevant information of each testing case is recorded through the configuration file, when a testing task is received, the configuration file corresponding to the testing task is obtained, each testing case is executed according to the relevant information of each testing case in the configuration file, and the execution result of each testing case is obtained; the execution result of each test case is compared with the corresponding expected result to obtain the test result, the test code is not required to be written by a tester while the automatic regression test is realized, all functions of the test case are completed through configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flowchart of an auto-regression testing method according to an embodiment of the present invention;
FIG. 2 is a flowchart of an auto-regression testing method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an auto-regressive test apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an auto-regressive test apparatus according to a fourth embodiment of the present invention.
With the above figures, certain embodiments of the invention have been illustrated and described in more detail below. The drawings and the description are not intended to limit the scope of the inventive concept in any way, but rather to illustrate it by those skilled in the art with reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The terms to which the present invention relates will be explained first:
code: refers to a source file written by a programmer in a language supported by a development tool, which is a set of definite rule system that represents information in a discrete form by characters, symbols or signal symbols.
Test Case (Test Case): the test case is a description of a test task performed on a specific software product, and embodies test schemes, methods, techniques and strategies. The test cases may include test targets, test environments, input data, test steps, expected results, test codes, and the like.
A plurality of: two or more.
The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Example one
Fig. 1 is a flowchart of an auto-regression testing method according to an embodiment of the present invention. The embodiment of the invention provides an automatic regression testing method aiming at the problems that in the existing automatic regression testing method, a plurality of testing codes still need to be written manually by testing personnel, a large amount of time and manpower are consumed, the testing efficiency is low, the number of the testing codes is too large, and the management and maintenance of the testing codes are difficult. As shown in fig. 1, the method comprises the following specific steps:
step S101, obtaining a configuration file corresponding to the test task, wherein the configuration file records relevant information of all test cases needing to be executed after the test task is completed.
Wherein, the relevant information of the test case at least comprises: case identification, program information to be tested, data source configuration information, input data, expected results, and execution result location information.
In the step, when a test task is received, a configuration file corresponding to the test task is obtained.
In this embodiment, the configuration files may be classified into 5 types according to different recording information: global profiles, case profiles, step profiles, data profiles, and the like. There may be one or more of each type of configuration file. Each type of profile may correspond to a plurality of test cases, with relevant information for the plurality of test cases being recorded. For example, a case profile may correspond to a plurality of test cases, a step profile may correspond to a plurality of steps of a plurality of test cases, and a data profile may record input data and expected output results for a plurality of test cases.
In practical application, in order to facilitate maintenance, the test cases can be classified according to functions, types and the like corresponding to the test cases, and related information of the test cases of the same type is recorded in the same set of configuration files, so that the number of the configuration files is reduced, and the test cases are convenient to maintain.
Before the automatic regression test, a tester formulates a test task, the design and the writing of a configuration file are completed according to the test task, and the configuration file after the verification can be applied to the regression test. The configuration file records the relevant information of all test cases to be executed for completing the test task, so that program codes do not need to be written. If the input data of the test case needs to be modified, only the configuration file where the input data is located needs to be modified.
Step S102, executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case.
In this embodiment, the execution result of the test case includes: and returning the result and the database modification result. The expected result includes an expected output result and an expected modified result. The preset output result refers to the expected value of the return value of each test case; the expected modification result refers to an expected value of the data modified in the data source during execution of each test case.
And recording the relevant information of all the test cases through the configuration file, and executing each test case according to the relevant information of the test case recorded by the configuration file to obtain the execution result of each test case.
Step S103, comparing the execution result of each test case with the corresponding expected result to obtain a test result.
Wherein, the test result may include: whether each test case passes the test, error information of failed test cases, and the like.
In the testing process, if the execution result of the test case is consistent with the corresponding expected result, the test case passes the test; if the execution result of the test case is inconsistent with the corresponding expected result, the test case fails.
According to the embodiment of the invention, the relevant information of each test case is recorded through the configuration file, when the test task is received, the configuration file corresponding to the test task is obtained, and each test case is executed according to the relevant information of each test case in the configuration file to obtain the execution result of each test case; the execution result of each test case is compared with the corresponding expected result to obtain the test result, the test code is not required to be written by a tester while the automatic regression test is realized, all functions of the test case are completed through configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly improved.
Example two
Fig. 2 is a flowchart of an auto-regression testing method according to a second embodiment of the present invention. On the basis of the first embodiment, in this embodiment, the configuration files include a first configuration file, a second configuration file, a third configuration file, a fourth configuration file, and a fifth configuration file. The first configuration file also records the storage position of the target data; executing each test case according to the relevant information of each test case in the configuration file, and before obtaining the execution result of each test case, the method further comprises the following steps: determining whether the target data storage position recorded in the first configuration file stores data or not; and if the target data storage position does not store the data, acquiring the target data from the complete database, and storing the target data to the target data storage position. As shown in fig. 2, the method comprises the following specific steps:
step S201, obtaining a first configuration file, a second configuration file, a third configuration file, a fourth configuration file, and a fifth configuration file corresponding to the test task.
In this embodiment, the configuration files are divided into five types, namely, a first configuration file, a second configuration file, a third configuration file, a fourth configuration file and a fifth configuration file, according to different configuration contents.
Wherein, the first configuration file, the second configuration file, the third configuration file and the fifth configuration file can be XM L files, and the fourth configuration file can be JSON format files.
Each configuration file is described in detail below:
(1) the first configuration file is used for recording global configuration information, and the global configuration information at least comprises execution result position information and data source configuration information corresponding to each test case.
In this embodiment, the first configuration file may be an XM L file, and may be named as, for example, application context.
In the execution process of the test case, global configuration information needs to be configured in advance, and global configuration information such as global variables or tool objects which need to be used in the test process is recorded in the first configuration file.
The first profile is a global profile and may include: the JSON data analyzer, the storage position of the execution result of the test case, whether errors encountered in the execution process are ignored, the configuration of a data source, tool objects needed to be used in the test execution process and the like. The modification of the global configuration information can affect the execution of all test cases, so that the information is extracted and maintained independently, and the maintenance of the configuration file is more convenient.
(2) The second configuration file is used for recording the basic information of each test case, and the basic information at least comprises the following components: a case identification and at least one execution path identification.
In this embodiment, the second configuration file may be an XM L file, and may be named cases.
For example, the second configuration file may include content in the form of:
Figure BDA0001431134240000061
wherein, the < cases > </cases > label records the basic information of one or more test cases.
The method comprises the steps of representing a case node by contents included in a group of < case > </case > labels, wherein each case node corresponds to a test case and is used for recording basic information of the test case, namely, each case node defines a test case. In addition, in the definition of a test case, besides the step node, the definition of the test case may also include information such as case identification, case description, and writer of the test case. Wherein the case identification is globally unique and can be used to uniquely identify a test case.
A step node is represented by the content comprised by a set of < step > </step > tags. Each step node corresponds to an execution path, the name attribute value of the step node is a corresponding execution path identifier, and the step node corresponds to the to-be-tested program information of one execution path in the third configuration file through the execution path identifier. That is, each step node defines an execution path.
When the test case is executed, each execution path is executed in sequence from top to bottom, and the program to be tested corresponding to each execution path is called. The program to be tested may be a Java method or a web service.
(3) The third configuration file is used for recording the information of the program to be tested corresponding to each execution path identifier, and the information of the program to be tested at least comprises: the method comprises the following steps that corresponding execution path identification, a calling interface of a program to be tested and target data information in a data source to be accessed by the program to be tested are carried out, wherein the target data information comprises: table names of data tables, data routing information, and data query information.
The data source is a database which is usually referred to be accessed by a program to be tested, and the program to be tested may be a Java method or a web service.
In this embodiment, the third configuration file may be an XM L file, and may be named step.
For example, the third configuration file may include content in the form of:
Figure BDA0001431134240000071
Figure BDA0001431134240000081
wherein, the content included in the group of < bean > </bean > tags represents a bean object, and the content included in the group of < property > </property > tags represents a property object.
The bean object defined by the outermost group of < bean > </bean > tags corresponds to the information of the program to be tested of one execution path in the second configuration file. The id attribute value of the bean object is the execution path identifier of the corresponding execution path, and corresponds to the name attribute value in a step node in (2).
The program information to be tested of the third configuration file also comprises a calling interface of the program to be tested, which needs to be executed by the execution path. Taking the program to be tested as a Java method as an example, two property objects can be used in the bean object to respectively give the class name and the method name of the Java method, and the Java method can be called by the class name and the method name. If the program to be tested is a webservice service, a service calling interface can also be directly given through the property object, for example, the following method can be adopted:
Figure BDA0001431134240000082
the className attribute value is the class name of the Java method or the name of the web service, if the className attribute value is the class name of the Java method, the methodName attribute value corresponds to the name of the Java method to be executed, and if the className attribute value is the name of the web service, the methodName attribute value must be execute.
In this embodiment, the program to be tested that needs to be tested can be determined when the test case is executed through the call interface of the program to be tested.
In addition, since data in the data source may be modified in the execution process of the program to be tested, and target data in the data source accessed by the program to be tested in the execution process also needs to be verified, the program information to be tested corresponding to each execution path identifier recorded in the third configuration file further includes target data information in the data source to be accessed by the program to be tested.
The target data information refers to information of a target data table to be accessed in the execution process of the program to be tested, and specifically includes a table name of the target data table, data routing information and data query information.
Target data information in the data source to be verified may be recorded in a property object with a name attribute value of "tables" in the third configuration file, and multiple pieces of target data information may be recorded in the property object.
A bean object can be used in the property object with the name attribute value of "tables" to correspond to the information of a target data table in the data source. Each of the bean objects corresponds to the data table structure information of one target data information recorded in the fourth configuration file.
The property object with the name attribute value of "tableName" in the bean object of the information of the target data table records the table name of the target data table, the property object with the name attribute value of "subject" records the data query information of the target data table, such as the where clause information in the SQ L statement, and the property object with the name attribute value of "route" records the data routing information of the target data table.
Each bean object corresponds to the data table structure information of one piece of target data information recorded in the fourth configuration file through the table name and the data routing information of the target data table.
In the embodiment of the invention, the data to be verified is acquired from the corresponding target database table after the program to be tested is executed through the table name, the data routing information and the data query information of the target database table recorded by the third configuration file.
It should be noted that the data routing information in this embodiment may be used to support the structure of database sub-tables, and if there is no sub-table or sub-table, the data routing information may not be included.
(4) The fourth configuration file is used for recording data table structure information corresponding to the target data information in the third configuration file, and the data table structure information at least comprises: table name of the data table, data routing information, and test field of the data table.
In this embodiment, the fourth configuration file may be an XM L file, which may be named table.
For example, the fourth configuration file may include content in the form of:
Figure BDA0001431134240000101
the target data information refers to information of a target data table to be accessed in the execution process of the program to be tested, and specifically includes a table name of the target data table, data routing information and data query information.
Each bean object in the list node corresponds to data table structure information of a target data table, and the property object with the name attribute value of 'tableName' records the table name of the target data table; the property object with the name attribute value of "route" records the data routing information of the target data table, and the property object with the name attribute value of "query field" records the test field of the target data table.
Each bean object in the list node corresponds to the target data information in the third configuration file through the table name of the target data table, that is, each bean object in the list node in the fourth configuration file corresponds to one bean object in the property object with the name attribute value of "tables" in the third configuration file.
In this embodiment, according to the target data information in the data source to be accessed by the program to be tested, which is recorded in the third configuration file, and the data table structure information corresponding to the target data information in the third configuration file, which is recorded in the fourth configuration file, the target data in the data source to be accessed for executing each test case can be determined, so that which data in the data source to be tested can be determined.
(5) The fifth configuration file is used for recording test data information corresponding to each test case, and the test data information at least comprises: corresponding case identification, input data, and expected output results.
In this embodiment, the execution result of the test case includes: and returning the result and the database modification result. The expected result includes an expected output result and an expected modified result. The preset output result refers to the expected value of the return value of each test case; the expected modification result refers to an expected value of the data modified in the data source during execution of each test case.
In this embodiment, the fifth configuration file may be a JSON format file, and may be named test _ data. The fifth configuration file configures the test input data of each test case, and under the condition that the execution path is the same, the test input data in the file can be only modified, and a new test case can be formed. The fifth configuration file may be parsed by a preconfigured json parser.
Each execution path of a test case has corresponding input data, and in the invention, the input data of the test case is defined by json.
For example, the fifth configuration file may include content in the form of:
Figure BDA0001431134240000111
Figure BDA0001431134240000121
wherein, the test data information may further include: the test data identifies, whether the test data is completed, the related description information of the test data, the writer, the writing time and the like.
And (3) test data identification: a test data identifier for uniquely identifying a piece of test data information.
Description information related to test data: for describing the meaning, the role played, and the like of each numerical representation in the test data.
Whether or not: to indicate whether the test data is completed, it is required that all the test data must be in a complete state when performing the auto-regression test. For example, "N" may be used to indicate that the test data is not complete and "Y" may be used to indicate that the test data is complete.
Optionally, before performing the auto-regression test, it may be checked whether the test data corresponding to all the test cases corresponding to the test task are in the completed state, and if there is unfinished test data, an error prompt message is output, and the test is rejected.
"data" [ ] represents a data block for recording a test data information, which corresponds to a test case through case identification.
"steps" [ ] indicates a step data block, and the data block includes the step data block.
The input data of the program to be tested corresponding to each execution path refers to input parameter values, and can be constant values, variable values or output results of the previous program.
During the execution of the test case, each input data corresponds to the context of a test case, and the data in the steps data block is put into the context of the test case. Data in the test case context may be referenced in the input data using "# { XXX }".
The steps data block is composed of an input statement block and an output statement block. The input statement block is used for recording input data of a program to be tested, and the input data can be input data of one or more Java methods and/or web service services. Parameter values of constants and/or variables are set in an input statement block of a "first data block" data block.
In addition, the input data of the test case may need to call an external method in some cases, such as calling a method of a certain object to generate the input data, and in the steps data block, the external method may be called by "$ { xx.yy }", where XX denotes a class name or an object name and YY denotes a method name. If XX is a class name, then YY must correspond to a static method.
The output statement block is used for recording a preset result of the program to be tested. If the test case does not return a value, it may not be recorded here.
Step S202, a storage location of the target data is further recorded in the first configuration file, and it is determined whether the storage location of the target data recorded in the first configuration file stores data.
In practical application, some modifications are made to data in a data source when a test case is executed, the modified data may cause that the test case cannot be automatically and repeatedly executed, and at this time, the data in a database can be repeatedly executed after being modified, so that automatic regression testing cannot be performed.
In this embodiment, in order to ensure that the automatic repeat execution of the test case can be realized in the auto-regression testing process, the first configuration file further includes: a target data storage location. Before executing each test case according to the relevant information of each test case in the configuration file, determining whether the storage position of the target data recorded in the first configuration file stores data or not through the step; if the target data storage position does not store data, determining that the test case cannot be executed, executing subsequent steps to obtain target data from a complete database, and storing the target data to the target data storage position; and if the target data storage position stores data, skipping step S203 to directly execute step S204, and executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case.
Step S203, if the target data storage position does not store data, acquiring the target data from the complete database, and storing the target data to the target data storage position.
The relevant information of the complete database and the target data can be recorded in the first configuration file, so that the target data can be obtained from the complete database according to the relevant information of the complete database and the target data, and the target data can be stored in the target data storage position.
For example, the first configuration file may include content in the form of:
Figure BDA0001431134240000141
wherein, the value of the property object with the name attribute value of "dataSource" indicates the complete database object where the target data to be acquired is located. The value of the property object whose name attribute value is "tables" refers to the table name of each data table corresponding to the target data. The value of the property object with the name attribute value of "xmlPath" refers to the storage location of the target data, that is, the storage location of the file of the target data table to be accessed when executing the test case.
In this embodiment, before executing the test case, it is first checked whether the storage location specified by the value of the property object with the name attribute value of "xmlPath" contains data through step S202, and if there is no corresponding data, the data corresponding to the data table specified by the value of the property object with the name attribute value of "tables" is stored in the complete database pointed to by the value of the property object with the name attribute value of "xsource".
Optionally, if the storage location specified by the value of the property object with name attribute value "xmlPath" already contains relevant data, the data in the storage location is restored to the complete database pointed to by the value of the property object with name attribute value "dataSource" to ensure that the case is repeatedly executed and the execution result is the same after each execution.
Alternatively, the data in the database tables may be represented in the form of an xml file.
Optionally, before executing each test case according to the relevant information of each test case in the configuration file in step S204, the method further includes: verifying all the configuration files, and if the configuration files are verified to be wrong, outputting configuration file verification error information so that a tester can modify the configuration files; if the verification is successful, executing the subsequent steps, and executing each test case according to the relevant information of each test case in the configuration file.
And S204, executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case.
Specifically, executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case, which specifically includes:
connecting a data source according to data source configuration information recorded in the first configuration file so as to enable each program to be tested corresponding to each test case to access target data in the data source when running;
determining target data in the data source which is required to be accessed by each test case according to the basic information of each test case recorded by the second configuration file, the target data information in the data source which is required to be accessed by the program to be tested and recorded by the third configuration file corresponding to each test case, and the data table structure information which is recorded in the fourth configuration file and corresponds to the target data information in the third configuration file;
and running each program to be tested according to the input data in the test data information corresponding to each test case recorded in the fifth configuration file and the call interface of the program to be tested in the third configuration file corresponding to each test case, and outputting the running result of each program to be tested to the position corresponding to the execution result position information recorded in the first configuration file.
Step S205, comparing the execution result of each test case with the corresponding expected result to obtain a test result.
Wherein, the test result may include: whether each test case passes the test, error information of failed test cases, and the like.
In this embodiment, the execution result of the test case includes: and returning the result and the database modification result. The expected result includes an expected output result and an expected modified result. The preset output result refers to the expected value of the return value of each test case; the expected modification result refers to an expected value of the data modified in the data source during execution of each test case.
In this step, comparing the execution result of the test case with the corresponding expected result includes: comparing the returned result of the test case with the expected output result, comparing the database modification result with the expected modification result, and if all the comparisons are consistent, determining that the test case passes the test; if any execution result is inconsistent with the expected result, determining that the test case test does not pass; record whether the test case passed.
Optionally, if the test case fails, error information in the execution of the test case may be recorded, so that a tester can analyze the reason why the test case fails.
And step S206, generating a test report according to the test result.
In this embodiment, after the test result is obtained, a test report may be generated according to the test result, so that a tester may view the test result, or export or print the test result.
Optionally, the test results of the test cases may also be counted, for example, the number of test cases that pass the test, the number of test cases that fail the test, and the proportion of test cases that pass the test are counted for waiting, and the statistical results may also be added to the test report.
According to the embodiment of the invention, different types of configuration files are set according to different configuration contents, different types of configuration information are put into different configuration files, each test case is executed according to the relevant information of each test case in the configuration files to obtain the execution result of each test case, the execution result of each test case is compared with the corresponding expected result to obtain the test result, the test code is not required to be compiled by a tester while the automatic regression test is realized, all functions of the test case are completed through the configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an auto-regressive test apparatus according to a third embodiment of the present invention. The auto-regression testing device provided by the embodiment of the invention can execute the processing flow provided by the embodiment of the auto-regression testing method. As shown in fig. 3, the apparatus 30 includes: an acquisition module 301, an execution module 302 and a comparison module 303.
Specifically, the obtaining module 301 is configured to obtain a configuration file corresponding to the test task, where the configuration file records relevant information of all test cases that need to be executed to complete the test task.
Wherein, the relevant information of the test case at least comprises: case identification, program information to be tested, data source configuration information, input data, expected results, and execution result location information.
The execution module 302 is configured to execute each test case according to the relevant information of each test case in the configuration file, so as to obtain an execution result of each test case.
The comparison module 303 is configured to compare the execution result of each test case with the corresponding expected result to obtain a test result.
The apparatus provided in the embodiment of the present invention may be specifically configured to execute the method embodiment provided in the above embodiment … …, and specific functions are not described herein again.
According to the embodiment of the invention, the relevant information of each test case is recorded through the configuration file, when the test task is received, the configuration file corresponding to the test task is obtained, and each test case is executed according to the relevant information of each test case in the configuration file to obtain the execution result of each test case; the execution result of each test case is compared with the corresponding expected result to obtain the test result, the test code is not required to be written by a tester while the automatic regression test is realized, all functions of the test case are completed through configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly improved.
Example four
Fig. 4 is a schematic structural diagram of an auto-regressive test apparatus according to a fourth embodiment of the present invention. On the basis of the third embodiment, in this embodiment, the related information of the test case at least includes: case identification, program information to be tested, data source configuration information, input data, expected results, and execution result location information.
The obtaining module 301 is further configured to obtain a first configuration file, a second configuration file, a third configuration file, a fourth configuration file, and a fifth configuration file.
The first configuration file, the second configuration file, the third configuration file and the fifth configuration file are XM L files, and the fourth configuration file is a JSON format file.
The first configuration file is used for recording global configuration information, and the global configuration information at least comprises execution result position information and data source configuration information corresponding to each test case.
The second configuration file is used for recording the basic information of each test case, and the basic information at least comprises the following components: a case identification and at least one execution path identification.
The third configuration file is used for recording the information of the program to be tested corresponding to each execution path identifier, and the information of the program to be tested at least comprises: the method comprises the following steps that corresponding execution path identification, a calling interface of a program to be tested and target data information in a data source to be accessed by the program to be tested are carried out, wherein the target data information comprises: table names of data tables, data routing information, and data query information.
The fourth configuration file is used for recording data table structure information corresponding to the target data information in the third configuration file, and the data table structure information at least comprises: table name of the data table, data routing information, and test field of the data table.
The fifth configuration file is used for recording test data information corresponding to each test case, and the test data information at least comprises: corresponding case identification, input data, and an expected output result, the expected output result being a portion of the expected result.
Optionally, the executing module 302 includes: the device comprises a connection submodule, a determination submodule and an execution submodule.
The connection submodule is used for connecting the data source according to the data source configuration information recorded in the first configuration file, so that each program to be tested corresponding to each test case can access the target data in the data source when running.
The determining submodule is used for determining target data in the data source which is required to be accessed by each test case according to the basic information of each test case recorded by the second configuration file, the target data information in the data source which is required to be accessed by the program to be tested and recorded by the third configuration file corresponding to each test case, and the data table structure information which is recorded in the fourth configuration file and corresponds to the target data information in the third configuration file.
The execution submodule is used for operating each program to be tested according to the input data in the test data information corresponding to each test case recorded in the fifth configuration file and the calling interface of the program to be tested in the third configuration file corresponding to each test case, and outputting the operation result of each program to be tested to the position corresponding to the execution result position information recorded in the first configuration file.
In this embodiment, as shown in fig. 4, the apparatus 30 further includes: a data verification module 304.
The data verification module 304 is configured to: determining whether the target data storage position recorded in the first configuration file stores data or not; and if the target data storage position does not store the data, acquiring the target data from the complete database, and storing the target data to the target data storage position.
The apparatus provided in the embodiment of the present invention may be specifically configured to execute the method embodiment provided in the second embodiment, and specific functions are not described herein again.
According to the embodiment of the invention, different types of configuration files are set according to different configuration contents, different types of configuration information are put into different configuration files, each test case is executed according to the relevant information of each test case in the configuration files to obtain the execution result of each test case, the execution result of each test case is compared with the corresponding expected result to obtain the test result, the test code is not required to be compiled by a tester while the automatic regression test is realized, all functions of the test case are completed through the configuration files, the number of the configuration files is small, the maintenance is convenient, and the efficiency of the automatic regression test is greatly improved.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (9)

1. An auto-regression testing method, comprising:
acquiring a configuration file corresponding to a test task, wherein the configuration file records relevant information of all test cases needing to be executed for completing the test task;
executing each test case according to the relevant information of each test case in the configuration file to obtain an execution result of each test case;
comparing the execution result of each test case with the corresponding expected result to obtain a test result;
the acquiring of the configuration file corresponding to the test task includes:
acquiring a first configuration file, a second configuration file, a third configuration file, a fourth configuration file and a fifth configuration file;
the first configuration file, the second configuration file, the third configuration file and the fifth configuration file are XM L files, and the fourth configuration file is a JSON format file;
the first configuration file is used for recording global configuration information, and the global configuration information at least comprises execution result position information and data source configuration information corresponding to each test case;
the second configuration file is used for recording basic information of each test case, and the basic information at least comprises: a case identification and at least one execution path identification;
the third configuration file is used for recording information of the program to be tested corresponding to each execution path identifier, and the information of the program to be tested at least comprises: the method comprises the following steps that corresponding execution path identification, a calling interface of a program to be tested and target data information in a data source to be accessed by the program to be tested are carried out, wherein the target data information comprises: table names of the data tables, data routing information and data query information;
the fourth configuration file is configured to record data table structure information corresponding to the target data information in the third configuration file, where the data table structure information at least includes: the table name of the data table, the data routing information and the test field of the data table;
the fifth configuration file is used for recording test data information corresponding to each test case, and the test data information at least includes: a corresponding case identification, input data, and an expected output result that is a portion of the expected result.
2. The method of claim 1, wherein the information related to the test case comprises at least: case identification, program information to be tested, data source configuration information, input data, expected results, and execution result location information.
3. The method of claim 1, wherein the executing each of the test cases according to the information related to each of the test cases in the configuration file to obtain the execution result of each of the test cases comprises:
connecting a data source according to data source configuration information recorded in the first configuration file so that each program to be tested corresponding to each test case accesses target data in the data source when running;
determining target data in the data source which the test cases need to access according to the basic information of the test cases recorded by the second configuration file, target data information in the data source which the program to be tested needs to access and is recorded by the third configuration file and corresponding to the test cases, and data table structure information which is recorded by the fourth configuration file and corresponds to the target data information in the third configuration file;
and running each program to be tested according to the input data in the test data information corresponding to each test case recorded in the fifth configuration file and the call interface of the program to be tested in the third configuration file corresponding to each test case, and outputting the running result of each program to be tested to the position corresponding to the execution result position information recorded in the first configuration file.
4. The method of claim 1, wherein the first configuration file further records a target data storage location;
before executing each of the test cases according to the relevant information of each of the test cases in the configuration file to obtain an execution result of each of the test cases, the method further includes:
determining whether the target data storage positions recorded in the first configuration file store data or not;
and if the target data storage position does not store data, acquiring the target data from a complete database, and storing the target data to the target data storage position.
5. The method of any one of claims 1-4, wherein after comparing the executed result of each of the test cases with the corresponding expected result to obtain the test result, the method further comprises:
and generating a test report according to the test result.
6. An autoregressive test apparatus, comprising:
the acquisition module is used for acquiring a configuration file corresponding to the test task, wherein the configuration file records relevant information of all test cases needing to be executed for completing the test task;
the execution module is used for executing each test case according to the relevant information of each test case in the configuration file to obtain the execution result of each test case;
the comparison module is used for comparing the execution result of each test case with the corresponding expected result to obtain a test result;
the acquisition module is further used for acquiring a first configuration file, a second configuration file, a third configuration file, a fourth configuration file and a fifth configuration file;
the first configuration file, the second configuration file, the third configuration file and the fifth configuration file are XM L files, and the fourth configuration file is a JSON format file;
the first configuration file is used for recording global configuration information, and the global configuration information at least comprises execution result position information and data source configuration information corresponding to each test case;
the second configuration file is used for recording basic information of each test case, and the basic information at least comprises: a case identification and at least one execution path identification;
the third configuration file is used for recording information of the program to be tested corresponding to each execution path identifier, and the information of the program to be tested at least comprises: the method comprises the following steps that corresponding execution path identification, a calling interface of a program to be tested and target data information in a data source to be accessed by the program to be tested are carried out, wherein the target data information comprises: table names of the data tables, data routing information and data query information;
the fourth configuration file is configured to record data table structure information corresponding to the target data information in the third configuration file, where the data table structure information at least includes: the table name of the data table, the data routing information and the test field of the data table;
the fifth configuration file is used for recording test data information corresponding to each test case, and the test data information at least includes: a corresponding case identification, input data, and an expected output result that is a portion of the expected result.
7. The apparatus of claim 6, wherein the information related to the test case comprises at least: case identification, program information to be tested, data source configuration information, input data, expected results, and execution result location information.
8. The apparatus of claim 6, wherein the execution module comprises:
the connection submodule is used for connecting a data source according to the data source configuration information recorded in the first configuration file so as to enable each program to be tested corresponding to each test case to access target data in the data source when running;
a determining submodule, configured to determine target data in the data source that needs to be accessed by each test case according to the basic information of each test case recorded by the second configuration file, target data information in the data source that needs to be accessed by the program to be tested and recorded by the third configuration file corresponding to each test case, and data table structure information corresponding to the target data information in the third configuration file and recorded by the fourth configuration file;
and the execution submodule is used for operating each program to be tested according to the input data in the test data information which is recorded in the fifth configuration file and corresponds to each test case and the calling interface of the program to be tested in the third configuration file corresponding to each test case, and outputting the operation result of each program to be tested to the position corresponding to the execution result position information recorded in the first configuration file.
9. The apparatus of claim 6 or 8, wherein the first configuration file further records a target data storage location, the apparatus further comprising: a data verification module;
the data verification module is used for:
determining whether the target data storage positions recorded in the first configuration file store data or not;
and if the target data storage position does not store data, acquiring the target data from a complete database, and storing the target data to the target data storage position.
CN201710942943.XA 2017-10-11 2017-10-11 Automatic regression testing method and device Active CN107665171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710942943.XA CN107665171B (en) 2017-10-11 2017-10-11 Automatic regression testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710942943.XA CN107665171B (en) 2017-10-11 2017-10-11 Automatic regression testing method and device

Publications (2)

Publication Number Publication Date
CN107665171A CN107665171A (en) 2018-02-06
CN107665171B true CN107665171B (en) 2020-08-04

Family

ID=61097499

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710942943.XA Active CN107665171B (en) 2017-10-11 2017-10-11 Automatic regression testing method and device

Country Status (1)

Country Link
CN (1) CN107665171B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109710508B (en) * 2018-08-20 2024-03-15 天航长鹰(江苏)科技有限公司 Test method, test device, test apparatus, and computer-readable storage medium
CN109117375A (en) * 2018-08-30 2019-01-01 上海携程金融信息服务有限公司 Database interface test method, system, equipment and storage medium
CN109634837A (en) * 2018-10-23 2019-04-16 平安科技(深圳)有限公司 Automated testing method, device, equipment and storage medium
CN109558525B (en) * 2018-12-12 2020-11-06 北京锐安科技有限公司 Test data set generation method, device, equipment and storage medium
CN109684205B (en) * 2018-12-12 2022-02-08 恒生电子股份有限公司 System testing method, device, electronic equipment and storage medium
CN110109824B (en) * 2019-04-09 2022-05-17 平安科技(深圳)有限公司 Big data autoregression test method and device, computer equipment and storage medium
CN112468355B (en) * 2019-09-09 2024-01-19 北京奇虎科技有限公司 IOT equipment management application testing method and device, electronic equipment and storage medium
CN110569196A (en) * 2019-09-11 2019-12-13 宝付网络科技(上海)有限公司 Regression testing system
CN110781090B (en) * 2019-10-31 2023-09-12 望海康信(北京)科技股份公司 Control method and device for data processing test, computer equipment and storage medium
CN111158942A (en) * 2019-12-17 2020-05-15 珠海格力电器股份有限公司 Method and device for verifying fault processing data
CN111488279A (en) * 2020-04-09 2020-08-04 吉林亿联银行股份有限公司 Regression testing method and device
CN111737148A (en) * 2020-07-24 2020-10-02 深圳市富之富信息技术有限公司 Automatic regression testing method and device, computer equipment and storage medium
US20230244587A1 (en) * 2022-01-31 2023-08-03 Volvo Car Corporation Computer-Implemented Method for Performing a System Assessment
CN115543773A (en) * 2022-08-17 2022-12-30 睿智合创(北京)科技有限公司 Automatic comparison method for test results

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202339542U (en) * 2011-03-21 2012-07-18 中国工商银行股份有限公司 Software product test system
US9170926B1 (en) * 2011-05-08 2015-10-27 Panaya Ltd. Generating a configuration test based on configuration tests of other organizations
CN102819492B (en) * 2012-08-02 2015-03-04 中山大学 Keyword driven automatic testing frame on basis of Android
CN104461855B (en) * 2013-09-22 2019-03-26 腾讯科技(北京)有限公司 A kind of Web automated testing method, system and device
MY167175A (en) * 2013-12-03 2018-08-13 Mimos Berhad A system and method for emulating multiple independent wireless client devices in the cloud
US9053236B1 (en) * 2013-12-23 2015-06-09 Emc Corporation Automated directory services test setup utility
CN105373469B (en) * 2014-08-25 2018-09-04 广东金赋科技股份有限公司 A kind of software automated testing system and method based on interface
CN104182347B (en) * 2014-09-05 2017-09-19 上海斐讯数据通信技术有限公司 A kind of testing requirement automatic analysis method based on automatic test platform
CN104317713A (en) * 2014-10-27 2015-01-28 北京锐安科技有限公司 Automatic testing tool and method on basis of templates
CN104598376B (en) * 2014-12-30 2017-09-15 中国科学院计算机网络信息中心 The layering automatization test system and method for a kind of data-driven
CN107015902B (en) * 2016-01-27 2021-02-09 创新先进技术有限公司 Test method and test equipment

Also Published As

Publication number Publication date
CN107665171A (en) 2018-02-06

Similar Documents

Publication Publication Date Title
CN107665171B (en) Automatic regression testing method and device
US11379348B2 (en) System and method for performing automated API tests
US10055338B2 (en) Completing functional testing
CN107943694B (en) Test data generation method and device
CN108628748B (en) Automatic test management method and automatic test management system
CN106844730B (en) Method and device for displaying file content
CN114116496A (en) Automatic testing method, device, equipment and medium
CN112783867A (en) Database optimization method for meeting real-time big data service requirements and cloud server
CN116955097A (en) Test flow display method and device and test flow display system
CN111061733B (en) Data processing method, device, electronic equipment and computer readable storage medium
CN113220597A (en) Test method, test device, electronic apparatus, and storage medium
CN117194255A (en) Test data maintenance method, device, equipment and storage medium
CN110147313B (en) Log output method and device
CN115220731A (en) Index data acquisition method and device, computer equipment and storage medium
CN110908907A (en) Web page testing method, device, equipment and storage medium
CN116069667A (en) Test case auxiliary positioning method and device based on code analysis
CN114490413A (en) Test data preparation method and device, storage medium and electronic equipment
CN113791980A (en) Test case conversion analysis method, device, equipment and storage medium
CN113094277A (en) Chip test case management method and device
CN111767222A (en) Data model verification method and device, electronic equipment and storage medium
CN115470127B (en) Page compatibility processing method, device, computer equipment and storage medium
CN114116729B (en) Test data processing method and equipment
CN111309623B (en) Coordinate class data classification test method and device
CN113254328B (en) White box testing method, system, mobile terminal and storage medium
CN114327377B (en) Method and device for generating demand tracking matrix, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant