CN112597014A - Automatic testing method, device, medium and electronic equipment based on data driving - Google Patents

Automatic testing method, device, medium and electronic equipment based on data driving Download PDF

Info

Publication number
CN112597014A
CN112597014A CN202011490722.1A CN202011490722A CN112597014A CN 112597014 A CN112597014 A CN 112597014A CN 202011490722 A CN202011490722 A CN 202011490722A CN 112597014 A CN112597014 A CN 112597014A
Authority
CN
China
Prior art keywords
data
test
target
variable
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011490722.1A
Other languages
Chinese (zh)
Other versions
CN112597014B (en
Inventor
黄丽改
王永海
董春玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Softcom Power Information Technology Group Co Ltd
Original Assignee
Softcom Power Information Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Softcom Power Information Technology Group Co Ltd filed Critical Softcom Power Information Technology Group Co Ltd
Priority to CN202011490722.1A priority Critical patent/CN112597014B/en
Publication of CN112597014A publication Critical patent/CN112597014A/en
Application granted granted Critical
Publication of CN112597014B publication Critical patent/CN112597014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application discloses an automatic testing method, an automatic testing device, an automatic testing medium and electronic equipment based on data driving. The method comprises the following steps: responding to an execution instruction of a target test script, and extracting variables in the target test script; determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data case to obtain a test result. By executing the scheme, the data use case and the test script can be respectively maintained, and the data use case and the test script are associated based on the variable, so that the reusability and the sharing performance of the test script are improved.

Description

Automatic testing method, device, medium and electronic equipment based on data driving
Technical Field
The embodiment of the application relates to the technical field of computer application, in particular to an automatic testing method, device, medium and electronic equipment based on data driving.
Background
With the continuous change of business requirements and the rapid iteration of software versions, time-consuming and labor-consuming manual tests are gradually replaced by automatic test methods, which become mainstream tests, in order to save cost and ensure high-efficiency and high-quality version iteration.
At present, some Web automatic test flat methods realized based on a Rabbit automatic test platform can well complete regression tests of a Web system, but because most of the automatic methods are driven based on keywords, the whole test process is controlled by the keywords, test data and test logic in the method are not separated, the coupling of the test data and the test logic is high, and one test script can only aim at a limited amount of test data, so that once the test data change occurs, higher maintenance cost is generated. This results in that these automated testing methods cannot be well applied to the scenario of performing functional tests on web systems or performing performance tests on web systems using large-scale normal and abnormal data tests. The automatic test method has the problems of poor test script sharing performance and low multiplex rate.
Disclosure of Invention
The embodiment of the application provides an automatic testing method, an automatic testing device, an automatic testing medium and electronic equipment based on data driving, and the purposes that a testing script and a data case are respectively maintained, association is carried out based on variables, and the reuse rate of the testing script is improved can be achieved.
In a first aspect, an embodiment of the present application provides an automated testing method based on data driving, where the method includes:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data case to obtain a test result.
In a second aspect, an embodiment of the present application provides an automated testing apparatus based on data driving, where the apparatus includes:
the variable extraction module is used for responding to an execution instruction of a target test script and extracting a variable in the target test script;
the target data use case determining module is used for determining a target data use case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data use case; the target data use case is at least one of candidate data use cases;
and the test result determining module is used for testing the tested object according to the target test script and the target data case to obtain a test result.
In a third aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements an automated testing method based on data driving according to embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable by the processor, where the processor executes the computer program to implement the automated testing method based on data driving according to the embodiment of the present application.
According to the technical scheme provided by the embodiment of the application, the variable in the target test script is extracted in response to the execution instruction of the target test script; determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data case to obtain a test result. According to the technical scheme, the data use cases and the test scripts can be maintained respectively, and the data use cases and the test scripts are associated based on the variables, so that the reusability and the sharing performance of the test scripts are improved.
Drawings
FIG. 1 is a flowchart of an automated testing method based on data driving according to an embodiment of the present application;
FIG. 2 is a flowchart of another automated testing method based on data driving according to the second embodiment of the present application;
FIG. 3 is a flowchart of another method for automated testing based on data driving according to the third embodiment of the present application;
fig. 4 is a schematic structural diagram of an automated testing apparatus based on data driving according to a fourth embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example one
Fig. 1 is a flowchart of an automated testing method based on data driving according to an embodiment of the present disclosure, where the embodiment is applicable to a situation of testing a web system. The method can be executed by the data-driven-based automatic testing device provided by the embodiment of the application, and the device can be realized by software and/or hardware and can be integrated in electronic equipment running the system.
As shown in fig. 1, the automated testing method based on data driving includes:
and S110, responding to an execution instruction of the target test script, and extracting variables in the target test script.
A test script refers to a series of instructions written to complete a particular test plan that can be executed by an automated test tool. The test scripts may be pre-configured in the automated test platform by a user. Wherein, the automated testing platform can be developed based on the Selenium automated testing tool.
The execution instruction of the target test script refers to an instruction for instructing the target test script to start running. The execution instruction of the target test script can be generated when the control for controlling the execution of the target test script is clicked by a user, and can also be automatically generated by the test platform at the execution time of the test plan corresponding to the target test script.
The automatic test process is that normal and abnormal test data input into a tested object by running a test script are used for obtaining an actual result, and then the actual result is compared with an expected result to obtain a test result. And extracting variables in the target test script, specifically extracting information of the variables in the target test script by executing the variable extraction statements, and storing the information of the variables in a local database.
S120, determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; and the target data use case is at least one of the candidate data use cases.
The variable identification information is information for identifying the variable, and the variable in the target test script is distinguished according to the identification information of the variable. The variable identification information includes: variable name and variable type. The combined variable name and variable type may uniquely identify a variable in a test script.
A test script is essentially a piece of program code, and variables in the test script identify the memory space in which test data is stored. That is, the variables are directly related to the test data. The data use case includes test data used in testing the object under test. The data use case can be an Excel file or a form which is edited online by a user in an x-speedsheet online form editing mode. Optionally, the data use case includes a large amount of normal or abnormal data.
In the process of generating the test script, a variable is set in the test script, and the association relationship between the variable and the data case is written in the test script. For example, the grammar of freemarker can be used to establish the association relationship between the variable and the data use case in the test script. Specifically, the incidence relation between the variable and the data case is established through an instruction $ { param. Wherein param is the name of the variable, and name is the name of the head of the specific column in the data case. The variable may be associated with a certain row of data in the data use case, or may be associated with all rows of data in the certain row of the data use case.
Under the condition that the incidence relation between each variable and the data use case in the target test script is known, the target data use case relevant to the target test script can be determined according to the identification information of the variable. The candidate data use cases are data use cases which are pre-configured on the automatic test platform by a user, the number of the candidate data use cases is at least one, specific numerical values are not limited in the method, and the specific numerical values are determined according to actual conditions. The target data use case is a data use case associated with the target test script in the candidate use cases.
And S130, testing the tested object according to the target test script and the target data case to obtain a test result.
The testing of the tested system comprises the testing of an interface, a process or a function of the web system. Wherein the interface test comprises a test for element correctness or a test for text box length.
The target test script comprises automatic test logic, the target data case comprises data for testing, and the target test script and the target data case are matched to realize the test of an interface, a flow or a function of a tested object to obtain a test result. Wherein the test result comprises a test pass and a test fail.
According to the technical scheme provided by the embodiment of the application, the variable in the target test script is extracted in response to the execution instruction of the target test script; determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data case to obtain a test result. According to the technical scheme, the test script and the data case can be maintained respectively through the target, association is carried out based on the variable, and the reuse rate of the script is improved.
Example two
Fig. 2 is a flowchart of another automated testing method based on data driving according to the second embodiment of the present application. The present embodiment is further optimized on the basis of the above-described embodiments. Specifically, the refinement of the target data use case associated with the target test script is determined according to the identification information of the variable and the association relationship between the variable and the data use case.
As shown in fig. 2, the automated testing method based on data driving includes:
s210, responding to an execution instruction of the target test script, and extracting variables in the target test script.
In an optional embodiment, before extracting the variable in the target test script in response to the execution instruction of the target test script, the method further comprises: acquiring a candidate test script configured for a test plan by a user; wherein the candidate test script comprises a variable for associating data use cases; and analyzing the candidate test script to determine the association relationship between the variable identification information and the variable and the data use case.
The candidate test scripts are pre-configured in the automatic test platform for completing interface test, flow test or function test of the Web system by a user. The candidate test scripts may be automatically generated by the script generation tool or may be written autonomously by the user. When a user writes and maintains a candidate test script independently, optionally, an ace editor is introduced into an automatic test platform, a script case is edited in an online code mode, and real-time compiling and running of codes are realized. The online code mode is combined with the existing keyword mode of the automatic test platform, so that a user can switch the code view and the keyword view according to requirements when maintaining the test script. Therefore, the user experience can be improved for meeting the requirements of users with different coding capacities. Specifically, a user with limited coding capacity can maintain the test script in a keyword view mode; for users with stronger coding capacity, compared with the maintenance of the test script by writing codes on line, the operation of maintaining the test script by keywords is higher in complexity and more complicated, and the users can maintain the test script in a code view mode.
Optionally, functions necessary for functional automation are added to the candidate test scripts to provide clear functional interpretation. Optionally, a result backfill function is added to the candidate test script for backfilling the test data result into the data use case. The backfilling of the test data result to the data case comprises covering and reserving original data of the data case with the original data in the data case, adding columns in the data case, and filling the data test result in corresponding positions of the added columns. Preferably, the data test result is backfilled to the specified row and column of the data use case; if the variables in the test script are associated with a plurality of data case variables, the data test results can be backfilled to specific rows and columns in specific data cases according to the association relationship between the variables and the data cases.
Optionally, a packet capturing function is added in the candidate test script, and is used for capturing and storing the interface information in the web system in the process of testing the tested object according to the test script, so as to be used in a subsequent scene of performing a security performance test on the web system.
Optionally, a result quantization function is added to the candidate test script to extract the test data result as a variable, and the variable is set as a local variable of a test plan or a global variable between different plans according to requirements and set an effective duration for the variable. The test data result is extracted as a variable, and the scope of the variable is set, so that the test data result obtained by one test plan can be used for different test plans according to requirements, and the test efficiency is improved. Setting the effective duration for the variable may improve the utilization of the resource.
Optionally, in order to meet different test requirements, variable types are set for the variables in the candidate test script, specifically, the types of the variables in the candidate test script may be set to a single value, a data case set, and a file, so as to associate different types of data cases.
Optionally, in order to facilitate statistics, query and management of the test script, when maintaining the candidate test case, at least one attribute information is set for the candidate test script: the test script comprises a test script name, a test script correlation function module, a test type to which the test script belongs, an influence level of the test script and a test plan identifier to which the test script belongs. The test script correlation function refers to which function of the web system the test script corresponds to for testing; the test type of the test script specifically includes: interface test, flow test and function test; the influence level of the test script refers to the importance of the test script in the test plan, for example, the influence level of the test script includes four levels, and the first level to the fourth level respectively correspond to blocking, serious, main and general, that is, how the test result of the test object tested by the test script affects the whole web system if the test result does not pass. The higher the number of levels of influence, the less influence on the web system. The specific influence level and the influence situation of the influence level corresponding to the web system are not limited herein, and are determined according to the actual situation.
And specifically, analyzing a statement written with the variable and data case association script in the candidate test script to determine the variable identification information and the association relationship between the variable and the data case.
S220, if the variable type is a data use case set, determining data use case set information associated with the variable according to the variable name; the data case set information comprises a test data version identification and a data case set name.
In the process of testing, different test data are often used for testing the tested object for multiple times. In order to improve the testing efficiency, the embodiment of the invention realizes that different testing data are used for testing the tested object for multiple times by setting the variable of the data case set type in the testing script.
If the variable type is a data case set in the test script, indicating that the data case associated with the variable is the data case set. The data use case set refers to a data use case of test data of multiple versions. The data use case set information comprises a test data version identification and a data use case set name. And determining the data use case set information associated with the variable according to the variable name and the association relationship between the variable and the data use case. Specifically, the data use case set information includes a test data version identification and a data use case set name.
And S230, determining a target data case set from the candidate data cases according to the test data version identification and the data case set name, and using the target data case set as a target data case.
The test data version identification and the data use case name can uniquely identify a data use case set. Optionally, the candidate data case is stored in a local database or a cloud of the automated testing platform, and the target data case set can be determined as the target data case in the local database or the cloud of the automated testing platform according to the test data version identifier and the data case set name.
S240, if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name.
If the variable type is a file in the test script, the data use case associated with the variable is indicated to be a file. The file refers to a file including test data, and the file may be an Excel type file or an xml type file. And determining the file information associated with the variable according to the variable name and the association relationship between the variable and the data use case. Specifically, the file information includes a file name and a file address.
And S250, determining a target file from the candidate data use cases according to the file name and the file address, and using the target file as a target data use case.
The file name and file address may uniquely identify a file. Optionally, the file is stored in a local database or a cloud of the automated testing platform, and the target data case set can be obtained in the local database or the cloud of the automated testing platform according to the file name and the file address and serves as the target data case.
And S260, testing the tested object according to the target test script and the target data case to obtain a test result.
In an optional embodiment, after the parsing the candidate test script determines the association relationship between the variable identification information and the variable and the data use case, the method further includes: acquiring a candidate data use case configured for the test plan by a user and determining the type information of the candidate data use case; the candidate data use case is imported through a data interface or generated on line; if the candidate data use case type is a data use case set, acquiring a candidate data use case name and a candidate data use case version identifier, and associating the data use case set with the variable name according to the association relation; if the candidate data use case type is a file, obtaining the candidate data use case name and the candidate data use case address, and associating the candidate data use case component with the variable name according to the association relation.
Because the data use case is associated with the variable in the test script, the type of the variable corresponds to the type of the data use case, and the data use case type information includes: single value, file, and data use case types.
Optionally, in order to facilitate statistics, query and management of the data use case, when maintaining the candidate data use case, at least one piece of attribute information is set for the candidate data use case: the method comprises the following steps of data case name, a data case correlation function module, a test type of the data case and the influence level of the data case. The data case correlation function refers to which function test of the web system the data case corresponds to; the test types of the data use cases specifically include: interface test, flow test and function test; the influence level of the data case refers to the importance of the data case in the test plan, and exemplarily, the influence level of the data case includes four levels, where the first level to the fourth level correspond to blocking, serious, main and general respectively, that is, how the test result of the test object tested by the data case will affect the whole web system if the test result does not pass. The higher the number of levels of influence, the less influence on the web system. The specific influence level and the influence situation of the influence level corresponding to the web system are not limited herein, and are determined according to the actual situation.
Optionally, in order to obtain a test result more intuitively according to the data test result, a data test result fixed column value is set in the data case. And backfilling the test data result into a fixed column value of the test data result in the data case.
The candidate data use case is imported through a data interface or generated on line; specifically, when the data use case is maintained, an x-speedsheet online form editing mode is adopted, and a template excel file importing mode is supported to maintain the data use case.
When a data use case is associated with a variable, if the type of a candidate data use case is a data use case set, acquiring the name of the candidate data use case and the version identification of the candidate data use case, and associating the data use case set with the name of the variable according to the association relation; if the candidate data use case type is a file, obtaining the candidate data use case name and the candidate data use case address, and associating the candidate data use case component with the variable name according to the association relation.
According to the technical scheme provided by the embodiment of the application, the variable in the target test script is extracted in response to the execution instruction of the target test script. If the variable type is a data case set, determining data case set information associated with the variable according to the variable name; the data case set information comprises a test data version identification and a data case set name. And determining a target data case set from the candidate data cases according to the test data version identification and the data case set name to serve as a target data case. And if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name. And determining a target file from the candidate data use cases according to the file name and the file address, wherein the target file is used as a target data use case. And testing the tested object according to the target test script and the target data case to obtain a test result. According to the technical scheme, different testing requirements are met by setting different types of variables to be associated with different types of data cases, and the reusability of the testing script is improved. Because the test script and the data case are maintained respectively, under the condition that the test type of the test script is determined, different tests can be completed on the tested object by selecting different data cases under the same type, and the coverage of the test script is improved.
EXAMPLE III
Fig. 3 is a flowchart of another automated testing method based on data driving according to a third embodiment of the present application. The present embodiment is further optimized on the basis of the above-described embodiments. Specifically, the tested object is tested according to the target test script and the target data case, and a test result is obtained and refined.
As shown in fig. 3, the automated testing method based on data driving includes:
s310, responding to the execution instruction of the target test script, and extracting the variable in the target test script.
S320, determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; and the target data use case is at least one of the candidate data use cases.
S330, testing the tested object according to the target test script and the target data case to obtain a test data result.
S340, extracting the test data result, and backfilling the test data result into the target data case; wherein the target data use case comprises a test expected result.
Specifically, after the data use case is used to complete the test on the tested object, the tested object feeds back test result data for each piece of test data in the data use case, and the test data result is an actual result.
In order to obtain a test result more intuitively according to the data test result, a data test result fixed column value is set in the data case. And backfilling the test data result into a fixed column value of the test data result in the data case. Specifically, a result backfilling function in the test script is called, and the test data result is backfilled into the data case. The backfilling of the test data result to the data case comprises covering and reserving original data of the data case with the original data in the data case, adding columns in the data case, and filling the data test result in corresponding positions of the added columns. Preferably, the data test result is backfilled to the specified row and column of the data use case; if the variables in the test script are associated with a plurality of data case variables, the data test results can be backfilled to specific rows and columns in specific data cases according to the association relationship between the variables and the data cases.
And S350, matching the test data result with the test expected result to obtain a matching result.
The expected test result is related to the test data, and for the test data in the data case, when the data case is maintained, an expected result is set corresponding to each piece of test data in the data case. The expected result is a feedback result of the tested object for the correctness of the object, and the expected result is a basis for judging the correctness of the result. And the automatic test platform matches the result data with the expected data to obtain a matching result. The matching result comprises matching success and matching failure.
And S360, determining a test result according to the matching result and the influence level of the target data case.
The influence level of the data use case refers to the importance of the test script in the test plan, that is, how the test result of the test object tested by the test script affects the whole web system if the test result does not pass. Different impact levels and matching results of the target data use case may impact the final test result.
In an optional embodiment, determining a test result according to the matching result and the influence level of the target data use case includes: if the number of the failed matching of the test data result and the test expected result is not more than a preset quality threshold corresponding to the influence level of the target data case, determining that the test result is passed; and if the number of the failed matching pieces of the test data result and the test expected result is greater than the preset quality threshold corresponding to the influence level of the target data case, determining that the test result is failed.
The preset quality threshold is an empirical value preset by a user according to the influence level and the actual situation of the target data. The higher the influence level is, the smaller the corresponding numerical value of the preset quality threshold value is, illustratively, if the influence level is blocking, the preset quality threshold value is 1, that is, it means that the test result is passed only if the number of the test data result and the test expected result matching failure is not more than 1, otherwise, the test result is not passed; if the influence level is general, the preset quality threshold is 4, that is, the test result is passed if the number of the failed test data results and the expected test results is not more than 4, otherwise, the test result is failed. The lower the impact level the higher the preset quality threshold.
Optionally, after the test result is obtained, a test report is generated according to the test result and the test data result, and the total number of the data cases used in the currently completed test, the number of passed tests and the number of failed tests, the number of defects corresponding to the influence level, the success rate, the test result, the test environment information, and the test conclusion are displayed in the test report. Optionally, the test report is captured according to the requirement, a video and an email are generated and sent to the client, and the client interfaces with a bug (bug) management platform to directly submit the bug. Meanwhile, the generation and export of a data use case report, a script use case process report and a log report can be supported.
In order to improve the testing efficiency, all the test data in all the data case sets are not used for judging whether the test passes or not when the test result is determined. In an optional embodiment, the testing the object to be tested according to the target test script and the target data case to obtain a test result, further includes: if the target test script comprises a plurality of variables of which the variable types are the variables of the data case set, determining the target data case set associated with the parent-level variable as a main data case set; and determining the test result according to the data test result of the main data use case set and the influence level of the main data use case set.
The parent variable refers to that the data use case set corresponding to the variable also comprises other data use case sets. That is, there is a nested relationship between the variable and other variables of the same type. For example, in the case of a test plan for issuing compensation to employees of a company, the parent variable is a list of companies, and the child variable is a list of employees in a particular company a in the parent variable. At this time, the target data use case set associated with the parent variable is determined as the master data use case set. And determining the test result according to the data test result of the main data use case set and the influence level of the main data use case set.
According to the technical scheme provided by the embodiment of the application, when the test result is determined, the test result is determined by comprehensively considering the influence levels of the actual result, the expected result and the target data case. According to the technical scheme, the test result can be directly reflected through the test data by taking the data as the center, the test result and the data test result corresponding to the data case are directly obtained, and the accuracy of the test result is improved.
Example four
Fig. 4 is a diagram of an automated testing apparatus based on data driving according to a fourth embodiment of the present disclosure, which is applicable to this embodiment. The device can be realized by software and/or hardware, and can be integrated in electronic equipment such as an intelligent terminal.
As shown in fig. 4, the apparatus may include: a variable extraction module 410, a target data use case determination module 420, and a test result determination module 430.
A variable extraction module 410, configured to, in response to an execution instruction of a target test script, extract a variable in the target test script;
a target data use case determining module 420, configured to determine, according to the identification information of the variable and the association relationship between the variable and the data use case, a target data use case associated with the target test script; the target data use case is at least one of candidate data use cases;
and the test result determining module 430 is configured to test the object to be tested according to the target test script and the target data case to obtain a test result.
According to the technical scheme provided by the embodiment of the application, the variable in the target test script is extracted in response to the execution instruction of the target test script; determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data case to obtain a test result. According to the technical scheme, the test script and the data case can be maintained respectively through the target, association is carried out based on the variable, and the reuse rate of the script is improved.
Optionally, the identification information of the variable includes: the variable type comprises the following variable types and variable names: a data use case set and a file; accordingly, the target data use case determining module 420 includes: the data use case set information determining submodule is used for determining data use case set information related to the variable according to the variable name if the variable type is a data use case set; the data case set information comprises a test data version identification and a data case set name. And the first target data use case set determining submodule is used for determining a target data use case set from the candidate data use cases according to the test data version identification and the data use case set name to serve as a target data use case. And the file information determining submodule is used for determining the file name and the file address associated with the variable according to the variable name if the variable type is a file. And the second target data use case set determining submodule is used for determining a target file from the candidate data use cases according to the file name and the file address to serve as a target data use case.
Optionally, the apparatus further comprises: the candidate test script acquisition module is used for acquiring a candidate test script configured for a test plan by a user before extracting a variable in a target test script in response to an execution instruction of the target test script; wherein the candidate test script comprises a variable for associating data use cases. And the incidence relation determining module is used for analyzing the candidate test scripts to determine the incidence relation between the variable identification information and the variable and data use case.
Optionally, the apparatus further comprises: the candidate data use case type information determining module is used for acquiring a candidate data use case configured by a user for the test plan and determining the type information of the candidate data use case after analyzing the candidate test script to determine the incidence relation between the variable identification information and the variable and data use case; the candidate data use case is imported through a data interface or generated online. And the first association module is used for acquiring the candidate data case name and the candidate data case version identifier if the candidate data case type is the data case set, and associating the data case set with the variable name according to the association relation. And the second association module is used for acquiring the candidate data use case name and the candidate data use case address and associating the candidate data use case component with the variable name according to the association relationship if the candidate data use case type is a file.
Optionally, the target data use case information further includes an influence level, and correspondingly, the test result determining module 430 includes: and the test data result determining submodule is used for testing the tested object according to the target test script and the target data case to obtain a test data result. The test data result backfilling submodule is used for extracting the test data result and backfilling the test data result into the target data case; wherein the target data use case comprises a test expected result. And the matching result determining submodule is used for matching the test data result with the test expected result to obtain a matching result. And the test result determining submodule is used for determining a test result according to the matching result and the influence level of the target data case.
Optionally, the test result determining sub-module includes: and the first test result determining unit is used for determining that the test result is passed if the number of the failed test data results and the expected test results is not greater than a preset quality threshold corresponding to the influence level of the target data case. And the second test result determining unit is used for determining that the test result is failed if the number of the failed test data results and the expected test results is greater than the preset quality threshold corresponding to the influence level of the target data case.
Optionally, the test data result determining sub-module further includes: and the master data use case set determining unit is used for determining the target data use case set associated with the parent-level variable as the master data use case set if the target test script comprises a plurality of variables of which the variable types are the variables of the data use case set. And the second test result determining unit determines the test result according to the data test result of the main data case set and the influence level of the main data case set.
The data-driven-based automatic testing device provided by the embodiment of the invention can execute the data-driven-based automatic testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the data-driven-based automatic testing method.
EXAMPLE five
The fifth embodiment of the present application further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a data-driven automated testing method, the method including:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data case to obtain a test result.
Storage media refers to any of various types of memory electronics or storage electronics. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in the computer system in which the program is executed, or may be located in a different second computer system connected to the computer system through a network (such as the internet). The second computer system may provide the program instructions to the computer for execution. The term "storage medium" may include two or more storage media that may reside in different unknowns (e.g., in different computer systems connected by a network). The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the above-described automated testing operation based on data driving, and may also perform related operations in the automated testing method based on data driving provided in any embodiment of the present application.
EXAMPLE six
An embodiment of the present invention provides an electronic device, where the automatic test apparatus based on data driving provided in the embodiment of the present invention may be integrated into the electronic device, and the electronic device may be configured in a system, or may be a device that performs part or all of functions in the system. Fig. 5 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present application. As shown in fig. 5, the present embodiment provides an electronic device 500, which includes: one or more processors 520; the storage 510 is configured to store one or more programs, and when the one or more programs are executed by the one or more processors 520, the one or more processors 520 implement the method for automated testing based on data driving provided in the embodiment of the present application, the method includes:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data case to obtain a test result.
Of course, those skilled in the art will understand that the processor 520 also implements the solution of the automated testing method based on data driving provided in any embodiment of the present application.
The electronic device 500 shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 5, the electronic device 500 includes a processor 520, a storage 510, an input 530, and an output 540; the number of the processors 520 in the electronic device may be one or more, and one processor 520 is taken as an example in fig. 5; the processor 520, the storage 510, the input device 530, and the output device 540 in the electronic apparatus may be connected by a bus or other means, and are exemplified by a bus 550 in fig. 5.
The storage device 510 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and module units, such as program instructions corresponding to the data-driven automated testing method in the embodiment of the present application.
The storage device 510 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the storage 510 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 510 may further include memory located remotely from processor 520, which may be connected via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 530 may be used to receive input numbers, character information, or voice information, and to generate key signal inputs related to user settings and function control of the electronic apparatus. The output device 540 may include a display screen, speakers, etc. of electronic equipment.
The data-drive-based automatic test device, the medium and the electronic equipment provided in the above embodiments can execute the data-drive-based automatic test method provided in any embodiment of the present application, and have corresponding functional modules and beneficial effects for executing the method. Technical details that are not described in detail in the above embodiments can be referred to the automated test method based on data driving provided in any embodiments of the present application.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (10)

1. An automated testing method based on data driving, characterized in that the method comprises:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case; the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data case to obtain a test result.
2. The method of claim 1, wherein the identification information of the variable comprises: the variable type comprises the following variable types and variable names: a data use case set and a file; correspondingly, determining a target data case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data case includes:
if the variable type is a data case set, determining data case set information associated with the variable according to the variable name; the data use case set information comprises a test data version identification and a data use case set name;
determining a target data case set from the candidate data cases according to the test data version identification and the data case set name, wherein the target data case set is used as a target data case;
if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name;
and determining a target file from the candidate data use cases according to the file name and the file address, wherein the target file is used as a target data use case.
3. The method of claim 2, wherein prior to extracting variables in a target test script in response to execution instructions of the target test script, the method further comprises:
acquiring a candidate test script configured for a test plan by a user; wherein the candidate test script comprises a variable for associating data use cases;
and analyzing the candidate test script to determine the association relationship between the variable identification information and the variable and the data use case.
4. The method of claim 3, wherein after the parsing the candidate test script to determine the variable identification information and the association between the variable and the data use case, the method further comprises:
acquiring a candidate data use case configured for the test plan by a user and determining the type information of the candidate data use case; the candidate data use case is imported through a data interface or generated on line;
if the candidate data use case type is a data use case set, acquiring a candidate data use case name and a candidate data use case version identifier, and associating the data use case set with the variable name according to the association relation;
if the candidate data use case type is a file, obtaining the candidate data use case name and the candidate data use case address, and associating the candidate data use case component with the variable name according to the association relation.
5. The method of claim 1, wherein the target data case information further includes an influence level, and accordingly, the testing the object to be tested according to the target test script and the target data case to obtain a test result includes:
testing the tested object according to the target test script and the target data case to obtain a test data result;
extracting the test data result, and backfilling the test data result into the target data case; wherein the target data use case comprises a test expected result;
matching the test data result with the test expected result to obtain a matching result;
and determining a test result according to the matching result and the influence level of the target data case.
6. The method of claim 5, wherein determining a test result according to the matching result and the impact level of the target data use case comprises:
if the number of the failed matching of the test data result and the test expected result is not more than a preset quality threshold corresponding to the influence level of the target data case, determining that the test result is passed;
and if the number of the failed matching pieces of the test data result and the test expected result is greater than the preset quality threshold corresponding to the influence level of the target data case, determining that the test result is failed.
7. The method of claim 5, wherein the object under test is tested according to the target test script and the target data case to obtain a test result, further comprising:
if the target test script comprises a plurality of variables of which the variable types are the variables of the data case set, determining the target data case set associated with the parent-level variable as a main data case set;
and determining the test result according to the data test result of the main data use case set and the influence level of the main data use case set.
8. An automated test device based on data driving, the device comprising:
the variable extraction module is used for responding to an execution instruction of a target test script and extracting a variable in the target test script;
the target data use case determining module is used for determining a target data use case associated with the target test script according to the identification information of the variable and the association relationship between the variable and the data use case; the target data use case is at least one of candidate data use cases;
and the test result determining module is used for testing the tested object according to the target test script and the target data case to obtain a test result.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the data-driven-based automated testing method according to any one of claims 1 to 7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the data-driven-based automated testing method according to any one of claims 1 to 7 when executing the computer program.
CN202011490722.1A 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment Active CN112597014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011490722.1A CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011490722.1A CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112597014A true CN112597014A (en) 2021-04-02
CN112597014B CN112597014B (en) 2023-11-28

Family

ID=75196623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011490722.1A Active CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112597014B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419774A (en) * 2021-05-31 2021-09-21 西南电子技术研究所(中国电子科技集团公司第十研究所) Test method for automatically traversing different test parameters of tested product
CN114968787A (en) * 2022-05-27 2022-08-30 中移互联网有限公司 Node relation-based test method and device and electronic equipment
CN115314428A (en) * 2022-06-24 2022-11-08 合众新能源汽车有限公司 Vehicle CAN network testing method and system, electronic device and storage medium
CN115964306A (en) * 2023-03-16 2023-04-14 杭州新视窗信息技术有限公司 Automatic testing method, device and equipment for target system
CN116594914A (en) * 2023-07-17 2023-08-15 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for generating test data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608880A (en) * 2017-08-24 2018-01-19 郑州云海信息技术有限公司 A kind of automated testing method for being used for virtual platform based on data-driven
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN108694114A (en) * 2017-04-06 2018-10-23 广东亿迅科技有限公司 Method and its system for detaching test case, test script and test data
CN109299009A (en) * 2018-09-25 2019-02-01 金蝶软件(中国)有限公司 Data test method, apparatus, computer equipment and storage medium
CN109614313A (en) * 2018-10-25 2019-04-12 平安科技(深圳)有限公司 Automated testing method, device and computer readable storage medium
CN110321281A (en) * 2019-05-24 2019-10-11 中国工程物理研究院计算机应用研究所 Web test platform and test method based on mixing automated test frame
CN110851356A (en) * 2019-10-30 2020-02-28 河海大学 Selenium-based Web application automatic test framework and construction method and system thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN108694114A (en) * 2017-04-06 2018-10-23 广东亿迅科技有限公司 Method and its system for detaching test case, test script and test data
CN107608880A (en) * 2017-08-24 2018-01-19 郑州云海信息技术有限公司 A kind of automated testing method for being used for virtual platform based on data-driven
CN109299009A (en) * 2018-09-25 2019-02-01 金蝶软件(中国)有限公司 Data test method, apparatus, computer equipment and storage medium
CN109614313A (en) * 2018-10-25 2019-04-12 平安科技(深圳)有限公司 Automated testing method, device and computer readable storage medium
CN110321281A (en) * 2019-05-24 2019-10-11 中国工程物理研究院计算机应用研究所 Web test platform and test method based on mixing automated test frame
CN110851356A (en) * 2019-10-30 2020-02-28 河海大学 Selenium-based Web application automatic test framework and construction method and system thereof

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419774A (en) * 2021-05-31 2021-09-21 西南电子技术研究所(中国电子科技集团公司第十研究所) Test method for automatically traversing different test parameters of tested product
CN114968787A (en) * 2022-05-27 2022-08-30 中移互联网有限公司 Node relation-based test method and device and electronic equipment
CN114968787B (en) * 2022-05-27 2023-09-19 中移互联网有限公司 Method and device for testing based on node relation and electronic equipment
CN115314428A (en) * 2022-06-24 2022-11-08 合众新能源汽车有限公司 Vehicle CAN network testing method and system, electronic device and storage medium
CN115964306A (en) * 2023-03-16 2023-04-14 杭州新视窗信息技术有限公司 Automatic testing method, device and equipment for target system
CN116594914A (en) * 2023-07-17 2023-08-15 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for generating test data
CN116594914B (en) * 2023-07-17 2023-12-26 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for generating test data

Also Published As

Publication number Publication date
CN112597014B (en) 2023-11-28

Similar Documents

Publication Publication Date Title
CN112597014B (en) Automatic test method and device based on data driving, medium and electronic equipment
CN110309071B (en) Test code generation method and module, and test method and system
CN107341098B (en) Software performance testing method, platform, equipment and storage medium
CN103150249B (en) A kind of method and system of automatic test
US10176079B2 (en) Identification of elements of currently-executing component script
CN110716870B (en) Automatic service testing method and device
CN109582563B (en) Test method, device, computer equipment and storage medium for test cases
CN112783793B (en) Automatic interface test system and method
CN112052172B (en) Rapid test method and device for third-party channel and electronic equipment
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN101090295A (en) Test system and method for ASON network
CN108459951B (en) Test method and device
CN112380255A (en) Service processing method, device, equipment and storage medium
CN112367220B (en) Interface testing method and device, storage medium and electronic equipment
CN104657274A (en) Method and device for testing software interface
CN110764998A (en) Data comparison method, device and equipment based on Django framework and storage medium
CN112433948A (en) Simulation test system and method based on network data analysis
CN112650688A (en) Automated regression testing method, associated device and computer program product
CN105279092A (en) Software testing method and apparatus
CN110750453B (en) HTML 5-based intelligent mobile terminal testing method, system, server and storage medium
CN111596899A (en) Database migration method, system, equipment and storage medium based on Java development
CN112860587B (en) UI automatic test method and device
CN113220597B (en) Test method, test device, electronic equipment and storage medium
WO2023051073A1 (en) Database test method, distributed database, and storage medium
CN116414751A (en) Algorithm access method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant