CN112597014B - Automatic test method and device based on data driving, medium and electronic equipment - Google Patents

Automatic test method and device based on data driving, medium and electronic equipment Download PDF

Info

Publication number
CN112597014B
CN112597014B CN202011490722.1A CN202011490722A CN112597014B CN 112597014 B CN112597014 B CN 112597014B CN 202011490722 A CN202011490722 A CN 202011490722A CN 112597014 B CN112597014 B CN 112597014B
Authority
CN
China
Prior art keywords
test
data
target
use case
data use
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011490722.1A
Other languages
Chinese (zh)
Other versions
CN112597014A (en
Inventor
黄丽改
王永海
董春玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Softcom Power Information Technology Group Co ltd
Original Assignee
Softcom Power Information Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Softcom Power Information Technology Group Co ltd filed Critical Softcom Power Information Technology Group Co ltd
Priority to CN202011490722.1A priority Critical patent/CN112597014B/en
Publication of CN112597014A publication Critical patent/CN112597014A/en
Application granted granted Critical
Publication of CN112597014B publication Critical patent/CN112597014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application discloses an automatic test method, an automatic test device, a medium and electronic equipment based on data driving. The method comprises the following steps: responding to an execution instruction of a target test script, and extracting variables in the target test script; determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data use case to obtain a test result. By executing the scheme, the multiplexing rate and the sharing property of the test script can be improved by respectively maintaining the data use cases and the test script and associating the data use cases with the test script based on the variables.

Description

Automatic test method and device based on data driving, medium and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of computer application, in particular to an automatic test method, device, medium and electronic equipment based on data driving.
Background
With the continuous change of service demands and the rapid iteration of software versions, in order to save cost and ensure high-efficiency and high-quality version iteration, time-consuming and labor-consuming manual testing is gradually replaced by an automatic testing method, and the automatic testing method becomes the main testing flow.
At present, some Web automation test flattening methods realized based on a Rabbit automation test platform can well complete regression testing of a Web system, but because most of the automation methods are driven based on keywords, the whole test process is controlled by the keywords, test data and test logic are not separated in the method, the coupling of the test data and the test logic is higher, and one test script can only aim at a limited amount of test data, so that once the test data is changed, higher maintenance cost is generated. This results in the inability of these automated test methods to be well applied in the context of functional testing of web systems, or performance testing of web systems using large volumes of normal and abnormal data testing. The automatic test method has the problems of poor test script sharing and low multiplexing rate.
Disclosure of Invention
The embodiment of the application provides an automatic test method, an automatic test device, a medium and electronic equipment based on data driving, which can realize the purposes of respectively maintaining a test script and a data use case, correlating based on variables and improving the multiplexing rate of the test script.
In a first aspect, an embodiment of the present application provides a data-driven-based automated testing method, where the method includes:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data use case to obtain a test result.
In a second aspect, an embodiment of the present application provides an automated testing apparatus based on data driving, the apparatus comprising:
the variable extraction module is used for responding to an execution instruction of a target test script and extracting variables in the target test script;
the target data use case determining module is used for determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
And the test result determining module is used for testing the tested object according to the target test script and the target data use case to obtain a test result.
In a third aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a data driven based automated test method according to embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable by the processor, where the processor executes the computer program to implement the data-driven based automated test method according to the embodiment of the present application.
According to the technical scheme provided by the embodiment of the application, the variables in the target test script are extracted in response to the execution instruction of the target test script; determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data use case to obtain a test result. According to the technical scheme provided by the application, the multiplexing rate and the sharing property of the test script can be improved by respectively maintaining the data use cases and the test script and associating the data use cases with the test script based on the variables.
Drawings
FIG. 1 is a flow chart of an automated testing method based on data driving according to a first embodiment of the present application;
FIG. 2 is a flow chart of another data-driven-based automated testing method according to a second embodiment of the present application;
FIG. 3 is a flow chart of yet another automated testing method based on data driving provided by a third embodiment of the present application;
FIG. 4 is a schematic structural diagram of an automated testing apparatus based on data driving according to a fourth embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present application are shown in the drawings.
Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts steps as a sequential process, many of the steps may be implemented in parallel, concurrently, or with other steps. Furthermore, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example 1
Fig. 1 is a flowchart of an automated testing method based on data driving according to a first embodiment of the present application, where the present embodiment is applicable to a case of testing a web system. The method can be performed by the data-driven based automated test equipment provided by the embodiment of the application, which can be implemented in software and/or hardware and can be integrated into an electronic device running the system.
As shown in fig. 1, the data-driven-based automated testing method includes:
s110, responding to an execution instruction of a target test script, and extracting variables in the target test script.
Where a test script refers to a series of instructions written to accomplish a particular test plan that can be executed by an automated test tool. The test scripts may be pre-configured by the user in the automated test platform. The automated test platform may be developed based on a Selenium automated test tool, among others.
The execution instruction of the target test script refers to an instruction for instructing the target test script to start running. The execution instruction of the target test script can be generated when a control for controlling the execution of the target test script is clicked by a user, or can be automatically generated by the test platform at the execution time of the test plan corresponding to the target test script.
The automatic test process refers to obtaining an actual result by running test scripts on normal and abnormal test data input into a tested object, and then comparing the actual result with an expected result to obtain a test result. Extracting the variable in the target test script, specifically, extracting the information of the variable in the target test script by executing a variable extraction statement, and storing the information of the variable in a local database.
S120, determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases.
The variable identification information is information for identifying the variables, and the variables in the target test script are distinguished according to the identification information of the variables. The variable identification information includes: variable name and variable type. The integrated variable name and variable type may uniquely identify a variable within a test script.
The test script is essentially a piece of program code, where variables in the test script identify the memory space in which the test data is stored. That is, the variables are directly related to the test data. The data use case includes test data for use in testing the object under test. The data use case can be an Excel file or a table edited online by a user in an x-spashet online table editing mode. Optionally, the data use case includes a large amount of normal or abnormal data.
In the process of generating the test script, variables are set in the test script, and the association relation between the variables and the data use cases is written in the test script. Illustratively, a freemaker grammar may be used to establish an association between variables and data use cases in a test script. Specifically, an association relationship between a variable and a data use case is established in a test script through an instruction $ { parameter.name }. Wherein, param is variable name, name is specific column header name in data use case. The variable may be associated with a row of data in the data use case, or may be associated with all the rows of data in the column of the data use case.
Under the condition that the association relation between each variable and the data use case in the target test script is known, the target data use case associated with the target test script can be determined according to the identification information of the variable. The number of candidate data cases is at least one, and specific numerical values are not limited herein, and are determined specifically according to actual situations. The target data use case is a data use case associated with the target test script among the candidate use cases.
S130, testing the tested object according to the target test script and the target data use case to obtain a test result.
Wherein testing the system under test includes testing interfaces, flows, or functions of the web system. Wherein the interface test includes a test of element correctness or text box length.
The target test script comprises automatic test logic, the target data use case comprises data for test, and the target test script and the target data use case are matched to realize the test of the interface, flow or function of the tested object, so as to obtain a test result. Wherein the test results include test pass and test fail.
According to the technical scheme provided by the embodiment of the application, the variables in the target test script are extracted in response to the execution instruction of the target test script; determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data use case to obtain a test result. According to the technical scheme provided by the application, the test script and the data use case can be maintained respectively through the target, and the correlation is carried out based on the variable, so that the multiplexing rate of the script is improved.
Example two
Fig. 2 is a flowchart of another data-driven-based automated testing method according to a second embodiment of the present application. The present embodiment is further optimized on the basis of the above embodiment. Specifically, the refinement of the target data use case associated with the target test script is determined according to the identification information of the variable and the association relation between the variable and the data use case.
As shown in fig. 2, the data-driven-based automated testing method includes:
s210, responding to an execution instruction of a target test script, and extracting variables in the target test script.
In an alternative embodiment, before extracting the variable in the target test script in response to an execution instruction of the target test script, the method further comprises: acquiring candidate test scripts configured by a user for a test plan; wherein the candidate test script includes variables associated with the data use case; and analyzing the candidate test script to determine the variable identification information and the association relationship between the variable and the data use case.
The candidate test scripts are preconfigured in the automatic test platform for a user to complete interface test, flow test or function test on the Web system. Candidate test scripts may be automatically generated by the script generation tool or may be written autonomously by the user. When a user independently writes and maintains candidate test scripts, optionally, introducing an ace editor into an automatic test platform, and editing script cases in an online code mode to realize code real-time compiling and running. The online code mode is combined with the existing keyword mode of the automatic test platform, so that a user can switch the code view and the keyword view with each other according to requirements when maintaining the test script. Therefore, the user experience can be improved for meeting the requirements of users with different coding capacities. Specifically, for users with limited coding capability, the test script can be maintained in a keyword view mode; because, for users with stronger coding capability, compared with maintaining test scripts by writing codes on line, the operation of maintaining the test scripts by keywords is more complex and cumbersome, and such users can maintain the test scripts in the code view mode.
Optionally, functions necessary for function automation are added in the candidate test scripts, so that clear function interpretation is provided. Optionally, a result backfill function is added to the candidate test script for backfilling test data results to the data use case. The step of backfilling the test data result to the data use case comprises the steps of covering and reserving the original data of the data use case by the original data in the data use case, adding a new column in the data use case, and filling the data test result in the corresponding position of the new column. Preferably, backfilling the data test results to the designated rows and columns of the data use case; if the variable in the test script is associated with a plurality of data use case variables, the data test result can be backfilled to specific rows and columns in specific data use cases according to the association relation between the variable and the data use cases.
Optionally, a packet capturing function is added in the candidate test script, and is used for capturing and storing interface information in the web system in the process of testing the tested object according to the test script, so as to be used for a subsequent scene of testing the safety performance of the web system.
Optionally, a result variational function is added to the candidate test script, which is used to extract the test data result as a variable, and set it as a local variable of a test plan or a global variable between different plans according to the requirement, and set the effective duration for the same. The test data result is extracted as a variable, and the acting scope of the variable is set, so that the test data result obtained by one test plan can be used for different test plans according to requirements, and the test efficiency is improved. Setting the effective duration for the variable can improve the utilization rate of the resource.
Optionally, to meet different test requirements, variable types are set for variables in the candidate test scripts, specifically, the types of variables in the candidate test scripts may be set to be single values, data case sets and files, so as to associate different types of data cases.
Optionally, in order to facilitate statistics, query and management of test scripts, at least one piece of attribute information is set for the candidate test scripts when maintaining the candidate test cases: the test script comprises a test script name, a test script association function module, a test type to which the test script belongs, an influence level of the test script and a test plan identifier to which the test script belongs. Wherein, the test script association function refers to which function test of the web system the test script corresponds to; the test script belongs to the test type which specifically comprises: interface test, flow test and function test; the influence level of the test script refers to the importance of the test script in the test plan, and illustratively, the influence level of the test script includes four levels, and the levels from one level to four levels respectively correspond to blocking, severity, main and general, that is, represent how the test result of the test object tested by the test script can affect the whole web system if the test result is not passed. The higher the number of levels of impact, the less impact on the web system. The specific impact level and the impact situation of the web system corresponding to the impact level are not limited herein, and are specifically determined according to the actual situation.
The association relation between the variable and the data use case is written in a candidate test script, the candidate test script is analyzed to determine the association relation between the variable identification information and the variable and the data use case, specifically, the association script statement of the variable and the data use case written in the candidate test script is analyzed to determine the association relation between the variable identification information and the variable and the data use case.
S220, if the variable type is a data case set, determining data case set information associated with the variable according to the variable name; the data case set information comprises a test data version identifier and a data case set name.
Because in the process of testing, different test data are often used to test the tested object for multiple times. In order to improve the test efficiency, the embodiment of the invention realizes the multiple tests of the tested object by using different test data by setting the variable of the data case set type in the test script.
And if the variable type is the data use case set in the test script, indicating that the data use case associated with the variable is the data use case set. The data use case set refers to data use cases of test data of multiple versions. The data case set information comprises a test data version identifier and a data case set name. And determining the data use case set information associated with the variable according to the variable name and the association relation between the variable and the data use case. Specifically, the data use case set information includes a test data version identifier and a data use case set name.
S230, determining a target data use case set from the candidate data use cases according to the test data version identification and the data use case set name, and taking the target data use case set as a target data use case.
The test data version identification and the data instance set name may uniquely identify a data instance set. Optionally, the candidate data use cases are stored in a local database or a cloud end of the automatic test platform, and the target data use case set can be determined as the target data use case in the local database or the cloud end of the automatic test platform according to the test data version identification and the data use case set name.
S240, if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name.
And if the variable type is a file in the test script, indicating that the data use case associated with the variable is the file. The file refers to a file including test data, and the file may be an Excel type file or an xml type file. And determining file information associated with the variable according to the variable name and the association relation between the variable and the data use case. Specifically, the file information includes a file name and a file address.
S250, determining a target file from the candidate data use cases according to the file name and the file address, and taking the target file as a target data use case.
The file name and file address may uniquely identify a file. Optionally, the file is stored in a local database or a cloud of the automatic test platform, and the target data use case set can be obtained in the local database or the cloud of the automatic test platform as the target data use case according to the file name and the file address.
And S260, testing the tested object according to the target test script and the target data use case to obtain a test result.
In an alternative embodiment, after said parsing said candidate test script to determine said variable identification information and an association between said variable and a data use case, said method further comprises: acquiring candidate data use cases configured for the test plan by a user and determining type information of the candidate data use cases; the candidate data use cases are imported through a data interface or generated online; if the candidate data use case type is a data use case set, acquiring the candidate data use case name and the candidate data use case version identifier, and associating the data use case set with the variable name according to the association relation; and if the candidate data case type is a file, acquiring the candidate data case name and the candidate data case address, and associating the candidate data case with the variable name according to the association relation.
Since the data use case is associated with a variable in the test script, the variable type corresponds to the type of the data use case, and the data use case type information includes: single value, file, and data instance set types.
Optionally, in order to facilitate statistics, query and management of the data use cases, when maintaining the candidate data use cases, at least one piece of attribute information is set for the candidate data use cases: the method comprises the steps of data use case names, data use case association function modules, test types to which the data use cases belong and influence levels of the data use cases. Wherein, the data use case association function refers to which function test of the web system the data use case corresponds to; the test types to which the data use cases belong specifically include: interface test, flow test and function test; the influence level of the data use case refers to the importance of the data use case in the test plan, and the influence level of the data use case includes four levels, wherein the levels of the influence of the data use case correspond to blocking, serious, main and general respectively, that is, the influence on the whole web system if the test result of the test object tested by the data use case does not pass. The higher the number of levels of impact, the less impact on the web system. The specific impact level and the impact situation of the web system corresponding to the impact level are not limited herein, and are specifically determined according to the actual situation.
Optionally, in order to obtain the test result more intuitively according to the data test result, a fixed column value of the data test result is set in the data use case. And backfilling the test data result into a fixed column value of the test data result in the data use case.
The candidate data use cases are imported through a data interface or generated online; specifically, when maintaining the data use case, an x-spashet online form editing mode is adopted, and a template excel file import mode is supported to maintain the data use case.
When a data use case is associated with a variable, if the candidate data use case type is a data use case set, acquiring the candidate data use case name and the candidate data use case version identifier, and associating the data use case set with the variable name according to the association relation; and if the candidate data case type is a file, acquiring the candidate data case name and the candidate data case address, and associating the candidate data case with the variable name according to the association relation.
According to the technical scheme provided by the embodiment of the application, the variables in the target test script are extracted in response to the execution instruction of the target test script. If the variable type is a data case set, determining data case set information associated with the variable according to the variable name; the data case set information comprises a test data version identifier and a data case set name. And determining a target data use case set from the candidate data use cases according to the test data version identification and the data use case set name, and taking the target data use case set as a target data use case. And if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name. And determining a target file from the candidate data use cases according to the file name and the file address, and taking the target file as a target data use case. And testing the tested object according to the target test script and the target data use case to obtain a test result. According to the technical scheme provided by the application, different test requirements are met by setting different types of variables to be associated with different types of data use cases, and the multiplexing rate of the test script is improved. Because the test script and the data use cases are maintained respectively, under the condition that the test type of the test script is determined, different tests on the tested object can be completed by selecting different data use cases under the same type, and the coverage of the test script is improved.
Example III
Fig. 3 is a flowchart of yet another data-driven-based automated testing method according to a third embodiment of the present application. The present embodiment is further optimized on the basis of the above embodiment. Specifically, the tested object is tested according to the target test script and the target data use case, and the test result is refined.
As shown in fig. 3, the data-driven-based automated testing method includes:
s310, responding to an execution instruction of a target test script, and extracting variables in the target test script.
S320, determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases.
S330, testing the tested object according to the target test script and the target data use case to obtain a test data result.
S340, extracting the test data result, and backfilling the test data result into the target data use case; wherein the target data use case includes a test expected result.
The test data result is a test result associated with the test data, is a test result corresponding to the test data, and specifically, after the tested object is tested by using the data use case, the tested object feeds back test result data for each piece of test data in the data use case, and the test data result is an actual result.
In order to obtain the test result more intuitively according to the data test result, a data test result fixed column value is set in the data use case. And backfilling the test data result into a fixed column value of the test data result in the data use case. Specifically, a result backfill function in the test script is called, and the test data result is backfilled into the data use case. The step of backfilling the test data result to the data use case comprises the steps of covering and reserving the original data of the data use case by the original data in the data use case, adding a new column in the data use case, and filling the data test result in the corresponding position of the new column. Preferably, backfilling the data test results to the designated rows and columns of the data use case; if the variable in the test script is associated with a plurality of data use case variables, the data test result can be backfilled to specific rows and columns in specific data use cases according to the association relation between the variable and the data use cases.
And S350, matching the test data result with the test expected result to obtain a matching result.
The expected test result is related to the test data, and is set for each piece of test data in the data use case when the data use case is maintained. The expected result is the correct feedback result of the tested object to the object, and the expected result is the basis for judging the correctness of the result. And matching the result data with the expected data by the automatic test platform to obtain a matching result. The matching result includes matching success and matching failure.
S360, determining a test result according to the matching result and the influence level of the target data use case.
The influence level of the data use case refers to the importance of the test script in the test plan, namely, the influence of the test script on the whole web system is shown if the test result of the test object tested by the test script is not passed. Different impact levels and matching results of the target data use cases affect the final test results.
In an alternative embodiment, determining the test result according to the matching result and the influence level of the target data use case includes: if the number of failed matching between the test data result and the expected test result is not greater than the preset quality threshold corresponding to the influence level of the target data use case, determining that the test result is passed; and if the number of failed matching between the test data result and the expected test result is larger than the preset quality threshold corresponding to the influence level of the target data use case, determining that the test result is not passed.
The preset quality threshold is an experience value preset by a user according to the influence level of target data and actual conditions. The value of the preset quality threshold corresponding to the higher influence level is smaller, and if the influence level is blocked, the preset quality threshold is 1, namely that only the number of matching failure of the test data result and the test expected result is not more than 1 is shown, namely that the test result is passed, and otherwise, the test result is failed; and if the influence level is common, the preset quality threshold value is 4, namely, the test result is the test passing if the number of the matching failure between the test data result and the test expected result is not more than 4, and otherwise, the test result is the failed. The lower the impact level the higher the preset quality threshold.
Optionally, after the test result is obtained, a test report is generated according to the test result and the test data result, and the total number of the data cases used in the currently completed test is displayed in the test report, wherein the number of passing tests and the number of failed tests influence the number of defects corresponding to the level, the success rate, the test result, the test environment information and the test conclusion. Optionally, the test report screenshot is generated according to the requirement, the generated video and the mail are sent to the client, and the interface is carried out with a bug (bug) management platform to directly submit the bug. Meanwhile, the generation and the export of the data case report, the script case process report and the log report can be supported.
In order to improve the test efficiency, it is determined that the test result is that not all test data in all data case sets are used for judging whether the test is passed. In an optional embodiment, the testing of the tested object according to the target test script and the target data use case, to obtain a test result, further includes: if the target test script comprises a plurality of variables with the variable types being the data case sets, determining the target data case set associated with the parent variable as a main data case set; and determining the data test result of the main data case set and the influence level of the main data case set.
The parent variable refers to that other data case sets are also included under the data case set corresponding to the variable. I.e. there is a nested relationship between the variable and other types of variables. Illustratively, in the case where the test plan is a pay for company employees, the parent variable is a list of individual companies, and the child variable is a list of employees in a particular A company in the parent variable. At this time, the target data use case set associated with the parent variable is determined as the main data use case set. And determining the data test result of the main data case set and the influence level of the main data case set.
According to the technical scheme provided by the embodiment of the application, the test result is determined by comprehensively considering the actual result, the expected result and the influence level of the target data use case when the test result is determined. According to the technical scheme provided by the application, the data is taken as the center, the test result can be directly reflected through the test data, the test result is directly obtained from the data test result corresponding to the data use case, and the accuracy of the test result is improved.
Example IV
Fig. 4 is a schematic diagram of an automated testing apparatus based on data driving according to a fourth embodiment of the present application, which is applicable to the present application. The apparatus may be implemented in software and/or hardware and may be integrated in an electronic device such as a smart terminal.
As shown in fig. 4, the apparatus may include: a variable extraction module 410, a target data use case determination module 420, and a test result determination module 430.
A variable extraction module 410, configured to extract a variable in a target test script in response to an execution instruction of the target test script;
the target data use case determining module 420 is configured to determine, according to the identification information of the variable and the association relationship between the variable and the data use case, that the target data use case is associated with the target test script; wherein the target data use case is at least one of candidate data use cases;
and the test result determining module 430 is configured to test the tested object according to the target test script and the target data use case, so as to obtain a test result.
According to the technical scheme provided by the embodiment of the application, the variables in the target test script are extracted in response to the execution instruction of the target test script; determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases; and testing the tested object according to the target test script and the target data use case to obtain a test result. According to the technical scheme provided by the application, the test script and the data use case can be maintained respectively through the target, and the correlation is carried out based on the variable, so that the multiplexing rate of the script is improved.
Optionally, the identification information of the variable includes: a variable type and a variable name, the variable type comprising: a data use case set and a file; accordingly, the target data use case determining module 420 includes: the data case set information determining submodule is used for determining data case set information associated with the variable according to the variable name if the variable type is a data case set; the data case set information comprises a test data version identifier and a data case set name. And the first target data case set determining submodule is used for determining a target data case set from the candidate data cases according to the test data version identification and the data case set name as a target data case. And the file information determining submodule is used for determining the file name and the file address associated with the variable according to the variable name if the variable type is a file. And the second target data use case set determining submodule is used for determining a target file from the candidate data use cases according to the file name and the file address as a target data use case.
Optionally, the apparatus further includes: the candidate test script acquisition module is used for acquiring candidate test scripts configured by a user for a test plan before responding to an execution instruction of a target test script and extracting variables in the target test script; wherein the candidate test script includes variables associated with the data use case. And the association relation determining module is used for analyzing the candidate test script to determine the variable identification information and the association relation between the variable and the data use case.
Optionally, the apparatus further includes: the candidate data case type information determining module is used for acquiring candidate data cases configured by a user for the test plan and determining candidate data case type information after the analysis of the candidate test script determines the variable identification information and the association relation between the variable and the data cases; the candidate data use cases are imported through a data interface or generated online. And the first association module of the data case set and the variable name is used for acquiring the candidate data case name and the candidate data case version identifier if the candidate data case type is the data case set, and associating the data case set with the variable name according to the association relation. And the second association module of the data case set and the variable name is used for acquiring the candidate data case name and the candidate data case address if the candidate data case type is a file, and associating the candidate data case with the variable name according to the association relation.
Optionally, the target data case information further includes an impact level, and the corresponding test result determining module 430 includes: and the test data result determining submodule is used for testing the tested object according to the target test script and the target data use case to obtain a test data result. The test data result backfill submodule is used for extracting the test data result and backfilling the test data result into the target data use case; wherein the target data use case includes a test expected result. And the matching result determining submodule is used for matching the test data result with the test expected result to obtain a matching result. And the test result determining submodule is used for determining a test result according to the matching result and the influence level of the target data use case.
Optionally, the test result determining sub-module includes: and the first test result determining unit is used for determining that the test result is passed if the number of the matching failure between the test data result and the test expected result is not more than a preset quality threshold corresponding to the influence level of the target data use case. And the second test result determining unit is used for determining that the test result is not passed if the number of the matching failure between the test data result and the test expected result is larger than the preset quality threshold corresponding to the influence level of the target data use case.
Optionally, the test data result determining submodule further includes: and the main data case set determining unit is used for determining that the target data case set associated with the parent variable is the main data case set if the target test script comprises a plurality of variables with the variable types being the data case set. And the second test result determining unit is used for determining the test result by the data test result of the main data case set and the influence level of the main data case set.
The data-driven-based automatic testing device provided by the embodiment of the invention can execute the data-driven-based automatic testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the data-driven-based automatic testing method.
Example five
A fifth embodiment of the present application also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a data-driven based automated test method, the method comprising:
responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data use case to obtain a test result.
Storage media refers to any of various types of memory electronic devices or storage electronic devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, lanbas (Rambus) RAM, etc.; nonvolatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a computer system in which the program is executed, or may be located in a different second computer system connected to the computer system through a network (such as the internet). The second computer system may provide program instructions to the computer for execution. The term "storage medium" may include two or more storage media that may reside in different unknowns (e.g., in different computer systems connected by a network). The storage medium may store program instructions (e.g., embodied as a computer program) executable by one or more processors.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present application is not limited to the data-driven automatic test operation described above, and may also perform the related operations in the data-driven automatic test method provided in any embodiment of the present application.
Example six
The sixth embodiment of the present application provides an electronic device, in which the data-driven automatic test device provided by the present application may be integrated, where the electronic device may be configured in a system, or may be a device that performs some or all of the functions in the system. Fig. 5 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present application. As shown in fig. 5, the present embodiment provides an electronic device 500, which includes: one or more processors 520; a storage 510 for storing one or more programs that, when executed by the one or more processors 520, cause the one or more processors 520 to implement a data-driven based automated test method provided by an embodiment of the present application, the method comprising:
Responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
and testing the tested object according to the target test script and the target data use case to obtain a test result.
Of course, those skilled in the art will appreciate that the processor 520 also implements the technical solution of the data-driven based automated test method provided by any embodiment of the present application.
The electronic device 500 shown in fig. 5 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 5, the electronic device 500 includes a processor 520, a storage device 510, an input device 530, and an output device 540; the number of processors 520 in the electronic device may be one or more, one processor 520 being exemplified in fig. 5; the processor 520, the storage 510, the input 530, and the output 540 in the electronic device may be connected by a bus or other means, as exemplified by connection via bus 550 in fig. 5.
The storage device 510 is used as a computer readable storage medium for storing a software program, a computer executable program, and a module unit, such as program instructions corresponding to the data-driven automatic test method according to the embodiment of the present application.
The storage device 510 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, the storage 510 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, storage 510 may further include memory located remotely from processor 520, which may be connected via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 530 may be used to receive input numeric, character information or voice information and to generate key signal inputs related to user settings and function control of the electronic device. Output 540 may include an electronic device such as a display screen, speaker, etc.
The data-driven-based automatic testing device, the medium and the electronic equipment provided by the embodiment can execute the data-driven-based automatic testing method provided by any embodiment of the application, and have the corresponding functional modules and beneficial effects of executing the method. Technical details not described in detail in the above embodiments may be found in the data-driven based automated test method provided in any of the embodiments of the present application.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the application, which is set forth in the following claims.

Claims (9)

1. A data-driven based automated test method, the method comprising:
Responding to an execution instruction of a target test script, and extracting variables in the target test script;
determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
testing the tested object according to the target test script and the target data use case to obtain a test result;
the target data use case information further includes an influence level, and correspondingly, the testing of the tested object according to the target test script and the target data use case to obtain a test result includes:
testing the tested object according to the target test script and the target data use case to obtain a test data result;
extracting the test data result, and backfilling the test data result into the target data use case; wherein the target data use case comprises a test expected result;
matching the test data result with the test expected result to obtain a matching result;
and determining a test result according to the matching result and the influence level of the target data use case.
2. The method of claim 1, wherein the identification information of the variable comprises: a variable type and a variable name, the variable type comprising: a data use case set and a file; correspondingly, determining the target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case comprises the following steps:
if the variable type is a data case set, determining data case set information associated with the variable according to the variable name; the data case set information comprises a test data version identifier and a data case set name;
determining a target data use case set from the candidate data use cases according to the test data version identification and the data use case set name, and taking the target data use case set as a target data use case;
if the variable type is a file, determining a file name and a file address associated with the variable according to the variable name;
and determining a target file from the candidate data use cases according to the file name and the file address, and taking the target file as a target data use case.
3. The method of claim 2, wherein prior to extracting the variable in the target test script in response to an execution instruction of the target test script, the method further comprises:
Acquiring candidate test scripts configured by a user for a test plan; wherein the candidate test script includes variables associated with the data use case;
and analyzing the candidate test script to determine the variable identification information and the association relationship between the variable and the data use case.
4. A method according to claim 3, wherein after said parsing said candidate test script determines said variable identification information and an association between said variable and a data use case, said method further comprises:
acquiring candidate data use cases configured for the test plan by a user and determining type information of the candidate data use cases; the candidate data use cases are imported through a data interface or generated online;
if the candidate data use case type is a data use case set, acquiring the candidate data use case name and the candidate data use case version identifier, and associating the data use case set with the variable name according to the association relation;
and if the candidate data case type is a file, acquiring the candidate data case name and the candidate data case address, and associating the candidate data case with the variable name according to the association relation.
5. The method of claim 1, wherein determining a test result based on the matching result and the impact level of the target data use case comprises:
if the number of failed matching between the test data result and the expected test result is not greater than the preset quality threshold corresponding to the influence level of the target data use case, determining that the test result is passed;
and if the number of failed matching between the test data result and the expected test result is larger than the preset quality threshold corresponding to the influence level of the target data use case, determining that the test result is not passed.
6. The method of claim 1, wherein the testing the object under test according to the target test script and the target data use case to obtain a test result, further comprises:
if the target test script comprises a plurality of variables with the variable types being the data case sets, determining the target data case set associated with the parent variable as a main data case set;
and determining the data test result of the main data case set and the influence level of the main data case set.
7. An automated testing apparatus based on data driving, the apparatus comprising:
The variable extraction module is used for responding to an execution instruction of a target test script and extracting variables in the target test script;
the target data use case determining module is used for determining a target data use case associated with the target test script according to the identification information of the variable and the association relation between the variable and the data use case; wherein the target data use case is at least one of candidate data use cases;
the test result determining module is used for testing the tested object according to the target test script and the target data use case to obtain a test result;
the test result determining module includes:
the test data result determining submodule is used for testing the tested object according to the target test script and the target data use case to obtain a test data result;
the test data result backfill submodule is used for extracting the test data result and backfilling the test data result into the target data use case; wherein the target data use case comprises a test expected result;
the matching result determining submodule is used for matching the test data result with the test expected result to obtain a matching result;
And the test result determining submodule is used for determining a test result according to the matching result and the influence level of the target data use case.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the data-driven based automated test method according to any of claims 1-6.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the data-driven based automated test method of any of claims 1-6 when the computer program is executed by the processor.
CN202011490722.1A 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment Active CN112597014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011490722.1A CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011490722.1A CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112597014A CN112597014A (en) 2021-04-02
CN112597014B true CN112597014B (en) 2023-11-28

Family

ID=75196623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011490722.1A Active CN112597014B (en) 2020-12-16 2020-12-16 Automatic test method and device based on data driving, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112597014B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419774A (en) * 2021-05-31 2021-09-21 西南电子技术研究所(中国电子科技集团公司第十研究所) Test method for automatically traversing different test parameters of tested product
CN113868135A (en) * 2021-09-28 2021-12-31 杭州孝道科技有限公司 Interface case extraction method based on Java language
CN114153725B (en) * 2021-11-25 2024-06-18 中国航空工业集团公司沈阳飞机设计研究所 Automatic test verification method for complex display control system
CN114968787B (en) * 2022-05-27 2023-09-19 中移互联网有限公司 Method and device for testing based on node relation and electronic equipment
CN115314428A (en) * 2022-06-24 2022-11-08 合众新能源汽车有限公司 Vehicle CAN network testing method and system, electronic device and storage medium
CN115964306A (en) * 2023-03-16 2023-04-14 杭州新视窗信息技术有限公司 Automatic testing method, device and equipment for target system
CN116594914B (en) * 2023-07-17 2023-12-26 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for generating test data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608880A (en) * 2017-08-24 2018-01-19 郑州云海信息技术有限公司 A kind of automated testing method for being used for virtual platform based on data-driven
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN108694114A (en) * 2017-04-06 2018-10-23 广东亿迅科技有限公司 Method and its system for detaching test case, test script and test data
CN109299009A (en) * 2018-09-25 2019-02-01 金蝶软件(中国)有限公司 Data test method, apparatus, computer equipment and storage medium
CN109614313A (en) * 2018-10-25 2019-04-12 平安科技(深圳)有限公司 Automated testing method, device and computer readable storage medium
CN110321281A (en) * 2019-05-24 2019-10-11 中国工程物理研究院计算机应用研究所 Web test platform and test method based on mixing automated test frame
CN110851356A (en) * 2019-10-30 2020-02-28 河海大学 Selenium-based Web application automatic test framework and construction method and system thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN108694114A (en) * 2017-04-06 2018-10-23 广东亿迅科技有限公司 Method and its system for detaching test case, test script and test data
CN107608880A (en) * 2017-08-24 2018-01-19 郑州云海信息技术有限公司 A kind of automated testing method for being used for virtual platform based on data-driven
CN109299009A (en) * 2018-09-25 2019-02-01 金蝶软件(中国)有限公司 Data test method, apparatus, computer equipment and storage medium
CN109614313A (en) * 2018-10-25 2019-04-12 平安科技(深圳)有限公司 Automated testing method, device and computer readable storage medium
CN110321281A (en) * 2019-05-24 2019-10-11 中国工程物理研究院计算机应用研究所 Web test platform and test method based on mixing automated test frame
CN110851356A (en) * 2019-10-30 2020-02-28 河海大学 Selenium-based Web application automatic test framework and construction method and system thereof

Also Published As

Publication number Publication date
CN112597014A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN112597014B (en) Automatic test method and device based on data driving, medium and electronic equipment
CN107341098B (en) Software performance testing method, platform, equipment and storage medium
CN110309071B (en) Test code generation method and module, and test method and system
US8701092B1 (en) System and method for testing applications
CN108763076A (en) A kind of Software Automatic Testing Method, device, equipment and medium
CN110888818A (en) Test case configuration system and method, automatic test system and method
CN103164328A (en) Method and device and system for regression testing of service function
CN101025686A (en) Automation test system and test script generating and operating method
CN106961362A (en) Automated testing method and mobile cloud test system
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN109560996B (en) Automatic testing system and method for terminal of Internet of things
CN112650688A (en) Automated regression testing method, associated device and computer program product
CN112380255A (en) Service processing method, device, equipment and storage medium
CN112367220B (en) Interface testing method and device, storage medium and electronic equipment
CN112433948A (en) Simulation test system and method based on network data analysis
CN105279092A (en) Software testing method and apparatus
CN111596899A (en) Database migration method, system, equipment and storage medium based on Java development
CN110750453A (en) HTML 5-based intelligent mobile terminal testing method, system, server and storage medium
CN112860587B (en) UI automatic test method and device
CN114297961A (en) Chip test case processing method and related device
CN116545891A (en) Automatic distribution network testing method based on intelligent equipment
CN116016270A (en) Switch test management method and device, electronic equipment and storage medium
CN116185826A (en) Test method, device, equipment and storage medium
CN115878448A (en) Database test method, distributed database and storage medium
CN114398283A (en) Automatic testing method and device for user interface, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant