CN112100086B - Software automation test method, device, equipment and computer readable storage medium - Google Patents

Software automation test method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN112100086B
CN112100086B CN202011289568.1A CN202011289568A CN112100086B CN 112100086 B CN112100086 B CN 112100086B CN 202011289568 A CN202011289568 A CN 202011289568A CN 112100086 B CN112100086 B CN 112100086B
Authority
CN
China
Prior art keywords
test
data
test data
case
final
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011289568.1A
Other languages
Chinese (zh)
Other versions
CN112100086A (en
Inventor
陈文建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fangduoduo Network Technologies Co ltd
Original Assignee
Shenzhen Fangduoduo Network Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fangduoduo Network Technologies Co ltd filed Critical Shenzhen Fangduoduo Network Technologies Co ltd
Priority to CN202011289568.1A priority Critical patent/CN112100086B/en
Publication of CN112100086A publication Critical patent/CN112100086A/en
Application granted granted Critical
Publication of CN112100086B publication Critical patent/CN112100086B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Abstract

The embodiment of the invention relates to the technical field of testing, and discloses a software automation testing method, a device, equipment and a computer readable storage medium, wherein the method comprises the following steps: generating a test script which does not contain test data; storing the test data to obtain a test database; screening test data used for the test from the test database based on the test case environment to obtain first alternative data; screening out test data for the test from the first alternative data based on the test items to obtain second alternative data; screening test data matched with the test function name from the second alternative data based on the test function name to obtain test data; and the test script reads the test data and executes the test case. Through the mode, the embodiment of the invention reduces the maintenance cost of the test script and the test data and improves the test efficiency.

Description

Software automation test method, device, equipment and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of testing, in particular to a software automation testing method, a device, equipment and a computer readable storage medium.
Background
The interface test is a part of software test and is mainly used for detecting interaction points between an external system and a tested system and between internal systems, and the test key points comprise data interaction, transmission, control management processes and the like. In software testing, a test case is a particular set of test input data (i.e., test data), operational or environmental settings, and expected results that are provided to a system under test to perform a test. The test script is written for the purpose of automatic testing, and the writing of the test script must correspond to a corresponding test case.
In the interface automation test, test data is generally hard coded (hardcode) in a test script, and the test script is low in flexibility. Because each test case needs to correspond to one piece of test data, for the test cases which have the same page operation and are only different in test data, a large number of repeated codes exist in the test script, so that the test script is repeated in a large number, the repeated test script needs to be written in a large number every time the test cases are newly added, the corresponding test data is defined for the test script, and therefore the test script and the test data are high in maintenance cost and low in efficiency.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present invention provide a software automation testing method, apparatus, device and computer readable storage medium, which are used to solve the problems in the prior art that the test script and the test data are high in maintenance cost and low in efficiency.
According to an aspect of an embodiment of the present invention, there is provided a software automation testing method, including:
generating a test script which does not contain test data;
storing the test data to obtain a test database;
screening test data used for the test from the test database based on the test case environment to obtain first alternative data;
screening out test data for the test from the first alternative data based on the test items to obtain second alternative data;
screening test data matched with the test function name from the second alternative data based on the test function name to obtain test data;
and the test script reads the test data and executes the test case.
In an optional manner, before the test script reads the test data and executes the test case, the method further includes:
judging whether the test case is set not to be executed;
if the test case is set not to be executed, the test case is not executed;
and if the test case is set to be executed, executing the test script to read the test data and executing the test case.
In an optional manner, if the test case is set to be executed, the step of executing the test script to read the test data and execute the test case includes:
if the test case is set to be executed, judging whether the type of the test data is a dependent type or a depended type;
if the type of the test data is dependent, reading the data on which the test data depends from a global variable, then executing the test script to read the test data, and executing the test case;
and if the type of the test data is a depended type, executing the test script to read the test data, executing a test case, and writing data obtained after the test case is executed into the global variable. In an optional manner, the method further comprises:
the judging whether the type of the test data is dependent or depended comprises the following steps:
if the special identifier exists in the test data, judging the type of the test data to be dependent;
and if the dependency relationship field in the test data has the associated parameters, judging the type of the test data to be a depended type.
In an optional manner, the method further comprises:
judging whether the test service normally runs and whether a machine is down;
and if the test service is abnormally operated or the machine is down, setting the test case not to be executed.
In an optional manner, the screening out test data for this test from the test database based on the test case environment to obtain first candidate data includes:
determining a test case environment;
searching data of which the environment field is consistent with the environment of the test case in the test database;
determining the searched data as the first alternative data;
screening out test data used for the test from the first candidate data based on the test items to obtain second candidate data, wherein the test data comprises:
determining a test item;
searching data of which the item fields are consistent with the test items in the first alternative data;
determining the searched data as the second alternative data;
screening out test data matched with the test function name from the second alternative data based on the test function name to obtain test data, wherein the test data comprises:
searching data with function fields consistent with the test function name in the second alternative data;
and determining the searched data as the test data.
In an optional manner, the storing the test data to obtain a test database includes:
dividing the test data into first test data and second test data; wherein the first test data includes core data and the second test data includes drive data;
and storing the first test data according to a JSON format, and storing the second test data according to an Excel format to obtain a test database.
According to another aspect of the embodiments of the present invention, there is provided a software automation test apparatus, including:
the generating module is used for generating a test script which does not contain test data;
the storage module is used for storing the test data to obtain a test database;
the first screening module is used for screening the test data for the test from the test database based on the test case environment to obtain first alternative data;
the second screening module is used for screening the test data for the test from the first candidate data based on the test items to obtain second candidate data;
the third screening module is used for screening out test data matched with the test function name from the second alternative data based on the test function name to obtain test data;
and the test module is used for reading the test data by the test script and executing the test case.
According to another aspect of the embodiments of the present invention, there is provided an electronic device including:
the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction which causes the processor to execute the operation of the software automation test method.
According to another aspect of the embodiments of the present invention, there is provided a computer-readable storage medium, wherein at least one executable instruction is stored in the storage medium, and when the executable instruction is executed on an electronic device, the electronic device executes the operation of the software automation testing method as described above.
According to the embodiment of the invention, the test script and the test data are separated, the test data required by the test script is screened out based on a three-level screening mode before the test case is executed, the test data does not need to be written in when the test script is defined, only a concise test script needs to be defined, a large amount of repetition of the test script is avoided, and then the test data corresponding to the test script only needs to be maintained, so that the test script can be driven to be executed through one or more pieces of test data, the flexibility is improved, the maintenance cost of the test script and the test data is reduced, and the test efficiency is improved.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart illustrating a method for automated testing of software provided by an embodiment of the invention;
FIG. 2 is a flow chart illustrating a method for automated testing of software according to another embodiment of the present invention;
FIG. 3 is a flow chart illustrating a method for automated testing of software according to another embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a software automation test device provided by the embodiment of the invention;
fig. 5 shows a schematic structural diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein.
Fig. 1 shows a flowchart of a software automation testing method provided by an embodiment of the present invention, where the method is executed by a device that needs to perform software automation testing, for example, a mobile terminal such as a mobile phone and a tablet computer, and a computing device such as a computer and a server. The embodiment of the invention aims at software automation test, such as API (Application Programming Interface) test. The interface test is a part of software test and is mainly used for detecting interaction points between an external system and a tested system and between internal systems, and the test key points comprise data interaction, transmission, control management processes and the like. Of course, the present invention can also be used for UI (User Interface) tests and the like, unit tests and the like.
As shown in fig. 1, the method comprises the steps of:
step 110: generating a test script which does not contain test data;
in the step, the test script is separated from the test data, the separation mode can be that a variable is used for replacing a position in the test script, where the test data needs to be input, and the separated test script does not contain the test data. The test data are stored separately, when the test case needs to be executed, the test script screens out the corresponding test data from the stored test data, and then the test data are assigned to the corresponding variables, so that the test case can be executed.
Of course, the separation manner of the test script and the test data may also adopt other manners, for example, a specific identifier is used to replace a position in the test script where the test data needs to be input, and the present invention is not limited thereto.
Step 120: storing the test data to obtain a test database;
and separating the test data from the test script, and storing the separated test data. In some embodiments, the test data may be divided into first test data and second test data; the first test data comprises core data, and the second test data comprises drive data. And then storing the first test data according to a JSON format, and storing the second test data according to an Excel format to obtain a test database. The core data includes data such as a test item request header, test item request parameters, a test item request method, and a test item interface address. The above-mentioned mode of dividing the test data according to the type and storing respectively can reduce the data repetition.
The attributes or fields of the test data stored in the test database at least comprise:
1. test case environment: namely the test case environment suitable for the test data; each piece of test data is applicable to a variety of test case environments. Test case environments include, but are not limited to:
development environment (dev): the development environment is a server specially used for development, the configuration can be relatively random, and all error reports are opened for development and debugging convenience.
Test environment (test): typically, a configuration of the production environment is cloned and a program does not work properly in the test environment and must not be distributed to the production machines.
Grayscale environment (pre): external users can access, but the server configuration is relatively low, others are as productive. Sometimes the test environment may be referred to as the Pre environment.
Production environment (prod): the environment is an environment for formally providing external services, and generally an error report is turned off and an error log is turned on.
2. And (3) testing items: namely the test item to which the test data is applicable; each piece of test data is applicable to a plurality of test items;
3. test function name: each piece of test data may be applicable to multiple test functions. Such as function test _ x, function test _ y, etc.
4. Testing module name: each piece of test data may be applicable to multiple test modules. Test modules typically exist in the form of classes, such as class ATest, class BTest, and the like.
And defining attributes or fields of each piece of test data and storing the defined attributes or fields, and screening the test data based on the attributes or the fields before executing the test case subsequently.
Through steps 110 and 120, secondary packaging of DDT (Data-drive Tests) based on a unit test framework is realized. In the calculation, for the Python language, the unit test framework is unitest, which is a self-contained unit test module in the Python, and has a structure similar to Junit. The Unittest has a complete test structure, supports the execution of automatic test, organizes a test case set, provides rich assertion methods, and finally generates a test report. DDT allows one test case to be run and displayed as multiple test cases by using different test data.
Three screens of test data are then required. First, environment-based screening is performed, then item-based screening is performed, and finally function-based screening is performed. Through three times of screening, the test case environment which is expected to be executed can be uniformly controlled, or a test item is specified, or a corresponding case function is executed. In addition, when the number of the test cases or the test data amount is large, the test cases are executed in a three-time screening mode, only one time of data is read from the test database, and only specific test data corresponding to the test cases need to be screened for the specific test cases, so that the test performance is improved. The process of each screening is further described below.
Step 130: screening test data used for the test from a test database based on the test case environment project name to obtain first alternative data;
in step 130, context-based screening is performed. The environment can be specified by a tester, the tester inputs a test case environment expected to be executed, and the test software screens all test data in the environment according to the test case environment. The method can be specifically realized by the following steps: determining a test case environment; searching data with the environment field consistent with the test case environment in the test database; and determining the searched data as the first candidate data. The first alternative data is all test data under the currently determined test case environment.
Step 140: screening out test data for the test from the first alternative data based on the test items to obtain second alternative data;
in step 140, item-based screening is performed. After all the test data in the currently determined test case environment are obtained in step 130, the test data of the specified test item needs to be further screened out from the test data. The project may also be specified by a tester who enters a test project desired to be executed, and the test software screens all test data under the project according to the test project. The method can be specifically realized by the following steps: determining a test item; searching data of which the item fields are consistent with the test items in the first alternative data; and determining the searched data as second alternative data.
Step 150: screening out test data matched with the test function name from the second alternative data based on the test function name to obtain test data;
in step 150, function-based screening is performed. After all the test data under the currently determined test item are obtained in step 140, the test data meeting the test function of the current test script needs to be further screened from the test data. The test script is deterministic, and the test function is deterministic. The method can be specifically realized by the following steps: searching data with the function field consistent with the test function name in the second alternative data; and determining the searched data as test data.
For example, the test function names in the test script are "function test _ x" and "function test _ y", and then the test data whose function field coincides with "function test _ x" is searched from the second candidate data as the test data of "function test _ x", and the test data whose function field coincides with "function test _ y" is searched from the second candidate data as the test data of "function test _ y".
In some embodiments, the following module screening steps may also be performed before step 150:
and screening out the test data matched with the name of the test module from the second candidate data based on the name of the test module to obtain third candidate data. Step 150 needs to screen out the test data matching the name of the test function from the third candidate data based on the name of the test function, so as to obtain the test data. The sub-module screening can facilitate the test personnel to maintain the test script and the test data of the corresponding test case.
Step 160: the test script reads the test data and executes the test case.
And after the final test data is obtained, assigning the test data to corresponding variables in the test script to execute the test case, so that the test case is executed in a data-driven script mode.
According to the embodiment of the invention, the test script and the test data are separated, the test data required by the test script is screened out based on a three-level screening mode before the test case is executed, the test data does not need to be written in when the test script is defined, only a concise test script needs to be defined, a large amount of repetition of the test script is avoided, and then the test data corresponding to the test script only needs to be maintained, so that the test script can be driven to be executed through one or more pieces of test data, the flexibility is improved, the maintenance cost of the test script and the test data is reduced, and the test efficiency is improved.
Fig. 2 shows a flowchart of a software automation testing method according to another embodiment of the present invention, which is executed by a device that needs to perform software automation testing, such as a mobile terminal like a mobile phone, a tablet computer, etc., and a computing device like a computer, a server, etc. The embodiment of the invention aims at software automation test, such as API (Application Programming Interface) test. The interface test is a part of software test and is mainly used for detecting interaction points between an external system and a tested system and between internal systems, and the test key points comprise data interaction, transmission, control management processes and the like. Of course, the present invention can also be used for UI (User Interface) tests and the like, unit tests and the like.
As shown in fig. 2, the method comprises the steps of:
step 210: generating a test script which does not contain test data;
step 220: storing the test data to obtain a test database;
step 230: screening test data used for the test from a test database based on the test case environment to obtain first alternative data;
step 240: screening out test data for the test from the first alternative data based on the test items to obtain second alternative data;
step 250: screening out test data matched with the test function name from the second alternative data based on the test function name to obtain test data;
the specific implementation of steps 210 and 250 can refer to steps 110 and 150 in the above-mentioned embodiment, which is not described herein again.
Step 260: judging whether the test case is set not to be executed; if yes, go to step 270; otherwise, go to step 280;
the test case can be set to be executed by default, and when the test service is hung or the machine is down, the test case can be judged not to be executed smoothly, and then the test case can be set to be not executed.
Thus, the method further comprises the steps of:
judging whether the test service normally runs and whether a machine is down;
and if the test service is abnormally operated or the machine is down, setting the test case not to be executed.
Step 270: the test case is not executed;
and if the test case is set not to be executed, skipping to execute the test case.
The test case is set to be executed, the test script is executed to read the test data, and the test case is executed.
Step 280: the test script reads the test data and executes the test case.
The embodiment of the invention directly skips when the test case can not be executed by judging whether the test case is set to be not executed, thereby preventing the system from crashing or falling into dead circulation and improving the test efficiency.
Fig. 3 shows a flowchart of a software automation testing method according to another embodiment of the present invention, which is executed by a device that needs to perform software automation testing, such as a mobile terminal like a mobile phone, a tablet computer, etc., and a computing device like a computer, a server, etc. The embodiment of the invention aims at software automation test, such as API (Application Programming Interface) test. The interface test is a part of software test and is mainly used for detecting interaction points between an external system and a tested system and between internal systems, and the test key points comprise data interaction, transmission, control management processes and the like. Of course, the present invention can also be used for UI (User Interface) tests and the like, unit tests and the like.
As shown in fig. 3, the method comprises the steps of:
step 310: generating a test script which does not contain test data;
step 320: storing the test data to obtain a test database;
step 330: screening test data used for the test from a test database based on the test case environment to obtain first alternative data;
step 340: screening out test data for the test from the first alternative data based on the test items to obtain second alternative data;
step 350: screening out test data matched with the test function name from the second alternative data based on the test function name to obtain test data;
step 360: judging whether the test case is set not to be executed; if yes, go to step 370; otherwise, go to step 380;
step 370: the test case is not executed;
the specific implementation of step 310-370 can refer to step 210-270 in the above embodiment, and will not be described herein again.
Step 380: judging whether the type of the test data is a dependent type or a depended type;
and when the test case can be normally executed, determining the next step according to the type of the test data used by the test case. Test data can be classified from dependencies into dependent and depended. For example, in the shopping software, in order submission and payment, the order must be submitted first to make payment, so the test data submitted in the order is dependent, and the test data paid in the order is dependent. Step 390: if the type of the test data is dependent, reading the data on which the test data depends from the global variable, and then executing step 400;
for dependent test data, the execution is based on the test data it depends on, e.g., the data obtained after the previous test case was executed. For example, paying an order requires knowing the content of the order and whether there is a usage red pack, coupon, etc., i.e., payment is required based on the results of submitting the order. The execution result of the dependent test data needs to be written into the global variable, the dependent test data can read data obtained after the test data of other test cases depended by the dependent test data is executed from the global variable, and then the test case is executed according to the input test data.
If the type of the test data is dependent, step 400 is performed, and step 410 is performed.
Step 400: the test script reads the test data and executes the test case;
step 410: writing data obtained after the test case is executed into a global variable;
accordingly, for the depended-on test data, the execution result needs to be written into the global variable to be provided for the test data depending on the global variable.
Specifically, the determining whether the type of the test data is dependent or depended includes:
if the special identifier exists in the test data, judging the type of the test data to be dependent;
and if the dependency relationship field in the test data has the associated parameters, judging the type of the test data to be a depended type.
For example, if a piece of test data is dependent, an identifier is set in Excel to identify the piece of data as dependent, and when a test case is executed, a variable related to the piece of data directly uses the dependent data in the global variable. If a piece of test data is dependent, an association parameter is stored in a specific field in Excel, for example, in interface test, an association parameter can be stored in an "interface association parameter" field in Excel, for example, the association parameter is { relateddata "{ [ identifier": identifier "}, {" ifddId ": fddId" }. And when judging that the field of the interface association parameter has data, the test data is the depended type. By the way of storing the associated parameters by the identifier and the specific field, the type of the test data is judged to be simple and convenient, and the efficiency is improved.
According to the embodiment of the invention, by judging the type of the test data and reading the data or writing the data into the global variable according to the type of the test data, the test case with the dependency relationship can be smoothly executed, and the test efficiency is improved.
Fig. 4 shows a schematic structural diagram of a software automation testing apparatus provided in an embodiment of the present invention. As shown in fig. 4, the apparatus 300 includes:
a generating module 310, configured to generate a test script that does not include test data;
the storage module 320 is used for storing the test data to obtain a test database;
the first screening module 330 is configured to screen test data used for the test from the test database based on a test case environment to obtain first candidate data;
the second screening module 340 is configured to screen out test data used for the test from the first candidate data based on the test item, so as to obtain second candidate data;
a third screening module 350, configured to screen, from the second candidate data, test data matched with the test function name based on the test function name, so as to obtain test data;
and the test module 360 is used for reading the test data by the test script and executing the test case.
In an optional manner, the apparatus 300 further includes a determining module 370, configured to:
judging whether the test case is set not to be executed;
if the test case is set not to be executed, the test case is not executed;
and if the test case is set to be executed, executing the test script to read the test data and executing the test case.
In an optional manner, the determining module 370 is further configured to:
if the test case is set to be executed, judging whether the type of the test data is a dependent type or a depended type;
if the type of the test data is dependent, reading the data on which the test data depends from a global variable, then executing the test script to read the test data, and executing the test case;
and if the type of the test data is a depended type, executing the test script to read the test data, executing a test case, and writing data obtained after the test case is executed into the global variable.
In an optional manner, the determining module 370 is further configured to:
if the special identifier exists in the test data, judging the type of the test data to be dependent;
and if the dependency relationship field in the test data has the associated parameters, judging the type of the test data to be a depended type.
In an optional manner, the determining module 370 is further configured to:
judging whether the test service normally runs and whether a machine is down;
and if the test service is abnormally operated or the machine is down, setting the test case not to be executed.
In an optional manner, the generating module 310 is further configured to
Replacing the position of the test script needing to input the test data by using a variable;
the test module 360 is further configured to:
the test script reads the test data;
assigning the test data to the corresponding variable;
and executing the test case.
In an optional manner, the first screening module 330 is further configured to:
determining a test case environment;
searching data of which the environment field is consistent with the environment of the test case in the test database;
determining the searched data as the first alternative data;
the second screening module 340 is further configured to:
determining a test item;
searching data of which the item fields are consistent with the test items in the first alternative data;
determining the searched data as the second alternative data;
the third screening module 350 is further configured to:
searching data with function fields consistent with the test function name in the second alternative data;
and determining the searched data as the test data.
In an optional manner, the storage module 320 is further configured to:
dividing the test data into first test data and second test data; wherein the first test data includes core data and the second test data includes drive data;
and storing the first test data according to a JSON format, and storing the second test data according to an Excel format to obtain a test database.
According to the embodiment of the invention, the test script and the test data are separated, the test data required by the test script is screened out based on a three-level screening mode before the test case is executed, the test data does not need to be written in when the test script is defined, only a concise test script needs to be defined, a large amount of repetition of the test script is avoided, and then the test data corresponding to the test script only needs to be maintained, so that the test script can be driven to be executed through one or more pieces of test data, the flexibility is improved, the maintenance cost of the test script and the test data is reduced, and the test efficiency is improved.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and the specific embodiment of the present invention does not limit the specific implementation of the electronic device.
As shown in fig. 5, the electronic device may include: a processor (processor)402, a Communications Interface 404, a memory 406, and a Communications bus 408.
Wherein: the processor 402, communication interface 404, and memory 406 communicate with each other via a communication bus 408. A communication interface 404 for communicating with network elements of other devices, such as clients or other servers. The processor 402 is configured to execute the program 410, and may specifically execute the relevant steps in the above-described embodiment of the software automation testing method.
In particular, program 410 may include program code comprising computer-executable instructions.
The processor 402 may be a central processing unit CPU or an application Specific Integrated circuit asic or one or more Integrated circuits configured to implement embodiments of the present invention. The electronic device comprises one or more processors, which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 406 for storing a program 410. Memory 406 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The embodiment of the invention provides a computer-readable storage medium, wherein at least one executable instruction is stored in the storage medium, and when the executable instruction runs on electronic equipment, the electronic equipment is enabled to execute the software automation testing method in any method embodiment.
The embodiment of the invention provides a software automatic testing device which is used for executing the software automatic testing method.
Embodiments of the present invention provide a computer program, where the computer program can be called by a processor to enable an electronic device to execute the software automation testing method in any of the above method embodiments.
Embodiments of the present invention provide a computer program product comprising a computer program stored on a computer-readable storage medium, the computer program comprising program instructions which, when run on a computer, cause the computer to perform the software automation testing method of any of the above-mentioned method embodiments.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specified otherwise.

Claims (8)

1. A software automation test method is characterized by comprising the following steps:
replacing the position of the test data needing to be input in the test script by using the variable, separating the test script from the test data, and generating the test script which does not contain the test data;
storing the test data to obtain a test database;
screening test data used for the test from the test database based on the test case environment to obtain first alternative data;
screening out test data for the test from the first alternative data based on the test items to obtain second alternative data;
screening out test data matched with the name of the test module from the second alternative data based on the name of the test module to obtain third alternative data; wherein the test modules exist in the form of classes;
screening out test data matched with the test function name from the third alternative data based on the test function name to obtain final test data;
the test script reads the final test data, assigns the final test data to a corresponding variable and executes a test case;
the test script reads the final test data, and before executing the test case, the method further comprises:
judging whether the test case is set not to be executed;
if the test case is set not to be executed, the test case is not executed;
if the test case is set to be executed, executing the test script to read the final test data and executing the test case;
if the test case is set to be executed, executing the test script to read the final test data, and executing the test case, wherein the step of executing the test case comprises the following steps:
if the test case is set to be executed, judging whether the type of the final test data is a dependent type or a depended type;
if the type of the final test data is dependent, reading data on which the final test data depends from a global variable, then executing the test script to read the final test data, and executing a test case;
and if the type of the final test data is a depended type, executing the test script to read the final test data, executing a test case, and writing data obtained after the test case is executed into the global variable.
2. The method of claim 1, wherein said determining whether the type of the final test data is dependent or depended comprises:
if the final test data has the special identifier, judging the type of the final test data to be dependent;
and if the dependency relationship field in the final test data has the associated parameters, judging that the type of the final test data is a depended type.
3. The method of claim 1, further comprising:
judging whether the test service normally runs and whether a machine is down;
and if the test service is abnormally operated or the machine is down, setting the test case not to be executed.
4. The method of claim 1,
the method for screening the test data for the test from the test database based on the test case environment to obtain first alternative data comprises the following steps:
determining a test case environment;
searching data of which the environment field is consistent with the environment of the test case in the test database;
determining the searched data as the first alternative data;
screening out test data used for the test from the first candidate data based on the test items to obtain second candidate data, wherein the test data comprises:
determining a test item;
searching data of which the item fields are consistent with the test items in the first alternative data;
determining the searched data as the second alternative data;
screening out test data matched with the test function name from the third alternative data based on the test function name to obtain final test data, wherein the test data comprises the following steps:
searching data with function fields consistent with the test function name in the third alternative data;
and determining the searched data as the final test data.
5. The method of claim 1, wherein said storing said test data to obtain a test database comprises:
dividing the test data into first test data and second test data; wherein the first test data includes core data and the second test data includes drive data;
and storing the first test data according to a JSON format, and storing the second test data according to an Excel format to obtain a test database.
6. An automated software testing apparatus, the apparatus comprising:
the generating module is used for replacing the position of the test data needing to be input in the test script by using the variable, separating the test script from the test data and generating the test script without the test data;
the storage module is used for storing the test data to obtain a test database;
the first screening module is used for screening the test data for the test from the test database based on the test case environment to obtain first alternative data;
the second screening module is used for screening the test data for the test from the first candidate data based on the test items to obtain second candidate data;
the third screening module is used for screening out the test data matched with the name of the test module from the second alternative data based on the name of the test module to obtain third alternative data; wherein the test modules exist in the form of classes;
the fourth screening module is used for screening out the test data matched with the test function name from the third alternative data based on the test function name to obtain final test data;
the test module is used for reading the final test data by the test script, assigning the final test data to a corresponding variable and executing a test case;
the device further comprises a judging module for:
judging whether the test case is set not to be executed;
if the test case is set not to be executed, the test case is not executed;
if the test case is set to be executed, executing the test script to read the final test data and executing the test case;
the judging module is further configured to:
if the test case is set to be executed, judging whether the type of the final test data is a dependent type or a depended type;
if the type of the final test data is dependent, reading data on which the final test data depends from a global variable, then executing the test script to read the final test data, and executing a test case;
and if the type of the final test data is a depended type, executing the test script to read the final test data, executing a test case, and writing data obtained after the test case is executed into the global variable.
7. An electronic device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction which causes the processor to execute the operation of the software automation test method according to any one of the claims 1 to 5.
8. A computer-readable storage medium having stored therein at least one executable instruction that, when executed on an electronic device, causes the electronic device to perform operations of the software automation testing method of any one of claims 1 to 5.
CN202011289568.1A 2020-11-17 2020-11-17 Software automation test method, device, equipment and computer readable storage medium Active CN112100086B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011289568.1A CN112100086B (en) 2020-11-17 2020-11-17 Software automation test method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011289568.1A CN112100086B (en) 2020-11-17 2020-11-17 Software automation test method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112100086A CN112100086A (en) 2020-12-18
CN112100086B true CN112100086B (en) 2021-02-26

Family

ID=73784655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011289568.1A Active CN112100086B (en) 2020-11-17 2020-11-17 Software automation test method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112100086B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007060094A2 (en) * 2005-11-22 2007-05-31 International Business Machines Corporation Software application interfacing testing
CN102495799A (en) * 2011-12-02 2012-06-13 刘伟 Automatic test system and method of movable terminal
CN103377127A (en) * 2012-04-28 2013-10-30 阿里巴巴集团控股有限公司 Development testing system, testing method and device for webpage product
CN105138462A (en) * 2015-09-24 2015-12-09 武汉泰世达科技有限公司 Testing software integration frame and method for processing testing data
EP3508981A1 (en) * 2017-12-27 2019-07-10 Accenture Global Solutions Limited Touchless testing platform
CN111124892A (en) * 2019-12-04 2020-05-08 四川安迪科技实业有限公司 Automatic testing method and system based on command line
CN111427803A (en) * 2020-06-11 2020-07-17 平安国际智慧城市科技股份有限公司 Automated random test method, apparatus, computer device and medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070113282A1 (en) * 2005-11-17 2007-05-17 Ross Robert F Systems and methods for detecting and disabling malicious script code
US9166809B2 (en) * 2006-04-03 2015-10-20 Verizon Patent And Licensing Inc. Automated network testing
US7877732B2 (en) * 2006-11-29 2011-01-25 International Business Machines Corporation Efficient stress testing of a service oriented architecture based application
CN101571802B (en) * 2009-06-19 2012-05-09 北京航空航天大学 Visualization automatic generation method of embedded software test data and system thereof
US9563544B2 (en) * 2012-01-10 2017-02-07 Sap Se Framework for automated testing of mobile apps
CN102955739B (en) * 2012-11-21 2016-02-03 浪潮电子信息产业股份有限公司 A kind of method improving performance test script reuse rate
CN108694114A (en) * 2017-04-06 2018-10-23 广东亿迅科技有限公司 Method and its system for detaching test case, test script and test data
US10310967B1 (en) * 2017-11-17 2019-06-04 International Business Machines Corporation Regression testing of new software version and deployment
CN109614313A (en) * 2018-10-25 2019-04-12 平安科技(深圳)有限公司 Automated testing method, device and computer readable storage medium
CN110399309B (en) * 2019-08-02 2023-11-03 中国工商银行股份有限公司 Test data generation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007060094A2 (en) * 2005-11-22 2007-05-31 International Business Machines Corporation Software application interfacing testing
CN102495799A (en) * 2011-12-02 2012-06-13 刘伟 Automatic test system and method of movable terminal
CN103377127A (en) * 2012-04-28 2013-10-30 阿里巴巴集团控股有限公司 Development testing system, testing method and device for webpage product
CN105138462A (en) * 2015-09-24 2015-12-09 武汉泰世达科技有限公司 Testing software integration frame and method for processing testing data
EP3508981A1 (en) * 2017-12-27 2019-07-10 Accenture Global Solutions Limited Touchless testing platform
CN111124892A (en) * 2019-12-04 2020-05-08 四川安迪科技实业有限公司 Automatic testing method and system based on command line
CN111427803A (en) * 2020-06-11 2020-07-17 平安国际智慧城市科技股份有限公司 Automated random test method, apparatus, computer device and medium

Also Published As

Publication number Publication date
CN112100086A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN105260299A (en) Method, device, and system for software test
CN110928802A (en) Test method, device, equipment and storage medium based on automatic generation of case
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
CN110597730A (en) Scene method based automatic test case generation method and system
US8661414B2 (en) Method and system for testing an order management system
CN114546738B (en) Universal test method, system, terminal and storage medium for server
CN111309317A (en) Code automation method and device for realizing data visualization
CN111124528B (en) Page loading method and device, electronic equipment and computer readable storage medium
CN112181749A (en) Hardware testing method and device, electronic equipment and storage medium
CN112100086B (en) Software automation test method, device, equipment and computer readable storage medium
CN115934559A (en) Testing method of intelligent form testing system
CN115080433A (en) Testing method and device based on flow playback
CN111125605B (en) Page element acquisition method and device
CN113032256A (en) Automatic test method, device, computer system and readable storage medium
CN111767218A (en) Automatic testing method, equipment and storage medium for continuous integration
CN114124769A (en) Base station testing method and device, electronic equipment and storage medium
CN114371866A (en) Version reconfiguration test method, device and equipment of service system
CN115129355B (en) Page repair method, system and computer equipment thereof
CN116049020B (en) Automatic test method, device and equipment for software products and readable storage medium
CN116304399B (en) Visual processing method, device and system for test cases
CN111221719B (en) Automatic test system and test method
CN114579444A (en) Automated testing method, automated testing device, electronic apparatus, storage medium, and program product
CN115422029A (en) Interface log document generation method and device and computer equipment
CN116185433A (en) Resource processing method, device, equipment and computer readable storage medium
CN116610405A (en) Method and device for controlling page state and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant