CN114528198A - Software automation test method and device, electronic equipment and storage medium - Google Patents
Software automation test method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114528198A CN114528198A CN202011321931.3A CN202011321931A CN114528198A CN 114528198 A CN114528198 A CN 114528198A CN 202011321931 A CN202011321931 A CN 202011321931A CN 114528198 A CN114528198 A CN 114528198A
- Authority
- CN
- China
- Prior art keywords
- test
- software
- data
- testing
- label
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000010998 test method Methods 0.000 title claims abstract description 7
- 238000012360 testing method Methods 0.000 claims abstract description 491
- 238000013515 script Methods 0.000 claims abstract description 69
- 230000006870 function Effects 0.000 claims abstract description 67
- 238000000034 method Methods 0.000 claims abstract description 32
- 238000013522 software testing Methods 0.000 claims abstract description 23
- 238000004590 computer program Methods 0.000 claims description 14
- 238000012423 maintenance Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The application discloses a software automation test method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a test label of a software test; acquiring a software test script according to the test label, and generating a software automation test case applied to the test label; according to the test label, obtaining test data in a database corresponding to the test label; and testing the test data according to the software automation test case. In the embodiment of the application, the software testing script is loaded into the software automatic testing case, the testing data of the corresponding database of the tested software is called, the automatic testing of different functions of the software is realized, the technical requirements on testers are low, the use and the maintenance are convenient, the software testing method can be directly applied to the test of the dealer software in a specific scene, and the reuse rate is high.
Description
Technical Field
The present disclosure relates generally to the field of software testing technologies, and in particular, to a software automation testing method and apparatus, an electronic device, and a storage medium.
Background
In prior art, the test of security dealer software product all goes on line through the mode of artifical manual test, and this kind of mode leads to the human cost high easily, and partial software test scene can't realize, and efficiency of software testing is low excessively, and along with the rapid development of security dealer software technology, more and more to the test project of security dealer software product, the software test demand of security dealer enterprise can't be satisfied to the mode of artifical test moreover.
The prior art can not be directly applied to case flow, the reuse rate is low, and the applicability of a software test scene is insufficient.
Therefore, a more efficient software automation testing method is desired to meet the requirements of security dealer software testing.
Disclosure of Invention
In view of the above-mentioned defects or shortcomings in the prior art, a software automated testing method, device, electronic device and storage medium are provided, which can meet the requirements of automated testing of the existing dealer software, reduce the workload of testers, and improve the quality and efficiency of dealer software product testing.
Based on one aspect of the embodiments of the present invention, an embodiment of the present application provides a software automation testing method, including:
acquiring a test label of a software test, wherein the test label is used for defining a test item of the tested software;
acquiring a software test script according to the test label, and generating a software automation test case applied to the test label, wherein the software test script is a function for testing test data of the tested software;
according to the test label, obtaining test data in a database corresponding to the test label;
and testing the test data according to the software automation test case.
In one embodiment, before the obtaining the software test script, the method further includes:
setting a test label of a software test, wherein the test label is used for defining a test item of the tested software;
and acquiring a software test script according to the test label.
In one embodiment, the obtaining the test label of the software test includes:
acquiring software functions to be tested during software automated testing;
acquiring software testing function project information according to the software function to be tested;
acquiring a software test item directory according to the software test function item information;
and associating the software testing item catalog with the corresponding software testing script and testing data according to the software testing item catalog.
In one embodiment, the obtaining the software test script and the generating the software automation test case applied to the test item of the tested software comprises:
acquiring one or more software test scripts according to the test labels;
setting test points of a software automation test case according to one or more acquired software test scripts, wherein the test points are used for testing acquired test data of a software function in a database;
setting a test set of a software automation test case according to the test points, wherein the test set is used for testing the test data of all the software functions in the acquired database;
and generating a software automation test case according to one or more test points and/or one or more test sets.
In an embodiment, the obtaining, according to the test tag, the test data in the database corresponding to the test tag includes:
according to the test label, obtaining test data of a database to be called;
setting test data of a software function of a database as first data;
merging a plurality of first data of a database into second data according to the first data;
and according to the first data and/or the second data, the first data and/or the second data are used as test data of the software automation test.
In one embodiment, the testing the test data according to the software automation test case comprises:
according to the first data, the test of the first data is finished by the test point;
and according to the second data, completing the test of the second data by the test sleeve.
In one embodiment, the method further comprises:
generating a test report according to the test result of the test data;
and debugging the unfinished software test script or the tested software according to the test report until the test of all test items in the test label is finished.
In one embodiment, the method further comprises:
saving the debugged uncompleted software test script or the tested software;
updating a software test script of the automatic test case or test data of the tested software;
and testing according to the updated software test script or test data.
The embodiment of the invention also discloses a software automatic testing device, which comprises:
the test label module is used for generating a test label for software test according to a test item of the tested software;
the test case module is used for acquiring a software test script according to the test label and generating a software automation test case applied to the test label, wherein the software test script is a function for testing the test data of the tested software;
and the test data module is used for acquiring the test data in the database corresponding to the test label according to the test label.
The embodiment of the invention also discloses electronic equipment, which comprises one or more processors and a memory, wherein the memory is used for storing one or more programs; when the one or more programs are executed by the processor, the processor is enabled to realize the software automatic testing method provided by the embodiments of the invention.
The embodiment of the invention also discloses a computer readable storage medium storing a computer program, and the computer program realizes the automatic software testing method provided by the embodiments of the invention when executed.
In the embodiment of the application, the software automation test is realized by obtaining a test label of the software test; acquiring a software test script according to the test label, generating a software automation test case applied to the test label, and acquiring test data in a database corresponding to the test label; and testing the test data according to the software automation test case. The invention has low technical requirements on testers, is convenient to use and maintain, has a self-contained test report function, can be directly applied to the test of dealer software in a specific scene, and has high reuse rate.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a diagram illustrating an exemplary implementation of a software automation test methodology;
FIG. 2 is a flow diagram illustrating a method for automated testing of software, according to one embodiment;
FIG. 3 is a flowchart illustrating a software automated testing method according to another embodiment;
FIG. 4 is a flowchart illustrating a method for automated testing of software in accordance with yet another embodiment;
FIG. 5 is a block diagram of an exemplary software automated test equipment;
FIG. 6 is one of the internal structural diagrams of an electronic device in one embodiment;
FIG. 7 is a second exemplary embodiment of an electronic device.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The software automatic testing method provided by the application can be applied to the application environment shown in fig. 1. The software automatic testing method is applied to the software automatic testing device. The software automated testing device can be configured in the terminal 102 or the server 104, or partially configured in the terminal 102 and partially configured in the server 104, and the terminal 102 and the server 104 interact to complete the software automated testing method.
Wherein the terminal 102 and the server 104 can communicate through a network.
The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, the terminal 102 needs to have a function of acquiring a software test script, and the server 104 may be implemented by an independent server or a server cluster formed by multiple servers.
In one embodiment, as shown in FIG. 2, a software automated testing method is provided. The embodiment is mainly illustrated by applying the method to the terminal 102 in fig. 1.
For software automation test, the technical scheme used in the prior art generally comprises two types, one is that a Requests library framework structure is adopted, Requests are written by python language, and an HTTP library of an Apache2 license open source protocol is adopted; secondly, a Data base Library and a Library written by python language are adopted, and the two methods can not solve the following problems: firstly, the method cannot be directly applied to case flow, the reuse rate is low, the method provided by the basic library cannot be directly applied under most conditions due to data difference generated by the dealer project, and the method can be used only by performing keyword encapsulation or library self-definition for one time; secondly, partial scene functions cannot be effectively supported, and a library of the python cannot effectively support some specific scenes, for example, rounding up of decimal places cannot be realized, so that the case cannot be realized in the result verification process, and the case needs to be realized by adding keywords or expanding a custom library.
Accordingly, the present application provides a new method for software automation testing, as illustrated in fig. 2, the method comprising:
Specifically, in a task of software automated testing, which functions of software to be tested need to be set first, after the functions are selected, a test label for software testing is generated, and specifically, the test label is a test item for determining specific software to be tested.
In one embodiment, the obtaining the test label of the software test includes:
acquiring software functions to be tested during software automated testing;
specifically, in a software automation test project, it is first necessary to know the function of the software test, for example, in the automation test of dealer software, the project to be tested may include: access items, order items, account items, asset items, etc. in each item there are functions that need to be tested, such as functions included in an access item may include: the system comprises a monthly income and income statistic function, a monthly profit rate statistic function, a monthly payment statistic function, a monthly income statistic function and the like, wherein each project or each function of each project can be stored in one database or a plurality of databases, and certain test data can be collected to test the functional condition when the specific functions are tested.
And acquiring software testing function project information according to the software function to be tested.
Specifically, the test item information can be summarized according to the test function.
And acquiring a software test item catalog according to the software test function item information.
Specifically, on the test label, the test item information of the software test is represented in the form of a test catalog.
And associating the software testing item catalog with the corresponding software testing script and testing data according to the software testing item catalog.
Specifically, when testing the testing function of each software to be tested, the corresponding software testing script and the corresponding testing data need to be associated with the project catalog, so that the software testing script of the testing data and the testing data can be obtained, and the software testing is realized.
The method is characterized in that a Python script file is compiled, functions for increasing, deleting, changing and searching the database service are defined, and then the Python script file is added to a software automation test case as a Library. The software automation test case can directly call the user key word to call the compiled function to connect the appointed database, and business operations such as adding, deleting, modifying, checking and the like can be realized on the connected database by inputting the corresponding parameter value. If a third-party script file needs to be imported, a third-party library code module to be called needs to be imported firstly; then, some functions for adding, deleting, modifying, checking and the like to the database are defined, for example, functions for connecting a specified database, SQL statements, feedback methods for whether the execution result is successful or not, and the like.
Specifically, as shown in fig. 3, the obtaining of the software test script and the generating of the software automation test case applied to the test item of the software to be tested include the following steps:
The software test scripts required by the test functions of the tested software can be obtained through the software test item catalog of the test labels, and one or more software test scripts are specifically called according to the requirements of the test functions of the tested software.
Test points are intended to perform the testing of a software function in a database and, when actually executed, there are situations where a software function requires only one software test script to complete, and only one database needs to be called to complete the whole software test, so that the test point is to complete the test of one test data in the database, and similarly, if multiple test data are called from a database, each test data requires a site to be tested, and for a site, which is at least one software test script, and possibly a plurality of software test scripts, since one test datum typically performs a test of a software function, even if a test data cannot perform a software function test, a test data needs to call a test point, and a test point necessarily includes at least one software test script.
And 303, setting a test set of the software automation test case according to the test points, wherein the test set is used for testing the obtained test data of all software functions in a personal database.
Specifically, in actual implementation, there may be a case where, for example, the software function of a software under test needs to retrieve tested data from a database, where the tested data includes a plurality of test points, and if test points are also used, each test data is tested by using one test point, the test efficiency will be very low, so that the test efficiency will be greatly improved by testing all the test data of a database by using a test suite at one time, where the test suite includes all the test points corresponding to the data under test called by the database.
And 304, generating a software automation test case according to one or more test points and/or one or more test sets.
Specifically, the following situations may exist in the test data of the database that needs to be called by the test function of the software under test: firstly, only one test data of one database is required to be called, only one test point is required for completing the software automatic test, and the software automatic test case only comprises one test point; secondly, a plurality of databases are required to be called, but each database only needs one test data, so that only a plurality of test points are required for completing the software automatic test, and the software automatic test case only comprises a plurality of test points; thirdly, one database needs to be called, but one database needs to call a plurality of test data, only one test set is needed for completing the software automatic test, and the software automatic test case only comprises one test set; fourthly, a plurality of databases are required to be called, but each database is required to call a plurality of test data, so that a plurality of test sets are required for completing the software automatic test, and the software automatic test case only comprises a plurality of test sets; fifthly, a plurality of databases need to be called, some databases only need to call one test data, some databases need to call a plurality of test data, so that a plurality of test sleeves and a plurality of test points are needed for completing the software automatic test, and the software automatic test case comprises a plurality of test sleeves and a plurality of test points.
All the test data are stored in databases, and in particular, these databases generally store corresponding test software according to specific function types, such as: the deposit and withdrawal database stores corresponding deposit and withdrawal test data; the order database stores corresponding asset test data; the account database stores corresponding account test data; the asset database stores the corresponding asset test data, and the like.
Specifically, as shown in fig. 4, the obtaining of the test data in the database corresponding to the test tag according to the test tag includes:
after the test tag is set, the test data of the database required to be called by the test function of the software to be tested can be obtained through the test tag list, and the specific database to be called and the specific test data in the database are determined according to the requirement of the test function of the software to be tested.
Specifically, according to different test items of the software to be tested, one test data may be called from one database, multiple test data may be called from multiple databases, one test data may be called from multiple databases, multiple test data may be called from multiple databases, and therefore, in order to improve efficiency, and also in order to correspond to a test point in a software automation test case, the test data of one software function in one database is set as first data, and the first data serves to designate the first data to one test point for testing.
Specifically, a plurality of test data are acquired from one database for testing, in order to improve the testing efficiency, all the test data extracted from the database are combined into second data, the second data comprise all the first data called in the database, the second data correspond to a test suite in a software automation test case, and for all the called test data in one database, corresponding testing can be completed only by uniformly sending the second data to the corresponding test suite. In particular, if only one test data is called for a database to form the first data, the first data does not need to be merged into the second data, because in this case, it is more efficient to directly send the first data to the corresponding test point for testing.
And step 404, taking the first data and/or the second data as test data of the software automation test.
Specifically, when the software automation test is actually performed, first data may be required, second data may be required, or both the first data and the second data may be required, so that the test data is acquired according to a specific software automation test item, and the first data and/or the second data is generated according to characteristics of the test data, and particularly, the test data of the software automation test includes at least one of the first data and the second data.
And 204, testing the test data according to the software automation test case.
Specifically, according to the first data, the test point completes the test of the first data; and according to the second data, completing the test of the second data by the test sleeve. The test efficiency can be greatly improved through the test modes of the first data, the second data, the test points and the test sleeve, for example, the test of the test data of one software function of one database can be directly completed through the test points, a plurality of first data of one database are combined into second data, the second data are directly tested through the test sleeve, the test times and the test steps are reduced, in addition, the test time of the test is reduced through the direct packaging test mode, for example, if each first data needs to be tested by the test points, the test needs to be performed for several times when the first data of one database has more first data, and if the second data is formed, the test of all the first data of the whole database can be completed through the test sleeve once.
In one embodiment, the software automation test method of the present application further includes:
generating a test report according to the test result of the test data;
and debugging the unfinished software test script or the tested software according to the test report until the test of all test items in the test label is finished.
Specifically, the test result of the automated test includes which item tests pass, which item tests fail, and whether the analysis is a problem of the software test script or a problem of the tested software for the failed item. And after the problems in the software automation test report are eliminated, the corrected software test script and/or the tested software program need to be stored, and if the problems occur again, debugging is carried out again until the test items of all the tested software are tested.
In one embodiment, the software automation test method of the present application further includes:
saving the debugged uncompleted software test script or the tested software;
updating a software test script of the automatic test case or test data of the tested software;
and testing according to the updated software test script or test data.
Specifically, after the software test script or the tested software is not completed to be debugged, the debugged software test script or the tested software needs to be updated and stored, and the test data corresponding to the updated software test script or the tested software is used for testing, if the updated software test script or the tested software passes the test, the updated software test script or the tested software is stored, if the test does not pass the test, the debugging is continued, after the test of all the projects is completed, the final software test script is stored in the software automation test case, and the test data of the tested software which completes the debugging is stored in the database.
It should be particularly noted that the method for software automation test described in the present application is executed under a Robot Framework.
The functional automation test Framework Robot Framework is a functional automation test Framework written by python, has good expandability, supports keyword driving, can simultaneously test various types of clients or interfaces, and can perform distributed test execution.
It should be understood that although the various steps in the flow charts of fig. 2-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-4 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 5, there is provided a software automation test apparatus including: the test system comprises a test label module, a test case module and a test data module.
The test label module is used for generating a test label for software test according to a test item of the tested software;
the test case module is used for acquiring a software test script according to the test label and generating a software automation test case applied to the test label, wherein the software test script is a function for testing the test data of the tested software;
and the test data module is used for acquiring the test data in the database corresponding to the test label according to the test label.
Specifically, the test data module is a port in the Robot Framework for function automation test, which is responsible for connecting with a specific database of the software to be tested, if a test case module needs to complete a certain software test and needs to call test data in the database of the corresponding software to be tested, a data acquisition module is required to call the corresponding test data in the database, and then the test data is delivered to the test case module, the test case module completes the corresponding software function test, the data acquisition module generally comprises at least one data port, each data port can correspond to one database of the software to be tested, if the data acquisition module comprises a plurality of data ports, some data ports can also be idle, that is, when the function of the software to be tested is tested, the data port does not need to call the test data of the corresponding database, some data ports can be vacant, the vacant data ports indicate that any database is not connected, when a new database needs to be added, the new database can be connected with the vacant data ports, if the tested software has a plurality of databases, the data acquisition module can set the data ports with the consistent number of the databases, and each database corresponds to one data port.
Specifically, after the test case module completes the automatic test on the software, a test report is generally generated, wherein the test report is a test result of a test item set for the tested software, and includes the test completion condition of each test item; for a project which is tested, a user can consider that the function of the project is reliable, if the test is not completed, the user needs to correspondingly modify software, or modify a script file in a test case module, or check hardware equipment to ensure that the script file is reliable, the modified software or script file is stored and then the test is executed again, and if the test is passed, the modified software or script file is updated.
In one embodiment, an electronic device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The electronic device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the electronic equipment is used for storing data of application programs and software automation tests. The network interface of the electronic device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a software automation testing method.
In one embodiment, an electronic device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 7. The electronic device comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the electronic device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, Near Field Communication (NFC) or other technologies. The computer program is executed by a processor to implement a software automated testing method. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the configurations shown in fig. 6 and 7 are only block diagrams of partial configurations relevant to the present disclosure, and do not constitute a limitation on the electronic devices to which the present disclosure may be applied, and a particular electronic device may include more or less components than those shown in the drawings, or may combine certain components, or have a different arrangement of components.
In one embodiment, the software automation test device provided in the present application may be implemented in the form of a computer program, and the computer program may be run on an electronic device as shown in fig. 6 or fig. 7. The memory of the electronic device may store various program modules constituting the software automation test apparatus, such as a test label module, a test case module, and a test data module shown in fig. 5. The computer program constituted by the program modules causes the processor to execute the steps of the software automation test method of the embodiments of the present application described in the present specification.
For example, the electronic device shown in fig. 6 may generate a test label for a software test according to a test item of the software to be tested by using the test label module of the software automation test apparatus shown in fig. 5; the electronic equipment can obtain a software test script through the test case module according to the test label to generate a software automation test case applied to the test label, wherein the software test script is a function for testing test data of the tested software; the electronic equipment can acquire the test data in the database corresponding to the test label through the test data module according to the test label.
For example, the electronic device shown in fig. 7 may generate a test label for a software test according to a test item of the software to be tested by using the test label module of the software automation test apparatus shown in fig. 5; the electronic equipment can obtain a software test script through the test case module according to the test label to generate a software automation test case applied to the test label, wherein the software test script is a function for testing test data of the tested software; the electronic equipment can acquire the test data in the database corresponding to the test label through the test data module according to the test label.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware instructions of a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), and the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A software automation test method is characterized by comprising the following steps:
acquiring a test label of a software test, wherein the test label is used for defining a test item of the tested software;
acquiring a software test script according to the test label, and generating a software automation test case applied to the test label, wherein the software test script is a function for testing test data of the tested software;
according to the test label, obtaining test data in a database corresponding to the test label;
and testing the test data according to the software automation test case.
2. The method of claim 1, wherein obtaining a test label for a software test comprises:
acquiring software functions to be tested during software automated testing;
acquiring software testing function project information according to the software function to be tested;
acquiring a software test item directory according to the software test function item information;
and associating the software testing item catalog with the corresponding software testing script and testing data according to the software testing item catalog.
3. The method of claim 1, wherein obtaining the software test script and generating the software automation test case for the test item applied to the software under test comprises:
acquiring one or more software test scripts according to the test labels;
setting test points of a software automation test case according to one or more acquired software test scripts, wherein the test points are used for testing acquired test data of a software function in a database;
setting a test set of a software automation test case according to the test points, wherein the test set is used for testing the test data of all the software functions in the acquired personal database;
and generating a software automation test case according to one or more test points and/or one or more test sets.
4. The method of claim 3, wherein obtaining the test data in the database corresponding to the test tag according to the test tag comprises:
acquiring test data of a database to be called according to the test label;
setting test data of a software function of a database as first data;
merging a plurality of first data of a database into second data according to the first data;
and according to the first data and/or the second data, the first data and/or the second data are used as test data of the software automation test.
5. The method of claim 4, wherein the testing the test data against the software automation test case comprises:
according to the first data, the test point completes the test of the first data;
and according to the second data, completing the test of the second data by the test sleeve.
6. The method of claim 1, further comprising:
generating a test report according to the test result of the test data;
and debugging the unfinished software test script or the tested software according to the test report until the test of all test items in the test label is finished.
7. The method of claim 6, further comprising:
saving the debugged uncompleted software test script or the tested software;
updating a software test script of the automatic test case or test data of the tested software;
and testing according to the updated software test script or test data.
8. An automated software testing apparatus, the apparatus comprising:
the test label module is used for generating a test label for software test according to a test item of the tested software;
the test case module is used for acquiring a software test script according to the test label and generating a software automation test case applied to the test label, wherein the software test script is a function for testing the test data of the tested software;
and the test data module is used for acquiring the test data in the database corresponding to the test label according to the test label.
9. An electronic device, comprising one or more processors and memory, the memory to store one or more programs;
the one or more programs, when executed by the processor, cause the processor to implement the method of any of claims 1-7.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed, implements the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011321931.3A CN114528198A (en) | 2020-11-23 | 2020-11-23 | Software automation test method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011321931.3A CN114528198A (en) | 2020-11-23 | 2020-11-23 | Software automation test method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114528198A true CN114528198A (en) | 2022-05-24 |
Family
ID=81619643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011321931.3A Pending CN114528198A (en) | 2020-11-23 | 2020-11-23 | Software automation test method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114528198A (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693183A (en) * | 2012-05-30 | 2012-09-26 | 瑞斯康达科技发展股份有限公司 | Method and system for realizing automatic software testing |
-
2020
- 2020-11-23 CN CN202011321931.3A patent/CN114528198A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693183A (en) * | 2012-05-30 | 2012-09-26 | 瑞斯康达科技发展股份有限公司 | Method and system for realizing automatic software testing |
Non-Patent Citations (1)
Title |
---|
朱韶松: "基于Robot Framework的自动化测试系统的设计与实现", 《中国优秀硕士学位论文全文数据库》, no. 2018, 15 April 2018 (2018-04-15), pages 25 - 40 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109901834B (en) | Document page generation method, device, computer equipment and storage medium | |
CN108959059B (en) | Test method and test platform | |
CN111061475A (en) | Software code generation method and device, computer equipment and storage medium | |
CN110597552B (en) | Configuration method, device, equipment and storage medium of project continuous integrated pipeline | |
CN109144799A (en) | Integrated testing method, apparatus, computer equipment and storage medium | |
CN112433712A (en) | Report display method and device, computer equipment and storage medium | |
CN111949543A (en) | Testing method and device based on distributed platform, electronic equipment and storage medium | |
CN114691506A (en) | Pressure testing method, apparatus, device, medium, and program product | |
CN113505895A (en) | Machine learning engine service system, model training method and configuration method | |
CN110232018A (en) | Interface test method, device, computer equipment | |
CN108228611B (en) | Document information copying method and device | |
CN112561690A (en) | Method, system, equipment and storage medium for testing credit card staging service interface | |
CN116561003A (en) | Test data generation method, device, computer equipment and storage medium | |
CN113656022B (en) | Software development method and device, computer equipment and storage medium | |
CN114756293A (en) | Service processing method, device, computer equipment and storage medium | |
CN114528198A (en) | Software automation test method and device, electronic equipment and storage medium | |
CN114185566A (en) | Containerized deployment method, apparatus, computer device and storage medium | |
CN114116664A (en) | Database table building statement processing method and device, computer equipment and storage medium | |
CN112465612A (en) | Receipt information processing method and device, computer equipment and storage medium | |
CN117687608B (en) | Method, device, equipment and storage medium for generating orchestration stream | |
CN116524986B (en) | System level testing method and system for storage products | |
CN111813842B (en) | Data processing method, device, system, equipment and storage medium | |
US20230057746A1 (en) | User constrained process mining | |
CN117076292A (en) | Webpage testing method and device, computer equipment and storage medium | |
CN115617667A (en) | Demonstration method and device of test process, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |