CN111949545A - Automatic testing method, system, server and storage medium - Google Patents
Automatic testing method, system, server and storage medium Download PDFInfo
- Publication number
- CN111949545A CN111949545A CN202010839717.0A CN202010839717A CN111949545A CN 111949545 A CN111949545 A CN 111949545A CN 202010839717 A CN202010839717 A CN 202010839717A CN 111949545 A CN111949545 A CN 111949545A
- Authority
- CN
- China
- Prior art keywords
- test
- script
- original
- test script
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 408
- 238000013515 script Methods 0.000 claims abstract description 195
- 238000000034 method Methods 0.000 claims description 36
- 230000009471 action Effects 0.000 claims description 19
- 230000008569 process Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 6
- 238000004458 analytical method Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 6
- 238000013522 software testing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000011161 development Methods 0.000 description 4
- 238000010998 test method Methods 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000013102 re-test Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The embodiment of the invention discloses an automatic testing method, a system, a server and a storage medium, wherein the system comprises the following components: acquiring an original test script provided by a user; analyzing the original test script to determine a parameter configuration item of the original test script; acquiring script configuration parameters of the parameter configuration items to generate a target test script; obtaining a test strategy of the target test script; and executing the target test script according to the test strategy. The embodiment of the invention realizes the automatic test of the software, improves the popular intelligibility and maintainability of the test script and improves the test efficiency.
Description
Technical Field
The embodiment of the invention relates to the technical field of software testing, in particular to an automatic testing method, an automatic testing system, a server and a storage medium.
Background
Software testing refers to the process of operating a program under specified conditions to discover program errors, measure software quality, and evaluate whether it can meet design requirements. Software testing is an essential step in the development and maintenance of internet projects.
In the traditional software testing method, testers write different testing scripts according to different testing requirements to test, and when the testing requirements change, the testing scripts need to be changed. Due to the rapid development of the existing internet technology, internet projects are larger and larger in scale and more in quantity, the development period is required to be shortened as much as possible, and the traditional software testing method cannot adapt to the increasing project requirements.
Disclosure of Invention
In view of this, embodiments of the present invention provide an automated testing method, system, server and storage medium, so as to implement an automated function of software testing and improve testing efficiency.
In a first aspect, an embodiment of the present invention provides an automated testing method, including:
acquiring an original test script provided by a user;
analyzing the original test script to determine a parameter configuration item of the original test script;
acquiring script configuration parameters of the parameter configuration items to generate a target test script;
obtaining a test strategy of the target test script;
and executing the target test script according to the test strategy.
Further, parsing the original test script to determine parameter configuration items of the original test script includes:
analyzing the test action, the test object and the test data of the original test script;
and taking the adjustable parameters in the test action, the test object and the test data as parameter configuration items of the original test script.
Further, the test strategy comprises an object to be tested and a timing plan.
Further, executing the target test script according to the test policy includes:
generating at least one test task according to the object to be tested and the target test script;
and executing the at least one test task according to a test instruction or the timing plan.
Further, executing the at least one test task according to the test instructions or the timing plan includes:
determining at least one test node meeting a preset condition;
and distributing each test task to a corresponding test node to be executed according to the test instruction or the timing plan.
Further, the method also comprises the following steps:
and displaying the execution progress of the target test script through a visual interface in the execution process of the target test script.
Further, the method also comprises the following steps:
and generating a test result, wherein the test result comprises a test report, a test log and a test video.
In a second aspect, an embodiment of the present invention provides an automated testing system, including:
the script obtaining module is used for obtaining an original test script provided by a user;
the script analysis module is used for analyzing the original test script to determine a parameter configuration item of the original test script;
the parameter configuration module is used for acquiring script configuration parameters of the parameter configuration items so as to generate a target test script;
the test strategy acquisition module is used for acquiring the test strategy of the target test script;
and the test execution module is used for executing the target test script according to the test strategy.
In a third aspect, an embodiment of the present invention provides a server, including:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the automated testing methods provided by any of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the automated test generation method provided in any embodiment of the present invention.
The automatic test method provided by the embodiment of the invention obtains the original test script provided by the user; analyzing the original test script to determine a parameter configuration item of the original test script; acquiring script configuration parameters of the parameter configuration items to generate a target test script; obtaining a test strategy of the target test script; and executing the target test script according to the test strategy. The automatic test of the software is realized, the popular understandability and maintainability of the test script are improved, and the test efficiency is improved.
Drawings
Fig. 1 is a schematic flow chart of an automated testing method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of an automated testing method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an automated testing system according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a server according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. A process may be terminated when its operations are completed, but may have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
Furthermore, the terms "first," "second," and the like may be used herein to describe various orientations, actions, steps, elements, or the like, but the orientations, actions, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. The terms "first", "second", etc. are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "plurality", "batch" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Example one
Fig. 1 is a schematic flow chart of an automated testing method according to an embodiment of the present invention, which is applicable to automated testing of software. The automatic test method provided by the embodiment of the invention can be realized by the automatic test system provided by any embodiment of the invention.
As shown in fig. 1, an automated testing method provided in an embodiment of the present invention includes:
and S110, acquiring an original test script provided by a user.
Specifically, the original test script refers to a series of program instructions provided by a user for a particular test. The original test script can be a test script written by the user or a test script recorded by the user through a tool.
S120, analyzing the original test script to determine the parameter configuration item of the original test script.
Specifically, parsing the original test script means that the original test script is decomposed into several modules that are easily understood by a tester, and parameter configuration items in the original test script are extracted at the same time. Because the development quantity of projects is large, at present, testers usually use recording tools to record test scripts, so the original test scripts are mostly recorded scripts by the tools. Such test scripts are typically formed in the programming logic of the recording tool itself, and the original test scripts obtained may vary from recording tool to recording tool. Therefore, the original test script is analyzed and decomposed into a plurality of modules which are easier to understand by testers, so that the test script is clearer for the testers, and the maintenance of the test script by the testers is facilitated. For example, if the original test script is a script for testing the login function of a certain website, the test script provided by the recording tool includes all steps of logging in the website by the user, and after the original script is analyzed, the original script can be decomposed into a user information module (such as a user name, a user password and the like), an operation action module (such as clicking a login action), a website response module and the like, so that the tester can know the whole test script more clearly.
The parameter configuration items of the original test script refer to adjustable parameters in the original test script. For example, the user name, user password, etc. in the above examples.
S130, obtaining script configuration parameters of the parameter configuration items to generate a target test script.
Specifically, the user configures the parameter configuration items of the original test script to realize the setting of the related data in the original test script and form the target test script. The script configuration parameters are the setting data corresponding to the parameter configuration items. For example, the parameter configuration item in the above example is a user name, and the corresponding configuration parameter may be a plurality of different user names.
The script configuration parameters enable the test logic of the test script to be separated from the test data, a user can generate different target test scripts by setting different script configuration parameters, when the user needs to modify the target test scripts, only the script configuration parameters need to be modified, the test scripts do not need to be modified completely, and the method is fast and convenient.
And S140, obtaining the test strategy of the target test script.
Specifically, the test strategy refers to a way of performing a test, and generally includes an object to be tested and a timing plan. The object to be tested represents an object for performing a test, such as a website in the above example; the timing plan then represents the time of the timing test.
And S150, executing the target test script according to the test strategy.
Specifically, the target test script is executed according to the test strategy, that is, the object to be tested is tested at the time corresponding to the timing plan.
The automatic test method provided by the embodiment of the invention obtains an original test script provided by a user; analyzing the original test script to determine a parameter configuration item of the original test script; acquiring script configuration parameters of the parameter configuration items to generate a target test script; obtaining a test strategy of the target test script; and executing the target test script according to the test strategy. The automatic test of the software is realized, the popular understandability and maintainability of the test script are improved, and the test efficiency is improved.
Example two
Fig. 2 is a schematic flow chart of an automated testing method according to a second embodiment of the present invention, which is a further refinement of the second embodiment of the present invention. As shown in fig. 2, the automated testing method provided by the embodiment of the present invention includes:
and S210, acquiring an original test script provided by a user.
Specifically, the original test script refers to a series of program instructions provided by a user for a particular test. The original test script can be a test script written by the user or a test script recorded by the user through a tool.
S220, analyzing the test action, the test object and the test data of the original test script.
Specifically, a test action refers to a portion of an original test script that relates to an action instruction. The test object refers to a part of the original test script that relates to the tested object. The test data refers to relevant parameter data in the original test script. For example, the original test script is a recorded script for testing a website login function, after the original script is analyzed, the test action is an instruction part corresponding to the click login action in the recorded script, the test object is an instruction part corresponding to the website address, and the test data includes data instruction parts such as a user name and a user password.
Furthermore, when the original test script is decomposed into different modules, Chinese description of each sentence of instruction in the script is generated to explain the function of the instruction, so that a tester can quickly understand the purpose of each sentence of instruction in the script.
And S230, taking the adjustable parameters in the test action, the test object and the test data as parameter configuration items of the original test script.
Specifically, the adjustable parameters refer to parameters that can be configured by adjustment, and the adjustable parameters in the test action, the test object, and the test data are all parameter configuration items. As with the example summary above, the parameter configuration item may be a website address, a user name, a user password, and so forth.
S240, obtaining script configuration parameters of the parameter configuration items to generate a target test script.
Specifically, the user configures the parameter configuration items of the original test script to realize the setting of the related data in the original test script and form the target test script. The script configuration parameters are the setting data corresponding to the parameter configuration items. For example, the parameter configuration item in the above example is a user name, and the corresponding configuration parameter may be a plurality of different user names.
The script configuration parameters enable the test logic of the test script to be separated from the test data, a user can generate different target test scripts by setting different script configuration parameters, when the user needs to modify the target test scripts, only the script configuration parameters need to be modified, the test scripts do not need to be modified completely, and the method is fast and convenient.
S250, obtaining a test strategy of the target test script, wherein the test strategy comprises an object to be tested and a timing plan.
Specifically, the test strategy refers to a way of performing a test, and generally includes an object to be tested and a timing plan. The object to be tested represents an object for performing a test, such as a website address in the above example; the timing plan indicates the timing test time, and if the timing test is required, a user needs to set the timing time; if the timing test is not needed, the user does not need to set the timing time.
Further, the test strategy may also include alarm settings, test nodes, scheduled test equipment, and polling interval durations. The alarm setting refers to an alarm mode when an abnormal condition occurs in the test process, for example, the abnormal condition is sent to the corresponding contact person through a mail. A test node refers to a node server that performs a test. A scheduled test device refers to a device, such as a terminal device, that is scheduled to perform a test. The polling interval duration refers to the time interval in which the test is repeatedly performed.
And S260, generating at least one test task according to the object to be tested and the target test script.
Specifically, one test task is equivalent to executing one target test script on the object to be tested, that is, performing one test on the object to be tested. For example, the target test script is a script for testing the login function of a certain website, which may include multiple user names, and then a test task may be to perform a login operation on the website by using one user name. Further, the object to be tested may also be a plurality of different test objects, for example, a plurality of different websites, and then a test task may be a login operation performed on a certain website by a user name.
S270, executing the at least one test task according to the test instruction or the timing plan.
Specifically, when the test is not a timing test, the test task needs to be executed according to a test instruction sent by a user; if the test is a timed test, the test task is automatically performed at a time set according to the timing plan.
Further, the method for executing the test task comprises the following steps: determining at least one test node meeting a preset condition; and distributing each test task to a corresponding test node to be executed according to the test instruction or the timing plan.
In this embodiment, the test node is a distributed test node cluster, and includes a plurality of node servers (i.e., test nodes), where the plurality of node servers may be respectively deployed on a plurality of terminal devices. The test tasks are executed on which node server, and are uniformly distributed by the task scheduling server, and when the node servers meet preset conditions, the task scheduling server distributes the test tasks to the node servers for execution.
The preset condition is used for judging whether the current resource use condition of the node server is too large or not and whether the node server is suitable for executing the test task or not. For example, the preset condition is that the node server is currently in an idle state, or the node server is currently in a busy state and the number of currently queued tasks in the node server is not more than 5. And when the node server meets the preset conditions, if the node server is in an idle state at present, the node server is considered to be suitable for executing the test task, and the test task is distributed to the corresponding node server to be executed according to the test instruction or the timing plan.
Furthermore, when the specified test node is set in the test strategy, the test task is only required to be allocated to the specified test node for execution according to the test instruction or the timing plan of the user.
Further, when the execution of the test task fails, the information of the test failure is sent to the related contact persons according to the alarm setting in the test strategy, and the retest operation is automatically executed, wherein the times of retest can be default or preset by the user.
S280, displaying the execution progress of the target test script through a visual interface in the execution process of the target test script.
Specifically, the execution progress of the target test script is the execution progress of each test task, the execution progress of the test tasks is displayed through the visual interface, and testers can know the execution condition of the test tasks in real time, so that the test progress is convenient to master.
And S290, generating a test result, wherein the test result comprises a test report, a test log and a test video.
Specifically, after the test is completed, a test result is generated, and a tester can know the test quality and whether the test purpose is achieved by checking the test result. The test result mainly comprises a test report, a test log and a test video. The test report records an execution report of each step in the test script, for example, success or failure of step execution, return parameters, and the like. The test log records log information in the test process, such as test times, test failure times, test success times, alarm information and the like. The test video recording is a recording of the execution process of the test script. The test information is recorded from multiple directions in the test result, and the tester can conveniently analyze the test.
The second embodiment of the invention realizes the automatic test of software, improves the popular understandability and maintainability of the test script through script analysis, separates script logic and script parameters through parameter configuration, improves the maintainability of the test script, realizes the automatic distribution of test tasks according to test instructions or timing plans, realizes the multi-concurrent processing of the test tasks through a distributed test node cluster, and improves the test efficiency.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an automated testing system according to a third embodiment of the present invention, which is applicable to automated testing of software. The automated testing system provided by the embodiment of the invention can realize the automated testing method provided by any embodiment of the invention, has the function result and the beneficial effect of the realization method, and the content which is not described in detail in the embodiment can refer to the description of any method embodiment of the invention.
As shown in fig. 3, an automated testing system provided by the third embodiment of the present invention includes: a script obtaining module 310, a script parsing module 320, a parameter configuration module 330, a test policy obtaining module 340, and a test executing module 350, wherein:
the script obtaining module 310 is configured to obtain an original test script provided by a user;
the script parsing module 320 is configured to parse the original test script to determine a parameter configuration item of the original test script;
the parameter configuration module 330 is configured to obtain script configuration parameters of the parameter configuration items to generate a target test script;
the test strategy obtaining module 340 is configured to obtain a test strategy of the target test script;
the test execution module 350 is configured to execute the target test script according to the test policy.
Further, the script parsing module 320 is specifically configured to:
analyzing the test action, the test object and the test data of the original test script;
and taking the adjustable parameters in the test action, the test object and the test data as parameter configuration items of the original test script.
Further, the test strategy comprises an object to be tested and a timing plan.
Further, the test execution module 350 includes:
the task generating unit is used for generating at least one test task according to the object to be tested and the target test script;
and the task execution unit is used for executing the at least one test task according to a test instruction or the timing plan.
Further, the task execution unit is specifically configured to:
determining at least one test node meeting a preset condition;
and distributing each test task to a corresponding test node to be executed according to the test instruction or the timing plan.
Further, the method also comprises the following steps:
and the visualization module is used for displaying the execution progress of the target test script through a visualization interface in the execution process of the target test script.
Further, the method also comprises the following steps:
and the result generation module is used for generating a test result, and the test result comprises a test report, a test log and a test video.
According to the automatic testing system provided by the third embodiment of the invention, the automatic testing of software is realized through the script acquisition module, the script analysis module, the parameter configuration module, the testing strategy acquisition module and the testing execution module, so that the popular understandability and maintainability of the testing script are improved, and the testing efficiency is improved.
Example four
Fig. 4 is a schematic structural diagram of a server according to a fourth embodiment of the present invention. FIG. 4 illustrates a block diagram of an exemplary server 412 suitable for use in implementing embodiments of the present invention. The server 412 shown in fig. 4 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 4, server 412 is in the form of a general purpose server. Components of server 412 may include, but are not limited to: one or more processors 416, a storage device 428, and a bus 418 that couples the various system components including the storage device 428 and the processors 416.
A program/utility 440 having a set (at least one) of program modules 442 may be stored, for instance, in storage 428, such program modules 442 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 442 generally perform the functions and/or methodologies of the described embodiments of the invention.
The server 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing terminal, display 424, etc.), with one or more terminals that enable a user to interact with the server 412, and/or with any terminals (e.g., network card, modem, etc.) that enable the server 412 to communicate with one or more other computing terminals. Such communication may occur via input/output (I/O) interfaces 422. Further, server 412 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), and/or a public Network, such as the Internet) via Network adapter 420. As shown in FIG. 4, network adapter 420 communicates with the other modules of server 412 via bus 418. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the server 412, including but not limited to: microcode, end drives, Redundant processors, external disk drive Arrays, RAID (Redundant Arrays of Independent Disks) systems, tape drives, and data backup storage systems, among others.
The processor 416 executes various functional applications and data processing by running programs stored in the storage device 428, for example, implementing an automated testing method provided by any embodiment of the present invention, which may include:
acquiring an original test script provided by a user;
analyzing the original test script to determine a parameter configuration item of the original test script;
acquiring script configuration parameters of the parameter configuration items to generate a target test script;
obtaining a test strategy of the target test script;
and executing the target test script according to the test strategy.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements an automated testing method according to any embodiment of the present invention, where the method may include:
acquiring an original test script provided by a user;
analyzing the original test script to determine a parameter configuration item of the original test script;
acquiring script configuration parameters of the parameter configuration items to generate a target test script;
obtaining a test strategy of the target test script;
and executing the target test script according to the test strategy.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or terminal. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (10)
1. An automated testing method, comprising:
acquiring an original test script provided by a user;
analyzing the original test script to determine a parameter configuration item of the original test script;
acquiring script configuration parameters of the parameter configuration items to generate a target test script;
obtaining a test strategy of the target test script;
and executing the target test script according to the test strategy.
2. The method of claim 1, wherein parsing the original test script to determine parameter configurations for the original test script comprises:
analyzing the test action, the test object and the test data of the original test script;
and taking the adjustable parameters in the test action, the test object and the test data as parameter configuration items of the original test script.
3. The method of claim 1, wherein the test strategy comprises an object under test and a timing plan.
4. The method of claim 3, wherein executing the target test script according to the test policy comprises:
generating at least one test task according to the object to be tested and the target test script;
and executing the at least one test task according to a test instruction or the timing plan.
5. The method of claim 4, wherein performing the at least one test task according to test instructions or the timing plan comprises:
determining at least one test node meeting a preset condition;
and distributing each test task to a corresponding test node to be executed according to the test instruction or the timing plan.
6. The method of any one of claims 1-5, further comprising:
and displaying the execution progress of the target test script through a visual interface in the execution process of the target test script.
7. The method of any one of claims 1-5, further comprising:
and generating a test result, wherein the test result comprises a test report, a test log and a test video.
8. An automated test system, comprising:
the script obtaining module is used for obtaining an original test script provided by a user;
the script analysis module is used for analyzing the original test script to determine a parameter configuration item of the original test script;
the parameter configuration module is used for acquiring script configuration parameters of the parameter configuration items so as to generate a target test script;
the test strategy acquisition module is used for acquiring the test strategy of the target test script;
and the test execution module is used for executing the target test script according to the test strategy.
9. A server, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the automated testing method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out an automated test generation method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010839717.0A CN111949545A (en) | 2020-08-19 | 2020-08-19 | Automatic testing method, system, server and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010839717.0A CN111949545A (en) | 2020-08-19 | 2020-08-19 | Automatic testing method, system, server and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111949545A true CN111949545A (en) | 2020-11-17 |
Family
ID=73359800
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010839717.0A Pending CN111949545A (en) | 2020-08-19 | 2020-08-19 | Automatic testing method, system, server and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111949545A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112597001A (en) * | 2020-12-07 | 2021-04-02 | 长沙市到家悠享网络科技有限公司 | Interface testing method and device, electronic equipment and storage medium |
CN112783773A (en) * | 2021-01-25 | 2021-05-11 | 中国工商银行股份有限公司 | Software automation test method and device |
CN113014452A (en) * | 2021-03-01 | 2021-06-22 | 鹏城实验室 | Network flow testing method, device, testing end and storage medium |
CN113687223A (en) * | 2021-08-30 | 2021-11-23 | 广东睿住智能科技有限公司 | Test method, test device, server and storage medium |
CN113806229A (en) * | 2021-09-27 | 2021-12-17 | 工银科技有限公司 | Interface change test script multiplexing method, device, equipment, medium and product |
CN114900569A (en) * | 2022-05-17 | 2022-08-12 | 中国银行股份有限公司 | Method and device for acquiring test script |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150324274A1 (en) * | 2014-05-09 | 2015-11-12 | Wipro Limited | System and method for creating universal test script for testing variants of software application |
US20160360298A1 (en) * | 2015-06-02 | 2016-12-08 | International Business Machines Corporation | Generating customized on-demand videos from automated test scripts |
CN111209218A (en) * | 2020-04-01 | 2020-05-29 | 中电万维信息技术有限责任公司 | Automatic performance testing method based on Jmeter |
CN111367813A (en) * | 2020-03-17 | 2020-07-03 | 深圳市卡牛科技有限公司 | Automatic testing method and device for decision engine, server and storage medium |
CN111400186A (en) * | 2020-03-19 | 2020-07-10 | 时时同云科技(成都)有限责任公司 | Performance test method and system |
-
2020
- 2020-08-19 CN CN202010839717.0A patent/CN111949545A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150324274A1 (en) * | 2014-05-09 | 2015-11-12 | Wipro Limited | System and method for creating universal test script for testing variants of software application |
US20160360298A1 (en) * | 2015-06-02 | 2016-12-08 | International Business Machines Corporation | Generating customized on-demand videos from automated test scripts |
CN111367813A (en) * | 2020-03-17 | 2020-07-03 | 深圳市卡牛科技有限公司 | Automatic testing method and device for decision engine, server and storage medium |
CN111400186A (en) * | 2020-03-19 | 2020-07-10 | 时时同云科技(成都)有限责任公司 | Performance test method and system |
CN111209218A (en) * | 2020-04-01 | 2020-05-29 | 中电万维信息技术有限责任公司 | Automatic performance testing method based on Jmeter |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112597001A (en) * | 2020-12-07 | 2021-04-02 | 长沙市到家悠享网络科技有限公司 | Interface testing method and device, electronic equipment and storage medium |
CN112783773A (en) * | 2021-01-25 | 2021-05-11 | 中国工商银行股份有限公司 | Software automation test method and device |
CN112783773B (en) * | 2021-01-25 | 2024-02-06 | 中国工商银行股份有限公司 | Automatic software testing method and device |
CN113014452A (en) * | 2021-03-01 | 2021-06-22 | 鹏城实验室 | Network flow testing method, device, testing end and storage medium |
CN113687223A (en) * | 2021-08-30 | 2021-11-23 | 广东睿住智能科技有限公司 | Test method, test device, server and storage medium |
CN113806229A (en) * | 2021-09-27 | 2021-12-17 | 工银科技有限公司 | Interface change test script multiplexing method, device, equipment, medium and product |
CN113806229B (en) * | 2021-09-27 | 2024-06-11 | 工银科技有限公司 | Test script multiplexing method, device, equipment, medium and product for interface change |
CN114900569A (en) * | 2022-05-17 | 2022-08-12 | 中国银行股份有限公司 | Method and device for acquiring test script |
CN114900569B (en) * | 2022-05-17 | 2024-05-03 | 中国银行股份有限公司 | Test script acquisition method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111949545A (en) | Automatic testing method, system, server and storage medium | |
CN109302522B (en) | Test method, test device, computer system, and computer medium | |
CN108984389B (en) | Application program testing method and terminal equipment | |
CN110768872B (en) | Inspection method, system, device, computer equipment and storage medium | |
CN111124919A (en) | User interface testing method, device, equipment and storage medium | |
CN110750458A (en) | Big data platform testing method and device, readable storage medium and electronic equipment | |
CN111045911A (en) | Performance test method, performance test device, storage medium and electronic equipment | |
CN113127356A (en) | Pressure measurement method and device, electronic equipment and storage medium | |
CN112463588A (en) | Automatic test system and method, storage medium and computing equipment | |
CN113836014A (en) | Interface testing method and device, electronic equipment and storage medium | |
CN111309570A (en) | Pressure testing method, medium, device and computing equipment | |
CN112817869A (en) | Test method, test device, test medium, and electronic apparatus | |
CN110677307B (en) | Service monitoring method, device, equipment and storage medium | |
US20180285082A1 (en) | Comparing scripts | |
CN110597704A (en) | Application program pressure testing method, device, server and medium | |
CN107515803A (en) | A kind of storing performance testing method and device | |
CN111290942A (en) | Pressure testing method, device and computer readable medium | |
CN117539754A (en) | Pressure testing method and device, storage medium and electronic equipment | |
US10382311B2 (en) | Benchmarking servers based on production data | |
CN113868129B (en) | Method for checking accuracy of back-end data and automatic testing tool | |
CN113535273A (en) | System-level recording method and system of industrial networked intelligent equipment and storage medium | |
CN113641575A (en) | Test method, device, equipment and storage medium | |
CN112527584A (en) | Software efficiency improving method and system based on script compiling and data acquisition | |
CN112905445A (en) | Log-based test method and device and computer system | |
CN112214469A (en) | Drive test data processing method, device, server and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201117 |
|
RJ01 | Rejection of invention patent application after publication |