CN110781074A - Automatic testing method and device and storage medium - Google Patents

Automatic testing method and device and storage medium Download PDF

Info

Publication number
CN110781074A
CN110781074A CN201910882507.7A CN201910882507A CN110781074A CN 110781074 A CN110781074 A CN 110781074A CN 201910882507 A CN201910882507 A CN 201910882507A CN 110781074 A CN110781074 A CN 110781074A
Authority
CN
China
Prior art keywords
test
execution
components
case
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910882507.7A
Other languages
Chinese (zh)
Inventor
吴燕君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN201910882507.7A priority Critical patent/CN110781074A/en
Publication of CN110781074A publication Critical patent/CN110781074A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The invention relates to the technical field of testing, and discloses an automatic testing method, an electronic device and a storage medium. Then, the method acquires a target interface set corresponding to the application environment information from the original interface set based on the initial case application environment information, and executes the target interface set to generate a target test case. By using the method and the device, the reuse rate of the interfaces and the components can be improved, and a case writer can write the flow case with a plurality of interfaces connected in series only by using simple components to quote the components and drag data.

Description

Automatic testing method and device and storage medium
Technical Field
The present invention relates to the field of testing technologies, and in particular, to an automated testing method and apparatus, and a computer-readable storage medium.
Background
A Test Case (Test Case) refers to a description of a Test task performed on a specific software product, and embodies Test schemes, methods, techniques, and strategies. The contents comprise a test target, a test environment, input data, test steps, expected results, test scripts and the like, form documents, are basic bases for the implementation of the functional test of the software product, and the multiplexing of the test cases can effectively improve the efficiency of the test work and reduce the quality risk of the test implementation. In the current functional test work, the multiplexing rate of interfaces and components is low, so that the compiling and maintaining efficiency of test cases is low.
Disclosure of Invention
In view of the above, the present invention provides an automated testing method, an automated testing device, and a computer-readable storage medium, and mainly aims to improve the multiplexing rate of interfaces and components and improve the writing and maintenance efficiency of test cases.
In order to achieve the above object, the present invention provides an automated testing method, which comprises:
a receiving step: receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information;
a first generation step: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path;
a second generation step: receiving test environment information configured by a tester aiming at the initial test case, and generating a target test case;
the execution steps are as follows: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
Preferably, before the receiving step, the method further comprises:
generating a single atomic interface corresponding to each test task, and generating an atomic interface set;
packaging the atomic interface according to different business logics to generate a component set;
predetermining and storing mapping relations between different test tasks and components and logic relations between components corresponding to the test tasks;
and predetermining new addition and modification permission white lists of interfaces and components.
Preferably, when the test task includes a plurality of different test tasks, the first generating step further includes:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case.
Preferably, the second generating step further comprises:
and receiving the execution sequence, the execution environment and the execution machine for executing the test cases, which are set by the tester, and the set execution conditions and execution time.
Preferably, when a plurality of test cases need to be executed simultaneously, the executing step further includes:
distributing the target test cases to a plurality of execution machines to be executed simultaneously based on a preset load balancing rule;
and receiving the execution results fed back by the multiple execution machines, integrating and summarizing the execution results, and generating the execution results of the target test case.
Preferably, the load balancing rule is:
acquiring the total number of the test cases to be executed of each executing machine, and distributing the target test case to the executing machine with the least total number of the test cases to be executed for execution; or the like, or, alternatively,
and distributing the target test cases to the execution machines for execution according to the predetermined sequence of the execution machines and the polling sequence.
In addition, the present invention also provides an electronic apparatus, including: the automatic test system comprises a memory, a processor and a display, wherein an automatic test program is stored in the memory, and the automatic test program is executed by the processor, so that the following steps can be realized:
a receiving step: receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information;
a first generation step: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path;
a second generation step: receiving test environment information configured by a tester aiming at the initial test case, and generating a target test case;
the execution steps are as follows: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
Preferably, when the test task includes a plurality of different test tasks, the first generating step further includes:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case.
Preferably, the first generating step further comprises:
when the interface and the component corresponding to the test task do not exist, prompt information is generated and fed back to the terminal, so that a tester can compile a test case;
and receiving the newly added interface and component of the terminal, and executing the subsequent operation of generating the test case.
In addition, to achieve the above object, the present invention further provides a computer readable storage medium, which includes an automatic test program, and when the automatic test program is executed by a processor, the automatic test method as described above can be implemented.
According to the automatic testing method, the automatic testing device and the computer readable storage medium, after the test case generation request is obtained, the test task information carried by the request is analyzed, the component set corresponding to the test task information and the serial logic configuration relation between the component set and the test task are determined, the components in the component set are connected in series according to the serial logic configuration relation to generate the initial test case corresponding to the test task, and therefore the fact that the interfaces obtained from the component set conform to the configuration logic and the accuracy of the serial sequence of all interface combinations can be guaranteed. Then, generating a target test case based on the test environment information configured for the initial test case, so that the test case generation efficiency can be improved; the generated target test case is stored in the database, and the query and the reuse of the test case can be realized.
Drawings
FIG. 1 is a diagram of an electronic device according to a preferred embodiment of the present invention;
FIG. 2 is a flow chart of a preferred embodiment of the automated testing method of the present invention;
FIG. 3 is a schematic diagram of a platform architecture implemented to run the automated test program of FIG. 1.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a schematic diagram of an electronic device 10 according to a preferred embodiment of the invention.
In this embodiment, the electronic device 10 may be a rack server, a blade server, a tower server, or a rack server, and the electronic device 10 may be an independent server or a server cluster formed by a plurality of servers.
The electronic device 10 includes, but is not limited to: memory 103, processor 100, display 101, and network interface 102. The electronic device 10 is connected to a network through a network interface 102 to obtain raw data. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System for mobile communications (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), Wi-Fi, or a communication network.
The memory 103 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the storage 103 may be an internal storage unit of the electronic device 10, such as a hard disk or a memory of the electronic device 10. In other embodiments, the memory 103 may also be an external storage device of the electronic apparatus 10, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided in the electronic apparatus 10. Of course, the memory 103 may also include both internal and external memory units of the electronic device 10. In this embodiment, the memory 103l is generally used for storing an operating system and various application software installed in the electronic device 10, such as an automatic test platform program 104.
The processor 100 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 100 is generally configured to control the operation of the electronic device 10, such as performing data interaction or communication related control and processing. In this embodiment, the processor 100 is configured to run the program code stored in the memory 103 or process data, such as the program code of the automated test platform 104.
The display 101 may be referred to as a display screen or display unit. In some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an Organic Light-emitting diode (OLED) touch panel, or the like. The display 101 is used for displaying information processed in the electronic device 10 and for displaying a visual work interface, e.g. displaying the results of targeted test case feedback.
The network interface 102 may optionally include a standard wired interface, a wireless interface (e.g., a WI-FI interface), and the network interface 102 is generally used to establish a communication link between the electronic apparatus 10 and other electronic devices.
FIG. 1 only shows the electronic device 10 with the components 100, 103 and the automated test program 104, but it should be understood that not all of the shown components are required and that more or fewer components may alternatively be implemented.
Optionally, the electronic device 10 may further include a tester interface, which may include an input unit such as a Keyboard (Keyboard), a voice output device such as a speaker, a headset, etc., and optionally may further include a standard wired interface, a wireless interface.
Optionally, the electronic device 10 further comprises a touch sensor. The area provided by the touch sensor and used for a tester to perform touch operation is called a touch area. Further, the touch sensor described herein may be a resistive touch sensor, a capacitive touch sensor, or the like. The touch sensor may include not only a contact type touch sensor but also a proximity type touch sensor. Further, the touch sensor may be a single sensor, or may be a plurality of sensors arranged in an array, for example. The tester may initiate the automated test program 104 by touching the touch area.
The area of the display 101 of the electronic device 10 may be the same as or different from the area of the touch sensor. Optionally, the display 101 is stacked with the touch sensor to form a touch display screen. The electronic device 10 detects a touch operation triggered by a tester based on a touch display screen.
The electronic device 10 may further include Radio Frequency (RF) circuitry, sensors, audio circuitry, and the like, which are not described in detail herein.
FIG. 2 is a flowchart illustrating an embodiment of the automated testing method implemented by the automated testing program 104 of FIG. 1 when executed by the processor 100. In this embodiment, the method includes:
step S1: and receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information.
A typical test comprises the following steps: generating a corresponding test case aiming at the test task, quoting the test data to execute the test case, generating a test result to examine the test case, and periodically executing the test operation.
The test task refers to a task determined according to an actual test requirement, for example: the account opening service, the deposit service, the balance inquiry service and the like can be used as test tasks.
Step S2: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path.
Generating a test case refers to generating a test case that can meet actual test requirements. The test case is a description of a test task performed on a specific software product, is a test data set, and can embody a test scheme, a method, a technology, a strategy and the like. The content of the test case may include test objects, test environments, input data, test steps, expected results, test scripts, etc., and a test document is formed based on the content.
Step S3: and receiving test environment information configured by the tester aiming at the initial test case, and generating a target test case.
By configuring the tested environment information, when the subsequent adapter runs in the test case (or the test case set), the ip and the port used by the adapter are changed according to the configured tested environment information, so that the switching of the test environment of the test case (or the test case set) is realized. The target test case (or test case set) refers to a test case meeting the actual test requirement, that is, a test case matched with the test task and the tested environment information at the same time. It can be understood that the tested environment information corresponding to the same test task is different, and the target test case (or test case set) is also different. It can be understood that the different application environments can be distinguished by setting the tested environment information, and the flexibility of the interface, the component and the case application is increased.
Different from the prior art, one interface in the prior art can be divided into a plurality of interfaces corresponding to different application environment information, then screening is carried out according to the application environment information, and the interface, the assembly, the case and the environment are decoupled, so that the number of the interfaces is reduced, the effect of multiplexing the interface and the test case is achieved, and the universality is high.
Step S4: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
In another embodiment, prior to the receiving step, the method further comprises:
generating a single atomic interface corresponding to each test task, and generating an atomic interface set;
packaging the atomic interface according to different business logics to generate a component set;
predetermining and storing mapping relations between different test tasks and components and logic relations between components corresponding to the test tasks;
and predetermining new addition and modification permission white lists of interfaces and components.
In another embodiment, when the test task includes a plurality of different test tasks, the step S2 further includes:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case. For example: the account opening service, the deposit service, the balance inquiry service and the like can be used as sub-test tasks. The sub-test tasks are uniquely corresponding to the transaction codes, so that the atomic test tasks are combined with a single atomic interface of the tested system which is developed in advance to generate an atomic interface set and store the logical relationship in the step, and the associated interface components are automatically connected in series to complete the calling of the interface components for the service test. Typically a large test task may be broken down into a number of smaller sub-test tasks, so that a larger test task may be completed.
In another embodiment, the step S3 further includes:
and receiving the execution sequence, the execution environment and the execution machine for executing the test cases, which are set by the tester, and the set execution conditions and execution time.
In another embodiment, when a plurality of test cases are to be executed simultaneously, the executing step further includes:
distributing the target test cases to a plurality of execution machines to be executed simultaneously based on a preset load balancing rule;
and receiving the execution results fed back by the multiple execution machines, integrating and summarizing the execution results, and generating the execution results of the target test case.
Fig. 3 is a schematic diagram of a platform architecture implemented to run the automated test program 104 in fig. 1. The platform comprises a management examination layer, a system layer, an interface layer, a component layer, a case set layer and an execution machine layer.
In the present embodiment, the processor 100 implements the automated testing method including steps S1-S4 as shown in fig. 2 when executing the computer program of the automated testing program 104 stored in the memory 103. An implementation of the present invention will now be described in conjunction with fig. 3.
The management auditing layer is used for allowing a tester to define auditing authorities, such as the test task in step S1, and the auditing authorities and the consulting authorities of various test tasks (such as account opening business, deposit business and balance inquiry business) can be set through the management auditing layer.
And the system layer is used for a tester (such as a programmer) to associate different adapters, such as http \ MQ \ ESB, according to an interface communication protocol of the system. When the adapter runs in the case set, the tested environment information (including the IP and the port) set in the case set is used, for example, the IP and the port used in the adapter are replaced by the IP and the port of the tested system environment in the case set in steps S2-S3, so that the case can be ensured to be switched from one testing environment to another testing environment. And the interface layer can be used for setting the value rule and default value of each input field in the interface by a tester.
And the component layer comprises a single interface component, an sql component, a function component and a composite component. The single interface component may override the default input values of the interface template of the interface layer again. The sql component includes business logic that encapsulates and fetches. The functional components are used to encapsulate the logic for the input and output conversion processing of other components. The single interface components, the sql components and the function components are called as atomic components, for example, in step S2, the component set can combine the atomic components arbitrarily according to the business logic, and package a plurality of atomic components into individual business function components according to the test requirements, so as to facilitate the reference and adjustment of the components when writing the case, and the case writer can write the case only by referring the components and dragging data.
The case layer is used for the testers to set default input values of the case data, partial cases can be generated according to the rules of the component layer, business logic is provided, for example, the step S3-the step S4, the case set of the case set layer is a set of a plurality of test cases, and the case layer can be used for the testers to set the execution sequence, the execution environment and the execution machine of the cases and support the execution condition and the execution time setting of the cases.
And the execution machine layer is used for storing relevant information of the execution machines, and comprises execution test case results and logs of the execution machines. For example, in step S4, different case sets can be distributed to different execution machines for execution, and the results and logs of execution will be summarized to facilitate the subsequent statistical analysis.
It should be noted that the interface template layer is the lowest layer, and is sequentially a component layer, a case layer, and a case set layer. Hierarchical referencing and overlay relationships of interface field values in cases: the lower layer directly references the default values for the upper layer, and the updated values of the lower layer settings override the default values of the upper layer. The hierarchical reference and coverage relation can ensure that when the interface is changed according to the business requirement, the case of the lower layer referred to the interface can be modified without modification, and the modification can be realized only by modifying the interface at the interface layer and the field value of the component layer. If there is no layered design, when hundreds of cases have been written in some basic common interfaces, the modification of the interfaces causes all cases to fail, a large number of cases need to be modified again, and the related modification and re-debugging work is very huge. The layered design lays a foundation for unified management of test cases and report statistics, and also lays a foundation for integrated performance testing, automatic UI testing and other function extension.
If the written test case needs to be copied and adjusted, a tester can modify the test case at the component layer or the case layer to realize case reuse;
receiving new and modified instructions of the target test case, the component and the interface, wherein the instructions comprise identity information of an instruction sender; verifying the identity information, judging whether the command sender has the newly-added and modified authority, if so, sending the newly-added and modified cases, components and interfaces to a second preset terminal for auditing, receiving an auditing result fed back by the second preset terminal, and if the auditing result is that the auditing is passed, performing newly-added and modified operations on the target test cases, components and interfaces; and if the verification result is that the instruction does not pass or the instruction sender does not have the authority, generating early warning information to be fed back to the instruction sender, and rejecting the instruction.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes an automatic test program 104, and when executed by a processor, the automatic test program 104 implements the following steps:
s1: receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information;
s2: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path;
s3: receiving test environment information configured by a tester aiming at the initial test case, and generating a target test case;
s4: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
Preferably, before the receiving step, the method further comprises:
generating a single atomic interface corresponding to each test task, and generating an atomic interface set;
packaging the atomic interface according to different business logics to generate a component set;
predetermining and storing mapping relations between different test tasks and components and logic relations between components corresponding to the test tasks;
and predetermining new addition and modification permission white lists of interfaces and components.
Preferably, when the test task includes a plurality of different test tasks, the first generating step further includes:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case.
Preferably, the second generating step further comprises:
and receiving the execution sequence, the execution environment and the execution machine for executing the test cases, which are set by the tester, and the set execution conditions and execution time.
Preferably, when a plurality of test cases need to be executed simultaneously, the executing step further includes:
distributing the target test cases to a plurality of execution machines to be executed simultaneously based on a preset load balancing rule;
and receiving the execution results fed back by the multiple execution machines, integrating and summarizing the execution results, and generating the execution results of the target test case.
Preferably, the load balancing rule is:
acquiring the total number of the test cases to be executed of each executing machine, and distributing the target test case to the executing machine with the least total number of the test cases to be executed for execution; or the like, or, alternatively,
and distributing the target test cases to the execution machines for execution according to the predetermined sequence of the execution machines and the polling sequence.
The embodiment of the computer readable storage medium of the present invention is substantially the same as the embodiment of the automatic testing method, and will not be described herein again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An automated testing method applied to an electronic device, the method comprising:
a receiving step: receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information;
a first generation step: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path;
a second generation step: receiving test environment information configured by a tester aiming at the initial test case, and generating a target test case;
the execution steps are as follows: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
2. The automated testing method of claim 1, wherein prior to the receiving step, the method further comprises:
generating a single atomic interface corresponding to each test task, and generating an atomic interface set;
packaging the atomic interface according to different business logics to generate a component set;
predetermining and storing mapping relations between different test tasks and components and logic relations between components corresponding to the test tasks;
and predetermining new addition and modification permission white lists of interfaces and components.
3. The automated testing method of claim 1, wherein when the test tasks include a plurality of different test tasks, the first generating step further comprises:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case.
4. The automated testing method of claim 1, wherein the second generating step further comprises:
and receiving the execution sequence, the execution environment and the execution machine for executing the test cases, which are set by the tester, and the set execution conditions and execution time.
5. The automated testing method of claim 1, wherein when multiple test cases are to be executed simultaneously, the executing step further comprises:
distributing the target test cases to a plurality of execution machines to be executed simultaneously based on a preset load balancing rule;
and receiving the execution results fed back by the multiple execution machines, integrating and summarizing the execution results, and generating the execution results of the target test case.
6. The automated testing method of claim 5, wherein the load balancing rule is:
acquiring the total number of the test cases to be executed of each executing machine, and distributing the target test case to the executing machine with the least total number of the test cases to be executed for execution; or the like, or, alternatively,
and distributing the target test cases to the execution machines for execution according to the predetermined sequence of the execution machines and the polling sequence.
7. An electronic device, comprising: the automatic test system comprises a memory, a processor and a display, wherein an automatic test program is stored in the memory, and when the automatic test program is executed by the processor, the following steps are realized:
a receiving step: receiving a test request sent by a tester through a terminal, wherein the test request comprises test task information;
a first generation step: analyzing the test task information, determining a component set corresponding to the test task information and a serial logic configuration relation corresponding to the component set and the test task according to a preset mapping relation between the test task and the components, connecting the components in the component set in series according to the serial logic configuration relation to generate an initial test case corresponding to the test task, and storing the initial test case to a preset storage path;
a second generation step: receiving test environment information configured by a tester aiming at the initial test case, and generating a target test case;
the execution steps are as follows: and controlling the execution machine to execute the target test case, receiving an execution result fed back by the execution machine, and feeding back the execution result to the terminal in a preset form.
8. The electronic device of claim 7, wherein when the test task comprises a plurality of different test tasks, the first generating step further comprises:
dividing the test task into a plurality of sub-test tasks;
respectively storing component subsets corresponding to the sub-test tasks and logic configuration relations among the components in the component subsets;
and connecting the components in the component subset in series according to the logic configuration relation to generate sub-test cases corresponding to the sub-test tasks, and combining the sub-test cases into a complete test case.
9. The electronic device of claim 7, wherein the first generating step further comprises:
when the interface and the component corresponding to the test task do not exist, prompt information is generated and fed back to the terminal, so that a tester can compile a test case;
and receiving the newly added interface and component of the terminal, and executing the subsequent operation of generating the test case.
10. A computer-readable storage medium, comprising an automated test program that, when executed by a processor, performs the steps of the automated testing method of any of claims 1-6.
CN201910882507.7A 2019-09-18 2019-09-18 Automatic testing method and device and storage medium Pending CN110781074A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910882507.7A CN110781074A (en) 2019-09-18 2019-09-18 Automatic testing method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910882507.7A CN110781074A (en) 2019-09-18 2019-09-18 Automatic testing method and device and storage medium

Publications (1)

Publication Number Publication Date
CN110781074A true CN110781074A (en) 2020-02-11

Family

ID=69383754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910882507.7A Pending CN110781074A (en) 2019-09-18 2019-09-18 Automatic testing method and device and storage medium

Country Status (1)

Country Link
CN (1) CN110781074A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506501A (en) * 2020-04-13 2020-08-07 杭州涂鸦信息技术有限公司 Test instruction set generation method and device and electronic equipment
CN111930363A (en) * 2020-08-07 2020-11-13 北京字节跳动网络技术有限公司 Block interface code generation method and device
CN112241367A (en) * 2020-09-25 2021-01-19 建信金融科技有限责任公司 Data line testing method and device
CN113434397A (en) * 2021-06-23 2021-09-24 平安银行股份有限公司 Task system testing method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108710569A (en) * 2018-05-09 2018-10-26 深圳壹账通智能科技有限公司 Test cases generation method, data creation method, device, equipment and medium
CN108763085A (en) * 2018-05-31 2018-11-06 平安科技(深圳)有限公司 On-line debugging device, method and the computer readable storage medium of test cases
WO2019165691A1 (en) * 2018-02-28 2019-09-06 平安科技(深圳)有限公司 Method, apparatus and device for automatically generating test case, and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019165691A1 (en) * 2018-02-28 2019-09-06 平安科技(深圳)有限公司 Method, apparatus and device for automatically generating test case, and readable storage medium
CN108710569A (en) * 2018-05-09 2018-10-26 深圳壹账通智能科技有限公司 Test cases generation method, data creation method, device, equipment and medium
CN108763085A (en) * 2018-05-31 2018-11-06 平安科技(深圳)有限公司 On-line debugging device, method and the computer readable storage medium of test cases

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506501A (en) * 2020-04-13 2020-08-07 杭州涂鸦信息技术有限公司 Test instruction set generation method and device and electronic equipment
CN111506501B (en) * 2020-04-13 2023-09-26 杭州涂鸦信息技术有限公司 Method and device for generating test instruction set and electronic equipment
CN111930363A (en) * 2020-08-07 2020-11-13 北京字节跳动网络技术有限公司 Block interface code generation method and device
CN111930363B (en) * 2020-08-07 2023-11-24 抖音视界有限公司 Block interface code generation method and device
CN112241367A (en) * 2020-09-25 2021-01-19 建信金融科技有限责任公司 Data line testing method and device
CN112241367B (en) * 2020-09-25 2022-09-13 建信金融科技有限责任公司 Data line testing method and device
CN113434397A (en) * 2021-06-23 2021-09-24 平安银行股份有限公司 Task system testing method and device, electronic equipment and storage medium
CN113434397B (en) * 2021-06-23 2022-04-08 平安银行股份有限公司 Task system testing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110781074A (en) Automatic testing method and device and storage medium
CN109582303B (en) General component calling method, device, computer equipment and storage medium
US11099973B2 (en) Automated test case management systems and methods
US10296563B2 (en) Automated testing of perceptible web page elements
US8793660B2 (en) Automated testing of programming code for a web service
US9076072B2 (en) System and method for web page rendering test automation suite
CN109032824A (en) Database method of calibration, device, computer equipment and storage medium
US9053231B2 (en) Systems and methods for analyzing operations in a multi-tenant database system environment
CN109474578A (en) Message method of calibration, device, computer equipment and storage medium
EP3468145B1 (en) Automated vulnerability grouping
CN104133772A (en) Automatic test data generation method
WO2021022713A1 (en) Distributed module update method, device, and storage medium
CN111190603B (en) Private data detection method and device and computer readable storage medium
CN111831191A (en) Workflow configuration method and device, computer equipment and storage medium
CN114531477B (en) Method and device for configuring functional components, computer equipment and storage medium
US20130014082A1 (en) Method of configuring business logic supporting multi-tenancy
US11250029B2 (en) Information processing system and classification method
US8984514B2 (en) Modifying scheduled execution of object modification methods associated with database objects
CN112948099B (en) Task issuing flow method and device, computer equipment and storage medium
CN107943504A (en) The dynamic update method of software module in a kind of embedded system
CN103186384A (en) Business-component-oriented software designing and analyzing system and using method thereof
CN113377346B (en) Integrated environment building method and device, electronic equipment and storage medium
CN108776601A (en) Loading method, device, equipment and the storage medium of the Android H5 pages
US11227005B2 (en) Gesture-based database actions
CN103914505A (en) Information management method and information management device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination