CN111813685B - Automatic test method and device - Google Patents

Automatic test method and device Download PDF

Info

Publication number
CN111813685B
CN111813685B CN202010692534.0A CN202010692534A CN111813685B CN 111813685 B CN111813685 B CN 111813685B CN 202010692534 A CN202010692534 A CN 202010692534A CN 111813685 B CN111813685 B CN 111813685B
Authority
CN
China
Prior art keywords
test
application
case
cases
test cases
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010692534.0A
Other languages
Chinese (zh)
Other versions
CN111813685A (en
Inventor
李秋林
金伟光
陆帅忠
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Technology Holding Co Ltd
Original Assignee
Jingdong Technology Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Technology Holding Co Ltd filed Critical Jingdong Technology Holding Co Ltd
Priority to CN202010692534.0A priority Critical patent/CN111813685B/en
Publication of CN111813685A publication Critical patent/CN111813685A/en
Application granted granted Critical
Publication of CN111813685B publication Critical patent/CN111813685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The embodiment of the disclosure discloses an automatic test method and an automatic test device. One embodiment of the method comprises the following steps: receiving a test task for a target application; acquiring a test case set associated with a target application; grouping the test cases in the test case set according to the attribute; distributing each group of test cases to the test end corresponding to the group of test cases for execution. According to the embodiment, the corresponding application is associated through the atomic test cases in a case association application mode, so that the test cases can be executed in time, and the flexibility and the utilization rate of the test cases are improved.

Description

Automatic test method and device
Technical Field
Embodiments of the present disclosure relate to the field of computer technology, and in particular, to an automated testing method and apparatus.
Background
When the existing method executes the test case, if a fixed time point test mode is adopted, the problem in the code cannot be found timely, for example, the test is completed, but the test task is executed in the early morning, and the problem cannot be found timely. If the test frequency is increased, although problems can be found in time, test resources are wasted. In addition, the prior art needs to manually configure which applications trigger test cases, is inconvenient to modify, and is easy to miss test cases.
Disclosure of Invention
The embodiment of the disclosure provides an automatic test method and an automatic test device.
In a first aspect, embodiments of the present disclosure provide an automated testing method, comprising: receiving a test task for a target application; acquiring a test case set associated with a target application; grouping the test cases in the test case set according to the attribute; distributing each group of test cases to the test end corresponding to the group of test cases for execution.
In some embodiments, the method further comprises: obtaining an execution result of each test case in the test case set; and generating an execution result of the test task based on the execution result of each test case.
In some embodiments, the method further comprises: receiving test configuration information, wherein the test configuration information comprises a test case and an application name of at least one associated application; and storing the test cases and the application names of at least one associated application into an application association table.
In some embodiments, obtaining a set of test cases associated with a target application includes: at least one test case associated with the target application is searched from the application association table.
In some embodiments, the test configuration information further includes operational information; and distributing each group of test cases to the test end corresponding to the group of test cases for execution, comprising: and distributing each group of test cases to the test end corresponding to the group of test cases, so that the test end executes the test cases according to the operation information.
In some embodiments, the triggering of the test task includes.
In some embodiments, the attributes include at least one of: interface type, operating system type, terminal type.
In a second aspect, embodiments of the present disclosure provide an automated testing apparatus comprising: a receiving unit configured to receive a test task for a target application; an acquisition unit configured to acquire a set of test cases associated with a target application; a grouping unit configured to group test cases in the test case set according to attributes; the sending unit is configured to distribute each group of test cases to the test end corresponding to the group of test cases for execution.
In some embodiments, the apparatus further comprises an output unit configured to: obtaining an execution result of each test case in the test case set; and generating and outputting an execution result of the test task based on the execution result of each test case.
In some embodiments, the apparatus further comprises an association unit configured to: receiving test configuration information of a test case, wherein the test configuration information comprises the test case and an application name of at least one associated application; and storing the test cases and the application names of at least one associated application into an application association table.
In some embodiments, the acquisition unit is further configured to: at least one test case associated with the target application is searched from the application association table.
In some embodiments, the test configuration information further includes operational information; and the transmitting unit is further configured to: and distributing each group of test cases to different test terminals, so that the test terminals execute the test cases according to the operation information.
In some embodiments, the triggering of the test task includes at least one of: fixed point in time triggers, fixed time interval triggers, association triggers.
In some embodiments, the attributes include at least one of: interface type, operating system type, terminal type.
In a third aspect, embodiments of the present disclosure provide an automated test electronics comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method as in any of the first aspects.
In a fourth aspect, embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements a method as in any of the first aspects.
According to the automatic test method and the automatic test device, the corresponding application is associated through the test cases of atoms in the application association mode, and the test cases corresponding to the application can be automatically searched as long as the application is lifted. The scheme has the following advantages:
1. the automation mode can timely find problems in the program (the automation test cases can be timely executed).
2. The test environment can be influenced as little as possible (the automation of the test environment is disturbed as little as possible).
3. The test cases to be executed can be automatically selected without manual intervention or with as little manual intervention as possible after the test cases are written.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings:
FIG. 1 is an exemplary system architecture diagram in which an embodiment of the present disclosure may be applied;
FIG. 2 is a flow chart of one embodiment of an automated test method according to the present disclosure;
FIG. 3 is a schematic illustration of one application scenario of an automated test method according to the present disclosure;
FIG. 4 is a flow chart of yet another embodiment of an automated test method according to the present disclosure;
FIG. 5 is a schematic structural view of one embodiment of an automated test equipment according to the present disclosure;
fig. 6 is a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates an exemplary system architecture 100 in which embodiments of an automated test method or automated test equipment of the present application may be employed.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as a test class application, a web browser application, a shopping class application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., to provide distributed services), or as a single software or software module. The present application is not particularly limited herein.
The server 105 may be a server providing various services, such as a background test server providing support for test cases compiled on the terminal devices 101, 102, 103. The background test server can analyze and other processes on the received test case related data and feed back the test result to the terminal equipment.
The server may be hardware or software. When the server is hardware, the server may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules (e.g., a plurality of software or software modules for providing distributed services), or as a single software or software module. The present application is not particularly limited herein.
It should be noted that the automated testing method provided by the embodiment of the present application is generally executed by the server 105, and accordingly, the automated testing apparatus is generally disposed in the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of an automated test method in accordance with the present application is shown. The automatic test method comprises the following steps:
step 201, a test task for a target application is received.
In this embodiment, an execution body of the automated test method (for example, a server shown in fig. 1) may start an http, socket, or other service to receive a test task for a target application. At least one target application may be included in one test task.
In some optional implementations of this embodiment, the triggering manner of the test task includes at least one of: fixed point in time triggers, fixed time interval triggers, association triggers. For example, a fixed point in time or a fixed time interval triggers the target application, and the associated test case is triggered. The trigger may also be associated by other applications. For example, the query application triggers the authentication application, which triggers the test case corresponding to the authentication application.
Step 202, a set of test cases associated with a target application is obtained.
In this embodiment, each target application may be associated with at least one test case. FIG. 3 illustrates a method for creating a test case, in which an associated application is to add applications related to the test case, so that the test case is automatically triggered when one of the associated applications is tested. An application can be associated with multiple test cases, e.g., application X can be associated with test case A, B, C, D, and if application X is promoted, execution of test case A, B, C, D is triggered. Similarly, one test case can be associated with multiple applications, e.g., test case A can be associated with application X, Y, Z. Only one test case needs to be stored in the library, so that the method can be applied to a plurality of applications, and multiplexing of the test cases is realized. For example, the test application X, Y, Z can trigger execution of the same test case a. The user can configure the test cases through the webpage interface to obtain test configuration information. The test configuration information may include basic information and execution operations. As shown in fig. 3, the basic information may include a case name, at least one associated application, a home module, a creator, a running environment, a case description, a case type, and the like. The test configuration information is uploaded to the server after the test configuration personnel creates the test cases and fills in the test configuration information. The server stores the test cases and the application names of the at least one associated application in an application association table. After receiving the test task, the server can find out the test case associated with the application through looking up a table. In addition to configuring the associated application, the operation information may be configured, such as the "perform operation" section shown in fig. 3. The operations performed may include opening a link, clicking operations, inputting operations, asserting operations, and the like.
Step 203, grouping the test cases in the test case set according to the attribute.
In this embodiment, the attribute may include an interface type, an operating system type, and a terminal type. Interface types may include ui automation and interface automation. ui automation can also be subdivided into gui automation and web automation. Operating system types may include android, IOS, and the like. The terminal type may include a mobile terminal, a PC, etc., and may be subdivided according to brand. Each test case is created with an attribute specified. Such as android systems, web automation, HW handsets, etc. Certain attribute information may also be configured by default, for example, if an operating system is not specified, test cases may be used for any one or more operating systems, or for all operating systems.
And 204, distributing each group of test cases to the test end corresponding to the group of test cases for execution.
In this embodiment, a corresponding test terminal may be set in advance for each attribute. The test end may be software or hardware. If the test end is hardware, the test end can be electronic equipment such as a mobile phone, a PC, a tablet personal computer and the like. The same set of test cases may use the same test terminal, and different sets of test cases may use the same or different test terminals to perform tests of different attributes. For example, using an android HW handset, the test attributes are test cases of an android system, web automation, HW handset. And the android HW mobile phone can be used, and the test attributes are test cases of an android system, gui automation and the HW mobile phone. Using iPAD, test attributes are ios System, web Automation, test cases of iPAD.
In some optional implementations of the present embodiment, the test configuration information further includes operation information. Distributing each group of test cases to different test end executions, including: and distributing each group of test cases and operation information to different test terminals, so that the test terminals execute the test cases according to the operation information. The operation information of the same set of test cases is not necessarily the same, and the operation information may be set according to the test cases. For example, the test property is a set of test cases A, B, C for android system, web automation, HW handsets. The operation of test case A only includes clicking, the operation of test case B only includes keyboard input, and the operation of test case C has both clicking and keyboard input.
The method provided by the embodiment of the application is a dynamic disassembly test case, and automatically searches the associated test case, namely, the associated test case can be automatically searched as long as the associated test case is. Rather than manually adding each test case to the test case library of the associated application. When there are hundreds or thousands of system test cases for an application, it is difficult for a human to find and add all relevant test cases associated with the application to the test case library for the application. In practice, the human approach usually finds the main test case for testing, and ignores other secondary test cases. The technical scheme of the application can find all relevant test cases, so that the utilization rate of the test cases is improved.
The traditional mode needs to manually search test cases aiming at different applications, and then the found test cases are configured to the application to perform automatic test. The test cases of the application do not need to be searched manually, and all the test cases related to the application can be executed when the application is tested by only relating the written test cases which are responsible for each person to the application. Each time a new application is developed, it is only necessary to associate the new application with an existing test case, and it is not necessary to copy a batch of existing test cases for the new application. And after the new application is associated with the existing test cases, the corresponding timing task is configured for triggering the execution of the test cases associated with the new application at fixed time.
When the task management system is actually used, a request for executing a task is triggered through the terminal equipment at the front end, related information is transmitted to the server at the rear end, the server at the rear end assembles a test case according to an application name contained in the task, a test task is generated after the test task is processed, and the task is inserted into a task table. The execution engine of the server has a timing task to query for test tasks at predetermined intervals, e.g., at a frequency of 10 s. If the test task exists, analyzing the task in the execution engine to obtain test cases, analyzing the test cases into test sub-cases one by one (for example, combining the test sub-case login and the test sub-case identity verification into one test case, so that the login and the identity verification sequence can be limited), and traversing and executing each test sub-case.
With further reference to fig. 4, a flow 400 of yet another embodiment of an automated test method is shown. The process 400 of the automated test method comprises the steps of:
step 401, a test task for a target application is received.
Step 402, a set of test cases associated with a target application is obtained.
Step 403, grouping the test cases in the test case set according to the attribute.
And step 404, distributing each group of test cases to the test end corresponding to the group of test cases for execution.
Steps 401-404 are substantially identical to steps 201-204 and are therefore not described in detail.
Step 405, obtaining an execution result of each test case in the test case set.
In this embodiment, an execution result is obtained after one test case is executed, and if one test case includes a plurality of test sub-cases, each test sub-case also has an execution result. The execution result may be success or failure. Other data, such as image classification results, etc., are also possible.
Step 406, generating and outputting the execution result of the test task based on the execution result of each test case.
In this embodiment, after one test sub-case is executed to collect the results, and so on, after all the test sub-cases are executed, the task execution results are pushed to the back end according to the task id, the task results are updated to the task table, and when the front end opens the task execution results, the results are dynamically obtained and displayed on the page.
Optionally, the execution results with different attributes can be compared and analyzed, and the difference of the execution results with different attributes can be automatically found, so that the research personnel can locate the problem. For example, the image recognition accuracy rate and the like under two operating systems, namely android and IOS, are counted.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of an automated test equipment, which corresponds to the embodiment of the method shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the automated test apparatus 500 of the present embodiment includes: a receiving unit 501, an acquiring unit 502, a grouping unit 503, and a transmitting unit 504. Wherein the receiving unit 501 is configured to receive a test task for a target application; an obtaining unit 502 configured to obtain a set of test cases associated with a target application; a grouping unit 503 configured to group the test cases in the test case set according to the attribute; and the sending unit 504 is configured to distribute each group of test cases to the test end corresponding to the group of test cases for execution.
In this embodiment, the specific processes of the receiving unit 501, the acquiring unit 502, the grouping unit 503, and the transmitting unit 504 of the automated test equipment 500 may refer to steps 201, 202, 203, and 204 in the corresponding embodiment of fig. 2.
In some optional implementations of the present embodiment, the apparatus further includes an output unit (not shown in the drawings) configured to: obtaining an execution result of each test case in the test case set; and generating and outputting an execution result of the test task based on the execution result of each test case.
In some optional implementations of the present embodiment, the apparatus further includes an association unit (not shown in the drawings) configured to: receiving test configuration information of a test case, wherein the test configuration information comprises the test case and an application name of at least one associated application; and storing the test cases and the application names of at least one associated application into an application association table.
In some optional implementations of the present embodiment, the obtaining unit 502 is further configured to: at least one test case associated with the target application is searched from the application association table.
In some optional implementations of the present embodiment, the test configuration information further includes operation information; and the transmitting unit 504 is further configured to: and distributing each group of test cases to the test end corresponding to the group of test cases, so that the test end executes the test cases according to the operation information.
In some optional implementations of this embodiment, the triggering manner of the test task includes at least one of: fixed point in time triggers, fixed time interval triggers, association triggers.
In some alternative implementations of the present embodiment, the attribute includes at least one of: interface type, operating system type, terminal type.
Referring now to fig. 6, a schematic diagram of an electronic device (e.g., server or terminal device of fig. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), car terminals (e.g., car navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The terminal device/server illustrated in fig. 6 is merely an example, and should not impose any limitation on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 6, the electronic device 600 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data required for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 600 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 6 may represent one device or a plurality of devices as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing means 601. It should be noted that, the computer readable medium according to the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In an embodiment of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Whereas in embodiments of the present disclosure, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a test task for a target application; acquiring a test case set associated with a target application; grouping the test cases in the test case set according to the attribute; distributing each group of test cases to the test end corresponding to the group of test cases for execution.
Computer program code for carrying out operations of embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments described in the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes a receiving unit, an acquiring unit, a grouping unit, and a transmitting unit. Where the names of the units do not constitute a limitation of the unit itself in some cases, for example, the receiving unit may also be described as "unit receiving a test task for a target application".
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the application referred to in this disclosure is not limited to the specific combination of features described above, but encompasses other embodiments in which any combination of features described above or their equivalents is contemplated without departing from the inventive concepts described. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).

Claims (10)

1. An automated testing method comprising:
receiving a test task for a target application;
the method comprises the steps of obtaining a test case set associated with the target application, wherein one test case is associated with a plurality of applications, each test case has test configuration information, and the method comprises the following steps: basic information and executing operations, wherein the basic information comprises a case name, at least one associated application, a attribution module, a creator, an operating environment, a case description and a case type, and the executing operations comprise: opening a link, clicking operation, inputting operation and asserting operation;
grouping the test cases in the test case set according to the attribute;
distributing each group of test cases to the test terminals corresponding to the group of test cases for execution, wherein the corresponding test terminals are preset for each attribute, and the same test terminals are used for the same group of test cases.
2. The method of claim 1, wherein the method further comprises:
obtaining an execution result of each test case in the test case set;
and generating an execution result of the test task based on the execution result of each test case and outputting the execution result.
3. The method of claim 1, wherein the method further comprises:
receiving test configuration information, wherein the test configuration information comprises a test case and an application name of at least one associated application;
and storing the test cases and the application names of at least one associated application into an application association table.
4. The method of claim 3, wherein the obtaining the set of test cases associated with the target application comprises:
and searching at least one test case associated with the target application from the application association table.
5. The method of claim 3, wherein the test configuration information further comprises operational information; and
the distributing each group of test cases to the test end corresponding to the group of test cases for execution includes:
and distributing each group of test cases and operation information to a test terminal corresponding to the group of test cases, so that the test terminal executes the test cases according to the operation information.
6. The method of claim 1, wherein the triggering of the test task comprises at least one of: fixed point in time triggers, fixed time interval triggers, association triggers.
7. The method of any of claims 1-6, wherein the attribute comprises at least one of:
interface type, operating system type, terminal type.
8. An automated testing apparatus comprising:
a receiving unit configured to receive a test task for a target application;
an obtaining unit, configured to obtain a set of test cases associated with the target application, where one test case is associated with a plurality of applications, and each test case has test configuration information, including: basic information and executing operations, wherein the basic information comprises a case name, at least one associated application, a attribution module, a creator, an operating environment, a case description and a case type, and the executing operations comprise: opening a link, clicking operation, inputting operation and asserting operation;
a grouping unit configured to group the test cases in the test case set according to attributes;
the sending unit is configured to distribute each group of test cases to the test terminals corresponding to the group of test cases for execution, wherein corresponding test terminals are preset for each attribute, and the same test terminals are used for the same group of test cases.
9. An automated test electronics device comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-7.
10. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1-7.
CN202010692534.0A 2020-07-17 2020-07-17 Automatic test method and device Active CN111813685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010692534.0A CN111813685B (en) 2020-07-17 2020-07-17 Automatic test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010692534.0A CN111813685B (en) 2020-07-17 2020-07-17 Automatic test method and device

Publications (2)

Publication Number Publication Date
CN111813685A CN111813685A (en) 2020-10-23
CN111813685B true CN111813685B (en) 2023-12-05

Family

ID=72866179

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010692534.0A Active CN111813685B (en) 2020-07-17 2020-07-17 Automatic test method and device

Country Status (1)

Country Link
CN (1) CN111813685B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448844B (en) * 2021-06-21 2022-10-25 青岛海尔科技有限公司 Method and device for regression testing and electronic equipment
CN114116453B (en) * 2021-10-31 2024-01-09 苏州浪潮智能科技有限公司 Method, device, equipment and readable medium for testing case association configuration information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823754A (en) * 2014-02-11 2014-05-28 深圳市同洲电子股份有限公司 Method and device for realizing automatic testing
CN108614774A (en) * 2018-04-24 2018-10-02 百度在线网络技术(北京)有限公司 Automated testing method and device
CN109582579A (en) * 2018-11-30 2019-04-05 腾讯音乐娱乐科技(深圳)有限公司 Applied program testing method, device, electronic equipment and storage medium
CN109726093A (en) * 2017-10-27 2019-05-07 伊姆西Ip控股有限责任公司 Method, equipment and computer program product for implementation of test cases
CN109992494A (en) * 2017-12-29 2019-07-09 北京京东尚科信息技术有限公司 A kind of automatic test execution method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160044520A1 (en) * 2014-08-11 2016-02-11 Verizon Patent And Licensing Inc. Mobile automation test platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823754A (en) * 2014-02-11 2014-05-28 深圳市同洲电子股份有限公司 Method and device for realizing automatic testing
CN109726093A (en) * 2017-10-27 2019-05-07 伊姆西Ip控股有限责任公司 Method, equipment and computer program product for implementation of test cases
CN109992494A (en) * 2017-12-29 2019-07-09 北京京东尚科信息技术有限公司 A kind of automatic test execution method and apparatus
CN108614774A (en) * 2018-04-24 2018-10-02 百度在线网络技术(北京)有限公司 Automated testing method and device
CN109582579A (en) * 2018-11-30 2019-04-05 腾讯音乐娱乐科技(深圳)有限公司 Applied program testing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111813685A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN110708346B (en) Information processing system and method
CN110909521B (en) Online document information synchronous processing method and device and electronic equipment
CN111679990B (en) Test data generation method and device, readable medium and electronic equipment
CN110619100B (en) Method and apparatus for acquiring data
CN110619096B (en) Method and apparatus for synchronizing data
CN110232091B (en) Method, system and apparatus for synchronizing data
CN115757400B (en) Data table processing method, device, electronic equipment and computer readable medium
CN109862100B (en) Method and device for pushing information
CN111813685B (en) Automatic test method and device
CN111694757B (en) Application program testing method and device, electronic equipment and computer readable storage medium
CN113505302A (en) Method, device and system for supporting dynamic acquisition of buried point data and electronic equipment
CN109218041B (en) Request processing method and device for server system
CN111931464A (en) Document editing method and device and electronic equipment
CN111552620B (en) Data acquisition method, device, terminal and storage medium
CN110489326B (en) IDS-based HTTPAPI debugging method device, medium and equipment
CN110619101B (en) Method and apparatus for processing information
CN111628938A (en) Branch merging method and device, electronic equipment and computer storage medium
CN115996179A (en) Service node testing method and device, readable medium and electronic equipment
CN113568695A (en) Corner mark processing method and device for boarder application
CN111787043A (en) Data request method and device
CN111240657A (en) Method and device for generating SDK and electronic equipment
CN110099122B (en) Method and apparatus for sending network request
CN111294657A (en) Information processing method and device
CN112468849B (en) Method, apparatus, electronic device and medium for video information transmission
CN111291199B (en) Information query method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant after: Jingdong Technology Holding Co.,Ltd.

Address before: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant before: Jingdong Digital Technology Holding Co.,Ltd.

GR01 Patent grant
GR01 Patent grant