CN115454869A - Interface automation test method, device, equipment and storage medium - Google Patents

Interface automation test method, device, equipment and storage medium Download PDF

Info

Publication number
CN115454869A
CN115454869A CN202211154917.8A CN202211154917A CN115454869A CN 115454869 A CN115454869 A CN 115454869A CN 202211154917 A CN202211154917 A CN 202211154917A CN 115454869 A CN115454869 A CN 115454869A
Authority
CN
China
Prior art keywords
execution
interface
test
test case
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211154917.8A
Other languages
Chinese (zh)
Inventor
李宵宵
甘雨
杨娅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Life Insurance Company of China Ltd
Original Assignee
Ping An Life Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Life Insurance Company of China Ltd filed Critical Ping An Life Insurance Company of China Ltd
Priority to CN202211154917.8A priority Critical patent/CN115454869A/en
Publication of CN115454869A publication Critical patent/CN115454869A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application provides an interface automatic testing method, device, equipment and storage medium, and belongs to the technical field of testing. The method comprises the steps of obtaining a test case set sent by a server in communication connection with an execution end; the test case set is issued by the server when detecting that the execution end is in an idle state; determining a target test case from the test case set according to a preset task rule; acquiring the type of an operating system and a general flow corresponding to a target test case; sequentially executing a plurality of execution steps included in the target test case according to an execution sequence set in the general flow; in the execution process of each execution step, determining a difference interface corresponding to the currently executed execution step according to the type of an operating system; and performing interface test on the equipment to be tested through the difference interface to obtain an interface test result. The embodiment of the application can realize the unified management of the interface test process for different operating systems, and the convenience of the management and maintenance of the test process is improved.

Description

Interface automation test method, device, equipment and storage medium
Technical Field
The present application relates to the field of testing technologies, and in particular, to an interface automation testing method, apparatus, device, and storage medium.
Background
With the increasing scale of software systems, the necessity and difficulty of software system testing are self-evident in order to guarantee the delivery quality, the limitation of the traditional manual testing is increasingly obvious, and the automatic testing technology can overcome many problems of the traditional manual testing. Therefore, automation of interface use cases is generally achieved by using interface automation tests. The interface automatic test mainly comprises two stages of recording case scripts and executing the case scripts. In the case script executing stage, the executing end executes the script data of the test case and outputs the report of the executing test result. However, for different operating systems, different projects are usually set up to adapt to the execution of interface use cases of different operating systems, and at this time, all the data records (such as screenshots, logs and fault tolerance) from the starting to the execution process and the result reporting need to be written separately and independently, which is inconvenient for the management and maintenance of the interface test process.
Disclosure of Invention
The embodiment of the application mainly aims to provide an interface automatic testing method, device, equipment and storage medium, and aims to realize unified management of interface testing processes for different operating systems and improve the convenience of management and maintenance of the testing processes.
In order to achieve the above object, a first aspect of the embodiments of the present application provides an interface automation testing method, which is applied to an execution end, and the method includes:
acquiring a test case set sent by a server in communication connection with the execution end; the test case set is issued by the server when the server detects that the execution end is in an idle state;
determining a target test case from the test case set according to a preset task rule;
acquiring the type of an operating system and a general flow corresponding to the target test case;
sequentially executing a plurality of execution steps included in the target test case according to the execution sequence set in the general flow;
in the execution process of each execution step, determining a difference interface corresponding to the currently executed execution step according to the type of the operating system;
and performing interface test on the equipment to be tested through the difference interface to obtain an interface test result.
In some embodiments, the performing, by the difference interface, an interface test on the device to be tested to obtain an interface test result includes:
acquiring an execution result corresponding to the difference interface;
when the execution result shows that the execution of the execution step fails, acquiring a preset positioning element list;
selecting a first positioning element from the positioning element list as an abnormal positioning element;
carrying out control positioning processing on the execution result through the abnormal positioning element to obtain control positioning data;
when the control positioning data is not positioned to an abnormal control, reselecting a second positioning element from the positioning element list to update the abnormal positioning element, and performing control positioning processing on the execution result again through the updated abnormal positioning element;
when all the positioning elements in the positioning element list are traversed or the control positioning data is positioned to an abnormal control, stopping the control positioning processing;
and obtaining an interface test result according to the execution result and the control positioning data.
In some embodiments, when the control location data has located an abnormal control, after performing control location processing, the method further comprises:
matching the control positioning data with a preset interference list, and judging whether interference factors exist or not;
when the interference factors exist, the difference interface is called again to carry out interface test on the equipment to be tested;
and eliminating the session frame popped up by the interference factor through an operation instruction read from a configuration file in the process of recalling the difference interface to carry out interface test on the equipment to be tested.
In some embodiments, the performing an interface test on the device to be tested through the difference interface to obtain an interface test result further includes:
and when the execution result shows that the execution of the execution step fails, switching the display page displayed on the equipment to be tested to the network page corresponding to the execution step through a multi-window switching mechanism.
In some embodiments, the performing an interface test on the device to be tested through the difference interface to obtain an interface test result further includes:
when the target test case is a primary ecological case, detecting the communication state of the differential interface, and when the communication state shows connection jitter, restarting a test platform executing the target test case;
and when the target test case is an embedded network page, detecting the window switching state of the browser control corresponding to the target test case, and restarting the drive of the browser control when the window switching state indicates abnormal switching.
In order to achieve the above object, a second aspect of the embodiments of the present application provides an interface automation testing method, which is applied to a server, where the server is communicatively connected to at least one execution end, and the method includes:
judging whether the execution end is in an idle state or not;
when the execution end is in an idle state, determining a test case set to be tested from a preset plan case set;
and sending the test case suite to the execution end so that the execution end executes the method of any one of the first aspect.
In some embodiments, at least one target test case in the set of planning use cases is obtained by:
acquiring a recording case, and performing step disassembly on the recording case to obtain a case step set corresponding to the recording case one by one;
and performing association combination on Json fields corresponding to all the case steps in the case step set to obtain the target test case.
In order to achieve the above object, a third aspect of the embodiments of the present application provides an interface test management apparatus, including:
the receiving module is used for acquiring a test case set sent by a server in communication connection with the execution end; the target test case is issued by the server when the server detects that the execution end is in an idle state;
the determining module is used for determining a currently executed target test case from the test case set according to an execution task corresponding to the test case set;
the general flow acquisition module is used for acquiring the type of an operating system and a general flow corresponding to the target test case;
the execution module is used for sequentially executing a plurality of execution steps included by the target test case according to the execution sequence set in the general flow;
and the difference processing module is used for determining a difference interface corresponding to the currently executed execution step according to the type of the operating system and carrying out interface test on the equipment to be debugged through the difference interface to obtain an interface test result in the execution process of each execution step.
In order to achieve the above object, a fourth aspect of the embodiments of the present application proposes an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor executes the computer program to implement the method of the first aspect or the method of the second aspect.
To achieve the above object, a fifth aspect of embodiments of the present application provides a storage medium, which is a computer-readable storage medium, and stores a computer program, and the computer program, when executed by a processor, implements the method of the first aspect or the method of the second aspect.
According to the interface automatic testing method, the device, the equipment and the storage medium, the server detects the idle state of the execution end to judge whether the test case set is issued or not, and unified management of all test cases to be tested is achieved. Meanwhile, the interface test of the equipment to be tested with different system types can be realized at the same execution end through the same target test case by setting a general flow irrelevant to the system type of the equipment to be tested for each target test case and executing the execution steps by combining with a difference interface strongly relevant to the system type of the equipment to be tested. At the moment, the test development engineer only needs to maintain one set of test cases and the difference interface related to the system, so that the management and the maintenance are more convenient. Therefore, compared with the related art, the method provided by the embodiment of the application can realize the unified management of the interface test process on different operating systems, and the convenience of management and maintenance of the test process is improved.
Drawings
FIG. 1 is a flowchart of an interface automation test method applied to an execution end according to an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of the interface automation test method applied to the execution end according to the embodiment of the present application;
fig. 3 is a schematic diagram of an execution process of a target test case in the interface automation test method applied to the execution end according to the embodiment of the present application;
FIG. 4 is a flowchart of step S600 in FIG. 1;
fig. 5 is a schematic processing flow diagram illustrating the existence of interference factors in the interface automation test method applied to the execution end according to the embodiment of the present application;
FIG. 6 is a schematic diagram of multiple windows in the interface automation test method applied to the execution end according to the embodiment of the present application;
FIG. 7 is a flowchart of an interface automation testing method applied to a server according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of an interface automation test device provided in an embodiment of the present application;
fig. 9 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that although functional blocks are partitioned in a schematic diagram of an apparatus and a logical order is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the partitioning of blocks in the apparatus or the order in the flowchart. The terms first, second and the like in the description and in the claims, and the drawings described above, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
First, several terms referred to in the present application are resolved:
the fact Native (RN for short) is a cross-platform mobile application development framework sourced from Facebook in 2015 4, is a derivative product of UI framework React sourced from Facebook in the Native mobile application platform, and currently supports two platforms of iOS and android; the fact Native can obtain completely consistent development experience on the basis of Javascript and fact to construct Native application. The method packages the layout design and the service codes of the page, then downloads, installs and uses the page, and when the page layout needs to be updated or the service logic needs to be changed, the front-end content needs to be modified, upgraded and packaged, and then the latest page layout and the service codes can be used after being released again.
Webview is an engine based on webkit, can parse DOM elements and display html page controls, and has the same principle as a browser for displaying pages. It does the presentation of the page by loading the html file. When the page layout needs to be updated or the business logic needs to be changed, only the html code or the js file needs to be modified (if the html code or the js file is acquired from the server side, only the new file deployment is completed).
JSON, (JavaScriptObject notification, JS) is a lightweight data exchange format. It stores and represents data in a text format that is completely independent of the programming language, based on a subset of ECMAScript (js specification set by the european computer association). The compact and clear hierarchy makes JSON an ideal data exchange language. The network transmission method is easy to read and write by people, is easy to analyze and generate by machines, and effectively improves the network transmission efficiency.
The Appium is an open source test automation framework that can be used for native, hybrid, and mobile Web application testing. It uses the Webdriver protocol to drive iOS, android and Windows applications.
With the increasing scale of software systems, the necessity and difficulty of software system testing are self-evident in order to guarantee the delivery quality, the limitation of the traditional manual testing is increasingly obvious, and the automatic testing technology can overcome many problems of the traditional manual testing. Therefore, automation of interface use cases is generally achieved by using interface automation tests. The interface automatic test mainly comprises two stages of recording case scripts and executing the case scripts. In the case script execution stage, the script data of the test case is executed through the execution end, and a report of the execution test result is output. However, for different operating systems, different projects are usually set up to adapt to the execution of interface use cases of different operating systems, and at this time, all the data records (such as screenshots, logs and fault tolerance) from the starting to the execution process and the result reporting need to be written separately and independently, which is inconvenient for the management and maintenance of the interface test process.
Based on this, embodiments of the present application provide a method, an apparatus, a device, and a storage medium for interface automated testing, which aim to implement unified management of interface testing processes for different operating systems, and improve convenience of management and maintenance of the testing processes.
The embodiment of the application provides an interface automatic testing method, and relates to the technical field of testing. The interface automatic testing method provided by the embodiment of the application can be applied to a terminal, a server side and software running in the terminal or the server side. In some embodiments, the terminal may be a smartphone, tablet, laptop, desktop computer, or the like; the server side can be configured into an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and cloud servers for providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN (content delivery network) and big data and artificial intelligence platforms; the software may be an application or the like that implements the interface automation test method, but is not limited to the above form.
The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Fig. 1 is an optional flowchart of an interface automation testing method provided in an embodiment of the present application, and the method in fig. 1 may include, but is not limited to, steps S100 to S600.
Referring to fig. 1, an interface automation testing method provided in accordance with a first aspect of the embodiment of the present application is applied to an execution end, and the method includes:
s100, acquiring a test case set sent by a server in communication connection with an execution end; the test case set is issued by the server when detecting that the execution end is in an idle state.
It should be noted that, by detecting whether the execution end is in an idle state and then issuing the test case set, the execution end can be effectively utilized, for example, the execution end with high execution efficiency can execute more test cases, so that the cases to be tested can be covered one by one in the shortest time, and the execution efficiency is higher. And the test case set is managed and issued by the server, so that a plurality of execution ends are managed by one server, and the test process is more convenient to maintain.
It should be noted that, because a plurality of execution ends are provided, and the plurality of execution ends are not associated with each other, concurrent execution of the plurality of execution ends can be realized.
It should be noted that the idle state includes two types, one is that no test case exists at the execution end for execution, and the other is that no test case exists at least one device to be tested that has a connection at the execution end for execution. For example, if the execution end is connected with a plurality of devices to be tested, it may be determined whether the execution end has a test case being executed, and if so, it may be determined whether each device to be tested connected to the execution end is tested, so as to determine whether the device to be tested is in an idle state. In this way, the resources can be utilized to the maximum, and the test cases to be tested can be covered in the shortest time.
And S200, determining a target test case from the test case set according to a preset task rule.
It should be noted that the task rule defines an execution sequence of the test case sets, and each test case set has at least one test case.
It should be noted that, after the current target test case is executed, the next target test case is determined according to the task rule and the following steps are executed.
And step S300, acquiring the type of the operating system and a general flow corresponding to the target test case.
It should be noted that, when the same application is applied to different operating systems, the operation flows of the function implementation are mostly consistent, and the common flow defines a common operation flow of the same function in different operating systems. Such as order of execution of each type of control, reference relationships, and so forth.
And S400, sequentially executing a plurality of execution steps included in the target test case according to the execution sequence set in the general flow.
It should be noted that each target test case can be disassembled to obtain multiple steps based on a general flow, for example, taking the target test case for testing a text box as an example, the test process usually includes selecting, inputting, and verifying the text box. At this time, the test case is disassembled into corresponding steps according to the general operation and executed.
Step S500, in the execution process of each execution step, determining a difference interface corresponding to the currently executed execution step according to the type of the operating system.
It should be noted that the difference interface is a system interface corresponding to the step of triggering the same function under different operating systems. For example, for the step of selecting a text box, it needs to call an interface for capturing a click position of a mouse of a page to determine whether the text box is selected, and different operating systems have different interfaces, so that a difference interface is formed.
And S600, performing interface test on the equipment to be tested through the difference interface to obtain an interface test result.
Therefore, according to the embodiment of the application, at least the following effect is achieved, the server detects the idle state of the execution end to judge whether to issue the test case set, and unified management of all the test cases to be tested is achieved. Meanwhile, the interface test of the equipment to be tested with different system types can be realized at the same execution end through the same target test case by setting a general flow irrelevant to the system type of the equipment to be tested for each target test case and executing the execution steps by combining with a difference interface strongly relevant to the system type of the equipment to be tested. At the moment, the test development engineer only needs to maintain one set of test cases and the difference interface related to the system, so that the management and the maintenance are more convenient. Therefore, compared with the related art, the method provided by the embodiment of the application can realize the unified management of the interface test process on different operating systems, and the convenience of management and maintenance of the test process is improved.
It should be noted that, because the execution sequence of the target cases corresponding to different systems is managed by adopting a general flow, and multiple machine types can be compatible for parallel execution, multiple machine types can be connected to one execution terminal, and multiple machine types execute the same target test case concurrently, so that the coverage rate of the case is higher. Meanwhile, the difficulty of test environment deployment is correspondingly reduced. When the test environment is deployed, only the efficiency of the execution end, the number of devices to be tested which can be used for executing the test, and the like need to be considered.
It should be noted that, for the execution end, a concurrency mechanism may be used to perform concurrent processing on multiple devices to be tested. For example, the same test case set is used for carrying out concurrent test on a plurality of devices to be tested so as to improve the reliability of test results.
For example, referring to fig. 2, a server is correspondingly connected with a plurality of execution terminals, and each execution terminal performs a use case test on at least one or more devices to be tested. Specifically, for the execution end 1, the execution end 1 is configured to perform a case test on the device to be tested 1 and the device to be tested 2, and when the server detects that the execution end 1 does not perform the case test on the device to be tested 2, the server issues a case that is not executed in the device to be tested 2, and at this time, the execution end 1 receives the test case set and executes the test case set. It should be noted that the server issuing may be triggered actively by the user, or may be triggered by a rule set in the server. If the server detects that the execution terminal 1 is in an idle state, the server prompts a tester to issue a test case set on the operation interface, and the tester issues the test case set according to actual business requirements. Therefore, by the mode, more test case sets can be executed at the execution end with high execution efficiency as much as possible, and the coverage rate of the test cases is higher.
For example, referring to fig. 3, taking an example that the execution end 1 performs the same target test case on the device to be tested 1 and the device to be tested 2, assuming that the target test case includes 3 execution steps, as shown in the execution sequence fig. 3, when the execution step 1 is executed, for the device to be tested 1, the difference interface 1 matched with the device to be tested 1 is called to execute the test content of the step 1. For the device to be tested 2, the difference interface 2 matched with the device to be tested 2 is called to execute the test content of the step 1.
It can be understood that, referring to fig. 4, in step S600, the interface test is performed on the device to be tested through the differential interface, so as to obtain an interface test result, including:
and step S601, obtaining an execution result corresponding to the difference interface.
Step S602, when the execution result indicates that the execution of the execution step fails, acquiring a preset positioning element list.
Step S603, selecting the first positioning element from the positioning element list as an abnormal positioning element.
And step S604, performing control positioning processing on the execution result through the abnormal positioning element to obtain control positioning data.
Step S605, when the control positioning data is not positioned to the abnormal control, reselecting the second positioning element from the positioning element list to update the abnormal positioning element, and performing control positioning processing on the execution result again through the updated abnormal positioning element.
And when all the positioning elements or control positioning data in the traversed positioning element list are positioned to the abnormal control, stopping control positioning processing.
And step S606, obtaining an interface test result according to the execution result and the control positioning data.
It should be noted that, for the interface test, each difference interface corresponds to the operation of at least one control, such as a text box and a button. In the related art, only one positioning mode is usually recorded for a control, and when positioning fails based on the positioning mode, analysis cannot be continued; therefore, the problem of positioning failure is easy to occur, a positioning element list is set, a plurality of positioning modes are provided, flexible switching of the positioning modes can be achieved, and the positioning success rate is improved. The positioning element list records a plurality of positioning elements for finding the control, and for each control, the positioning elements comprise one or more of ID, name, class name, XPath, accessibility ID and android automation. At the moment, due to the adoption of a plurality of positioning factors, the abnormal position can be quickly positioned, and meanwhile, in some embodiments, after the abnormal position is determined, the reason of the abnormality is classified, so that the fault factor is judged in advance, and the problem is positioned and repaired more quickly by a test developer.
It can be understood that, referring to fig. 5, when the control positioning data has been positioned to the abnormal control, after the control positioning processing, the method further includes:
and S701, matching the control positioning data with a preset interference list, and judging whether interference factors exist.
The interference factor represents a bullet box triggered by a non-functional flow during the test. Specifically, for example, in some embodiments, after a certain step is executed, an authority confirmation popup may pop up, and since the operation of click confirmation is not set in the test case, the permission confirmation popup cannot jump to the next page, and the execution step fails. In practice, the permission confirmation popup may be generated by a system, and whether the permission confirmation popup is generated is related to the system configuration of the device to be tested or the permission setting of the verified application, and is not triggered by the current functional flow, so that the execution step of the test case does not include the operation of the permission confirmation popup. If the permission confirmation popup is set in the interference list as an interference factor, it can be determined that the execution step is not a true failure. In other embodiments, the distracting factor may also include dynamic pop-up, such as news pop-up or link pop-up, among others.
It should be noted that a plurality of interference factors are defined in the interference list. The acquisition of the interference factors can be obtained by summarizing the results of the failure of the historical test cases or artificially set according to the prejudgment of the system popup. In this regard, in the embodiment of the present application, the content of the specific setting of the interference factor is not restricted, and a person skilled in the art may generate the interference list according to the actual application situation or the historical experience.
It should be noted that, in some embodiments, the interference list may be configured by a Json file, and when the interference list is applied, the Json file is loaded to implement dynamic configuration of the interference factors. In other embodiments, the configuration file may also be directly solidified in the program, or may be dynamically configured through another configuration file format, which is not limited in this embodiment. For the embodiment of the application, a Json file configuration mode is preferably adopted to reduce the threshold requirement of a tester, and realize more efficient maintenance of the test engineering so as to improve the convenience of maintenance.
It should be noted that after the target test case completes all the steps, the execution log and the screenshots of each step after execution are uploaded to the server, the server sends the execution log and the screenshots to the test developers related to the cases in the form of mails or other text forms, and the complete closed loop for problem processing is formed by following the processing results of the test developers.
And step S702, when the interference factors exist, the difference interface is called again to carry out interface test on the equipment to be tested.
It should be noted that, in some embodiments, when the display fails to execute, the execution of the target test case may be selected to be directly ended, and at this time, after the positioning is performed until the interference factor exists, the interface test on the device to be tested is performed by recalling the difference interface by recalling the target case. In other embodiments, after the display fails to execute, when it is determined that the interference factor exists, the failed execution step is directly repeated to recall the difference interface to test the device under test. In this regard, the embodiment of the present application does not limit the manner of recalling the difference interface. Those skilled in the art can selectively set the setting according to the requirements of the project.
Step S703, in the process of re-invoking the difference interface to perform the interface test on the device to be tested, eliminating the session frame popped up by the interference factor through the operation instruction read from the configuration file.
Therefore, by introducing a fault-tolerant mechanism, the interference caused by other operations is removed, and the reliability of the result of the control positioning can be greatly improved.
It can be understood that, the interface test is performed on the device to be tested through the differential interface to obtain the interface test result, and the method further includes:
and when the execution result indicates that the execution of the execution step fails, switching the display page displayed on the equipment to be tested to the network page corresponding to the execution step through a multi-window switching mechanism.
It should be noted that, when a target test case is tested, a web page is generated in each execution step, and at this time, a plurality of windows display contents. Similarly, for each device to be tested, a plurality of target test cases are executed to generate a plurality of windows. The displayed page is specifically switched to the window corresponding to which target test case or which execution step, and is determined by the current handle. As shown in fig. 6, when the current handle is the handle of the tag 1, the displayed page that is being displayed displays the content of the tag 1, and when the current handle is the handle of the tag 2, the displayed page that is being displayed displays the content of the tag 2. Thus, the difficulty of locating may be increased when the current handle does not correspond to the actual failed step. And switching the current handle into a corresponding network page when the execution step fails through a multi-window switching mechanism. At the moment, the page content corresponding to the failed execution step can be automatically and quickly intercepted, manual participation is not needed, and the positioning efficiency of the use case is higher.
It should be noted that, in the multi-window switching mechanism, the handle of the web page corresponding to the step of failed execution is used as the current handle, so that the web page is used as the display page being displayed. Since the multi-window switching mechanism belongs to a conventional operation for performing handle operation in the field, the embodiments of the present application are not described in detail.
It can be understood that, the interface test is performed on the device to be tested through the differential interface to obtain an interface test result, and the method further includes:
when the target test case is a primary ecological case, detecting the communication state of the differential interface, and restarting a test platform executing the target test case when the communication state shows connection jitter;
and when the target test case is an embedded network page, detecting the window switching state of the browser control corresponding to the target test case, and restarting the drive of the browser control when the window switching state indicates that the switching is abnormal.
It should be noted that the original ecological case represents an interface case for testing a function point which realizes page development by using the fact Native. The embedded network page represents an interface case for testing the functional point which adopts H5 to realize page development. In some embodiments, the embedded web page refers to an interface case for testing a function point for realizing page development by using Webview.
It should be noted that, when terminal application development is performed, a hybrid development of read Native and H5 is usually performed, and in some embodiments, H5 is displayed by using a browser control (e.g., webview). In practical application, under the application of hybrid development, the H5 page switching is not stable enough, and the multiple devices are not stable enough in parallel, specifically, the target test case cannot be continuously executed and the display execution fails, for example, the switching cannot be performed, and the display cannot be displayed for a long time after the switching. Therefore, when the window switching abnormality is detected, the browser control is restarted, and at the moment, the restarted browser control is reloaded, so that the failure probability of the target test case is reduced.
It should be noted that the test platform may be an Appium. The communication state is represented as the connection state of the socket, and when the socket connection has jitter, information loss can be caused, so that the probability of executing failure of the target test case is improved. After the test platform is restarted, the current connection with the equipment to be tested is disconnected, and the socket is restarted, so that the probability that the connection with the equipment to be tested is in a stable state is improved.
Therefore, by providing an error analysis mechanism to detect the execution environment, the probability of case failure due to the stability of the execution environment can be reduced.
In order to achieve the above object, referring to fig. 7, a second aspect of the embodiment of the present application provides an interface automation testing method, which is applied to a server, where the server is communicatively connected to at least one execution end, and the method includes:
step 801, judging whether the execution end is in an idle state.
And S802, when the execution end is in an idle state, determining a test case set to be tested from a preset plan case set.
It should be noted that the idle state indicates that no use case is executed at the execution end, or that no relevant test case is executed at the execution end in the device to be tested.
It should be noted that the planning case set is a full set of test cases. The server allocates the distribution of the test case sets according to the idle states of the execution ends, so that each execution end can be in the execution state, the coverage rate of the plan case sets is improved, and the reliability of the test results is improved.
And S803, sending the test case set to the execution end so that the execution end executes the interface automatic test method applied to any one of the execution ends.
Therefore, the server detects the idle state of the execution end to judge whether to issue the test case set or not, and the unified management of all the test cases to be tested is realized. Meanwhile, the interface test of the equipment to be tested with different system types can be realized at the same execution end through the same target test case by setting a general flow irrelevant to the system type of the equipment to be tested for each target test case and executing the execution steps by combining with a difference interface strongly relevant to the system type of the equipment to be tested. At the moment, the test development engineer only needs to maintain one set of test cases and the difference interface related to the system, so that the management and the maintenance are more convenient. Therefore, compared with the related art, the method provided by the embodiment of the application can realize the unified management of the interface test process on different operating systems, and the convenience of management and maintenance of the test process is improved.
It can be understood that at least one target test case in the planning case set is obtained by the following steps: acquiring a recording case, and performing step disassembly on the recording case to obtain a case step set corresponding to the recording case one by one; and performing relevant combination on Json fields corresponding to all the case steps in the case step set to obtain a target test case.
It should be noted that the association combination indicates that the Json field corresponding to each case step is referred to in the target test case, and therefore, when the location information of the Json field corresponding to step 1 in one target test case is changed, other target test cases referring to the Json field are changed correspondingly.
It should be noted that, the use case is obtained by a recording mode, the recording use case is subjected to step disassembly, and the Json fields corresponding to the use case steps are combined to generate the target use case. In this way, the threshold of the tester can be reduced. During maintenance, cases needing maintenance can be quickly located through the query step.
It should be noted that, since each step corresponds to one Json field, and the target use case is combined based on the step, when modifying or querying, a query can be performed based on the step. Similarly, because of the association combination, when a plurality of target use cases share the same step, the positioning information of the Json field corresponding to the shared step is changed, so that the corresponding target use cases are correspondingly changed. For example, taking the Json field corresponding to one click control as an example, the Json field corresponding to the click control is as follows:
Figure BDA0003856775210000121
when the Json field is used in the steps of the target test cases, and when the positioning information in the Json field, such as the access availability ID, is changed, the target test cases are correspondingly updated.
The method for automated interface testing according to an embodiment of the present application is described below as a specific embodiment. Referring to fig. 1 to 7:
the server periodically detects idle states of the execution end 1, the execution end 2 and the execution end 3; when detecting that no use case is executed at the execution end 2, issuing a test case set to the execution end 2 with reference to step S802 and step S803. After receiving the test case suite with reference to step S100, the execution end 2 determines the target test case with reference to step S200, determines the general flow with reference to step S300, and sequentially executes each execution step in the target test case with reference to step S400. For each execution step, when the device to be tested 3 is tested by the target test case, the difference interface is determined according to the system type of the device to be tested 3 with reference to step S500, so that the execution end 2 can call the corresponding function point in the execution step through the difference interface. In the execution process of each execution step, if socket connection or window switching is unstable in the device to be tested 3, the corresponding drive of the app or Webview is restarted. In the execution process of each execution step, referring to fig. 4, when the execution of the execution step fails, positioning is performed in multiple ways through a preset positioning element list. Meanwhile, referring to fig. 5, after the abnormal position is determined, the control positioning data is matched with the interference list, whether an interference factor exists is judged, when the interference factor exists, a session frame popped up by the interference factor is eliminated through an operation instruction loaded by a Json file, and the execution step is executed again. Similarly, for the device to be tested 4, when it needs to execute the target test case the same as that of the device to be tested 3, the device to be tested 3 and the device to be tested 4 may be concurrently processed. It should be noted that, when the execution of the execution step fails, the web page corresponding to the execution step that fails to be executed is also switched to the current display page through the multi-window switching mechanism. At this time, the screenshot can be quickly performed on the current display page for positioning confirmation. And simultaneously, sending the result of classifying the control positioning data in the figure 4, the screenshot and the corresponding execution log to the corresponding test developer through an email.
Therefore, referring to the method for interface automated testing in fig. 1 to fig. 7, at the execution end, the Android and iOS platforms do not need to be maintained separately from two projects, and the whole process from starting to executing to reporting the result of the Android and iOS platforms can be completed in one unified project, so that management and maintenance are facilitated. Meanwhile, the stability of parallel execution of a plurality of devices to be tested is optimized, optimization work can be performed aiming at the stability problem caused by switching drive when H5 and Native cases are executed simultaneously, and the failure rate of case execution influenced by stability at the execution end is effectively reduced. Meanwhile, the authority popup and the dynamic popup are used as interference factors, and key characteristics of the authority popup are configured by JSON files, so that management and maintenance are facilitated. When the execution step fails, when the abnormal control corresponding to the execution step is positioned, a plurality of positioning modes are combined, when one positioning mode fails, the other positioning mode can be switched, and the success rate of executing the target test case is improved. Meanwhile, analysis and improvement are carried out on abnormal problems occurring in the case execution process, and the test developer can be helped to quickly and accurately position and repair the problems by attributing and classifying the screenshot of each execution step and the abnormal thrown out in the execution process. And after the execution of all test cases is finished, uploading the execution log and the screenshots to a background, and sending the execution log and the screenshots to test developers related to the cases in an email mode. Through their follow-up processing, a complete closed loop of problem processing is formed.
Referring to fig. 8, an embodiment of the present application further provides a management device for interface testing, which can implement the above-mentioned interface automated testing method applied to an execution end, and the device includes:
a receiving module 100, configured to obtain a test case set sent by a server in communication connection with an execution end; the target test case is issued by the server when detecting that the execution end is in an idle state;
the determining module 200 is used for determining a currently executed target test case from the test case set according to an execution task corresponding to the test case set;
a general flow obtaining module 300, configured to obtain an operating system type and a general flow corresponding to the target test case;
an execution module 400, configured to sequentially execute multiple execution steps included in the target test case according to an execution sequence set in the general flow;
the difference processing module 500 is configured to, in the execution process of each execution step, determine a difference interface corresponding to the currently executed execution step according to the type of the operating system, and perform an interface test on the device to be debugged through the difference interface to obtain an interface test result.
The specific implementation of the management apparatus for interface testing is substantially the same as the specific implementation of the interface automation testing method applied to the execution end, and is not described herein again.
The embodiment of the application further provides electronic equipment, the electronic equipment comprises a memory and a processor, the memory stores a computer program, and the processor executes the computer program to realize the interface automatic test method applied to the execution end or the interface automatic test method applied to the server. The electronic equipment can be any intelligent terminal including a tablet computer, a vehicle-mounted computer and the like.
Referring to fig. 9, fig. 9 illustrates a hardware structure of an electronic device according to another embodiment, where the electronic device includes:
the processor 901 may be implemented by a general-purpose CPU (central processing unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits, and is configured to execute a relevant program to implement the technical solution provided in the embodiment of the present application;
the memory 902 may be implemented in the form of a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a Random Access Memory (RAM). The memory 902 may store an operating system and other application programs, and when the technical solution provided in the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 902, and the processor 901 calls the interface automation test method applied to the execution end or the interface automation test method applied to the server to execute the embodiments of the present application;
an input/output interface 903 for implementing information input and output;
a communication interface 904, configured to implement communication interaction between the device and another device, where communication may be implemented in a wired manner (e.g., USB, network cable, etc.), or in a wireless manner (e.g., mobile network, WIFI, bluetooth, etc.);
a bus 905 that transfers information between various components of the device (e.g., the processor 901, the memory 902, the input/output interface 903, and the communication interface 904);
wherein the processor 901, the memory 902, the input/output interface 903 and the communication interface 904 enable a communication connection within the device with each other through a bus 905.
The embodiment of the application also provides a storage medium, which is a computer-readable storage medium, and the storage medium stores a computer program, and when the computer program is executed by a processor, the computer program implements the interface automation test method applied to the execution end or the interface automation test method applied to the server.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs as well as non-transitory computer executable programs. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and these remote memories may be connected to the processor through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
According to the interface automatic testing method, the interface testing management device, the electronic equipment and the storage medium, the server detects the idle state of the execution end to judge whether the test case set is issued or not, and unified management of all test cases to be tested is achieved. Meanwhile, the interface test of the equipment to be tested with different system types can be realized at the same execution end through the same target test case by setting a general flow irrelevant to the system type of the equipment to be tested for each target test case and executing the execution steps by combining with a difference interface strongly relevant to the system type of the equipment to be tested. At the moment, the test development engineer only needs to maintain one set of test cases and the difference interface related to the system, so that the management and the maintenance are more convenient. Therefore, compared with the related art, the method provided by the embodiment of the application can realize unified management of the interface test process on different operating systems, and the convenience of management and maintenance of the test process is improved.
The embodiments described in the embodiments of the present application are for more clearly illustrating the technical solutions of the embodiments of the present application, and do not constitute a limitation to the technical solutions provided in the embodiments of the present application, and it is obvious to those skilled in the art that the technical solutions provided in the embodiments of the present application are also applicable to similar technical problems with the evolution of technology and the emergence of new application scenarios.
It will be appreciated by those skilled in the art that the embodiments shown in the figures are not intended to limit the embodiments of the present application and may include more or fewer steps than those shown, or some of the steps may be combined, or different steps may be included.
The above-described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may also be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
One of ordinary skill in the art will appreciate that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the above-described units is only one type of logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes multiple instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing programs, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The preferred embodiments of the present application have been described above with reference to the accompanying drawings, and the scope of the claims of the embodiments of the present application is not limited thereto. Any modifications, equivalents and improvements that may occur to those skilled in the art without departing from the scope and spirit of the embodiments of the present application are intended to be within the scope of the claims of the embodiments of the present application.

Claims (10)

1. An interface automation test method is applied to an execution end, and the method comprises the following steps:
acquiring a test case set sent by a server in communication connection with the execution end; the test case set is issued by the server when the server detects that the execution end is in an idle state;
determining a target test case from the test case set according to a preset task rule;
acquiring the type of an operating system and a general flow corresponding to the target test case;
sequentially executing a plurality of execution steps included in the target test case according to the execution sequence set in the general flow;
in the execution process of each execution step, determining a difference interface corresponding to the currently executed execution step according to the type of the operating system;
and performing interface test on the equipment to be tested through the difference interface to obtain an interface test result.
2. The method of claim 1, wherein performing an interface test on the device to be tested through the differential interface to obtain an interface test result comprises:
acquiring an execution result corresponding to the difference interface;
when the execution result shows that the execution of the execution step fails, acquiring a preset positioning element list;
selecting a first positioning element from the positioning element list as an abnormal positioning element;
carrying out control positioning processing on the execution result through the abnormal positioning element to obtain control positioning data;
when the control positioning data is not positioned to an abnormal control, reselecting a second positioning element from the positioning element list to update the abnormal positioning element, and performing control positioning processing on the execution result again through the updated abnormal positioning element;
when all positioning elements in the positioning element list are traversed or the control positioning data is positioned to an abnormal control, stopping the control positioning processing;
and obtaining an interface test result according to the execution result and the control positioning data.
3. The method of claim 2, wherein when the control positioning data has been positioned to an abnormal control, after performing control positioning processing, the method further comprises:
matching the control positioning data with a preset interference list, and judging whether interference factors exist or not;
when the interference factors exist, the difference interface is called again to carry out interface test on the equipment to be tested;
and eliminating the session frame popped up by the interference factor through an operation instruction read from a configuration file in the process of recalling the difference interface to carry out interface test on the equipment to be tested.
4. The method of claim 2, wherein the interface test is performed on the device to be tested through the differential interface to obtain an interface test result, and further comprising:
and when the execution result shows that the execution of the execution step fails, switching the display page displayed on the equipment to be tested to the network page corresponding to the execution step through a multi-window switching mechanism.
5. The method according to any one of claims 1 to 4, wherein the interface test is performed on the device to be tested through the difference interface to obtain an interface test result, further comprising:
when the target test case is a primary ecological case, detecting the communication state of the differential interface, and when the communication state shows connection jitter, restarting a test platform executing the target test case;
and when the target test case is an embedded network page, detecting the window switching state of the browser control corresponding to the target test case, and restarting the drive of the browser control when the window switching state indicates abnormal switching.
6. An interface automation test method is applied to a server, wherein the server is in communication connection with at least one execution end, and the method comprises the following steps:
judging whether the execution end is in an idle state or not;
when the execution end is in an idle state, determining a test case set to be tested from a preset plan case set;
sending the test case suite to the execution end so that the execution end executes the method of any one of claims 1 to 5.
7. The method of claim 6, wherein at least one target test case in the set of planning use cases is obtained by:
acquiring a recording case, and performing step disassembly on the recording case to obtain a case step set corresponding to the recording case one by one;
and performing association combination on Json fields corresponding to all the case steps in the case step set to obtain the target test case.
8. An apparatus for managing interface tests, the apparatus comprising:
the receiving module is used for acquiring a test case set sent by a server in communication connection with the execution end; the target test case is issued by the server when the execution end is detected to be in an idle state;
the determining module is used for determining a currently executed target test case from the test case set according to an execution task corresponding to the test case set;
the general flow acquisition module is used for acquiring the type of the operating system and a general flow corresponding to the target test case;
the execution module is used for sequentially executing a plurality of execution steps included in the target test case according to the execution sequence set in the general flow;
and the difference processing module is used for determining a difference interface corresponding to the currently executed execution step according to the type of the operating system and carrying out interface test on the equipment to be debugged through the difference interface to obtain an interface test result in the execution process of each execution step.
9. An electronic device, comprising a memory storing a computer program and a processor, wherein the processor implements the interface automation test method of any one of claims 1 to 5 or the interface automation test method of any one of claims 6 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the interface automation test method of any one of claims 1 to 5 or the interface automation test method of any one of claims 6 to 7.
CN202211154917.8A 2022-09-21 2022-09-21 Interface automation test method, device, equipment and storage medium Pending CN115454869A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211154917.8A CN115454869A (en) 2022-09-21 2022-09-21 Interface automation test method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211154917.8A CN115454869A (en) 2022-09-21 2022-09-21 Interface automation test method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115454869A true CN115454869A (en) 2022-12-09

Family

ID=84306180

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211154917.8A Pending CN115454869A (en) 2022-09-21 2022-09-21 Interface automation test method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115454869A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115687160A (en) * 2022-12-30 2023-02-03 杭州兑吧网络科技有限公司 Interface test system, method and equipment cluster
CN116192922A (en) * 2023-04-23 2023-05-30 成都华兴汇明科技有限公司 Issuing management method, device and system for test cases
CN116561472A (en) * 2023-04-14 2023-08-08 广州力麒智能科技有限公司 Government browser system architecture based on Electron cross-platform technology
CN117076329A (en) * 2023-10-12 2023-11-17 浙江云融创新科技有限公司 Method and system for concurrent execution of use cases in service mutex state
CN117149638A (en) * 2023-09-01 2023-12-01 镁佳(北京)科技有限公司 UI (user interface) automatic testing method and device, computer equipment and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115687160A (en) * 2022-12-30 2023-02-03 杭州兑吧网络科技有限公司 Interface test system, method and equipment cluster
CN116561472A (en) * 2023-04-14 2023-08-08 广州力麒智能科技有限公司 Government browser system architecture based on Electron cross-platform technology
CN116561472B (en) * 2023-04-14 2023-12-26 广州力麒智能科技有限公司 Government browser system architecture based on Electron cross-platform technology
CN116192922A (en) * 2023-04-23 2023-05-30 成都华兴汇明科技有限公司 Issuing management method, device and system for test cases
CN116192922B (en) * 2023-04-23 2023-08-11 成都华兴汇明科技有限公司 Issuing management method, device and system for test cases
CN117149638A (en) * 2023-09-01 2023-12-01 镁佳(北京)科技有限公司 UI (user interface) automatic testing method and device, computer equipment and storage medium
CN117076329A (en) * 2023-10-12 2023-11-17 浙江云融创新科技有限公司 Method and system for concurrent execution of use cases in service mutex state
CN117076329B (en) * 2023-10-12 2024-01-30 浙江云融创新科技有限公司 Method and system for concurrent execution of use cases in service mutex state

Similar Documents

Publication Publication Date Title
CN115454869A (en) Interface automation test method, device, equipment and storage medium
CN109302522B (en) Test method, test device, computer system, and computer medium
US11163731B1 (en) Autobuild log anomaly detection methods and systems
US10664388B2 (en) Continuous integration testing for network-based applications
CN105164644B (en) Hook frame
KR20210040884A (en) Edge computing test methods, devices, electronic devices and computer-readable media
US9280445B2 (en) Logging code generation and distribution
US20230236809A1 (en) Ios application program construction method and apparatus, electronic device, and storage medium
US10353807B2 (en) Application development management
US20170228220A1 (en) Self-healing automated script-testing tool
US10310964B2 (en) System and method for determining relevance of application software maintenance
CN112241360A (en) Test case generation method, device, equipment and storage medium
CN115658529A (en) Automatic testing method for user page and related equipment
CN114297056A (en) Automatic testing method and system
CN115617780A (en) Data import method, device, equipment and storage medium
WO2017084388A1 (en) Network polling method and apparatus
CN112052037A (en) Application software development method, device, equipment and medium
US20210406158A1 (en) Systems and methods for automated device testing
CN114064467A (en) Resource analysis method and device, electronic equipment and storage medium
US11467786B2 (en) Using artificial intelligence to respond to printer error
CN111858302B (en) Method and device for testing small program, electronic equipment and storage medium
CN116820526B (en) Operating system upgrading method, device, equipment and storage medium
CN112765040B (en) Page test method, system, computer equipment and storage medium
US10977210B2 (en) Methods for implementing an administration and testing tool
CN116991735A (en) Front-end component change testing system, method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination