CN116185810A - Automated testing with improved extensibility and compatibility - Google Patents

Automated testing with improved extensibility and compatibility Download PDF

Info

Publication number
CN116185810A
CN116185810A CN202111432022.1A CN202111432022A CN116185810A CN 116185810 A CN116185810 A CN 116185810A CN 202111432022 A CN202111432022 A CN 202111432022A CN 116185810 A CN116185810 A CN 116185810A
Authority
CN
China
Prior art keywords
test
automated
registry
agent
authorization code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111432022.1A
Other languages
Chinese (zh)
Inventor
步绍鹏
丁宏
陶冉
胡与兴
郭健
李英杰
沈理
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to CN202111432022.1A priority Critical patent/CN116185810A/en
Priority to PCT/US2022/042581 priority patent/WO2023096690A1/en
Publication of CN116185810A publication Critical patent/CN116185810A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The present disclosure presents methods for performing automated testing. A test request for performing an automated test with a specified set of test equipment may be received by a registry. A test task corresponding to the test request may be generated by the registry. The test tasks may be scheduled by the registry to test agents associated with the specified test equipment set. The automated test may be performed by the test agent using the specified set of test equipment. The present disclosure also proposes an automated testing system. The automated test system may include: a registry, at least one test agent, and at least one set of test devices.

Description

Automated testing with improved extensibility and compatibility
Background
In the development of software applications, testing plays a very critical role as a ring of paramount importance in ensuring application quality. The software application for which the test is performed may be referred to herein as a target application. In general, when testing a target application, after determining a test case, a tester may perform the test step by step according to a procedure described in the test case, and compare an actual test result with an expected test result to verify whether each function of the target application is correct. In the process, in order to save manpower, time or hardware resources and improve the test efficiency, automatic test is introduced. Automated testing may be the process of converting human driven testing into machine-executed testing. In automated testing, specific software or programs may be utilized to control the execution of the test and the comparison between the actual test results and the desired test results. By automating the test, some of the repeated but necessary test tasks that exist in the test flow may be automated, or some of the test tasks that would otherwise be difficult to perform manually may be performed.
Disclosure of Invention
This summary is provided to introduce a selection of concepts that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Embodiments of the present disclosure propose a method for performing automated testing. A test request for performing an automated test with a specified set of test equipment may be received by a registry. A test task corresponding to the test request may be generated by the registry. The test tasks may be scheduled by the registry to test agents associated with the specified test equipment set. The automated test may be performed by the test agent using the specified set of test equipment.
The embodiment of the disclosure provides an automatic test system. The automated test system may include: a registry configured to receive a test request for performing an automated test for a target application with a specified set of test devices, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified set of test devices; at least one test agent, each test agent configured to receive test tasks and to perform automated testing with a set of test devices specified in the received test tasks; and at least one set of test devices, each set of test devices being associated with one of the at least one test agent and configured to run the target application.
It is noted that one or more of the aspects above include the features specifically pointed out in the following detailed description and the claims. The following description and the annexed drawings set forth in detail certain illustrative features of the one or more aspects. These features are indicative of but a few of the various ways in which the principles of various aspects may be employed and the present disclosure is intended to include all such aspects and their equivalents.
Drawings
The disclosed aspects will be described below in conjunction with the drawings, which are provided to illustrate and not limit the disclosed aspects.
FIG. 1 illustrates an exemplary process for performing automated testing according to an embodiment of the present disclosure.
FIG. 2 illustrates an exemplary process for creating a test agent according to an embodiment of the present disclosure.
Fig. 3 illustrates an exemplary process for acquiring authorization codes for a set of test devices according to an embodiment of the disclosure.
FIG. 4 illustrates an exemplary video navigation interface according to an embodiment of the present disclosure.
Fig. 5 illustrates an exemplary architecture of an automated test system in accordance with an embodiment of the present disclosure.
FIG. 6 illustrates an exemplary test device set according to an embodiment of the present disclosure.
Fig. 7 is a flowchart of an exemplary method for performing automated testing according to an embodiment of the present disclosure.
FIG. 8 illustrates an exemplary automated test system according to an embodiment of the present disclosure.
Fig. 9 illustrates an exemplary apparatus for performing automated testing in accordance with an embodiment of the present disclosure.
Detailed Description
The present disclosure will now be discussed with reference to several exemplary embodiments. It should be understood that the discussion of these embodiments is merely intended to enable one skilled in the art to better understand and thereby practice the examples of the present disclosure and is not intended to limit the scope of the present disclosure in any way.
Currently, a single machine is typically utilized to perform automated testing and present test results. Taking an automated test for mobile applications as an example, the automated test typically runs on a personal computer connected to the mobile device, and the automated test is performed by the personal computer and the test results are presented. The personal computer may be down due to the malfunction. Once down, the automated test will be interrupted and the existing test results will be lost. Thus, it may be difficult to provide reliable automated test services in this manner. Furthermore, since mobile devices are typically connected to a personal computer in a wired manner, the number of mobile devices that can be connected to a personal computer is limited. This results in a limited number of automated tests that can be performed simultaneously.
Embodiments of the present disclosure propose an improved automated test service. The automated test services may be decoupled to the registry and the test agents. The registry may manage test agents, schedule test tasks, visualize test results, and so forth. The test agent may perform test tasks, send test results to a registry, and so forth. The registry and test agents may be deployed in different places. Multiple test agents may be deployed simultaneously. Even if one or some of the test agents fail, automated testing can be performed by other test agents and test results can be retained. The registry may also perform multi-node deployment to avoid single node failures. In this way, the reliability of the automated test services may be significantly improved.
In one aspect, embodiments of the present disclosure propose to create a test agent with a terminal device. The terminal device may be any computing device located in any geographic location, such as a desktop computer, laptop computer, tablet computer, cellular telephone, wearable device, etc. The terminal device may be configured as a test agent by running a test agent creation program on the terminal device and registering with a registry. The set of desired test devices with the target application installed may be connected to a test agent for automated testing of the target application. Herein, a set of test devices may refer to a set of test devices that are connected as a whole to a test agent. The form of the test device set may include, for example, a single test device, a pair of test devices made up of two test devices, a group of test devices made up of more than two test devices, and so forth. Herein, a test device may refer to a computing device, such as a desktop computer, laptop computer, tablet computer, cellular telephone, wearable device, etc., on which a target application is run and with which automated testing for the target application is performed. Since the test agent can be created by the terminal device located at any geographical location and the connection of the desired test device set to the test agent allows for automated testing, the number of supported test device sets can be greatly expanded, thereby improving the extensibility of the automated test service. In addition, this approach enables automated testing to be conveniently performed using a collection of test equipment located at any geographic location. For example, the user may be located at site A, while the desired set of test equipment is located at site B. A test agent may be created at B and the desired set of test devices connected to the test agent. The user at site a may then request automated testing with the test equipment set at site B. Furthermore, in the event that the number of currently available test device sets is insufficient, the test device sets may be replenished by creating a test agent and connecting the appropriate test device set to the test agent, or the test device sets may be replenished by requesting use of the test device sets on other test agents. In this way, the time for scheduling test tasks and waiting for test execution can be shortened.
In another aspect, embodiments of the present disclosure propose to enable an automated test service to be compatible with various types of automated tests by having a test agent support various types of test frameworks (testing framework) and associating various forms of test equipment sets with the test agent. Types of automation tests that may be compatible may include, for example, tests for single operating system applications, tests for cross-operating system applications, and tests for Web applications, among others. In this context, a single operating system application may refer to an application that involves only a single operating system, such as an android application, iOS application, windows application, macOS application, and so forth. A cross-operating system application may refer to an application that involves interactions between multiple operating systems, such as an application that involves interactions between an android operating system and a Windows operating system, an application that involves interactions between an iOS operating system and a macOS operating system, and so forth. A Web application may refer to a Web-based application, such as a Web site accessed through a browser, a plug-in running through a browser, and so forth. For example, testing for cross-operating system applications may be achieved by having a test agent support a test framework such as Appium, and connecting pairs of test devices with different operating systems installed to the test agent. In this way, the application scenario of the automated test can be wider, thereby significantly improving the compatibility of the automated test service.
In another aspect, embodiments of the present disclosure propose unified management of test resources for sharing among various test agents. The test resources may include, for example, application packages of the target application, test suites including test cases of automated tests to be performed, test equipment sets, and the like. The application packages and test suites may be stored in a data store deployed in the cloud and managed by a registry. The test equipment set may also be managed by a registry. In this way, different users or teams, such as users or teams located in different geographic locations, may share the test resources, thereby improving the reusability of the test resources and saving resource costs.
In another aspect, embodiments of the present disclosure propose to manage usage rights of a set of test devices by means of an authorization code. In this context, an authorization code may refer to an encrypted code associated with a particular set of test devices and used to authorize a sender of a test request to use the particular set of test devices for automated testing. Unlike the identifiers of the publicly visible test device sets, the authorization code of the test device set is access-restricted. The authorization code for the test equipment set may be provided by the registry to the test agent at the time of creation of the test agent associated with the test equipment set and further provided to a designated user of the test agent, such as the creator of the test agent, the administrator, etc. In addition, the authorization code for the test device set may be provided by the registry to the general user in the event of a specified user grant by a test agent associated with the test device set. When a test agent receives a test task from a registry for performing an automated test with a specified set of test devices, the test agent may determine whether the test task it receives includes an authorization code for the specified set of test devices and perform the automated test if the test task includes an authorization code. In this way, the test equipment set can be prevented from being used arbitrarily, thereby improving the security of the automated test service.
In another aspect, embodiments of the present disclosure propose to visualize test results of an automated test by presenting a video navigation interface. The video navigation interface may include a navigation area for displaying a plurality of test cases included in the automated test. Each test case may be selectable. The video navigation interface may also include a video area for displaying video clips corresponding to the selected test case. In this way, a user may be made able to conveniently view the status of the test equipment during the execution of the various use cases of the automated test. Furthermore, the video clips of the failed test cases can be combined with corresponding test logs, equipment logs and the like, so that the loopholes in the target application can be positioned more quickly and accurately.
FIG. 1 illustrates an exemplary process 100 for performing automated testing in accordance with an embodiment of the present disclosure. In process 100, an automated test service may be decoupled to a registry and a test agent. The registry may manage test agents, schedule test tasks, visualize test results, and so forth. The test agent may perform test tasks, send test results to a registry, and so forth.
At 102, a test request for performing an automated test with a specified set of test equipment may be received by a registry. The test request may include, for example, an application package of the target application, a test suite including a set of test cases for automated testing to be performed, an identifier specifying a set of test devices, a recipient of the test report, and so forth. For example, when a user initiates a test request, a set of test devices on which to perform an automated test may be selected as the specified set of test devices. In addition, where the user knows the authorization code that specifies the set of test devices, the test request may also include the authorization code. The designated test equipment set may have various forms such as a single test equipment, a pair of test equipment consisting of two test equipment, a group of test equipment consisting of more than two test equipment, and so forth. An exemplary form of the test device set will be described later in connection with fig. 6. The test device may be any type of computing device, such as a desktop computer, a laptop computer, a tablet computer, a cellular telephone, a wearable device, and the like. Preferably, after the registry receives the test request, the user may be presented with information specifying the set of test devices in the test request, such as an identifier specifying the set of test devices, a supported test type, a supported test framework, a current state of the test devices included in the set of test devices, a device type, a device model, etc., via a front end connected thereto.
At 104, a test task corresponding to the test request may be generated by the registry. For example, the registry may identify an application package, a test suite, an identifier specifying a set of test devices, an authorization code specifying a set of test devices, etc. from the test request and generate a test task based on the identified application package, test suite, identifier specifying a set of test devices, authorization code specifying a set of test devices, etc.
At 106, the test tasks may be scheduled by the registry to test agents associated with the specified test equipment set. For example, an identifier specifying a set of test equipment may be included in the test task. The registry may identify a test agent associated with the specified set of test devices and schedule test tasks to the identified test agent.
Automated testing may be performed by the test agent using a specified set of test equipment. Preferably, the test agent may first determine whether its received test task includes an authorization code for a set of test devices specified in the test task, and perform an automated test in response to determining that the test task includes an authorization code. For example, at 108, it may be determined by the test agent whether the test task includes an authorization code that specifies a set of test devices. The authorization code may be from a test request.
If it is determined at 108 that the test task does not include an authorization code that specifies a set of test devices, the process 100 may proceed to step 110. At 110, automated testing is performed without utilizing a specified test equipment set.
If it is determined at 108 that the test task includes an authorization code that specifies a set of test devices, the process 100 may proceed to step 112. At 112, automated testing may be performed by the test agent with the specified test equipment set.
According to embodiments of the present disclosure, the status of each test device in a set of test devices may be monitored in real time during the execution of an automated test to obtain performance data for a target application being tested. For example, the central processing unit (Central Processing Unit, CPU) utilization, memory utilization, network connection status, etc. of the test device may be monitored in real time. The battery usage of the test device can also be monitored in real time so that the user can know the power consumption of the target application being tested. In addition, the delay of the application click can be monitored in real time, so that the user can know the running speed of the target application. Further, video corresponding to the automated test may be recorded by capturing in real-time the screen and/or sound of the test equipment during the performance of the automated test. The recorded video may include a plurality of video clips corresponding to a plurality of test cases included in the automated test.
At 114, test results of the automated test may be obtained by the test agent and sent to the registry. The test results of the automated test may include, for example, pass/fail information corresponding to each test case, runtime, test log, device log, custom log screenshots, performance data, video, and the like. The test results can comprehensively reflect the conditions of the tested target application. A detailed test report for the automated test may be generated based on the overall test results. The detailed test report may help a developer of the target application quickly and accurately locate vulnerabilities existing in the target application.
At 116, the test results may be visualized through a registry. For example, the test results may be presented by a dashboard (dashboard). The indicator panel may be displayed, for example, via a front end connected to the registry. Preferably, a video navigation interface may be presented when visualizing the test results. The video navigation interface may include a navigation area for displaying a plurality of test cases included in the automated test. Each test case may be selectable. The video navigation interface may also include a video area for displaying video clips corresponding to the selected test case. An exemplary video navigation interface will be described later in connection with fig. 4.
It should be appreciated that the process for performing automated testing described above in connection with FIG. 1 is merely exemplary. The steps in the process for performing automated testing may be replaced or modified in any manner and may include more or fewer steps depending on the actual application requirements. For example, the test agent may perform an automated test directly after receiving a test task with a specified set of test devices in the test task without determining whether the test task includes authorization code for the specified set of test devices, such as if the test agent knows that the test task is from its specified user. In addition, at 116, in addition to visualizing the test results, the status of the test devices in the set of test devices may be presented. Furthermore, the particular order or hierarchy of steps in process 100 is exemplary only, and the processes for performing automated testing may be performed in an order different than that described.
According to embodiments of the present disclosure, the test agent may be created by the terminal device. The terminal device may be any computing device located in any geographic location, such as a desktop computer, laptop computer, tablet computer, cellular telephone, wearable device, etc. For example, a user may create a test agent using a computing device that is accessible to the user. FIG. 2 illustrates an exemplary process 200 for creating a test agent according to an embodiment of the disclosure.
At 202, a test agent creation program may be run on a terminal device. The test agent creation program may be a predetermined computer program that can be used to create a test agent.
At 204, a registration process may be initiated at a terminal device to a registry. For example, the registration request may be sent to the registration center by the terminal device. The registration center may register the terminal device in response to receiving the registration request.
At 206, the terminal device may be configured as a test agent in response to completion of the registration process.
After registering the test agent, the authorization code for the set of test devices associated with the test agent may further be obtained.
At 208, an authorization code generation request may be received, via the registry, for generating authorization codes for a set of test devices associated with the test agent. For example, the test agent may send an authorization code generation request to the registry for generating authorization codes for the set of test devices associated therewith. The authorization code generation request may include information of the test device set, such as an identifier of the test device set, a supported test type, a supported test framework, a current state of the test device included in the test device set, a device type, a device model, and the like.
At 210, an authorization code may be generated by a registry. For example, the registry may generate the authorization code for the set of test devices based on information for the set of test devices included in the authorization code generation request received from the test agent.
At 212, the generated authorization code may be sent to the test agent through the registry.
Since the test agent can be created by the terminal device located at any geographical location and the connection of the desired test device set to the test agent allows for automated testing, the number of supported test device sets can be greatly expanded, thereby improving the extensibility of the automated test service. In addition, the user may be located at site A while the desired set of test equipment is located at site B. A test agent may be created at B and the desired set of test devices connected to the test agent. The user at site a may then request automated testing with the test equipment set at site B. This approach enables automated testing to be conveniently performed using a collection of test equipment located at any geographic location. Furthermore, in the event that the number of currently available test device sets is insufficient, the test device sets may be supplemented by creating a test agent and connecting the appropriate test device set to the test agent. Additionally or alternatively, the test equipment may be supplemented by requesting use of a set of test equipment on other test agents. In this way, the time for scheduling test tasks and waiting for test execution can be shortened.
It should be appreciated that the process for creating a test agent described above in connection with FIG. 2 is merely exemplary. The steps in the process for creating the test agent may be replaced or modified in any manner and may include more or fewer steps depending on the actual application requirements. Further, the particular order or hierarchy of steps in process 200 is merely exemplary, and the process for creating test agents may be performed in an order different from that described.
As described above, the authorization code for a specified set of test devices may be provided upon initiating a test request to a registry for performing an automated test with the specified set of test devices. When the test agent receives a test task corresponding to the test request from the registry, the test agent may determine whether the test task it receives includes an authorization code for the specified set of test devices, and perform an automated test if the test task includes an authorization code. The authorization code may be obtained from the test agent for a given user of the test agent associated with a given set of test devices, such as for a creator, manager, etc. of the test agent. The authorization code at the test agent may be obtained from a registry at the time the test agent is created, for example, through process 200 of fig. 2. For an average user, for example, for a given user who is not a test agent, an authorization code for the given set of test devices may be requested from a test agent associated with the given set of test devices via a registry. The registry may provide the authorization code to the generic user in the event of a specified user grant for a test agent associated with the test device set. In this way, the test equipment set can be prevented from being used arbitrarily, thereby improving the security of the automated test service. Fig. 3 illustrates an exemplary process 300 for acquiring authorization codes for a set of test devices in accordance with an embodiment of the present disclosure.
At 302, an authorization code acquisition request may be received, via a registry, for acquiring authorization codes for a specified set of test devices. The authorization code acquisition request may be received, for example, from a computing device associated with a user who wants to perform an automated test with a specified set of test devices.
At 304, an authorization code acquisition request may be forwarded by the registry to a test agent associated with the specified set of test devices.
Subsequently, it may be determined by the registry whether to provide the authorization code based on the response received from the test agent.
At 306, a response to the authorization code acquisition request may be received from the test agent through the registry. For example, the test agent may, after receiving the authorization code acquisition request forwarded by the registry, determine whether to grant the authorization code to the sender of the authorization code acquisition request, and include the determination in a response to the authorization code acquisition request sent to the registry.
At 308, it may be determined by the registry whether a response received from the test agent indicates that the test agent grant provided the authorization code. If it is determined at 308 that the response received from the test agent indicates that the test agent grants the authorization code, the process 300 may proceed to step 310. At 310, an authorization code may be provided through a registry. If it is determined at 308 that the response received from the test agent does not indicate that the test agent grant provided the authorization code, the process 300 may proceed to step 312. At 312, the authorization code is not provided.
It should be appreciated that the process for obtaining authorization codes for a set of test devices described above in connection with fig. 3 is merely exemplary. The steps in the process for obtaining the authorization codes for the test device set may be replaced or modified in any manner and may include more or fewer steps depending on the actual application requirements. For example, the test agent may consider whether the sender of the authorization code acquisition request has created other test agents in determining whether to grant the authorization code to the sender. The sender who created the other test agent may prefer to use the test equipment set of that test agent. Furthermore, the particular order or hierarchy of steps in process 300 is exemplary only, and the process for obtaining authorization codes for a set of test devices may be performed in an order different from that described.
Fig. 4 illustrates an exemplary video navigation interface 400 according to an embodiment of the present disclosure.
The video navigation interface 400 may include a navigation area 410 for displaying a plurality of test cases included in an automated test. In the navigation area 410, the names and corresponding times of the individual test cases are shown. Optionally, in the navigation area 410, an initialization event, a test run start event, and a corresponding time are also shown. Preferably, failed test cases may be marked. Failed test cases may be marked in various ways. For example, failed test cases may be displayed with a different color than successful test cases, underlines may be displayed below failed test cases, failed test cases may be highlighted, and so on. For example, the test case "5.Notescardtest. Createtextnote" is highlighted in the navigation area 410, which indicates that the test case failed.
Each test case may be selected by clicking on it. The video navigation interface 400 may also include a video area 420 for displaying video clips corresponding to the selected test case. After one test case is selected from the navigation area 410, the video corresponding to the automated test may jump to a point in time corresponding to the selected test case, so that a video clip corresponding to the selected test case may be displayed in the video area 420. In this way, a user may be made able to conveniently view the status of the test equipment during the execution of the various use cases of the automated test. Furthermore, the video clips of the failed test cases can be combined with corresponding test logs, device logs and the like, so that the loopholes in the target application can be positioned more quickly and accurately.
Optionally, the video navigation interface 400 may also include a button 430 for downloading video and a set of buttons 440 for setting playback speed.
It should be appreciated that the video navigation interface 400 shown in fig. 4 is merely one example of a video navigation interface. The video navigation interface may have any other structure and may include more or fewer elements, depending on the actual application requirements. For example, in the video navigation interface 400, in addition to buttons for downloading video and setting playback speed, buttons such as setting video definition may be displayed.
Fig. 5 illustrates an exemplary architecture 500 of an automated test system in accordance with an embodiment of the present disclosure. Architecture 500 may provide automated test services for target applications, for example, by performing the processes described above in connection with fig. 1-4. Architecture 500 may include registry 510 and at least one test agent, such as test agent 520-1 through test agent 520-K (K.gtoreq.1). In addition, architecture 500 may also include at least one test device set. Each test device set may be associated with one of the at least one test agent. For example, test equipment set 540-1 through test equipment set 540-M (M.gtoreq.1) may be associated with test agent 520-1, and test equipment set 542-1 through test equipment set 542-N (N.gtoreq.1) may be associated with test agent 520-K.
Registry 510 may manage test agents, schedule test tasks, visualize test results, and so forth. The registry 510 may be deployed at the cloud. It should be appreciated that although only one registry 510 is shown in architecture 500, in some embodiments, registries may be extended. For example, an automated test system may include more than one registry. These registries may be managed as a cluster of registries with unified endpoints. Each registry in the registry cluster may be multi-node deployed in a distributed manner, thereby avoiding single node failures.
Registry 510 may be connected to front end 550. Front end 550 may interface with the user and present a user interface associated with registry 510 to the user. In addition, registry 510 may be connected to data store 560. The data store 560 can be deployed at the cloud. The data store 560 may store test resources such as application packages 562, test suites 564, and the like. The application packages 562 may include application programs for installing, running the target application. The test suite 564 may include a set of test cases for automated testing to be performed. Registry 510 may manage the test resources stored in data store 560.
The registry 510 may include a rights management unit 512. The rights management unit 512 may manage rights of the test agent. For example, the rights management unit 512 may determine whether to register the terminal device to configure the terminal device as a test agent upon receiving a registration request from the terminal device. The registration request may be triggered by a test agent creator program running on the terminal device. In addition, the rights management unit 512 may manage the rights of the user to determine the set of test devices that the user can use.
The registry 510 may include an agent and device set management unit 514. Agent and device set management unit 514 may be used to manage test agents registered with registry 510 and test device sets associated with the test agents. The status of the test agent and/or the test equipment in the set of test equipment may be presented to the user via the front end 550.
Registry 510 may include test task scheduling unit 516. The test task scheduling unit 516 may generate a test task corresponding to the test request and schedule the test task to a corresponding test agent. For example, a test request may specify a set of test equipment with which to perform an automated test. Test task scheduler 516 may schedule test tasks corresponding to a test request to test agents associated with a set of test devices specified in the test request.
The registry 510 may include a test results visualization unit 518. The test result visualization unit 518 may visualize test results of the automated test. The test results of the automated test may include, for example, pass/fail information corresponding to each test case, runtime, test log, device log, custom log screenshots, performance data, and the like. Preferably, a video navigation interface, such as video navigation interface 400 described above in connection with fig. 4, may be presented when visualizing test results.
Each of test agents 520-1 through 520-K (1.ltoreq.k.ltoreq.K) may be registered with registry 510. Registry 510 and test agent 520-k may access each other through, for example, a remote procedure call (Remote Procedure Call). Test agent 520-k may be created by any computing device located in any geographic location. For example, a test agent may be created by the process 200 described above in connection with FIG. 2. Test agent 520-k may perform test tasks, send test results to registry 510, and the like.
Test agent 520-k may include a registration unit 522-k to initiate a registration process with registry 510.
Test agent 520-k may include a security element 524-k for determining whether to perform a test task scheduled by registry 510. For example, the security element 524-k may analyze the test tasks received from the registry 510, determine whether the received test tasks include authorization codes for the respective designated test equipment sets, and notify the test agent 520-k, e.g., the test execution element 530-k in the test agent 520-k, to perform an automated test if it is determined that the received test tasks include authorization codes.
Test agent 520-k may include a device set management unit 526-k for locally managing one or more test device sets associated therewith.
Test agent 520-k may include a device set control tool 528-k for controlling and debugging one or more test device sets associated therewith. Test agent 520-k is typically associated with a type of test equipment set. The device set control tool 528-k may, for example, correspond to the type of test device set associated with the test agent 520-k. As an example, when the set of test devices associated with test agent 520-k is an android device, device set control tool 528-k may be a Software Development Kit (SDK) for the android device. As another example, when the test device set associated with test agent 520-k is an iOS device, device set control tool 528-k may be an SDK for the iOS device.
Test agent 520-k may include a test execution unit 530-k for performing automated testing using a set of test equipment, test suite, etc., specified in a test task.
Test agent 520-k may include a test result processing unit 532-k for retrieving test results of automated tests and transmitting the test results to registry 510.
Test agent 520-k may support various types of test frameworks, such as Appium, espresso, junit, etc. Various forms of test equipment sets may be associated with test agents 520-k to enable automated test services to be compatible with various types of automated tests. Types of automation tests that may be compatible may include, for example, tests for single operating system applications, tests for cross-operating system applications, and tests for Web applications, among others. Test agent 520-k may be associated with one or more test equipment sets. For example, test agent 520-1 may be associated with test equipment set 540-1 through test equipment set 540-M, and test agent 520-K may be associated with test equipment set 542-1 through test equipment set 542-N. Each test device set may be configured to run a target application.
The test device set may have various forms such as a single test device, a pair of test devices composed of two test devices, a group of test devices composed of more than two test devices, and the like. Fig. 6 illustrates exemplary test equipment sets 600, 620, and 640 according to embodiments of the present disclosure. Each of the test device sets 600, 620, and 640 may correspond to any of the test device sets 540-1 through 540-M and the test device sets 542-1 through 542-N in fig. 5.
The test equipment set 600 may include a single test equipment 610. The single test device 610 may be used independently to perform automated testing. A target application 612 may be run on the test equipment 610. The target application 612 may be, for example, a single operating system application, a Web application, or the like.
Test device set 620 may include two test devices, such as test device 630-1 and test device 630-2. Test device 630-1 and test device 630-2 may be tied to each other to form a test device pair. The test device 630-1 and the test device 630-2 may be used to cooperatively perform automated testing. The target application 632 may be run on test device 630-1 and test device 630-2. Target application 632 may be, for example, a single operating system application, a Web application, a cross-operating system application, and so forth. In the case where the target application 632 running on the test device 630-1 and the test device 630-2 is a cross-operating system application, versions of the target application 632 corresponding to the respective operating systems may be run on the test device 630-1 and the test device 630-2, respectively. By way of example, the target application 632 may be an application that involves interactions between the android operating system and the Windows operating system. In this case, the test device 630-1 may be an android phone installed with an android operating system, and the test device 630-2 may be a Windows computer installed with a Windows operating system. An android version of the target application 632 may be run on the test device 630-1 and a Windows version of the target application 632 is run on the test device 630-2. The test agents associated with test equipment set 620 may be test agents that support a test framework such as Appium. In performing automated testing, operations may be performed on both test equipment 630-1 and test equipment 630-2, respectively. Interoperation between test device 630-1 and test device 630-2 is also possible.
Test equipment set 640 may include more than two test equipment, such as test equipment 650-1, test equipment 650-2, … …, test equipment 650-T (T > 2). The test equipment 650-1, the test equipment 650-2, … …, and the test equipment 650-T may be mutually bound to form a test equipment group or a test equipment cluster. These test devices may be used to cooperatively perform automated testing. The target application 652 may be run on test equipment 650-1, test equipment 650-2, … …, test equipment 650-T. Target application 652 may be, for example, a single operating system application, a Web application, a cross-operating system application, and the like. As an example, target application 652 may be, for example, an application requiring participation of more than two test devices with respect to server distribution operations.
By enabling the test agents to support various types of test frameworks and associating various forms of test equipment sets with the test agents, the automated test application scenario can be made wider, thereby significantly improving the compatibility of automated test services.
Referring back to fig. 5, according to embodiments of the present disclosure, a test agent may have various implementations with respect to a test equipment set.
In one embodiment, the test agent may be implemented independently of the test equipment set. For example, test agent 520-1 may be implemented independently of test equipment set 540-1 through test equipment set 540-M, test agent 520-K may be implemented independently of test equipment set 542-1 through test equipment set 542-N, and so on. Fig. 5 illustrates an embodiment in which the test agent is implemented independently of the test equipment set. In such an embodiment, the test equipment set may be connected to its associated test agent by a wired means. The number of test equipment sets connected to a test agent may depend on the number of ports that the test agent can provide.
In another embodiment, the test agent may be implemented in a test equipment set. For example, the test agent may be implemented as one of a set of test devices. The test agent needs to acquire resources of other applications when performing automated testing, thus requiring interactions between applications. If the test equipment can implement interactions across applications, the test equipment itself can become a test agent. In such an embodiment, the test agent and the test device may be the same device when the test device set includes only a single test device. When the set of test devices includes more than one test device, the test agent may be one of the more than one test device.
In architecture 500, test resources may be uniformly managed for sharing among various test agents in accordance with embodiments of the present disclosure. The test resources may include, for example, application packages 562, test suites 564, test device sets 540-1 through 540-M, test device sets 542-1 through 542-N, and so forth. The application packages 562 and test suites 564 may be stored in a data store 560 deployed in the cloud and managed by the registry 510. Test equipment set 540-1 through test equipment set 540-M and test equipment set 542-1 through test equipment set 542-N may also be managed by registry 510. In this way, different users or teams, such as users or teams located in different geographic locations, may share the test resources, thereby improving the reusability of the test resources and saving resource costs.
It should be appreciated that architecture 500 shown in fig. 5 is but one example of an architecture for an automated test system. The automated test system may have any other configuration and may include more or fewer components depending on the actual application requirements. For example, a new test agent may be added to an existing automated test system through process 200 of FIG. 2 to enhance the extensibility of the automated test system. In addition, registry 510 may be re-registered with other registries as test agents, further enhancing the extensibility of the automated test system.
Fig. 7 is a flowchart of an exemplary method 700 for performing automated testing in accordance with an embodiment of the present disclosure.
At 710, a test request to perform an automated test with a specified set of test equipment may be received through a registry.
At 720, a test task corresponding to the test request may be generated by the registry.
At 730, the test tasks may be scheduled by the registry to test agents associated with the specified set of test devices.
At 740, the automated test may be performed by the test agent using the specified set of test equipment.
In one embodiment, the automated test may comprise at least one of: testing for single operating system applications, testing for cross-operating system applications, and testing for Web applications.
In one embodiment, the set of designated test equipment may be in the form of at least one of: a single test device, a pair of test devices consisting of two test devices, and a group of test devices consisting of more than two test devices.
In one embodiment, the designated set of test equipment may include more than one test equipment. The performing the automated test may include: the automated test is cooperatively performed by the test agent using the one or more test devices.
In one embodiment, the test agent may be implemented independently of or in the designated set of test equipment.
In one embodiment, the method 700 may further comprise: determining, by the test agent, whether the test task includes an authorization code for the specified set of test equipment, the authorization code from the test request. The automated test may be performed in response to determining that the test task includes the authorization code.
In one embodiment, the method 700 may further include creating the test agent by: running a test agent creation program on the terminal device; initiating, at the terminal device, a registration procedure to the registry; and configuring the terminal device as the test agent in response to completion of the registration process.
The method 700 may further include: receiving, by the registry, an authorization code generation request for generating authorization codes for a set of test devices associated with the test agent; generating the authorization code through the registry; and sending the generated authorization code to the test agent through the registry.
In one embodiment, the method 700 may further comprise: receiving, by the registry, an authorization code acquisition request for acquiring an authorization code of the specified test device set; forwarding the authorization code acquisition request to the test agent through the registry; and determining, by the registry, whether to provide the authorization code based on a response received from the test agent.
In one embodiment, method 700 may further comprise visualizing the test results of the automated test. The test results may include a video navigation interface. The video navigation interface may include: a navigation area for displaying a plurality of test cases included in the automated test, each test case being selectable, and a video area for displaying a video clip corresponding to the selected test case.
It should be appreciated that method 700 may also include any steps/processes for performing automated testing in accordance with embodiments of the present disclosure described above.
Fig. 8 illustrates an exemplary automated test system 800 in accordance with an embodiment of the present disclosure.
The automated test system 800 may include a registry 810 configured to receive a test request for performing an automated test for a target application with a specified set of test devices, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified set of test devices; at least one test agent 820, each test agent configured to receive test tasks and to perform automated testing with a set of test equipment specified in the received test tasks; and at least one set of test devices 830, each set of test devices being associated with one of the at least one test agent and configured to run the target application.
In one embodiment, the target application may include at least one of: single operating system applications, cross operating system applications, and Web applications.
In one embodiment, each of the at least one set of test devices may be in the form of at least one of: a single test device, a pair of test devices consisting of two test devices, and a group of test devices consisting of more than two test devices.
In one embodiment, each of the at least one test agent may be implemented independently of, or in, a set of test devices associated with the test agent.
In one embodiment, each of the at least one test agent may be further configured to: determining whether the received test task includes an authorization code corresponding to the set of specified test devices; and performing the automated test in response to determining that the received test task includes the authorization code.
In one embodiment, the registry may be further configured to: receiving a registration request from a terminal device, the registration request being triggered by a test agent creation program running on the terminal device; and registering the terminal device as a test agent in response to receiving the registration request.
The registry may be further configured to: receiving an authorization code generation request for generating authorization codes for a set of test devices associated with the registered test agent; generating the authorization code; and transmitting the generated authorization code to the registered test agent.
In one embodiment, the registry may be further configured to: receiving an authorization code acquisition request for acquiring authorization codes of the specified test equipment set; forwarding the authorization code acquisition request to the test agent; and determining whether to provide the authorization code based on a response received from the test agent.
In one embodiment, the registry may be further configured to visualize test results of the automated test. The test results may include a video navigation interface. The video navigation interface may include: a navigation area for displaying a plurality of test cases included in the automated test, each test case being selectable, and a video area for displaying a video clip corresponding to the selected test case.
It should be appreciated that the automated test system 800 may also include any other components configured to perform automated tests in accordance with embodiments of the present disclosure described above.
Fig. 9 illustrates an exemplary apparatus 900 for performing automated testing in accordance with an embodiment of the present disclosure.
The apparatus 900 may include: at least one processor 910; and a memory 920 storing computer-executable instructions. The computer-executable instructions, when executed, may cause the at least one processor 910 to: receiving, by the registry, a test request for performing an automated test using the specified test equipment set; generating a test task corresponding to the test request through the registry; scheduling, by the registry, the test tasks to test agents associated with the specified test equipment set; and performing, by the test agent, the automated test with the specified set of test equipment.
It should be appreciated that the processor 910 may also perform any other steps/processes for performing the methods of automated testing according to embodiments of the present disclosure described above.
Embodiments of the present disclosure propose a computer program product for performing automated testing, comprising a computer program for execution by at least one processor: receiving, by the registry, a test request for performing an automated test using the specified test equipment set; generating a test task corresponding to the test request through the registry; scheduling, by the registry, the test tasks to test agents associated with the specified test equipment set; and performing, by the test agent, the automated test with the specified set of test equipment. Furthermore, the computer program may also be executed for implementing any other steps/processes of the method for performing automated testing according to embodiments of the present disclosure described above.
Embodiments of the present disclosure may be embodied in a non-transitory computer readable medium. The non-transitory computer-readable medium may include instructions that, when executed, cause one or more processors to perform any operations of a method for performing automated testing according to embodiments of the present disclosure as described above.
It should be understood that all operations in the methods described above are merely exemplary, and the present disclosure is not limited to any operations in the methods or the order of such operations, but rather should cover all other equivalent variations under the same or similar concepts. In addition, the articles "a" and "an" as used in this specification and the appended claims should generally be construed to mean "one" or "one or more" unless specified otherwise or clear from context to be directed to a singular form.
It should also be understood that all of the modules in the apparatus described above may be implemented in various ways. These modules may be implemented as hardware, software, or a combination thereof. Furthermore, any of these modules may be functionally further divided into sub-modules or combined together.
The processor has been described in connection with various apparatuses and methods. These processors may be implemented using electronic hardware, computer software, or any combination thereof. Whether such processors are implemented as hardware or software will depend upon the particular application and the overall design constraints imposed on the system. As an example, a processor, any portion of a processor, or any combination of processors presented in this disclosure may be implemented with a microprocessor, microcontroller, digital Signal Processor (DSP), field Programmable Gate Array (FPGA), programmable Logic Device (PLD), state machine, gated logic unit, discrete hardware circuits, and other suitable processing components configured to perform the various functions described in this disclosure. The functions of a processor, any portion of a processor, or any combination of processors presented in this disclosure may be implemented using software that is executed by a microprocessor, microcontroller, DSP, or other suitable platform.
Software should be construed broadly to mean instructions, instruction sets, code segments, program code, programs, subroutines, software modules, applications, software packages, routines, subroutines, objects, threads of execution, procedures, functions, and the like. The software may reside in a computer readable medium. Computer-readable media may include, for example, memory, which may be, for example, a magnetic storage device (e.g., hard disk, floppy disk, magnetic strips), optical disk, smart card, flash memory device, random Access Memory (RAM), read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), registers, or removable disk. Although the memory is shown separate from the processor in various aspects presented in this disclosure, the memory may also be located internal to the processor, such as a cache or register.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Accordingly, the claims are not intended to be limited to the aspects shown herein. All structural and functional equivalents to the elements of the various aspects described in the disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein and are intended to be encompassed by the claims.

Claims (20)

1. A method for performing automated testing, comprising:
receiving, by the registry, a test request for performing an automated test using the specified test equipment set;
generating a test task corresponding to the test request through the registry;
scheduling, by the registry, the test tasks to test agents associated with the specified test equipment set; and
the automated test is performed by the test agent using the specified set of test equipment.
2. The method of claim 1, wherein the automated test comprises at least one of: testing for single operating system applications, testing for cross-operating system applications, and testing for Web applications.
3. The method of claim 1, wherein the specified set of test devices is in the form of at least one of:
a single piece of test equipment is used,
test device pair consisting of two test devices
A test equipment group consisting of more than two test equipment.
4. The method of claim 1, wherein the specified set of test devices includes more than one test device, and the performing the automated test includes:
The automated test is cooperatively performed by the test agent using the one or more test devices.
5. The method of claim 1, wherein the test agent is implemented independently of or in the designated set of test equipment.
6. The method of claim 1, further comprising:
determining, by the test agent, whether the test task includes an authorization code for the specified set of test devices, the authorization code from the test request, an
Wherein the automated test is performed in response to determining that the test task includes the authorization code.
7. The method of claim 1, further comprising creating the test agent by:
running a test agent creation program on the terminal device;
initiating, at the terminal device, a registration procedure to the registry; and
and in response to completion of the registration process, configuring the terminal device as the test agent.
8. The method of claim 7, further comprising:
receiving, by the registry, an authorization code generation request for generating authorization codes for a set of test devices associated with the test agent;
Generating the authorization code through the registry; and
and sending the generated authorization code to the test agent through the registry.
9. The method of claim 1, further comprising:
receiving, by the registry, an authorization code acquisition request for acquiring an authorization code of the specified test device set;
forwarding the authorization code acquisition request to the test agent through the registry; and
determining, by the registry, whether to provide the authorization code based on a response received from the test agent.
10. The method of claim 1, further comprising visualizing test results of the automated test, the test results comprising a video navigation interface, and the video navigation interface comprising:
a navigation area for displaying a plurality of test cases included in the automated test, each test case being selectable, and
for displaying a video area of a video clip corresponding to the selected test case.
11. An automated testing system, comprising:
a registry configured to receive a test request for performing an automated test for a target application with a specified set of test devices, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified set of test devices;
At least one test agent, each test agent configured to receive test tasks and to perform automated testing with a set of test devices specified in the received test tasks; and
at least one set of test devices, each set of test devices being associated with one of the at least one test agent and configured to run the target application.
12. The automated test system of claim 11, wherein the target application comprises at least one of: single operating system applications, cross operating system applications, and Web applications.
13. The automated test system of claim 11, wherein each of the at least one set of test equipment is in the form of at least one of:
a single piece of test equipment is used,
test device pair consisting of two test devices
A test equipment group consisting of more than two test equipment.
14. The automated test system of claim 11, wherein each of the at least one test agent is implemented independently of or in a set of test devices associated with the test agent.
15. The automated test system of claim 11, wherein each of the at least one test agent is further configured to:
determining whether the received test task includes an authorization code corresponding to the set of specified test devices; and
the automated test is performed in response to determining that the received test task includes the authorization code.
16. The automated test system of claim 11, the registry further configured to:
receiving a registration request from a terminal device, the registration request being triggered by a test agent creation program running on the terminal device; and
in response to receiving the registration request, the terminal device is registered as a test agent.
17. The automated testing system of claim 16, wherein the registry is further configured to:
receiving an authorization code generation request for generating authorization codes for a set of test devices associated with the registered test agent;
generating the authorization code; and
the generated authorization code is sent to the registered test agent.
18. The automated test system of claim 11, wherein the registry is further configured to:
Receiving an authorization code acquisition request for acquiring authorization codes of the specified test equipment set;
forwarding the authorization code acquisition request to the test agent; and
a determination is made as to whether to provide the authorization code based on a response received from the test agent.
19. The automated testing system of claim 11, wherein the registry is further configured to visualize test results of the automated test, the test results comprising a video navigation interface, and the video navigation interface comprising:
a navigation area for displaying a plurality of test cases included in the automated test, each test case being selectable, and
for displaying a video area of a video clip corresponding to the selected test case.
20. A computer program product for performing automated testing, comprising a computer program for execution by at least one processor for:
receiving, by the registry, a test request for performing an automated test using the specified test equipment set;
generating a test task corresponding to the test request through the registry;
scheduling, by the registry, the test tasks to test agents associated with the specified test equipment set; and
The automated test is performed by the test agent using the specified set of test equipment.
CN202111432022.1A 2021-11-29 2021-11-29 Automated testing with improved extensibility and compatibility Pending CN116185810A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111432022.1A CN116185810A (en) 2021-11-29 2021-11-29 Automated testing with improved extensibility and compatibility
PCT/US2022/042581 WO2023096690A1 (en) 2021-11-29 2022-09-05 Automated testing with improved scalability and compatibility

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111432022.1A CN116185810A (en) 2021-11-29 2021-11-29 Automated testing with improved extensibility and compatibility

Publications (1)

Publication Number Publication Date
CN116185810A true CN116185810A (en) 2023-05-30

Family

ID=83902882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111432022.1A Pending CN116185810A (en) 2021-11-29 2021-11-29 Automated testing with improved extensibility and compatibility

Country Status (2)

Country Link
CN (1) CN116185810A (en)
WO (1) WO2023096690A1 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10075557B2 (en) * 2015-12-30 2018-09-11 Amazon Technologies, Inc. Service authorization handshake

Also Published As

Publication number Publication date
WO2023096690A1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
CN110286884B (en) Micro-service arrangement method, device, equipment and computer readable storage medium
US20170244626A1 (en) Device and settings management platform
WO2016095554A1 (en) Test method, device and system for application
JP6387344B2 (en) System and method for executing a user-defined instrument command sequence using multiple hardware and analysis modules
US10489287B2 (en) Conducting automated software testing using centralized controller and distributed test host servers
US10223248B2 (en) Conducting automated software testing using centralized controller and distributed test host servers
KR20140110520A (en) Method for automatic verification for samrt applications by automatic execution control of test terminals, and computer-readable recording medium for the same
CN111190810B (en) Method, device, server and storage medium for executing test task
US8046638B2 (en) Testing of distributed systems
CN113672441B (en) Method and device for testing intelligent equipment
CN110582750A (en) system and method for task scheduling and device management
CN112463588A (en) Automatic test system and method, storage medium and computing equipment
CN114546738A (en) Server general test method, system, terminal and storage medium
CN109388420A (en) Application upgrade test method, device, computer equipment and storage medium
CN111045919B (en) Method, device, background server, storage medium and system for debugging program
CN116185810A (en) Automated testing with improved extensibility and compatibility
CN105339974B (en) Analog sensor
Tao et al. Cloud-based infrastructure for mobile testing as a service
CN110795330A (en) Monkey pressure testing method and device
CN107332730B (en) Protocol extensible service availability detection system and method
CN115617668A (en) Compatibility testing method, device and equipment
KR20170044320A (en) Method of analyzing application objects based on distributed computing, method of providing item executable by computer, server performing the same and storage media storing the same
CN110971478A (en) Pressure measurement method and device for cloud platform service performance and computing equipment
CN112650666B (en) Software testing system, method, device, control equipment and storage medium
US20180203787A1 (en) Detection of software errors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination