CN116974878A - Test method, storage medium and electronic device - Google Patents

Test method, storage medium and electronic device Download PDF

Info

Publication number
CN116974878A
CN116974878A CN202210322186.7A CN202210322186A CN116974878A CN 116974878 A CN116974878 A CN 116974878A CN 202210322186 A CN202210322186 A CN 202210322186A CN 116974878 A CN116974878 A CN 116974878A
Authority
CN
China
Prior art keywords
test
task
test task
pool
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210322186.7A
Other languages
Chinese (zh)
Inventor
杨光
蒋学鑫
仉亚男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN202210322186.7A priority Critical patent/CN116974878A/en
Priority to PCT/CN2023/081741 priority patent/WO2023185482A1/en
Publication of CN116974878A publication Critical patent/CN116974878A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]

Abstract

The application provides a testing method, a storage medium and an electronic device, and relates to the field of automatic testing. The test method is realized by matching a device pool with a test server, wherein the device pool comprises a scheduler and at least one tested device corresponding to the type of target device, and the test method applied to the device pool comprises the following steps: acquiring a test task issuing request sent by a test server, wherein the test task issuing request comprises a target equipment type required by a test task; confirming whether target tested equipment meeting the execution conditions of the test task exists in the equipment pool or not, and obtaining a confirmation result; and sending feedback information to the test server according to the confirmation result. The application is used for solving the problem of low execution efficiency of a plurality of test tasks.

Description

Test method, storage medium and electronic device
Technical Field
The present application relates to the field of automated testing, and in particular, to a testing method, a storage medium, and an electronic device.
Background
Automated testing is a technique that replaces the tedious and repetitive manual operation and inspection process with a machine entirely. With the increasing complexity of product equipment in various industries, the requirements on quality are higher, the testing is more and more important in various links of development and production, the automatic testing is more and more widely applied, and the improvement of the efficiency of the automatic testing is more and more important.
At present, an automatic test architecture mainly comprises two parts, wherein one part is tested equipment, the other part is a test execution monitoring machine linked with the tested equipment, the latter test execution monitoring machine is called a test execution machine for short, and the test execution machine can automatically send test commands to the tested equipment and acquire command execution results. The automatic test task generally comprises a plurality of test commands, specifies the execution sequence and execution time of the test commands, and then judges whether the function or the system reaches the expectations or not according to the obtained execution results to obtain the test results.
In the traditional automatic test framework, the number of test execution machines and the number of tested devices are unequal, and the test execution machines or the tested devices still exist in an idle state, so that the execution efficiency of a plurality of test tasks is low.
Disclosure of Invention
The embodiment of the application provides a testing method, a storage medium and an electronic device, which are used for at least solving the problem of low execution efficiency of a plurality of testing tasks in the related technology.
In a first aspect, the present application provides a testing method applied to a device pool, where the device pool includes a scheduler and at least one device under test corresponding to a target device type, and the testing method includes:
acquiring a test task issuing request sent by a test server, wherein the test task issuing request comprises a target equipment type required by the test task;
confirming whether target tested equipment meeting the execution conditions of the test task exists in the equipment pool or not, and obtaining a confirmation result;
and sending feedback information to the test server according to the confirmation result.
In a second aspect, the present application provides a testing method, applied to a testing server, including:
reading a test task from the head end of the task queue;
obtaining a target equipment type corresponding to the test task;
searching a device pool corresponding to the target device type, and establishing a communication link with the device pool, wherein the device pool comprises at least one tested device corresponding to the target device type;
sending a test task issuing request to the equipment pool, wherein the test task issuing request comprises the target equipment type required by the test task;
and acquiring feedback information sent by the equipment pool.
In a third aspect, the present application provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the test method of the first aspect or the test method of the second aspect at run-time.
In a fourth aspect, the application provides an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the test method of the first aspect or to perform the test method of the second aspect.
Compared with the prior art, the technical scheme provided by the application has the following advantages: in the application, the test system comprises a device pool and a test server, wherein the device pool comprises a scheduler and at least one tested device corresponding to the type of the target device, and as the result of the test task running on a plurality of tested devices of the same type is not different, the test task only needs to pay attention to the type of the device and does not need to sense which tested device runs specifically on. In the application, at least one tested device with the same device type is planned into the same device pool for management, one device pool corresponds to one device type, a test server acquires a target device type corresponding to a test task, searches the device pool corresponding to the target device type, establishes a communication link with the device pool, further sends a test task issuing request to the device pool, the test server does not need to issue the test task to a specific tested device, only needs to find the device pool corresponding to the target device type, sends the test task issuing request to the device pool, confirms whether the target tested device meeting the test task execution condition exists or not by a dispatcher in the device pool, obtains a confirmation result, and sends feedback information to the test server according to the confirmation result.
In the application, the test server sends the test task issuing request to the equipment pool corresponding to the type of the target equipment, the dispatcher in the equipment pool searches the target tested equipment capable of executing the test task, after the test server issues the test task to the equipment pool, the test server can continuously send a new test task issuing request to the equipment pool while the target tested equipment in the equipment pool executes the test task, each tested equipment in the equipment pool can simultaneously execute different test tasks, the test tasks can be executed in parallel, the utilization rate of the tested equipment can be greatly improved, the total execution time of a plurality of test tasks is shortened, and the problem of low execution efficiency of a plurality of test tasks is solved. Moreover, the method can be realized by only one test server, and has low cost and high use flexibility.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of a test system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of the architecture of a test system in an exemplary embodiment of the application;
FIG. 3 is a flow chart of a method for testing a device pool in accordance with an embodiment of the present application;
FIG. 4 is a flow chart of a method of detecting whether a device under test is faulty in an exemplary embodiment of the application;
FIG. 5 is a flow chart of a method for testing a test server according to an embodiment of the present application;
FIG. 6 is a schematic diagram of the basic architecture of a test system in an exemplary embodiment of the application;
FIG. 7 is a schematic diagram of a specific architecture of a test system in an exemplary embodiment of the application;
FIG. 8 is a schematic diagram of the health of various devices under test in a test system in an exemplary embodiment of the application;
FIG. 9 is a schematic diagram of the basic architecture of a test system in an exemplary embodiment of the application;
FIG. 10 is a schematic diagram of a specific architecture of a test system in an exemplary embodiment of the application.
Detailed Description
The application will be described in detail hereinafter with reference to the drawings in conjunction with embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
In an embodiment of the present application, as shown in fig. 1, a test system is provided, where the test system includes a device pool 101 and a test server 102, and the device pool 101 includes a scheduler 1011 and at least one device under test corresponding to a target device type. Only the scheduler 1011 and the devices under test a and B corresponding to the target device type are shown in fig. 1, the reference numeral 1012 for the device under test a, the reference numeral 1013 for the device under test B, fig. 1 is only a schematic diagram, and the number of devices under test in the device pool is not limited, and may be 1, 3, or more as needed.
The scheduler is used for acquiring a test task issuing request sent by the test server, wherein the test task issuing request comprises a target equipment type required by the test task; confirming whether target tested equipment meeting the execution conditions of the test task exists in the equipment pool or not, and obtaining a confirmation result; according to the confirmation result, feedback information is sent to the test server;
the test server is used for reading the test task from the head end of the task queue; obtaining a target equipment type corresponding to a test task; searching a device pool corresponding to the type of the target device, and establishing a communication link with the device pool; sending a test task issuing request to an equipment pool; and acquiring feedback information sent by the equipment pool.
In an exemplary embodiment, a test executor is disposed in the device under test;
the scheduler is specifically configured to send first feedback information to the test server when the target device to be tested exists in the device pool, obtain a test task sent by the test server, send the test task to the target device to be tested, and execute the test task by a test executor in the target device to be tested, where the first feedback information is used to instruct to issue the test task; and under the condition that the target tested equipment does not exist in the equipment pool, sending second feedback information to the test server, wherein the second feedback information is used for indicating the failure of issuing the test task.
In an exemplary embodiment, the test server includes a test task manager, a task queue, and a test task issuer.
And the test task manager is used for sending the test task to the head end of the task queue.
The test task issuing device is used for reading the test task from the head end of the task queue; obtaining a target equipment type corresponding to a test task; searching a device pool corresponding to the type of the target device, and establishing a communication link with the device pool; sending a test task issuing request to an equipment pool; and acquiring feedback information sent by the equipment pool.
In an exemplary embodiment, as shown in fig. 2, the test system includes a device pool a, a device pool B, and a test server, and the number of device pools in fig. 2 is merely illustrative and not limiting. The device pool A comprises a scheduler, tested devices with the device number A_1 and tested devices with the device number A_2, the type of the devices corresponding to the device numbers A_1 and A_2 is A, a test executor is arranged in the tested devices with the device number A_1, and a test executor is arranged in the tested devices with the device number A_2. The device pool B comprises a scheduler, tested devices of the device number B_1 and tested devices of the device number B_2, the device types corresponding to the device numbers B_1 and B_2 are B, a test executor is arranged in the tested devices of the device number B_1, and a test executor is arranged in the tested devices of the device number B_2. Fig. 2 is only a schematic diagram and is not intended to limit the number of devices under test in the device pool. The test server comprises a test task manager, a task queue and a test task issuing device.
In the embodiment of the application, a testing method is provided, and the following embodiment mainly describes a process of realizing testing by the testing system provided in the embodiment of the application, and the testing method is respectively described from two angles of an equipment pool and a testing server. The device pool comprises a scheduler and at least one tested device corresponding to the type of the target device.
In the embodiment of the present application, as shown in fig. 3, a method flow applied to testing an equipment pool mainly includes:
step 301, obtaining a test task issuing request sent by a test server, wherein the test task issuing request comprises a target device type required by the test task.
And step 302, confirming whether target tested equipment meeting the execution conditions of the test task exists in the equipment pool, and obtaining a confirmation result.
In an exemplary embodiment, determining whether a target device under test exists in the device pool, and obtaining a determination result includes: polling the tested devices in the device pool according to a preset sequence; in the polling process, judging whether the polled tested equipment meets the execution condition of the test task or not; stopping the step of polling the tested devices in the device pool according to a preset sequence under the condition that the polled tested devices meet the execution conditions of the test tasks, and taking the polled tested devices as target tested devices; or returning to execute the step of polling the tested devices in the device pool according to the preset sequence under the condition that the polled tested devices do not meet the execution conditions of the test tasks.
The preset sequence can be the sequence among the tested devices which are randomly generated and fixed during the creation of the device pool, or the sequence among the tested devices in the device pool which is manually set.
In an exemplary embodiment, the test task execution conditions include that the device under test is fault free and the device under test is in an idle state and is not executing a test task.
In an exemplary embodiment, the test method applied to the device pool further includes: the tested equipment executes a self-checking test task every preset time length to obtain an execution result; under the condition that the execution result is that the self-checking test task is successfully executed, determining that the tested equipment has no fault; or under the condition that the execution result is that the self-checking test task fails to be executed, determining that the tested equipment has faults.
In an exemplary embodiment, as shown in fig. 4, a method for detecting whether a device under test has a fault includes:
step 401, it is determined whether the set time is exceeded, and if the set time is exceeded, step 402 is executed, or if the set time is not exceeded, step 401 is executed again.
The set time is a preset duration.
Step 402, a health check is started.
Step 403, determining whether the pass is detected, and if the pass is detected, executing step 404, or if the pass is not detected, executing step 405.
Step 404, a health status is set.
Step 405, a fault state is set.
After step 404 or step 405 is performed, the process returns to step 401.
The preset time length can be an empirical value or a numerical value obtained by multiple tests. For example, the preset time period may be 24 hours.
By means of the timing health check function, environmental faults are identified in advance, and the influence of test results and progress caused by faults of tested equipment is reduced. And the tested equipment executes the self-checking test task without intervention of a test server, so that interaction between the test server and the tested equipment is reduced, and the efficiency of the whole test flow is further improved.
Step 303, according to the confirmation result, feedback information is sent to the test server.
In an exemplary embodiment, sending feedback information to the test server based on the confirmation result includes: under the condition that a target tested device exists in the device pool, first feedback information is sent to a test server, wherein the first feedback information is used for indicating to issue a test task; or under the condition that the target tested equipment does not exist in the equipment pool, sending second feedback information to the test server, wherein the second feedback information is used for indicating the failure of issuing the test task.
The equipment pool acquires a test task issuing request sent by the test server, and then timely sends feedback information to the test server according to a confirmation result to instruct the next action of the test server, so that the problem that the test server directly issues the test task to the equipment pool, and the equipment pool does not have tested equipment capable of executing the test task, so that the test task fails to execute and the efficiency of the whole test flow is affected can be avoided.
In an exemplary embodiment, after sending the first feedback information to the test server, the test method applied to the device pool further includes: acquiring a test task sent by a test server; and sending the test task to the target device to be tested, and executing the test task by the target device to be tested.
In the embodiment of the present application, as shown in fig. 5, a testing method flow applied to a test server mainly includes:
step 501, a test task is read from the head end of the task queue.
Step 502, obtaining a target device type corresponding to the test task.
In step 503, a device pool corresponding to the target device type is searched, and a communication link is established with the device pool, where the device pool includes at least one device under test corresponding to the target device type.
Step 504, a test task issuing request is sent to the device pool, wherein the test task issuing request includes a target device type required by the test task.
Step 505, obtaining feedback information sent by the device pool.
In an exemplary embodiment, after obtaining the feedback information sent by the device pool, the testing method applied to the testing server further includes: and sending the test task to the equipment pool under the condition that the feedback information is first feedback information, wherein the first feedback information is used for indicating to send the test task.
Or under the condition that the feedback information is second feedback information, inputting the test task to the tail end of the task queue, wherein the second feedback information is used for indicating the failure of issuing the test task; reading a new test task from the head end of the task queue; and taking the new test task as a test task, and returning to the step of executing the target equipment type corresponding to the test task.
And under the condition that the feedback information is the second feedback information, inputting the test task to the tail end of the task queue, reading a new test task from the head end of the task queue, and executing the step of acquiring the target equipment type for the new test task. After the test task fails to be issued, the test task is input to the tail end of the task queue in time, the next issuing and testing of the test task are not affected, and a new test task is read from the head end of the task queue in time, so that a plurality of test tasks can be processed in parallel, and the test efficiency of the plurality of test tasks is improved.
In an exemplary embodiment, the test content is to test the CPU related functions of the device under test. As shown in fig. 6, the test system includes a test server and four devices under test, and the test server and the devices under test are linked by serial ports. The test server is an x86 server, the device names of four tested devices are thunderx, upsquar, tegra and sharkl respectively, wherein the CPUs of the tested devices thunderx and upsquat belong to the type of corex-M2 devices, and the CPUs of the tested devices tegra and sharkl belong to the type of corex-A15 devices. As shown in fig. 7, the x86 server includes a test task manager, a task queue, and a test task issuer, the device pool corex-M2 includes a scheduler, a device under test thunder and a device under test upsquarer, a test executor is provided in the device under test thunder, a test executor is provided in the device under test upsquarer, the device pool corex-a 15 includes a scheduler, a device under test tegra, and a device under test sharkl, a test executor is provided in the device under test tegra, and a test executor is provided in the device under test sharkl.
Four test tasks are created: 1. testing the operation performance of the cortex-M2; 2. testing the static power consumption of the corex-M2; 3. testing the operation performance of the cortex-A15; 4. the static power consumption of cortex-a15 was tested. After the four test tasks are created in the test task manager, the four test tasks enter a task queue in sequence, a test task issuing device reads a queue head test task 1, analyzes and identifies the tested device of the type of the matched cortex-M2, so as to try to send the task to a dispatcher of a cortex-M2 device pool, after the dispatcher receives a request, the tested device in the device pool is traversed in sequence according to a pre-stored sequence (1. Thunder, 2. Upsquare), when the tested device is traversed to the thunder, the tested device is found to be capable of executing a test, the task is accepted, the task is also removed from the queue, and the task is sent to a test executor of the vibration device, and automatic test is started to be executed. After the test task is successfully sent by the test task issuing device, the test task 2 is checked from the test task queue, the tested device of the type of the matched cortex-M2 is analyzed and identified, the task is tried to be sent to a dispatcher of a cortex-M2 device pool again, the dispatcher traverses to find that the thredex is executing the test task, the condition is not satisfied, and the upsquater satisfies the test condition, and then the test task is received and issued to a test executor of the upsquater to start executing the test task. Similarly, test tasks 3 and 4 are assigned to tegra and sharkl devices in this order. At this time, the four test devices are all running different test tasks at the same time, and after a certain time, each test executor sequentially executes the test tasks and outputs test results.
As shown in fig. 8, during operation, the tested device thunderx has a stuck fault, and after a health check of 1 hour period, the health status of the device is set as a fault. At this time, a test task of the type corex-M2 is created, when the test task reaches a scheduler of a pool of the corex-M2 devices, polling to find that the thunder device is in a fault state, and if the test task cannot be executed, continuing to poll the upsquat device, and if the device state is healthy and in an idle state, the test task can be executed, and then distributing the test task to the tested device for execution. If the health check function is not available, the scheduler can distribute the test task to the idle and faulty thunder, so that the test task is blocked, time is wasted, efficiency is reduced, and the test result is affected by equipment faults, and the reference value is lost.
In an exemplary embodiment, the test content is the compatibility of test applications on different mobile phone systems. As shown in FIG. 9, a host PC computer is linked to iPhone13 of an iOS15 system through a data line, and two other ZTE-S30 linked to iPhone12 and MyOS11 systems of a mobile phone iOS15 system of a slave PC and ZTE-Z40 of a mobile phone MyOS11 system directly linked to the network are accessible through a network. As shown in fig. 10, the host PC includes a test task manager, a task queue, and a test task issuer, the device pool iOS15 includes a scheduler, a device under test iPhone12, and a device under test iPhone13, a test executor is provided in the device under test iPhone12, a test executor is provided in the device under test iPhone13, the device pool MyOS11 includes a scheduler, a device under test ZTE-S30, and a device under test ZTE-Z40, a test executor is provided in the device under test ZTE-S30, and a test executor is provided in the device under test ZTE-Z40. The exemplary embodiment mainly illustrates that even though the tested devices have different link modes, the tested devices can still be managed by using the testing method provided by the application, and the testing method provided by the application has wider application scenes.
In summary, compared with the prior art, the technical scheme provided by the application has the following advantages: in the application, the test system comprises a device pool and a test server, wherein the device pool comprises a scheduler and at least one tested device corresponding to the type of the target device, and as the result of the test task running on a plurality of tested devices of the same type is not different, the test task only needs to pay attention to the type of the device and does not need to sense which tested device runs specifically on. In the application, at least one tested device with the same device type is planned into the same device pool for management, one device pool corresponds to one device type, a test server acquires a target device type corresponding to a test task, searches the device pool corresponding to the target device type, establishes a communication link with the device pool, further sends a test task issuing request to the device pool, the test server does not need to issue the test task to a specific tested device, only needs to find the device pool corresponding to the target device type, sends the test task issuing request to the device pool, confirms whether the target tested device meeting the test task execution condition exists or not by a dispatcher in the device pool, obtains a confirmation result, and sends feedback information to the test server according to the confirmation result.
In the application, the test server sends the test task issuing request to the equipment pool corresponding to the type of the target equipment, the dispatcher in the equipment pool searches the target tested equipment capable of executing the test task, after the test server issues the test task to the equipment pool, the test server can continuously send a new test task issuing request to the equipment pool while the target tested equipment in the equipment pool executes the test task, each tested equipment in the equipment pool can simultaneously execute different test tasks, the test tasks can be executed in parallel, the utilization rate of the tested equipment can be greatly improved, the total execution time of a plurality of test tasks is shortened, and the problem of low execution efficiency of a plurality of test tasks is solved. Moreover, the method can be realized by only one test server, and has low cost and high use flexibility.
An embodiment of the application also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run. In an exemplary embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the application also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
It will be appreciated by those skilled in the art that the modules or steps of the application described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of testing, characterized by being applied to a device pool, the device pool including a scheduler and at least one device under test corresponding to a target device type, comprising:
acquiring a test task issuing request sent by a test server, wherein the test task issuing request comprises a target equipment type required by the test task;
confirming whether target tested equipment meeting the execution conditions of the test task exists in the equipment pool or not, and obtaining a confirmation result;
and sending feedback information to the test server according to the confirmation result.
2. The method according to claim 1, wherein the sending feedback information to the test server according to the confirmation result includes:
under the condition that the target tested equipment exists in the equipment pool, first feedback information is sent to the test server, wherein the first feedback information is used for indicating to issue the test task; or (b)
And under the condition that the target tested equipment does not exist in the equipment pool, sending second feedback information to the test server, wherein the second feedback information is used for indicating the failure of issuing the test task.
3. The method of testing according to claim 2, wherein after said sending the first feedback information to the test server, the method further comprises:
acquiring the test task sent by the test server;
and sending the test task to the target tested equipment, and executing the test task by the target tested equipment.
4. The test method according to claim 1, wherein the step of confirming whether the target device under test satisfying the test task execution condition exists in the device pool, and obtaining a confirmation result includes:
polling the tested devices in the device pool according to a preset sequence;
in the polling process, judging whether the polled tested equipment meets the execution condition of the test task;
stopping the step of polling the tested devices in the device pool according to a preset sequence under the condition that the polled tested devices meet the execution conditions of the test tasks, and taking the polled tested devices as the target tested devices; or (b)
And returning to execute the step of polling the tested equipment in the equipment pool according to the preset sequence under the condition that the polled tested equipment does not meet the execution condition of the test task.
5. The test method according to any one of claims 1 to 4, wherein the test task execution condition includes that the device under test is fault-free and the device under test is in an idle state and is not executing a test task.
6. The method of testing according to claim 5, further comprising:
the tested equipment executes a self-checking test task at intervals of preset time length to obtain an execution result;
under the condition that the execution result is that the self-checking test task is successfully executed, determining that the tested equipment has no fault; or (b)
And under the condition that the execution result is that the self-checking test task fails to be executed, determining that the tested equipment has faults.
7. A method of testing, applied to a test server, comprising:
reading a test task from the head end of the task queue;
obtaining a target equipment type corresponding to the test task;
searching a device pool corresponding to the target device type, and establishing a communication link with the device pool, wherein the device pool comprises at least one tested device corresponding to the target device type;
sending a test task issuing request to the equipment pool, wherein the test task issuing request comprises the target equipment type required by the test task;
and acquiring feedback information sent by the equipment pool.
8. The method according to claim 7, wherein after the obtaining the feedback information sent by the device pool, the method further comprises:
sending the test task to the equipment pool under the condition that the feedback information is first feedback information, wherein the first feedback information is used for indicating to send the test task; or (b)
Inputting the test task to the tail end of the task queue under the condition that the feedback information is second feedback information, wherein the second feedback information is used for indicating that the test task fails to be issued; reading a new test task from the head end of the task queue; and taking the new test task as the test task, and returning to the step of executing the target equipment type corresponding to the test task.
9. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the test method of any of claims 1 to 6 or to perform the test method of any of claims 7 to 8 when run.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the test method of any of claims 1 to 6 or to perform the test method of any of claims 7 to 8.
CN202210322186.7A 2022-03-29 2022-03-29 Test method, storage medium and electronic device Pending CN116974878A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210322186.7A CN116974878A (en) 2022-03-29 2022-03-29 Test method, storage medium and electronic device
PCT/CN2023/081741 WO2023185482A1 (en) 2022-03-29 2023-03-15 Test method, storage medium and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210322186.7A CN116974878A (en) 2022-03-29 2022-03-29 Test method, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN116974878A true CN116974878A (en) 2023-10-31

Family

ID=88199059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210322186.7A Pending CN116974878A (en) 2022-03-29 2022-03-29 Test method, storage medium and electronic device

Country Status (2)

Country Link
CN (1) CN116974878A (en)
WO (1) WO2023185482A1 (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090307763A1 (en) * 2008-06-05 2009-12-10 Fiberlink Communications Corporation Automated Test Management System and Method
CN105573902B (en) * 2014-10-10 2018-07-03 阿里巴巴集团控股有限公司 The test method and system of a kind of application program
CN105117289B (en) * 2015-09-30 2019-05-28 北京奇虎科技有限公司 Method for allocating tasks, apparatus and system based on cloud test platform
CN105279017B (en) * 2015-09-30 2019-03-05 北京奇虎科技有限公司 Method for allocating tasks, apparatus and system based on cloud test platform
CN108616424B (en) * 2018-04-26 2021-04-06 新华三技术有限公司 Resource scheduling method, computer equipment and system
CN111488268A (en) * 2019-01-25 2020-08-04 北京京东尚科信息技术有限公司 Dispatching method and dispatching device for automatic test
CN111190810B (en) * 2019-08-26 2021-09-17 腾讯科技(深圳)有限公司 Method, device, server and storage medium for executing test task
CN111913884A (en) * 2020-07-30 2020-11-10 百度在线网络技术(北京)有限公司 Distributed test method, device, equipment, system and readable storage medium
CN112069082A (en) * 2020-09-29 2020-12-11 智慧互通科技有限公司 Automatic testing method and system
CN113051179A (en) * 2021-04-27 2021-06-29 思享智汇(海南)科技有限责任公司 Automatic testing method, system and storage medium

Also Published As

Publication number Publication date
WO2023185482A1 (en) 2023-10-05

Similar Documents

Publication Publication Date Title
US20180357146A1 (en) Completing functional testing
CN108399132B (en) Scheduling test method, device and storage medium
US8549522B1 (en) Automated testing environment framework for testing data storage systems
US20170024299A1 (en) Providing Fault Injection to Cloud-Provisioned Machines
US7512933B1 (en) Method and system for associating logs and traces to test cases
US20150100832A1 (en) Method and system for selecting and executing test scripts
CN111552556B (en) GPU cluster service management system and method
US20150100830A1 (en) Method and system for selecting and executing test scripts
CN102799519A (en) Automatic test method for cluster file system
CN111881014B (en) System test method, device, storage medium and electronic equipment
CN108491254A (en) A kind of dispatching method and device of data warehouse
US20150100831A1 (en) Method and system for selecting and executing test scripts
CN112650676A (en) Software testing method, device, equipment and storage medium
CN104615486B (en) For searching for the multi-task scheduling of Extension Software Platform and executing methods, devices and systems
US20170123873A1 (en) Computing hardware health check
US8266624B2 (en) Task dispatch utility coordinating the execution of tasks on different computers
CN111258913A (en) Automatic algorithm testing method and device, computer system and readable storage medium
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN111240974A (en) Log output method and device, electronic equipment and medium
US20050137819A1 (en) Test automation method and tool with dynamic attributes and value sets integration
CN111767218B (en) Automatic test method, equipment and storage medium for continuous integration
CN116974878A (en) Test method, storage medium and electronic device
CN107992420B (en) Management method and system for test item
US10255128B2 (en) Root cause candidate determination in multiple process systems
CN115202946A (en) Automated testing method, apparatus, device, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication