CN113722237B - Device testing method and electronic device - Google Patents

Device testing method and electronic device Download PDF

Info

Publication number
CN113722237B
CN113722237B CN202111279334.3A CN202111279334A CN113722237B CN 113722237 B CN113722237 B CN 113722237B CN 202111279334 A CN202111279334 A CN 202111279334A CN 113722237 B CN113722237 B CN 113722237B
Authority
CN
China
Prior art keywords
test
cases
targets
case
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111279334.3A
Other languages
Chinese (zh)
Other versions
CN113722237A (en
Inventor
周伟萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111279334.3A priority Critical patent/CN113722237B/en
Publication of CN113722237A publication Critical patent/CN113722237A/en
Application granted granted Critical
Publication of CN113722237B publication Critical patent/CN113722237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Abstract

The application provides a device testing method and electronic equipment, and relates to the technical field of terminals. The equipment testing method comprises the following steps: the test equipment acquires a test case set. The test case set comprises N test cases, N is an integer greater than 1, and the number of test targets in any test case is smaller than a first threshold and greater than or equal to a second threshold. And the test equipment tests a plurality of test prototypes according to the sequence that the number of the test targets in the N test cases is reduced from large to small. Because the number of the test targets included in the test case is smaller than the first threshold, the time for outputting the test result corresponding to the test case is short. In addition, the number of the test targets included in the test case is greater than or equal to the second threshold. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Furthermore, the total time loss of dispatching all test cases can be saved.

Description

Device testing method and electronic device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an apparatus testing method and an electronic apparatus.
Background
Before the terminal device leaves the factory, a fuzz testing (also called a security testing) is usually performed on a test prototype of the terminal device.
At present, the fuzzy test mode of the test prototype can be as follows: the test equipment edits a mass of test targets into a plurality of test cases, and selects a test prototype set to be tested. And the test equipment judges whether an unexecuted test case exists or not, and if so, the test equipment traverses the test prototype set to judge whether an idle test prototype exists or not. If so, a test case is assigned to the test prototype. Further, the test equipment tests the test prototypes assigned with the test cases. Meanwhile, the test equipment records a test log when the test prototype executes the corresponding test function based on the test case. After any test case is tested, the test log corresponding to the test case is analyzed, and the test result is output, so that a tester can sense the test result.
In general, in the testing process, the time length for the testing device to output the test result corresponding to any test case is longer; or, after all the test cases are tested, the total time loss of scheduling all the test cases is large.
Disclosure of Invention
The application provides an equipment testing method and electronic equipment, so that the time for outputting a testing result corresponding to a testing case is short, and the total time loss for dispatching all the testing cases can be saved.
In a first aspect, the present application provides an apparatus testing method, including: the test equipment acquires a test case set. The test case set comprises N test cases, N is an integer greater than 1, and the number of test targets in any test case is smaller than a first threshold and greater than or equal to a second threshold. And the test equipment tests a plurality of test prototypes according to the sequence that the number of the test targets in the N test cases is reduced from large to small.
According to the equipment testing method provided by the application, because the number of the test targets included in the test case is smaller than the first threshold value, the time for outputting the test result corresponding to the test case is short. In addition, the number of the test targets included in the test case is greater than or equal to the second threshold. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Furthermore, the total time loss of dispatching all test cases can be saved.
In one possible implementation, the obtaining, by a test device, a set of test cases includes: the test equipment acquires a test target set. And the test equipment edits the test target set into M test case groups according to the preset M number dimensions to obtain a test case set. The number dimensionalities corresponding to the test cases in the same test case group are the same, the number dimensionalities corresponding to the test cases between any two test case groups are different, and M is an integer larger than 1.
Therefore, the quantity dimensionality corresponding to the test cases between any two groups of test case groups is different, and the quantity of the test targets in the test cases distributed by the plurality of test prototypes is smaller when the test progress is backward. Therefore, the test cases can be more uniformly distributed to each test prototype, and the load balance among a plurality of test prototypes is improved.
Further, the test equipment edits the test target set into M test case groups according to the preset M number dimensions, including: the test equipment extracts the nth largest number dimension from the M number dimensions, wherein the initial value of n is 1. The test equipment judges whether L1/(L2+ b) is larger than the nth largest number dimension, L1 is the number of the remaining test targets, L2 is the number of the test prototypes, b is the adjustment factor, and b is an integer. If yes, the testing equipment edits the test cases corresponding to the N1 nth large number dimension. Wherein N1= [ L1-K (L2+ b) ]/K, K being the nth largest number dimension. And adding 1 to the value of n by the test equipment, and returning to the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted. If not, the test equipment adds 1 to the value of n and returns to the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted.
It can be seen that when editing test cases, multiple quantity dimensions of test cases and the number of test prototypes are referred to. Therefore, the number of the test cases of each number dimension can be distributed more reasonably, and the balance relation between shorter time for outputting the test result corresponding to the test case and smaller time loss for scheduling the test case can be optimized.
Furthermore, M number dimensions are in an equal ratio sequence.
Thus, the variation amplitude of the number dimension is larger than the preset amplitude value and is uniformly changed; the balance relation between shorter duration of the test result corresponding to the output test case and smaller time loss of the scheduling test case can be further optimized.
In a possible implementation manner, the editing, by the test device, the test target set into M test case groups according to the preset M number dimensions includes: test equipment pair satisfaction
Figure 242922DEST_PATH_IMAGE001
N of (A)1、N2...NMRandomly selecting a group of values, and editing a test target set into M test case groups, wherein K is1、K2...KMFor different number dimensions, N1To comprise K1Number of test cases of individual test targets, N2To comprise K2Number of test cases of individual test targets, NMTo comprise KMNumber of test cases, Q, of individual test targets1Is the number of test targets in the set of test targets.
In one possible implementation, the obtaining, by a test device, a set of test cases includes: the test equipment acquires a test target set. The test equipment edits test cases based on the test target set to obtain a test case set, wherein the number of the test cases meets N = Q1/Q2。Wherein Q is1For the number of test targets in the test target set, Q2The number of test targets of any test case is N, and the number of test cases is N.
Therefore, the test case set can be acquired more conveniently, and computing resources are saved.
In a possible implementation manner, the testing device tests a plurality of test prototypes according to a descending order of the number of test targets in the N test cases, including: and the test equipment judges whether the residual test cases exist or not. And if the residual test cases exist, the test equipment traverses whether an idle test prototype exists or not. And if the idle test prototype exists, the test equipment allocates the rest test cases to the idle test prototype, wherein the rest test cases comprise the test cases with the largest number of test targets. And the test equipment tests the test prototypes based on the distributed test cases until no residual test cases exist.
In this way, dynamic allocation of test cases can be achieved.
In one possible embodiment, the first threshold satisfies:
Figure 163473DEST_PATH_IMAGE002
. The second threshold satisfies:
Figure 182245DEST_PATH_IMAGE003
in a possible implementation manner, before the test device obtains the test case set, the method provided by the present application further includes: the test equipment displays a first interface. The test equipment receives a plurality of input prototypes to be tested and initial test cases on a first interface. The method for acquiring the test case set by the test equipment comprises the following steps: and the test equipment decomposes the initial test case to obtain a test target set. And the test equipment edits the test targets in the test target set to obtain the test case set.
In a second aspect, the present application further provides an apparatus testing device, including:
and the processing unit is used for acquiring the test case set. The test case set comprises N test cases, N is an integer greater than 1, and the number of test targets in any test case is smaller than a first threshold and greater than or equal to a second threshold.
And the processing unit is also used for testing a plurality of test prototypes according to the sequence of the number of the test targets in the N test cases from large to small.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory is used for storing code instructions; the processor is configured to execute the code instructions to cause the electronic device to perform the device testing method as described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a device testing method as described in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, the present application provides a computer program product, which includes a computer program and when the computer program is executed, causes a computer to execute the device testing method as described in the first aspect or any implementation manner of the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic diagram of a hardware system architecture of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software system architecture of an electronic device according to an embodiment of the present application;
fig. 3 is an interaction schematic diagram of a cloud server, a test prototype, and a test terminal provided in the embodiment of the present application;
fig. 4 is a flowchart of a device testing method provided in an embodiment of the present application;
fig. 5 is an interface schematic diagram of a first interface displayed by a test terminal according to an embodiment of the present disclosure;
FIG. 6 is a flowchart of one embodiment of S403 in FIG. 4;
FIG. 7 is a flowchart of one embodiment of S404 of FIG. 4;
fig. 8 is an interface schematic diagram of a second interface displayed by the test terminal according to the embodiment of the present disclosure;
FIG. 9 is a diagram illustrating distribution of test durations of various test targets according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of idle time duration distribution during a test process of each test prototype provided in the embodiment of the present application;
fig. 11 is a schematic view of load distribution of each test prototype after the test is completed according to the embodiment of the present application;
fig. 12 is a functional block diagram of an apparatus testing device according to an embodiment of the present application;
fig. 13 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
Before the terminal device leaves the factory, a fuzz testing (also called a security testing) needs to be performed on a test prototype of the terminal device.
At present, the fuzzy test mode of the test prototype can be as follows: assuming that 50000 test targets are included, the test equipment edits 50000 test targets into 5 test cases, wherein each test case comprises 10000 test targets; or, the test device edits 50000 test targets into 50000 test cases, where each test case includes 1 test target. The test equipment judges whether the number of the unexecuted test cases is larger than 0, if so, the test equipment traverses the selected test prototype set to judge whether an idle test prototype exists, and if so, one test case is allocated to the test prototype. Furthermore, the test equipment sends abnormal data to an interface to be tested [ such as four major components of the system (including an active component, a service component, a content component, and a broadcast receiver) and/or an Application Package (APK) port ] of the corresponding test prototype based on the test target in the test case. And after the to-be-tested interface of the test prototype receives the abnormal data, processing the abnormal data. And after the test of the test prototype based on the test target is finished, the test equipment generates a sub-result. Typically, the test equipment generates sub-results for a test prototype test based on any one of the test targets, which may take an average of 10 s. And the test equipment generates a test log aiming at each sub-result, analyzes the test log corresponding to the test case and outputs the test result corresponding to the test case after the test of any test case is finished. Generally, the test device traverses an idle test prototype and analyzes a test log corresponding to the test case (i.e. scheduling the test case) based on any one test case, and also takes 10 seconds.
It can be understood that, when the test device edits 50000 test targets into 5 test cases, the test device may take 5.78 days to test 10000 test targets in the test cases, and then may output the test results corresponding to the test cases. It can be seen that the time length for outputting the test result corresponding to the test case is long. When the test equipment edits 50000 test targets into 50000 test cases, the total time consumption after scheduling all the test cases is 5.78 days. As can be seen, the total time consumed for scheduling all test cases is also relatively long. As such, the total duration resulting in all test cases being tested is also longer (total duration =5.78+5.78=11.56 days).
In view of this, the present application provides a device testing method, where a testing device may obtain a test case set. The test case set comprises N test cases, wherein N is an integer larger than 1. The number of the test targets included in any test case is smaller than a first threshold value and larger than or equal to a second threshold value. And the test equipment tests a plurality of test prototypes according to the sequence that the number of the test targets in the N test cases is reduced from large to small.
Because the number of the test targets included in the test case is smaller than the first threshold, the time length for executing all the test targets in the test case is shorter, and further the time length for outputting the test result corresponding to the test case is shorter. In addition, the number of the test targets included in the test case is greater than or equal to the second threshold. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Therefore, the time loss of the test case scheduling can be reduced. Furthermore, the total time of testing all the test cases is reduced.
It will be appreciated that the test device described above may be a terminal device and/or a server. The server may be, but is not limited to, a network server, a database server, a cloud server, and the like. The terminal device may be, but is not limited to, an office computer.
In addition, the test model may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) electronic device, an Augmented Reality (AR) electronic device, a wireless terminal in industrial control (industrial control), and so on. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the test prototype.
Fig. 1 is a schematic structural diagram of a test prototype provided in an embodiment of the present application.
The test prototype may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, buttons 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the test prototype. In other embodiments of the present application, the test prototype may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. The different processing units may be separate devices or may be integrated into one or more processors. A memory may also be provided in processor 110 for storing instructions and data.
The software system of the electronic device may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device. Fig. 2 is a block diagram of a software structure of an electronic device to which the embodiment of the present application is applied. The layered architecture divides the software system of the electronic device into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into five layers, namely an application layer (applications), an application framework layer (application framework), an Android runtime (Android runtime), and a system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 3, the application package may include applications such as green, 58 city, UC browser, calendar, talk, map, navigation, WLAN, bluetooth, music, video, sms, etc. It will be appreciated that the ports of each of the applications described above may be used to receive data.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. The system library may include a plurality of functional modules.
The hardware abstraction layer can contain a plurality of library modules, and the library modules can be camera library modules, motor library modules and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, a motor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver, the motor driver and the like. It is to be understood that a display driver, a camera driver, an audio driver, a sensor driver, a motor driver, etc. may be regarded as one driving node. Each of the driver nodes described above includes an interface that can be used to receive data.
Interpretation of terms in the patent of this application:
blur test (fuzz testing): is a type of safety test that is intermediate between manual penetration testing and automated testing. The way of fuzz testing may be: generating abnormal data in a random or semi-random mode based on the test case; sending the abnormal data to a tested application program port or a driving node port; detecting the state of the application program or the driving node to be detected; and judging whether a potential security vulnerability exists according to the state of the tested application program or the driving node.
Test Case (Test Case): the method refers to the description of testing tasks performed on a specific software product, and embodies testing schemes, methods, techniques and strategies. The test case comprises a test target, a test environment, input data, test steps, expected results, a test script and the like, and finally forms a document. It should be noted that one test case may include one or more test targets, and after all the test targets in the test case are executed, the test result corresponding to the test case may be output.
The device testing method provided in the embodiment of the present application is described below by taking an example in which the testing device includes the testing terminal 100 and the cloud server 200, and the example does not limit the embodiment of the present application. The following embodiments may be combined with each other and are not described in detail with respect to the same or similar concepts or processes.
It can be understood that before the test prototypes of the target models are shipped, the fuzzy test needs to be carried out on the test prototypes in order to improve the quality and the use safety of the test prototypes. The target model of the test prototype may be HONOR-3C or HONOR-50, and the like, and is not limited herein. Typically, testing of a test prototype of a target model involves tens of thousands of different test targets. If all test objects are tested on one test prototype, the total test duration is long. In order to shorten the test time, a plurality of test prototypes may be prepared. Therefore, tens of thousands of different test targets can be distributed to a plurality of test prototypes for testing, and the total testing time can be shortened. It will be appreciated that there is a cost associated with producing test prototypes, such that the number of test prototypes is typically less than a set threshold (typically no more than 20).
As shown in fig. 3, before the fuzz test is performed on the test prototypes 300, the tester may connect the test terminal 100 to the cloud server 200 in a communication manner, and the cloud server 200 is connected to the plurality of test prototypes 300 in a communication manner. Wherein the number of test prototypes 300 is less than 20. In addition, the plurality of test prototypes 300 have different serial numbers (in order to distinguish the identities of the test prototypes 300), and the plurality of test prototypes 300 can be plugged into a USB interface of a network device (such as a host of a plurality of computers, or an Xbox) so as to be capable of being in communication connection with the cloud server 200.
Fig. 4 is a schematic flowchart of an embodiment of a device testing method according to an embodiment of the present application. As shown in fig. 4, the device testing method provided in the embodiment of the present application may include:
s401: the test terminal 100 displays a first interface 101. Therein, the first interface 101 includes a test control 102.
Illustratively, as shown in fig. 5, the test terminal 100 may be an office computer. The tester may enter test information in the first interface 101. Wherein the test information includes: a task name "fuzzy testing", a device type "android phone", a plurality of device serial numbers, and an initial test case "com. The terminal device 100 receives a task name "fuzzy testing", a device type "android phone", a plurality of device serial numbers, and an initial test case "com.
In addition, the plurality of device serial numbers, i.e., the identifiers of the selected plurality of test prototypes 300, the initial test case "com.
S402: the test terminal 100 may send a test instruction to the cloud server 200 in response to a trigger operation of the test control 102 by a tester. The test instruction carries the test information.
It is to be understood that S401-S402 described above may be omitted.
S403: the cloud server 200 responds to the test instruction to acquire the test case set. The test case set comprises N test cases, wherein N is an integer larger than 1. In addition, the number of the test targets included in any test case is smaller than the first threshold and larger than or equal to the second threshold. Illustratively, the first threshold may satisfy:
Figure 329192DEST_PATH_IMAGE002
(ii) a The second threshold may satisfy:
Figure 91612DEST_PATH_IMAGE003
illustratively, the specific implementation manner of S403 may be: the cloud server 200 receives the test instruction from the test terminal 100, and parses an initial test case from the test instruction. Further, the cloud server 200 decomposes Q1 test targets in the initial test case. In this way, the cloud server 200 compiles the Q1 test targets into N test cases. Wherein, the number of the test targets in any test case is less than 3000 and greater than 1. It will be appreciated that the value 3000 is the first threshold value described above and the value 1 is the second threshold value described above. Of course, the first threshold may also be other values, for example, 4000, 2500, etc., which are not limited herein; the second threshold may be other values, for example, 20, 10, or 5, and is not limited herein.
For example, the test case may be:
<secfuzz>
Toolname:iofuzz
name:/dev/acm,/dev/dam,...,/dev/ion
Para:null
<secfuzz>
it is understood that,/dev/acm,/dev/dam,/dev/ion are test targets, respectively, and name:/dev/acm,/dev/dam,/dev/ion is used to indicate that the driving node/dev/acm in the kernel layer, the driving node/dev/dam,. in the kernel layer, the driving node/dev/ion in the kernel layer are tested sequentially.
In addition, for example, the test case may be:
<secfuzz>
Toolname:itentfuzz
package:com.sina.new,com.wuba,...,com.ucm
Para:null
<secfuzz>
as can be appreciated, com.sina.new, com.wuba.. the com.ucm is the test target, respectively, com.sina.new, com.wuba.. the com.ucm is used to indicate that surf news, 58 city,.. the UC browser, in turn, is tested in the application layer.
In the following, how the cloud server 200 edits the Q1 test targets into N test cases is specifically described.
For example, the cloud server 200 may preset M number dimensions of the test case, where M is an integer greater than 1. For example, the M number dimensions may include a number dimension a, a number dimension B, a number dimension C, and a number dimension D (see, M = 4). Wherein the number of the test targets corresponding to the number dimension A
Figure 132249DEST_PATH_IMAGE004
Number of test targets corresponding to number dimension B
Figure 56343DEST_PATH_IMAGE004
Number of test targets corresponding to number dimension C
Figure 690587DEST_PATH_IMAGE004
The number dimension D corresponds to the number of test targets. For example, a test case in quantity dimension A may include a1 test targets, a test case in quantity dimension B may include a2 test targets, a test case in quantity dimension C may include a3 test targets, and a test case in quantity dimension D may include a4 test targets. Wherein, a1
Figure 991118DEST_PATH_IMAGE004
a2
Figure 151841DEST_PATH_IMAGE004
a3
Figure 246836DEST_PATH_IMAGE004
a4, and a1, a2, a3, and a4 may be in equal ratio rows (i.e., equal ratio rows between M number dimensions). Thus, the test cases with the number dimension A can be enabled,The variation range of the number of the test cases in the number dimension B, the test cases in the number dimension C and the number of the test targets in the number dimension D is larger than a preset amplitude value, and the variation range is uniform. In this way, the cloud server 200 can edit the test cases respectively corresponding to the plurality of quantity dimensions based on the number of the test targets, the plurality of quantity dimensions of the test cases, and the number of the test prototypes 300.
As can be seen, the cloud server 200 may edit the test targets in the test target set into M test case groups, where the number dimension corresponding to any test case in the same test case group is the same, and the number dimension corresponding to the test cases between any two test case groups is different.
For example, when a1=1000, a2=100, a3=10, and a4=1, as shown in fig. 6, the method for editing Q1 test targets into N test cases by the cloud server 200 includes:
s601: the cloud server 200 determines whether L1/(L2+ b) is greater than 1000 (i.e., the remaining maximum number dimension K), and if so, performs S602; if not, S603 is directly performed.
Where L1 is the number of remaining test targets (i.e., the number of test targets that have not yet been edited into a test case, it can be understood that the value of L1 varies with the editing of a test case, and initially L1= Q1), L2 is the number of test prototypes 300, b is an adjustment factor, and b is an integer. It will be appreciated that b may be used to adjust the total number of test cases edited. When the value of b is larger, the total number of the test cases is larger; conversely, when the value of b is smaller, the total number of test cases is smaller, and the value can be determined according to the requirements of testers.
S602: the cloud server 200 compiles N1 test cases including 1000 test targets, where N1= [ L1-1000(L2+ b) ]/1000.
S603: the cloud server 200 determines whether L1/(L2+ b) is greater than 100 (i.e., the remaining maximum number dimension K), and if so, performs S604; if not, S605 is directly performed.
S604: the cloud server 200 compiles N2 test cases including 100 test targets, where N2= [ L1-100(L2+ b) ]/100.
S605: the cloud server 200 determines whether L1/(L2+ b) is greater than 10, and if so, performs S606; if not, S607 is directly executed.
S606: the cloud server 200 compiles N3 test cases (i.e., the remaining maximum number dimension K) including 10 test targets, where N3= [ L1-10(L2+ b) ]/10.
S607: the cloud server 200 determines whether L1/(L2+ b) is greater than 1 (i.e., the remaining maximum number dimension K), and if so, performs S608; if not, the process is ended.
S608: the cloud server 200 compiles N4 test cases including 1 test target, where N4= L.
It can be seen that Q1 test targets can be compiled through the above-mentioned S601-S608 as: n1 test cases including 1000 test targets (i.e., test cases in number dimension a), N2 test cases including 100 test targets (i.e., test cases in number dimension B), N3 test cases including 10 test targets (i.e., test cases in number dimension C), and N4 test cases including 1 test target (i.e., test cases in number dimension D). Wherein N1, N2, N3 and N4 satisfy the following conditions: 1000N1+100N2+10N3+10N4= Q1. It is understood that when N1= [ L-1000(L2+ a) ]/1000, N2= [ L-100(L2+ a) ]/100, N3= [ L-10(L2+ a) ]/10, and N4= [ L-10(L2+ a) ]/10, the number of test cases of the number dimension a, the number of test cases of the number dimension B, the number of test cases of the number dimension C, and the number of test cases of the number dimension D can be more reasonably assigned based on the number of test prototypes 300.
It can be understood that when the values of Q1, L2, and B are different, the number of test cases in the quantity dimension a, the number of test cases in the quantity dimension B, the number of test cases in the quantity dimension C, and the number of test cases in the quantity dimension D edited by the cloud server 200 are also different. Illustratively, when B =1, the number Q1 of different test targets and the number L2 of the test prototypes 300 respectively correspond to the number of test cases in the number dimension a, the number of test cases in the number dimension B, the number of test cases in the number dimension C, and the number of test cases in the number dimension D.
Test eye Target number Measurement of Test sample Number of machines Measurement of Comprises 1000 test items Number of subject test cases Measurement of Comprises 100 test items Number of subject test cases Measurement of Comprises 10 test targets Number of test cases of Comprising 1 test object Number of test cases Measurement of Test case Total amount of
300 4 0 0 25 50 75
300 8 0 0 21 90 111
800 8 0 0 71 90 161
3000 8 0 21 81 90 192
8000 8 0 59 189 210 458
50000 8 41 81 81 90 293
50000 20 29 189 189 210 617
TABLE 1
It can be understood that after the test cases are edited, the cloud server 200 may obtain a test case set including N test cases.
S404: the cloud server 200 tests the plurality of test prototypes 300 according to the sequence that the number of the test targets in the N test cases is reduced from large to small.
Illustratively, as shown in fig. 7, the specific implementation process of S404 described above may be:
s701: the cloud server 200 determines whether the number of the remaining test cases is greater than 0, if so, executes S702, and if not, ends the process.
Wherein, the rest test cases refer to: test cases that have not been used to perform test tasks.
S702: the cloud server 200 traverses whether the idle test prototypes 300 exist, and if so, allocates a test case containing the largest number of test targets to the idle test prototypes 300.
S703: the cloud server 200 tests the prototype 300 based on the allocated test case, and returns to execute S701.
In the above S702, the idle test prototype 300 refers to: a test prototype 300 that did not perform the test task. It is to be understood that, after the cloud server 200 has tested one of the test prototypes 300 based on the test case, the cloud server 200 may traverse to the test prototype 300 in the idle state (traversing the test prototype 300 in the idle state has a certain time loss). In this way, the cloud server 200 allocates a test case containing the largest number of test targets to the test prototype 300 in the idle state. Thus, real-time dynamic allocation of the test cases is realized. In addition, when distributing the test cases, the cloud server 200 selects the test case including the largest number of test targets from the remaining test cases. In this way, the cloud server 200 may allocate the test cases in the order from large to small according to the number of the test targets.
Next, a specific implementation process of the above S703 is described by taking, as an example, test targets in the test case include com.
The cloud server 200 sends abnormal data (such as boundary values, character strings, memory addresses, super-large numbers, or negative numbers) to the application port of the news of the new wave of the test prototype 300. Further, the news processing anomaly data of the new wave of the prototype 300 is tested, and the sub-result is fed back to the cloud server 200. Wherein the sub-result is used to indicate a processing success or a processing failure. The cloud server 200 records the sub-results in the sqlite database (i.e., generates a test log). The cloud server 200 sends anomaly data to the 58 city application port of the test prototype 300. Further, 58 of the test prototype 300 handles abnormal data in the same city, and feeds back the sub-results to the cloud server 200. Wherein the sub-result is used to indicate a processing success or a processing failure. The cloud server 200 records the sub-result again in the sqlite database (i.e., generates a test log), so that the cloud server 200 sends abnormal data to the application port of the UC browser of the test prototype 300, and then the UC browser of the test prototype 300 processes the abnormal data and feeds back the sub-result to the cloud server 200. Wherein the sub-result is used to indicate a processing success or a processing failure. The cloud server 200 records the sub-result again in the sqlite database (i.e., generates a test log). Thus, the test of all test targets in the test case can be completed.
Furthermore, the cloud server 200 may parse the sub-result corresponding to each test target from the sqlite database, and generate the test result corresponding to the test case. It should be noted that the test result corresponding to the test case generated by the cloud server 200 also has a certain time loss. It can be understood that the time loss of the cloud server 200 for scheduling one test case based on the sum of the time loss of the test case traversing the idle test prototype 300 and the time loss of the test result corresponding to the test case generated by the cloud server 200 can be regarded as the time loss of the server 200 for scheduling one test case. In the embodiment of the present application, the time loss for scheduling any one test case is 10 s.
It can be understood that, for one test case, the duration of the test result output by the cloud server 200 based on one test case is equal to the sum of the test durations of the test targets of the cloud server 200. When the number of the test targets included in the test case is more, the longer the time length of outputting the test result is; conversely, when the number of test targets included in the test case is smaller, the time length for outputting the test result is shorter.
S405: the cloud server 200 sends the test result corresponding to each test case to the second interface 202 of the test terminal 100 for display.
It can be understood that, on the basis of the embodiment corresponding to fig. 7, when the cloud server 200 finishes testing the test prototype 300 based on any test case, a test result corresponding to one test case is sent to the test terminal 100. Therefore, the tester can look up the test result corresponding to the test case on the second interface 202 in time.
Illustratively, as shown in fig. 8, when the cloud server 200 has tested the prototype 300 based on two test cases, two test results are included in the second interface 202. One of the test results is: the method comprises the following steps of testing a user by using a case name of 'test case 1', a state of 'completion', a device type of 'android phone', a device model of 'horor-50', a passing rate of '100%', an execution time of '1000 min' and a passing state of '1'. The other test result is as follows: the method comprises the following steps of testing a user by using a case name of ' test case 1 ', a state of ' completion ', a device type of ' android phone ', a device model of ' horor-50 ', a passing rate of ' 0% ', an unknown cause of an error ', an execution time length of ' 998min ' and a passing state of ' 0 '.
In summary, since the number of the test targets included in each test case is smaller than the first threshold, the time length for executing all the test targets in the test case is shorter, and further the time length for subsequently outputting the test result corresponding to the test case is also shorter. In addition, each test case comprises a number of test targets which is larger than or equal to the second threshold value. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Therefore, the time loss of the test case scheduling can be reduced. Furthermore, the total time of testing all the test cases is reduced. Furthermore, because the plurality of test prototypes 300 are tested according to the sequence of the number of the test targets in the N test cases from large to small, the number of the test targets in the test cases allocated to the plurality of test prototypes 300 is smaller as the test progress is farther. Thus, the test cases can be more uniformly distributed to the test prototypes 300, and the load balance degree among the test prototypes 300 is improved.
Furthermore, because the variation range of the number of the test cases in the number dimension a, the test cases in the number dimension B, the test cases in the number dimension C, and the test targets between the number dimension D is larger than the preset amplitude value, and the variation range is uniform, the balance relationship between the short time duration of outputting the test result corresponding to the test case and the small time loss of scheduling the test case can be optimized.
In addition, on the basis of the embodiment corresponding to fig. 7, the specific implementation process of S702 and S703 may be:
assume that the plurality of test prototypes 300 includes test prototype 1, test prototype 2, test prototype 3, and test prototype 4, and that the first threshold value is equal to 3000 and the second threshold value is equal to 1. The test case set acquired by the cloud server 200 includes 41 test cases a including 1000 test targets, 81 test cases B including 100 test targets, 81 test cases C including 10 test targets, and 90 test cases D including 1 test target. It is understood that the test case set includes 41+81+81+90=293 test cases in total, and the total number of test targets is
Figure 368376DEST_PATH_IMAGE005
Initially, test prototype 1, test prototype 2, test prototype 3, and test prototype 4 were all in an idle state. Since the number of test targets in the test case a is the largest, the cloud server 200 allocates one test case a to each of the test prototype 1, the test prototype 2, the test prototype 3, and the test prototype 4. When the test prototype 2 finishes the test based on the allocated test case a, the cloud server 200 traverses to the test prototype 2 in the idle state. Further, the cloud server 200 continues to allocate one test case a to the test prototype 2, and the process is repeated until the allocation of 41 test cases a is completed.
If the cloud server 200 traverses that the test prototype 3 is in the idle state again, the number of the test targets in the test prototype B is the largest because the test prototype a is already allocated, the cloud server 200 allocates one test prototype B for the test prototype 3, and the process is repeated until 81 test prototypes B are allocated.
If the cloud server 200 traverses that the test prototype 1 is in the idle state again, the number of the test targets in the test prototype C is the largest because the test case B is completely allocated, the cloud server 200 allocates one test case C to the test prototype 1, and the process is repeated until the 81 test cases C are completely allocated.
If the cloud server 200 traverses that the test prototype 4 is in the idle state again, the number of the test targets in the test prototype D is the largest because the test prototype C is completely allocated, the cloud server 200 allocates one test prototype D to the test prototype 1, and the process is repeated until the 81 test prototypes D are completely allocated.
It should be noted that the distribution of the test duration of each test object may be as shown in fig. 9. In fig. 9, the average test time period of each test object is 10 s.
It can be seen that, in the above embodiment, the number of test targets included in each of the 293 test cases in the test case set is less than the first threshold 3000. In this way, the time for the cloud server 200 to test the test prototype 300 based on any test case is short, and further the time for the cloud server 200 to send the test result corresponding to any test case to the test terminal 100 is short. Therefore, the tester can timely look up the test result corresponding to any test case.
In the above embodiment, the test case set includes 41 test cases a containing 1000 test targets, 81 test cases B containing 100 test targets, and 81 test cases C containing 10 test targets. It can be seen that the number of test targets in the 41 test cases a, the 81 test cases B, and the 81 test cases C is greater than the first threshold 1. In this way, when the total number of test targets =50000 is unchanged, the number of test cases included in the test case set can be made smaller. In this way, the cloud server 200 can also consider that the time loss of scheduling the test case is small.
In addition, as shown in (a) of fig. 10, in the prior art, if 50000 test targets are edited into 5 test cases including 10000 test targets, when 4 test cases have been executed and the last 1 test case is allocated to test prototype 1, if the average test duration of each test target is 10s, then prototype 1 needs to spend 100000s to test 1000 test targets in the last 1 test cases, which is inefficient. However, since 5 test cases have been allocated, the time period for which test prototype 2, test prototype 3, and test prototype 4 are in the idle state is 100000 s. As shown in fig. 11 (a), after the test of 5 test cases is completed, the test duration of test prototype 1 is 3300min, the test duration of test prototype 2 is 1600min, the test duration of test prototype 3 is 1660min, and the test duration of test prototype 2 is 1700 min. As can be seen, the load balance among the test prototype 1, the test prototype 2, the test prototype 3, and the test prototype 4 is low, the utilization rate of the equipment resources is low, and the resulting efficiency is also low.
As such, in the embodiment of the present application, as shown in (b) in fig. 10, 81 test cases D including 1 test target are finally allocated by the cloud server 200. The 81 test cases D can be more evenly distributed to the 4 test prototypes 300. When 80 test cases D have been completed and the last 1 test case D is assigned to test prototype 1, if the average test duration for each test object is 10s, then test prototype 1 only takes 10s to complete the last 1 test case. Thus, the time period for which the test prototype 2, the test prototype 3, and the test prototype 4 were in the idle state was only 10 seconds. As shown in fig. 11 (b), when all the 293 test cases in the test case set are tested, the test duration of test prototype 1 is 2084min, the test duration of test prototype 2 is 2083min, the test duration of test prototype 1 is 2083min, and the test duration of test prototype 2 is 2082 min. As can be seen, the load balance between test prototype 1, test prototype 2, test prototype 3, and test prototype 4 was high. Therefore, the utilization rate of equipment resources is improved, and the testing efficiency is further improved.
In the embodiment of the present application, S402 may be replaced with the following two ways, in addition to the embodiment corresponding to fig. 6:
the first mode is as follows: assume that the total number of test targets is Q1, and the cloud server 200 presets any test case including the number of test targets being Q2, the first threshold =3000, and the second threshold = 1. As such, the cloud server 200 may determine the number of test targets of any test case based on the equation N = Q1/Q2.
For example, when Q1=50000 and Q2=1000, 50000 test targets can be edited into 50 test cases. It is understood that the number Q2 of the test objects included in each test case is the same, and is 1000. Specifically, 50000 test targets may be respectively marked with identifiers for indicating the execution order of the test targets, for example, the 50000 test targets may be respectively marked with identifiers of 1-50000 arabic numbers. The cloud server 200 edits the test targets respectively marked with 1-1000 into 1 test case, edits the test targets respectively marked with 1001-dash 2000 into 1 test case, and edits the test targets respectively marked with 49001-dash 50000 into 1 test case. In this way, the cloud server 200 can edit 50000 test targets into 50 test cases. Of course, the test case including the number Q2 of test targets may also take other values as long as it is greater than the first threshold and smaller than the second threshold, and is not limited herein.
It can be understood that, since the number 1000 of the test targets in the test case is smaller than the second threshold, the time duration for subsequently outputting the test result corresponding to the test case is also shorter. In addition, the number 1000 of the test targets in the test case is larger than the second threshold value. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Therefore, the time loss of the test case scheduling can be reduced.
The second mode is as follows: in the embodiment corresponding to fig. 6, N1, N2, N3, and N4 may be determined by the cloud server 200 based on a plurality of number dimensions of preset test cases and the number of test prototypes 300. Unlike the embodiment corresponding to fig. 6, the values of N1, N2, and N3 in the second embodiment are randomly selected from a plurality of sets of values satisfying 2000N1+1000N2+500N3= Q1.
Illustratively, still assuming that the total number of test targets is Q1, the first threshold =3000, and the second threshold =1, the cloud server 200 may preset M number dimensions of test cases, where M is an integer greater than 1. For example, the M number dimensions of the test case include a number dimension a, a number dimension B, and a number dimension C (see, M = 3). The test case of the quantity dimension a includes 2000 test targets, the test case of the quantity dimension B includes 1000 test targets, and the test case of the quantity dimension C includes 500 test targets. The cloud server 200 may edit test cases of N1 number dimension a, N2 number dimension B, and N3 number dimension C. Wherein, N1, N2 and N3 satisfy the following conditions: 2000N1+1000N2+500N3= Q1. As can be seen, the cloud server 200 may edit the test targets in the test target set into M test case groups, where the number dimension corresponding to any test case in the same test case group is the same, and the number dimension corresponding to the test cases between any two test case groups is different.
It is understood that N1, N2, and N3 have multiple sets of values. The cloud server 200 may randomly select a set of values from the plurality of sets of values and edit the test case. For example, when Q1=50000, a randomly selected set of values of the cloud server 200 is N1=10, N2=20, and N3= 20. Thus, the number of test cases in the set of test cases N = N1+ N2+ N3= 50; as another example, the cloud server 200 randomly selects a set of values N1=15, N2=10, and N3= 20. Thus, the number of test cases in the set of test cases N = N1+ N2+ N3= 45. Of course, N1, N2, and N3 may have other values as long as 2000N1+1000N2+500N3=50000 is satisfied. It can be understood that, when the number of test cases including the number dimension a is larger, the number N of test cases in the test case set is smaller; conversely, the number N of test cases in the test case set is higher when the number of test cases including the number dimension a is smaller.
It can be seen that the above-mentioned solution of the second mode can be summarized as follows: from satisfying
Figure 472598DEST_PATH_IMAGE006
Randomly selecting the value of the number of a group of test cases, and editing the test target in the test target set into M test case groups. Wherein, K1、K2...KMFor different number dimensions, N1To comprise K1Number of test cases of individual test targets, N2To comprise K1Number of test cases of individual test targets, NMTo comprise KMNumber of test cases, Q, of individual test targets1Is the number of test targets in the set of test targets.
Because the number of the test targets in the test cases of the number dimension A, the number dimension B and the number dimension C is less than the second threshold, the time length for subsequently outputting the test results corresponding to the test cases is also short. In addition, the number of the test targets in the test cases of the number dimension A, the test cases of the number dimension B and the test cases of the number dimension C is larger than the second threshold. Therefore, under the condition that the total number of the test targets is constant, the number of the test cases in the test case set can be reduced. Therefore, the time loss of the test case scheduling can be reduced. Furthermore, the number of test targets of the test cases of the number dimension A
Figure 222248DEST_PATH_IMAGE004
Number of test targets for test cases of number dimension B
Figure 753724DEST_PATH_IMAGE004
The number of test targets of the test case of the number dimension C is smaller as the test progress goes backward, and the number of test targets in the test case allocated to the plurality of test prototypes 300 is smaller. Thus, the test cases can be more uniformly distributed to the test prototypes 300, and the load balance among the test prototypes 300 is improved.
It should be noted that, in the second manner, the number of the test cases in the number dimension a, the number dimension B, and the number dimension C may also be other values. For example, the test case in the quantity dimension a includes 1000 test targets, the test case in the quantity dimension B includes 100 test targets, the test case in the quantity dimension C includes 10 test targets, and the like, which is not limited herein.
In addition, in the above embodiment, it is described that the test apparatus includes the test terminal 100 and the cloud server 200 as an example. In this embodiment, the test device may also be an independent test terminal 100 or an independent cloud server 200. When the test device is a stand-alone test terminal 100, the test terminal 100 may perform the above-described S401 to S405. When the test device is the independent cloud server 200, the cloud server 200 may perform S403 to S405 described above.
Referring to fig. 12, the present application further provides an apparatus testing device 1200, including:
the processing unit 1201 is configured to obtain a test case set. The test case set comprises N test cases, N is an integer greater than 1, and the number of test targets in any test case is smaller than a first threshold and greater than or equal to a second threshold.
The processing unit 1201 is further configured to test a plurality of test prototypes according to a descending order of the number of test targets in the N test cases.
In a possible implementation manner, the processing unit 1201 is specifically configured to edit the test target set into M test case groups according to a preset number dimension of M, so as to obtain a test case set.
The number dimensionalities corresponding to the test cases in the same test case group are the same, the number dimensionalities corresponding to the test cases between any two test case groups are different, and M is an integer larger than 1.
Further, the processing unit 1201 is specifically configured to extract an nth large number dimension from the M number dimensions, where an initial value of n is 1. And judging whether the L1/(L2+ b) is larger than the nth large number dimension, wherein L1 is the number of the remaining test targets, L2 is the number of the test prototypes, b is an adjusting factor and b is an integer. And if so, editing the test cases corresponding to the N1 nth large number dimensions, wherein N1= [ L1-K (L2+ b) ]/K, and K is the nth large number dimension. And adding 1 to the value of n, and returning to the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted. And if not, adding 1 to the value of n, and returning to the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted.
Furthermore, M quantity dimensions are in an equal ratio series.
In a possible embodiment, the processing unit 1201 is specifically configured to satisfy
Figure 96980DEST_PATH_IMAGE006
N of (A)1、N2...NMAnd randomly selecting a group of values, and editing the test target set into M test case groups.
Wherein, K1、K2...KMFor different number dimensions, N1To comprise K1Number of test cases of individual test targets, N2To comprise K2Number of test cases of individual test targets, NMTo comprise KMNumber of test cases, Q, of individual test targets1Is the number of test targets in the set of test targets.
In a possible implementation, the processing unit 1201 is specifically configured to obtain a test target set; and editing the test cases based on the test target set to obtain the test case set.
Wherein the number of test cases satisfies N = Q1/Q2Wherein Q is1For the number of test targets in the test target set, Q2The number of test targets of any test case is N, and the number of test cases is N.
In a possible implementation manner, the processing unit 1201 is specifically configured to determine whether there are remaining test cases; if the residual test cases exist, traversing whether an idle test prototype exists or not; if the idle test prototype exists, distributing the rest test cases including the test case with the largest number of test targets to the idle test prototype; and testing the test prototype based on the distributed test cases until no residual test cases exist.
In one possible embodiment, the first threshold satisfies:
Figure 4893DEST_PATH_IMAGE002
(ii) a The second threshold satisfies:
Figure 874629DEST_PATH_IMAGE003
in addition, the device testing apparatus 1200 provided by the present application may further include:
the display unit 1202 is used for displaying a first interface by the test equipment;
the display unit 1202 is further configured to receive, at the first interface, a plurality of prototypes to be tested and initial test cases.
The processing unit 1201 is specifically configured to decompose the initial test case to obtain a test target set. And editing the test targets in the test target set to obtain the test case set.
Fig. 13 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 13, the electronic device includes a processor 1301, a communication line 1304, and at least one communication interface (an example of the communication interface 1303 in fig. 13 is described as an example).
The processor 1301 may be a general processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present disclosure.
The communication lines 1304 may include circuitry to communicate information between the above-described components.
Communication interface 1303 may be implemented using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, Wireless Local Area Networks (WLANs), etc.
Possibly, the electronic device may further comprise a memory 1302.
The memory 1302 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1304. The memory may also be integral to the processor.
The memory 1302 is used for storing computer-executable instructions for executing the present invention, and is controlled by the processor 1301 to execute the instructions. The processor 1301 is configured to execute the computer executable instructions stored in the memory 1302, so as to implement the device testing method provided by the embodiment of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1301 may include one or more CPUs, such as CPU0 and CPU1 in fig. 13, as one embodiment.
In particular implementations, an electronic device may include multiple processors, such as processor 1301 and processor 1305 in fig. 13, for example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 14 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 140 includes one or more (including two) processors 1410 and a communication interface 1430.
In some embodiments, memory 1440 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In an embodiment of the present application, the memory 1440 may include a read-only memory and a random access memory, and provides instructions and data to the processor 1410. A portion of the memory 1440 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, memory 1440, communication interface 1430, and memory 1440 are coupled together via bus system 1420. The bus system 1420 may include a power bus, a control bus, a status signal bus, and the like, in addition to the data bus. For ease of description, the various buses are labeled in FIG. 14 as bus system 1420.
The method described in the embodiments of the present application may be applied to the processor 1410, or implemented by the processor 1410. Processor 1410 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 1410. The processor 1410 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 1410 may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present application.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1440, and the processor 1410 reads the information in the memory 1440 and performs the steps of the above method in combination with the hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. Computer instructions may be stored in, or transmitted from, a computer-readable storage medium to another computer-readable storage medium, e.g., from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, Digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.), the computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media integrated servers, data centers, etc., the available media may include, for example, magnetic media (e.g., floppy disks, hard disks, or magnetic tape), optical media (e.g., digital versatile disks, DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), etc.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (7)

1. A method for testing a device, the method comprising:
the method comprises the steps that a test device obtains a test case set, wherein the test case set comprises N test cases, N is an integer larger than 1, and the number of test targets in any test case is smaller than a first threshold and larger than or equal to a second threshold;
the method for acquiring the test case set by the test equipment comprises the following steps:
the test equipment acquires a test target set;
the test equipment edits the test target set into M test case groups according to a preset M number dimension to obtain the test case set, wherein the number dimension corresponding to the test cases in the same test case group is the same, the number dimension corresponding to the test cases between any two test case groups is different, and M is an integer larger than 1;
the test equipment edits the test target set into M test case groups according to preset M number dimensions, and the method comprises the following steps:
the testing equipment extracts the nth number dimension from the M number dimensions, wherein the initial value of n is 1;
the test equipment judges whether L1/(L2+ b) is larger than the nth large number dimension, L1 is the number of the remaining test targets, L2 is the number of the test prototypes, b is an adjustment factor, and b is an integer;
if so, the test equipment edits N1 test cases corresponding to the nth large number dimension, wherein N1 is [ L1-K (L2+ b) ]/K, and K is the nth large number dimension;
adding 1 to the value of n by the test equipment, and returning to execute the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted;
if not, the test equipment adds 1 to the value of n and returns to the step of extracting the nth data dimension from the M number dimensions until the M number dimensions are extracted;
and the test equipment tests a plurality of test prototypes according to the sequence that the number of the test targets in the N test cases is from large to small.
2. The method of claim 1, wherein the M number dimensions are in an equal ratio array.
3. The method according to claim 1, wherein the step of testing the plurality of test prototypes by the test equipment according to the sequence of the number of the test targets in the N test cases from large to small comprises the following steps:
the test equipment judges whether the residual test cases exist or not;
if the residual test cases exist, the test equipment traverses whether an idle test prototype exists or not;
if an idle test prototype exists, the test equipment allocates the rest test cases to the idle test prototype, wherein the rest test cases comprise the test cases with the largest number of test targets;
and the test equipment tests the test prototypes based on the distributed test cases until no residual test cases exist.
4. The method according to any one of claims 1 to 3,
the first threshold satisfies: 1000 is greater than or equal to 500;
the second threshold satisfies: 10 is greater than or equal to the second threshold value is greater than or equal to 1.
5. The method according to any one of claims 1 to 3, wherein before the test device obtains the set of test cases, the method further comprises:
the test equipment displays a first interface;
the test equipment receives a plurality of input prototypes to be tested and initial test cases on the first interface;
the method for acquiring the test case set by the test equipment comprises the following steps:
the test equipment decomposes the initial test case to obtain a test target set;
and the test equipment edits the test targets in the test target set to obtain the test case set.
6. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to perform the method of any of claims 1 to 5.
7. A computer-readable storage medium, in which a computer program is stored which, when executed by a processor, causes a computer to carry out the method according to any one of claims 1 to 5.
CN202111279334.3A 2021-11-01 2021-11-01 Device testing method and electronic device Active CN113722237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111279334.3A CN113722237B (en) 2021-11-01 2021-11-01 Device testing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111279334.3A CN113722237B (en) 2021-11-01 2021-11-01 Device testing method and electronic device

Publications (2)

Publication Number Publication Date
CN113722237A CN113722237A (en) 2021-11-30
CN113722237B true CN113722237B (en) 2022-02-08

Family

ID=78686212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111279334.3A Active CN113722237B (en) 2021-11-01 2021-11-01 Device testing method and electronic device

Country Status (1)

Country Link
CN (1) CN113722237B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018036167A1 (en) * 2016-08-22 2018-03-01 平安科技(深圳)有限公司 Test task executor assignment method, device, server and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103298016B (en) * 2012-02-27 2016-12-14 展讯通信(上海)有限公司 the test system of mobile terminal
US9317410B2 (en) * 2013-03-15 2016-04-19 International Business Machines Corporation Testing functional correctness and idempotence of software automation scripts
US9482683B2 (en) * 2014-04-22 2016-11-01 Wipro Limited System and method for sequential testing across multiple devices
CN109359031B (en) * 2018-09-04 2023-08-22 中国平安人寿保险股份有限公司 Multi-device application program testing method and device, server and storage medium
CN112346965A (en) * 2020-10-12 2021-02-09 天津五八到家货运服务有限公司 Test case distribution method, device and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018036167A1 (en) * 2016-08-22 2018-03-01 平安科技(深圳)有限公司 Test task executor assignment method, device, server and storage medium

Also Published As

Publication number Publication date
CN113722237A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
US20140310050A1 (en) Methods And Apparatus For Project Portfolio Management
WO2019019975A1 (en) Method and device for cloud platform performance testing
CN112148610A (en) Test case execution method and device, computer equipment and storage medium
CN111026634A (en) Interface automation test system, method, device and storage medium
CN111381940B (en) Distributed data processing method and device
CN114531477A (en) Method and device for configuring functional components, computer equipment and storage medium
JP2007257588A (en) Verification system
CN111221721B (en) Automatic recording and executing method and device for unit test cases
CN112559525B (en) Data checking system, method, device and server
CN104598409A (en) Method and device for processing input and output requests
CN113722237B (en) Device testing method and electronic device
CN112561690A (en) Method, system, equipment and storage medium for testing credit card staging service interface
US20230409468A1 (en) Providing application error data for use by third-party library development systems
CN115525561A (en) Protocol interface testing method, device, terminal equipment and storage medium
CN115237889A (en) Database switching method and device, storage medium and computer equipment
CN115048107A (en) Code compiling method, system, electronic device and storage medium
CN113419957A (en) Rule-based big data offline batch processing performance capacity scanning method and device
CN113392010A (en) Common component testing method and device, electronic equipment and storage medium
CN113641628A (en) Data quality detection method, device, equipment and storage medium
CN111160403A (en) Method and device for multiplexing and discovering API (application program interface)
CN111562982B (en) Method and device for processing request data, computer readable storage medium and electronic equipment
CN112035425B (en) Log storage method and device and computer system
CN113254328B (en) White box testing method, system, mobile terminal and storage medium
CN111324542B (en) Web application regression test case selection system, method and equipment
CN110309038B (en) Performance test method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220607

Address after: 100095 floors 2-14, building 3, yard 5, honeysuckle Road, Haidian District, Beijing

Patentee after: Beijing Honor Device Co.,Ltd.

Address before: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee before: Honor Device Co.,Ltd.

TR01 Transfer of patent right