CN112346965A - Test case distribution method, device and storage medium - Google Patents

Test case distribution method, device and storage medium Download PDF

Info

Publication number
CN112346965A
CN112346965A CN202011085824.5A CN202011085824A CN112346965A CN 112346965 A CN112346965 A CN 112346965A CN 202011085824 A CN202011085824 A CN 202011085824A CN 112346965 A CN112346965 A CN 112346965A
Authority
CN
China
Prior art keywords
test
test case
current
equipment
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011085824.5A
Other languages
Chinese (zh)
Other versions
CN112346965B (en
Inventor
高岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin May 8th Home Freight Service Co ltd
Original Assignee
Tianjin May 8th Home Freight Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin May 8th Home Freight Service Co ltd filed Critical Tianjin May 8th Home Freight Service Co ltd
Priority to CN202011085824.5A priority Critical patent/CN112346965B/en
Publication of CN112346965A publication Critical patent/CN112346965A/en
Application granted granted Critical
Publication of CN112346965B publication Critical patent/CN112346965B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The embodiment of the application provides a test case distribution method, a test case distribution device and a storage medium. In the embodiment of the application, when any current test case to be allocated in the test case combination matched with the current test requirement of the application software is allocated, the selected scores of the plurality of test devices for the current test case are calculated by combining the test case information and the model data which are already executed by the plurality of test devices, and the selected scores and the current use state information of the plurality of test devices for the current test case are comprehensively considered to obtain the target test device from the plurality of test devices. Therefore, the same test case is distributed to different test devices in multiple test works as far as possible, so that the test case is relatively comprehensively covered; the situation that the test is distributed to the non-mainstream model when the mainstream model is idle is avoided as much as possible; and further, the existing idle resources are utilized as much as possible, the execution task of the test case is completed as soon as possible, and the reasonability of the test case distribution is realized.

Description

Test case distribution method, device and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for distributing test cases, and a storage medium.
Background
In the test work of the application software, a plurality of test devices are often configured, and a plurality of test cases of the application software are distributed to each test device to be executed. In practical applications, there may be many different testing requirements for the application software, and in this case, many testing operations need to be performed on the application software. In consideration of improving the reliability of application software, the same test case should be executed on different test devices as much as possible, that is, the coverage of the test case is more comprehensive and better, and in addition, the test should be preferentially performed on the main machine model when the main machine model is idle.
At present, a tester is mainly used for manually selecting test equipment for executing test cases. However, the test cases are distributed manually, which may cause the same test case to be distributed to the same test device in multiple test operations, and the coverage of the test case is relatively incomplete, and even the test case is easily distributed to the non-mainstream model when the mainstream model is idle. Therefore, a more reasonable test case allocation scheme is needed.
Disclosure of Invention
Aspects of the present disclosure provide a method, an apparatus, and a storage medium for distributing test cases, so as to improve rationality of test case distribution.
The embodiment of the application provides a test case distribution method, which comprises the following steps:
acquiring a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case;
aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the test devices aiming at the current test case according to the test case information and the model data which are executed by the test devices;
selecting target test equipment from the plurality of test equipment according to the selected scores and the current use state information of the plurality of test equipment aiming at the current test case;
and distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
An embodiment of the present application further provides a test case distribution device, including: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case;
aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the test devices aiming at the current test case according to the test case information and the model data which are executed by the test devices;
selecting target test equipment from the plurality of test equipment according to the selected scores and the current use state information of the plurality of test equipment aiming at the current test case;
and distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to implement the steps in the test case distribution method.
According to the test case allocation method, the test case allocation device and the storage medium, when any current test case to be allocated in a test case combination matched with the current test requirement of application software is allocated, the selected scores of a plurality of test devices for the current test case are calculated by combining test case information and model data executed by the plurality of test devices, and the selected scores of the plurality of test devices for the current test case and the current use state information are comprehensively considered to obtain the target test device from the plurality of test devices. Therefore, the method can ensure that the same test case is distributed to different test devices in multiple test works as much as possible, so that the test case is covered more comprehensively; the situation that the test is distributed to the non-mainstream model when the mainstream model is idle can be avoided as much as possible; and further, the existing idle resources are utilized as much as possible, the execution task of the test case is completed as soon as possible, and the reasonability of the test case distribution is realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a method for distributing test cases according to an exemplary embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating another example distribution method for test cases according to an exemplary embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of a test case distribution device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a method for distributing test cases according to an exemplary embodiment of the present disclosure. The method is executed by a test case distribution device, and as shown in fig. 1, the method includes the following steps:
step 101, obtaining a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case.
And 102, aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the multiple test devices aiming at the current test case according to the test case information and the model data which are executed by the multiple test devices.
And 103, selecting target test equipment from the plurality of test equipment according to the selected scores of the plurality of test equipment for the current test case and the current use state information.
And 104, distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
In step 101, for the current test requirement of the application software, a developer may use a suitable automated testing framework to write one or more test cases matched with the current test requirement of the application software, so as to form a test case combination matched with the current test requirement of the application software.
For example, when the application software is a new online package (an online package may be understood as an application software released to a large application store for downloading by a user), a large number of test cases need to be written to cover all functional points of the application software. For another example, when the application software is a new test package (the test package can be understood as the application software that is not on-line), in order to quickly know that the test package may not be available and good and bad, some test cases deviating from the main flow need to be written, and a small number of test cases can quickly obtain the test result.
For example, the functional points of a certain application software include: 1. installation, 2, uninstallation, 3, overlay installation, 4, login account, 5, order placement, 6, order details checking, 7, user balance checking, 8, user protocol checking and 9, traversing all accessed third-party services. When the application software is regarded as a new online package, test cases covering the above 9 functional points need to be written. When the application software is regarded as a test package, test cases covering 6 functional points such as 1 to 6 need to be written.
In this embodiment, there may be more than one automated testing framework, different automated testing frameworks have different advantages, and developers select a suitable automated testing framework to write test cases according to actual situations. Automated test frameworks include, for example, but are not limited to: appium, Monkey. Among them, the Appium is an open source, cross-platform automated testing framework that can be used to test native and lightweight mobile applications. The Monkey automatic testing framework is self-contained in an Android platform system, Monkey is easy to use and practical, but has much more explanation and skill when being used, a simple Monkey tool cannot necessarily complete mission, and the Monkey needs to be encapsulated and transformed (native Monkey can be modified) in the test with great cost so as to meet the test requirement.
In this embodiment, the test case distribution device has the capability of being compatible with different automated test frameworks, so that a developer can upload the written test cases to the test case distribution device. When the test case stage is entered, the test case distribution device can obtain the locally stored test case combination matched with the current test requirement of the application software for distribution.
In step 102, when the test case assigning apparatus assigns each current test case to be assigned in the test case combination, the multiple test devices first select the median score of the current test case, and the selected median score of the multiple test devices for the current test case is used as a reference factor for selecting a target test device from the multiple test devices. The test case distribution device is in communication connection with a plurality of test devices, and the plurality of test devices include but are not limited to mobile phones, tablet computers, desktop computers, wearable devices, vehicle-mounted devices and the like of various types on the market.
The test case distribution device manages and maintains the test case information executed by each test device. The test case information executed by each test device records which test cases have been executed by each test device. For example, after a new application software comes online, it may need to perform a test work of daily inspection, that is, the application software may perform a plurality of test works. The test case distribution device updates the test case information executed by each test device for each test operation.
In addition, the test case distribution device manages and maintains model data of the test equipment connected with the test case distribution device, and whether the test equipment is a mainstream model or a non-mainstream model can be known through the model data of the test equipment. The mainstream model can be understood as a device with a large number of users, and the non-mainstream model can be understood as a device with a small number of users. Of course, the mainstream model and the non-mainstream model are distinguished according to actual situations.
In this embodiment, the selected scores of the multiple test devices for the current test case may be calculated more objectively by combining the test case information and model data that have been executed by the multiple test devices. If the selected score of any test equipment for the current test case is larger, the probability that the test equipment is selected as the target test equipment is larger.
Specifically, an initial score common to each test device may be preset, the execution times of each executed current test case is determined according to the test case information executed by each test device, the initial scores are subtracted according to the execution times, the initial scores are added according to the model data of each test device, and the selected scores of each test device for the current test case are obtained. When the initial scores are reduced according to the execution times, the execution times are more and more, so that the selected scores of the test devices which are not executed or have less execution times are higher as much as possible, the probability that the test devices which are not executed or have less execution times are selected as the target test devices of the current test case is increased, and the same test case is distributed to different test devices as much as possible in multiple test works. When the initial scores are added according to the model data of each testing device, the testing devices with larger number of users add more scores, so that the selected scores of the mainstream models are higher as much as possible, and the probability that the selected mainstream models are used as the target testing devices of the current test cases is increased.
It can be understood that, when all the test devices have not executed the test case to be assigned, the probability that the mainstream machine model is selected as the target test device is higher than the probability that the non-mainstream machine model is selected as the target test device.
Therefore, based on the selected scores of the current test cases by the plurality of test devices, the test distribution device can ensure that the same test case is distributed to different test devices in a plurality of test works as much as possible, so that the coverage of the test cases is relatively comprehensive, and in addition, the current test cases can be preferentially distributed to the main machine type as much as possible, and the situation that the current test cases are distributed to the non-main machine type for testing when the main machine type is idle is avoided as much as possible.
In step 103, after calculating the selected scores of the multiple test devices for the current test case, the test case assigning apparatus should select the test device with the higher selected score as the target test device of the current test case as much as possible. However, in an actual situation, it may happen that some test devices selected with a higher score are in a busy state, and if the test device is selected as a target test device of a current test case, the current test case may be executed only after the test device is released from the busy state to an idle state, which may delay the test time and affect the test execution efficiency. Therefore, it is necessary to select a target test device having a higher selected score and being in an idle state from the plurality of test devices as much as possible in combination with the selected score and the current use state information of the plurality of test devices for the current test case.
In step 104, after the test case allocating apparatus selects the target test device from the multiple test devices, the current test case to be allocated in the test case combination may be allocated to the target test device for testing.
According to the test case allocation method provided by the embodiment of the application, when any current test case to be allocated in the test case combination matched with the current test requirement of the application software is allocated, the selected scores of the plurality of test devices for the current test case are calculated by combining the test case information and the model data executed by the plurality of test devices, and the selected scores and the current use state information of the plurality of test devices for the current test case are comprehensively considered to obtain the target test device from the plurality of test devices. Therefore, the method can ensure that the same test case is distributed to different test devices in multiple test works as much as possible, so that the test case is covered more comprehensively; the situation that the test is distributed to the non-mainstream model when the mainstream model is idle can be avoided as much as possible; and further, the existing idle resources are utilized as much as possible, the execution task of the test case is completed as soon as possible, and the reasonability of the test case distribution is realized.
FIG. 2 is a flowchart illustrating another example distribution method for test cases according to an exemplary embodiment of the present disclosure. As shown in fig. 2, the method comprises the steps of:
step 201, obtaining a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case.
Step 202, aiming at the current test case to be distributed in the test case combination, according to the test case information and the model data executed by the plurality of test devices, calculating the selected scores of the plurality of test devices aiming at the current test case.
Step 203, selecting a target test device from the plurality of test devices according to the selected scores of the plurality of test devices for the current test case and the current use state information.
And 204, distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
The implementation manner of step 201 in this embodiment is the same as the implementation manner of step 101 in the above embodiment, the implementation manner of step 202 is the same as the implementation manner of step 102 in the above embodiment, the implementation manner of step 203 is the same as the implementation manner of step 103 in the above embodiment, and the implementation manner of step 204 is the same as the implementation manner of step 104 in the above embodiment, and therefore, details are not repeated here.
Step 205, determining whether the current test case is successfully tested on the target test equipment, and if the test case is failed, executing step 206.
Step 206, new target test equipment is reassigned to the current test case.
And step 207, detecting the test failure times of the current test case.
And step 208, if the test failure times of the current test case are greater than the preset time threshold, no new target test equipment is allocated to the current test case again.
In practical situations, the test case may fail to be tested on the allocated test device, and the reason for the failure may be that the test case is not designed reasonably or that the test device has a failure problem.
In this embodiment, when the current test case fails to test the allocated target test device, a new target test device is reallocated to the current test case again to test the current test case again, whether the new target test device reallocated to the current test case fails to test is determined, the number of test failures of the current test case is counted, and if the number of test failures of the current test case is greater than the preset number threshold, the new target test device is not reallocated to the current test case. The preset time threshold is set according to an actual situation, and the preset time threshold is, for example, 3 times.
According to the test case allocation method provided by the embodiment of the application, after the current test case is allocated to one target test device in the plurality of test devices, it is further judged that the current test case fails to be tested on the allocated test device, and when the test fails, a new target test device is selected from the plurality of test devices again to retest the current test case. If the current test case fails to test on a plurality of different target test devices, the reason for the test failure is that the current test case is unreasonable in design; if the current test case can be tested successfully on a new target test device, it indicates that the reason for the previous test failure may be that the previously allocated target test device has a failure problem. Therefore, the reason of test case test failure is more accurately positioned.
On the basis of the foregoing embodiment, optionally, calculating the selected scores of the multiple test devices for the current test case includes: aiming at each test device, matching the current test case in the test case information executed by the test device; if the user number is not matched, determining the user number corresponding to the test equipment according to the model data of the test equipment; and adjusting the preset initial scores according to the number of the users corresponding to the test equipment to obtain the selected scores of the test equipment for the current test case. If the user number is matched with the model data of the testing equipment, determining the user number corresponding to the testing equipment according to the model data of the testing equipment; acquiring the execution times of the test equipment for executing the current test case; and adjusting the preset initial score according to the number of the users corresponding to the test equipment and the execution times of the test equipment executing the current test case to obtain the selected score of the test equipment aiming at the current test case.
In this embodiment, the preset initial score is set according to an actual situation, and all the test devices correspond to the same initial score, and the initial score is, for example, 500.
In this embodiment, a corresponding relationship between the execution frequency range and the score reduction value may be set according to an actual situation, and the larger the value of the execution frequency range is, the higher the score reduction value is. For example, when the number of executions is 0, the reduction value is 0; when the execution times is 1 to 2, the point reduction value is 100 points; when the number of execution times is 3 or more, the point of subtraction value is 200 points. In this way, when the preset initial score is adjusted based on the execution times of the test device executing the current test case, the corresponding subtraction value may be determined based on the corresponding relationship, and then the initial score may be adjusted by subtracting the subtraction value from the preset initial score. For example, the reduction score value is 200 points, the initial score is 500 points, and the adjusted score is 300 points.
In this embodiment, a corresponding relationship between a user number range and a bonus point value may be set according to an actual situation, and the larger the user number range is, the higher the bonus point value is. For example, when the number range of the users is more than 200 ten thousand, the point adding value is 50 points; when the number range of the users is not more than 200 ten thousand but more than 100 ten thousand, the point adding value is 30 points; when the number range of the users is not more than 100 ten thousand, the point adding value is 0 point. Thus, when the preset initial score is adjusted based on the number of users corresponding to the test equipment, the corresponding bonus score value can be determined based on the corresponding relationship, and then the initial score can be adjusted by adding the bonus score value to the preset initial score. For example, the point value is 50 points, the initial point is 500 points, and the adjusted point is 550 points.
The method includes the steps that when the test equipment executes a current test case, corresponding bonus points are determined according to the number of users corresponding to the test equipment, corresponding subtraction points are determined according to the execution times of the test equipment executing the current test case, and then the selected points of the test equipment for the current test case are obtained by adding the bonus points to preset initial points and subtracting the subtraction points. For example, the initial score is 500 points, the addend value is 50 points, the depreciation value is 200 points, and the selected score of the test device for the current test case is 500+50-200 points to 350 points.
On the basis of the foregoing embodiment, optionally, selecting the target test device from the multiple test devices according to the selected scores and the current use state information of the multiple test devices for the current test case includes: and selecting the target test equipment with the highest selected score and in an idle state from the plurality of test equipment according to the selected score and the current use state information of the plurality of test equipment aiming at the current test case.
For example, for the current test case, the selected scores are in order from high to low: test device 1 (busy state), test device 3 (idle state), test device 2 (busy state), test device 5 (busy state), test device 4 (idle state). In combination with the selected scores of the plurality of test devices for the current test case and the current use state information of the test devices, the test device 3 should be selected as the target test device.
On the basis of the above embodiment, optionally, if all the plurality of test devices are in the busy state, releasing the first test device in the plurality of test devices from the busy state to the test device in the idle state as the target test device.
In a practical situation, it may happen that all test devices are busy, when the selected scores of the plurality of test tests for the current test case are temporarily not considered when selecting the target test device. When all the test equipment is in the busy state, the first test equipment released from the busy state to the idle state is used as target test equipment, so that the test work of the current test case is started as soon as possible, the test time is shortened, and the test execution efficiency is improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 101 to 104 may be device a; for another example, the main body for executing steps 101 and 102 may be device a, and the bodies 103 and 104 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 3 is a schematic structural diagram of a test case distribution device according to an exemplary embodiment of the present application. As shown in fig. 3, the apparatus includes: the method comprises the following steps: a memory 11 and a processor 12.
The memory 11 is used for storing a computer program and may be configured to store other various data to support operations on the processor. Examples of such data include instructions for any application or method operating on the processor, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 11 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 12, coupled to the memory 11, for executing the computer program in the memory 11 for:
acquiring a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case;
aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the test devices aiming at the current test case according to the test case information and the model data which are executed by the test devices;
selecting target test equipment from the plurality of test equipment according to the selected scores and the current use state information of the plurality of test equipment aiming at the current test case;
and distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
Further, when calculating the selected scores of the plurality of test devices for the current test case, the processor 12 is specifically configured to:
aiming at each test device, matching the current test case in the test case information executed by the test device;
if the user number is not matched, determining the user number corresponding to the test equipment according to the model data of the test equipment;
and adjusting the preset initial scores according to the number of the users corresponding to the test equipment to obtain the selected scores of the test equipment for the current test case. Further, the processor 12 is further configured to:
if the user number is matched with the model data of the testing equipment, determining the user number corresponding to the testing equipment according to the model data of the testing equipment;
acquiring the execution times of the test equipment for executing the current test case;
and adjusting the preset initial score according to the number of the users corresponding to the test equipment and the execution times of the test equipment executing the current test case to obtain the selected score of the test equipment aiming at the current test case.
Further, when the target test device is selected from the plurality of test devices, the processor 12 is specifically configured to:
and selecting a target test device which has the highest selected score and is in an idle state from the plurality of test devices according to the selected scores of the plurality of test devices for the current test case and the current use state information.
Further, the processor 12 is further configured to:
and if the plurality of test equipment are in busy states, releasing the first test equipment in the plurality of test equipment from the busy state to the test equipment in an idle state as the target test equipment.
Further, the processor 12, after assigning the current test case to the target test device, is further configured to:
judging whether the current test case is tested successfully on the target test equipment;
if the test fails, new target test equipment is redistributed to the current test case; and
detecting the test failure times of the current test case;
and if the test failure times of the current test case are larger than a preset time threshold, no new target test equipment is allocated to the current test case again.
The apparatus shown in fig. 3 can perform the method of the above embodiment, and reference may be made to the related description of the above embodiment for a part of the embodiment that is not described in detail. The implementation process and technical effect of the technical solution refer to the description in the above embodiments, and are not described herein again.
Further, as shown in fig. 3, the apparatus further includes: communication components 13, display 14, power components 15, audio components 16, and the like. Only some of the components are schematically shown in fig. 3, and it is not meant that the processor includes only the components shown in fig. 3. In addition, the components shown by the dashed boxes in fig. 3 are optional components, but not necessarily optional components, and may be determined according to a specific implementation form of the test case distribution apparatus. If the test case distribution device is implemented as a terminal device such as a notebook computer, a tablet, a mobile phone, etc., the test case distribution device may include components shown by dotted line boxes in fig. 3; if the test case distribution device is implemented as a server device such as a conventional server, a cloud server or a server array, the components shown by the dashed boxes in fig. 3 are not included.
Accordingly, the present application further provides a computer readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by a processor in the foregoing method embodiments when executed.
The communication component of fig. 3 described above is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display in fig. 3 described above includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply assembly of fig. 3 described above provides power to the various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component of fig. 3 described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method for distributing test cases, comprising:
acquiring a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case;
aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the test devices aiming at the current test case according to the test case information and the model data which are executed by the test devices;
selecting target test equipment from the plurality of test equipment according to the selected scores and the current use state information of the plurality of test equipment aiming at the current test case;
and distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
2. The method of claim 1, wherein the calculating the selected scores for the current test cases for the plurality of test devices comprises:
aiming at each test device, matching the current test case in the test case information executed by the test device;
if the user number is not matched, determining the user number corresponding to the test equipment according to the model data of the test equipment;
and adjusting the preset initial scores according to the number of the users corresponding to the test equipment to obtain the selected scores of the test equipment for the current test case.
3. The method of claim 2, further comprising:
if the user number is matched with the model data of the testing equipment, determining the user number corresponding to the testing equipment according to the model data of the testing equipment;
acquiring the execution times of the test equipment for executing the current test case;
and adjusting the preset initial score according to the number of the users corresponding to the test equipment and the execution times of the test equipment executing the current test case to obtain the selected score of the test equipment aiming at the current test case.
4. The method of claim 1, wherein selecting a target test device from the plurality of test devices based on the selected scores and current usage status information for the current test case for the plurality of test devices comprises:
and selecting a target test device which has the highest selected score and is in an idle state from the plurality of test devices according to the selected scores of the plurality of test devices for the current test case and the current use state information.
5. The method of claim 4, further comprising:
and if the plurality of test equipment are in busy states, releasing the first test equipment in the plurality of test equipment from the busy state to the test equipment in an idle state as the target test equipment.
6. The method of any of claims 1 to 5, further comprising, after assigning a current test case to the target test device:
judging whether the current test case is tested successfully on the target test equipment;
if the test fails, new target test equipment is redistributed to the current test case; and
detecting the test failure times of the current test case;
and if the test failure times of the current test case are larger than a preset time threshold, no new target test equipment is allocated to the current test case again.
7. A test case assigning apparatus, comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring a test case combination matched with the current test requirement of the application software, wherein the test case combination comprises at least one test case;
aiming at the current test case to be distributed in the test case combination, calculating the selected scores of the test devices aiming at the current test case according to the test case information and the model data which are executed by the test devices;
selecting target test equipment from the plurality of test equipment according to the selected scores and the current use state information of the plurality of test equipment aiming at the current test case;
and distributing the current test case to the target test equipment so that the target test equipment can test the current test case.
8. The apparatus of claim 7, wherein the processor, when calculating the selected scores for the current test case for the plurality of test devices, is specifically configured to:
aiming at each test device, matching the current test case in the test case information executed by the test device;
if the user number is not matched, determining the user number corresponding to the test equipment according to the model data of the test equipment;
and adjusting the preset initial scores according to the number of the users corresponding to the test equipment to obtain the selected scores of the test equipment for the current test case.
9. The apparatus of claim 7, wherein the processor, after assigning a current test case to the target test device, is further configured to:
judging whether the current test case is tested successfully on the target test equipment;
if the test fails, new target test equipment is redistributed to the current test case; and
detecting the test failure times of the current test case;
and if the test failure times of the current test case are larger than a preset time threshold, no new target test equipment is allocated to the current test case again.
10. A computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to carry out the steps of the test case distribution method of any one of claims 1 to 6.
CN202011085824.5A 2020-10-12 2020-10-12 Test case distribution method, device and storage medium Active CN112346965B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011085824.5A CN112346965B (en) 2020-10-12 2020-10-12 Test case distribution method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011085824.5A CN112346965B (en) 2020-10-12 2020-10-12 Test case distribution method, device and storage medium

Publications (2)

Publication Number Publication Date
CN112346965A true CN112346965A (en) 2021-02-09
CN112346965B CN112346965B (en) 2024-05-17

Family

ID=74361771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011085824.5A Active CN112346965B (en) 2020-10-12 2020-10-12 Test case distribution method, device and storage medium

Country Status (1)

Country Link
CN (1) CN112346965B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817869A (en) * 2021-02-25 2021-05-18 网易(杭州)网络有限公司 Test method, test device, test medium, and electronic apparatus
CN113722237A (en) * 2021-11-01 2021-11-30 荣耀终端有限公司 Device testing method and electronic device
CN113778771A (en) * 2021-09-14 2021-12-10 百富计算机技术(深圳)有限公司 Terminal testing method, system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100058295A1 (en) * 2008-09-02 2010-03-04 International Business Machines Corporation Dynamic Test Coverage
CN104699616A (en) * 2015-03-31 2015-06-10 北京奇虎科技有限公司 Method, device and system for testing application
CN105095063A (en) * 2014-05-12 2015-11-25 腾讯科技(深圳)有限公司 Application program testing method, apparatus and system
CN110806981A (en) * 2019-11-05 2020-02-18 北京博睿宏远数据科技股份有限公司 Application program testing method, device, equipment and storage medium
CN111190810A (en) * 2019-08-26 2020-05-22 腾讯科技(深圳)有限公司 Method, device, server and storage medium for executing test task
CN111651358A (en) * 2020-06-05 2020-09-11 北京金山云网络技术有限公司 Method for generating test case, software testing method, device and server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100058295A1 (en) * 2008-09-02 2010-03-04 International Business Machines Corporation Dynamic Test Coverage
CN105095063A (en) * 2014-05-12 2015-11-25 腾讯科技(深圳)有限公司 Application program testing method, apparatus and system
CN104699616A (en) * 2015-03-31 2015-06-10 北京奇虎科技有限公司 Method, device and system for testing application
CN111190810A (en) * 2019-08-26 2020-05-22 腾讯科技(深圳)有限公司 Method, device, server and storage medium for executing test task
CN110806981A (en) * 2019-11-05 2020-02-18 北京博睿宏远数据科技股份有限公司 Application program testing method, device, equipment and storage medium
CN111651358A (en) * 2020-06-05 2020-09-11 北京金山云网络技术有限公司 Method for generating test case, software testing method, device and server

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
鞠炜刚;欧林宝;: "基于环境资源自动匹配的云测试框架研究与应用", 计算机应用与软件, no. 01, 15 January 2018 (2018-01-15) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817869A (en) * 2021-02-25 2021-05-18 网易(杭州)网络有限公司 Test method, test device, test medium, and electronic apparatus
CN113778771A (en) * 2021-09-14 2021-12-10 百富计算机技术(深圳)有限公司 Terminal testing method, system and storage medium
CN113778771B (en) * 2021-09-14 2023-07-18 百富计算机技术(深圳)有限公司 Terminal testing method, system and storage medium
CN113722237A (en) * 2021-11-01 2021-11-30 荣耀终端有限公司 Device testing method and electronic device

Also Published As

Publication number Publication date
CN112346965B (en) 2024-05-17

Similar Documents

Publication Publication Date Title
CN112346965B (en) Test case distribution method, device and storage medium
EP3355187A1 (en) Loading method and device for terminal application (app)
CN109359118B (en) Data writing method and device
CN107229559B (en) Detection method and device for testing integrity of service system
CN107038120B (en) Software testing method and device
CN112463634B (en) Software testing method and device under micro-service architecture
CN107045475B (en) Test method and device
CN111897740A (en) User interface testing method and device, electronic equipment and computer readable medium
WO2020211360A1 (en) Mock test method and system, electronic device, and computer non-volatile readable storage medium
CN107391362A (en) Application testing method, mobile terminal and storage medium
CN110851204A (en) Application starting method and device and application packaging method and device
CN111158987B (en) Health check method and device for micro-service architecture
CN104809054A (en) Method and system for realizing program testing
CN114996134A (en) Containerized deployment method, electronic equipment and storage medium
CN112306857A (en) Method and apparatus for testing applications
CN109345249B (en) Payment failure processing method and device
CN109840109B (en) Method and apparatus for generating software development toolkit
US10176062B2 (en) Cloud servers and methods for handling dysfunctional cloud services
CN115373998A (en) Application program optimization method, device, equipment and medium
CN115348352A (en) Page access method and system
US20220122038A1 (en) Process Version Control for Business Process Management
CN112905449B (en) Target test method, device, equipment and storage medium
CN113760768A (en) Test method, monitoring platform, electronic equipment and storage medium
CN110297625B (en) Application processing method and device
CN112786034A (en) Voice interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant