CN112540919A - Test equipment determination method and device - Google Patents

Test equipment determination method and device Download PDF

Info

Publication number
CN112540919A
CN112540919A CN202011422468.1A CN202011422468A CN112540919A CN 112540919 A CN112540919 A CN 112540919A CN 202011422468 A CN202011422468 A CN 202011422468A CN 112540919 A CN112540919 A CN 112540919A
Authority
CN
China
Prior art keywords
target
equipment
test
tested
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011422468.1A
Other languages
Chinese (zh)
Other versions
CN112540919B (en
Inventor
李炯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202011422468.1A priority Critical patent/CN112540919B/en
Publication of CN112540919A publication Critical patent/CN112540919A/en
Application granted granted Critical
Publication of CN112540919B publication Critical patent/CN112540919B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present specification provides a method and an apparatus for determining test equipment, wherein the method for determining test equipment includes: under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not; under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs; and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result. Therefore, the operation data can be fully utilized, the corresponding target equipment with higher user coverage rate is obtained for the object to be tested, then the obtained target equipment related to the object to be tested is counted and sequenced, and therefore the test equipment for the object to be tested is accurately determined.

Description

Test equipment determination method and device
Technical Field
The specification relates to the technical field of computers, in particular to a test equipment determining method. The present specification also relates to a test equipment determination apparatus, a computing device, and a computer-readable storage medium.
Background
With the rapid development of computer technology and network technology, various applications have come into play and become essential tools for most people to work and live, such as communication applications, document applications, games, and the like. Before the application programs are online or updated, compatibility tests need to be carried out on different devices, so that each device can normally run the application programs.
In the prior art, related personnel often manually inquire and count on the internet, or automatically collect and count the number of users of different devices based on huge user data information, so that the device with the top user coverage rate is selected as a test device to test an application program to be tested.
However, if manual query is adopted, the actual samples searched are few, updating is slow, a large amount of human resources are consumed, and query efficiency is low; the automatic query needs to be based on huge user data information, when the user data information is insufficient, the query result is inaccurate, real data does not depend and is verified, and the reliability of the query result is low. In addition, whether manual query or automatic query is performed, the test equipment to be used is determined directly based on the overall user coverage, the accuracy of the determination of the test equipment is low, so that the actual test coverage of the compatibility test is low, the compatibility test is performed by using the data, and a certain deviation exists between the actual result and the compatibility test.
Disclosure of Invention
In view of this, the embodiments of the present specification provide a test device determination method. The present specification also relates to a testing device determining apparatus, a computing device, and a computer-readable storage medium, so as to solve the problem of low actual test coverage of compatibility test in the prior art.
According to a first aspect of embodiments herein, there is provided a test device determination method, including:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
According to a second aspect of embodiments of the present specification, there is provided a test device determination apparatus including:
the device comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is configured to determine whether an object to be tested is an online object or not under the condition that a test task for the object to be tested is detected;
the acquisition module is configured to acquire target equipment corresponding to the object to be detected under the condition that the object to be detected is an online object; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and the second determination module is configured to count the target equipment according to the statistical indexes and determine the target test equipment for the object to be tested according to the statistical result.
According to a third aspect of embodiments herein, there is provided a computing device comprising:
a memory and a processor;
the memory is configured to store computer-executable instructions, and the processor is configured to execute the computer-executable instructions to implement the method of:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
According to a fourth aspect of embodiments herein, there is provided a computer-readable storage medium storing computer-executable instructions that, when executed by a processor, implement the steps of any of the test device determination methods.
In the method for determining the test equipment provided by the specification, under the condition that a test task for an object to be tested is detected, whether the object to be tested is an online object is determined; under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs; and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result. Under the condition, the operation data can be fully utilized, the corresponding target equipment with higher user coverage rate is obtained for the object to be tested, then the obtained target equipment related to the object to be tested is subjected to statistical sequencing, and therefore the test equipment for the object to be tested is determined; therefore, the determined test equipment is the equipment related to the object to be tested, the accuracy of determining the test equipment is improved, the accuracy of actual test coverage of compatibility test is improved, and meanwhile, a test team can manage and purchase the test equipment conveniently.
Drawings
Fig. 1 is a flowchart of a test device determination method provided in an embodiment of the present specification;
FIG. 2 is a process flow diagram of another test equipment determination method provided in an embodiment of the present specification;
fig. 3 is a schematic structural diagram of a test equipment determination apparatus according to an embodiment of the present disclosure;
fig. 4 is a block diagram of a computing device according to an embodiment of the present disclosure.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present description. This description may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, as those skilled in the art will be able to make and use the present disclosure without departing from the spirit and scope of the present disclosure.
The terminology used in the description of the one or more embodiments is for the purpose of describing the particular embodiments only and is not intended to be limiting of the description of the one or more embodiments. As used in one or more embodiments of the present specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present specification refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first can also be referred to as a second and, similarly, a second can also be referred to as a first without departing from the scope of one or more embodiments of the present description. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
First, an application scenario of the present specification will be explained.
The compatibility test belongs to one of function tests, is mainly applied to games/applications for testing before online or updating, and due to the fact that the existing equipment models and brands are seriously fragmented, a testing team can buy a large amount of testing equipment and even build a cloud testing platform to improve the coverage rate of the compatibility test, so that more users are ensured to have good use experience, and the coverage rate of the users also becomes one of key indexes for purchasing the equipment and improving the compatibility test.
Currently, the methods for obtaining the user coverage rate are divided into the following two types:
(1) and (3) manual query: related personnel are generally required to perform online query and statistics, such as finding top100 user coverage mobile phones and the number of users on the market. However, the manual query of actual samples is few and slow in updating, so that the query result is inaccurate; moreover, a large amount of human resources are required to be consumed for manual query, so that time and labor are wasted, and the query efficiency is low; in addition, the reliability of the query result data is low because no real data is relied on and verified.
(2) Automatic acquisition: based on huge user data information, the mobile phone with the user coverage rate and the number of users are automatically collected and counted. However, the automatic acquisition depends on the collection of huge user data, and the lack of the number of users can cause the inaccuracy of the counted user coverage rate mobile phones and the number of users thereof; and the automatic acquisition does not perform real-time data extraction and updating.
For example, taking the acquisition of top100 user coverage rate mobile phones and the number of users thereof on the market as an example, no matter manual query or automatic acquisition is performed, the total user coverage rate mobile phones and the number of users thereof are generally acquired, but the top100 user coverage rate mobile phones and the number of users thereof of the application or game to be tested are not, compatibility test is performed by using the data, and a certain deviation exists between the actual result and the actual result.
In the two existing test modes, the query result of manual query is inaccurate, the data acquisition efficiency is low, the reliability of the query result data is low, and the coverage rate accuracy of the compatibility test is low. The automatic acquisition mode effectively overcomes the defect of manual query, but the automatic query needs to be based on huge user data, and because the used coverage rate is the overall user coverage rate, the application or the game to be tested is not screened, and the test result and the actual result have certain deviation.
Therefore, in order to improve the accuracy of determining the test equipment and thus improve the accuracy of the actual test coverage of the compatibility test, the application provides a test equipment determining method, which determines whether the object to be tested is an on-line object or not under the condition that a test task for the object to be tested is detected; under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs; and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result. Under the condition, the operation data can be fully utilized, the corresponding target equipment with higher user coverage rate is obtained for the object to be tested, then the obtained target equipment related to the object to be tested is subjected to statistical sequencing, and therefore the test equipment for the object to be tested is determined; therefore, the determined test equipment is the equipment related to the object to be tested, the accuracy of determining the test equipment is improved, and the coverage rate accuracy of the compatibility test is improved.
In the present specification, a test device determination method is provided, and the present specification relates to a test device determination apparatus, a computing device, and a computer-readable storage medium, which are described in detail one by one in the following embodiments.
Fig. 1 shows a flowchart of a test device determination method according to an embodiment of the present specification, which specifically includes the following steps:
step 102: and under the condition that a test task aiming at the object to be tested is detected, determining whether the object to be tested is an on-line object.
Specifically, the object to be tested refers to an object which is about to be online or updated and needs to be subjected to compatibility testing on different devices, and for example, the object to be tested may be an application program to be online, a game to be online, an application program to be updated, a game to be updated, or the like. The test task is triggered by the preset operation of the staff, and the test task can carry the identification of the object to be tested, so that the object to be tested subsequently can be determined according to the identification.
In specific implementation, the program environment can be installed first, the compatibility test platform is deployed and comprises equipment with high relevant market share, and the compatibility test platform can be connected to an operation database relevant to the object to be tested. Then, the number of test devices to be tested, such as the first 100 devices, the first 300 devices, the first 600 devices, etc., is selected, and then a test task for the object to be tested is created and executed. That is to say, the worker may initiate a test task for the object to be tested in the compatibility test platform, and the test task may carry the number of the required test devices in addition to the identifier of the object to be tested.
It should be noted that a test task may be initiated for the object to be tested, which facilitates to obtain relevant user data for the object to be tested subsequently, thereby determining the device with high user coverage as the test device. Because the corresponding user data exists only after the object to be tested is on line, under the condition that a test task for the object to be tested is detected, whether the object to be tested is an on-line object needs to be further determined, and then different measures are respectively taken to acquire the corresponding user data according to whether the object to be tested is on line or not.
Step 104: and under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected.
Specifically, on the basis of determining whether the object to be tested is an online object or not under the condition that a test task for the object to be tested is detected, further, under the condition that the object to be tested is the online object, target equipment corresponding to the object to be tested is obtained. The target equipment is equipment which downloads or installs the object to be tested.
It should be noted that, if the object to be detected is an online object, corresponding user data exists in the object to be detected, that is, which users download or install the device to be detected through which devices, and then the target device with high user coverage rate for the object to be detected can be determined through statistical analysis of the user data.
In the application, the online object can be connected to the operation database of the object to be tested, the device model and the number of users of the target device which downloads or installs the object to be tested are inquired and listed, the user coverage rate of the target device obtained through subsequent statistics is facilitated, and therefore the test device for testing the object to be tested is further determined.
Step 106: and under the condition that the object to be detected is not on-line, acquiring target equipment corresponding to the class to which the object to be detected belongs.
Specifically, on the basis of determining whether the object to be tested is an online object or not under the condition that a test task for the object to be tested is detected, further, under the condition that the object to be tested is an offline object, target equipment corresponding to the category to which the object to be tested belongs is obtained. The target equipment is equipment which downloads or installs the object of the class to which the object to be detected belongs.
In an optional implementation manner of this embodiment, as for an offline object, it is required to acquire a target device corresponding to a category to which the object to be detected belongs, that is, it is required to acquire user data of an online object of the category to which the object to be detected belongs, so that all online objects need to be classified in advance, that is, before determining whether the object to be detected is an online object, the method further includes:
acquiring an online object;
and classifying the online objects to obtain at least one type of online object group.
For example, the online object group may be a role playing game, a shooting game, a business game, a racing game, a music game, a social application, an online shopping payment application, a communication application, a consumer application, and the like.
It should be noted that, the online objects are classified to obtain at least one type of online object group, and then, for the offline objects, the user data of the corresponding online object group can be determined according to the category to which the offline objects belong, so as to perform subsequent statistical analysis.
In an optional implementation manner of this embodiment, the target device corresponding to the category to which the object to be detected belongs is obtained, and a specific implementation process may be as follows:
determining the category of the object to be detected;
acquiring the online object group of the category;
and determining the target equipment corresponding to the on-line object group as the target equipment corresponding to the category of the object to be detected.
It should be noted that if the object to be detected is an offline object, the object to be detected does not have corresponding user data, so that the category to which the object to be detected belongs can be determined first, then the user data of the online object group of the category is obtained, and then the target device with high user coverage rate for the object to be detected can be determined by performing statistical analysis on the user data of the online objects of the same category.
In the application, for the non-online object, the non-online object can be connected to the operation database corresponding to the online object group of the category to which the object to be tested belongs, and the target device which downloads or installs the online object included in the online object group is inquired and listed, so that the user coverage of the target device obtained through subsequent statistics is facilitated, and the test device for testing the object to be tested is further determined.
Step 108: and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
Specifically, under the condition that the object to be detected is an online object, target equipment corresponding to the object to be detected is obtained; and on the basis of acquiring target equipment corresponding to the class to which the object to be tested belongs under the condition that the object to be tested is not on-line, further, counting the target equipment according to the statistical indexes, and determining target test equipment for the object to be tested according to the statistical result. The statistical indicator refers to an indicator for counting the number of target devices, and for example, the statistical indicator may be a device model, a system version number, or the like.
In an optional implementation manner of this embodiment, the statistical index is a device model, the target device is counted according to the statistical index, and the target test device for the object to be tested is determined according to the statistical result, where a specific implementation process may be as follows:
counting and sequencing the number of the target equipment according to the equipment model;
and selecting a preset number of target devices as target test devices for the object to be tested.
Specifically, the preset value refers to the number of test devices required for testing the object to be tested, that is, the number carried in the test task, when the test task for the object to be tested is created, and for example, the preset value may be 100, 300, 600, or the like.
In actual implementation, for the obtained target device, the device models can be sorted from high to low according to the number of users, the device models with the preset values are extracted as test devices (if top100 is selected, the first 100 device models are extracted as test devices), and then the extracted device models are listed and the compatibility test is performed on the object to be tested by using the device with the model.
Illustratively, the obtained target device is: 100 brand model 1 mobile phones; 75 mobile phones of brand A and model 2; 120 brand model 3 mobile phones; 80 mobile phones with brand model 1; 250 brand 2 mobile phones; 270 mobile phones with the brand number of 1; and 85 mobile phones of brand C and model 2. And sequencing the user numbers corresponding to the equipment models of the target equipment to obtain a target equipment user number sequencing table shown in the following table 1, and selecting the first 3 equipment models, namely a C brand model 1 mobile phone, a B brand model 2 mobile phone and an A brand model 3 mobile phone as test equipment for testing the object to be tested as shown in the following table 1.
TABLE 1 user-number sorting table for target equipment
Figure BDA0002823064370000101
Figure BDA0002823064370000111
In an optional implementation manner of this embodiment, counting and sorting the number of the target devices according to the device models includes:
determining the equipment model of the target equipment;
counting the number of target equipment corresponding to the model of the target equipment according to the model of the target equipment, wherein the model of the target equipment is any equipment model of the target equipment;
and sorting different equipment models corresponding to the target equipment according to the number of the target equipment.
It should be noted that any one of the obtained target devices has a corresponding device model, the device model of each target device is determined, then the number of the device models is counted, the number is the number of users corresponding to the target devices of the device models, and then the device models can be sorted from high to low according to the number of the users.
For example, 100 target devices are all a mobile phone of a brand 1, and the number of users corresponding to the mobile phone of the brand 1 is 100; 75 target devices are all mobile phones of A brand and model 2, and the number of users corresponding to the mobile phones of A brand and model 2 is 75; if 120 target devices are all the A brand model 3 mobile phones, the number of users corresponding to the A brand model 3 mobile phones is 120; 80 target devices are all B brand model 1 mobile phones, and the number of users corresponding to the B brand model 1 mobile phones is 80; 250 target devices are all B brand model 2 mobile phones, and the number of users corresponding to the B brand model 2 mobile phones is 250; 270 pieces of target equipment are all C brand model 1 mobile phones, and the number of users corresponding to the C brand model 1 mobile phones is 270; 85 target devices are all C brand model 2 mobile phones, and the number of users corresponding to the C brand model 2 mobile phones is 85.
In an optional implementation manner of this embodiment, the statistical indexes are a device model and a system version number, the target device is counted according to the statistical indexes, and the target test device for the object to be tested is determined according to the statistical result, which may be specifically implemented as follows:
counting and sequencing the number of the target equipment according to the equipment model and the system version number;
and selecting a preset number of target devices as target test devices for the object to be tested.
In actual implementation, for the obtained target device, different device models and system version numbers can be sorted from high to low according to the number of users, the first preset values of the device models and the system version numbers are extracted as test devices (if top100 is selected, the first 100 device models and the system version numbers are extracted as test devices), then the extracted device models and system version numbers are listed, and compatibility test is performed on the object to be tested by using the devices of the models and the system version numbers. It should be noted that the device model and the system version number are an integral, and the numbers are counted respectively as long as one of the device model and the system version number is different from the other.
Illustratively, the obtained target device is: 85 mobile phones of brand A, model 1 and system version 1; 15 mobile phones of brand A, model 1 and system version 2; 10 mobile phones of brand A, model 2 and system version 1; 65 mobile phones of brand A, model 2 and system version 2; 80 mobile phones of brand, model 2 and system version 1; 100 mobile phones of brand, model 2 and system version 2; and 70 mobile phones of brand B, model 2 and system version 3. And sequencing the user numbers corresponding to the equipment models and the system version numbers of the target equipment to obtain a target equipment user number sequencing table shown in the following table 2, and selecting the first 3 equipment models and the system version numbers, namely the mobile phones of the brand B, the model 2 and the system version 2, the mobile phones of the brand A, the model 1 and the system version 1 and the mobile phones of the brand B, the model 2 and the system version 1 as test equipment for testing the object to be tested as shown in the following table 2.
TABLE 2 user number sorting table for target equipment
Device model and system version number Number of users
B brand, model 2, system version 2 mobile phone 100 of
Mobile phone with A brand, model 1 and system version 1 85 by one
B brand, model 2, system version 1 mobile phone 80 are provided with
B brand, model 2, system version 3 mobile phone 70 are provided with
Mobile phone of brand A, model 2 and system version 2 65 are provided with
Mobile phone with A brand, model 1 and system version 2 15 are provided with
Mobile phone of brand A, model 2 and system version 1 10 are provided with
In an optional implementation manner of this embodiment, counting and sorting the number of the target devices according to the device model and the system version number includes:
determining the equipment model and the system version number of the target equipment;
counting the number of target equipment corresponding to the target equipment model and the system version number aiming at the target equipment model and the system version number, wherein the target equipment model and the system version number are any equipment model and system version number of the target equipment;
and sequencing the different equipment models and the system version numbers corresponding to the target equipment according to the number of the target equipment.
It should be noted that any one of the obtained target devices has a corresponding device model and system version number, the device model and system version number of each target device are determined, then the number of the device model and system version number is counted, the number is the number of users corresponding to the target devices of the device model and system version number, and then the device model and system version number can be sorted from high to low according to the number of the users.
For example, 85 target devices are all mobile phones of brand a, model 1, and system version 1, and the number of users corresponding to the mobile phones of brand a, model 1, and system version 1 is 85; 15 target devices are all mobile phones of brand A, model 1 and system version 2, and the number of users corresponding to the mobile phones of brand A, model 1 and system version 2 is 15; the number of the users corresponding to the mobile phones of the brand A, the model 2 and the system version 1 is 10; 65 target devices are all mobile phones of brand A, model 2 and system version 2, and the number of users corresponding to the mobile phones of brand A, model 2 and system version 2 is 65; 80 target devices are all mobile phones with B brand, model 2 and system version 1, and the number of users corresponding to the mobile phones with B brand, model 2 and system version 1 is 85; 100 target devices are all mobile phones of brand B, model 2 and system version 2, and the number of users corresponding to the mobile phones of brand B, model 2 and system version 2 is 100; 70 target devices are all mobile phones of brand B, model 2 and system version 3, and the number of users corresponding to the mobile phones of brand B, model 2 and system version 3 is 85.
In an optional implementation manner of this embodiment, for a target device model, before counting the number of target devices corresponding to the target device model, the method further includes:
determining whether target equipment corresponding to the same user account exists or not according to the model of the target equipment;
and if so, carrying out duplicate removal on the target equipment with the same user account, and executing the operation step of counting the number of the target equipment corresponding to the type of the target equipment.
It should be noted that, if a user downloads an object to be tested by using a target device, deletes the object to be tested, and downloads the object to be tested again after a period of time, at this time, two download records exist for the target device, but the user actually corresponding to the target device is only one, at this time, the target device may be deduplicated, that is, only any one record of the two download records is reserved. In addition, the duplicate removal of the same user account is performed for the device model, and the duplicate removal of the same user account can be performed for the device model and the system version number, and the detailed process is similar and is not repeated herein.
In the application, data downloading or installation can be performed through the same device (the same system version number) aiming at the same user account number for duplication removal, misleading of multiple downloading or installation on the user coverage rate of the device is avoided, the accuracy rate of the user coverage rate of the target device is improved, and the accuracy rate of the test device for determining the object to be tested is further improved.
In an optional implementation manner of this embodiment, after determining the target test device for the object to be tested according to the statistical result, the method further includes:
determining current test equipment set in a test platform;
and determining and returning the test equipment except the current test equipment in the target test equipment.
Specifically, the test platform refers to a pre-established compatibility test platform, and when the test platform is pre-established, some test devices (i.e., current test devices of the test platform) are pre-set in the test platform, i.e., test devices already held by the test platform. After the target test equipment for the object to be tested is determined, target test equipment which is not set by the test platform, namely test equipment which is high in coverage rate and not held by the test team can be listed, and the subsequent test team can purchase the test equipment which is not held by the subsequent test team and test the object to be tested.
In an optional implementation manner of this embodiment, after counting the target device according to a statistical indicator and determining the target test device for the object to be tested according to a statistical result, the method further includes:
determining the equipment model and the current system version number of the current testing equipment set in the testing platform;
determining whether the current system version number is the same as the most-numerous system version number in the target equipment of the equipment model;
and if not, returning the system version number with the maximum number in the target equipment of the equipment model.
It should be noted that, in addition to listing target test devices that are not set by the test platform, that is, test devices that are high in coverage but not held by the test team, the device model of a test device that has been set in the test platform (that is, the current test device) may be determined, then the system version number that is the highest in user coverage and corresponds to the device model is searched, then the system version number of the current test device is compared with the system version number that is the highest in coverage, if the system version number is not consistent, the model and the version number that is the highest in coverage are listed, so that a subsequent test team can upgrade the system of the test device of the model, and then the object to be tested is tested.
In practical application, a game is tested by adopting a test scheme of using the total user coverage rate in the market in the prior art, and the test mode of determining the user coverage rate in the user data related to the game to be tested is determined for experimental comparison based on the game to be tested, and the conclusion is as follows: table 3 is an online game comparison table, where table 3 shows that, when a game to be tested is online and has user data, the user coverage top100 mobile phone for the game is used to perform compatibility test and the result of the test is compared with the total user coverage mobile phone on the market, the user coverage is determined for the game and the test equipment is determined, and the number of users covered finally is more accurate, and the coverage and the test result are more accurate.
TABLE 3 on-line game comparison table
Figure BDA0002823064370000161
In addition, table 4 is a comparison table of the offline game, and table 4 shows that the result of the online game tested by using the mobile phone with the total coverage of users on the market is compared with the result of the online game tested by using the mobile phone with the coverage of users, and the actual test result obtained by using the mobile phone top100 ranking with the coverage of users for the online game is higher in the coverage of users.
TABLE 4 comparison table of game not on-line
Figure BDA0002823064370000171
In summary, it can be seen from the above tables 3 and 4 that compared with the scheme of performing the compatibility test by using the test of the total user coverage on the market in the prior art, the test scheme of using the user coverage corresponding to the test game or using the user coverage of the game according to the present application has higher and more accurate actual test coverage and actual coverage of the user, and effectively improves the reliability of the test result.
In the method for determining the test equipment provided by the specification, under the condition that a test task for an object to be tested is detected, whether the object to be tested is an online object is determined; under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs; and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result. Under the condition, the operation data can be fully utilized, the corresponding target equipment with higher user coverage rate is obtained for the object to be tested, then the obtained target equipment related to the object to be tested is subjected to statistical sequencing, and therefore the test equipment for the object to be tested is determined; therefore, the determined test equipment is the equipment related to the object to be tested, the accuracy of determining the test equipment is improved, the accuracy of actual test coverage of compatibility test is improved, and meanwhile, a test team can manage and purchase the test equipment conveniently.
Fig. 2 shows a processing flow chart of another test device determination method provided in an embodiment of the present specification, which specifically includes the following steps:
step 202: and acquiring the online objects, and classifying the online objects to obtain at least one type of online object group.
Step 204: and under the condition that a test task aiming at the object to be tested is detected, determining whether the object to be tested is an on-line object.
Step 206: and under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected.
Step 208: under the condition that the object to be detected is an offline object, determining the class to which the object to be detected belongs, and acquiring an online object group of the class; and determining the target equipment corresponding to the on-line object group as the target equipment corresponding to the category of the object to be detected.
Step 210: and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
Step 212: and performing compatibility test on the object to be tested by using the target test equipment.
The method for determining the test equipment provided by the specification can fully utilize the operation data, obtain the corresponding target equipment with higher user coverage rate aiming at the object to be tested, and then perform statistical sequencing on the obtained target equipment related to the object to be tested, so as to determine the test equipment aiming at the object to be tested; therefore, the determined test equipment is the equipment related to the object to be tested, the accuracy of determining the test equipment is improved, the accuracy of actual test coverage of compatibility test is improved, and meanwhile, a test team can manage and purchase the test equipment conveniently.
The above is a schematic scheme of a test equipment determination method of this embodiment. It should be noted that the technical solution of the test device determining method belongs to the same concept as the technical solution of the test device determining method shown in fig. 1, and details of the technical solution of the test device determining method, which are not described in detail, can be referred to the description of the technical solution of the test device determining method shown in fig. 1.
Corresponding to the above method embodiment, the present specification further provides an embodiment of a test device determining apparatus, and fig. 3 shows a schematic structural diagram of the test device determining apparatus provided in an embodiment of the present specification. As shown in fig. 3, the apparatus includes:
a first determining module 302, configured to determine whether an object to be tested is an online object when a test task for the object to be tested is detected;
a first obtaining module 304, configured to obtain target equipment corresponding to the object to be detected when the object to be detected is an online object; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and a second determining module 306 configured to count the target devices according to the statistical indexes, and determine a target testing device for the object to be tested according to the statistical result.
Optionally, the obtaining module 304 is further configured to:
determining the category of the object to be detected;
acquiring the online object group of the category;
and determining the target equipment corresponding to the on-line object group as the target equipment corresponding to the category of the object to be detected.
Optionally, the apparatus further comprises:
the second acquisition module is configured to acquire the online object;
and the classification module is configured to classify the online objects to obtain at least one type of online object group.
Optionally, the second determining module 306 is further configured to:
counting and sequencing the number of the target equipment according to the equipment model;
and selecting a preset number of target devices as target test devices for the object to be tested.
Optionally, the second determining module 306 is further configured to:
determining the equipment model of the target equipment;
counting the number of target equipment corresponding to the model of the target equipment according to the model of the target equipment, wherein the model of the target equipment is any equipment model of the target equipment;
and sorting different equipment models corresponding to the target equipment according to the number of the target equipment.
Optionally, the second determining module 306 is further configured to:
counting and sequencing the number of the target equipment according to the equipment model and the system version number;
and selecting a preset number of target devices as target test devices for the object to be tested.
Optionally, the second determining module 306 is further configured to:
determining the equipment model and the system version number of the target equipment;
counting the number of target equipment corresponding to the target equipment model and the system version number aiming at the target equipment model and the system version number, wherein the target equipment model and the system version number are any equipment model and system version number of the target equipment;
and sequencing the different equipment models and the system version numbers corresponding to the target equipment according to the number of the target equipment.
Optionally, the second determining module 306 is further configured to:
determining whether target equipment corresponding to the same user account exists or not according to the model of the target equipment;
and if so, carrying out duplicate removal on the target equipment with the same user account, and executing the operation step of counting the number of the target equipment corresponding to the type of the target equipment.
Optionally, the apparatus further comprises:
the third determining module is configured to determine the current testing equipment set in the testing platform;
the first returning module is configured to determine and return the test equipment except the current test equipment in the target test equipment.
Optionally, the apparatus further comprises:
the fourth determining module is configured to determine the equipment model and the current system version number of the current testing equipment set in the testing platform;
a fifth determining module configured to determine whether the current system version number is the same as a system version number with the largest number among the target devices of the device model; and if not, returning the system version number with the maximum number in the target equipment of the equipment model.
The test equipment determination device provided by the present specification, when detecting a test task for an object to be tested, determines whether the object to be tested is an online object; under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs; and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result. Under the condition, the operation data can be fully utilized, the corresponding target equipment with higher user coverage rate is obtained for the object to be tested, then the obtained target equipment related to the object to be tested is subjected to statistical sequencing, and therefore the test equipment for the object to be tested is determined; therefore, the determined test equipment is the equipment related to the object to be tested, the accuracy of determining the test equipment is improved, the accuracy of actual test coverage of compatibility test is improved, and meanwhile, a test team can manage and purchase the test equipment conveniently.
The above is an exemplary scheme of a test equipment determination apparatus of the present embodiment. It should be noted that the technical solution of the test device determining apparatus and the technical solution of the test device determining method belong to the same concept, and details that are not described in detail in the technical solution of the test device determining apparatus can be referred to the description of the technical solution of the test device determining method.
FIG. 4 illustrates a block diagram of a computing device 400 provided according to an embodiment of the present description. The components of the computing device 400 include, but are not limited to, a memory 410 and a processor 420. Processor 420 is coupled to memory 410 via bus 430 and database 450 is used to store data.
Computing device 400 also includes access device 440, access device 440 enabling computing device 400 to communicate via one or more networks 460. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 440 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present description, the above-described components of computing device 400, as well as other components not shown in FIG. 4, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 4 is for purposes of example only and is not limiting as to the scope of the present description. Those skilled in the art may add or replace other components as desired.
Computing device 400 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smartphone), wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 400 may also be a mobile or stationary server.
Wherein the processor 420 is configured to execute the following computer-executable instructions to implement the following method:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the test device determining method belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the test device determining method.
An embodiment of the present specification also provides a computer readable storage medium storing computer instructions that, when executed by a processor, are configured to implement the method of:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the test device determining method described above, and for details that are not described in detail in the technical solution of the storage medium, reference may be made to the description of the technical solution of the test device determining method described above.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the foregoing method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present disclosure is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present disclosure. Further, those skilled in the art should also appreciate that the embodiments described in this specification are preferred embodiments and that acts and modules referred to are not necessarily required for this description.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present specification disclosed above are intended only to aid in the description of the specification. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the specification and its practical application, to thereby enable others skilled in the art to best understand the specification and its practical application. The specification is limited only by the claims and their full scope and equivalents.

Claims (13)

1. A test device determination method, comprising:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
2. The method for determining the test equipment according to claim 1, wherein the obtaining the target equipment corresponding to the class to which the object to be tested belongs includes:
determining the category of the object to be detected;
acquiring the online object group of the category;
and determining the target equipment corresponding to the on-line object group as the target equipment corresponding to the category of the object to be detected.
3. The method according to claim 1 or 2, wherein before determining whether the object to be tested is an online object, the method further comprises:
acquiring an online object;
and classifying the online objects to obtain at least one type of online object group.
4. The method according to claim 1 or 2, wherein the statistical index is a device model, the performing statistics on the target device according to the statistical index, and determining the target test device for the object to be tested according to the statistical result includes:
counting and sequencing the number of the target equipment according to the equipment model;
and selecting a preset number of target devices as target test devices for the object to be tested.
5. The method for determining test equipment according to claim 4, wherein the counting and sorting the number of target equipment according to the equipment model comprises:
determining the equipment model of the target equipment;
counting the number of target equipment corresponding to the model of the target equipment according to the model of the target equipment, wherein the model of the target equipment is any equipment model of the target equipment;
and sorting different equipment models corresponding to the target equipment according to the number of the target equipment.
6. The method according to claim 1 or 2, wherein the statistical indexes are device model numbers and system version numbers, and the determining the target test device for the object to be tested according to the statistical indexes by performing statistics on the target device according to the statistical indexes and determining the target test device for the object to be tested according to the statistical result includes:
counting and sequencing the number of the target equipment according to the equipment model and the system version number;
and selecting a preset number of target devices as target test devices for the object to be tested.
7. The method according to claim 6, wherein the counting and sorting the number of the target devices according to the device model and the system version number includes:
determining the equipment model and the system version number of the target equipment;
counting the number of target equipment corresponding to the target equipment model and the system version number aiming at the target equipment model and the system version number, wherein the target equipment model and the system version number are any equipment model and system version number of the target equipment;
and sequencing the different equipment models and the system version numbers corresponding to the target equipment according to the number of the target equipment.
8. The method according to claim 5, wherein before counting, for the target device model, the number of target devices corresponding to the target device model, the method further comprises:
determining whether target equipment corresponding to the same user account exists or not according to the model of the target equipment;
and if so, carrying out duplicate removal on the target equipment with the same user account, and executing the operation step of counting the number of the target equipment corresponding to the type of the target equipment.
9. The method according to claim 1 or 2, wherein after determining the target test device for the object to be tested according to the statistical result, the method further comprises:
determining current test equipment set in a test platform;
and determining and returning the test equipment except the current test equipment in the target test equipment.
10. The method for determining test equipment according to claim 7, wherein after the target equipment is counted according to the statistical index and the target test equipment for the object to be tested is determined according to the statistical result, the method further comprises:
determining the equipment model and the current system version number of the current testing equipment set in the testing platform;
determining whether the current system version number is the same as the most-numerous system version number in the target equipment of the equipment model;
and if not, returning the system version number with the maximum number in the target equipment of the equipment model.
11. A test device determination apparatus, comprising:
the device comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is configured to determine whether an object to be tested is an online object or not under the condition that a test task for the object to be tested is detected;
the acquisition module is configured to acquire target equipment corresponding to the object to be detected under the condition that the object to be detected is an online object; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and the second determination module is configured to count the target equipment according to the statistical indexes and determine the target test equipment for the object to be tested according to the statistical result.
12. A computing device, comprising:
a memory and a processor;
the memory is configured to store computer-executable instructions, and the processor is configured to execute the computer-executable instructions to implement the method of:
under the condition that a test task aiming at an object to be tested is detected, determining whether the object to be tested is an on-line object or not;
under the condition that the object to be detected is an online object, acquiring target equipment corresponding to the object to be detected; under the condition that the object to be detected is an offline object, acquiring target equipment corresponding to the class to which the object to be detected belongs;
and counting the target equipment according to the statistical indexes, and determining the target test equipment for the object to be tested according to the statistical result.
13. A computer-readable storage medium storing computer instructions which, when executed by a processor, carry out the steps of the test device determination method of any one of claims 1 to 10.
CN202011422468.1A 2020-12-08 2020-12-08 Test equipment determining method and device Active CN112540919B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011422468.1A CN112540919B (en) 2020-12-08 2020-12-08 Test equipment determining method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011422468.1A CN112540919B (en) 2020-12-08 2020-12-08 Test equipment determining method and device

Publications (2)

Publication Number Publication Date
CN112540919A true CN112540919A (en) 2021-03-23
CN112540919B CN112540919B (en) 2024-02-23

Family

ID=75019194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011422468.1A Active CN112540919B (en) 2020-12-08 2020-12-08 Test equipment determining method and device

Country Status (1)

Country Link
CN (1) CN112540919B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115184717A (en) * 2022-09-15 2022-10-14 为准(北京)电子科技有限公司 Test method and device based on multiple devices to be tested and electronic device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150363448A1 (en) * 2014-06-17 2015-12-17 Freescale Semiconductor, Inc. Method of, and a device for updating a multiple-processing entity packet management system, and associated computer program product
CN105320701A (en) * 2014-08-04 2016-02-10 腾讯科技(深圳)有限公司 Method and device for screening function point test implementing ways, and terminal
US20170075791A1 (en) * 2015-09-14 2017-03-16 Salesforce.Com, Inc. Methods and systems for executing tests using grouped/filtered test classes during testing of an application
CN106980573A (en) * 2016-10-26 2017-07-25 阿里巴巴集团控股有限公司 A kind of method for building test case request object, apparatus and system
CN107741903A (en) * 2017-09-11 2018-02-27 平安科技(深圳)有限公司 Application compatibility method of testing, device, computer equipment and storage medium
CN109446070A (en) * 2018-09-26 2019-03-08 深圳壹账通智能科技有限公司 Networking software upgrading test method, apparatus, electronic equipment and storage medium
CN109446069A (en) * 2018-09-26 2019-03-08 平安普惠企业管理有限公司 Compatibility test method, device, computer equipment and medium
CN109522203A (en) * 2017-09-19 2019-03-26 中移(杭州)信息技术有限公司 A kind of evaluating method and device of software product
CN110119348A (en) * 2018-02-06 2019-08-13 福建天泉教育科技有限公司 A kind of method and terminal of software upgrading test
CN110232020A (en) * 2019-05-20 2019-09-13 平安普惠企业管理有限公司 Test result analysis method and relevant apparatus based on intelligent decision
CN110765025A (en) * 2019-10-31 2020-02-07 北京东软望海科技有限公司 Test method, test device, computer equipment and storage medium
CN111078646A (en) * 2019-12-30 2020-04-28 弭迺彬 Method and system for grouping software based on running data of Internet equipment
WO2020224310A1 (en) * 2019-05-08 2020-11-12 口碑(上海)信息技术有限公司 Device use rights allocation method, device, storage medium and electronic apparatus
US20200379891A1 (en) * 2019-05-29 2020-12-03 Intelliframe, Inc. Methods, systems and computer program products for automated software testing

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150363448A1 (en) * 2014-06-17 2015-12-17 Freescale Semiconductor, Inc. Method of, and a device for updating a multiple-processing entity packet management system, and associated computer program product
CN105320701A (en) * 2014-08-04 2016-02-10 腾讯科技(深圳)有限公司 Method and device for screening function point test implementing ways, and terminal
US20170075791A1 (en) * 2015-09-14 2017-03-16 Salesforce.Com, Inc. Methods and systems for executing tests using grouped/filtered test classes during testing of an application
CN106980573A (en) * 2016-10-26 2017-07-25 阿里巴巴集团控股有限公司 A kind of method for building test case request object, apparatus and system
CN107741903A (en) * 2017-09-11 2018-02-27 平安科技(深圳)有限公司 Application compatibility method of testing, device, computer equipment and storage medium
CN109522203A (en) * 2017-09-19 2019-03-26 中移(杭州)信息技术有限公司 A kind of evaluating method and device of software product
CN110119348A (en) * 2018-02-06 2019-08-13 福建天泉教育科技有限公司 A kind of method and terminal of software upgrading test
CN109446069A (en) * 2018-09-26 2019-03-08 平安普惠企业管理有限公司 Compatibility test method, device, computer equipment and medium
CN109446070A (en) * 2018-09-26 2019-03-08 深圳壹账通智能科技有限公司 Networking software upgrading test method, apparatus, electronic equipment and storage medium
WO2020224310A1 (en) * 2019-05-08 2020-11-12 口碑(上海)信息技术有限公司 Device use rights allocation method, device, storage medium and electronic apparatus
CN110232020A (en) * 2019-05-20 2019-09-13 平安普惠企业管理有限公司 Test result analysis method and relevant apparatus based on intelligent decision
US20200379891A1 (en) * 2019-05-29 2020-12-03 Intelliframe, Inc. Methods, systems and computer program products for automated software testing
CN110765025A (en) * 2019-10-31 2020-02-07 北京东软望海科技有限公司 Test method, test device, computer equipment and storage medium
CN111078646A (en) * 2019-12-30 2020-04-28 弭迺彬 Method and system for grouping software based on running data of Internet equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115184717A (en) * 2022-09-15 2022-10-14 为准(北京)电子科技有限公司 Test method and device based on multiple devices to be tested and electronic device
CN115184717B (en) * 2022-09-15 2022-12-06 为准(北京)电子科技有限公司 Test method and device based on multiple devices to be tested and electronic device

Also Published As

Publication number Publication date
CN112540919B (en) 2024-02-23

Similar Documents

Publication Publication Date Title
CN106980573B (en) Method, device and system for constructing test case request object
CN107577688B (en) Original article influence analysis system based on media information acquisition
CN102768659B (en) Method and system for identifying repeated account
CN101616101B (en) Method and device for filtering user information
CN112494952B (en) Target game user detection method, device and equipment
CN109711424B (en) Behavior rule acquisition method, device and equipment based on decision tree
CN106803799B (en) Performance test method and device
CN111028016A (en) Sales data prediction method and device and related equipment
CN110610193A (en) Method and device for processing labeled data
CN111242318B (en) Service model training method and device based on heterogeneous feature library
CN106843941B (en) Information processing method, device and computer equipment
CN113971527A (en) Data risk assessment method and device based on machine learning
CN111652468A (en) Business process generation method and device, storage medium and computer equipment
CN116881430B (en) Industrial chain identification method and device, electronic equipment and readable storage medium
CN111159561A (en) Method for constructing recommendation engine according to user behaviors and user portrait
JP4627539B2 (en) Load test system, load test data creation method, and program thereof
CN111506504A (en) Software development process measurement-based software security defect prediction method and device
CN112153378B (en) Method and system for testing video auditing capability
CN105976188A (en) Multi-channel client information processing system and processing method
CN113506050A (en) Staff performance evaluation method and device, electronic equipment and readable storage medium
CN108345979B (en) Service testing method and device
CN112540919B (en) Test equipment determining method and device
CN108520012B (en) Mobile internet user comment mining method based on machine learning
CN105681097B (en) Method and device for acquiring replacement cycle of terminal equipment
CN104484329B (en) Consumption hot spot method for tracing and device based on comment centre word timing variations analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant