CN112003976A - Hard-coding and hard-decoding test method and device - Google Patents

Hard-coding and hard-decoding test method and device Download PDF

Info

Publication number
CN112003976A
CN112003976A CN202010759387.4A CN202010759387A CN112003976A CN 112003976 A CN112003976 A CN 112003976A CN 202010759387 A CN202010759387 A CN 202010759387A CN 112003976 A CN112003976 A CN 112003976A
Authority
CN
China
Prior art keywords
list
hard
test
client
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010759387.4A
Other languages
Chinese (zh)
Other versions
CN112003976B (en
Inventor
昝晓飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010759387.4A priority Critical patent/CN112003976B/en
Publication of CN112003976A publication Critical patent/CN112003976A/en
Application granted granted Critical
Publication of CN112003976B publication Critical patent/CN112003976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/24Arrangements for testing

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The disclosure relates to a hard-coding and hard-decoding test method and device, and relates to the technical field of test. The method comprises the following steps: receiving a first request sent by a client; responding to the first request, and sending reference configuration information or a pre-stored target list to the client; acquiring a first list of a client; the first list is generated based on the test result of the client; the test result is determined by the client based on the reference configuration information and the target list test; and updating the target list according to the first list. The method can automatically complete the hard-editing and hard-solving test, automatically improve the target list, reduce the workload of workers and improve the convenience of the test.

Description

Hard-coding and hard-decoding test method and device
Technical Field
The present disclosure relates to the field of test technologies, and in particular, to a hard-coded and hard-solved test method and apparatus.
Background
In the electronic device, there are often scenes that need to be encoded and decoded, such as shooting an encoded scene, editing a transcoding scene, and the like, and since the hard encoding and hard decoding capabilities of different models of electronic devices are not uniform, in these scenes, in order to complete the encoding and decoding operation, it is necessary to first determine whether the model of the electronic device has the hard encoding and hard decoding capability, if the model of the electronic device has the hard encoding and hard decoding capability, the hard encoding and hard decoding manner is adopted for encoding and decoding, and if the model of the electronic device does not have the hard encoding and hard decoding capability, the soft encoding and soft decoding manner is adopted for encoding and decoding.
In the related art, in the process of determining the hard-coding and hard-decoding capability of a machine type, a black and white list of the hard-coding and hard-decoding of the machine type needs to be constructed manually, or a manual control device is used for carrying out hard-coding and hard-decoding test, so that the method is large in workload and not convenient enough.
Disclosure of Invention
The present disclosure provides a hard-coded hard-solution testing method, apparatus, server and medium, so as to at least solve the problem of large workload and insufficient convenience of hard-coded hard-solution testing in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, a hard-coded hard solution testing method is provided, which is applied to a first service end, and includes:
receiving a first request sent by a client;
responding to the first request, and sending reference configuration information or a pre-stored target list to the client;
acquiring a first list of the client; the first list is generated based on a test result of the client; the test result is determined by the client based on the benchmark configuration information and the target list test;
and updating the target list according to the first list.
In an optional embodiment, the first request includes a test request or a target list acquisition request.
In an optional embodiment, before sending the reference configuration information to the client, the method further includes:
reading a pre-stored model candidate list based on the first request as the test request;
and determining the reference configuration information corresponding to the target model according to whether the target model of the client belongs to the model candidate list.
In an optional embodiment, the reading the pre-stored model candidate list includes:
determining whether to read the model candidate list or not according to whether the test request meets a preset test condition or not;
and reading the model candidate list based on the test request meeting the preset test condition.
In an alternative embodiment, the predetermined test condition is determined based on a predetermined sampling rate and a number of active users per day.
In an optional embodiment, the determining the preset test condition based on a preset sampling rate and a number of active users per day includes:
determining the number of equipment tests within a preset time length according to the product of the number of the daily active users and the preset sampling rate;
determining whether the test requests are the first N test requests received within the preset time length or not, wherein N is a positive integer less than or equal to the test quantity of the equipment;
and determining that the test requests meet the preset test conditions based on the fact that the test requests are the first N test requests received within the preset time length.
In an optional embodiment, the determining the reference configuration information corresponding to the target machine model according to whether the target machine model of the client belongs to the machine model candidate list includes:
determining a configuration rule corresponding to the target model in the model candidate list based on the fact that the target model belongs to the model candidate list, and generating the reference configuration information according to the configuration rule;
and generating the reference configuration information according to pre-stored default configuration information based on the fact that the target model does not belong to the model candidate list.
In an optional embodiment, the method further comprises:
and reading the target list stored in a preset storage device based on the first request as the target list acquisition request.
In an optional embodiment, the obtaining the first list of the client includes:
regularly sending a list acquisition request to a second server;
receiving the first list returned by the second server according to the list acquisition request; the first list is generated by the second server based on the test result of the client.
An alternative embodiment, comprising:
and the client side performs the hard-coded and hard-decoded benchmark test according to the benchmark configuration information after cold start based on that the target machine type does not belong to the target list, and uploads the test result to the second server side.
In an alternative embodiment, the updating the target list according to the first list includes:
and under the condition that the first version information of the first list is higher than the second version information of the target list, updating the target list according to the first list.
In an alternative embodiment, the target list includes at least one of: hard-coded hard-solution blacklists, hard-coded hard-solution whitelists and hard-solution chip whitelists.
According to a second aspect of the embodiments of the present disclosure, there is provided a hard-coded hard-solution testing apparatus, applied to a first service end, including:
the first receiving module is configured to execute receiving of a first request sent by a client;
a first sending module configured to execute sending, in response to the first request, reference configuration information or a pre-stored target list to the client;
a first list acquisition module configured to perform acquisition of a first list of the client; the first list is generated based on a test result of the client; the test result is generated by the client based on the benchmark configuration information and the target list test;
an update module configured to perform updating the target list according to the first list.
In an optional embodiment, the apparatus further comprises:
the list reading module is configured to read a pre-stored model candidate list based on the first request as a test request;
and the information determining module is configured to determine the reference configuration information corresponding to the target machine type according to whether the target machine type of the client belongs to the machine type candidate list.
In an alternative embodiment, the roster reading module is configured to perform: determining whether to read the model candidate list or not according to whether the test request meets a preset test condition or not; and reading the model candidate list based on the test request meeting the preset test condition.
In an alternative embodiment, the list reading module is configured to specifically perform: determining the number of equipment tests within a preset time length according to the product of the number of the daily active users and the preset sampling rate; determining whether the test requests are the first N test requests received within the preset time length or not, wherein N is a positive integer less than or equal to the test quantity of the equipment; and determining that the test requests meet the preset test conditions based on the fact that the test requests are the first N test requests received within the preset time length.
In an alternative embodiment, the information determination module is configured to perform: determining a configuration rule corresponding to the target model in the model candidate list based on the fact that the target model belongs to the model candidate list, and generating the reference configuration information according to the configuration rule; and generating the reference configuration information according to pre-stored default configuration information based on the fact that the target model does not belong to the model candidate list.
In an optional embodiment, the apparatus further comprises:
and the target list acquisition module is configured to execute reading of the target list stored in a preset storage device based on the first request as a target list acquisition request.
In an optional embodiment, the first list obtaining module is configured to perform: regularly sending a list acquisition request to a second server; receiving the first list returned by the second server according to the list acquisition request; the first list is generated by the second server based on the test result of the client.
In an alternative embodiment, the update module is specifically configured to perform:
and under the condition that the first version information of the first list is higher than the second version information of the target list, updating the target list according to the first list.
According to a third aspect of the embodiments of the present disclosure, there is provided a server, including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the hard-coded hard-solution testing method according to the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium, wherein when instructions in the storage medium are executed by a processor of a server, the server is capable of executing the hard-coded hard-solution testing method according to the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a program or instructions to, when executed, implement the hard-coded hard-solution testing method according to the first aspect.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
in this embodiment, when the client needs to determine the hard-editing and hard-solving capability of its own machine type, a first request may be sent to the first server, the first server may return the reference configuration information and the target list to the client, a hard-editing and hard-solving reference test may be performed at the client according to the reference configuration information and the target list, and the subsequent first server may obtain a first list generated based on a test result and update the target list according to the first list. Therefore, in the embodiment, manual participation is not needed, the client only needs to send the first request to the first server, the hardware editing and hardware solution benchmark test can be completed, the hardware editing and hardware solution capability of the client is determined, the first server can further continuously improve the target list according to the test result, and therefore the coverage range of the target list to the machine type is continuously enlarged, the workload of workers in the hardware editing and hardware solution test process is reduced, and the test convenience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a block diagram illustrating a hard-coded hard-solution test system in accordance with an exemplary embodiment.
FIG. 2 is an architecture diagram illustrating a hard-coded hard solution test method in accordance with an exemplary embodiment.
FIG. 3 is a flow diagram illustrating a generation of baseline configuration information, according to an example embodiment.
FIG. 4 is an architecture diagram illustrating another hard-coded hard solution test method in accordance with an exemplary embodiment.
Fig. 5 is a flowchart illustrating a method for hard-coded hard solution testing on the first service end side according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating a hard-coded hard-solution test apparatus in accordance with an example embodiment.
FIG. 7 is a block diagram illustrating a server in accordance with an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
With the development of electronic devices, the types of the electronic devices are more and more abundant, and hard-coding and hard-decoding capabilities of electronic devices of different manufacturers, different chips, and different models may be different, so that in a scene where the electronic devices need to be coded and decoded, for example, a consumer terminal plays a scene, edits a preview scene, shoots a coding scene, edits a transcoding scene, and the like, the hard-coding and hard-decoding capabilities of the models of the electronic devices need to be determined first. Where hard-coding refers to hard-coding, i.e., software development practices that embed data directly into the source code of a program or other executable object. Hard solution refers to hard decoding, i.e., decoding implemented in hardware.
The present disclosure provides a hard-coded hard-solution test system, and referring to fig. 1, fig. 1 is a block diagram illustrating a hard-coded hard-solution test system according to an exemplary embodiment. The system may include: a first server 110, a plurality of clients 120, and a preset storage device 140. The first server 110 and the client 120 are communicatively connected through a network.
The first service end 110 may be configured to generate reference configuration information based on a test request of the client 120 and send the reference configuration information to the client 120, and may also be configured to obtain a target name list from the preset storage device 140 based on a list obtaining request of the client 120 and send the target name list to the client 120; and may also be used to update the target list in the preset storage device 140 according to the test result of the client 120.
Optionally, the first service end 110 may be a server or a server cluster, and specifically, the first service end 110 may be a service processing server. The present disclosure does not limit the specific type of first service end 110.
The client 120 may be configured to send a test request to the first service end 110 to obtain the reference configuration information, or the client 120 may be further configured to send a list obtaining request to the first service end 110 to obtain a target list, for example, the target list is a test black list or a test white list. After the client 120 obtains the reference configuration information and the target list, based on that the client 120 does not belong to the target list, after the client 120 is cold started, a reference test can be performed based on the reference configuration information, and a test result is generated.
Optionally, when the client 120 executes the hard-coded and hard-solution benchmark test, specifically, the target application program in the client 120 executes the hard-coded and hard-solution benchmark test operation, where the target application program may be an application program that needs to be coded and decoded in the client 120, or may also be an application program that is specifically set in the client 120 and is used for performing the hard-coded and hard-solution benchmark test.
In some embodiments, the client 120 may be a mobile electronic device, such as a cell phone, pad, laptop, wearable smart device, etc., or may be a non-mobile electronic device, such as a computer, etc. The present disclosure does not limit the specific type of client 120.
The preset storage device 140 is used for storing the target list, for example, the preset storage device 140 may be a distributed storage (Redis) or other storage device, and the like. The preset storage device 140 may be disposed on the first server 110, or may exist independently.
For convenience of understanding, the first service end 110 is referred to as a server, and the client 120 is referred to as a mobile phone.
The method comprises the steps that a mobile phone sends a first request to a server, the server sends reference configuration information and a target name list to the mobile phone after receiving the first request, if the model of the mobile phone does not belong to the target list, the mobile phone carries out hard editing and hard decoding reference test according to the reference configuration information to obtain a test result, a follow-up server obtains a first list generated based on the test result of the mobile phone, and updates the target list according to the first list, so that relevant information whether the model of the mobile phone has hard editing and hard decoding capability is added to the target list to be stored.
In some embodiments, the system may further include a second server 130. The client 120 is further configured to report the test result to the second server 130.
The second server 130 may be configured to receive a test result reported by the client 120, and generate a first list according to the test result; the first list is returned to the first service end 110 based on the list acquisition request sent by the first service end 110.
In the above embodiment, the operation of generating the first list according to the test result is executed by the second server 130, in other embodiments, the operation may also be executed by the first server 110, which is not limited in this disclosure.
Optionally, the second server 130 may be a server or a server cluster, and specifically, the second client 120 may be a hive server, where the hive server is a data warehouse tool based on Hadoop, and may map the Structured data file into a database table and provide a complete Structured Query Language (SQL) Query function. The present disclosure does not limit the specific type of the second server 130.
In an embodiment of the present disclosure, the first server 110 is a server sever, the second server 130 is a server hive, the client 120 is a mobile phone, and the first list includes a hard white list hive table and a hard white list hive table for introduction.
The mobile phone sends a first request to the server, the server receives the first request, and then sends the reference configuration information and the target name sheet to the mobile phone, if the model of the mobile phone does not belong to the target name sheet, the mobile phone can perform hard-coded hard-matrix standard test according to the reference configuration information to obtain a test result hive, and reports the test result hive to the server hive, and the server hive generates a hard-matrix white list hive and a hard-coded white list hive according to the test result. And the subsequent server sever regularly acquires the hard solution white list hive table and the hard editing white list hive table, and updates the target list according to the hard solution white list hive table and the hard editing white list hive table, so that the relevant information of whether the model of the mobile phone has the hard editing capability or not is added into the target list for storage.
Based on the above hard-coded hard-solution test system, the present disclosure also provides a hard-coded hard-solution test method, and fig. 2 is an architecture diagram of a hard-coded hard-solution test method according to an exemplary embodiment, as shown in fig. 2, the method includes the following steps.
The client sends a first request to the first server.
The first request may include a test request or a target list obtaining request. The test request is used for requesting the first service terminal for the reference configuration information, and the target list obtaining request is used for requesting the first service terminal for the target list. The client may send the test request and the target list acquisition request to the first server at the same time, or may send the test request and the target list acquisition request sequentially, that is, the test request is sent first, and the target list acquisition request is sent after the reference configuration information returned by the first server is received, or the target list acquisition request is sent first, and whether the test request is sent or not is selected according to the target list after the target list returned by the first server is received. In some embodiments, the test request and the target list obtaining request may further include a target model of the client, own identification information, and the like, for example, an own hardware identifier, an IP address, and the like, so that the first service end can return information to the client through the communication connection.
In addition, the client may send a first request to the first server in case of receiving a test input of the user; or, the client may automatically send the first request to the first server when a program scenario run by the client needs to utilize a hard-programming capability or a hard-solution capability.
And the first service terminal responds to the first request and sends the reference configuration information or the pre-stored target list to the client terminal.
In some embodiments, the first server performs different operations according to the information of the first request, and the specific operations are as follows:
and the first service terminal sends reference configuration information to the client terminal based on the first request as a test request. The reference configuration information is information for the client to complete the hard-coded and hard-solution reference test. The reference configuration information may include at least one or more of the following information: the hard Coding hard decoding benchmark test needs to depend on the minimum client version, the sampling rate, the automatic hard Coding test version information, the automatic hard decoding test version information, Advanced Video Coding (AVC) -MCS hard decoding test enabling information, AVC-MCBB hard decoding test enabling information, High Efficiency Video Coding (HEVC) -MCS hard Coding test enabling information, HEVC-MCBB hard decoding test enabling information, the maximum decoding number and other information which can be used for the client hard Coding hard decoding benchmark test. In some embodiments, the sampling rate is used to control the number of test devices. The automatic hard-editing test version information and the automatic hard-decoding test version information are respectively used for performing a forced hard-editing test and a forced hard-decoding test. The AVC-MCS hard solution test enabling information is used for starting AVC hard solution Surface (MCS) test, and the AVC-MCBB hard solution test enabling information is used for starting AVC hard solution Bytebuffer (MCBB) test. The HEVC-MCS hard coding test enabling information is used for starting an HEVC hard decoding MCS test, and the HEVC-MCBB hard decoding test enabling information is used for starting an HEVC hard decoding MCBB test. The AVC is a video with h264 format, the HEVC refers to a video with h265 format, and h264 and h265 refer to two compression coding modes. Where MediaCodec refers to a class for encoding and decoding audio and video, and Surface is a data type supported by MediaCodec.
In addition, the first service terminal sends a pre-stored target list to the client terminal based on the first request as a target list acquisition request. In some embodiments, in response to the first request being a target list acquisition request, the first service end reads a target list stored in a preset storage device, and sends the target list to the client end. In some embodiments, the target list may include at least one of: hard-coded hard-solution blacklists, hard-coded hard-solution whitelists and hard-solution chip whitelists.
Optionally, the target list obtaining request sent by the client may include a type of a target list to be obtained, and the first server may obtain, in a targeted manner, the target list of a corresponding type from the distributed storage according to the type of the target list carried in the target list obtaining request. Based on this, the client can only obtain the list required by the client without obtaining all types of lists, and the data transmission quantity between the first service end and the client is reduced. For example, the target list acquisition request includes an identifier of a hard codec blacklist, and the first service end only acquires the hard codec blacklist from the Redis and returns the hard codec blacklist to the client.
In other embodiments, in the target list acquisition request sent by the client, it may also be limited to acquire only the white list or the black list, and if the white list is acquired, all types of white lists (i.e., the hard-programmed hard-solution white list and the hard-solution chip white list) are sent to the client. Or the client can not select the type of the target list, and the first service end returns all types of target lists to the client, so that the client can know the hard-programming and hard-decoding capability of the client as much as possible, and the test times of the client are reduced.
In other embodiments, after obtaining the read target list and before sending the target list to the client, the first service may further include: and the first server side constructs hardware configuration information (HardwareConfig) according to the target list.
In this embodiment, the first service end can obtain the request and the read target list according to the target list of the client, and generate the hardware configuration information, where the hardware configuration information may include a target machine type of the client, whether the target machine type is in a blacklist, whether the target machine type is in a white list, and the like. Under the condition, the client can directly know whether the client is in the black and white list or not after receiving the hardware configuration information without judging by the client, so that the convenience of the client is improved. For example, the hardware configuration information may include: model xxxx; whether in a hard de-hard blacklist: if not; whether in the hard-coded hard white list: is that; whether the white list of the chip is in a hard solution: and no. Of course, the present disclosure does not limit the specific content information contained in the hardware configuration information.
And the client determines the test result of the hard-coded and hard-solved benchmark test based on the benchmark configuration information and the target list.
In some embodiments, the client can determine whether the own target machine type is in the target list according to the target list, if so, the client can directly determine the hard-programming hard-solution capability of the own machine type according to the information in the target list, and then the hard-programming hard-solution benchmark test is not required, so that the test times are simplified, the tested machine types can share the test result, and each device is not required to perform the hard-programming hard-solution benchmark test. And if the target machine type is not in the target list, the client needs to complete the hard-programming and hard-solution benchmark test according to the benchmark configuration information to obtain a test result. For example, the target list includes a hard-programmed white list, and if the target machine type is in the hard-programmed white list, the client can directly determine that the machine type has the hard-programmed capability, so that the hard-programmed test is not required. On the contrary, if the target list comprises the hard solution blacklist, if the target machine type processes the hard solution blacklist, the client can directly determine that the machine type of the client does not have the hard solution capability, and therefore the hard solution test is not needed any more.
In some embodiments, based on that the target model does not belong to the target list, the client cannot determine the hard-coded and hard-decoded capability of the target model, and therefore after cold start, the client needs to perform a hard-coded and hard-decoded benchmark test according to the benchmark configuration information to obtain a test result. In a specific embodiment, after the client is cold-started and when the system of the client is idle, the test service is started, and the hard-coded and hard-solution benchmark test is performed according to the benchmark configuration information, so that the influence of the hard-coded and hard-solution capability on other functions of the client can be avoided.
In other embodiments, if the client stores the test result of the hard-coded hard solution of the client to the local, the client performs the hard-coded hard solution benchmark test according to the benchmark configuration information after cold start based on that the target machine type does not belong to the target list and that no test result exists locally. Namely, the client only carries out the hard-programming and hard-decoding benchmark test under the condition that the own machine type does not belong to the target list and no test result exists locally, thereby further reducing the test times.
The hard-coded hard-solution benchmark test may include a plurality of test stages. Optionally, the method may include 5 test stages, namely, an AVC hard solution MCS test, an AVC hard solution MCBB test, an HEVC hard solution MCS test, an HEVC hard solution MCBB test, and an AVC hard coding test. Each stage can be interrupted, if one stage is interrupted, the test is terminated, and after the next cold start, the test of the rest stages is continued. In some embodiments, the entire hard-coded hard solution benchmark test may also be re-run the next time the test is started cold after the test is terminated.
Optionally, the test result of the client is a test result generated by performing a hard-coded and hard-solved benchmark test after the client performs cold start. After the client generates the test result, the test result may be stored locally, and the test result may be directly sent to the first server, or the first list may be generated according to the test result. The present disclosure is not limited thereto.
The first server side obtains a first list of the client side.
Wherein the first list includes, but is not limited to, the hard-programming capability information and the hard-decoding capability information of the client. The first list may include at least one of: hard white list hive table, hard black list hive table and hard chip white list hive table. For example, if the target machine type of the client is in the hard white list hive table, it indicates that the target machine type has hard editing capability.
In some embodiments, the first server may directly obtain the test result of the client, and generate the first list according to the test result. Or the first server may directly obtain the first list generated by the client under the condition that the client generates the first list by itself.
And the first server side updates the target list according to the first list.
The updating method may be that the first service end directly adds the information included in the first list to a corresponding list in the target list. For example, in the case that the first list includes a hard white list hive table, the information in the hard white list hive table is directly added to the hard white list in the target list.
Because the previous target list does not include the hard-coding and hard-decoding capability information of the target machine type of the client, after the client completes the hard-coding and hard-decoding test, the target list needs to be updated according to the test result, so that the updated target list includes the hard-coding and hard-decoding capability information of the target machine type. By the method, the target list is continuously improved, so that when the subsequent hardware-programming and hardware-solving test is carried out on the equipment of the same type as the client, the hardware-programming and hardware-solving capability of the equipment can be directly known according to the target list without testing, and the aim of sharing the test result is fulfilled.
In some embodiments, the first service end updates the target list according to the first list in a case that the first version information of the first list is higher than the second version information of the target list. The first list further includes first version information, the first version information is generated when the first list is generated, and the first version information is used for representing versions of the list information in the first list.
In this embodiment, when the first server updates the target list by using the first list, the first version information of the first list is also used to update the second version information of the target list, for example, if the first version information of the first list is 01, and the original second version information of the target list is 01, then after the update, the second version information of the target list is 02. Therefore, only in the case that the first version information of the first list is higher than the second version information of the target list, the first list is indicated to be newer than the target list, and therefore the target list can be updated according to the first list. The mode of setting the version information avoids the situation that the target list is repeatedly updated or is wrongly updated by using old data, and ensures the accuracy of the target list.
In connection with the above embodiments, referring to fig. 3, fig. 3 is an architecture diagram illustrating another hard-coded hard-solution testing method according to an exemplary embodiment. The hard-coded hard-solution test method provided by the disclosure may include:
the client sends a test request to the first service terminal.
And the first service end responds to the first request and sends the reference configuration information to the client.
The client sends a target list acquisition request to the first service terminal.
The above parts are similar to the scheme described in fig. 2, and are not described herein again to avoid repetition.
The first server reads a target list stored in a preset storage device and sends the target list to the client.
The preset storage device is used for storing the target list, and the preset storage device may be a distributed storage (Redis) or other storage devices. The preset storage device may be disposed on the first server, or may exist independently.
And the client performs hard-programming and hard-solution benchmark test according to the benchmark configuration information after cold start based on that the target machine type does not belong to the target list.
And the client uploads the test result to the second server.
That is, after the client generates the test result, the test result may be uploaded to the second server, where the second server is a server for obtaining the test result and generating the first list. For example, the second server may be a hive server.
The second client generates a first list based on the test result.
And the second server generates a first list based on the test result reported by the client. The second server may store the test result reported by the client in a test result hive table, and then periodically read information from the test result hive table to generate a first list, where the first list may include a hard solution white list hive table and/or a hard editing white list hive table, or the first list may further include at least one of the following items: hard white list hive table, hard black list hive table and hard chip white list hive table.
The first server side sends a list acquisition request to the second server side at regular time.
The first server may send the list acquisition request to the second server every preset time, for example, send the list acquisition request to the second server every 3 hours. Or the first server may also send the list acquisition request to the second server at a fixed time, for example, send the list acquisition request to the second server at 12 points per day.
And the second server acquires the first list returned by the request to the first server according to the list.
In this case, the first list is generated by the second server according to the test result of the client without the first server generating the first list, so that the working content of the first server is simplified, and the first server and the second server can work separately and cooperate.
In some embodiments, when the second server generates the first list, the second server may further generate first version information, such as a first version number, for the first list. The newer the first version information, the larger the first version information. For example, the first version information of the first list generated last time is 01, and the first version information of the first list generated this time is 02. Similarly, the target list is also provided with corresponding second version information, and the second version information and the first version information adopt the same coding mode, namely the newer the target list is, the larger the second version information is. The version information can record the sequence of the first list and the generation of the data in the target list, so that the situation that the target list is repeatedly updated by using the first list is avoided, and the accuracy of the target list is ensured.
And the first server side updates the target list according to the first list. This part is similar to the scheme described in fig. 2, and is not described here again to avoid redundancy.
In this embodiment, the hardware-programmed hardware-solution benchmark test is performed only when the target model of the client does not belong to the target list, so that the test times of the device are reduced, and the client can quickly know the hardware-programmed hardware-solution capability of the client. In addition, the first service terminal can accurately distribute proper benchmark configuration information to the client terminal according to the model of the client terminal, so that the client terminal can execute a hard-coded and hard-solved benchmark test meeting the self requirement. In addition, the first service end can update the target list according to the test result of the client list, so that the follow-up other clients with the same model do not need to perform the same test any more, and the sharing of the test result to the devices with the same model is realized. The embodiment can be automatically carried out without manual participation, the workload of workers is reduced, and the testing efficiency is improved.
Based on the foregoing embodiments, in some specific implementations, referring to fig. 4, fig. 4 is a flowchart illustrating a method for generating reference configuration information according to an exemplary embodiment. The sending, by the first server, the reference configuration information to the client in response to the first request may include the following steps.
In step S410, the first service end reads a pre-stored model Candidate list (Benchmark Candidate) based on the first request as a test request.
The model candidate list refers to a list of preset models to be tested, and corresponding reference configuration information, special requirements during testing and the like are set for the models in the list in advance.
In some implementations, the reading the pre-stored model candidate list may include:
the first server determines whether to read a model candidate list according to whether the test request meets a preset test condition;
and the first service terminal reads the model candidate list based on the test request meeting the preset test condition.
In the implementation mode, the preset test condition is set, and only when the preset test condition is met, the model candidate list is read and the reference configuration information is issued. If the preset test condition is not met, the model candidate list cannot be read, and the reference configuration information cannot be issued, namely, the client cannot perform the hard-programming and hard-decoding reference test. In this case, a certain limitation can be performed on the client that performs the hard-coded and hard-decoded benchmark test, for example, the model of the client is limited, only the client that satisfies a specific model can perform the hard-coded and hard-decoded benchmark test, or the number of test requests received by the first service end is limited, and the like, the limitation can play a certain protection role on the client or the first service end, and specific preset test conditions can be set according to actual requirements.
In one implementation, the preset test condition is determined based on a preset sampling rate and a daily active user number. The preset sampling rate may be a sampling rate of kpn latitude, and the preset sampling rate is used for controlling the number of devices for performing the hard-coded and hard-decoded benchmark test. The value of the preset sampling rate is a preset fixed value, and the value cannot be changed by itself, for example, the value may be 0.08, and the preset sampling rate may be set according to the actual situation of the first service end. The number of Active users (DAU) per day may be obtained by the first server from a service server corresponding to a target application program in a client. The DAU will change itself, so to ensure accuracy, the DAU may be acquired from the service end once every certain time period, for example, 24 hours. The method can determine whether to allow the client to execute the hard-coded hard-solution test request according to the number of users executing the service every day, and the working capacity of the first server is considered, so that excessive working pressure on the first server is avoided.
In another specific implementation manner, the determining the preset test condition based on the preset sampling rate and the number of active users per day may include:
the first server determines the number of equipment tests within a preset time length according to the product of the number of daily active users and a preset sampling rate;
the first service end determines whether the test request is the first N test requests received within a preset time length, wherein N is a positive integer less than or equal to the test quantity of the equipment;
the first service end determines that the test request meets the preset test condition based on the test request being the first N test requests received within the preset time length.
In the implementation mode, by the product of the number of the daily active users and the preset sampling rate, the number of devices allowed to perform the hard-coded hard-solution benchmark test within the preset duration can be known, namely only a certain proportion of the devices in the daily active total devices can be limited to perform the hard-coded hard-solution benchmark test. Therefore, if the test requests sent by the client belong to the first N test requests, the client is allowed to perform the hard-coded hard-solution benchmark test. The method can limit the number of devices for performing the hard-programming and hard-decoding benchmark test within the preset time length, and reduce the working pressure of the first server.
For example, the number of active users per day obtained on the same day is 10000, the preset sampling rate is 0.08, the device test number is 10000 × 0.08 ═ 800, the preset duration is 1 day, only the clients corresponding to the first 800 test requests received on the same day can perform the hard-programmed hard-solution benchmark test, and the clients corresponding to other test requests cannot perform the hard-programmed hard-solution benchmark test.
In step S420, the first service end determines whether the target machine model belongs to the machine model candidate list.
In step S430, the first service end determines a configuration rule corresponding to the target model in the model candidate list based on that the target model belongs to the model candidate list, and generates reference configuration information (Benchmark configuration) according to the configuration rule.
In step S440, the first service end generates reference configuration information according to the pre-stored default configuration information based on that the target model does not belong to the model candidate list.
In the implementation manner, for each model in the model candidate list, a corresponding configuration rule is preset, if the target model belongs to the model candidate list, the matched reference configuration information is generated according to the corresponding configuration rule, and if the target model does not belong to the model candidate list, the universal default configuration information is directly used as the reference configuration information. Because the performance of different models to be tested may be different, for example, model 1 only needs to be tested for the hard compiling capability, and model 2 only needs to be tested for the hard solution capability, the reference configuration information corresponding to model 1 only needs to include the hard compiling configuration information, and model 2 is the same. The mode can not only meet the special test requirements of a specific machine type, but also avoid the client from executing some unnecessary test contents, thereby improving the test pertinence and the test efficiency. The model candidate list can be pre-edited by a worker and then stored in the first service terminal, or stored in a storage device connected with the first service terminal, and the subsequent worker can edit the model candidate list according to requirements.
In some embodiments, after the first service end receives the first list, the target model of the client can be deleted from the model candidate list, so that a subsequent worker does not need to manually delete the tested model, and the operation of the worker is simplified.
Based on the hard-programming and hard-solving test method, specific work content of the first service end is described below, wherein part of the content is already described in the above method embodiment, and therefore, in the following description of the work content of the first service end, description is not repeated. Fig. 5 is a flowchart illustrating a method for hard-coded hard solution testing on the first service end side according to an exemplary embodiment, which includes the following steps.
In step S510, the first service end receives a first request sent by the client.
The first request may include a test request or a target list obtaining request. The client may send the test request and the target list acquisition request to the first server at the same time, or may send the test request and the target list acquisition request sequentially, that is, the test request is sent first, and the target list acquisition request is sent after the reference configuration information returned by the first server is received, or the target list acquisition request is sent first, and whether the test request is sent or not is selected according to the target list after the target list returned by the first server is received. In some embodiments, the test request and the target list obtaining request may further include a target model of the client, own identification information, and the like, for example, an own hardware identifier, an IP address, and the like, so that the first service end can return information to the client through the communication connection.
In step S520, the first service end sends the reference configuration information to the client end based on the first request as a test request.
The reference configuration information may include at least one or more of the following information: the hard coding hard decoding benchmark test needs the minimum client version, the sampling rate, the automatic hard coding test version information, the automatic hard decoding test version information, the AVC-MCS hard decoding test enabling information, the AVC-MCBB hard decoding test enabling information, the HEVC-MCS hard coding test enabling information, the HEVC-MCBB hard decoding test enabling information, the maximum decoding number and other information which can be used for the client hard coding hard decoding benchmark test.
In step S530, the first service end sends a pre-stored target list to the client end based on the first request as a target list acquisition request.
In some embodiments, in response to the first request being a target list acquisition request, the first service end reads a target list stored in a preset storage device, and sends the target list to the client end. In some embodiments, the target list may include at least one of: hard-coded hard-solution blacklists, hard-coded hard-solution whitelists and hard-solution chip whitelists.
Optionally, the target list obtaining request sent by the client may include a type of a target list to be obtained, and the first server may obtain, in a targeted manner, the target list of a corresponding type from the distributed storage according to the type of the target list carried in the target list obtaining request.
Steps S520 and S530 may be executed in parallel or sequentially, and the present disclosure does not limit the order of the two steps.
In step S540, the first server obtains a first list of the client; the first list is generated based on the test result of the client; the test result is determined by the client based on the benchmark configuration information and the target list test.
In some embodiments, the first server may directly obtain the test result of the client, and generate the first list according to the test result. Or the first server may directly obtain the first list generated by the client under the condition that the client generates the first list by itself.
In step S550, the first server updates the target list according to the first list.
The updating method may be that the first service end directly adds the information included in the first list to a corresponding list in the target list.
In this embodiment, when the client needs to determine the hard-editing and hard-solving capability of its own machine type, a first request may be sent to the first server, the first server may return the reference configuration information and the target list to the client, a hard-editing and hard-solving reference test may be performed at the client according to the reference configuration information and the target list, and the subsequent first server may obtain a first list generated based on a test result and update the target list according to the first list. Therefore, in the embodiment, manual participation is not needed, the client only needs to send the first request to the first server, the hardware editing and hardware solution benchmark test can be completed, the hardware editing and hardware solution capability of the client is determined, the first server can further continuously improve the target list according to the test result, and therefore the coverage range of the target list to the machine type is continuously enlarged, the workload of workers in the hardware editing and hardware solution test process is reduced, and the test convenience is improved.
In some embodiments, in S520, before the sending, by the first server, the reference configuration information to the client, the method may further include:
the first service terminal reads a pre-stored model candidate list based on the first request as a test request;
and the first server determines reference configuration information corresponding to the target machine type according to whether the target machine type of the client belongs to the machine type candidate list.
In this embodiment, the reference configuration information corresponding to the model in the model candidate list may be different from the reference configuration information corresponding to the model not in the model candidate list. Under the condition, the test method can ensure that some special test requirements on part of machine types can be finished in the automatic test process, and the test effect is improved. The model candidate list can be pre-edited by a worker and then stored in the first service terminal, or stored in a storage device connected with the first service terminal, and the subsequent worker can edit the model candidate list according to requirements.
Based on the foregoing embodiment, in some specific implementation manners, the reading the pre-stored model candidate list may include:
the first server determines whether to read a model candidate list according to whether the test request meets a preset test condition;
and the first service terminal reads the model candidate list based on the test request meeting the preset test condition.
In the implementation mode, the preset test condition is set, and only when the preset test condition is met, the model candidate list is read and the reference configuration information is issued. If the preset test condition is not met, the model candidate list cannot be read, and the reference configuration information cannot be issued, namely, the client cannot perform the hard-programming and hard-decoding reference test. In this case, a certain limitation may be performed on the client performing the hard-coded hard de-benchmark test, for example, the type of the client is limited, only the client satisfying a specific type may perform the hard-coded hard de-benchmark test, or the number of test requests received by the first service is limited, and this limitation may play a certain role in protecting the client or the first service.
Optionally, the preset test condition is determined based on a preset sampling rate and the number of active users per day. The method can determine whether to allow the client to execute the hard-coded hard-solution test request according to the number of users executing the service every day, and the working capacity of the first server is considered, so that excessive working pressure on the first server is avoided.
In a specific embodiment, the preset test condition is determined based on a preset sampling rate and a daily active user number, and may include:
the first server determines the number of equipment tests within a preset time length according to the product of the number of daily active users and a preset sampling rate;
the first service end determines whether the test request is the first N test requests received within a preset time length, wherein N is a positive integer less than or equal to the test quantity of the equipment;
the first service end determines that the test request meets the preset test condition based on the test request being the first N test requests received within the preset time length.
In this embodiment, by the product of the number of active users on day and the preset sampling rate, it can be known how many devices are allowed to perform hard-coded hard-solution benchmark tests within a preset duration, that is, only a certain proportion of devices in the total devices active on day are limited to perform hard-coded hard-solution benchmark tests. Therefore, if the test requests sent by the client belong to the first N test requests, the client is allowed to perform the hard-coded hard-solution benchmark test. The method can limit the number of devices for performing the hard-programming and hard-decoding benchmark test within the preset time length, and reduce the working pressure of the first server.
In some embodiments, the determining, by the first service end, the reference configuration information corresponding to the target machine model according to whether the target machine model of the client belongs to the machine model candidate list may include:
the first service terminal determines a configuration rule corresponding to a target machine type in a machine type candidate list based on that the target machine type belongs to the machine type candidate list, and generates reference configuration information according to the configuration rule;
and the first service terminal generates reference configuration information according to the pre-stored default configuration information based on the fact that the target machine type does not belong to the machine type candidate list.
In the implementation manner, for each model in the model candidate list, a corresponding configuration rule is preset, if the target model belongs to the model candidate list, the matched reference configuration information is generated according to the corresponding configuration rule, and if the target model does not belong to the model candidate list, the universal default configuration information is directly used as the reference configuration information. The performance that needs to be tested may vary from model to model. The mode can not only meet the special test requirements of a specific machine type, but also avoid the client from executing some unnecessary test contents, thereby improving the test pertinence and the test efficiency.
In some embodiments, in S530, before the first server obtains the request for the target list based on the first request and sends the target list to the client, the method further includes: and the first server reads a target list stored in the preset storage device.
In this embodiment, the target list is stored in the preset storage device, which can prevent the target list from occupying the storage space of the first server, and the separate storage mode also facilitates other devices to access the target list.
In other embodiments, the S540 may include:
the first server sends a list acquisition request to the second server at regular time; the first server may send the list acquisition request to the second server every preset time, for example, send the list acquisition request to the second server every 3 hours. Or the first server may also send the list acquisition request to the second server at a fixed time, for example, send the list acquisition request to the second server at 12 points per day.
The first server receives a first list returned by the second server according to the list acquisition request; the first list is generated by the second server side based on the test result of the client side.
In this embodiment, the first list is generated by the second server according to the test result of the client without the first server generating the first list, so that the working content of the first server is simplified, and the first server and the second server can work separately and cooperate.
In still other embodiments, the step S550 may include:
and under the condition that the first version information of the first list is higher than the second version information of the target list, the first server side updates the target list according to the first list.
In this embodiment, when the second server generates the first list, the second server may also generate first version information, for example, a first version number, for the first list, and the newer the first version information is, the larger the first version information is. Similarly, the target list is also provided with corresponding second version information, and the second version information and the first version information adopt the same coding mode, namely the newer the target list is, the larger the second version information is. When the first server side updates the target list by using the first list, the first version information of the first list is also used for updating the second version information of the target list. Therefore, only in the case that the first version information of the first list is higher than the second version information of the target list, the first list is indicated to be newer than the target list, and therefore the target list can be updated according to the first list. The mode of setting the version information avoids the situation that the target list is repeatedly updated or is wrongly updated by using old data, and ensures the accuracy of the target list.
Based on the same inventive concept, the embodiment of the present disclosure further provides a hard-coded and hard-solved testing apparatus, which is applied to the first service end. FIG. 6 is a block diagram illustrating a hard-coded hard-solution testing apparatus according to an example embodiment. Referring to fig. 6, the apparatus includes a first receiving module 610, a first sending module 620, a first list obtaining module 630, and an updating module 640.
The first receiving module 610 is configured to perform receiving a first request sent by a client.
The first request may include a test request or a target list obtaining request. The test request is used for requesting the first service terminal for the reference configuration information, and the target list obtaining request is used for requesting the first service terminal for the target list. The client may send the test request and the target list acquisition request to the first server at the same time, or may send the test request and the target list acquisition request sequentially, that is, the test request is sent first, and the target list acquisition request is sent after the reference configuration information returned by the first server is received, or the target list acquisition request is sent first, and whether the test request is sent or not is selected according to the target list after the target list returned by the first server is received. In some embodiments, the test request and the target list obtaining request may further include a target model of the client, own identification information, and the like, for example, an own hardware identifier, an IP address, and the like, so that the first service end can return information to the client through the communication connection.
A first sending module 620 configured to perform sending the reference configuration information or the pre-stored target list to the client in response to the first request.
Specifically, the reference configuration information is generated based on the first request as a test request, and the reference configuration information is sent to the client. And sending a pre-stored target list to the client based on the first request as a target list acquisition request.
A first list obtaining module 630 configured to perform obtaining a first list of the client; the first list is generated based on the test result of the client; the test result is generated by the client based on the benchmark configuration information and the target list test.
In some embodiments, the first server may directly obtain the test result of the client, and generate the first list according to the test result. Or the first server may directly obtain the first list generated by the client under the condition that the client generates the first list by itself.
An update module 640 configured to perform updating the target list according to the first list.
The updating method may be that the first service end directly adds the information included in the first list to a corresponding list in the target list.
In this embodiment, when the client needs to determine the hard-editing and hard-solving capability of its own machine type, a first request may be sent to the first server, the first server may return the reference configuration information and the target list to the client, a hard-editing and hard-solving reference test may be performed at the client according to the reference configuration information and the target list, and the subsequent first server may obtain a first list generated based on a test result and update the target list according to the first list. Therefore, in the embodiment, manual participation is not needed, the client only needs to send the first request to the first server, the hardware editing and hardware solution benchmark test can be completed, the hardware editing and hardware solution capability of the client is determined, the first server can further continuously improve the target list according to the test result, and therefore the coverage range of the target list to the machine type is continuously enlarged, the workload of workers in the hardware editing and hardware solution test process is reduced, and the test convenience is improved.
In some embodiments, the apparatus may further comprise:
the list reading module is configured to read a pre-stored model candidate list based on the first request as a test request;
and the information determining module is configured to determine reference configuration information corresponding to the target machine type according to whether the target machine type of the client belongs to the machine type candidate list.
In this embodiment, the reference configuration information corresponding to the model in the model candidate list may be different from the reference configuration information corresponding to the model not in the model candidate list. Under the condition, the test method can ensure that some special test requirements on part of machine types can be finished in the automatic test process, and the test effect is improved. The model candidate list can be pre-edited by a worker and then stored in the first service terminal, or stored in a storage device connected with the first service terminal, and the subsequent worker can edit the model candidate list according to requirements.
Based on the foregoing embodiments, in some specific implementations, the list reading module may be configured to perform:
determining whether to read a model candidate list according to whether the test request meets a preset test condition; and reading the model candidate list based on the test request meeting the preset test condition.
In the implementation mode, the preset test condition is set, and only when the preset test condition is met, the model candidate list is read and the reference configuration information is issued. If the preset test condition is not met, the model candidate list cannot be read, and the reference configuration information cannot be issued, namely, the client cannot perform the hard-programming and hard-decoding reference test. In this case, a certain limitation may be performed on the client performing the hard-coded hard de-benchmark test, for example, the type of the client is limited, only the client satisfying a specific type may perform the hard-coded hard de-benchmark test, or the number of test requests received by the first service is limited, and this limitation may play a certain role in protecting the client or the first service.
In some embodiments, the condition determining unit may be configured to perform:
determining the number of equipment tests within a preset time length according to the product of the number of daily active users and a preset sampling rate; determining whether the test requests are the first N test requests received within a preset time length or not, wherein N is a positive integer less than or equal to the test number of the equipment; and determining that the test requests meet the preset test conditions based on the fact that the test requests are the first N test requests received within the preset time length.
In this embodiment, by the product of the number of active users on day and the preset sampling rate, it can be known how many devices are allowed to perform hard-coded hard-solution benchmark tests within a preset duration, that is, only a certain proportion of devices in the total devices active on day are limited to perform hard-coded hard-solution benchmark tests. Therefore, if the test requests sent by the client belong to the first N test requests, the client is allowed to perform the hard-coded hard-solution benchmark test. The method can limit the number of devices for performing the hard-programming and hard-decoding benchmark test within the preset time length, and reduce the working pressure of the first server.
In some embodiments, the information determination module may be configured to perform:
determining a configuration rule corresponding to the target model in the model candidate list based on the fact that the target model belongs to the model candidate list, and generating reference configuration information according to the configuration rule; and generating reference configuration information according to the pre-stored default configuration information based on the fact that the target model does not belong to the model candidate list.
In the implementation manner, for each model in the model candidate list, a corresponding configuration rule is preset, if the target model belongs to the model candidate list, the matched reference configuration information is generated according to the corresponding configuration rule, and if the target model does not belong to the model candidate list, the universal default configuration information is directly used as the reference configuration information. The performance that needs to be tested may vary from model to model. The mode can not only meet the special test requirements of a specific machine type, but also avoid the client from executing some unnecessary test contents, thereby improving the test pertinence and the test efficiency.
In other embodiments, the apparatus may further comprise:
and the target list acquisition module is configured to execute the target list acquisition request based on the first request and read the target list stored in the preset storage device.
In this embodiment, the target list is stored in the preset storage device, which can prevent the target list from occupying the storage space of the first server, and the separate storage mode also facilitates other devices to access the target list.
In other embodiments, the first list obtaining module 630 may be configured to perform:
regularly sending a list acquisition request to a second server; receiving a first list returned by the second server according to the list acquisition request; the first list is generated by the second server side based on the test result of the client side.
The first server may send the list acquisition request to the second server every preset time, for example, send the list acquisition request to the second server every 3 hours. Or the first server may also send the list acquisition request to the second server at a fixed time.
In this embodiment, the first list is generated by the second server according to the test result of the client without the first server generating the first list, so that the working content of the first server is simplified, and the first server and the second server can work separately and cooperate.
In still other embodiments, the update module 640 may be specifically configured to perform:
and under the condition that the first version information of the first list is higher than the second version information of the target list, updating the target list according to the first list.
In this embodiment, when the second server generates the first list, the second server may also generate first version information, for example, a first version number, for the first list, and the newer the first version information is, the larger the first version information is. Similarly, the target list is also provided with corresponding second version information, and the second version information and the first version information adopt the same coding mode, namely the newer the target list is, the larger the second version information is. When the first server side updates the target list by using the first list, the first version information of the first list is also used for updating the second version information of the target list. Therefore, only in the case that the first version information of the first list is higher than the second version information of the target list, the first list is indicated to be newer than the target list, and therefore the target list can be updated according to the first list. The mode of setting the version information avoids the situation that the target list is repeatedly updated or is wrongly updated by using old data, and ensures the accuracy of the target list.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
FIG. 7 is a block diagram illustrating a server in accordance with an example embodiment.
Based on the same inventive concept, in an exemplary embodiment, there is also provided a server 700, including: a processor 700, a communication interface 706, a memory 704 for storing instructions executable by the processor 700, and a communication bus 702; wherein the processor 700 is configured to execute the instructions to implement the steps implemented by the first service end or the second service end in the above method. Wherein the processor 700, communication interface 706, and memory 704 communicate with each other via a communication bus 702.
Based on the same inventive concept, in an exemplary embodiment, there is also provided a storage medium including instructions, for example, a memory including instructions, which are executable by the server 700 to perform the above-described method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A hard-coding and hard-decoding test method is applied to a first service end and is characterized by comprising the following steps:
receiving a first request sent by a client;
responding to the first request, and sending reference configuration information or a pre-stored target list to the client;
acquiring a first list of the client; the first list is generated based on a test result of the client; the test result is determined by the client based on the benchmark configuration information and the target list test;
and updating the target list according to the first list.
2. The hard-coded hard-solution testing method of claim 1, wherein the first request comprises a test request or a target list acquisition request.
3. The hard-coded hard-solution testing method according to claim 2, wherein before sending the reference configuration information to the client, the method further comprises:
reading a pre-stored model candidate list based on the first request as the test request;
and determining the reference configuration information corresponding to the target model according to whether the target model of the client belongs to the model candidate list.
4. The hard-coded hard solution testing method according to claim 3, wherein the reading of the pre-stored model candidate list comprises:
determining whether to read the model candidate list or not according to whether the test request meets a preset test condition or not;
and reading the model candidate list based on the test request meeting the preset test condition.
5. The hard-coded hard-solution testing method of claim 4, wherein the preset testing condition is determined based on a preset sampling rate and a number of active users per day.
6. The hard-coded hard-solution testing method according to claim 5, wherein the preset testing condition is determined based on a preset sampling rate and a daily active user number, and comprises:
determining the number of equipment tests within a preset time length according to the product of the number of the daily active users and the preset sampling rate;
determining whether the test requests are the first N test requests received within the preset time length or not, wherein N is a positive integer less than or equal to the test quantity of the equipment;
and determining that the test requests meet the preset test conditions based on the fact that the test requests are the first N test requests received within the preset time length.
7. The hard-coded hard-solution testing method according to claim 3, wherein the determining the reference configuration information corresponding to the target model according to whether the target model of the client belongs to the model candidate list includes:
determining a configuration rule corresponding to the target model in the model candidate list based on the fact that the target model belongs to the model candidate list, and generating the reference configuration information according to the configuration rule;
and generating the reference configuration information according to pre-stored default configuration information based on the fact that the target model does not belong to the model candidate list.
8. The hard-coded hard-solution testing method according to claim 1, wherein the obtaining the first list of the client comprises:
regularly sending a list acquisition request to a second server;
receiving the first list returned by the second server according to the list acquisition request; the first list is generated by the second server based on the test result of the client.
9. The hard-coded hard solution testing method according to claim 8, wherein the updating the target list according to the first list comprises:
and under the condition that the first version information of the first list is higher than the second version information of the target list, updating the target list according to the first list.
10. A hard-coding and hard-decoding testing device is applied to a first service end and is characterized by comprising:
the first receiving module is configured to execute receiving of a first request sent by a client;
a first sending module configured to execute sending, in response to the first request, reference configuration information or a pre-stored target list to the client;
a first list acquisition module configured to perform acquisition of a first list of the client; the first list is generated based on a test result of the client; the test result is generated by the client based on the benchmark configuration information and the target list test;
an update module configured to perform updating the target list according to the first list.
CN202010759387.4A 2020-07-31 2020-07-31 Hard-coding and hard-decoding test method and device Active CN112003976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010759387.4A CN112003976B (en) 2020-07-31 2020-07-31 Hard-coding and hard-decoding test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010759387.4A CN112003976B (en) 2020-07-31 2020-07-31 Hard-coding and hard-decoding test method and device

Publications (2)

Publication Number Publication Date
CN112003976A true CN112003976A (en) 2020-11-27
CN112003976B CN112003976B (en) 2022-04-29

Family

ID=73463554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010759387.4A Active CN112003976B (en) 2020-07-31 2020-07-31 Hard-coding and hard-decoding test method and device

Country Status (1)

Country Link
CN (1) CN112003976B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112527616A (en) * 2020-12-14 2021-03-19 北京达佳互联信息技术有限公司 Data processing method and device
CN114363001A (en) * 2021-12-06 2022-04-15 国网安徽省电力有限公司超高压分公司 Method, system and storage medium for client access limitation based on offline configuration
CN114501017A (en) * 2022-03-04 2022-05-13 百果园技术(新加坡)有限公司 Video coding adaptation method, device, equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103188119A (en) * 2011-12-27 2013-07-03 特克特朗尼克公司 Confidence intervals for key performance indicators in communication networks
CN104980788A (en) * 2015-02-11 2015-10-14 腾讯科技(深圳)有限公司 Video decoding method and device
CN104980797A (en) * 2015-05-27 2015-10-14 腾讯科技(深圳)有限公司 Video decoding method and client
CN106331765A (en) * 2015-06-30 2017-01-11 腾讯科技(深圳)有限公司 Hardware decoding test method, terminal and server
CN106559679A (en) * 2015-09-28 2017-04-05 腾讯科技(深圳)有限公司 Method, server and mobile terminal that video is decoded
CN108134956A (en) * 2016-12-01 2018-06-08 腾讯科技(深圳)有限公司 A kind of update method, terminal and the system of hard solution adaptation white list
CN110139104A (en) * 2018-02-09 2019-08-16 腾讯科技(深圳)有限公司 Video encoding/decoding method, device, computer equipment and storage medium
CN110647460A (en) * 2019-08-05 2020-01-03 微梦创科网络科技(中国)有限公司 Test resource management method and device and test client
CN110858920A (en) * 2018-08-23 2020-03-03 武汉斗鱼网络科技有限公司 Video decoding method, mobile terminal, server, system and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103188119A (en) * 2011-12-27 2013-07-03 特克特朗尼克公司 Confidence intervals for key performance indicators in communication networks
CN104980788A (en) * 2015-02-11 2015-10-14 腾讯科技(深圳)有限公司 Video decoding method and device
CN104980797A (en) * 2015-05-27 2015-10-14 腾讯科技(深圳)有限公司 Video decoding method and client
CN106331765A (en) * 2015-06-30 2017-01-11 腾讯科技(深圳)有限公司 Hardware decoding test method, terminal and server
CN106559679A (en) * 2015-09-28 2017-04-05 腾讯科技(深圳)有限公司 Method, server and mobile terminal that video is decoded
CN108134956A (en) * 2016-12-01 2018-06-08 腾讯科技(深圳)有限公司 A kind of update method, terminal and the system of hard solution adaptation white list
CN110139104A (en) * 2018-02-09 2019-08-16 腾讯科技(深圳)有限公司 Video encoding/decoding method, device, computer equipment and storage medium
CN110858920A (en) * 2018-08-23 2020-03-03 武汉斗鱼网络科技有限公司 Video decoding method, mobile terminal, server, system and storage medium
CN110647460A (en) * 2019-08-05 2020-01-03 微梦创科网络科技(中国)有限公司 Test resource management method and device and test client

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112527616A (en) * 2020-12-14 2021-03-19 北京达佳互联信息技术有限公司 Data processing method and device
CN112527616B (en) * 2020-12-14 2024-07-12 北京达佳互联信息技术有限公司 Data processing method and device
CN114363001A (en) * 2021-12-06 2022-04-15 国网安徽省电力有限公司超高压分公司 Method, system and storage medium for client access limitation based on offline configuration
CN114501017A (en) * 2022-03-04 2022-05-13 百果园技术(新加坡)有限公司 Video coding adaptation method, device, equipment and storage medium
WO2023165590A1 (en) * 2022-03-04 2023-09-07 百果园技术(新加坡)有限公司 Video coding adaptation method and apparatus, device, and storage medium

Also Published As

Publication number Publication date
CN112003976B (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN112003976B (en) Hard-coding and hard-decoding test method and device
CN109168028B (en) Video generation method, device, server and storage medium
CN110659121B (en) Task data acquisition method and device, task configuration method and device and server
CN105357475A (en) Video playing method and device
CN112689170B (en) Content playing method of display terminal, display terminal and readable storage medium
CN112311902B (en) File sending method and device based on micro-service
CN113891114B (en) Transcoding task scheduling method and device
CN111274325B (en) Platform automatic test method and system
CN111698281B (en) Resource downloading method and device, electronic equipment and storage medium
CN111258902B (en) Performance test method and performance test system based on SockJS server
CN111767558A (en) Data access monitoring method, device and system
CN111405215A (en) Video storage method and device, cloud server and storage medium
KR20160026138A (en) Rapid sync method for cloud file system and cloud file system using the same
CN111008209B (en) Data reconciliation method, device and system, storage medium and electronic device
CN110446118B (en) Video resource preprocessing method and device and video resource downloading method and device
CN112169312A (en) Queuing scheduling method, device, equipment and storage medium for cloud game service
CN110851433B (en) Key optimization method for key value storage system, storage medium, electronic device and system
CN107766212B (en) Method and device for determining installation state of application program
CN114422576B (en) Session cleaning method and device, computer equipment and readable storage medium
CN112449209B (en) Video storage method and device, cloud server and computer readable storage medium
CN110134547B (en) Middleware-based repeated data deleting method and related device
CN115002097A (en) Application image display method and device, storage medium and electronic device
CN111104381A (en) Log management method, device and equipment and computer readable storage medium
CN110113390A (en) Network request processing method, device, computer equipment and storage medium
CN117376300B (en) Message all-channel sending method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant