CN116627789B - Model detection method and device, electronic equipment and storage medium - Google Patents
Model detection method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116627789B CN116627789B CN202310889945.2A CN202310889945A CN116627789B CN 116627789 B CN116627789 B CN 116627789B CN 202310889945 A CN202310889945 A CN 202310889945A CN 116627789 B CN116627789 B CN 116627789B
- Authority
- CN
- China
- Prior art keywords
- model
- detected
- sub
- evaluation sample
- client
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 108
- 238000011156 evaluation Methods 0.000 claims abstract description 183
- 238000002474 experimental method Methods 0.000 claims abstract description 72
- 238000000034 method Methods 0.000 claims abstract description 48
- 230000004044 response Effects 0.000 claims abstract description 20
- 238000013515 script Methods 0.000 claims description 12
- 238000002372 labelling Methods 0.000 claims description 2
- 238000000638 solvent extraction Methods 0.000 claims 3
- 238000012545 processing Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 4
- 238000001914 filtration Methods 0.000 description 3
- 238000012163 sequencing technique Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3476—Data logging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/3003—Monitoring arrangements specially adapted to the computing system or computing system component being monitored
- G06F11/302—Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3457—Performance evaluation by simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/865—Monitoring of software
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
One or more embodiments of the present disclosure provide a method and apparatus for detecting a model, an electronic device, and a storage medium, where the method includes: dividing a plurality of clients into a plurality of experiment sub-barrels, and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels; in response to receiving a data acquisition request sent by a client, sending service data and a barrel mark corresponding to the client so that the client can collect an evaluation sample under the condition of running a model to be detected aiming at the service data, and marking the evaluation sample with the barrel mark corresponding to the client, wherein the evaluation sample comprises an operation instruction aiming at the service data by a user; and receiving evaluation samples marked with the barrel marks sent by a plurality of clients, and correspondingly determining the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
Description
Technical Field
One or more embodiments of the present disclosure relate to the field of model detection technology, and in particular, to a method and apparatus for model detection, an electronic device, and a storage medium.
Background
With the continuous development of artificial intelligence technology and big data technology, various automatic and intelligent services are more and more, and the services bring about very good use experience for users. The quality of these services depends on the quality of the end-cloud collaboration, as well as the quality of the network model used for machine learning. At present, models with better quality are often selected to be applied to business processing through model detection, namely, a plurality of network models are prepared, and the quality of the models is detected and compared under the cooperation of end clouds, so that the models with better quality are selected to be applied to business processing. However, in the related art, when the client collects the evaluation sample for evaluating the quality of the model, the pertinence is poor, resulting in poor accuracy of model detection.
Disclosure of Invention
In view of this, one or more embodiments of the present disclosure provide a model detection method and apparatus, an electronic device, and a storage medium.
In order to achieve the above object, one or more embodiments of the present disclosure provide the following technical solutions:
according to a first aspect of one or more embodiments of the present specification, there is provided a model detection method, the method comprising:
dividing a plurality of clients into a plurality of experiment sub-barrels, and issuing corresponding models to be detected to the clients in each of the plurality of experiment sub-barrels, wherein each of the plurality of experiment sub-barrels is provided with a corresponding model to be detected;
In response to receiving a data acquisition request sent by a client, sending service data and a barrel mark corresponding to the client so that the client collects an evaluation sample under the condition of running a model to be detected aiming at the service data, and marking the evaluation sample with the barrel mark corresponding to the client, wherein the barrel mark corresponding to the client is a mark of an experimental barrel to which the client belongs, and the evaluation sample comprises an operation instruction aiming at the service data by a user;
and receiving evaluation samples marked with the barrel marks sent by a plurality of clients, and correspondingly determining the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
In one embodiment of the present specification, the dividing the plurality of clients into a plurality of experimental sub-buckets includes:
and determining an experiment sub-bucket to which each of the plurality of clients belongs in response to receiving a model acquisition request sent by each of the plurality of clients.
In one embodiment of the present specification, the dividing the plurality of clients into a plurality of experimental sub-buckets includes:
dividing a plurality of clients into a plurality of experiment sub-barrels according to the number of the models to be detected, wherein the number difference of the clients in different experiment sub-barrels in the experiment sub-barrels is smaller than a preset threshold value, and the number of the experiment sub-barrels is the same as the number of the models to be detected.
In one embodiment of the present disclosure, the issuing, to the client in each of the plurality of experimental sub-buckets, a corresponding model to be detected includes:
and issuing a corresponding model to be detected and a script corresponding to the model to be detected to a client in each experimental sub-bucket in the experimental sub-buckets.
In one embodiment of the present specification, the method further comprises:
and determining a target model in the multiple models to be detected according to the detection result of each model to be detected.
In one embodiment of the present disclosure, the dividing the plurality of clients into a plurality of experiment sub-buckets, and issuing corresponding models to be detected to the clients in each of the plurality of experiment sub-buckets includes:
dividing a plurality of clients into a plurality of experiment sub-barrels and a blank sub-barrel, and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels;
the method further comprises the steps of:
in response to receiving a data acquisition request sent by a client in the blank sub-bucket, sending service data to the client in the blank sub-bucket, so that the client in the blank sub-bucket collects an evaluation sample, and marks the evaluation sample with a blank mark;
And receiving an evaluation sample sent by the client in the blank sub-bucket, and determining a reference result according to the evaluation sample sent by the client in the blank sub-bucket.
According to a second aspect of one or more embodiments of the present specification, there is provided a model detection method, the method comprising:
receiving a model to be detected sent by a server;
sending a data acquisition request to the server, and receiving service data and a sub-bucket mark sent by the server, wherein the sub-bucket mark is an experimental sub-bucket mark corresponding to the model to be detected;
running the model to be detected aiming at the service data, collecting an evaluation sample and marking the evaluation sample with the barrel mark, wherein the evaluation sample comprises an operation instruction of a user aiming at the service data;
and sending the evaluation sample marked with the barrel mark to the server side so that the server side can determine the detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
In an embodiment of the present disclosure, the receiving a model to be detected sent by a server includes:
and sending a model acquisition request to the server, and receiving the model to be detected sent by the server.
In one embodiment of the present specification, the running the model to be detected for the service data and collecting an evaluation sample, and labeling the evaluation sample with the bucket label includes:
and responding to the triggering of a model detection scene, running the model to be detected aiming at the service data, collecting an evaluation sample, and marking the evaluation sample with the barrel mark, wherein the detection scene comprises the service data reaching a preset state and/or receiving a detection instruction input by a user.
According to a third aspect of one or more embodiments of the present specification, there is provided a model detection method, the method comprising:
under the condition that a model to be detected sent by a server is not received, sending a data acquisition request to the server, and receiving service data sent by the server;
and collecting an evaluation sample, marking the evaluation sample with a blank mark, and sending the evaluation sample to the server so that the server can determine a reference result according to the evaluation sample, wherein the evaluation sample comprises an operation instruction of a user for service data.
According to a fourth aspect of one or more embodiments of the present specification, there is provided a model detection apparatus, the apparatus comprising:
The model issuing module is used for dividing a plurality of clients into a plurality of experiment sub-barrels and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels, wherein each experiment sub-barrel in the plurality of experiment sub-barrels is provided with a corresponding model to be detected;
the data issuing module is used for responding to a data acquisition request sent by a client, sending service data and a barrel division mark corresponding to the client so that the client can collect an evaluation sample under the condition of running a model to be detected aiming at the service data and mark the evaluation sample with the barrel division mark corresponding to the client, wherein the barrel division mark corresponding to the client is a mark of an experimental barrel division to which the client belongs, and the evaluation sample comprises an operation instruction aiming at the service data by a user;
the detection module is used for receiving the evaluation samples marked with the barrel marks sent by the plurality of clients and correspondingly determining the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
In one embodiment of the present disclosure, the model issuing module is configured to, when dividing a plurality of clients into a plurality of experimental sub-buckets, specifically:
And determining an experiment sub-bucket to which each of the plurality of clients belongs in response to receiving a model acquisition request sent by each of the plurality of clients.
In one embodiment of the present disclosure, the model issuing module is configured to, when dividing a plurality of clients into a plurality of experimental sub-buckets, specifically:
dividing a plurality of clients into a plurality of experiment sub-barrels according to the number of the models to be detected, wherein the number difference of the clients in different experiment sub-barrels in the experiment sub-barrels is smaller than a preset threshold value, and the number of the experiment sub-barrels is the same as the number of the models to be detected.
In an embodiment of the present disclosure, the model issuing module is configured to, when issuing, to a client in each of the plurality of experimental sub-buckets, a corresponding model to be detected, specifically configured to:
and issuing a corresponding model to be detected and a script corresponding to the model to be detected to a client in each experimental sub-bucket in the experimental sub-buckets.
In one embodiment of the present specification, the apparatus further comprises a target module for:
and determining a target model in the multiple models to be detected according to the detection result of each model to be detected.
In one embodiment of the present specification, the model issuing module is specifically configured to:
dividing a plurality of clients into a plurality of experiment sub-barrels and a blank sub-barrel, and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels;
the apparatus further comprises a reference module for:
in response to receiving a data acquisition request sent by a client in the blank sub-bucket, sending service data to the client in the blank sub-bucket, so that the client in the blank sub-bucket collects an evaluation sample, and marks the evaluation sample with a blank mark;
and receiving an evaluation sample sent by the client in the blank sub-bucket, and determining a reference result according to the evaluation sample sent by the client in the blank sub-bucket.
According to a fifth aspect of one or more embodiments of the present specification, there is provided a model detection apparatus, the apparatus comprising:
the model receiving module is used for receiving the model to be detected sent by the server;
the first data receiving module is used for sending a data acquisition request to the server and receiving service data and a sub-bucket mark sent by the server, wherein the sub-bucket mark is an experimental sub-bucket mark corresponding to the model to be detected;
The sample collection module is used for running the to-be-detected model aiming at the service data, collecting an evaluation sample and marking the evaluation sample with the barrel mark, wherein the evaluation sample comprises an operation instruction aiming at the service data by a user;
and the sample uploading module is used for sending the evaluation sample marked with the barrel mark to the server side so that the server side can determine the detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
In one embodiment of the present specification, the model receiving module is specifically configured to:
and sending a model acquisition request to the server, and receiving the model to be detected sent by the server.
In one embodiment of the present specification, the sample collection module is specifically configured to:
and responding to the triggering of a model detection scene, running the model to be detected aiming at the service data, collecting an evaluation sample, and marking the evaluation sample with the barrel mark, wherein the detection scene comprises the service data reaching a preset state and/or receiving a detection instruction input by a user.
According to a sixth aspect of one or more embodiments of the present specification, there is provided a model detection apparatus, the apparatus comprising:
The second data receiving module is used for sending a data acquisition request to the server and receiving service data sent by the server under the condition that the model to be detected sent by the server is not received;
the sample module is used for collecting an evaluation sample, marking the evaluation sample with a blank mark, and sending the evaluation sample to the server so that the server can determine a reference result according to the evaluation sample, wherein the evaluation sample comprises an operation instruction of a user for service data.
According to a seventh aspect of one or more embodiments of the present specification, there is provided an electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor implements the method of the first or second aspect by executing the executable instructions.
According to an eighth aspect of one or more embodiments of the present description, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method according to the first or second aspect.
The technical scheme provided by the embodiment of the specification can comprise the following beneficial effects:
In the model detection method provided by the embodiment of the specification, a server may divide a plurality of clients into a plurality of experimental sub-buckets, and issue a corresponding model to be detected to a client in each of the experimental sub-buckets, and the server may send service data and a sub-bucket mark corresponding to the client in response to receiving a data acquisition request sent by the client; the client can collect an evaluation sample under the condition of running a model to be detected aiming at service data, and mark the evaluation sample with a barrel mark corresponding to the client; the server side can receive evaluation samples marked with the barrel marks sent by the clients and correspondingly determine the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark. The service end transmits the business data to a certain client end and simultaneously transmits the barrel dividing mark corresponding to the client end, and the client end only collects the evaluation sample and marks the barrel dividing mark when the model to be detected is operated aiming at the business data, so that the evaluation sample is ensured to have stronger pertinence, namely, the evaluation sample is generated under the operation of the model to be detected, and the service end can also group the evaluation sample according to the barrel dividing mark and pertinently determine the detection result of each model to be detected, thereby improving the accuracy of model detection.
Drawings
Fig. 1 is a flowchart of a method for detecting a model running on a server according to an exemplary embodiment.
FIG. 2 is a flowchart of a method for model detection running on a client, in accordance with an exemplary embodiment.
FIG. 3 is a flowchart of a method for model detection running on a client, in accordance with an exemplary embodiment.
Fig. 4 is a flowchart of a client-server architecture and a model detection method obtained by combining the above embodiments according to an exemplary embodiment.
Fig. 5 is a schematic diagram of an apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram of a model detection device operating on a server according to an exemplary embodiment.
Fig. 7 is a block diagram of a model detection apparatus operating on a client provided in an exemplary embodiment.
Fig. 8 is a block diagram of a model detection apparatus operating on a client provided in an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with aspects of one or more embodiments of the present description as detailed in the accompanying claims.
It should be noted that: in other embodiments, the steps of the corresponding method are not necessarily performed in the order shown and described in this specification. In some other embodiments, the method may include more or fewer steps than described in this specification. Furthermore, individual steps described in this specification, in other embodiments, may be described as being split into multiple steps; while various steps described in this specification may be combined into a single step in other embodiments.
With the continuous development of artificial intelligence technology and big data technology, various automatic and intelligent services are more and more, and the services bring about very good use experience for users. The quality of these services depends on the quality of the end-cloud collaboration, as well as the quality of the network model used for machine learning. At present, models with better quality are often selected to be applied to business processing through model detection, namely, a plurality of network models are prepared, and the quality of the models is detected and compared under the cooperation of end clouds, so that the models with better quality are selected to be applied to business processing. However, in the related art, when the client collects the evaluation samples for evaluating the quality of the model, the pertinence is poor, for example, the client collects all the evaluation samples related to the service data issued to the client, so that the evaluation samples contain a large number of samples unrelated to the model, the purity and pertinence of the evaluation samples are reduced, and the accuracy of model detection is poor.
Based on this, in a first aspect, at least one embodiment of the present disclosure provides a model detection method, which is used to detect multiple models to be detected running in a client, so as to screen out a model with the best quality for service processing. The method can be completed under the cooperation of the client and the server, namely, a plurality of clients send model acquisition requests to the server, the server can divide the clients into a plurality of experimental sub-barrels and send corresponding models to be detected to the clients in each experimental sub-barrel in the plurality of experimental sub-barrels, and the server can send service data and sub-barrel marks corresponding to the clients in response to receiving the data acquisition requests sent by the clients; the client can collect an evaluation sample under the condition of running a model to be detected aiming at service data, and mark the evaluation sample with a barrel mark corresponding to the client; the server side can receive evaluation samples marked with the barrel marks sent by the clients and correspondingly determine the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark. The service end transmits the business data to a certain client end and simultaneously transmits the barrel dividing mark corresponding to the client end, and the client end only collects the evaluation sample and marks the barrel dividing mark when the model to be detected is operated aiming at the business data, so that the evaluation sample is ensured to have stronger pertinence, namely, the evaluation sample is generated under the operation of the model to be detected, and the service end can also group the evaluation sample according to the barrel dividing mark and pertinently determine the detection result of each model to be detected, thereby improving the accuracy of model detection.
The model detection method is described in detail below from both sides of the server side and the client side, respectively.
Referring to fig. 1, a flow of a model detection method running on a server is shown in an exemplary manner, and includes steps S101 to S103.
In step S101, a plurality of clients are divided into a plurality of experiment sub-buckets, and a corresponding model to be detected is issued to the (each) client in each of the plurality of experiment sub-buckets, where each of the plurality of experiment sub-buckets has a corresponding model to be detected.
Optionally, determining, in response to receiving the model acquisition request sent by each of the plurality of clients, an experimental sub-bucket to which each of the plurality of clients belongs. For example, after a user starts a client on a terminal device, the client may automatically send a model acquisition request to a server. For another example, the user inputs a model acquisition operation for the client, and the client sends a model acquisition request to the server.
The experimental sub-bucket is an experimental pool of the corresponding model to be detected, that is, the model to be detected performs experiments and detection in the corresponding experimental sub-bucket, for example, an evaluation sample is obtained to determine a detection result of the model to be detected. An experimental bucket may be viewed as a grouping of clients for the purpose of detecting multiple models to be detected, respectively. The number of the experimental sub-buckets can be determined by the number of the models to be detected, that is, the step can divide the plurality of clients into a plurality of experimental sub-buckets according to the number of the models to be detected, wherein the number difference of the clients in different experimental sub-buckets in the plurality of experimental sub-buckets is smaller than a preset threshold, and the number of the experimental sub-buckets is the same as the number of the models to be detected. In the case that the preset threshold is 1, the above example enables the plurality of clients to be divided into a plurality of experimental sub-buckets on average (or the plurality of clients to be divided into a plurality of experimental groups on average).
The server side can be provided with a plurality of cloud systems and an experiment distribution platform, the experiment distribution platform is used for dividing a plurality of terminal devices, and the cloud systems can generate service data for the client side. Each cloud system can be internally provided with experimental parameters, the experimental diversion platform can acquire the experimental parameters from the cloud systems, and a script of each experimental sub-barrel, namely a script corresponding to each model to be detected, is constructed according to the experimental parameters. Based on the above, in this step, the experimental diversion platform may issue the corresponding model to be detected and the script corresponding to the model to be detected to the client.
It can be understood that, a plurality of clients may send model acquisition requests to a server in succession, and this step may divide a client into a certain experiment sub-bucket after receiving a model acquisition request sent by a certain client, and send a model to be detected corresponding to the experiment sub-bucket and a script corresponding to the model to be detected to the client.
The method and the device enable a plurality of clients to obtain the model to be detected, and the server records the experiment sub-buckets of each client.
In step S102, in response to receiving a data acquisition request sent by a client, sending service data and a bucket label corresponding to the client, so that the client collects an evaluation sample under the condition that a model to be detected is operated for the service data, and marks the evaluation sample with the bucket label corresponding to the client, where the bucket label corresponding to the client is a label of an experimental bucket to which the client belongs, and the evaluation sample includes an operation instruction for the service data by a user.
It should be noted that, in step S101, the step is performed when the client receiving the model to be detected sends a data acquisition request to the server. In other words, this step is repeated for different clients in the method.
For example, a user initiating a function of the client, or a user entering a data acquisition operation for the client, may cause the client to send a data acquisition request to the server. For example, the user starts the short video browsing function of the client, and the client sends a data acquisition request to the server to acquire the short video presented to the user.
The service end can generate service data by utilizing a plurality of cloud systems, mark the service data in a barrel according to the experimental barrel of the client end requesting the service data in the record, and return the service data marked with the barrel to the client end. For example, the server generates a short video sequence composed of a plurality of short videos, marks a barrel on the short video sequence, and returns to the client running the short video browsing function.
After receiving the service data, the client can directly display the service data or display the service data after running the model to be detected. For example, the client may run the model to be detected for the business data in response to a model detection scenario being triggered, wherein the detection scenario includes the business data reaching a preset state and/or receiving a detection instruction entered by a user. And running a model to be detected aiming at the service data, namely processing the service data by the model to be detected, for example, sequencing and filtering short video sequences by the model to be detected. The method is characterized in that the business data processed by the to-be-detected model can represent the processing effect of the to-be-detected model, and the user operation of the business data processed by the to-be-detected model can represent the satisfaction degree of the user on the processing effect.
Taking the scenario that the client runs the short video browsing function as an example, when the short videos in the short video sequence are praised, collected, reviewed or played for a certain proportion of total time length by a user, and the like, the detection scenario is triggered, the client runs a model to be detected on the rest short videos in the short video sequence, performs processing such as filtering and sequencing on the rest short videos, and collects operation instructions of the user on the processed short videos, such as praise, collection, review, viewing time length and the like, so as to serve as an evaluation sample, and marks an experiment sub-barrel to which the client belongs for the evaluation sample.
In step S103, the evaluation samples marked with the barrel marks sent by the plurality of clients are received, and the detection result of the model to be detected corresponding to each barrel mark is determined according to the evaluation sample corresponding to each barrel mark.
It should be noted that, the plurality of clients in this step are some or all of all clients for which the step S102 is performed. The evaluation sample corresponding to a certain barrel mark refers to the evaluation sample marked with the barrel mark. The model to be detected corresponding to a certain barrel mark refers to the model to be detected corresponding to the experimental barrel to which the barrel mark belongs.
In the step, aiming at each experimental sub-bucket, effect evaluation is carried out on the model to be detected corresponding to the experimental sub-bucket by using an evaluation sample in the experimental sub-bucket, and the result is taken as a detection result. For example, different operation instructions of the user in the evaluation sample have different scores, for example, the user scores 5 for short video comments, 5 for short video collection, 3 for short video praise, 2 for short video playing time up to 50% of the total time, 0 for short video playing time up to 10% of the total time, etc.; the sum of the scores of the operation instructions in all the evaluation samples in the experiment sub-bucket can be used as a detection result.
It can be understood that the target model may also be determined from a plurality of models to be detected according to the detection result of each model to be detected. The target model can be the model with the best detection result in the multiple models to be detected, so that the target model can be issued to all clients for processing the service data; the target model can also be the model with the worst detection result in a plurality of models to be detected, so the target model can be eliminated, and the method is continuously used for detecting the model aiming at the rest models to be detected until the best model is finally selected and issued to all clients for processing the service data.
In the model detection method provided by the embodiment of the specification, a server may divide a plurality of clients into a plurality of experimental sub-buckets, and issue a corresponding model to be detected to a client in each of the experimental sub-buckets, and the server may send service data and a sub-bucket mark corresponding to the client in response to receiving a data acquisition request sent by the client; the client can collect an evaluation sample under the condition of running a model to be detected aiming at service data, and mark the evaluation sample with a barrel mark corresponding to the client; the server side can receive evaluation samples marked with the barrel marks sent by the clients and correspondingly determine the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark. The service end transmits the business data to a certain client end and simultaneously transmits the barrel dividing mark corresponding to the client end, and the client end only collects the evaluation sample and marks the barrel dividing mark when the model to be detected is operated aiming at the business data, so that the evaluation sample is ensured to have stronger pertinence, namely, the evaluation sample is generated under the operation of the model to be detected, and the service end can also group the evaluation sample according to the barrel dividing mark and pertinently determine the detection result of each model to be detected, thereby improving the accuracy of model detection.
In some embodiments of the present disclosure, step S101 may further include: dividing a plurality of clients into a plurality of experiment sub-barrels and a blank sub-barrel, and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels. That is, when dividing the experimental group, not only the experimental group of each model to be detected but also a blank group is divided for use as an experimental reference.
Based on the division of the blank sub-buckets, the method can respond to the received data acquisition request sent by the client in the experimental sub-bucket, send service data and sub-bucket marks corresponding to the client in the experimental sub-bucket, so that the client in the experimental sub-bucket collects an evaluation sample under the condition of running a model to be detected aiming at the service data, and marks the evaluation sample with the sub-bucket marks corresponding to the client, wherein the sub-bucket marks corresponding to the client are marks of experimental sub-buckets to which the client belongs, and the evaluation sample comprises operation instructions aiming at the service data by a user; and in response to receiving a data acquisition request sent by the client in the blank sub-bucket, sending service data to the client in the blank sub-bucket, so that the client in the blank sub-bucket collects an evaluation sample, and marks the evaluation sample with a blank mark.
The above-mentioned related operations for the clients in the experimental sub-bucket are the same as step S102, and may be performed with reference to the step S102; the above-mentioned related operation for the client in the blank sub-bucket is different from step S102 in that the server does not need to mark the sub-bucket mark when issuing the service data, the client directly collects the evaluation sample without collecting the evaluation sample in the detection scene, and the client marks the evaluation sample as a blank mark.
Based on the service data, the method can receive the evaluation samples marked with the barrel marks sent by a plurality of clients, and correspondingly determine the detection result of the to-be-detected model corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark; and receiving an evaluation sample sent by the client in the blank sub-bucket, and determining a reference result according to the evaluation sample sent by the client in the blank sub-bucket.
If the server selects a model with the best effect from the plurality of models to be detected according to the detection result of each model to be detected, the detection result of the model is sent to all clients for processing service data under the condition that the detection result of the model is better than the reference result.
In this embodiment, by adding the blank experiment group, the result of model detection is more accurate, and the model is issued only when the effect of verifying the model is better than that of the model of the blank experiment group, so that the model can be prevented from reducing user experience.
Referring to fig. 2, a flow of a model detection method running on a client is shown in an exemplary manner, and includes steps S201 to S204. It will be appreciated that the clients performing the method in this embodiment are clients within experimental sub-buckets, i.e. clients within non-blank sub-buckets.
In step S201, a model to be detected sent by a server is received.
The client sends a model acquisition request to the server and receives a model to be detected sent by the server. Optionally, a script sent by the server may also be received.
In step S202, a data acquisition request is sent to the server, and service data and a sub-bucket label sent by the server are received, where the sub-bucket label is a label of an experimental sub-bucket corresponding to the model to be detected.
In step S203, the model to be detected is run for the service data, and an evaluation sample is collected and marked on the barrel, where the evaluation sample includes an operation instruction for the service data by a user.
The client may have an end intelligence, a service SDK and a buried point, where the end intelligence is used to run scripts and models, and the service SDK is used to interact with a user and display service data. The service SDK can trigger the terminal to intelligently run a script and a model to be detected aiming at service data; the model to be detected can process the service data and return a service data processing result and a barrel dividing mark on the service data to the SDK; the service SDK can receive an operation instruction of a user for the processed service data; the buried point can collect an operation instruction of a user as an evaluation sample, and send the evaluation sample to a sample database of the server after marking the evaluation sample by a barrel.
For example, in response to a model detection scenario being triggered, running the model to be detected for the service data, collecting an evaluation sample, and marking the evaluation sample with the barrel mark, wherein the detection scenario includes that the service data reaches a preset state and/or a detection instruction input by a user is received.
Taking a scene of a short video browsing function operated by a client as an example, under the condition that short videos in a short video sequence are praised, collected, reviewed or played for a certain proportion of total time by a user, and the like, detecting the scene is triggered, the client operates a model to be detected on the rest short videos in the short video sequence, carries out processing such as filtering, sequencing and the like on the rest short videos, and collects operation instructions of the user on the processed short videos, such as praise, collection, review, viewing time and the like, so as to serve as an evaluation sample, and marks an experiment barrel to which the client belongs for the evaluation sample.
In step S204, the evaluation sample marked with the barrel mark is sent to the server, so that the server determines a detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
It can be understood that the server receives the evaluation samples marked with the barrel marks sent by the plurality of clients, and correspondingly determines the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
Further details of the above steps have been described in the method at the server side, and are not repeated here.
Referring to fig. 3, a flow of a model detection method running on a client is shown in an exemplary manner, and includes steps S301 to S303. It will be appreciated that the clients performing the method in this embodiment are clients within a blank sub-bucket, i.e. clients within a non-experimental sub-bucket.
In step S301, in the case that the model to be detected sent by the server is not received, a data acquisition request is sent to the server, and service data sent by the server is received.
In step S302, an evaluation sample is collected, the evaluation sample is marked with a blank mark, and the evaluation sample is sent to the server, so that the server determines a reference result according to the evaluation sample, wherein the evaluation sample includes an operation instruction of a user for service data.
Further details of the above steps have been described in the method at the server side, and are not repeated here.
Referring to fig. 4, a flowchart of a client-server architecture and a model detection method obtained by combining the above embodiments are shown, where the method includes the following steps:
s1: and obtaining experimental parameters.
S2: the model and script are obtained.
S3: and acquiring service data and a barrel mark.
S4: the user triggers the detection scenario.
S5: and (5) running the model.
S6: the user inputs an operation instruction.
S7: the evaluation sample (with the binning mark) is returned.
S8: and determining the detection result of the model according to the evaluation sample.
Fig. 5 is a schematic block diagram of an apparatus according to an exemplary embodiment. Referring to fig. 5, at the hardware level, the device includes a processor 502, an internal bus 504, a network interface 506, a memory 508, and a non-volatile storage 510, although other tasks may be performed by the device. One or more embodiments of the present description may be implemented in a software-based manner, such as by the processor 502 reading a corresponding computer program from the non-volatile storage 510 into the memory 508 and then running. Of course, in addition to software implementation, one or more embodiments of the present disclosure do not exclude other implementation manners, such as a logic device or a combination of software and hardware, etc., that is, the execution subject of the following processing flow is not limited to each logic unit, but may also be hardware or a logic device.
Referring to fig. 6, the model detection apparatus may be applied to the device shown in fig. 5 to implement the technical solution of the present specification. The device comprises:
the model issuing module 601 is configured to divide a plurality of clients into a plurality of experiment sub-buckets, and issue a corresponding model to be detected to the clients in each of the plurality of experiment sub-buckets, where each of the plurality of experiment sub-buckets has a corresponding model to be detected;
the data issuing module 602 is configured to send, in response to receiving a data acquisition request sent by a client, service data and a bucket label corresponding to the client, so that the client collects an evaluation sample under a condition that a model to be detected is run for the service data, and marks the evaluation sample with the bucket label corresponding to the client, where the bucket label corresponding to the client is a label of an experimental bucket to which the client belongs, and the evaluation sample includes an operation instruction for the service data by a user;
the detection module 603 is configured to receive evaluation samples marked with the barrel marks sent by a plurality of clients, and determine a detection result of the to-be-detected model corresponding to each barrel mark according to the evaluation sample corresponding to each barrel mark.
In one embodiment of the present disclosure, the model issuing module is configured to, when dividing a plurality of clients into a plurality of experimental sub-buckets, specifically:
and determining an experiment sub-bucket to which each of the plurality of clients belongs in response to receiving a model acquisition request sent by each of the plurality of clients.
In one embodiment of the present disclosure, the model issuing module is configured to, when dividing a plurality of clients into a plurality of experimental sub-buckets, specifically:
dividing a plurality of clients into a plurality of experiment sub-barrels according to the number of the models to be detected, wherein the number difference of the clients in different experiment sub-barrels in the experiment sub-barrels is smaller than a preset threshold value, and the number of the experiment sub-barrels is the same as the number of the models to be detected.
In an embodiment of the present disclosure, the model issuing module is configured to, when issuing, to a client in each of the plurality of experimental sub-buckets, a corresponding model to be detected, specifically configured to:
and issuing a corresponding model to be detected and a script corresponding to the model to be detected to a client in each experimental sub-bucket in the experimental sub-buckets.
In one embodiment of the present specification, the apparatus further comprises a target module for:
And determining a target model in the multiple models to be detected according to the detection result of each model to be detected.
In one embodiment of the present specification, the model issuing module is specifically configured to:
dividing a plurality of clients into a plurality of experiment sub-barrels and a blank sub-barrel, and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels;
the apparatus further comprises a reference module for:
in response to receiving a data acquisition request sent by a client in the blank sub-bucket, sending service data to the client in the blank sub-bucket, so that the client in the blank sub-bucket collects an evaluation sample, and marks the evaluation sample with a blank mark;
and receiving an evaluation sample sent by the client in the blank sub-bucket, and determining a reference result according to the evaluation sample sent by the client in the blank sub-bucket.
Referring to fig. 7, the model detection apparatus may be applied to the device shown in fig. 5 to implement the technical solution of the present specification. The device comprises:
the model receiving module 701 is configured to receive a model to be detected sent by a server;
the first data receiving module 702 is configured to send a data acquisition request to the server, and receive service data and a bucket label sent by the server, where the bucket label is a label of an experimental bucket corresponding to the model to be detected;
A sample collection module 703, configured to run the model to be detected for the service data, collect an evaluation sample, and mark the evaluation sample with the barrel label, where the evaluation sample includes an operation instruction for the service data by a user;
and a sample uploading module 704, configured to send the evaluation sample marked with the barrel mark to the server, so that the server determines a detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
In one embodiment of the present specification, the model receiving module is specifically configured to:
and sending a model acquisition request to the server, and receiving the model to be detected sent by the server.
In one embodiment of the present specification, the sample collection module is specifically configured to:
and responding to the triggering of a model detection scene, running the model to be detected aiming at the service data, collecting an evaluation sample, and marking the evaluation sample with the barrel mark, wherein the detection scene comprises the service data reaching a preset state and/or receiving a detection instruction input by a user.
Referring to fig. 8, the model detection apparatus may be applied to the device shown in fig. 5 to implement the technical solution of the present specification. The device comprises:
The second data receiving module 801 is configured to send a data acquisition request to a server, and receive service data sent by the server, where the to-be-detected model sent by the server is not received;
the sample module 802 is configured to collect an evaluation sample, mark the evaluation sample with a blank mark, and send the evaluation sample to the server, so that the server determines a reference result according to the evaluation sample, where the evaluation sample includes an operation instruction of a user for service data.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. A typical implementation device is a computer, which may be in the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email device, game console, tablet computer, wearable device, or a combination of any of these devices.
In a typical configuration, a computer includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, read only compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage, quantum memory, graphene-based storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by the computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The terminology used in the one or more embodiments of the specification is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the specification. As used in this specification, one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
The user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of related data is required to comply with the relevant laws and regulations and standards of the relevant country and region, and is provided with corresponding operation entries for the user to select authorization or rejection.
It should be understood that although the terms first, second, third, etc. may be used in one or more embodiments of the present description to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of one or more embodiments of the present description. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
The foregoing description of the preferred embodiment(s) is (are) merely intended to illustrate the embodiment(s) of the present invention, and it is not intended to limit the embodiment(s) of the present invention to the particular embodiment(s) described.
Claims (15)
1. A method of model detection, the method comprising:
dividing a plurality of clients into a plurality of experiment sub-barrels, and issuing corresponding models to be detected to the clients in each of the plurality of experiment sub-barrels, wherein each of the plurality of experiment sub-barrels is provided with a corresponding model to be detected;
in response to receiving a data acquisition request sent by a client, sending service data and a barrel mark corresponding to the client so that the client collects an evaluation sample under the condition of running a model to be detected aiming at the service data, and marking the evaluation sample with the barrel mark corresponding to the client, wherein the barrel mark corresponding to the client is a mark of an experimental barrel to which the client belongs, and the evaluation sample comprises an operation instruction aiming at the service data by a user;
And receiving evaluation samples marked with the barrel marks sent by a plurality of clients, and correspondingly determining the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
2. The model detection method of claim 1, the partitioning of the plurality of clients into a plurality of experimental sub-buckets, comprising:
and determining an experiment sub-bucket to which each of the plurality of clients belongs in response to receiving a model acquisition request sent by each of the plurality of clients.
3. The model detection method of claim 1, the partitioning of the plurality of clients into a plurality of experimental sub-buckets, comprising:
dividing a plurality of clients into a plurality of experiment sub-barrels according to the number of the models to be detected, wherein the number difference of the clients in different experiment sub-barrels in the experiment sub-barrels is smaller than a preset threshold value, and the number of the experiment sub-barrels is the same as the number of the models to be detected.
4. The model detection method according to claim 1, wherein the issuing, to the client in each of the plurality of experimental sub-buckets, the corresponding model to be detected includes:
and issuing a corresponding model to be detected and a script corresponding to the model to be detected to a client in each experimental sub-bucket in the experimental sub-buckets.
5. The model detection method of claim 1, the method further comprising:
and determining a target model in the multiple models to be detected according to the detection result of each model to be detected.
6. The model detection method of claim 1, the partitioning of the plurality of clients into a plurality of experimental sub-buckets, comprising:
dividing a plurality of clients in all clients into a plurality of experiment sub-buckets, wherein the clients are part of clients in all clients;
the method further comprises the steps of:
determining other clients except the clients as a blank sub-bucket, wherein the blank sub-bucket is used as a reference experiment group;
in response to receiving a data acquisition request sent by a client in the blank sub-bucket, sending service data to the client in the blank sub-bucket, so that the client in the blank sub-bucket collects an evaluation sample, and marks the evaluation sample with a blank mark;
and receiving an evaluation sample sent by the client in the blank sub-bucket, and determining a reference result according to the evaluation sample sent by the client in the blank sub-bucket.
7. A method of model detection, the method comprising:
Receiving a model to be detected sent by a server;
sending a data acquisition request to the server, and receiving service data and a sub-bucket mark sent by the server, wherein the sub-bucket mark is an experimental sub-bucket mark corresponding to the model to be detected;
running the model to be detected aiming at the service data, collecting an evaluation sample and marking the evaluation sample with the barrel mark, wherein the evaluation sample comprises an operation instruction of a user aiming at the service data;
and sending the evaluation sample marked with the barrel mark to the server side so that the server side can determine the detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
8. The method for detecting a model according to claim 7, wherein the receiving the model to be detected sent by the server includes:
and sending a model acquisition request to the server, and receiving the model to be detected sent by the server.
9. The model detection method according to claim 7, wherein the running the model to be detected for the service data and collecting an evaluation sample, and labeling the evaluation sample with the bucket label, includes:
And responding to the triggering of a model detection scene, running the model to be detected aiming at the service data, collecting an evaluation sample, and marking the evaluation sample with the barrel mark, wherein the detection scene comprises the service data reaching a preset state and/or receiving a detection instruction input by a user.
10. A method of model detection, the method comprising:
under the condition that a model to be detected sent by a server is not received, sending a data acquisition request to the server, and receiving service data sent by the server;
and collecting an evaluation sample, marking the evaluation sample with a blank mark, and sending the evaluation sample to the server so that the server can determine a reference result according to the evaluation sample, wherein the evaluation sample comprises an operation instruction of a user for service data.
11. A model detection apparatus, the apparatus comprising:
the model issuing module is used for dividing a plurality of clients into a plurality of experiment sub-barrels and issuing corresponding models to be detected to the clients in each experiment sub-barrel in the plurality of experiment sub-barrels, wherein each experiment sub-barrel in the plurality of experiment sub-barrels is provided with a corresponding model to be detected;
The data issuing module is used for responding to a data acquisition request sent by a client, sending service data and a barrel division mark corresponding to the client so that the client can collect an evaluation sample under the condition of running a model to be detected aiming at the service data and mark the evaluation sample with the barrel division mark corresponding to the client, wherein the barrel division mark corresponding to the client is a mark of an experimental barrel division to which the client belongs, and the evaluation sample comprises an operation instruction aiming at the service data by a user;
the detection module is used for receiving the evaluation samples marked with the barrel marks sent by the plurality of clients and correspondingly determining the detection result of the model to be detected corresponding to each barrel mark according to the evaluation samples corresponding to each barrel mark.
12. A model detection apparatus, the apparatus comprising:
the model receiving module is used for receiving the model to be detected sent by the server;
the first data receiving module is used for sending a data acquisition request to the server and receiving service data and a sub-bucket mark sent by the server, wherein the sub-bucket mark is an experimental sub-bucket mark corresponding to the model to be detected;
The sample collection module is used for running the to-be-detected model aiming at the service data, collecting an evaluation sample and marking the evaluation sample with the barrel mark, wherein the evaluation sample comprises an operation instruction aiming at the service data by a user;
and the sample uploading module is used for sending the evaluation sample marked with the barrel mark to the server side so that the server side can determine the detection result of the model to be detected corresponding to the barrel mark according to the evaluation sample.
13. A model detection apparatus, the apparatus comprising:
the second data receiving module is used for sending a data acquisition request to the server and receiving service data sent by the server under the condition that the model to be detected sent by the server is not received;
the sample module is used for collecting an evaluation sample, marking the evaluation sample with a blank mark, and sending the evaluation sample to the server so that the server can determine a reference result according to the evaluation sample, wherein the evaluation sample comprises an operation instruction of a user for service data.
14. An electronic device, comprising:
a processor;
A memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of claims 1-10 by executing the executable instructions.
15. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of any of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310889945.2A CN116627789B (en) | 2023-07-19 | 2023-07-19 | Model detection method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310889945.2A CN116627789B (en) | 2023-07-19 | 2023-07-19 | Model detection method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116627789A CN116627789A (en) | 2023-08-22 |
CN116627789B true CN116627789B (en) | 2023-11-03 |
Family
ID=87621582
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310889945.2A Active CN116627789B (en) | 2023-07-19 | 2023-07-19 | Model detection method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116627789B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106663224A (en) * | 2014-06-30 | 2017-05-10 | 亚马逊科技公司 | Interactive interfaces for machine learning model evaluations |
US10482376B1 (en) * | 2018-09-13 | 2019-11-19 | Sas Institute Inc. | User interface for assessment of classification model quality and selection of classification model cut-off score |
CN111555940A (en) * | 2020-04-28 | 2020-08-18 | 北京字节跳动网络技术有限公司 | Client test method and device, electronic equipment and computer readable storage medium |
CN113297277A (en) * | 2021-06-18 | 2021-08-24 | 北京有竹居网络技术有限公司 | Test statistic determination method, device, readable medium and electronic equipment |
CN113642662A (en) * | 2021-08-24 | 2021-11-12 | 凌云光技术股份有限公司 | Lightweight classification model-based classification detection method and device |
CN115933554A (en) * | 2022-11-25 | 2023-04-07 | 百度(中国)有限公司 | Path planning method, path planning model and training method for selection model |
CN116383533A (en) * | 2023-04-04 | 2023-07-04 | 拉扎斯网络科技(上海)有限公司 | AB experiment shunting processing method, device, medium and equipment |
-
2023
- 2023-07-19 CN CN202310889945.2A patent/CN116627789B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106663224A (en) * | 2014-06-30 | 2017-05-10 | 亚马逊科技公司 | Interactive interfaces for machine learning model evaluations |
US10482376B1 (en) * | 2018-09-13 | 2019-11-19 | Sas Institute Inc. | User interface for assessment of classification model quality and selection of classification model cut-off score |
CN111555940A (en) * | 2020-04-28 | 2020-08-18 | 北京字节跳动网络技术有限公司 | Client test method and device, electronic equipment and computer readable storage medium |
CN113297277A (en) * | 2021-06-18 | 2021-08-24 | 北京有竹居网络技术有限公司 | Test statistic determination method, device, readable medium and electronic equipment |
CN113642662A (en) * | 2021-08-24 | 2021-11-12 | 凌云光技术股份有限公司 | Lightweight classification model-based classification detection method and device |
CN115933554A (en) * | 2022-11-25 | 2023-04-07 | 百度(中国)有限公司 | Path planning method, path planning model and training method for selection model |
CN116383533A (en) * | 2023-04-04 | 2023-07-04 | 拉扎斯网络科技(上海)有限公司 | AB experiment shunting processing method, device, medium and equipment |
Non-Patent Citations (2)
Title |
---|
Model-Checking Access Control Policies;Dimitar P。Guelev et al;《Information Security,7th International Conference》;1-16 * |
端云协同计算的智能电视视频实时推荐系统的设计与实现;张旭东;《中国优秀硕士学位论文全文数据库(电子期刊)》;第2023卷(第02期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116627789A (en) | 2023-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110175609B (en) | Interface element detection method, device and equipment | |
CN110348392B (en) | Vehicle matching method and device | |
CN109379639B (en) | Method and device for pushing video content object and electronic equipment | |
US9264502B2 (en) | Download resource recommendation method, system and storage medium | |
CN114359563B (en) | Model training method, device, computer equipment and storage medium | |
CN117234732A (en) | Shared resource allocation method, device, equipment and medium | |
CN110134568A (en) | A kind of application program launching time detection method, device and equipment | |
CN116628235B (en) | Data recommendation method, device, equipment and medium | |
CN116627789B (en) | Model detection method and device, electronic equipment and storage medium | |
CN117520645A (en) | User determination method and device based on financial products and electronic equipment | |
CN111104915B (en) | Method, device, equipment and medium for peer analysis | |
CN110163470B (en) | Event evaluation method and device | |
CN110909072B (en) | Data table establishment method, device and equipment | |
CN116580390A (en) | Price tag content acquisition method, price tag content acquisition device, storage medium and computer equipment | |
CN116610308B (en) | Code management method and device, electronic equipment and storage medium | |
CN108154377B (en) | Advertisement cheating prediction method and device | |
CN115391489A (en) | Topic recommendation method based on knowledge graph | |
CN112291281B (en) | Voice broadcasting and voice broadcasting content setting method and device | |
CN114842382A (en) | Method, device, equipment and medium for generating semantic vector of video | |
CN112447178B (en) | Voiceprint retrieval method and device and electronic equipment | |
CN114219938A (en) | Region-of-interest acquisition method | |
CN114187347A (en) | Skin wrinkle non-contact measurement method, storage medium and processor | |
CN112183951A (en) | Supervision processing method and device based on audit quality evaluation | |
CN112101308B (en) | Method and device for combining text boxes based on language model and electronic equipment | |
CN112165456B (en) | Hijacking flow identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |