CN108734293A - Task management system, method and apparatus - Google Patents
Task management system, method and apparatus Download PDFInfo
- Publication number
- CN108734293A CN108734293A CN201710239897.7A CN201710239897A CN108734293A CN 108734293 A CN108734293 A CN 108734293A CN 201710239897 A CN201710239897 A CN 201710239897A CN 108734293 A CN108734293 A CN 108734293A
- Authority
- CN
- China
- Prior art keywords
- server
- training
- mission
- model
- training mission
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
This application discloses task management system, method and apparatus.One specific implementation mode of the system includes:Training mission management server waits for training pattern information for receiving, and is sent to training mission storage server;Training mission storage server from training mission server cluster for determining training mission distribution server;Training mission distribution server from training mission server cluster for determining training mission execute server;Training mission execute server, using deep learning algorithm, is based on sample data, training obtains prediction model, and is sent to model data store server for obtaining sample data corresponding with training pattern information is waited for;Prediction task execution server is predicted for obtaining prediction model, and by mission bit stream to be predicted importing prediction model, obtains prediction result corresponding with mission bit stream to be predicted.This embodiment improves the efficiency of task management.
Description
Technical field
This application involves field of computer technology, and in particular to Internet technical field more particularly to task management system,
Method and apparatus.
Background technology
Artificial intelligence (Artificial Intelligence), english abbreviation AI.It is research, develop for simulating,
Extend and extend intelligent theory, the new technological sciences of method, technology and application system of people.Artificial intelligence is to calculate
One branch of machine science, it attempts to understand essence of intelligence, and produce it is a kind of it is new can be in such a way that human intelligence be similar
The intelligence machine made a response.
Artificial intelligence especially deep learning is the research direction in current very forward position.Since deep learning can pass through group
It closes and forms more abstract high-rise expression attribute classification or feature on low-level feature, be widely used in machine at present and turned over
Translate, semantic excavation, image recognition, recognition of face, the fields such as speech recognition.
Current existing task management mode is typically artificial trigger model training, the artificial model uploaded after training
It is predicted.Whole process is required for manually participating in, and leads to the less efficient of task management.
Invention content
The purpose of the application is to propose a kind of improved task management system, method and apparatus, to solve background above
The technical issues of technology segment is mentioned.
In a first aspect, the embodiment of the present application provides a kind of task management system, which includes:Training mission management takes
Business device, training mission storage server, training mission server cluster, model data store server and prediction task are held
Row server;Training mission management server waits for training pattern information for receive the transmission of the first client, and is sent to instruction
Practice task storage server, wherein wait for training pattern information for characterizing the function of waiting for training pattern;Training mission storage service
Device is serviced for determining that training mission dispatch server is distributed as training mission from training mission server cluster
Device;Training mission distribution server, for determining training mission dispatch server from training mission server cluster
As training mission execute server;Training mission execute server waits instructing for obtaining from training mission storage server
Practice model information, sample data corresponding with training pattern information is waited for is obtained from model data store server, utilizes depth
Learning algorithm is spent, is based on sample data, training obtains prediction model, and is sent to model data store server;Prediction task
Execute server, for obtaining prediction model from model data store server, and waiting for of being received from the second client is pre-
It surveys mission bit stream importing prediction model to be predicted, obtains prediction result corresponding with mission bit stream to be predicted.
In some embodiments, each training mission dispatch server in training mission server cluster, is additionally operable to:
Each mark in the logo collection in training mission storage server is obtained respectively, wherein logo collection includes general identification
And target identification;Training mission storage server, is specifically used for:Target mark will be got in training mission server cluster
The training mission dispatch server of knowledge is determined as training mission distribution server.
In some embodiments, training mission storage server is specifically used for:From training mission server cluster
Training mission dispatch server is randomly selected out as training mission distribution server.
In some embodiments, training mission distribution server is specifically used for:Obtain training mission server cluster
In each training mission dispatch server performance indicator;Based on the performance indicator of each training mission dispatch server, from
Training mission execute server is selected in training mission server cluster.
In some embodiments, training mission distribution server is specifically used for:From training mission server cluster
Training mission dispatch server is randomly selected out as training mission execute server.
In some embodiments, wait for that training pattern information includes waiting for training pattern classification;Training mission execute server, tool
Body is used for:Based on training pattern classification is waited for, is selected from preset deep learning algorithm set and wait for training pattern classification phase
Matched deep learning algorithm;It is trained based on sample data using with the deep learning algorithm for waiting for that training pattern classification matches
Obtain prediction model.
In some embodiments, training mission management server is additionally operable to:Periodically determine that training mission executes service
Whether the model in device trains completion;It is completed in response to the model training determined in training mission execute server, training is complete
It is sent to the first client at information.
In some embodiments, system further includes model version storage server;Training mission execute server, is also used
In:After obtaining prediction model, the current version number of prediction model is sent to model version storage server;Model version
Number storage server, is additionally operable to:Current version number in response to the prediction model for receiving the transmission of training mission execute server,
The version number stored is updated to current version number.
In some embodiments, it predicts task execution server, is additionally operable to:Periodically obtain model version number storage clothes
The version number stored in business device;Determine whether prediction model has updated based on acquired version number;Mould is predicted in response to determining
Type has updated, and updated prediction model is obtained from model data store server.
Second aspect, the embodiment of the present application provide a kind of task management method, and this method includes:It is stored from training mission
It is obtained in server and waits for training pattern information, wherein wait for training pattern information for characterizing the function of waiting for training pattern;From model
Sample data corresponding with training pattern information is waited for is obtained in data storage server;Using deep learning algorithm, it is based on sample
Notebook data, training obtain prediction model;Prediction model is sent to model data store server, so that prediction task execution clothes
Device be engaged in from model data store server acquisition prediction model, and the mission bit stream to be predicted received from the second client is led
Enter prediction model to be predicted, obtains prediction result corresponding with mission bit stream to be predicted.
The third aspect, the embodiment of the present application provide a kind of task management device, which includes:Model information obtains single
Member is configured to obtain from training mission storage server and waits for training pattern information, wherein waits for training pattern information for table
Sign waits for the function of training pattern;Sample data acquiring unit is configured to obtain and wait from model data store server to instruct
Practice the corresponding sample data of model information;Model training unit is configured to utilize deep learning algorithm, is based on sample number
According to training obtains prediction model;Model transmission unit is configured to prediction model being sent to model data store server,
So that prediction task execution server obtains prediction model from model data store server, and waited for what is received from client
Prediction mission bit stream imports prediction model and is predicted, obtains prediction result corresponding with mission bit stream to be predicted.
Fourth aspect, the embodiment of the present application provide a kind of electronic equipment, which includes:One or more processing
Device;Storage device when one or more programs are executed by one or more processors, makes for storing one or more programs
Obtain method of the one or more processors realization as described in second aspect.
5th aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey
Sequence realizes the method as described in second aspect when the computer program is executed by processor.
Task management system provided by the embodiments of the present application, method and apparatus pass through training mission management server first
Receive the transmission of the first client waits for training pattern information, and is sent to training mission storage server;Then, training mission is deposited
Storage server determines training mission distribution server from training mission server cluster;Later, training mission is distributed
Server determines training mission execute server from training mission server cluster;Then, training mission executes clothes
Business device is obtained from training mission storage server waits for training pattern information, obtains and wait from model data store server to instruct
To practice the corresponding sample data of model information, using deep learning algorithm, is based on sample data, training obtains prediction model, and
It is sent to model data store server;Finally, prediction task execution server is obtained from model data store server and is predicted
Model, and by the mission bit stream to be predicted received from the second client importing prediction model predict, obtain with it is to be predicted
The corresponding prediction result of mission bit stream.User only needs to wait for training pattern information by the transmission of the first client, passes through second
Client sends mission bit stream to be predicted, and task management system can load automatically to be waited for training pattern information, trains mould automatically
Type, the automatic model uploaded after training are predicted, to improve the efficiency of task management.
Description of the drawings
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 is the exemplary system architecture figure according to the task management system of the embodiment of the present application;
Fig. 2 is the schematic diagram according to one embodiment of the data exchange process of the task management system of the application;
Fig. 3 is the flow chart according to one embodiment of the task management method of the application;
Fig. 4 is the structural schematic diagram according to one embodiment of the task management device of the application;
Fig. 5 is adapted for the structural schematic diagram of the computer system of the electronic equipment for realizing the embodiment of the present application.
Specific implementation mode
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, is illustrated only in attached drawing and invent relevant part with related.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 shows the exemplary system architecture 100 according to the task management system of the embodiment of the present application.
As shown in Figure 1, system architecture 100 may include the first client 1011, the second client 1012, network 1021,
1022,1023,1024,1025,1026, training mission management server 103, training mission storage server 104, training mission
Server cluster 105, model data store server 106 and prediction task execution server 107.Wherein, training mission
Server cluster 105 may include training mission dispatch server 1051,1052,1053,1054.Network 1021 to
The medium of communication link is provided between the first client 1011 and training mission management server 103.Network 1022 is instructing
Practice task management server 103, is provided between training mission storage server 104 and training mission server cluster 105
The medium of communication link.Network 1023 is to the training mission dispatch server in training mission server cluster 105
1051, the medium of communication link is provided between 1052,1053,1054.Network 1024 is in training mission dispatch server collection
The medium of communication link is provided between group 105 and model data store server 106.Network 1025 is in model data store
The medium of communication link is provided between server 106 and prediction task execution server 107.Network 1026 is in prediction task
The medium of communication link is provided between execute server 107 and the second client 1012.Network 1021,1022,1023,1024,
1025,1026 may include various connection types, such as wired, wireless communication link or fiber optic cables etc..
First client 1011 and the second client 1012 can be various terminal equipments, including but not limited to intelligent hand
Machine, tablet computer, pocket computer on knee and desktop computer etc..
Training mission management server 103, training mission storage server 104, training mission server cluster
105, model data store server 106 is to provide the server of different services with prediction task execution server 107 respectively.Example
Such as, training mission management server 103 can receive from the first client 1011 and wait for training pattern information, and is sent to training and appoints
Business storage server 104.Training mission storage server 104 can be selected from training mission server cluster 105
Training mission distribution server (such as training mission dispatch server 1051).Training mission distribution server can appoint from training
Training mission execute server (such as training mission dispatch server 1052) is selected in business server cluster 105.Instruction
White silk task execution server can obtain first from training mission storage server 104 waits for training pattern information;Then from mould
Sample data corresponding with training pattern information is waited for is obtained in type data storage server 106, and utilizes deep learning algorithm
Train prediction model;Prediction model is finally sent to model data store server 106.Predict task execution server
107 can obtain prediction model from model data store server 106, and using prediction model to being connect from the second client 1012
The mission bit stream to be predicted received is predicted, to obtain prediction result corresponding with mission bit stream to be predicted.
It should be noted that the task management method that the embodiment of the present application is provided is generally by training mission execute server
It executes, correspondingly, task management device is generally positioned in training mission execute server.
It should be understood that the first client, the second client, network, training mission management server, training in Fig. 1 are appointed
Be engaged in storage server, model data store server, prediction task execution server and training mission server cluster with
And the number of training mission dispatch server is only schematical.According to needs are realized, can have any number of first
Client, the second client, network, training mission management server, training mission storage server, model data store service
Device, prediction task execution server and training mission server cluster and training mission dispatch server.It needs to illustrate
Be the first client 1011 and the second client 1012 can be same client, client can also be different, this
Application is to this without limiting.
With continued reference to Fig. 2, it illustrates an implementations according to the data exchange process of the task management system of the application
The schematic diagram of example.
Task management system in the present embodiment may include:Training mission management server, training mission storage service
Device, training mission server cluster, model data store server and prediction task execution server;Training mission management
Server waits for training pattern information for receive the transmission of the first client, and is sent to training mission storage server,
In, wait for training pattern information for characterizing the function of waiting for training pattern;Training mission storage server is used for from training mission tune
Determine training mission dispatch server as training mission distribution server in degree server cluster;Training mission distribution service
Device executes service for determining training mission dispatch server as training mission from training mission server cluster
Device;Training mission execute server waits for training pattern information, from model data for being obtained from training mission storage server
Sample data corresponding with training pattern information is waited for is obtained in storage server, using deep learning algorithm, is based on sample number
According to training obtains prediction model, and is sent to model data store server;It predicts task execution server, is used for from model
Data storage server obtains prediction model, and the mission bit stream to be predicted received from the second client is imported prediction model
It is predicted, obtains prediction result corresponding with mission bit stream to be predicted.
As shown in Fig. 2, in step 201, training mission management server receives the mould to be trained that the first client is sent
Type information.
In the present embodiment, user can be by the first client (such as first client 1011 shown in FIG. 1) to instruction
Practice task management server (such as training mission management server 103 shown in FIG. 1) transmission and waits for training pattern information.Wherein,
Wait for that training pattern information can be used for characterizing the function of waiting for training pattern, for example, waiting for that training pattern information includes but not limited to wait for
The classification of training pattern, the input and output for the title for waiting for training pattern, waiting for algorithm used by training pattern, waiting for training pattern
Between correspondence etc. in it is one or more.
In step 202, training mission management server will wait for that training pattern information is sent to training mission storage service
Device.
In the present embodiment, training pattern information, training mission management server are waited for based on received in step 201
It can will wait for that training pattern information is sent to training mission storage server (such as training mission storage server shown in FIG. 1
104).Wherein, training mission storage server, which can be used for storing, waits for training pattern information.
In step 203, training mission storage server determines that training is appointed from training mission server cluster
Dispatch server be engaged in as training mission distribution server.
In the present embodiment, training mission storage server can be from training mission server cluster (such as Fig. 1 institutes
The training mission dispatch service cluster 105 shown) in determine training mission distribution server (for example, by training shown in FIG. 1 appoint
Business dispatch server 1051 is determined as training mission distribution server).
In some optional realization methods of the present embodiment, first, each instruction in training mission server cluster
Each mark in the logo collection in training mission storage server can be obtained respectively by practicing task scheduling server;Then,
Training mission storage server can dispatch the training mission that target identification is got in training mission server cluster
Server is determined as training mission distribution server.Wherein, logo collection includes general identification and target identification.For example, target
The value of mark can be arbitrary number (i.e. the value of target identification is not sky), and the value of general identification can be sky.Herein, it instructs
Mark in logo collection can be randomly assigned to each in training mission server cluster by white silk task storage server
Training mission dispatch server, the training mission dispatch server for getting target identification are training mission distribution server.
In some optional realization methods of the present embodiment, training mission storage server can be dispatched from training mission
Training mission dispatch server is randomly selected out in server cluster as training mission distribution server.
In step 204, training mission distribution server determines that training is appointed from training mission server cluster
Dispatch server be engaged in as training mission execute server.
In the present embodiment, training mission distribution server can be determined to instruct from training mission server cluster
Practice task execution server (for example, training mission dispatch server 1052 shown in FIG. 1, which is determined as training mission, executes service
Device).
In some optional realization methods of the present embodiment, first, training mission distribution server can obtain training
The performance indicator of each training mission dispatch server in task scheduling server cluster;Then, it is based on each training mission
The performance indicator of dispatch server selects training mission execute server from training mission server cluster.Wherein,
Performance indicator can include but is not limited to CPU (Central Processing Unit, central processing unit) occupancy, it is available in
Deposit number, physical disk access time situation etc..As an example, training mission distribution server can obtain training mission tune first
Spend the CPU usage of each training mission dispatch server in server cluster;Then the minimum instruction of CPU usage is selected
Practice task scheduling server as training mission execute server.
In some optional realization methods of the present embodiment, training mission distribution server can be with poll training mission tune
The performance indicator of the training mission dispatch server in server cluster is spent, until determining that performance indicator meets default capabilities and refers to
Target training mission dispatch server stops poll, and using the training mission dispatch server determined as training mission
Execute server.
In some optional realization methods of the present embodiment, training mission distribution server can be dispatched from training mission
Training mission dispatch server is randomly selected out in server cluster as training mission execute server.
In step 205, training mission execute server is obtained from training mission storage server waits for that training pattern is believed
Breath.
In the present embodiment, training mission execute server can obtain mould to be trained from training mission storage server
Type information carries out model training to trigger training mission execute server.
In step 206, training mission execute server obtains and waits for training pattern from model data store server
The corresponding sample data of information.
In the present embodiment, training pattern information, training mission execute server are waited for based on accessed in step 205
It can be obtained and mould to be trained from model data store server (such as model data store server 106 shown in FIG. 1)
The corresponding sample data of type information.Wherein, model data store server can be used for storing sample data.As an example,
Wait for that training pattern information includes waiting for the title of training pattern, the title and and mould of storage model in model data store server
The corresponding sample data of title of type, training mission execute server can will wait for the title and model data store of training pattern
The title of model in server is matched, if successful match, obtains sample corresponding with the title of the model of successful match
Data.
In step 207, training mission execute server utilizes deep learning algorithm, is based on sample data, and training obtains
Prediction model.
In the present embodiment, based on sample data accessed in step 206, training mission execute server can profit
With deep learning algorithm, it is based on sample input data and sample output data, training obtains that input data and output can be established
The prediction model of accurate correspondence between data.Wherein, sample data may include sample input data and sample output number
According to
In some optional realization methods of the present embodiment, wait for that training pattern information may include waiting for training pattern class
Not;Training mission execute server, which can be primarily based on, waits for training pattern classification, is selected from preset deep learning algorithm set
Take out and wait for the deep learning algorithm that training pattern classification matches;Then the depth for utilizing and waiting for that training pattern classification matches
Learning algorithm, is based on sample data, and training obtains prediction model.
Herein, it in deep learning algorithm set may include a variety of deep learnings for training different classes of model
Algorithm.For example, Word2Vec algorithms, CNN (Convolutional Neural Network, convolutional neural networks) algorithm, RNN
(Recurrent neural Network, Recognition with Recurrent Neural Network) algorithm, LSTM (Long-Short Term Memory, length
Phase remembers artificial neural network) algorithm.Wherein, Word2Vec algorithms can be being reduced to K dimensional vectors to the processing of content of text
Vector operation in space, and the similarity in vector space can be used for indicating the similarity on text semantic.Word2Vec
The term vector of output can be used to cluster, look for synonym, part of speech analysis etc..TensorFlow can support CNN, RNN and
LSTM algorithms, Tensor (tensor) mean that N-dimensional array, Flow (stream) mean the calculating based on data flow diagram,
TensorFlow is that tensor flow to other end calculating process from one end of flow graph.TensorFlow is by complicated data structure
It is transmitted to the system that analysis and processing procedure are carried out in artificial intelligence nerve net.TensorFlow can be used for speech recognition or figure
As multinomial machine deep learning fields such as identifications.
In a step 208, prediction model is sent to model data store server by training mission execute server.
In the present embodiment, it is based on obtained prediction model in step 207, training mission execute server can will be pre-
It surveys model and is sent to model data store server.Wherein, model data store server can be also used for Storage Estimation model.
In step 209, prediction task execution server obtains prediction model from model data store server.
In the present embodiment, prediction task execution server (such as prediction task execution server 107 shown in FIG. 1) can
To obtain prediction model from model data store server, with the automatic upload of implementation model.
In step 210, prediction task execution server leads the mission bit stream to be predicted received from the second client
Enter prediction model to be predicted, obtains prediction result corresponding with mission bit stream to be predicted.
In the present embodiment, prediction task execution server can be first from the second client (such as shown in FIG. 1 second
Client 1012) receive mission bit stream to be predicted;Then by mission bit stream to be predicted importing prediction model predict, obtain with
The corresponding prediction result of mission bit stream to be predicted;Prediction result is finally fed back into the second client.
In some optional realization methods of the present embodiment, training mission management server can periodically determine instruction
Whether the model practiced in task execution server trains completion;In response to determining the model training in training mission execute server
It completes, information is completed in training is sent to the first client.As an example, training mission management server can be every 10 minutes
Determine whether the model in a training mission execute server trains completion, when model training is completed, to the first client
Message is completed in transmission pattern training, after user receives model training completion message by the first client, you can pass through second
Client sends mission bit stream to be predicted to prediction task execution server and is predicted.
Task management system provided by the embodiments of the present application receives the first visitor by training mission management server first
What family end was sent waits for training pattern information, and is sent to training mission storage server;Then, training mission storage server from
Training mission distribution server is determined in training mission server cluster;Later, training mission distribution server is from instruction
Practice in task scheduling server cluster and determines training mission execute server;Then, training mission execute server is from training
It is obtained in task storage server and waits for training pattern information, obtain and wait for training pattern information from model data store server
Corresponding sample data is based on sample data, training obtains prediction model, and is sent to model using deep learning algorithm
Data storage server;Finally, prediction task execution server obtains prediction model from model data store server, and will be from
The mission bit stream to be predicted that second client receives imports prediction model and is predicted, obtains opposite with mission bit stream to be predicted
The prediction result answered.User only needs to wait for training pattern information by the transmission of the first client, is waited for by the transmission of the second client
Predict mission bit stream, task management system can load automatically waits for that training pattern information, automatic training pattern, automatic upload are instructed
Model after white silk is predicted, to improve the efficiency of task management.
In some optional realization methods of the present embodiment, task management system can also include that model version number stores
Server;The current version number of prediction model can be sent to mould by training mission execute server after obtaining prediction model
Type version number storage server;The pre- of training mission execute server transmission is received in response to model version storage server
The current version number of model is surveyed, the version number stored can be updated to current version number by model version storage server.
As an example, model version storage server can be with key-value pair (key:Value the title of form Storage Estimation model)
With the version number of prediction model, when model version storage server receive training mission execute server transmission prediction mould
When the current version of type, current version number can be updated and arrive value (value) corresponding with title (key) of prediction model
In.
In some optional realization methods of the present embodiment, prediction task execution server can periodically obtain mould
The version number stored in type version number storage server;Determine whether prediction model has updated based on acquired version number;It rings
It should have been updated in determining prediction model, updated prediction model obtained from model data store server.As an example, prediction
Task execution server can obtain the version number that store in a model version storage server every 10 minutes, and by institute
The version number of acquisition is compared with the version number of the prediction model stored in prediction task execution server;If identical, really
Determine prediction model not update;If it is different, then determining that prediction model has updated, and obtains and update from model data store server
Prediction model afterwards.
Increase model version storage server in task management system.It is obtained as a result, in training mission execute server
To after prediction model, model version storage server can update stored version number, prediction task execution clothes in time
Business device can quickly determine out whether prediction model has updated according to the version number stored in model version storage server,
Updated prediction model is uploaded automatically when prediction model update to now determine that in fact.
With further reference to Fig. 3, it illustrates the flows 300 according to one embodiment of the task management method of the application.
The flow 300 of the task management method, includes the following steps:
Step 301, it is obtained from training mission storage server and waits for training pattern information.
In the present embodiment, the training mission execute server of task management method operation thereon is (such as shown in FIG. 1
Training mission dispatch server 1052) it can be by wired connection mode or radio connection from training mission storage service
Device (such as training mission storage server 104 shown in FIG. 1) acquisition waits for training pattern information.Wherein, training pattern information is waited for
For characterizing the function of waiting for training pattern, it can include but is not limited to wait for the classification of training pattern, the title for waiting for training pattern, wait for
Algorithm used by training pattern waits for correspondence etc. between the input of training pattern and output.
Step 302, sample data corresponding with training pattern information is waited for is obtained from model data store server.
In the present embodiment, training pattern information, training mission execute server are waited for based on accessed in step 301
It can be obtained and mould to be trained from model data store server (such as model data store server 106 shown in FIG. 1)
The corresponding sample data of type information.Wherein, model data store server can be used for storing sample data.As an example,
Wait for that training pattern information includes waiting for the title of training pattern, the title and and mould of storage model in model data store server
The corresponding sample data of title of type, training mission execute server can will wait for the title and model data store of training pattern
The title of model in server is matched, if successful match, obtains sample corresponding with the title of the model of successful match
Data.
Step 303, using deep learning algorithm, it is based on sample data, training obtains prediction model.
In the present embodiment, based on sample data accessed in step 302, training mission execute server can profit
With deep learning algorithm, it is based on sample input data and sample output data, training obtains that input data and output can be established
The prediction model of accurate correspondence between data.Wherein, sample data may include sample input data and sample output number
According to.
Step 304, prediction model is sent to model data store server.
In the present embodiment, it is based on obtained prediction model in step 303, training mission execute server can will be pre-
It surveys model and is sent to model data store server, so that prediction task execution server (such as prediction task shown in FIG. 1 is held
Row server 107) from model data store server obtain prediction model, and will be from the second client (such as shown in FIG. 1 the
Two clients 1012) mission bit stream to be predicted that receives imports prediction model and predicted, obtained and mission bit stream to be predicted
Corresponding prediction result.
Task management method provided by the embodiments of the present application obtains from training mission storage server first and waits training
Model information;Later, sample data corresponding with training pattern information is waited for is obtained from model data store server;So
Afterwards, using deep learning algorithm, it is based on sample data, training obtains prediction model;Finally, prediction model is sent to pattern number
According to storage server, so that prediction task execution server obtains prediction model from model data store server, and will be from the
The mission bit stream to be predicted that two clients receive imports prediction model and is predicted, obtains corresponding with mission bit stream to be predicted
Prediction result.User only needs to wait for training pattern information by the transmission of the first client, is waited for by the transmission of the second client pre-
Mission bit stream is surveyed, task management system can load automatically waits for that training pattern information, automatic training pattern, automatic upload are trained
Model afterwards is predicted, and after model modification, can also be uploaded updated model automatically and be predicted, to improve
The efficiency of task management.
With further reference to Fig. 4, as the realization to method shown in above-mentioned Fig. 3, this application provides a kind of task management dresses
The one embodiment set, the embodiment of the method for the device embodiment as shown in figure 3 is corresponding, which specifically can be applied to respectively
In kind electronic equipment.
As shown in figure 4, the task management device 400 of the present embodiment includes:Model information acquiring unit 401, sample data
Acquiring unit 402, model training unit 403 and model transmission unit 404.Wherein, model information acquiring unit 401, configuration are used
Training pattern information is waited in being obtained from training mission storage server, wherein waits for that training pattern information waits training for characterizing
The function of model;Sample data acquiring unit 402 is configured to obtain and wait for training pattern from model data store server
The corresponding sample data of information;Model training unit 403 is configured to utilize deep learning algorithm, is based on sample data, instruction
Get prediction model;Model transmission unit 404 is configured to prediction model being sent to model data store server, with
Prediction task execution server is set to obtain prediction model from model data store server, and waiting for of being received from client is pre-
It surveys mission bit stream importing prediction model to be predicted, obtains prediction result corresponding with mission bit stream to be predicted.
In the present embodiment, in task management device 400:Model information acquiring unit 401, sample data acquiring unit
402, the specific advantageous effect for handling and its bringing of model training unit 403 and model transmission unit 404 can correspond to referring to Fig. 3
The associated description of the realization method of step 301, step 302, step 303 and step 304 in embodiment, details are not described herein.
Below with reference to Fig. 5, it illustrates the computer systems 500 suitable for the electronic equipment for realizing the embodiment of the present application
Structural schematic diagram.Electronic equipment shown in Fig. 5 is only an example, to the function of the embodiment of the present application and should not use model
Shroud carrys out any restrictions.
As shown in figure 5, computer system 500 includes central processing unit (CPU) 501, it can be read-only according to being stored in
Program in memory (ROM) 502 or be loaded into the program in random access storage device (RAM) 503 from storage section 508 and
Execute various actions appropriate and processing.In RAM 503, also it is stored with system 500 and operates required various programs and data.
CPU 501, ROM 502 and RAM 503 are connected with each other by bus 504.Input/output (I/O) interface 505 is also connected to always
Line 504.
It is connected to I/O interfaces 505 with lower component:Importation 506 including button, touch screen etc.;Including such as liquid crystal
The output par, c 507 of display (LCD) etc. and loud speaker etc.;Storage section 508 including hard disk etc.;And including such as
The communications portion 509 of the network interface card of LAN card, modem etc..Communications portion 509 is held via the network of such as internet
Row communication process.Driver 510 is also according to needing to be connected to I/O interfaces 505.Detachable media 511, such as disk, CD, magnetic
CD, semiconductor memory etc. are mounted on driver 510 as needed, in order to from the computer program read thereon
It is mounted into storage section 508 as needed.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium
On computer program, which includes the program code for method shown in execution flow chart.In such reality
It applies in example, which can be downloaded and installed by communications portion 509 from network, and/or from detachable media
511 are mounted.When the computer program is executed by central processing unit (CPU) 501, limited in execution the present processes
Above-mentioned function.It should be noted that computer-readable medium described herein can be computer-readable signal media or
Computer readable storage medium either the two arbitrarily combines.Computer readable storage medium for example can be --- but
Be not limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or arbitrary above combination.
The more specific example of computer readable storage medium can include but is not limited to:Electrical connection with one or more conducting wires,
Portable computer diskette, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only deposit
Reservoir (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory
Part or above-mentioned any appropriate combination.In this application, computer readable storage medium can any be included or store
The tangible medium of program, the program can be commanded the either device use or in connection of execution system, device.And
In the application, computer-readable signal media may include the data letter propagated in a base band or as a carrier wave part
Number, wherein carrying computer-readable program code.Diversified forms may be used in the data-signal of this propagation, including but not
It is limited to electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be computer
Any computer-readable medium other than readable storage medium storing program for executing, the computer-readable medium can send, propagate or transmit use
In by instruction execution system, device either device use or program in connection.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to:Wirelessly, electric wire, optical cable, RF etc., Huo Zheshang
Any appropriate combination stated.
Flow chart in attached drawing and block diagram, it is illustrated that according to the system of the various embodiments of the application, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part for a part for one module, program segment, or code of table, the module, program segment, or code includes one or more uses
The executable instruction of the logic function as defined in realization.It should also be noted that in some implementations as replacements, being marked in box
The function of note can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are actually
It can be basically executed in parallel, they can also be executed in the opposite order sometimes, this is depended on the functions involved.Also it to note
Meaning, the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart can be with holding
The dedicated hardware based system of functions or operations as defined in row is realized, or can use specialized hardware and computer instruction
Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard
The mode of part is realized.Described unit can also be arranged in the processor, for example, can be described as:A kind of processor packet
Include model information acquiring unit, sample data acquiring unit, model training unit and model transmission unit.Wherein, these units
Title do not constitute the restriction to the unit itself under certain conditions, for example, model information acquiring unit can also be retouched
It states as " obtained from training mission storage server and wait for the unit of training pattern information ".
As on the other hand, present invention also provides a kind of computer-readable medium, which can be
Included in electronic equipment described in above-described embodiment;Can also be individualism, and without be incorporated the electronic equipment in.
Above computer readable medium carries one or more program, when said one or multiple programs are held by the electronic equipment
When row so that the electronic equipment:It is obtained from training mission storage server and waits for training pattern information, wherein wait for training pattern
Information is for characterizing the function of waiting for training pattern;It is obtained from model data store server corresponding with training pattern information is waited for
Sample data;Using deep learning algorithm, it is based on sample data, training obtains prediction model;Prediction model is sent to mould
Type data storage server, so that prediction task execution server obtains prediction model from model data store server, and will
The mission bit stream to be predicted received from the second client imports prediction model and is predicted, obtains and mission bit stream phase to be predicted
Corresponding prediction result.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.People in the art
Member should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Scheme, while should also cover in the case where not departing from foregoing invention design, it is carried out by above-mentioned technical characteristic or its equivalent feature
Other technical solutions of arbitrary combination and formation.Such as features described above has similar work(with (but not limited to) disclosed herein
Can technical characteristic replaced mutually and the technical solution that is formed.
Claims (13)
1. a kind of task management system, which is characterized in that the system comprises:Training mission management server, training mission are deposited
Store up server, training mission server cluster, model data store server and prediction task execution server;
The training mission management server waits for training pattern information for receive the transmission of the first client, and is sent to institute
State training mission storage server, wherein described to wait for training pattern information for characterizing the function of waiting for training pattern;
The training mission storage server, for determining training mission tune from the training mission server cluster
Server is spent as training mission distribution server;
The training mission distribution server, for determining training mission tune from the training mission server cluster
Server is spent as training mission execute server;
The training mission execute server, for waiting for that training pattern is believed described in the acquisition from the training mission storage server
Breath obtains from the model data store server and waits for the corresponding sample data of training pattern information with described, utilizes depth
Learning algorithm is spent, is based on the sample data, training obtains prediction model, and is sent to the model data store server;
The prediction task execution server, for obtaining the prediction model from the model data store server, and will
The mission bit stream to be predicted received from the second client imports the prediction model and is predicted, obtains and described to be predicted
The business corresponding prediction result of information.
2. system according to claim 1, which is characterized in that
Each training mission dispatch server in the training mission server cluster, is additionally operable to:
Each mark in the logo collection in the training mission storage server is obtained respectively, wherein the logo collection
Including general identification and target identification;
The training mission storage server, is specifically used for:
The training mission dispatch server that the target identification is got in the training mission server cluster is determined
For training mission distribution server.
3. system according to claim 1, which is characterized in that the training mission storage server is specifically used for:
Training mission dispatch server is randomly selected out from the training mission server cluster as training mission point
With server.
4. system according to claim 1, which is characterized in that the training mission distribution server is specifically used for:
Obtain the performance indicator of each training mission dispatch server in the training mission server cluster;
Based on the performance indicator of each training mission dispatch server, selected from the training mission server cluster
Training mission execute server.
5. system according to claim 1, which is characterized in that the training mission distribution server is specifically used for:
Training mission dispatch server is randomly selected out from the training mission server cluster as training mission to hold
Row server.
6. system according to claim 1, which is characterized in that described to wait for that training pattern information includes waiting for training pattern class
Not;
The training mission execute server, is specifically used for:
Training pattern classification is waited for based on described, is selected from preset deep learning algorithm set and is waited for training pattern class with described
The deep learning algorithm not matched;
Using with the deep learning algorithm for waiting for training pattern classification and matching, be based on the sample data, training obtain it is pre-
Survey model.
7. according to the system described in one of claim 1-6, which is characterized in that the training mission management server is additionally operable to:
Periodically determine whether the model in the training mission execute server trains completion;
It is completed in response to the model training in the determination training mission execute server, model training completion information is sent to
First client.
8. according to the system described in one of claim 1-6, which is characterized in that the system also includes model version number storages to take
Business device;
The training mission execute server, is additionally operable to:
After obtaining the prediction model, the current version number of the prediction model is sent to the model version number storage clothes
Business device;
The model version storage server, is additionally operable to:
In response to receiving the current version number for the prediction model that the training mission execute server is sent, will be stored
Version number be updated to the current version number.
9. system according to claim 8, which is characterized in that the prediction task execution server is additionally operable to:
Periodically obtain the version number stored in the model version storage server;
Determine whether the prediction model has updated based on acquired version number;
It has been updated in response to the determination prediction model, updated prediction mould is obtained from the model data store server
Type.
10. a kind of task management method, which is characterized in that the method includes:
It is obtained from training mission storage server and waits for training pattern information, wherein is described to wait for training pattern information for characterizing
Wait for the function of training pattern;
It is obtained from model data store server and waits for the corresponding sample data of training pattern information with described;
Using deep learning algorithm, it is based on the sample data, training obtains prediction model;
The prediction model is sent to the model data store server, so as to predict task execution server from the mould
Type data storage server obtains the prediction model, and the mission bit stream to be predicted received from the second client is imported institute
It states prediction model to be predicted, obtains prediction result corresponding with the mission bit stream to be predicted.
11. a kind of task management device, which is characterized in that described device includes:
Model information acquiring unit is configured to obtain from training mission storage server and waits for training pattern information, wherein institute
It states and waits for training pattern information for characterizing the function of waiting for training pattern;
Sample data acquiring unit is configured to obtain from model data store server and waits for training pattern information phase with described
Corresponding sample data;
Model training unit is configured to utilize deep learning algorithm, is based on the sample data, and training obtains prediction model;
Model transmission unit is configured to the prediction model being sent to the model data store server, so that prediction
Task execution server obtains the prediction model from the model data store server, and is waited for what is received from client
Prediction mission bit stream imports the prediction model and is predicted, obtains prediction knot corresponding with the mission bit stream to be predicted
Fruit.
12. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
One or more processors;
Storage device, for storing one or more programs;
When one or more of programs are executed by one or more of processors so that one or more of processors are real
Now method as claimed in claim 10.
13. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
Method as claimed in claim 10 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710239897.7A CN108734293B (en) | 2017-04-13 | 2017-04-13 | Task management system, method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710239897.7A CN108734293B (en) | 2017-04-13 | 2017-04-13 | Task management system, method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108734293A true CN108734293A (en) | 2018-11-02 |
CN108734293B CN108734293B (en) | 2023-05-02 |
Family
ID=63923682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710239897.7A Active CN108734293B (en) | 2017-04-13 | 2017-04-13 | Task management system, method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108734293B (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109685213A (en) * | 2018-12-29 | 2019-04-26 | 百度在线网络技术(北京)有限公司 | A kind of acquisition methods, device and the terminal device of training sample data |
CN109685160A (en) * | 2019-01-18 | 2019-04-26 | 创新奇智(合肥)科技有限公司 | A kind of on-time model trained and dispositions method and system automatically |
CN110175677A (en) * | 2019-04-16 | 2019-08-27 | 平安普惠企业管理有限公司 | Automatic update method, device, computer equipment and storage medium |
CN111078659A (en) * | 2019-12-20 | 2020-04-28 | 腾讯科技(深圳)有限公司 | Model updating method, model updating device, computer readable storage medium and computer equipment |
WO2020098414A1 (en) * | 2018-11-13 | 2020-05-22 | Oppo广东移动通信有限公司 | Data processing method for terminal, device, and terminal |
CN111738404A (en) * | 2020-05-08 | 2020-10-02 | 深圳市万普拉斯科技有限公司 | Model training task processing method and device, electronic equipment and storage medium |
CN111753997A (en) * | 2020-06-28 | 2020-10-09 | 北京百度网讯科技有限公司 | Distributed training method, system, device and storage medium |
CN111966382A (en) * | 2020-08-28 | 2020-11-20 | 上海寻梦信息技术有限公司 | Online deployment method and device of machine learning model and related equipment |
CN112087487A (en) * | 2020-07-30 | 2020-12-15 | 北京聚云科技有限公司 | Model training task scheduling method and device, electronic equipment and storage medium |
CN112102263A (en) * | 2020-08-31 | 2020-12-18 | 深圳思谋信息科技有限公司 | Defect detection model generation system, method and device and computer equipment |
CN112257874A (en) * | 2020-11-13 | 2021-01-22 | 腾讯科技(深圳)有限公司 | Machine learning method, device and system of distributed machine learning system |
CN113408745A (en) * | 2021-08-20 | 2021-09-17 | 北京瑞莱智慧科技有限公司 | Task scheduling method, device, equipment and storage medium |
CN113672500A (en) * | 2021-07-27 | 2021-11-19 | 浙江大华技术股份有限公司 | Deep learning algorithm testing method and device, electronic device and storage medium |
CN113780568A (en) * | 2020-06-09 | 2021-12-10 | 子长科技(北京)有限公司 | Automatic model training framework, device and storage medium |
CN113806624A (en) * | 2020-06-15 | 2021-12-17 | 阿里巴巴集团控股有限公司 | Data processing method and device |
CN115496444A (en) * | 2022-09-26 | 2022-12-20 | 重庆大学 | Method and system for intelligent distribution management of storeroom |
WO2022267066A1 (en) * | 2021-06-25 | 2022-12-29 | Oppo广东移动通信有限公司 | Model management method and communication device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160098292A1 (en) * | 2014-10-03 | 2016-04-07 | Microsoft Corporation | Job scheduling using expected server performance information |
CN106227596A (en) * | 2016-07-13 | 2016-12-14 | 百度在线网络技术(北京)有限公司 | Mission Monitor method and apparatus for task scheduling server |
CN106230792A (en) * | 2016-07-21 | 2016-12-14 | 北京百度网讯科技有限公司 | Machine learning method based on mobile office, terminal unit and system |
-
2017
- 2017-04-13 CN CN201710239897.7A patent/CN108734293B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160098292A1 (en) * | 2014-10-03 | 2016-04-07 | Microsoft Corporation | Job scheduling using expected server performance information |
CN106227596A (en) * | 2016-07-13 | 2016-12-14 | 百度在线网络技术(北京)有限公司 | Mission Monitor method and apparatus for task scheduling server |
CN106230792A (en) * | 2016-07-21 | 2016-12-14 | 北京百度网讯科技有限公司 | Machine learning method based on mobile office, terminal unit and system |
Non-Patent Citations (1)
Title |
---|
张翠: "织物悬垂性能预测与评价系统的研究", 《中国优秀硕士学位论文全文数据库(电子期刊)工程科技Ⅰ辑》 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020098414A1 (en) * | 2018-11-13 | 2020-05-22 | Oppo广东移动通信有限公司 | Data processing method for terminal, device, and terminal |
CN109685213A (en) * | 2018-12-29 | 2019-04-26 | 百度在线网络技术(北京)有限公司 | A kind of acquisition methods, device and the terminal device of training sample data |
CN109685213B (en) * | 2018-12-29 | 2022-01-07 | 百度在线网络技术(北京)有限公司 | Method and device for acquiring training sample data and terminal equipment |
CN109685160A (en) * | 2019-01-18 | 2019-04-26 | 创新奇智(合肥)科技有限公司 | A kind of on-time model trained and dispositions method and system automatically |
CN110175677A (en) * | 2019-04-16 | 2019-08-27 | 平安普惠企业管理有限公司 | Automatic update method, device, computer equipment and storage medium |
CN111078659A (en) * | 2019-12-20 | 2020-04-28 | 腾讯科技(深圳)有限公司 | Model updating method, model updating device, computer readable storage medium and computer equipment |
CN111078659B (en) * | 2019-12-20 | 2023-04-21 | 腾讯科技(深圳)有限公司 | Model updating method, device, computer readable storage medium and computer equipment |
CN111738404A (en) * | 2020-05-08 | 2020-10-02 | 深圳市万普拉斯科技有限公司 | Model training task processing method and device, electronic equipment and storage medium |
CN111738404B (en) * | 2020-05-08 | 2024-01-12 | 深圳市万普拉斯科技有限公司 | Model training task processing method and device, electronic equipment and storage medium |
CN113780568B (en) * | 2020-06-09 | 2024-05-14 | 子长科技(北京)有限公司 | Automatic model training system, apparatus, and storage medium |
CN113780568A (en) * | 2020-06-09 | 2021-12-10 | 子长科技(北京)有限公司 | Automatic model training framework, device and storage medium |
CN113806624B (en) * | 2020-06-15 | 2024-03-08 | 阿里巴巴集团控股有限公司 | Data processing method and device |
CN113806624A (en) * | 2020-06-15 | 2021-12-17 | 阿里巴巴集团控股有限公司 | Data processing method and device |
CN111753997A (en) * | 2020-06-28 | 2020-10-09 | 北京百度网讯科技有限公司 | Distributed training method, system, device and storage medium |
CN112087487B (en) * | 2020-07-30 | 2023-08-18 | 北京聚云科技有限公司 | Scheduling method and device of model training task, electronic equipment and storage medium |
CN112087487A (en) * | 2020-07-30 | 2020-12-15 | 北京聚云科技有限公司 | Model training task scheduling method and device, electronic equipment and storage medium |
CN111966382A (en) * | 2020-08-28 | 2020-11-20 | 上海寻梦信息技术有限公司 | Online deployment method and device of machine learning model and related equipment |
CN112102263A (en) * | 2020-08-31 | 2020-12-18 | 深圳思谋信息科技有限公司 | Defect detection model generation system, method and device and computer equipment |
CN112257874A (en) * | 2020-11-13 | 2021-01-22 | 腾讯科技(深圳)有限公司 | Machine learning method, device and system of distributed machine learning system |
WO2022267066A1 (en) * | 2021-06-25 | 2022-12-29 | Oppo广东移动通信有限公司 | Model management method and communication device |
CN113672500A (en) * | 2021-07-27 | 2021-11-19 | 浙江大华技术股份有限公司 | Deep learning algorithm testing method and device, electronic device and storage medium |
CN113672500B (en) * | 2021-07-27 | 2024-05-07 | 浙江大华技术股份有限公司 | Deep learning algorithm testing method and device, electronic device and storage medium |
CN113408745A (en) * | 2021-08-20 | 2021-09-17 | 北京瑞莱智慧科技有限公司 | Task scheduling method, device, equipment and storage medium |
CN115496444A (en) * | 2022-09-26 | 2022-12-20 | 重庆大学 | Method and system for intelligent distribution management of storeroom |
CN115496444B (en) * | 2022-09-26 | 2023-06-09 | 重庆大学 | Warehouse intelligent allocation management method and system |
Also Published As
Publication number | Publication date |
---|---|
CN108734293B (en) | 2023-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108734293A (en) | Task management system, method and apparatus | |
CN110288049B (en) | Method and apparatus for generating image recognition model | |
CN109902186A (en) | Method and apparatus for generating neural network | |
CN109647719A (en) | Method and apparatus for sorting cargo | |
CN109325541A (en) | Method and apparatus for training pattern | |
CN108960316B (en) | Method and apparatus for generating a model | |
CN110110811A (en) | Method and apparatus for training pattern, the method and apparatus for predictive information | |
CN109472523A (en) | Method and apparatus for sorting cargo | |
CN107516090A (en) | Integrated face identification method and system | |
CN108287927B (en) | For obtaining the method and device of information | |
CN110263938A (en) | Method and apparatus for generating information | |
CN109410253B (en) | For generating method, apparatus, electronic equipment and the computer-readable medium of information | |
CN109360028A (en) | Method and apparatus for pushed information | |
CN109976997A (en) | Test method and device | |
CN109815365A (en) | Method and apparatus for handling video | |
CN109829164A (en) | Method and apparatus for generating text | |
CN108960110A (en) | Method and apparatus for generating information | |
CN109495552A (en) | Method and apparatus for updating clicking rate prediction model | |
CN110457476A (en) | Method and apparatus for generating disaggregated model | |
CN108182472A (en) | For generating the method and apparatus of information | |
CN109902446A (en) | Method and apparatus for generating information prediction model | |
CN110084317A (en) | The method and apparatus of image for identification | |
CN109117758A (en) | Method and apparatus for generating information | |
CN111738010A (en) | Method and apparatus for generating semantic matching model | |
CN109101309A (en) | For updating user interface method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |