CN107766940A - Method and apparatus for generation model - Google Patents

Method and apparatus for generation model Download PDF

Info

Publication number
CN107766940A
CN107766940A CN201711157646.0A CN201711157646A CN107766940A CN 107766940 A CN107766940 A CN 107766940A CN 201711157646 A CN201711157646 A CN 201711157646A CN 107766940 A CN107766940 A CN 107766940A
Authority
CN
China
Prior art keywords
model
sample data
user
training
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711157646.0A
Other languages
Chinese (zh)
Other versions
CN107766940B (en
Inventor
谢永康
施恩
胡鸣人
李曙鹏
李亚帅
臧硕
潘子豪
赵颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201711157646.0A priority Critical patent/CN107766940B/en
Publication of CN107766940A publication Critical patent/CN107766940A/en
Application granted granted Critical
Publication of CN107766940B publication Critical patent/CN107766940B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Information Transfer Between Computers (AREA)
  • Image Analysis (AREA)

Abstract

This application discloses the method and apparatus for generation model.One embodiment of this method includes:The model for including user's mark that terminal in response to receiving user is sent generates request, model information set corresponding with user's mark is searched from default model table, and to terminal transmission pattern information aggregate, wherein, model information in model information set includes model classification and model parameter, and model table is used for the corresponding relation for characterizing user's mark and model information;In response to receiving the user of terminal transmission from the model classification and model parameter of model information Resource selection, it is determined that the neutral net matched with the model classification and model parameter of user's selection;Sample data sets in response to receiving terminal transmission, using machine learning method, model is obtained based on sample data sets and neural metwork training.The embodiment can generate user-defined neural network model.

Description

Method and apparatus for generation model
Technical field
The invention relates to field of computer technology, and in particular to Internet technical field, it is more particularly, to raw Into the method and apparatus of model.
Background technology
Artificial intelligence (Artificial Intelligence), english abbreviation AI.It is research, develop for simulating, Extension and the extension intelligent theory of people, method, a new technological sciences of technology and application system.Artificial intelligence is to calculate One branch of machine science, it attempts to understand essence of intelligence, and produce it is a kind of it is new can be in a manner of human intelligence be similar The intelligence machine made a response, the research in the field include robot, language identification, image recognition, natural language processing and specially Family's system etc..
With the development of deep learning, existing AI technologies tend to obtain preferable effect in general field, such as:It is logical Identified with Text region, the image classification of finite aggregate, general participle, universal phonetic.However, AI abilities are needed in practical application Scene often all be customize, such as:The Text region of particular documents, the participle of medical terminology, the voice of special screne Identification.
The content of the invention
The embodiment of the present application proposes the method and apparatus for generation model.
In a first aspect, the embodiment of the present application provides a kind of method for generation model, including:In response to receiving use What the terminal at family was sent includes the model generation request of user's mark, is searched from default model table corresponding with user's mark Model information set, and to terminal transmission pattern information aggregate, wherein, the model information in model information set includes model class Not and model parameter, model table are used for the corresponding relation for characterizing user's mark and model information;Sent in response to receiving terminal User from the model classification and model parameter of model information Resource selection, it is determined that joining with the model classification and model of user's selection The neutral net of number matching;Sample data sets in response to receiving terminal transmission, using machine learning method, based on sample Data acquisition system and neural metwork training obtain model.
In certain embodiments, the sample data in sample data sets includes input sample data and output sample number According to;And this method also includes:From sample data sets obtain predetermined number input sample data and with predetermined number The output sample data of predetermined number corresponding to input sample data;It is for every acquired input sample data, this is defeated Enter sample data and be input to model, obtain output result corresponding with the input sample data, if the output result and the input The similarity of output sample data is more than predetermined similarity threshold corresponding to sample data, then add up the correct number of checking;Will Verify that the ratio of correct number and predetermined number is defined as accuracy rate.
In certain embodiments, this method also includes:If accuracy rate is more than predetermined accuracy rate threshold value, by model conversion into Using, and will apply and be published to destination server so that at least one terminal downloads use.
In certain embodiments, this method also includes:Receive download using application terminal send feedback data and will Feedback data is added to sample data sets, wherein, feedback data includes downloading to be input in application using the terminal of application Data are actually entered, reality output result corresponding to data is actually entered, downloads the phase inputted using the user of the terminal of application Output result is hoped, and the similarity of reality output result and desired output result is less than predetermined similarity threshold;Based on reality Input data, reality output result, desired output result re -training model, and the model after renewal is published to target again Server.
In certain embodiments, this method also includes:If accuracy rate is less than or equal to predetermined accuracy rate threshold value, according to standard True rate and the model classification of user's selection, model parameter determine recommended models information from model information set, and are sent out to terminal Recommended models information is sent, so that user reselects model classification and model parameter.
In certain embodiments, model is obtained based on sample data sets and neural metwork training, including:From sample data Sample data is chosen in set and performs following training step:Neutral net is trained using sample data;Based on training result, adjustment The network parameter of neutral net;It is determined that whether training completion condition has met;It is unsatisfactory in response to determining and does not receive The deconditioning message sent to the terminal of user, other sample datas are chosen from sample data sets and continue executing with training step Suddenly;Wherein, completion condition is trained including at least one of following:The frequency of training of neutral net reaches default frequency of training threshold Value;In adjacent training twice, the penalty values between the output of neutral net are less than predetermined threshold value.
Second aspect, the embodiment of the present application provide a kind of device for generation model, including:Receiving unit, configuration The model for including user's mark for being sent in response to receiving the terminal of user generates request, is looked into from default model table Model information set corresponding with user's mark is looked for, and to terminal transmission pattern information aggregate, wherein, in model information set Model information includes model classification and model parameter, and model table is used for the corresponding relation for characterizing user's mark and model information;Really Order member, it is configured to from the model classification and model of model information Resource selection join in response to receiving the user of terminal transmission Number, it is determined that the neutral net matched with the model classification and model parameter of user's selection;Training unit, it is configured in response to connecing The sample data sets of terminal transmission are received, using machine learning device, are obtained based on sample data sets and neural metwork training To model.
In certain embodiments, the sample data in sample data sets includes input sample data and output sample number According to;And the device also includes authentication unit, is configured to:The input sample number of predetermined number is obtained from sample data sets According to the output sample data of the corresponding predetermined number with the input sample data of predetermined number;For every acquired input Sample data, by the input sample data input to model, output result corresponding with the input sample data is obtained, if this is defeated The similarity for going out result output sample data corresponding with the input sample data is more than predetermined similarity threshold, then add up checking Correct number;The ratio for verifying correct number and predetermined number is defined as accuracy rate.
In certain embodiments, the device also includes release unit, is configured to:If accuracy rate is more than predetermined accuracy rate threshold Value, then by model conversion into application, and it will apply and be published to destination server so that at least one terminal downloads use.
In certain embodiments, the device also includes feedback unit, is configured to:Receive to download and sent out using the terminal of application Feedback data is simultaneously added to sample data sets by the feedback data sent, wherein, feedback data includes downloading the end using application End is input to actually entering data, actually enter reality output result corresponding to data, download using the end applied in application The desired output result of user's input at end, and reality output result is similar less than predetermined to the similarity of desired output result Spend threshold value;Based on actually entering data, reality output result, desired output result re -training model, and by the mould after renewal Type is published to destination server again.
In certain embodiments, the device also includes recommendation unit, is configured to:Fixed in advance if accuracy rate is less than or equal to True rate threshold value, then the model classification selected according to accuracy rate and user, model parameter are determined to recommend mould from model information set Type information, and recommended models information is sent to terminal, so that user reselects model classification and model parameter.
In certain embodiments, training unit is further used for:Sample data is chosen from sample data sets to perform such as Lower training step:Neutral net is trained using sample data;Based on training result, the network parameter of neutral net is adjusted;It is determined that Whether training completion condition has met;The stopping instruction being sent in response to determining to be unsatisfactory for and being not received by the terminal of user Practice message, other sample datas are chosen from sample data sets and continue executing with training step;Wherein, training completion condition includes At least one of below:The frequency of training of neutral net reaches default frequency of training threshold value;In adjacent training twice, neutral net Output between penalty values be less than predetermined threshold value.
The third aspect, the embodiment of the present application provide a kind of server, including:One or more processors;Storage device, For storing one or more programs, when one or more programs are executed by one or more processors so that one or more Processor is realized such as method any in first aspect.
Fourth aspect, the embodiment of the present application provide a kind of computer-readable recording medium, are stored thereon with computer journey Sequence, wherein, realized when program is executed by processor such as method any in first aspect.
The method and apparatus for generation model that the embodiment of the present application provides, by voluntarily selecting nerve net based on user The training sample data that the model classification and model parameter of network and user upload, train the model of user's customization.So as to effectively Ground make use of customization data, generate the model of customization.
Brief description of the drawings
By reading the detailed description made to non-limiting example made with reference to the following drawings, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 is that the application can apply to exemplary system architecture figure therein;
Fig. 2 is the flow chart according to one embodiment of the method for generation model of the application;
Fig. 3 is the schematic diagram according to an application scenarios of the method for generation model of the application;
Fig. 4 is the flow chart according to another embodiment of the method for generation model of the application;
Fig. 5 is the structural representation according to one embodiment of the device for generation model of the application;
Fig. 6 is adapted for the structural representation for realizing the terminal device of the embodiment of the present application or the computer system of server Figure.
Embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to Be easy to describe, illustrate only in accompanying drawing to about the related part of invention.
It should be noted that in the case where not conflicting, the feature in embodiment and embodiment in the application can phase Mutually combination.Describe the application in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 shows the implementation of the method for generation model that can apply the application or the device for generation model The exemplary system architecture 100 of example.
As shown in figure 1, system architecture 100 can include terminal device 101,102,103, network 104 and server 105. Network 104 between terminal device 101,102,103 and server 105 provide communication link medium.Network 104 can be with Including various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be interacted with using terminal equipment 101,102,103 by network 104 with server 105, to receive or send out Send message etc..Various telecommunication customer end applications can be installed, such as model generation class should on terminal device 101,102,103 With, web browser applications, the application of shopping class, searching class application, JICQ, mailbox client, social platform software Deng.
Terminal device 101,102,103 can be with display screen and support in preference pattern training parameter and support Pass the various electronic equipments of sample data used in model training, including but not limited to smart mobile phone, tablet personal computer, e-book Reader, MP3 player (Moving Picture Experts Group Audio Layer III, dynamic image expert pressure Contracting standard audio aspect 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image expert Compression standard audio aspect 4) player, pocket computer on knee and desktop computer etc..
Server 105 can be to provide the server of various services, such as to being shown on terminal device 101,102,103 Model information provides the background model generation server supported.Background model generation server can generate to the model received Request and sample data carry out processing, the generation model such as analyzing.
It should be noted that the method for generation model that the embodiment of the present application is provided typically is held by server 105 OK, correspondingly, the device for generation model is generally positioned in server 105.
It should be understood that the number of the terminal device, network and server in Fig. 1 is only schematical.According to realizing need Will, can have any number of terminal device, network and server.In a kind of optional embodiment, for generation model Method can be carried out on server or the server cluster being made up of multiple servers, the neural network model trained can Operate on various types of electronic equipments such as server, PC, mobile terminal, car-mounted terminal.
With continued reference to Fig. 2, the flow of one embodiment of the method for generation model according to the application is shown 200.This is used for the method for generation model, comprises the following steps:
Step 201, the model for including user's mark sent in response to receiving the terminal of user generates request, from default Model table in search corresponding with user's mark model information set, and to terminal transmission pattern information aggregate.
In the present embodiment, method operation electronic equipment (such as the service shown in Fig. 1 thereon for generation model Device) model generation request can be received from the terminal of user by wired connection mode or radio connection, then from pre- If model table in search corresponding with user's mark model information set, and to terminal transmission pattern information aggregate for user Preference pattern classification and model parameter, wherein, the model information in model information set includes model classification and model parameter, mould Type table is used for the corresponding relation for characterizing user's mark and model information.Model classification can be for example, convolutional neural networks, circulation Neutral net etc..Model parameter can include but is not limited to the network number of plies, kernel function type, error precision, learning rate etc..On It can be that each user sets model information to state electronic equipment, and user can be also divided into different rights, then by model table with using Family authority is associated.Server can provide different model classifications and model parameter for the user of different rights and be selected for user Select.The authority that user sets different stage can be identified as previously according to user, for example, being divided into two-stage authority, there is super-ordinate right User can select more than 10 layers of neutral net, and the user with rudimentary authority may only select less than 10 layers of nerve Network.
Step 202, in response to receiving the user of terminal transmission from the model classification and model of model information Resource selection Parameter, it is determined that the neutral net matched with the model classification and model parameter of user's selection.
In the present embodiment, after terminal receives model information set, the model information set is shown for selection by the user Model classification and model parameter.User can from model information list preference pattern parameter, can also be manually entered model ginseng Number.User can also input the title of model to be generated by terminal, and the title is user-defined, and server can be to the name Title is verified, to avoid repeating with the model name of other users.In addition, if user is inputted by the way of being manually entered Model parameter, then server also need to user input parameter verify.Model classification and model in model information set Parameter is associated, and after user have selected model classification, optional model parameter is relevant with chosen model classification.For example, If user have selected convolutional neural networks, the network number of plies can select 3-10 layers.And if selected for feedforward neural network, then The network number of plies can select 3-5 layers.Server determines the mould of the user's selection sent with terminal from the neutral net set of candidate Type classification and the neutral net of model parameter matching.Neutral net in the neutral net set of candidate is that a kind of application is similar to The structure of cerebral nerve cynapse connection carries out the mathematical modeling of information processing.It is also often directly referred to as " refreshing in engineering and academia Through network " or neural network.
Optionally, user can also input the title of model to be generated by terminal, and the title is user-defined, clothes Business device can verify to the title, to avoid repeating with the model name of other users.
Step 203, the sample data sets sent in response to receiving terminal, using machine learning method, based on sample Data acquisition system and neural metwork training obtain model.
In the present embodiment, terminal can send sample data sets while transmission pattern classification and model parameter. Also server authentication model classification and model parameter can be waited until, determining neutral net, sending out message again gives terminal notifying user afterwards Send sample data sets.So as to avoid can not being still to be uploaded in the case that user generates designated model classification and model parameter Sample data sets, cause to waste network traffics.For the neutral net and remote sensing images of the road in identification remote sensing images Exemplified by sample data sets, illustrate the training process of model.Remote sensing images sample data sets include original remote sensing images and original Road information in beginning remote sensing images.Original remote sensing images can be inputted neutral net by electronic equipment, obtain output result, such as The similarity between road information in fruit output result and original remote sensing images is less than predetermined similarity threshold, then adjusts network Parameter, original remote sensing images are inputted into neutral net again and carry out the road information in output result and original remote sensing images Compare.Constantly adjustment network parameter is until the similarity between the road information in output result and original remote sensing images is more than pre- Determine similarity threshold.The other original remote sensing images of reselection are trained, until the output of the original remote sensing images of predetermined quantity As a result the similarity between the road information in original remote sensing images is more than predetermined similarity threshold, and training obtains model.Electricity Sub- equipment training can be initial neutral net, and initial neutral net can be unbred neutral net or not train Into neutral net, each layer of initial neutral net can be provided with initial parameter, and parameter is in the training process of neutral net Can constantly it be adjusted.Initial neutral net can be various types of indisciplines or not train the ANN of completion Network does not train to a variety of indisciplines or the artificial neural network of completion to be combined resulting model.So, electronics Equipment can input original remote sensing images from the input side of initial neutral net, successively by each layer in initial neutral net Parameter processing, and exported from the outlet side of initial neutral net, the information of outlet side output is in original remote sensing images Road information.
In some optional implementations of the present embodiment, mould is obtained based on sample data sets and neural metwork training Type, including:Following training step is performed for each sample data in sample data sets:Nerve is trained using the sample data Network, based on training result, adjust the network parameter of neutral net;It is determined that whether training completion condition has met;In response to It is determined that being unsatisfactory for and being not received by the deconditioning message that the terminal of user is sent, other are chosen from sample data sets Sample data continues executing with training step.Training completion condition is set to train the neutral net with can avoiding the occurrence of Infinite Cyclic Situation.Training completion condition may include but be not limited at least one of following:The frequency of training of training neutral net reaches default Frequency of training threshold value;In adjacent training twice, the penalty values between the output of neutral net are less than predetermined threshold value.Training result refers to Be by sample data input neutral net after output result.For identification remote sensing images in road neutral net and Exemplified by remote sensing images sample data sets, illustrate a kind of optional training process of model.Wherein, remote sensing images sample data set Closing includes road information in original remote sensing images and original remote sensing images, the roadway characteristic may include the color of road, texture, Highly, the feature such as temperature, shade, direction change.From the original remote sensing images input of remote sensing images sample data sets selected part Neutral net;The roadway characteristic of remote sensing images through neutral net extraction input;Roadway characteristic according at least to extraction and original The difference of road information in remote sensing images, determine the penalty values of road feature extraction;Neutral net is adjusted according to penalty values Network parameter.Before training is started, neutral net nuclear parameter is carried out initially with some different small random numbers." small random number " For ensureing that network will not enter saturation state because nuclear parameter value is excessive, so as to cause failure to train;" difference " is used for ensureing Network can normally learn.If in fact, going initial nuclear parameter with identical number, network impotentia learns.Pass through The result and true category that network training is come out are compared, and correct error.By adjust nuclear parameter make error minimize come Continue to optimize nuclear parameter.
Optionally, progress msg is exported to the terminal of user in training process, progress msg for reference is voluntarily to determine It is fixed whether to terminate training process in advance.If user wants deconditioning, deconditioning message is sent by terminal.
With continued reference to Fig. 3, Fig. 3 is a signal according to the application scenarios of the method for generation model of the present embodiment Figure.In Fig. 3 application scenarios, user is sent by terminal 300 to server includes the model generation request that user identifies, clothes Business device finds the model information set matched in model table with user mark and is sent to terminal 300.Terminal 300 shows mould Type information aggregate inputs customized model name 301 and preference pattern classification 302 and the network number of plies 303 for user.Terminal User is inputted into customized model name 301, the model classification 302 of selection and the network number of plies 303 and is sent to server.Service Device determines neutral net according to model classification 302 and the network number of plies 303.The sample data sets and nerve uploaded again based on terminal Network training obtains model.
The method that above-described embodiment of the application provides is uploaded by user and customizes sample data sets, it is not necessary to is write Code can generate the model of customization.
With further reference to Fig. 4, it illustrates the flow 400 of another embodiment of the method for generation model.The use In the flow 400 of the method for generation model, comprise the following steps:
Step 401, the model for including user's mark sent in response to receiving the terminal of user generates request, from default Model table in search corresponding with user's mark model information set, and to terminal transmission pattern information aggregate.
Step 402, in response to receiving the user of terminal transmission from the model classification and model of model information Resource selection Parameter, it is determined that the neutral net matched with the model classification and model parameter of user's selection.
Step 403, the sample data sets sent in response to receiving terminal, using machine learning method, based on sample Data acquisition system and neural metwork training obtain model.
Step 401-403 and step 201-203 are essentially identical, therefore repeat no more.
Step 404, the input sample data of predetermined number and the input with predetermined number are obtained from sample data sets The output sample data of predetermined number corresponding to sample data.
In the present embodiment, sample data sets can be continuing with after model training completion to assess model. For example, for supervised learning, the sample data in sample data sets includes input sample data and output sample data.Training Using 10,000 pairs of input sample data and output sample data during model, during assessment models can from this 10,000 pairs of input sample data and Export and 100 pairs are chosen in sample data.
Step 405, for every acquired input sample data, by the input sample data input to model, obtain Output result corresponding with the input sample data, if output result output sample data corresponding with the input sample data Similarity be more than predetermined similarity threshold, then add up the correct number of checking.
In the present embodiment, can be from sample data sets selected part sample number in order to assess the effect of the model of generation The model generated according to input step 203 obtains output result.The output result and output sample are compared, determined between them Similarity.The computational methods of similarity can use the methods of general cosine similarity, Euclidean distance.If the output result The similarity of output sample data corresponding with the input sample data is more than predetermined similarity threshold, then it is assumed that this time verifies logical Cross.Predetermined number contents checking is carried out using the input sample data of predetermined number, every time if by the way that then add up checking after checking Correct number.
Step 406, the ratio for verifying correct number and predetermined number is defined as accuracy rate.
In the present embodiment, predetermined number time step 405 is repeated, add up the correct number of checking.It will verify correct Number and predetermined number ratio as accuracy rate.For example, 100 checkings are carried out, if verifying that correct number is 90 Secondary, then accuracy rate is 90%.
Step 407, if accuracy rate is more than predetermined accuracy rate threshold value, by model conversion into application, and it will apply and be published to Destination server uses at least one terminal downloads.
In the present embodiment, if accuracy rate is more than predetermined accuracy rate threshold value, then it is assumed that the model evaluation is by being available for other User uses.RESTful can be used, and (REST (Representational State Transfer, presentation layer condition conversion) refers to Be one group of framework constraints and principle.The application program or design for meeting these constraintss and principle are exactly RESTful) Model conversion into application and is published to destination server by framework by container service.REST title " presentation layer condition conversion " In, eliminate subject." presentation layer " refers to " presentation layer " of " resource " (Resources) in fact.So-called " resource ", is exactly net An entity on network, or perhaps a specifying information on network.Resource is an interesting conceptual entity, and it is to client End is open.The example of resource has:Application object, data-base recording, algorithm etc..A URI (Universal can be used Resource Identifier, URL) it is pointed to, the corresponding specific URI of every kind of resource.Obtain this Individual resource, its URI cans are accessed, therefore URI is just into the address of each resource or unique identifier.It is all Resource all shares unified interface, so as to transmission state between clients and servers.Container service, which provides high-performance, to be stretched The container application management service of contracting, support the life cycle management of containerization application, there is provided a variety of to apply published methods and continue Delivery capability simultaneously supports micro services framework.Container service simplifies the building of Container Management cluster, incorporate cloud virtualization, Storage, network and security capabilities, make the optimal container running environment in high in the clouds.
Optionally, the model obtained for unsupervised learning, can directly be issued without assessment.
In some optional implementations of the present embodiment, the feedback data downloaded and sent using the terminal of application is received And feedback data is added to sample data sets, wherein, feedback data includes downloading is input to application using the terminal of application In actually enter data, actually enter reality output result corresponding to data, download using application terminal user input Desired output result, and the similarity of reality output result and desired output result is less than predetermined similarity threshold;It is based on Data, reality output result, desired output result re -training model are actually entered, and the model after renewal is published to again Destination server.Wherein, desired output result is for actually entering the correct of data by user's input using the model Output result.It is undesirably defeated that user by artificial or machine judges to actually enter reality output result corresponding to data When going out result, it would be desirable to output result and actually enter data, actually enter reality output result corresponding to data and feed back together To server.Server receives feedback data and is added to sample data sets, passes through data set version management, re -training mould Type, and support model and the continuous integrating of service.
Step 408, if accuracy rate is less than or equal to predetermined accuracy rate threshold value, according to accuracy rate and the model of user's selection Classification, model parameter determine recommended models information from model information set, and send recommended models information to terminal, for Family reselects model classification and model parameter.
In the present embodiment, if accuracy rate is less than or equal to predetermined accuracy rate threshold value, illustrate model evaluation not by therefore Recommended user's re -training model.Model information after adjustment can be determined according to the gap of accuracy rate and predetermined accuracy rate threshold value As recommended models information, for example whether needing to adjust model parameter or types of models.And according to the characteristics of different typess of models It can determine that adjustment amount.If for example, predetermined accuracy rate threshold value be 90%, the current network number of plies be 3 when actual test accuracy rate For 89%, then can recommend the network number of plies being adjusted to 5.If the accuracy rate of actual test is 20% when the current network number of plies is 3, The effect that accuracy rate is then only improved by changing the network number of plies may be bad, therefore can recommend the type of model being adjusted to other Network, such as feedforward neural network is adjusted to convolutional neural networks.
Figure 4, it is seen that compared with embodiment corresponding to Fig. 2, the method for generation model in the present embodiment Flow 400 highlight the step of expanding sample data.Thus, the scheme of the present embodiment description can introduce more various Notebook data, so as to improve the speed and accuracy rate of model generation.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, it is used to generate mould this application provides one kind One embodiment of the device of type, the device embodiment is corresponding with the embodiment of the method shown in Fig. 2, and the device can specifically answer For in various electronic equipments.
As shown in figure 5, the device 500 for generation model of the present embodiment includes:Receiving unit 501, determining unit 502 With training unit 503.Wherein, receiving unit 501, the user that includes for being configured to send in response to the terminal for receiving user mark The model generation request of knowledge, searches model information set corresponding with user's mark, and send out to terminal from default model table Model information set is sent, wherein, the model information in model information set includes model classification and model parameter, and model table is used for Characterize the corresponding relation of user's mark and model information;Determining unit 502 is configured to the use in response to receiving terminal transmission Family is from the model classification and model parameter of model information Resource selection, it is determined that model classification and model parameter with user's selection The neutral net matched somebody with somebody;Training unit 503 is configured to the sample data sets in response to receiving terminal transmission, utilizes engineering Device is practised, model is obtained based on sample data sets and neural metwork training.
In the present embodiment, for generation model device 500 receiving unit 501, determining unit 502 and training unit 503 specific processing may be referred to Fig. 2 and correspond to step 201, step 202, step 203 in embodiment.
In some optional implementations of the present embodiment, the sample data in sample data sets includes input sample Data and output sample data;And the device 500 also includes authentication unit, is configured to:Obtained from sample data sets The output sample data of the input sample data of predetermined number and predetermined number corresponding with the input sample data of predetermined number; For every acquired input sample data, the input sample data input to model obtains and the input sample data Corresponding output result, make a reservation for if the similarity of output result output sample data corresponding with the input sample data is more than Similarity threshold, then add up the correct number of checking;The ratio for verifying correct number and predetermined number is defined as accuracy rate.
In some optional implementations of the present embodiment, device 500 also includes release unit, is configured to:It is if accurate True rate is more than predetermined accuracy rate threshold value, then by model conversion into application, and will apply and be published to destination server at least one Individual terminal downloads use.
In some optional implementations of the present embodiment, device 500 also includes feedback unit, is configured to:Receive Download the feedback data sent using the terminal of application and feedback data is added to sample data sets, wherein, feedback data Including downloading actually entering data, actually enter reality output knot corresponding to data in application is input to using the terminal of application Fruit, download the desired output result inputted using the user of the terminal of application, and reality output result and desired output result Similarity be less than predetermined similarity threshold;Based on actually entering data, reality output result, desired output result re -training Model, and the model after renewal is published to destination server again.
In some optional implementations of the present embodiment, device 500 also includes recommendation unit, is configured to:It is if accurate True rate is less than or equal to predetermined accuracy rate threshold value, then according to accuracy rate and the model classification of user's selection, model parameter from model Information aggregate determines recommended models information, and sends recommended models information to terminal, so that user reselects model classification And model parameter.
In some optional implementations of the present embodiment, training unit 503 is further used for:From sample data sets Middle selection sample data performs following training step:Neutral net is trained using sample data;Based on training result, adjustment nerve The network parameter of network;It is determined that whether training completion condition has met;In response to determining to be unsatisfactory for and being not received by use The deconditioning message that the terminal at family is sent, other sample datas are chosen from sample data sets and continue executing with training step; Wherein, completion condition is trained including at least one of following:The frequency of training of neutral net reaches default frequency of training threshold value;Phase In adjacent training twice, the penalty values between the output of neutral net are less than predetermined threshold value.
Below with reference to Fig. 6, it illustrates suitable for for realizing the computer system 600 of the server of the embodiment of the present application Structural representation.Server shown in Fig. 6 is only an example, should not be to the function and use range band of the embodiment of the present application Carry out any restrictions.
As shown in fig. 6, computer system 600 includes CPU (CPU) 601, it can be read-only according to being stored in Program in memory (ROM) 602 or be loaded into program in random access storage device (RAM) 603 from storage part 608 and Perform various appropriate actions and processing.In RAM 603, also it is stored with system 600 and operates required various programs and data. CPU 601, ROM 602 and RAM 603 are connected with each other by bus 604.Input/output (I/O) interface 605 is also connected to always Line 604.
I/O interfaces 605 are connected to lower component:Importation 606 including keyboard, mouse etc.;Penetrated including such as negative electrode The output par, c 607 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage part 608 including hard disk etc.; And the communications portion 609 of the NIC including LAN card, modem etc..Communications portion 609 via such as because The network of spy's net performs communication process.Driver 610 is also according to needing to be connected to I/O interfaces 605.Detachable media 611, such as Disk, CD, magneto-optic disk, semiconductor memory etc., it is arranged on as needed on driver 610, in order to read from it Computer program be mounted into as needed storage part 608.
Especially, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description Software program.For example, embodiment of the disclosure includes a kind of computer program product, it includes being carried on computer-readable medium On computer program, the computer program include be used for execution flow chart shown in method program code.In such reality To apply in example, the computer program can be downloaded and installed by communications portion 609 from network, and/or from detachable media 611 are mounted.When the computer program is performed by CPU (CPU) 601, perform what is limited in the present processes Above-mentioned function.It should be noted that computer-readable medium described herein can be computer-readable signal media or Computer-readable recording medium either the two any combination.Computer-readable recording medium for example can be --- but Be not limited to --- electricity, magnetic, optical, electromagnetic, system, device or the device of infrared ray or semiconductor, or it is any more than combination. The more specifically example of computer-readable recording medium can include but is not limited to:Electrical connection with one or more wires, Portable computer diskette, hard disk, random access storage device (RAM), read-only storage (ROM), erasable type may be programmed read-only deposit Reservoir (EPROM or flash memory), optical fiber, portable compact disc read-only storage (CD-ROM), light storage device, magnetic memory Part or above-mentioned any appropriate combination.In this application, computer-readable recording medium can any be included or store The tangible medium of program, the program can be commanded the either device use or in connection of execution system, device.And In the application, computer-readable signal media can include believing in a base band or as the data that a carrier wave part is propagated Number, wherein carrying computer-readable program code.The data-signal of this propagation can take various forms, including but not It is limited to electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be computer Any computer-readable medium beyond readable storage medium storing program for executing, the computer-readable medium can send, propagate or transmit use In by instruction execution system, device either device use or program in connection.Included on computer-readable medium Program code any appropriate medium can be used to transmit, include but is not limited to:Wirelessly, electric wire, optical cable, RF etc., Huo Zheshang Any appropriate combination stated.
The calculating of the operation for performing the application can be write with one or more programming languages or its combination Machine program code, described program design language include object oriented program language-such as Java, Smalltalk, C+ +, in addition to conventional procedural programming language-such as " C " language or similar programming language.Program code can Fully to perform on the user computer, partly perform, performed as an independent software kit on the user computer, Part performs or performed completely on remote computer or server on the remote computer on the user computer for part. In the situation of remote computer is related to, remote computer can pass through the network of any kind --- including LAN (LAN) Or wide area network (WAN)-subscriber computer is connected to, or, it may be connected to outer computer (such as utilize Internet service Provider passes through Internet connection).
Flow chart and block diagram in accompanying drawing, it is illustrated that according to the system of the various embodiments of the application, method and computer journey Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation The part of one module of table, program segment or code, the part of the module, program segment or code include one or more use In the executable instruction of logic function as defined in realization.It should also be noted that marked at some as in the realization replaced in square frame The function of note can also be with different from the order marked in accompanying drawing generation.For example, two square frames succeedingly represented are actually It can perform substantially in parallel, they can also be performed in the opposite order sometimes, and this is depending on involved function.Also to note Meaning, the combination of each square frame and block diagram in block diagram and/or flow chart and/or the square frame in flow chart can be with holding Function as defined in row or the special hardware based system of operation are realized, or can use specialized hardware and computer instruction Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard The mode of part is realized.Described unit can also be set within a processor, for example, can be described as:A kind of processor bag Include receiving unit, determining unit, training unit.Wherein, the title of these units is not formed to the unit under certain conditions The restriction of itself, for example, receiving unit is also described as, " user that includes that the terminal in response to receiving user is sent marks The model generation request of knowledge, searches corresponding with user mark model information set from default model table, and to institute State the unit that terminal sends the model information set ".
As on the other hand, present invention also provides a kind of computer-readable medium, the computer-readable medium can be Included in device described in above-described embodiment;Can also be individualism, and without be incorporated the device in.Above-mentioned calculating Machine computer-readable recording medium carries one or more program, when said one or multiple programs are performed by the device so that should Device:The model for including user's mark that terminal in response to receiving user is sent generates request, from default model table Model information set corresponding with user's mark is searched, and to terminal transmission pattern information aggregate, wherein, in model information set Model information include model classification and model parameter, model table is used for the corresponding relation for characterizing user's mark and model information; In response to receiving the user of terminal transmission from the model classification and model parameter of model information Resource selection, it is determined that being selected with user The neutral net for model classification and the model parameter matching selected;Sample data sets in response to receiving terminal transmission, are utilized Machine learning method, model is obtained based on sample data sets and neural metwork training.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.People in the art Member should be appreciated that invention scope involved in the application, however it is not limited to the technology that the particular combination of above-mentioned technical characteristic forms Scheme, while should also cover in the case where not departing from the inventive concept, carried out by above-mentioned technical characteristic or its equivalent feature The other technical schemes for being combined and being formed.Such as features described above has similar work(with (but not limited to) disclosed herein The technical scheme that the technical characteristic of energy is replaced mutually and formed.

Claims (14)

1. a kind of method for generation model, including:
The model for including user's mark that terminal in response to receiving user is sent generates request, is looked into from default model table Model information set corresponding with user mark is looked for, and the model information set is sent to the terminal, wherein, it is described Model information in model information set includes model classification and model parameter, and the model table is used to characterize user's mark and mould The corresponding relation of type information;
In response to receiving model classification and model of the user that the terminal is sent, described from the model information Resource selection Parameter, it is determined that the neutral net matched with the model classification and model parameter of user selection;
The sample data sets sent in response to receiving the terminal, using machine learning method, based on the sample data Set and the neural metwork training obtain model.
2. according to the method for claim 1, the sample data in the sample data sets include input sample data and Export sample data;And
Methods described also includes:
The input sample data of predetermined number and the input sample with the predetermined number are obtained from the sample data sets The output sample data of the predetermined number corresponding to data;
For every acquired input sample data, the input sample data input to the model obtains and the input Output result corresponding to sample data, if the similarity of output result output sample data corresponding with the input sample data More than predetermined similarity threshold, then add up the correct number of checking;
The ratio of the correct number of checking and the predetermined number is defined as accuracy rate.
3. according to the method for claim 2, wherein, methods described also includes:
If the accuracy rate is more than predetermined accuracy rate threshold value, by the model conversion into application, and the application is published to Destination server uses at least one terminal downloads.
4. according to the method for claim 3, wherein, methods described also includes:
Receive and download the feedback data sent using the terminal of the application and the feedback data is added to the sample number According to set, wherein, the feedback data includes downloading is input to actually entering in the application using the terminal of the application Data, the user's input for actually entering reality output result corresponding to data, the terminal for downloading the use application Desired output result, and the similarity of the reality output result and the desired output result be less than it is described predetermined similar Spend threshold value;
Data, the reality output result, model described in the desired output result re -training are actually entered based on described, and Model after renewal is published to the destination server again.
5. according to the method for claim 2, wherein, methods described also includes:
If the accuracy rate is less than or equal to predetermined accuracy rate threshold value, according to the accuracy rate and the model of user selection Classification, model parameter determine recommended models information from the model information set, and send the recommendation mould to the terminal Type information, so that the user reselects model classification and model parameter.
6. according to the method described in one of claim 1-5, wherein, it is described to be based on the sample data sets and the nerve net Network training obtains model, including:
Sample data is chosen from the sample data sets and performs following training step:Nerve net is trained using sample data Network;Based on training result, the network parameter of the neutral net is adjusted;It is determined that whether training completion condition has met;
The deconditioning message sent in response to determining to be unsatisfactory for and being not received by the terminal of the user, from the sample Other sample datas are chosen in data acquisition system and continue executing with the training step;
Wherein, the training completion condition includes at least one of following:
The frequency of training of the neutral net reaches default frequency of training threshold value;
In adjacent training twice, the penalty values between the output of the neutral net are less than predetermined threshold value.
7. a kind of device for generation model, including:
Receiving unit, the model for including user's mark for being configured to send in response to the terminal for receiving user generate request, Model information set corresponding with user mark is searched from default model table, and the model is sent to the terminal Information aggregate, wherein, the model information in the model information set includes model classification and model parameter, and the model table is used In the corresponding relation for characterizing user's mark and model information;
Determining unit, it is configured to select from the model information set in response to receiving user that the terminal is sent, described The model classification and model parameter selected, it is determined that the neutral net matched with the model classification and model parameter of user selection;
Training unit, the sample data sets sent in response to receiving the terminal are configured to, using machine learning device, Model is obtained based on the sample data sets and the neural metwork training.
8. device according to claim 7, the sample data in the sample data sets include input sample data and Export sample data;And
Described device also includes authentication unit, is configured to:
The input sample data of predetermined number and the input sample with the predetermined number are obtained from the sample data sets The output sample data of the predetermined number corresponding to data;
For every acquired input sample data, the input sample data input to the model obtains and the input Output result corresponding to sample data, if the similarity of output result output sample data corresponding with the input sample data More than predetermined similarity threshold, then add up the correct number of checking;
The ratio of the correct number of checking and the predetermined number is defined as accuracy rate.
9. device according to claim 8, wherein, described device also includes release unit, is configured to:
If the accuracy rate is more than predetermined accuracy rate threshold value, by the model conversion into application, and the application is published to Destination server uses at least one terminal downloads.
10. device according to claim 8, wherein, described device also includes feedback unit, is configured to:
Receive and download the feedback data sent using the terminal of the application and the feedback data is added to the sample number According to set, wherein, the feedback data includes downloading is input to actually entering in the application using the terminal of the application Data, the user's input for actually entering reality output result corresponding to data, the terminal for downloading the use application Desired output result, and the similarity of the reality output result and the desired output result be less than it is described predetermined similar Spend threshold value;
Data, the reality output result, model described in the desired output result re -training are actually entered based on described, and Model after renewal is published to the destination server again.
11. device according to claim 8, wherein, described device also includes recommendation unit, is configured to:
If the accuracy rate is less than or equal to predetermined accuracy rate threshold value, according to the accuracy rate and the model of user selection Classification, model parameter determine recommended models information from the model information set, and send the recommendation mould to the terminal Type information, so that the user reselects model classification and model parameter.
12. according to the device described in one of claim 7-11, wherein, the training unit is further used for:
Sample data is chosen from the sample data sets and performs following training step:Nerve net is trained using sample data Network;Based on training result, the network parameter of the neutral net is adjusted;It is determined that whether training completion condition has met;
The deconditioning message sent in response to determining to be unsatisfactory for and being not received by the terminal of the user, from the sample Other sample datas are chosen in data acquisition system and continue executing with the training step;
Wherein, the training completion condition includes at least one of following:
The frequency of training of the neutral net reaches default frequency of training threshold value;
In adjacent training twice, the penalty values between the output of the neutral net are less than predetermined threshold value.
13. a kind of server, including:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are by one or more of computing devices so that one or more of processors are real The now method as described in any in claim 1-6.
14. a kind of computer-readable recording medium, is stored thereon with computer program, wherein, described program is executed by processor Methods of the Shi Shixian as described in any in claim 1-6.
CN201711157646.0A 2017-11-20 2017-11-20 Method and apparatus for generating a model Active CN107766940B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711157646.0A CN107766940B (en) 2017-11-20 2017-11-20 Method and apparatus for generating a model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711157646.0A CN107766940B (en) 2017-11-20 2017-11-20 Method and apparatus for generating a model

Publications (2)

Publication Number Publication Date
CN107766940A true CN107766940A (en) 2018-03-06
CN107766940B CN107766940B (en) 2021-07-23

Family

ID=61280122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711157646.0A Active CN107766940B (en) 2017-11-20 2017-11-20 Method and apparatus for generating a model

Country Status (1)

Country Link
CN (1) CN107766940B (en)

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664610A (en) * 2018-05-11 2018-10-16 北京京东金融科技控股有限公司 Method and apparatus for handling data
CN108764369A (en) * 2018-06-07 2018-11-06 深圳市公安局公交分局 Character recognition method, device based on data fusion and computer storage media
CN108764487A (en) * 2018-05-29 2018-11-06 北京百度网讯科技有限公司 For generating the method and apparatus of model, the method and apparatus of information for identification
CN108805091A (en) * 2018-06-15 2018-11-13 北京字节跳动网络技术有限公司 Method and apparatus for generating model
CN108829518A (en) * 2018-05-31 2018-11-16 北京百度网讯科技有限公司 Method and apparatus for pushed information
CN108847066A (en) * 2018-05-31 2018-11-20 上海与德科技有限公司 A kind of content of courses reminding method, device, server and storage medium
CN108960316A (en) * 2018-06-27 2018-12-07 北京字节跳动网络技术有限公司 Method and apparatus for generating model
CN108984683A (en) * 2018-06-29 2018-12-11 北京百度网讯科技有限公司 Extracting method, system, equipment and the storage medium of structural data
CN108985386A (en) * 2018-08-07 2018-12-11 北京旷视科技有限公司 Obtain method, image processing method and the corresponding intrument of image processing model
CN109086789A (en) * 2018-06-08 2018-12-25 四川斐讯信息技术有限公司 A kind of image-recognizing method and system
CN109242025A (en) * 2018-09-14 2019-01-18 北京旷视科技有限公司 Model iterative correction methods, apparatus and system
CN109272116A (en) * 2018-09-05 2019-01-25 郑州云海信息技术有限公司 A kind of method and device of deep learning
CN109344885A (en) * 2018-09-14 2019-02-15 深圳增强现实技术有限公司 Deep learning identifying system, method and electronic equipment
CN109376267A (en) * 2018-10-30 2019-02-22 北京字节跳动网络技术有限公司 Method and apparatus for generating model
CN109492771A (en) * 2018-11-12 2019-03-19 北京百度网讯科技有限公司 Exchange method, device and system
CN109492698A (en) * 2018-11-20 2019-03-19 腾讯科技(深圳)有限公司 A kind of method of model training, the method for object detection and relevant apparatus
CN109635833A (en) * 2018-10-30 2019-04-16 银河水滴科技(北京)有限公司 A kind of image-recognizing method and system based on cloud platform and model intelligent recommendation
CN109643229A (en) * 2018-04-17 2019-04-16 深圳鲲云信息科技有限公司 The application and development method and Related product of network model
CN109711436A (en) * 2018-12-05 2019-05-03 量子云未来(北京)信息科技有限公司 A kind of artificial intelligence training pattern construction method, device and storage medium
CN109710819A (en) * 2018-12-29 2019-05-03 北京航天数据股份有限公司 A kind of model display method, apparatus, equipment and medium
CN109816114A (en) * 2018-12-29 2019-05-28 大唐软件技术股份有限公司 A kind of generation method of machine learning model, device
CN110163380A (en) * 2018-04-28 2019-08-23 腾讯科技(深圳)有限公司 Data analysing method, model training method, device, equipment and storage medium
CN110288089A (en) * 2019-06-28 2019-09-27 北京百度网讯科技有限公司 Method and apparatus for sending information
CN110532344A (en) * 2019-08-06 2019-12-03 北京如优教育科技有限公司 Automatic Selected Topic System based on deep neural network model
CN110543946A (en) * 2018-05-29 2019-12-06 百度在线网络技术(北京)有限公司 method and apparatus for training a model
CN110554047A (en) * 2019-09-06 2019-12-10 腾讯科技(深圳)有限公司 method, device, system and equipment for processing product defect detection data
CN110569313A (en) * 2018-05-17 2019-12-13 北京京东尚科信息技术有限公司 Method and device for judging grade of model table of data warehouse
WO2020029689A1 (en) * 2018-08-07 2020-02-13 阿里巴巴集团控股有限公司 Data processing model construction method and device, server and client
CN110837620A (en) * 2019-11-14 2020-02-25 帝国理工创新有限公司 Advanced online database system for publishing and running model and hosting data
CN110990870A (en) * 2019-11-29 2020-04-10 上海能塔智能科技有限公司 Operation and maintenance, processing method, device, equipment and medium using model library
CN111078984A (en) * 2019-11-05 2020-04-28 深圳奇迹智慧网络有限公司 Network model publishing method and device, computer equipment and storage medium
CN111199287A (en) * 2019-12-16 2020-05-26 北京淇瑀信息科技有限公司 Feature engineering real-time recommendation method and device and electronic equipment
CN111242317A (en) * 2020-01-09 2020-06-05 深圳供电局有限公司 Method and device for managing application, computer equipment and storage medium
CN111291882A (en) * 2018-12-06 2020-06-16 北京百度网讯科技有限公司 Model conversion method, device, equipment and computer storage medium
CN111428869A (en) * 2020-03-19 2020-07-17 北京源清慧虹信息科技有限公司 Model generation method and device, computer equipment and storage medium
WO2020155045A1 (en) * 2019-01-31 2020-08-06 西门子股份公司 Method and device for establishing communication model of network device
CN111797869A (en) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 Model training method and device, storage medium and electronic equipment
CN112068854A (en) * 2019-06-10 2020-12-11 杭州海康威视数字技术股份有限公司 Intelligent device algorithm updating system, intelligent device and platform server
CN112088386A (en) * 2018-05-07 2020-12-15 希侬人工智能公司 Generating customized machine learning models to perform tasks using artificial intelligence
CN112130723A (en) * 2018-05-25 2020-12-25 第四范式(北京)技术有限公司 Method and system for performing feature processing on data
CN112381000A (en) * 2020-11-16 2021-02-19 深圳前海微众银行股份有限公司 Face recognition method, device, equipment and storage medium based on federal learning
CN112650528A (en) * 2020-12-31 2021-04-13 新奥数能科技有限公司 Personalized algorithm generation method and device, electronic equipment and computer readable medium
CN113241056A (en) * 2021-04-26 2021-08-10 标贝(北京)科技有限公司 Method, device, system and medium for training speech synthesis model and speech synthesis
CN113408634A (en) * 2021-06-29 2021-09-17 深圳市商汤科技有限公司 Model recommendation method and device, equipment and computer storage medium
CN113469364A (en) * 2020-03-31 2021-10-01 杭州海康威视数字技术股份有限公司 Inference platform, method and device
CN113554401A (en) * 2021-08-05 2021-10-26 杭州拼便宜网络科技有限公司 Inventory data management method, device, equipment and storage medium
WO2021244334A1 (en) * 2020-05-30 2021-12-09 华为技术有限公司 Information processing method and related device
WO2022077946A1 (en) * 2020-10-14 2022-04-21 新智数字科技有限公司 Data measurement method and apparatus, and electronic device and computer-readable medium
WO2022082516A1 (en) * 2020-10-21 2022-04-28 华为技术有限公司 Data transmission method and communication apparatus
CN114902622A (en) * 2020-01-03 2022-08-12 华为技术有限公司 Network entity for determining a model for digitally analyzing input data
CN115001559A (en) * 2022-03-17 2022-09-02 中国科学院计算技术研究所 User terminal distribution model construction method suitable for satellite network
EP4208991A4 (en) * 2021-02-17 2023-11-01 Huawei Technologies Co., Ltd. Entities and methods for trained data model selection in 5g mobile networks
WO2023212926A1 (en) * 2022-05-06 2023-11-09 Oppo广东移动通信有限公司 Communication methods, and devices

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN105912500A (en) * 2016-03-30 2016-08-31 百度在线网络技术(北京)有限公司 Machine learning model generation method and machine learning model generation device
CN106230792A (en) * 2016-07-21 2016-12-14 北京百度网讯科技有限公司 Machine learning method based on mobile office, terminal unit and system
CN106250986A (en) * 2015-06-04 2016-12-21 波音公司 Advanced analysis base frame for machine learning
CN106779087A (en) * 2016-11-30 2017-05-31 福建亿榕信息技术有限公司 A kind of general-purpose machinery learning data analysis platform
CN108229686A (en) * 2016-12-14 2018-06-29 阿里巴巴集团控股有限公司 Model training, Forecasting Methodology, device, electronic equipment and machine learning platform
CN108604222A (en) * 2015-12-28 2018-09-28 云脑科技有限公司 System and method for deployment customized machine learning service
CN109656529A (en) * 2018-10-31 2019-04-19 北京大学 A kind of on-line customization method and system for client deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN106250986A (en) * 2015-06-04 2016-12-21 波音公司 Advanced analysis base frame for machine learning
CN108604222A (en) * 2015-12-28 2018-09-28 云脑科技有限公司 System and method for deployment customized machine learning service
CN105912500A (en) * 2016-03-30 2016-08-31 百度在线网络技术(北京)有限公司 Machine learning model generation method and machine learning model generation device
CN106230792A (en) * 2016-07-21 2016-12-14 北京百度网讯科技有限公司 Machine learning method based on mobile office, terminal unit and system
CN106779087A (en) * 2016-11-30 2017-05-31 福建亿榕信息技术有限公司 A kind of general-purpose machinery learning data analysis platform
CN108229686A (en) * 2016-12-14 2018-06-29 阿里巴巴集团控股有限公司 Model training, Forecasting Methodology, device, electronic equipment and machine learning platform
CN109656529A (en) * 2018-10-31 2019-04-19 北京大学 A kind of on-line customization method and system for client deep learning

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019200544A1 (en) * 2018-04-17 2019-10-24 深圳鲲云信息科技有限公司 Method for implementing and developing network model and related product
US11954576B2 (en) 2018-04-17 2024-04-09 Shenzhen Corerain Technologies Co., Ltd. Method for implementing and developing network model and related product
CN109643229A (en) * 2018-04-17 2019-04-16 深圳鲲云信息科技有限公司 The application and development method and Related product of network model
CN110163380B (en) * 2018-04-28 2023-07-07 腾讯科技(深圳)有限公司 Data analysis method, model training method, device, equipment and storage medium
CN110163380A (en) * 2018-04-28 2019-08-23 腾讯科技(深圳)有限公司 Data analysing method, model training method, device, equipment and storage medium
CN112088386B (en) * 2018-05-07 2024-04-09 苹果公司 Generating customized machine learning models to perform tasks using artificial intelligence
CN112088386A (en) * 2018-05-07 2020-12-15 希侬人工智能公司 Generating customized machine learning models to perform tasks using artificial intelligence
CN108664610A (en) * 2018-05-11 2018-10-16 北京京东金融科技控股有限公司 Method and apparatus for handling data
CN110569313A (en) * 2018-05-17 2019-12-13 北京京东尚科信息技术有限公司 Method and device for judging grade of model table of data warehouse
CN110569313B (en) * 2018-05-17 2023-12-05 北京京东尚科信息技术有限公司 Model table level judging method and device of data warehouse
CN112130723B (en) * 2018-05-25 2023-04-18 第四范式(北京)技术有限公司 Method and system for performing feature processing on data
CN112130723A (en) * 2018-05-25 2020-12-25 第四范式(北京)技术有限公司 Method and system for performing feature processing on data
US11210608B2 (en) 2018-05-29 2021-12-28 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for generating model, method and apparatus for recognizing information
CN110543946B (en) * 2018-05-29 2022-07-05 百度在线网络技术(北京)有限公司 Method and apparatus for training a model
CN108764487A (en) * 2018-05-29 2018-11-06 北京百度网讯科技有限公司 For generating the method and apparatus of model, the method and apparatus of information for identification
CN110543946A (en) * 2018-05-29 2019-12-06 百度在线网络技术(北京)有限公司 method and apparatus for training a model
CN108829518B (en) * 2018-05-31 2020-01-03 北京百度网讯科技有限公司 Method and device for pushing information
CN108847066A (en) * 2018-05-31 2018-11-20 上海与德科技有限公司 A kind of content of courses reminding method, device, server and storage medium
CN108829518A (en) * 2018-05-31 2018-11-16 北京百度网讯科技有限公司 Method and apparatus for pushed information
CN108764369A (en) * 2018-06-07 2018-11-06 深圳市公安局公交分局 Character recognition method, device based on data fusion and computer storage media
CN108764369B (en) * 2018-06-07 2021-10-22 深圳市公安局公交分局 Figure identification method and device based on data fusion and computer storage medium
CN109086789A (en) * 2018-06-08 2018-12-25 四川斐讯信息技术有限公司 A kind of image-recognizing method and system
CN108805091A (en) * 2018-06-15 2018-11-13 北京字节跳动网络技术有限公司 Method and apparatus for generating model
WO2019237657A1 (en) * 2018-06-15 2019-12-19 北京字节跳动网络技术有限公司 Method and device for generating model
CN108960316B (en) * 2018-06-27 2020-10-30 北京字节跳动网络技术有限公司 Method and apparatus for generating a model
CN108960316A (en) * 2018-06-27 2018-12-07 北京字节跳动网络技术有限公司 Method and apparatus for generating model
CN108984683A (en) * 2018-06-29 2018-12-11 北京百度网讯科技有限公司 Extracting method, system, equipment and the storage medium of structural data
CN108984683B (en) * 2018-06-29 2021-06-25 北京百度网讯科技有限公司 Method, system, equipment and storage medium for extracting structured data
WO2020029689A1 (en) * 2018-08-07 2020-02-13 阿里巴巴集团控股有限公司 Data processing model construction method and device, server and client
CN108985386A (en) * 2018-08-07 2018-12-11 北京旷视科技有限公司 Obtain method, image processing method and the corresponding intrument of image processing model
TWI703458B (en) * 2018-08-07 2020-09-01 香港商阿里巴巴集團服務有限公司 Data processing model construction method, device, server and client
US11210569B2 (en) 2018-08-07 2021-12-28 Advanced New Technologies Co., Ltd. Method, apparatus, server, and user terminal for constructing data processing model
CN109272116A (en) * 2018-09-05 2019-01-25 郑州云海信息技术有限公司 A kind of method and device of deep learning
CN109344885A (en) * 2018-09-14 2019-02-15 深圳增强现实技术有限公司 Deep learning identifying system, method and electronic equipment
CN109242025A (en) * 2018-09-14 2019-01-18 北京旷视科技有限公司 Model iterative correction methods, apparatus and system
CN109376267A (en) * 2018-10-30 2019-02-22 北京字节跳动网络技术有限公司 Method and apparatus for generating model
CN109635833A (en) * 2018-10-30 2019-04-16 银河水滴科技(北京)有限公司 A kind of image-recognizing method and system based on cloud platform and model intelligent recommendation
CN109376267B (en) * 2018-10-30 2020-11-13 北京字节跳动网络技术有限公司 Method and apparatus for generating a model
CN109492771A (en) * 2018-11-12 2019-03-19 北京百度网讯科技有限公司 Exchange method, device and system
CN109492698A (en) * 2018-11-20 2019-03-19 腾讯科技(深圳)有限公司 A kind of method of model training, the method for object detection and relevant apparatus
CN109711436A (en) * 2018-12-05 2019-05-03 量子云未来(北京)信息科技有限公司 A kind of artificial intelligence training pattern construction method, device and storage medium
CN111291882A (en) * 2018-12-06 2020-06-16 北京百度网讯科技有限公司 Model conversion method, device, equipment and computer storage medium
CN109710819A (en) * 2018-12-29 2019-05-03 北京航天数据股份有限公司 A kind of model display method, apparatus, equipment and medium
CN109816114A (en) * 2018-12-29 2019-05-28 大唐软件技术股份有限公司 A kind of generation method of machine learning model, device
WO2020155045A1 (en) * 2019-01-31 2020-08-06 西门子股份公司 Method and device for establishing communication model of network device
CN111797869A (en) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 Model training method and device, storage medium and electronic equipment
CN112068854B (en) * 2019-06-10 2023-09-01 杭州海康威视数字技术股份有限公司 Intelligent device algorithm updating system, intelligent device and platform server
CN112068854A (en) * 2019-06-10 2020-12-11 杭州海康威视数字技术股份有限公司 Intelligent device algorithm updating system, intelligent device and platform server
CN110288089B (en) * 2019-06-28 2021-07-09 北京百度网讯科技有限公司 Method and apparatus for transmitting information
CN110288089A (en) * 2019-06-28 2019-09-27 北京百度网讯科技有限公司 Method and apparatus for sending information
CN110532344A (en) * 2019-08-06 2019-12-03 北京如优教育科技有限公司 Automatic Selected Topic System based on deep neural network model
CN110554047A (en) * 2019-09-06 2019-12-10 腾讯科技(深圳)有限公司 method, device, system and equipment for processing product defect detection data
CN110554047B (en) * 2019-09-06 2021-07-02 腾讯科技(深圳)有限公司 Method, device, system and equipment for processing product defect detection data
CN111078984B (en) * 2019-11-05 2024-02-06 深圳奇迹智慧网络有限公司 Network model issuing method, device, computer equipment and storage medium
CN111078984A (en) * 2019-11-05 2020-04-28 深圳奇迹智慧网络有限公司 Network model publishing method and device, computer equipment and storage medium
CN110837620A (en) * 2019-11-14 2020-02-25 帝国理工创新有限公司 Advanced online database system for publishing and running model and hosting data
CN110990870A (en) * 2019-11-29 2020-04-10 上海能塔智能科技有限公司 Operation and maintenance, processing method, device, equipment and medium using model library
CN111199287A (en) * 2019-12-16 2020-05-26 北京淇瑀信息科技有限公司 Feature engineering real-time recommendation method and device and electronic equipment
CN114902622B (en) * 2020-01-03 2024-03-26 华为技术有限公司 Network entity for determining a model for digitally analysing input data
CN114902622A (en) * 2020-01-03 2022-08-12 华为技术有限公司 Network entity for determining a model for digitally analyzing input data
CN111242317A (en) * 2020-01-09 2020-06-05 深圳供电局有限公司 Method and device for managing application, computer equipment and storage medium
CN111242317B (en) * 2020-01-09 2023-11-24 深圳供电局有限公司 Method, device, computer equipment and storage medium for managing application
CN111428869A (en) * 2020-03-19 2020-07-17 北京源清慧虹信息科技有限公司 Model generation method and device, computer equipment and storage medium
CN113469364B (en) * 2020-03-31 2023-10-13 杭州海康威视数字技术股份有限公司 Reasoning platform, method and device
CN113469364A (en) * 2020-03-31 2021-10-01 杭州海康威视数字技术股份有限公司 Inference platform, method and device
WO2021244334A1 (en) * 2020-05-30 2021-12-09 华为技术有限公司 Information processing method and related device
WO2022077946A1 (en) * 2020-10-14 2022-04-21 新智数字科技有限公司 Data measurement method and apparatus, and electronic device and computer-readable medium
WO2022082516A1 (en) * 2020-10-21 2022-04-28 华为技术有限公司 Data transmission method and communication apparatus
CN112381000A (en) * 2020-11-16 2021-02-19 深圳前海微众银行股份有限公司 Face recognition method, device, equipment and storage medium based on federal learning
CN112650528A (en) * 2020-12-31 2021-04-13 新奥数能科技有限公司 Personalized algorithm generation method and device, electronic equipment and computer readable medium
CN112650528B (en) * 2020-12-31 2024-05-14 新奥数能科技有限公司 Personalized algorithm generation method, device, electronic equipment and computer readable medium
EP4208991A4 (en) * 2021-02-17 2023-11-01 Huawei Technologies Co., Ltd. Entities and methods for trained data model selection in 5g mobile networks
CN113241056B (en) * 2021-04-26 2024-03-15 标贝(青岛)科技有限公司 Training and speech synthesis method, device, system and medium for speech synthesis model
CN113241056A (en) * 2021-04-26 2021-08-10 标贝(北京)科技有限公司 Method, device, system and medium for training speech synthesis model and speech synthesis
CN113408634A (en) * 2021-06-29 2021-09-17 深圳市商汤科技有限公司 Model recommendation method and device, equipment and computer storage medium
CN113554401A (en) * 2021-08-05 2021-10-26 杭州拼便宜网络科技有限公司 Inventory data management method, device, equipment and storage medium
CN115001559A (en) * 2022-03-17 2022-09-02 中国科学院计算技术研究所 User terminal distribution model construction method suitable for satellite network
WO2023212926A1 (en) * 2022-05-06 2023-11-09 Oppo广东移动通信有限公司 Communication methods, and devices

Also Published As

Publication number Publication date
CN107766940B (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN107766940A (en) Method and apparatus for generation model
CN108898185A (en) Method and apparatus for generating image recognition model
CN107516090A (en) Integrated face identification method and system
CN107908789A (en) Method and apparatus for generating information
CN105912500A (en) Machine learning model generation method and machine learning model generation device
CN108830235A (en) Method and apparatus for generating information
US10614347B2 (en) Identifying parameter image adjustments using image variation and sequential processing
CN110288049A (en) Method and apparatus for generating image recognition model
CN106484766B (en) Searching method and device based on artificial intelligence
CN108121800A (en) Information generating method and device based on artificial intelligence
CN109981787B (en) Method and device for displaying information
CN107958247A (en) Method and apparatus for facial image identification
CN110263938A (en) Method and apparatus for generating information
CN110009059B (en) Method and apparatus for generating a model
CN107145395A (en) Method and apparatus for handling task
CN111104599B (en) Method and device for outputting information
CN111539903B (en) Method and device for training face image synthesis model
US20230206420A1 (en) Method for detecting defect and method for training model
CN109271556A (en) Method and apparatus for output information
CN110457476A (en) Method and apparatus for generating disaggregated model
CN107590484A (en) Method and apparatus for information to be presented
CN107330091A (en) Information processing method and device
CN109862100A (en) Method and apparatus for pushed information
CN111738010A (en) Method and apparatus for generating semantic matching model
CN110046571B (en) Method and device for identifying age

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant