CN106295697A - A kind of based on semi-supervised transfer learning sorting technique - Google Patents
A kind of based on semi-supervised transfer learning sorting technique Download PDFInfo
- Publication number
- CN106295697A CN106295697A CN201610651405.0A CN201610651405A CN106295697A CN 106295697 A CN106295697 A CN 106295697A CN 201610651405 A CN201610651405 A CN 201610651405A CN 106295697 A CN106295697 A CN 106295697A
- Authority
- CN
- China
- Prior art keywords
- data
- label
- classifiers
- learning algorithm
- task learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
Abstract
The invention discloses a kind of based on semi-supervised transfer learning sorting technique, the method includes: has source data set the data of label to carry out pretreatment, obtains the feature classifiers of set of source data;Utilize multi-task learning algorithm to carry out the data without label of target data set and the feature classifiers of described set of source data migrating repetitive exercise, obtain object classifiers;Object classifiers is utilized to complete the classification to feature.The method realizes saving resource, improves classification degree of accuracy.
Description
Technical field
The present invention relates to machine learning techniques field, particularly relate to a kind of based on semi-supervised transfer learning classification side
Method.
Background technology
At present, transfer learning is a kind of learning model risen in recent years, is widely used in machine learning and data mining
In, it helps the study of frontier goal task by utilizing mass data in similar field.Transfer learning is not the most to instruction
Practice data and make the requirement with distribution with test data, the most do not require aiming field has substantial amounts of labeled data.Transfer learning is permissible
Utilize original study to model or expired labeled data help new data field preferably learn.The target of transfer learning is appointed
Business utilizes knowledge migration in source domain in aiming field exactly, and then helps the study of aiming field.
In machine learning field, traditional learning method has two kinds: supervised learning and unsupervised learning.Semi-supervised learning
It is pattern recognition and the Important Problems of machine learning area research, is that the one that supervised learning combines with unsupervised learning learns
Method.It mainly considers how to utilize a small amount of mark sample and substantial amounts of does not marks the problem that sample is trained and classifies.
Semi-supervised learning, for reducing labeled cost, improves Learning machine performance and has the most great practical significance.Multi-task learning
Also being a kind of algorithm of machine learning, from the point of view of its merchandiser tasking learning is compared relatively, it is mainly valued between task and task
Contact, by combination learning, simultaneously different to multiple tasking learnings regression functions, both take into account the difference between task,
It is also contemplated that the contact between task, this is also one of most important thought of multi-task learning.
Now, in unsupervised learning, source domain and aiming field are all to use the substantial amounts of data set without label, like this
Having ignored that the data having label present in data set, this will result in the waste of resource and can not get higher
Learning model, classification degree of accuracy is relatively low.Moreover need that its result is carried out substantial amounts of analysis without supervision then to process, just can obtain
Classification results reliably, this will result in the substantial amounts of manpower and materials of needs.And the cluster and ground sorted out is there is also without supervision
Between class or corresponding or the most corresponding, add " the different spectrum of jljl " generally existed and " foreign body is with spectrum " phenomenon, Shi Ji group and classification
The big phenomenon of difficulty of matching.
Summary of the invention
It is an object of the invention to provide a kind of based on semi-supervised transfer learning sorting technique, to realize saving resource, carry
High-class degree of accuracy.
For solving above-mentioned technical problem, the present invention provides a kind of based on semi-supervised transfer learning sorting technique, including:
The data having label to source data set carry out pretreatment, obtain the feature classifiers of set of source data;
Utilize multi-task learning algorithm to the data without label of target data set and the tagsort of described set of source data
Device carries out migrating repetitive exercise, obtains object classifiers;
Object classifiers is utilized to complete the classification to feature.
Preferably, described multi-task learning algorithm is applicable to supervised learning, and described multi-task learning algorithm is based on many
The feature selecting algorithm of business study.
Preferably, the described data having label to source data set carry out pretreatment, and the feature obtaining set of source data is divided
Class device, including:
Found the parameter being best suitable for requiring by the data constantly iteration that source data set is had label, obtain source number
Feature classifiers f according to collections。
Preferably, described utilize multi-task learning algorithm to the data without label of target data set and described set of source data
Feature classifiers carry out migrate repetitive exercise, obtain object classifiers, including:
Set up multi-task learning algorithm;
The parameter of described multi-task learning algorithm is optimized;
Obtain object classifiers.
Preferably, the expression formula of the target equation of described multi-task learning algorithm is as follows:
Wherein,
Wherein,Represent the feature classifiers of set of source data, lsRepresent blunt degenrate function, w=w0+wrRepresent and divide
The parameter of class device,Representing the object classifiers of target data set, γ, β, c and θ all represent regularization parameter, ssRepresent
Set of source data feasible set on (0,1), n represents the data sample number of target data set.
Preferably, the described parameter to described multi-task learning algorithm is optimized, including:
Introduce and eliminate variable, the target equation of multi-task learning algorithm is updated;
Introduce dual variable, complete the Lagrange conversion of the target equation to multi-task learning algorithm;
Optimized parameter is obtained by Lagrange gradient.
Preferably, the expression formula of the target equation after renewal is as follows:
Wherein, ξiRepresent the elimination variable having label data, ξ 'iRepresent the elimination variable without label data.
Provided by the present invention a kind of based on semi-supervised transfer learning sorting technique, source data set there is is label
Data carry out pretreatment, obtain the feature classifiers of set of source data;Utilize the multi-task learning algorithm nothing mark to target data set
The data signed and the feature classifiers of described set of source data carry out migrating repetitive exercise, obtain object classifiers;Target is utilized to divide
Class device completes the classification to feature.Visible, use i.e. based on multi-task learning the feature selecting algorithm of multi-task learning algorithm, by
In there are the data of complexity and magnanimity in complicated space, use this algorithm can process in territory between each task
Relatedness, this is that other algorithms can not be accomplished, the method by by source data have the data acquisition of label to spy
Levying grader Data Migration without label in target data, continuous iteration gets the grader of target data set, thus
Just can sort out required feature in complicated space characteristics according to this grader, so consider the data without label
And have the data of label, both combine jointly, it is possible to save the resources such as human and material resources, it is to avoid the waste of resource, and fully profit
By the priori of the data having label, by the common study of the data of a large amount of unlabeled data and a small amount of label, to improve
Nicety of grading.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
In having technology to describe, the required accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is only this
Inventive embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to according to
The accompanying drawing provided obtains other accompanying drawing.
Fig. 1 is a kind of flow chart based on semi-supervised transfer learning sorting technique provided by the present invention;
Fig. 2 is transfer learning self-training classification process schematic diagram.
Detailed description of the invention
The core of the present invention is to provide a kind of based on semi-supervised transfer learning sorting technique, to realize saving resource, carries
High-class degree of accuracy.
In order to make those skilled in the art be more fully understood that the present invention program, below in conjunction with in the embodiment of the present invention
Accompanying drawing, is clearly and completely described the technical scheme in the embodiment of the present invention, it is clear that described embodiment is only
The a part of embodiment of the present invention rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art
The every other embodiment obtained under not making creative work premise, broadly falls into the scope of protection of the invention.
Refer to Fig. 1, Fig. 1 is a kind of flow process based on semi-supervised transfer learning sorting technique provided by the present invention
Figure, the method includes:
S11: have source data set the data of label to carry out pretreatment, obtain the feature classifiers of set of source data;
S12: utilize multi-task learning algorithm to the data without label of target data set and the feature of described set of source data
Grader carries out migrating repetitive exercise, obtains object classifiers;
S13: utilize object classifiers to complete the classification to feature.
Visible, use i.e. based on multi-task learning the feature selecting algorithm of multi-task learning algorithm, due at complicated sky
There are the data of complexity and magnanimity between, use this algorithm can process in territory the relatedness between each task, this
That other algorithms can not be accomplished, the method by by source data have the data acquisition of label to feature classifiers to mesh
Without the Data Migration of label in mark data, continuous iteration gets the grader of target data set, thus classifies according to this
Device just can sort out required feature in complicated space characteristics, so considers the data without label and has the number of label
According to, both combine jointly, it is possible to save the resources such as human and material resources, it is to avoid the waste of resource, and make full use of the number of label
According to priori, by the common study of the data of a large amount of unlabeled data and a small amount of label, to improve nicety of grading.
Based on said method, concrete, described multi-task learning algorithm is applicable to supervised learning, and described multi-task learning is calculated
Method is feature selecting algorithm based on multi-task learning, the most semi-supervised migration multitasked algorithm.(Semi-supervised-
Based transfer Multi-task, semi-supervised migration multitask) algorithm is that the data set utilizing relevant auxiliary territory moves
In-migration helps the study of aiming field task, and make use of the thought of semi-supervised iteration to carry out train classification models.Based on semi-supervised
Under multitask transfer learning algorithm solve the problem of classification in complex characteristic space.
Wherein, the process of step S11 is particularly as follows: find by the data constantly iteration having label to source data set
It is best suitable for the parameter required, obtains the feature classifiers f of set of source datas.The data having label to source data set carry out pre-
Process, i.e. found the parameter being best suitable for requiring by constantly iteration, thus we just obtain the tagsort of source data set
Device fs。
Use DsRepresenting assistance data collection i.e. set of source data, inside it, encapsulation is a small amount of data having label.Target
Data set uses DlRepresent, for target data set, useRepresent the substantial amounts of data without label, wherein comprise liIndividual sample
This { xi, i=1,2,3 ... .., n}, n represent the data sample number of target data set.
In step S12, have employed the multi-task learning algorithm being applicable to supervised learning, by multi-task learning algorithm
The data without label of target data set and the grader of set of source data are carried out substantial amounts of migration repetitive exercise, trains optimal
Object classifiers, thus realize classification to feature.Need in this step to carry out the expression of multi-task learning algorithm, algorithm
The optimization of parameter, the proposition i.e. acquisition of object classifiers of evaluation criteria.
Wherein, the process of step S12 specifically includes:
S21: set up multi-task learning algorithm;
S22: the parameter of described multi-task learning algorithm is optimized;
S23: obtain object classifiers.
Concrete, the expression formula of the target equation of described multi-task learning algorithm is as follows:
Wherein,
Wherein,Represent the feature classifiers of set of source data, lsRepresent blunt degenrate function, w=w0+wrRepresent and divide
The parameter of class device,Representing the object classifiers of target data set, γ, β, c and θ all represent regularization parameter, ssRepresent
Set of source data feasible set on (0,1), n represents the data sample number of target data set.In expression formulaIt it is source
The grader of data set, what this method finally needed to obtain is exactly the grader of target data set
Further, the process of step S22 specifically includes:
S31: introduce and eliminate variable, the target equation of multi-task learning algorithm is updated;
Wherein, elimination variable ξ is introduced.In terms of parameter optimization, it is firstly introduced into elimination variable ξ said before to replace
Degenrate function ls, because degenrate function is it is possible that the phenomenon of instability, replace so degenrate function is eliminated variable ζ
Falling, the expression formula of the target equation after renewal is as follows:
Wherein, ζiRepresent the elimination variable having label data, ζ 'iRepresent the elimination variable without label data.
S32: introduce dual variable, completes the Lagrange conversion of the target equation to multi-task learning algorithm;
Wherein, introducing dual variable and realize the Lagrange conversion to formula, convenient gradient below seeks optimal solution;Specifically
The formula of Lagrangian conversion process as follows:
Wherein, aiRepresent the dual variable having label data collection, a 'iRepresenting the dual variable without label data collection, b represents
The parameter of Laplace transform, ω ' indicates the parameter of the grader without label data collection,Indicate without label data collection
Feature weight, ∈ refers to constant value, and α represents mutation amount, the parameter of ω presentation class device, and ξ represents and eliminates variable, flRepresent target
The object classifiers of data set, LpRepresent Laplace function.
S33: obtain optimized parameter by Lagrange gradient.
Wherein, the problem obtaining parametric optimal solution by Lagrange gradient.First, solve in gradient, ω, b, ξ
During carrying out optimization, during this, it is thus achieved that about fsAnd flRelation, fsAnd flThe expression formula of relation as follows:
Then to antithesis factor aiWith a 'iCarrying out Laplace transform equally, the expression formula of last optimal solution problem is as follows:
Wherein, K represents the kernel matrix of each data set,Represent that the label data that has in source data set arranges
Vector value, the deviation of B presentation class device,Refer to the most suitable object classifiers chosen in 1 to n sample.
WhereinThat represent is source domain grader fsThe vector value of hierarchical arrangement, effect is
F by each datasArrange, until, select optimal fs。
The expression equation of the aiming field grader finally obtained is as follows:
Wherein,It is the weight of set of source data grader,Refer to source domain and i.e. have label data collection
The vector value of middle grader weight, βsIndicate the regularization parameter that label data is concentrated.
From lastSource domain will there is the data set grader of label in aiming field it can be seen that complete
Without the data set migration of label, continuous iteration gets the grader in aiming field, thus we are according to this grader just
Can sort out, in complicated space characteristics, the feature that we want ourselves.Whole process refers to Fig. 2, Fig. 2 and learns for migrating
Practise self-training classification process schematic diagram.Source domain i.e. set of source data, aiming field i.e. target data set.
The present invention is to classify complex space based on semi-supervised transfer learning algorithm, and the algorithm of transfer learning has
A lot, but use feature selecting algorithm based on multi-task learning here, this is because also exist multiple in complicated space
Miscellaneous and the data of magnanimity, use this algorithm, can process the relatedness between each task in territory, and this is other algorithm institutes
Can not accomplish.
The present invention uses multi-task learning algorithm, under semi-supervised system, complicated feature space is carried out transfer learning,
Thus the feature in aiming field is classified.The scheme being correlated with the present invention, although it is carried out also with multitasked algorithm
Transfer learning, but mostly it is utilized under unsupervised system carrying out, substantial amounts of without mark by using in source domain
Sign data to be iterated obtaining grader, be then used in the collection without label data in aiming field, thus learn, it is thus achieved that study
Device.
With multi-task learning algorithm to the data without label of target data set and the spy of described set of source data in the present invention
Levy grader to carry out migrating repetitive exercise, obtain object classifiers, it can be seen that the present invention is under the system of semi-supervised learning
The transfer learning carried out, has carried out semi-supervised and multi-task learning algorithm optimization process, by ancillary data field i.e. source number
Grader f is got according to the data set having label in territorys, then by fsMove in the data set without label in aiming field,
By continuous iteration optimization, finally obtain our the required grader f obtaining aiming fieldl, these are all at multi-task learning
Algorithm completes, so, the present invention is by this process of semi-supervised transfer learning, including the optimization of the multitasked algorithm of the inside
Process and semi-supervised transfer learning iterative process, it is thus achieved that object classifiers, complete tagsort.
Owing to a small amount of having label data and substantial amounts of combining without label data by using, it is much better than only use on a small quantity
The data having label or only use the substantial amounts of data without label.And semi-supervised learning has the further advantage that semi-supervised
Method consider the data without label and has the data of label, allowing it jointly learn, it is possible to saving the resources such as human and material resources, keep away
Exempt from the waste of resource;By the common study of the data of a large amount of unlabeled data and a small amount of label, can be used to reduce obtaining instruction
Practice the degree of difficulty of data sorter;Can make full use of the priori of label data, the classification of predetermined classification is come controlled
The selection of training sample processed, and repeated examinations training sample can be passed through, to improve nicety of grading.
Multi-task learning is for single task learning model, and its advantage is that multi-task learning is then valued
Contact between task, by combination learning, to multiple tasking learnings, both take into account the difference between task, it is also contemplated that
Contact between task, this is also one of most important thought of multi-task learning, can excavate the relation between these subtasks,
The difference between these tasks can be distinguished again simultaneously.
In semi-supervised transfer learning, feature is chosen, still have a lot of algorithm.As under semi-supervised pattern
Self-Training svm algorithm, self-learning algorithm, from various visual angles learning algorithm etc., these algorithms may be by a small amount of
Tape label data and substantial amounts of complete corresponding transfer learning without label data, but for complicated feature space, multitask
Learning algorithm can preferably excavate the relation between these subtasks, can distinguish again the difference between these tasks simultaneously, this
It is that other algorithm cannot realize.
A kind of it is described in detail based on semi-supervised transfer learning sorting technique provided by the present invention above.This
Applying specific case in literary composition to be set forth principle and the embodiment of the present invention, the explanation of above example is only intended to
Help to understand method and the core concept thereof of the present invention.It should be pointed out that, for those skilled in the art,
Without departing from the principles of the invention, it is also possible to the present invention is carried out some improvement and modification, these improve and modify also to fall
Enter in the protection domain of the claims in the present invention.
Claims (7)
1. one kind based on semi-supervised transfer learning sorting technique, it is characterised in that including:
The data having label to source data set carry out pretreatment, obtain the feature classifiers of set of source data;
Utilize multi-task learning algorithm that the data without label of target data set and the feature classifiers of described set of source data are entered
Row migrates repetitive exercise, obtains object classifiers;
Object classifiers is utilized to complete the classification to feature.
2. the method for claim 1, it is characterised in that described multi-task learning algorithm is applicable to supervised learning, described
Multi-task learning algorithm is feature selecting algorithm based on multi-task learning.
3. the method for claim 1, it is characterised in that the described data having label to source data set carry out pre-place
Reason, obtains the feature classifiers of set of source data, including:
Found the parameter being best suitable for requiring by the data constantly iteration that source data set is had label, obtain set of source data
Feature classifiers fs。
4. the method for claim 1, it is characterised in that described utilize the multi-task learning algorithm nothing to target data set
The data of label and the feature classifiers of described set of source data carry out migrating repetitive exercise, obtain object classifiers, including:
Set up multi-task learning algorithm;
The parameter of described multi-task learning algorithm is optimized;
Obtain object classifiers.
5. method as claimed in claim 4, it is characterised in that the expression formula of the target equation of described multi-task learning algorithm is such as
Under:
Wherein,
Wherein,Represent the feature classifiers of set of source data, lsRepresent blunt degenrate function, w=w0+wrPresentation class device
Parameter,Representing the object classifiers of target data set, γ, β, c and θ all represent regularization parameter, ssRepresent source number
According to collection feasible set on (0,1), n represents the data sample number of target data set.
6. method as claimed in claim 5, it is characterised in that the described parameter to described multi-task learning algorithm carries out excellent
Change, including:
Introduce and eliminate variable, the target equation of multi-task learning algorithm is updated;
Introduce dual variable, complete the Lagrange conversion of the target equation to multi-task learning algorithm;
Optimized parameter is obtained by Lagrange gradient.
7. method as claimed in claim 6, it is characterised in that the expression formula of the target equation after renewal is as follows:
Wherein, ξiRepresent the elimination variable having label data, ξ 'iRepresent the elimination variable without label data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610651405.0A CN106295697A (en) | 2016-08-10 | 2016-08-10 | A kind of based on semi-supervised transfer learning sorting technique |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610651405.0A CN106295697A (en) | 2016-08-10 | 2016-08-10 | A kind of based on semi-supervised transfer learning sorting technique |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106295697A true CN106295697A (en) | 2017-01-04 |
Family
ID=57667793
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610651405.0A Pending CN106295697A (en) | 2016-08-10 | 2016-08-10 | A kind of based on semi-supervised transfer learning sorting technique |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106295697A (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106934462A (en) * | 2017-02-09 | 2017-07-07 | 华南理工大学 | Defence under antagonism environment based on migration poisons the learning method of attack |
CN107909101A (en) * | 2017-11-10 | 2018-04-13 | 清华大学 | Semi-supervised transfer learning character identifying method and system based on convolutional neural networks |
CN108009593A (en) * | 2017-12-15 | 2018-05-08 | 清华大学 | A kind of transfer learning optimal algorithm choosing method and system |
CN108460134A (en) * | 2018-03-06 | 2018-08-28 | 云南大学 | The text subject disaggregated model and sorting technique of transfer learning are integrated based on multi-source domain |
WO2018196760A1 (en) * | 2017-04-27 | 2018-11-01 | Huawei Technologies Co., Ltd. | Ensemble transfer learning |
CN108764269A (en) * | 2018-04-03 | 2018-11-06 | 华南理工大学 | A kind of cross datasets pedestrian recognition methods again based on space-time restriction incremental learning |
CN109272023A (en) * | 2018-08-27 | 2019-01-25 | 中国科学院计算技术研究所 | A kind of Internet of Things transfer learning method and system |
CN109284313A (en) * | 2018-08-10 | 2019-01-29 | 深圳前海微众银行股份有限公司 | Federal modeling method, equipment and readable storage medium storing program for executing based on semi-supervised learning |
CN109635837A (en) * | 2018-11-10 | 2019-04-16 | 天津大学 | A kind of carefree fall detection system of scene based on commercial wireless Wi-Fi |
CN109657693A (en) * | 2018-10-22 | 2019-04-19 | 中国科学院软件研究所 | A kind of classification method based on joint entropy and transfer learning |
CN109740676A (en) * | 2019-01-07 | 2019-05-10 | 电子科技大学 | Object detection moving method based on similar purpose |
CN109919324A (en) * | 2019-03-07 | 2019-06-21 | 广东工业大学 | Transfer learning classification method, system and equipment based on the study of label ratio |
CN110135185A (en) * | 2018-02-08 | 2019-08-16 | 苹果公司 | The machine learning of privatization is carried out using production confrontation network |
CN110414622A (en) * | 2019-08-06 | 2019-11-05 | 广东工业大学 | Classifier training method and device based on semi-supervised learning |
CN110414624A (en) * | 2019-08-06 | 2019-11-05 | 广东工业大学 | Disaggregated model construction method and device based on multi-task learning |
CN110569813A (en) * | 2019-09-12 | 2019-12-13 | 天津华春智慧能源科技发展有限公司 | Fault diagnosis method for mobile heat supply unit |
CN112309375A (en) * | 2020-10-28 | 2021-02-02 | 平安科技(深圳)有限公司 | Training test method, device, equipment and storage medium of voice recognition model |
CN115879535A (en) * | 2023-02-10 | 2023-03-31 | 北京百度网讯科技有限公司 | Training method, device, equipment and medium for automatic driving perception model |
US11710035B2 (en) | 2018-09-28 | 2023-07-25 | Apple Inc. | Distributed labeling for supervised learning |
-
2016
- 2016-08-10 CN CN201610651405.0A patent/CN106295697A/en active Pending
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106934462A (en) * | 2017-02-09 | 2017-07-07 | 华南理工大学 | Defence under antagonism environment based on migration poisons the learning method of attack |
WO2018196760A1 (en) * | 2017-04-27 | 2018-11-01 | Huawei Technologies Co., Ltd. | Ensemble transfer learning |
CN107909101A (en) * | 2017-11-10 | 2018-04-13 | 清华大学 | Semi-supervised transfer learning character identifying method and system based on convolutional neural networks |
CN108009593A (en) * | 2017-12-15 | 2018-05-08 | 清华大学 | A kind of transfer learning optimal algorithm choosing method and system |
CN108009593B (en) * | 2017-12-15 | 2018-12-11 | 清华大学 | A kind of transfer learning optimal algorithm choosing method and system |
CN110135185A (en) * | 2018-02-08 | 2019-08-16 | 苹果公司 | The machine learning of privatization is carried out using production confrontation network |
CN110135185B (en) * | 2018-02-08 | 2023-12-22 | 苹果公司 | Machine learning privatization using generative antagonism network |
CN108460134A (en) * | 2018-03-06 | 2018-08-28 | 云南大学 | The text subject disaggregated model and sorting technique of transfer learning are integrated based on multi-source domain |
CN108764269A (en) * | 2018-04-03 | 2018-11-06 | 华南理工大学 | A kind of cross datasets pedestrian recognition methods again based on space-time restriction incremental learning |
CN109284313A (en) * | 2018-08-10 | 2019-01-29 | 深圳前海微众银行股份有限公司 | Federal modeling method, equipment and readable storage medium storing program for executing based on semi-supervised learning |
CN109284313B (en) * | 2018-08-10 | 2021-08-27 | 深圳前海微众银行股份有限公司 | Federal modeling method, device and readable storage medium based on semi-supervised learning |
CN109272023B (en) * | 2018-08-27 | 2021-04-27 | 中国科学院计算技术研究所 | Internet of things transfer learning method and system |
CN109272023A (en) * | 2018-08-27 | 2019-01-25 | 中国科学院计算技术研究所 | A kind of Internet of Things transfer learning method and system |
US11710035B2 (en) | 2018-09-28 | 2023-07-25 | Apple Inc. | Distributed labeling for supervised learning |
CN109657693B (en) * | 2018-10-22 | 2023-08-01 | 中国科学院软件研究所 | Classification method based on correlation entropy and transfer learning |
CN109657693A (en) * | 2018-10-22 | 2019-04-19 | 中国科学院软件研究所 | A kind of classification method based on joint entropy and transfer learning |
CN109635837A (en) * | 2018-11-10 | 2019-04-16 | 天津大学 | A kind of carefree fall detection system of scene based on commercial wireless Wi-Fi |
CN109740676A (en) * | 2019-01-07 | 2019-05-10 | 电子科技大学 | Object detection moving method based on similar purpose |
CN109919324A (en) * | 2019-03-07 | 2019-06-21 | 广东工业大学 | Transfer learning classification method, system and equipment based on the study of label ratio |
CN110414624A (en) * | 2019-08-06 | 2019-11-05 | 广东工业大学 | Disaggregated model construction method and device based on multi-task learning |
CN110414622B (en) * | 2019-08-06 | 2022-06-24 | 广东工业大学 | Classifier training method and device based on semi-supervised learning |
CN110414622A (en) * | 2019-08-06 | 2019-11-05 | 广东工业大学 | Classifier training method and device based on semi-supervised learning |
CN110569813A (en) * | 2019-09-12 | 2019-12-13 | 天津华春智慧能源科技发展有限公司 | Fault diagnosis method for mobile heat supply unit |
CN112309375A (en) * | 2020-10-28 | 2021-02-02 | 平安科技(深圳)有限公司 | Training test method, device, equipment and storage medium of voice recognition model |
CN112309375B (en) * | 2020-10-28 | 2024-02-23 | 平安科技(深圳)有限公司 | Training test method, device, equipment and storage medium for voice recognition model |
CN115879535A (en) * | 2023-02-10 | 2023-03-31 | 北京百度网讯科技有限公司 | Training method, device, equipment and medium for automatic driving perception model |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106295697A (en) | A kind of based on semi-supervised transfer learning sorting technique | |
CN107909101B (en) | Semi-supervised transfer learning character identifying method and system based on convolutional neural networks | |
Xue et al. | Modeling human-like decision-making for inbound smart ships based on fuzzy decision trees | |
CN108406767A (en) | Robot autonomous learning method towards man-machine collaboration | |
Irawan | Implementation Of Data Mining For Determining Majors Using K-Means Algorithm In Students Of SMA Negeri 1 Pangkalan Kerinci | |
CN106022521A (en) | Hadoop framework-based short-term load prediction method for distributed BP neural network | |
CN110348579A (en) | A kind of domain-adaptive migration feature method and system | |
CN103745233B (en) | The hyperspectral image classification method migrated based on spatial information | |
Panigrahi et al. | Deep learning approach for image classification | |
CN106203472A (en) | A kind of zero sample image sorting technique based on the direct forecast model of mixed attributes | |
CN105678381A (en) | Gender classification network training method, gender classification method and related device | |
CN113128620A (en) | Semi-supervised domain self-adaptive picture classification method based on hierarchical relationship | |
CN108197656A (en) | A kind of attribute reduction method based on CUDA | |
CN109299753A (en) | A kind of integrated learning approach and system for Law Text information excavating | |
CN103942214B (en) | Natural image classification method and device on basis of multi-modal matrix filling | |
CN105389471A (en) | Method for reducing training set of machine learning | |
Klimovets | Human resources make all the difference | |
CN109255025A (en) | A kind of short text classification method | |
Boschetti et al. | Interactive modelling for natural resource management | |
CN106203634A (en) | A kind of based on the didactic parallel probability plan method of cause-and-effect diagram | |
CN103198052A (en) | Active learning method based on support vector machine | |
Li | Research on English teaching ability evaluation algorithm based on big data fuzzy k-means clustering | |
Dixit et al. | An implementation of data pre-processing for small dataset | |
Yang et al. | Ouroboros: On accelerating training of transformer-based language models | |
Ward et al. | Improving Contrastive Learning on Visually Homogeneous Mars Rover Images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170104 |
|
RJ01 | Rejection of invention patent application after publication |