CN107480690A - A kind of more sorting techniques for including unknown classification based on SVMs - Google Patents

A kind of more sorting techniques for including unknown classification based on SVMs Download PDF

Info

Publication number
CN107480690A
CN107480690A CN201710536516.1A CN201710536516A CN107480690A CN 107480690 A CN107480690 A CN 107480690A CN 201710536516 A CN201710536516 A CN 201710536516A CN 107480690 A CN107480690 A CN 107480690A
Authority
CN
China
Prior art keywords
classification
sample
svms
sorting techniques
belongs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710536516.1A
Other languages
Chinese (zh)
Inventor
邢云冰
陈益强
忽丽莎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Computing Technology of CAS
Original Assignee
Institute of Computing Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Computing Technology of CAS filed Critical Institute of Computing Technology of CAS
Priority to CN201710536516.1A priority Critical patent/CN107480690A/en
Publication of CN107480690A publication Critical patent/CN107480690A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention provides a kind of more sorting techniques for including unknown classification based on SVMs, including:1) the i-th category classifier after each training is based respectively on, identifies whether sample to be identified belongs to the i-th classification;I-th category classifier is with the known sample structure positive sample collection for belonging to the i-th classification, negative sample collection is built with the known all samples for belonging to remaining classification of N 1, the two classification graders for identifying whether inputted sample belongs to the i-th classification are used for obtained from being trained based on SVM models;2) when by each two classification grader of sample to be identified input, when acquired results are no, the current sample to be identified of identification belongs to unknown classification in step 1);When in step 1), when being, then to assert, currently sample to be identified belongs to this two classification classified corresponding to grader to the output result of one and only one two classification grader.The present invention has the detectability of unknown classification, has higher recall ratio and precision ratio.

Description

A kind of more sorting techniques for including unknown classification based on SVMs
Technical field
The present invention relates to machine learning field, and specifically, the present invention relates to a kind of including not based on SVMs Know more sorting techniques of classification.
Background technology
SVMs (SVM) is used as a kind of typical machine learning algorithm, due to its complete theories integration, various It is widely applied in classification problem.SVM assumes sample dataIt is generally not linear separability in luv space, can incites somebody to action Sample data is mapped to new feature space (new feature Spatial General 6 R is more higher-dimension), sample data from luv spaceNew The corresponding points of feature space areTwo classification SVM target can be described as finding in feature space using sample data One linear hyperplaneWhereinIt is the normal vector of linear hyperplane, b is offset, if f (xi) the then y of > 0i=+1 represents that the sample data belongs to positive classification, if f (xi) the then y of < 0i=-1 represents the sample data category In anti-classification.Optimal linear hyperplane should cause sample data from the Maximizing Minimum Distance of the linear hyperplane, mathematics Form is expressed asMeet constraints Wherein C is experience system Number, ξiFor slack variable, to a small number of sample data relaxed constraints for being unsatisfactory for firm constraints condition, (cost is that object function is punished Penalize).
For more classification SVM, appoint and take the other sample data of two species to be trained using aforesaid way, respectively obtain classification Model (linear hyperplane), which classification is new samples data finally belong to is produced by the ballot of all disaggregated models.Such as N classification SVM can be decomposed into N (N-1)/2 two classification SVM, if N (N-1)/2 two classification SVM chooses identical kernel function (feature Space reflection function) and relevant parameter, equivalent to N (N-1)/2 linear hyperplane whole feature space is divided, such as Shown in Fig. 1.
Current almost all of machine learning algorithm, its judged result is known class, such as digital 0-9's is hand-written Which kind of data no matter input identification, input, and machine learning algorithm can be chosen one of as identification from this 10 numerals As a result, for invalid stochastic inputs data, more preferably result is one unknown classification of output to represent invalid output. In field of human-computer interaction, the problem of defect causes unobvious, for invalid recognition result, the mankind can simply differentiate simultaneously Ignore the output.But in some full-automatic fields, due to the presence of exceptional condition, then must automatic decision input data whether just Often or effectively.
For SVM, can solve the problem using following three kinds of modes at present.
First way is to obtain the sample data of unknown classification in the training stage, is as gathered for numeral identification various The data arbitrarily inputted, such as wave (~~), the data that (√), fork (×) etc. are not belonging to digital 0-9 classifications are hooked, so as to To comprising the other sample data of 11 species, then whole sample data set is trained to obtain disaggregated model using traditional SVM.This The shortcomings that kind of mode is it will be evident that the class number of unknown classification is inherently infinite, therefore the sample data gathered can not Covering is distributed.
The second way is the form using probability, and train to obtain is the disaggregated model containing probability, and forecast period is defeated What is gone out is that each new samples data belong to various types of other probability rather than single category result.This mode can be solved partly During preceding several maximum probability sizableness of the certainly problem, especially forecast period output, substantially it can be assumed that new samples data It is unknown classification, because should be that the probable value for corresponding to the category is far longer than the general of other corresponding classifications when belonging to known class Rate value.If a certain subclass of a certain known class and unknown classification with respect to " close ", which will obvious error in judgement, Such as digital 1-9 identification, when actually entering " 0 ", the probable value that disaggregated model also will be considered to corresponding classification " 6 " is far longer than The probable value of other corresponding classifications.Therefore which is an adequate condition rather than necessary condition.
The third mode is that SVM mutation algorithm is used in combination --- Support Vector data description (SVDD) algorithm.Instructing Practice the stage, first using the sample data of all known class as normal category, secondly utilize the sample number of this normal category SVDD models are obtained according to training, then train whole sample data set to obtain SVM models using traditional SVM.In forecast period, Judge whether new samples data are normal according to SVDD models first, if normally, judging new samples data using SVM models Which classification particularly belonged to.The major defect of this mode is present in the SVDD stages, due to there was only a kind of sample number of classification According to the SVDD models for training to obtain are general more coarse, while lack the limitation and correction of other classification sample datas, SVDD moulds Type also easily produces over-fitting, therefore whether normally judges that precision is inherently poor for new samples data, that is, to new sample The judgement precision whether notebook data belongs to unknown classification is poor.
The content of the invention
Therefore, task of the invention be to provide it is a kind of be more suitable for the application scenarios containing unknown classification based on support to More classification solutions of amount machine.
A kind of according to an aspect of the invention, there is provided more classification sides for including unknown classification based on SVMs Method, comprise the following steps:
1) for sample to be identified, the i-th category classifier after each training is based respectively on, identifies the sample to be identified Whether i-th classification is belonged to;Wherein, i=1,2 ... N;I-th category classifier is with the known sample for belonging to the i-th classification This structure positive sample collection, negative sample collection is built with the known all samples for belonging to remaining N-1 classification, is carried out based on SVM models Two classification grader obtained from training, the two classification grader are used to identify whether inputted sample belongs to the i-th classification;
2) when by each two classification grader of sample to be identified input, when acquired results are no, assert ought in step 1) Preceding sample to be identified belongs to unknown classification;When in step 1), output results of one and only one two classification grader for when being, Then assert that current sample to be identified belongs to the classification corresponding to this two classification grader.
Wherein, the step 2) also includes:When the output result in step 1), having multiple two classification graders is is, Directly assert that current sample to be identified belongs to unknown classification;Or corresponding to these the two classification graders for being yes in output result Classification in the range of, further differentiate which classification current sample to be identified belongs to using other sorting techniques.
Wherein, in the step 1), i-th category classifier is trained as follows:
11) taking the i-th classification, remaining N-1 kinds classification is as anti-classification as positive classification;
12) it is trained based on SVM models and obtains the two classification graders as the i-th category classifier.
Wherein, in the step 12), when being trained, first by known sampleNew feature is mapped to from luv space Space, obtain the sample after corresponding Feature Space TransformationTo cause the classification line of demarcation of SVM models more regular.
Wherein, in the step 12),
The object function of SVM models is configured as:A hypersphere is found in feature space using sample dataWhereinIt is the hyperspherical centre of sphere, R is hyperspherical radius, and y is exported if f (x) < R =+1 represents that the sample belongs to positive classification, exports y=-1 if f (x) > R, represents that the sample belongs to anti-classification;
Find optimal hypersphere so that the hyperspherical surface area minimizes and known sample data are from the hypersphere Maximizing Minimum Distance;Then the optimized parameter of SVM models is obtained according to described optimal hypersphere.
Wherein, in the step 12) so that known sample data are from the hyperspherical Maximizing Minimum Distance:Will just Anti- classification sample data forms two concentric hyperspheres respectively from former hyperspherical minimum range so that concentric hypersphere composition The thickness of the shell of hypersphere shell maximizes.
Wherein, the object function is:
Constraints isWherein C1And C2It is empirical coefficient, d is suitable In meeting the sample data of constraints from hyperspherical minimum range, ξiFor slack variable;
The optimized parameter of described SVM models is:
Wherein, in the step 12), the optimized parameter method for solving of described SVM models is as follows:
121) Lagrangian of object function is obtained by method of Lagrange multipliers;
122) local derviation for making Partial Variable in Lagrangian is zero, obtains solving the dual problem of object function, should The preferred Gaussian kernel of kernel function in dual problem;
123) dual problem is solved by Novel Algorithm, trains to obtain parameter according to known sample dataWith R.It is to be appreciated that in step 122), although being optimal in most of situation Gaussian kernel, Gaussian kernel is not uniquely to select Select, in certain embodiments, polynomial kernel (being used as kernel function by the use of polynomial kernel) is also effective to part scene.
Wherein, in the step 12), the Feature Space Transformation enables to the classification linear separability or poly- of SVM models Collection can divide.
Compared with prior art, the present invention has following technique effect:
(1) generic sample data can cluster, and non-diverging, meet the actual distribution of data, therefore this hair It is bright closer to real data.Judge that the precision of unknown classification is higher.
(2) disaggregated model has the detectability of unknown classification, not only have the recall ratio suitable with traditional SVM but also With higher precision ratio.
(3) disaggregated model is simple, and similar to traditional SVM, final disaggregated model only and is located at hypersphere shell interface on a small quantity Upper or wrong side sample data (i.e. supporting vector) is relevant.
(4) forecast period algorithm is simple, and implementation complexity is low, once trains particularly suitable for model is metastable, more The occasion of secondary prediction.
Brief description of the drawings
Hereinafter, embodiments of the invention are described in detail with reference to accompanying drawing, wherein:
Fig. 1 is traditional SVM of the prior art three classification schemes schematic diagrames;
Fig. 2 is the schematic diagram of two classification schemes in one embodiment of the invention;
Fig. 3 is the schematic diagram of three classification schemes in one embodiment of the invention.
Embodiment
The present invention is illustrated with reference to the accompanying drawings and detailed description.
According to one embodiment of present invention, there is provided a kind of three sorting techniques for including unknown classification.The three classification side Method can be decomposed into three two sorting techniques, and each two sorting technique includes training and two stages of prediction.
(1) training stage:
Step 1), one of which classification is chosen as positive classification, remaining two kinds of classification is as anti-classification.
Step 2), by sample dataNew feature space, sample data are mapped to from luv spaceIn new feature space Corresponding points beAs in Fig. 2 and Fig. 3 ▲, ■ and ● it is shown.Feature space mapping is generally adopted in svm classifier algorithm A kind of data prediction mode taken, the purpose is to cause sample data, in new feature space, (feature space dimension is more after mapping It is high) in be more readily separated, for example linear separability or aggregation can divide.That is, in this step, feature space mapping can allow not It is more more regular with the line of demarcation between sample class.In the present embodiment, if it is possible to can be by two class samples with a linear function This is completely separable, just claims these sample linear separabilities.If it is complete by two class samples to correspond to hyperspherical function with one It is complete to separate, just claim the aggregation of these samples to divide.In specific implementation, feature space mapping can be from x-y feature spaces To the mapping of r- θ feature spaces, its Feature Space Transformation formula is:
θ=arctan (y, x)
Feature space mapping can also be the mapping from x-y feature spaces to u-v feature spaces, and its Feature Space Transformation is public Formula is:
U=x-1
V=y-1
Above two feature space mapping mode is merely exemplary, in other embodiments, can also use other Feature space mapping mode, as long as the classification line of demarcation of known sample can be made more regular.
Step 3), the object function of two sorting techniques can be described as:One is found in feature space using sample data Individual hypersphereWhereinIt is the hyperspherical centre of sphere, R is hyperspherical radius, if f (x) < R yi=+1 represents that the sample belongs to positive classification, the y if f (x) > Ri=-1 represents that the sample belongs to anti-classification.Optimal hypersphere Face should make it that hyperspherical surface area minimizes and sample data is from the hyperspherical Maximizing Minimum Distance, the i.e. hypersphere The surface area in face is as far as possible small, and positive and negative classification sample data forms two concentric hyperspheres respectively from former hyperspherical minimum range, The thickness of the shell of the hypersphere shell of concentric hypersphere composition is as far as possible big, as shown in Fig. 2 mathematical form is expressed as
Meet constraintsWherein C1And C2It is empirical coefficient, d is suitable In meeting the sample data of constraints from hyperspherical minimum range, ξiFor slack variable, firm constraints are unsatisfactory for minority The sample data relaxed constraints of condition (cost is that object function is penalized).
Step 4), the Lagrangian of formula 1 is obtained by method of Lagrange multipliers
Wherein αi>=0, μi>=0 is Lagrange multiplier.
Step 5), orderIt is rightR, d andLocal derviation be zero, can obtain the dual problem of formula (1)
WhereinKernel function is represented, meets constraintsIt is preferred that Ground, kernel function can select Gaussian kernel
Step 6), formula (3) are a typical quadratic programming problems, are solved using general Novel Algorithm.It is excellent Choosing, sequence can be used to minimize optimization (SMO) algorithm formula (3) is solved, train to obtain parameter by sample dataWith R, that is, obtain a disaggregated model.
(2) forecast period
For each new sample data, predicted according to disaggregated modelThe table if f (x) < R Show that the sample belongs to positive classification, represent that the sample belongs to anti-classification if f (x) > R.
For three sorting techniques of the present embodiment, choose each of which classification and make as positive classification, remaining all categories For anti-classification, it is trained using above-mentioned two mode classification, respectively obtains disaggregated model, new samples data are according to each classification mould Type is predicted respectively, may finally be judged whether to belong to known class and be belonged to which known class.
If three two classification problems choose identical kernel function and relevant parameter, equivalent to three hyperspheres will be whole Feature space is divided into four parts, if new samples data are located at all three hyperspherical outsides, then it is assumed that the sample category In unknown classification, if new samples data are located at more than one hyperspherical inside (multiple hyperspheres intersect), it can be used His mode judges the classification of the sample, such as traditional SVM, can also directly think that the sample belongs to unknown classification, such as Fig. 3 institutes Show.
The effect of the present invention can pass through following description of test.
The more traditional SVM of this experiment and the present invention classifying quality.Traditional SVM parameter configuration is:Linear kernel function (nothing Feature space maps), empirical parameter C=1;The present invention parameter configuration be:Linear kernel function (no feature space mapping), experience Parameter C1=10, C2=1.
Infrared data of the data source used in this experiment in daily behavior data set, refer to " Jiang X, Chen Y, Liu J,et al.AIR:recognizing activity through IR-based distance sensing on feet[C]//Proceedings of the 2016ACM International Joint Conference on Pervasive and Ubiquitous Computing:Adjunct.ACM,2016:97-100”.The data set is total to comprising 6 classes 3372 sample datas, be respectively walk (walk), downstairs (downstair), upstairs (upstair), run (run), stand (stay) Walked (walk_in_place) with original place, each sample data is 38 dimensions.
This experiment is by the use of the preceding 5 class data in above-mentioned data set as training data, and all 6 class data are as test number According to.Experimental situation is WIN7, Matlab2016b, and optimized algorithm (SMO) uses CVX instruments.Table 1 shows traditional SVM point Class result.
Table 1
Table 2 shows the svm classifier result of one embodiment of the invention.
Table 2
Test result indicates that the present invention makes disaggregated model have the detectability of unknown classification, not only have and tradition is propped up Hold the suitable recall ratio of vector machine and there is higher precision ratio, so that the overall precision of classification is also improved.
It should be noted last that the above embodiments are merely illustrative of the technical solutions of the present invention and it is unrestricted.Although ginseng The present invention is described in detail according to embodiment, it will be understood by those within the art that, to the technical side of the present invention Case is modified or equivalent substitution, and without departure from the spirit and scope of technical solution of the present invention, it all should cover in the present invention Right among.

Claims (9)

1. a kind of more sorting techniques for including unknown classification based on SVMs, comprise the following steps:
1) for sample to be identified, the i-th category classifier after each training is based respectively on, whether identifies the sample to be identified Belong to the i-th classification;Wherein, i=1,2 ... N;I-th category classifier is with the known sample structure for belonging to the i-th classification Positive sample collection is built, negative sample collection is built with the known all samples for belonging to remaining N-1 classification, is trained based on SVM models Obtained from two classification graders, the two classification grader is used to identify whether inputted sample belongs to the i-th classification;
2) when by each two classification grader of sample to be identified input, when acquired results are no, identification is currently treated in step 1) Identification sample belongs to unknown classification;When in step 1), the output result of one and only one two classification grader is when being, then to recognize Settled preceding sample to be identified belongs to the classification corresponding to this two classification grader.
2. more sorting techniques according to claim 1 that include unknown classification based on SVMs, it is characterised in that The step 2) also includes:When the output result in step 1), there are multiple two classification graders is is, directly assert and currently treat Identification sample belongs to unknown classification;Or the scope of the classifications corresponding to these the two classification graders for being yes in output result It is interior, further differentiate which classification current sample to be identified belongs to using other sorting techniques.
3. more sorting techniques according to claim 1 that include unknown classification based on SVMs, it is characterised in that In the step 1), i-th category classifier is trained as follows:
11) taking the i-th classification, remaining N-1 kinds classification is as anti-classification as positive classification;
12) it is trained based on SVM models and obtains the two classification graders as the i-th category classifier.
4. more sorting techniques according to claim 3 that include unknown classification based on SVMs, it is characterised in that In the step 12), when being trained, first by known sampleNew feature space is mapped to from luv space, is obtained pair Sample after the Feature Space Transformation answeredTo cause the classification line of demarcation of known sample more regular.
5. more sorting techniques according to claim 4 that include unknown classification based on SVMs, it is characterised in that In the step 12),
The object function of SVM models is configured as:A hypersphere is found in feature space using sample dataWhereinIt is the hyperspherical centre of sphere, R is hyperspherical radius, and y is exported if f (x) < R =+1 represents that the sample belongs to positive classification, exports y=-1 if f (x) > R, represents that the sample belongs to anti-classification;
Find optimal hypersphere so that the hyperspherical surface area minimizes and known sample data are hyperspherical most from this Small distance maximizes;Then the optimized parameter of SVM models is obtained according to described optimal hypersphere.
6. more sorting techniques according to claim 5 that include unknown classification based on SVMs, it is characterised in that In the step 12) so that known sample data are from the hyperspherical Maximizing Minimum Distance:By positive and negative classification sample data Two concentric hyperspheres are formed respectively from former hyperspherical minimum range so that the thickness of the shell of the hypersphere shell of concentric hypersphere composition Maximize.
7. more sorting techniques according to claim 5 that include unknown classification based on SVMs, it is characterised in that The object function is:
Constraints isWherein C1And C2It is empirical coefficient, d is equivalent to satisfaction The sample data of constraints is from hyperspherical minimum range, ξiFor slack variable;
The optimized parameter of described SVM models is:
8. more sorting techniques according to claim 7 that include unknown classification based on SVMs, it is characterised in that In the step 12), the optimized parameter method for solving of described SVM models is as follows:
121) Lagrangian of object function is obtained by method of Lagrange multipliers;
122) local derviation for making Partial Variable in Lagrangian is zero, obtains solving the dual problem of object function, the antithesis Selection of kernel function Gaussian kernel or polynomial kernel in problem;
123) dual problem is solved by Novel Algorithm, trains to obtain parameter according to known sample dataAnd R.
9. more sorting techniques according to claim 7 that include unknown classification based on SVMs, it is characterised in that In the step 12), the Feature Space Transformation enables to the classification linear separability of SVM models or aggregation to divide.
CN201710536516.1A 2017-07-04 2017-07-04 A kind of more sorting techniques for including unknown classification based on SVMs Pending CN107480690A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710536516.1A CN107480690A (en) 2017-07-04 2017-07-04 A kind of more sorting techniques for including unknown classification based on SVMs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710536516.1A CN107480690A (en) 2017-07-04 2017-07-04 A kind of more sorting techniques for including unknown classification based on SVMs

Publications (1)

Publication Number Publication Date
CN107480690A true CN107480690A (en) 2017-12-15

Family

ID=60595341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710536516.1A Pending CN107480690A (en) 2017-07-04 2017-07-04 A kind of more sorting techniques for including unknown classification based on SVMs

Country Status (1)

Country Link
CN (1) CN107480690A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108551709A (en) * 2018-04-26 2018-09-18 南昌航空大学 A kind of street light modulating method that multinuclear directed acyclic graph support vector machines controls under complex environment
CN108922583A (en) * 2018-08-10 2018-11-30 青岛大学附属医院 It is a kind of for the liquid handling of medical test, transfer device and its method
CN109492096A (en) * 2018-10-23 2019-03-19 华东理工大学 A kind of unbalanced data categorizing system integrated based on geometry
CN109857849A (en) * 2019-01-18 2019-06-07 三角兽(北京)科技有限公司 Answering method, return mechanism, information processing unit and storage medium
CN109961032A (en) * 2019-03-18 2019-07-02 北京字节跳动网络技术有限公司 Method and apparatus for generating disaggregated model
CN110503160A (en) * 2019-08-28 2019-11-26 北京达佳互联信息技术有限公司 Image-recognizing method, device, electronic equipment and storage medium
CN111724353A (en) * 2020-05-30 2020-09-29 同济大学 Surface mount LED flaw labeling method based on One-Class SVM
CN111753583A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Identification method and device
CN111932035A (en) * 2020-09-22 2020-11-13 南京福佑在线电子商务有限公司 Data processing method and device based on multiple models and classified modeling method
JP2021179944A (en) * 2020-05-14 2021-11-18 株式会社東芝 Classification system, program, and learning system
CN113989603A (en) * 2018-01-26 2022-01-28 唯亚威通讯技术有限公司 Reduced false positive identification for spectral classification
CN114241526A (en) * 2022-02-28 2022-03-25 南京甄视智能科技有限公司 Classification model, training method, classification method, electronic device, and storage medium
CN114881110A (en) * 2022-04-02 2022-08-09 西安交通大学 Real-time detection method for total pressure change mode in on-orbit spacecraft cabin

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989603A (en) * 2018-01-26 2022-01-28 唯亚威通讯技术有限公司 Reduced false positive identification for spectral classification
CN108551709A (en) * 2018-04-26 2018-09-18 南昌航空大学 A kind of street light modulating method that multinuclear directed acyclic graph support vector machines controls under complex environment
CN108922583A (en) * 2018-08-10 2018-11-30 青岛大学附属医院 It is a kind of for the liquid handling of medical test, transfer device and its method
CN109492096A (en) * 2018-10-23 2019-03-19 华东理工大学 A kind of unbalanced data categorizing system integrated based on geometry
CN109857849A (en) * 2019-01-18 2019-06-07 三角兽(北京)科技有限公司 Answering method, return mechanism, information processing unit and storage medium
CN109961032B (en) * 2019-03-18 2022-03-29 北京字节跳动网络技术有限公司 Method and apparatus for generating classification model
CN109961032A (en) * 2019-03-18 2019-07-02 北京字节跳动网络技术有限公司 Method and apparatus for generating disaggregated model
CN111753583A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Identification method and device
CN110503160A (en) * 2019-08-28 2019-11-26 北京达佳互联信息技术有限公司 Image-recognizing method, device, electronic equipment and storage medium
US12008799B2 (en) 2020-05-14 2024-06-11 Kabushiki Kaisha Toshiba Classification system and learning system
JP2021179944A (en) * 2020-05-14 2021-11-18 株式会社東芝 Classification system, program, and learning system
JP7354063B2 (en) 2020-05-14 2023-10-02 株式会社東芝 Classification systems, programs and learning systems
CN111724353A (en) * 2020-05-30 2020-09-29 同济大学 Surface mount LED flaw labeling method based on One-Class SVM
CN111932035A (en) * 2020-09-22 2020-11-13 南京福佑在线电子商务有限公司 Data processing method and device based on multiple models and classified modeling method
CN111932035B (en) * 2020-09-22 2021-01-08 南京福佑在线电子商务有限公司 Data processing method and device based on multiple models and classified modeling method
CN114241526A (en) * 2022-02-28 2022-03-25 南京甄视智能科技有限公司 Classification model, training method, classification method, electronic device, and storage medium
CN114881110A (en) * 2022-04-02 2022-08-09 西安交通大学 Real-time detection method for total pressure change mode in on-orbit spacecraft cabin

Similar Documents

Publication Publication Date Title
CN107480690A (en) A kind of more sorting techniques for including unknown classification based on SVMs
CN111144490A (en) Fine granularity identification method based on alternative knowledge distillation strategy
US11837329B2 (en) Method for classifying multi-granularity breast cancer genes based on double self-adaptive neighborhood radius
CN110717554A (en) Image recognition method, electronic device, and storage medium
CN111506729B (en) Information processing method, device and computer readable storage medium
Song et al. Classifier calibration: a survey on how to assess and improve predicted class probabilities
CN102034107A (en) Unhealthy image differentiating method based on robust visual attention feature and sparse representation
WO2021114818A1 (en) Method, system, and device for oct image quality evaluation based on fourier transform
CN111241992A (en) Face recognition model construction method, recognition method, device, equipment and storage medium
CN111160538A (en) Method and system for updating margin parameter value in loss function
Wang et al. Mushroom toxicity recognition based on multigrained cascade forest
Owsiński et al. Reverse clustering
CN113724195B (en) Quantitative analysis model and establishment method of protein based on immunofluorescence image
Elwahsh et al. A new approach for cancer prediction based on deep neural learning
CN114511759A (en) Method and system for identifying categories and determining characteristics of skin state images
Vivona et al. Automated approach for indirect immunofluorescence images classification based on unsupervised clustering method
Mat Jizat et al. Evaluation of the transfer learning models in wafer defects classification
Kim et al. Multiple instance neural networks based on sparse attention for cancer detection using T-cell receptor sequences
CN110879821A (en) Method, device, equipment and storage medium for generating rating card model derivative label
CN115587884A (en) User loan default prediction method based on improved extreme learning machine
CN115472179A (en) Automatic detection method and system for digital audio deletion and insertion tampering operation
Han et al. Automatic used mobile phone color determination: Enhancing the used mobile phone recycling in China
Zhu et al. A Three‐step Method for Three‐way Clustering by Similarity‐based Sample’s Stability
Liu The alexnet-resnet-inception network for classifying fruit images
Cai et al. Application and research progress of machine learning in Bioinformatics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171215

RJ01 Rejection of invention patent application after publication