CN112232417A - Classification method and device, storage medium and terminal - Google Patents

Classification method and device, storage medium and terminal Download PDF

Info

Publication number
CN112232417A
CN112232417A CN202011111887.3A CN202011111887A CN112232417A CN 112232417 A CN112232417 A CN 112232417A CN 202011111887 A CN202011111887 A CN 202011111887A CN 112232417 A CN112232417 A CN 112232417A
Authority
CN
China
Prior art keywords
classification
category
models
result
categories
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011111887.3A
Other languages
Chinese (zh)
Inventor
廖智辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Spreadtrum Hi Tech Communications Technology Co Ltd
Beijing Ziguang Zhanrui Communication Technology Co Ltd
Original Assignee
Beijing Ziguang Zhanrui Communication Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ziguang Zhanrui Communication Technology Co Ltd filed Critical Beijing Ziguang Zhanrui Communication Technology Co Ltd
Priority to CN202011111887.3A priority Critical patent/CN112232417A/en
Publication of CN112232417A publication Critical patent/CN112232417A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A classification method and device, a storage medium and a terminal are provided, wherein the classification method comprises the following steps: obtaining a plurality of classification results of a plurality of classification models aiming at the same data to be classified, wherein each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms; and at least selecting the classification category with the largest quantity in the plurality of classification results as a final classification result. The technical scheme of the invention can improve the accuracy of data classification.

Description

Classification method and device, storage medium and terminal
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a classification method and apparatus, a storage medium, and a terminal.
Background
In the prior art, data can be classified by using an algorithm based on machine learning, and the common and learned classification algorithms include: logistic regression, Support Vector Machines (SVMs), random forests, multi-layer perceptrons (MLPs), gaussian bayes, and bernoulli bayes, among others.
However, in the prior art, a single algorithm model is usually adopted for data classification, and the generalization accuracy of the single algorithm model is not high, which is only about 70% approximately, and the requirement of higher accuracy in some scenes cannot be met.
Disclosure of Invention
The invention solves the technical problem of how to improve the accuracy of data classification.
In order to solve the above technical problem, an embodiment of the present invention provides a classification method, where the classification method includes: obtaining a plurality of classification results of a plurality of classification models aiming at the same data to be classified, wherein each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms; and at least selecting the classification category with the largest quantity in the plurality of classification results as a final classification result.
Optionally, the at least selecting the classification category with the largest number from the multiple classification results includes: selecting the classification category with the largest quantity in the plurality of classification results; determining the most numerous classification categories as the final classification result if only a single of the most numerous classification categories exists.
Optionally, the at least selecting the classification category with the largest number from the multiple classification results includes: selecting the classification category with the largest quantity in the plurality of classification results; if the classification models have a plurality of classification categories with the largest number, comparing the classification success rates of the classification categories, wherein the classification models respectively have corresponding classification success rates for each classification category; and selecting the classification category with the highest classification success rate as the final classification result.
Optionally, the success rate of comparing multiple categories includes: determining at least one classification model generating each of the plurality of classes and determining a maximum value of a classification success rate of the at least one classification model for each of the plurality of classes; and comparing the maximum value of the classification success rate corresponding to each of the plurality of classes.
Optionally, the classification success rate is calculated by the following method: classifying the test samples by using each pre-trained classification model to obtain a test classification result, wherein the test samples comprise actual classification categories; and comparing the test classification result with the actual classification category to obtain the classification success rate of each classification model for each classification category.
Optionally, the plurality of classification models are trained in advance by using training samples, the number of samples in the training samples for each classification category is balanced, and the feature values of the sample data in the training samples correspond to the features of the classification category to which the sample data belongs.
Optionally, the following formula is adopted to represent the overall classification error rate of the plurality of classification models:
Figure BDA0002728857510000021
wherein P (H) (n ≦ T/2) represents an overall classification error rate for the plurality of classification models, T represents a number of the plurality of classification models, hi(x) Representing whether a single classification model is classified correctly, sign () representing a sign function, p representing the correct rate of the single classification model, and q representing the error rate of the single classification model.
In order to solve the above technical problem, an embodiment of the present invention further discloses a classification apparatus, where the classification apparatus includes: the classification result acquisition module is used for acquiring a plurality of classification results of a plurality of classification models aiming at the same data to be classified, each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms; and the selection module is used for at least selecting the classification category with the largest quantity in the plurality of classification results to serve as a final classification result.
The embodiment of the invention also discloses a storage medium, wherein a computer program is stored on the storage medium, and the computer program executes the steps of the classification method when being executed by a processor.
The embodiment of the invention also discloses a terminal which comprises a memory and a processor, wherein the memory is stored with a computer program which can be run on the processor, and the processor executes the steps of the classification method when running the computer program.
Compared with the prior art, the technical scheme of the embodiment of the invention has the following beneficial effects:
the technical scheme of the invention includes that a plurality of classification models are obtained aiming at a plurality of classification results of the same data to be classified, each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms; and at least selecting the classification category with the largest quantity in the plurality of classification results as a final classification result. According to the technical scheme, the classification results of a plurality of classification models are screened, namely the classification categories with the largest quantity are selected; the final classification result determined at the moment is the same classification category obtained by classifying a plurality of classification models, so that the accuracy and reliability of the final classification result are ensured, and the generalization success rate of the classification method is further improved.
Further, selecting the classification category with the largest quantity in the plurality of classification results; if the classification models have a plurality of classification categories with the largest number, comparing the classification success rates of the classification categories, wherein the classification models respectively have corresponding classification success rates for each classification category; and selecting the classification category with the highest classification success rate as the final classification result. In the technical scheme of the invention, under the condition that a plurality of classification categories with the largest quantity exist, the classification category with the highest success rate is selected according to the classification success rate of each classification category of the classification model, so that the generalization accuracy rate of classification is further ensured.
Drawings
FIG. 1 is a flow chart of a classification method according to an embodiment of the present invention;
FIG. 2 is a flowchart of an embodiment of step S102 shown in FIG. 1;
fig. 3 is a schematic structural diagram of a sorting apparatus according to an embodiment of the present invention.
Detailed Description
As described in the background art, in the prior art, a single algorithm model is usually used for data classification, and the generalization accuracy of the single algorithm model is not high, and is only about 70%, which cannot meet the requirement of higher accuracy in some scenes.
According to the technical scheme, the classification results of a plurality of classification models are screened, namely the classification categories with the largest quantity are selected; the final classification result determined at the moment is the same classification category obtained by classifying a plurality of classification models, so that the accuracy and reliability of the final classification result are ensured, and the generalization success rate of the classification method is further improved.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Fig. 1 is a flowchart of a classification method according to an embodiment of the present invention.
The method shown in fig. 1 may be used on the server side, i.e. the steps of the method may be performed by a server. The server may be any suitable terminal, such as, but not limited to, a computer, a smart phone, etc.
Specifically, the classification method may include the steps of:
step S101: obtaining a plurality of classification results of a plurality of classification models aiming at the same data to be classified, wherein each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms;
step S102: and at least selecting the classification category with the largest quantity in the plurality of classification results as a final classification result.
It should be noted that the sequence numbers of the steps in this embodiment do not represent a limitation on the execution sequence of the steps.
In a specific implementation, the plurality of classification models may be previously constructed and trained by using different classification algorithms. And inputting the data to be classified into each classification model as input data, outputting corresponding classification results by each classification model, wherein the classification results comprise classification categories. For example, for the same to-be-classified data, the classification result generated by the classification model 1 is class 1, the classification result generated by the classification model 2 is class 2, the classification result generated by the classification model 3 is class 1, and the like.
In one embodiment, the plurality of classification models may be preconfigured in the server. And when data classification is needed, directly calling the classification result of the classification model.
It should be noted that the data to be classified may be any executable data capable of performing a classification operation, for example, log (log) data generated by a mobile phone, and the classification model classifies the log data according to the problems contained therein.
As described in the background, the accuracy of classification using a single classification model is not high. However, if a plurality of classification models are used for classification, the problem may occur that the classification results generated by different classification models are different, for example, the classification result generated by the classification model 1 is class 1, and the classification result generated by the classification model 2 is class 2. In this case, the final classification result may be determined by selecting the most numerous classification category of the plurality of classification results. That is, the classification category included in the final classification result is the largest number of classification categories.
Various classification algorithms (namely, classification models constructed by the classification algorithms) have different adaptability to different sample data distributions, and meanwhile, the characteristic value data compatibility of the samples can be better and the generalization capability is stronger by adopting a plurality of models for prediction, namely, the success rate is higher.
According to the technical scheme, the classification results of a plurality of classification models are screened, namely the classification categories with the largest quantity are selected; the final classification result determined at the moment is the same classification category obtained by classifying a plurality of classification models, so that the accuracy and reliability of the final classification result are ensured, and the generalization success rate of the classification method is further improved.
In one non-limiting embodiment, the overall classification error rate of the plurality of classification models may be represented using the following formula:
Figure BDA0002728857510000051
wherein P (H) (n ≦ T/2) represents an overall classification error rate for the plurality of classification models, T represents a number of the plurality of classification models, hi(x) Representing whether a single classification model is classified correctly, sign () representing a sign function, p representing the correct rate of the single classification model, and q representing the error rate of the single classification model.
In specific embodiments, hi(x) Is selected from { -1,1}, hi(x) A value of-1 indicates that the current classification model is classified incorrectly, and a value of 1 indicates that the current classification model is classified correctly. If more than half of the classification models predict correctly, the value of H (x) is greater than 0, indicating that the overall classification is correct.
As can be seen from the hough inequality, the larger the number T of the plurality of classification models,
Figure BDA0002728857510000052
the smaller the value of (a), the smaller the overall classification error rate of the plurality of classification models. For example: when the error rate q of single model classification is 0.2, namely the success rate of classification of single classification models is more than 80%, if T is 1, P (H (n) ≦ T/2)<0.98; if T is 6, then P (H (n) ≦ T/2)<=0.34。
In other words. The larger the number of classification models, the higher the success rate of the final classification result. Therefore, the classification method of the invention adopts various classification models to classify the data to be classified so as to improve the classification accuracy.
In one non-limiting embodiment, step S102 shown in fig. 1 may include the following steps: selecting the classification category with the largest quantity in the plurality of classification results; determining the most numerous classification categories as the final classification result if only a single of the most numerous classification categories exists.
In this embodiment, there is one and only one classification category with the largest number.
Referring to table 1, there are six classification models shown in table 1, which are classification models constructed by using logistic regression, Support Vector Machine (SVM), random forest, Multilayer Perceptron (MLP), gaussian bayes, and bernoulli bayes, respectively. The classification results of the models for the same data to be classified are classified one, classified two, classified one, classified three, classified one and classified two respectively. The most numerous classification categories are the first classification (3), the second classification category is 2, and the third classification category is 1. Therefore, the final classification result is class one.
Classification model Classification result
Logistic regression Classify one
SVM Classification two
Random forest Classify one
MLP Classification three
Gauss Bayes Classify one
Bernoulli Bayes Classification two
TABLE 1
In a non-limiting embodiment, referring to fig. 2, step S102 shown in fig. 1 may include the following steps:
step S201: selecting the classification category with the largest quantity in the plurality of classification results;
step S202: if the classification models have a plurality of classification categories with the largest number, comparing the classification success rates of the classification categories, wherein the classification models respectively have corresponding classification success rates for each classification category;
step S203: and selecting the classification category with the highest classification success rate as the final classification result.
Unlike the foregoing embodiments, in the present embodiment, there are a plurality of categories of the largest number of classification categories. Under the condition, the classification category with the highest success rate can be selected according to the classification success rate of each classification category of the classification model, and the generalization accuracy rate of classification is further ensured.
In a specific implementation, the classification models respectively have corresponding classification success rates for each classification category. The classification success rate represents the accuracy of the classification model for classifying the classification category. The higher the classification success rate of the classification model for a certain classification category is, the higher the accuracy rate of the classification model for classifying the classification category is, and the higher the reliability is.
Referring to table 2, there are six classification models shown in table 2, which are respectively classification models constructed by using logistic regression, SVM, random forest, MLP, gaussian bayes and bernoulli bayes. The classification results of the models for the same data to be classified are classified one, classified two, classified three, classified one and classified four respectively. The classification categories with the largest number are classified I and classified III (the number is 2); the success rates of the first classification are respectively 98.63% and 72.15%, the success rates of the third classification are respectively 100.0% and 100.0%, and the success rate of the third classification is the highest. Therefore, the final classification result is classification three.
Classification model Classification result Classification success rate
Logistic regression Classify one 98.63%
SVM Classification two 100.0%
Random forest Classification three 100.0%
MLP Classification three 100.0%
Gauss Bayes Classify one 72.15%
Bernoulli Bayes Classification four 81.28%
TABLE 2
In one non-limiting embodiment, step S202 shown in fig. 2 may include the following steps: determining at least one classification model generating each of the plurality of classes and determining a maximum value of a classification success rate of the at least one classification model for each of the plurality of classes; and comparing the maximum value of the classification success rate corresponding to each of the plurality of classes.
In this embodiment, when comparing the success rates of classification, the maximum value of the success rates of classification corresponding to classification categories is compared. And the classification category with the largest classification success rate is selected, so that the higher accuracy of the classification category in the final result can be ensured.
With continued reference to the embodiment shown in table 2, the success rates of the first classification are 98.63% and 72.15%, the success rates of the third classification are 100.0% and 100.0%, the maximum value of the success rates of the first classification is 98.63%, and the maximum value of the success rates of the third classification is 100%, so that the final classification result is the third classification.
In one non-limiting embodiment, the classification success rate is calculated as follows: classifying the test samples by using each pre-trained classification model to obtain a test classification result, wherein the test samples comprise actual classification categories; and comparing the test classification result with the actual classification category to obtain the classification success rate of each classification model for each classification category.
In this embodiment, the classification success rate may be obtained by pre-calculation, and specifically may be obtained by calculation using a test sample. The test sample includes test data and its corresponding actual classification category. And respectively inputting the test samples as input to each trained classification model, and outputting a corresponding classification result by each classification model according to each test data. The classification success rate of each classification model for each class can be obtained by comparing the classification result of each classification model for each test data with the actual classification class corresponding to the test data.
In one non-limiting embodiment, the plurality of classification models are pre-trained by using training samples, the number of samples in the training samples for each classification category is balanced, and the feature values of the sample data in the training samples correspond to the features of the classification category to which the sample data belongs.
Specifically, the characteristic value data of each sample reflects the characteristics of the classification of the sample as much as possible, so that the trained model can better fit the existing sample data, and the situations of incapability of convergence and overlong training time can be avoided. At the same time, the number of samples per classification is as balanced as possible. The average number of the classified samples is compared, so that the situation that the success rate of a certain classifier can not reach more than 50% all the time due to the fact that the sample data is distributed unevenly on the whole can be avoided. More specifically, the number of individual training samples is within a 10-fold gap.
In a specific application scenario, the server performs stability module problem classification on the log data of the smart phone side, wherein the classification categories are processor performance, reading performance, storage performance and system performance.
The inventor of the present application verified the accuracy of the classification method of the prior art and the classification method of the present invention through test data, and the accuracy calculated for different test data is 50%, 66%, 50% and 100% respectively under the condition of adopting a single classification algorithm. After the classification algorithm of the embodiment of the invention is adopted, the accuracy rates calculated aiming at different test data are respectively 100%, 86% and 90%.
Referring to fig. 3, an embodiment of the present invention further discloses a classification device, where the classification device 30 may include:
a classification result obtaining module 301, configured to obtain multiple classification results of multiple classification models for the same data to be classified, where each classification result includes a classification category, and the multiple classification models are constructed by using different classification algorithms;
a selecting module 302, configured to select at least the most numerous classification categories from the multiple classification results as final classification results.
In one non-limiting embodiment, the selection module 302 may include: a first selecting unit (not shown) for selecting a most numerous classification category from the plurality of classification results; a first determining unit (not shown) for determining the most numerous classification categories as the final classification result if only a single type of the most numerous classification categories exists.
In one non-limiting embodiment, the selection module 302 may include: a second selecting unit (not shown) for selecting the most numerous classification categories from the plurality of classification results; a comparing unit (not shown), if there are a plurality of categories with the largest number, comparing the success rates of the categories, wherein the plurality of classification models respectively have corresponding success rates for each category; and a second determining unit (not shown) for selecting the classification category with the highest classification success rate as the final classification result.
The embodiment of the invention screens the classification results of a plurality of classification models, namely selects the classification category with the largest quantity; the final classification result determined at the moment is the same classification category obtained by classifying a plurality of classification models, so that the accuracy and reliability of the final classification result are ensured, and the generalization success rate of the classification method is further improved.
For more details of the operation principle and the operation mode of the classification device 30, reference may be made to the related descriptions in fig. 1 to 2, which are not described herein again.
The embodiment of the invention also discloses a storage medium, which is a computer-readable storage medium and stores a computer program thereon, and the computer program can execute the steps of the classification method shown in fig. 1 or fig. 2 when running. The storage medium may include ROM, RAM, magnetic or optical disks, etc. The storage medium may further include a non-volatile memory (non-volatile) or a non-transitory memory (non-transient), and the like.
The embodiment of the invention also discloses a terminal which can comprise a memory and a processor, wherein the memory is stored with a computer program which can run on the processor. The processor, when running the computer program, may perform the steps of the classification method shown in fig. 1 or fig. 2. The terminal includes, but is not limited to, a mobile phone, a computer, a tablet computer and other terminal devices.
Specifically, in the embodiment of the present invention, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example and not limitation, many forms of Random Access Memory (RAM) are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (enhanced SDRAM), SDRAM (SLDRAM), synchlink DRAM (SLDRAM), and direct bus RAM (DR RAM).
The "plurality" appearing in the embodiments of the present application means two or more.
The descriptions of the first, second, etc. appearing in the embodiments of the present application are only for illustrating and differentiating the objects, and do not represent the order or the particular limitation of the number of the devices in the embodiments of the present application, and do not constitute any limitation to the embodiments of the present application.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method of classification, comprising:
obtaining a plurality of classification results of a plurality of classification models aiming at the same data to be classified, wherein each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms;
and at least selecting the classification category with the largest quantity in the plurality of classification results as a final classification result.
2. The method according to claim 1, wherein said selecting at least the most numerous classification categories of the classification results comprises:
selecting the classification category with the largest quantity in the plurality of classification results;
determining the most numerous classification categories as the final classification result if only a single of the most numerous classification categories exists.
3. The method according to claim 1, wherein said selecting at least the most numerous classification categories of the classification results comprises:
selecting the classification category with the largest quantity in the plurality of classification results;
if the classification models have a plurality of classification categories with the largest number, comparing the classification success rates of the classification categories, wherein the classification models respectively have corresponding classification success rates for each classification category;
and selecting the classification category with the highest classification success rate as the final classification result.
4. The classification method according to claim 3, wherein the success rate of comparing the plurality of classes comprises:
determining at least one classification model generating each of the plurality of classes and determining a maximum value of a classification success rate of the at least one classification model for each of the plurality of classes;
and comparing the maximum value of the classification success rate corresponding to each of the plurality of classes.
5. A classification method according to claim 3 or 4, characterized in that the classification success rate is calculated in the following way:
classifying the test samples by using each pre-trained classification model to obtain a test classification result, wherein the test samples comprise actual classification categories;
and comparing the test classification result with the actual classification category to obtain the classification success rate of each classification model for each classification category.
6. The classification method according to claim 1, wherein the classification models are pre-trained by using training samples, the number of samples in the training samples for each classification category is balanced, and the feature values of the sample data in the training samples correspond to the features of the classification category to which the sample data belongs.
7. The classification method according to claim 1, wherein the overall classification error rate of the plurality of classification models is expressed by the following formula:
Figure FDA0002728857500000021
Figure FDA0002728857500000022
wherein P (H) (n ≦ T/2) represents an overall classification error rate for the plurality of classification models, T represents a number of the plurality of classification models, hi(x) Representing whether a single classification model is classified correctly, sign () representing a sign function, p representing the correct rate of the single classification model, and q representing the error rate of the single classification model.
8. A sorting apparatus, comprising:
the classification result acquisition module is used for acquiring a plurality of classification results of a plurality of classification models aiming at the same data to be classified, each classification result comprises a classification category, and the classification models are constructed by adopting different classification algorithms;
and the selection module is used for at least selecting the classification category with the largest quantity in the plurality of classification results to serve as a final classification result.
9. A storage medium having stored thereon a computer program for performing the steps of the classification method according to any one of claims 1 to 7 when the computer program is executed by a processor.
10. A terminal comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, wherein the processor, when executing the computer program, performs the steps of the classification method according to any one of claims 1 to 7.
CN202011111887.3A 2020-10-16 2020-10-16 Classification method and device, storage medium and terminal Pending CN112232417A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011111887.3A CN112232417A (en) 2020-10-16 2020-10-16 Classification method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011111887.3A CN112232417A (en) 2020-10-16 2020-10-16 Classification method and device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN112232417A true CN112232417A (en) 2021-01-15

Family

ID=74118831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011111887.3A Pending CN112232417A (en) 2020-10-16 2020-10-16 Classification method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN112232417A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116719942A (en) * 2023-07-07 2023-09-08 北京亿赛通科技发展有限责任公司 Data asset classification method, apparatus, computer device and computer storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109086825A (en) * 2018-08-03 2018-12-25 北京邮电大学 A kind of more disaggregated model fusion methods based on model adaptation selection
CN109086790A (en) * 2018-06-19 2018-12-25 歌尔股份有限公司 A kind of alternative manner of disaggregated model, device and electronic equipment
CN109344869A (en) * 2018-08-28 2019-02-15 东软集团股份有限公司 A kind of disaggregated model optimization method, device and storage equipment, program product
CN109657159A (en) * 2018-12-18 2019-04-19 哈尔滨工业大学 The determination method of the transfer learning boundary of isomeric relationship data in public sentiment data role identification
US20200258008A1 (en) * 2019-02-12 2020-08-13 NEC Laboratories Europe GmbH Method and system for adaptive online meta learning from data streams

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109086790A (en) * 2018-06-19 2018-12-25 歌尔股份有限公司 A kind of alternative manner of disaggregated model, device and electronic equipment
CN109086825A (en) * 2018-08-03 2018-12-25 北京邮电大学 A kind of more disaggregated model fusion methods based on model adaptation selection
CN109344869A (en) * 2018-08-28 2019-02-15 东软集团股份有限公司 A kind of disaggregated model optimization method, device and storage equipment, program product
CN109657159A (en) * 2018-12-18 2019-04-19 哈尔滨工业大学 The determination method of the transfer learning boundary of isomeric relationship data in public sentiment data role identification
US20200258008A1 (en) * 2019-02-12 2020-08-13 NEC Laboratories Europe GmbH Method and system for adaptive online meta learning from data streams

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蒋杰,熊昌镇: "一种数据增强和多模型集成的细粒度分类算法", 《图学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116719942A (en) * 2023-07-07 2023-09-08 北京亿赛通科技发展有限责任公司 Data asset classification method, apparatus, computer device and computer storage medium
CN116719942B (en) * 2023-07-07 2024-03-12 北京亿赛通科技发展有限责任公司 Data asset classification method, apparatus, computer device and computer storage medium

Similar Documents

Publication Publication Date Title
US11741361B2 (en) Machine learning-based network model building method and apparatus
WO2022042123A1 (en) Image recognition model generation method and apparatus, computer device and storage medium
US11587356B2 (en) Method and device for age estimation
KR20210110823A (en) Image recognition method, training method of recognition model, and related devices and devices
CN112257808B (en) Integrated collaborative training method and device for zero sample classification and terminal equipment
CN110377733B (en) Text-based emotion recognition method, terminal equipment and medium
CN109840413B (en) Phishing website detection method and device
US20190303943A1 (en) User classification using a deep forest network
US11403550B2 (en) Classifier
CN111639687A (en) Model training and abnormal account identification method and device
CN111027576A (en) Cooperative significance detection method based on cooperative significance generation type countermeasure network
CN111242319A (en) Model prediction result interpretation method and device
CN112232397A (en) Knowledge distillation method and device of image classification model and computer equipment
US9053434B2 (en) Determining an obverse weight
CN112232417A (en) Classification method and device, storage medium and terminal
CN113902944A (en) Model training and scene recognition method, device, equipment and medium
WO2024016949A1 (en) Label generation method and apparatus, image classification model method and apparatus, and image classification method and apparatus
CN112200666A (en) Feature vector processing method and related device
CN113641708B (en) Rule engine optimization method, data matching method and device, storage medium and terminal
CN113221662B (en) Training method and device of face recognition model, storage medium and terminal
CN113157987A (en) Data preprocessing method for machine learning algorithm and related equipment
CN112347893B (en) Model training method and device for video behavior recognition and computer equipment
CN109034207B (en) Data classification method and device and computer equipment
CN113486918B (en) Image recognition method and device based on dynamic adjustment feature vector distribution trend
KR102326972B1 (en) System for identifying articles and method to determine the authenticity of propositions reflecting the reliability of media

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210115