CN108229566A - A kind of hierarchy sorting technique and device - Google Patents

A kind of hierarchy sorting technique and device Download PDF

Info

Publication number
CN108229566A
CN108229566A CN201810012074.5A CN201810012074A CN108229566A CN 108229566 A CN108229566 A CN 108229566A CN 201810012074 A CN201810012074 A CN 201810012074A CN 108229566 A CN108229566 A CN 108229566A
Authority
CN
China
Prior art keywords
class
score
thick
subclass
thick class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810012074.5A
Other languages
Chinese (zh)
Other versions
CN108229566B (en
Inventor
钟华堡
张帆
夏远祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Hualian Electronics Co Ltd
Original Assignee
Xiamen Hualian Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Hualian Electronics Co Ltd filed Critical Xiamen Hualian Electronics Co Ltd
Priority to CN201810012074.5A priority Critical patent/CN108229566B/en
Publication of CN108229566A publication Critical patent/CN108229566A/en
Application granted granted Critical
Publication of CN108229566B publication Critical patent/CN108229566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of hierarchy sorting techniques and device, this method to include:Neural network receives the image of input, and image is identified processing to export normalization score;The maximum score in normalization score is searched, and determines corresponding classification;Judge whether maximum score is more than threshold value;When maximum score is more than threshold value, maximum score and corresponding classification are exported as recognition result;When maximum score is not more than threshold value, score distribution is converted into thick class score distribution, then cycle performs the maximum score searched in normalization score, and determine corresponding classification, and the step of whether maximum score is more than threshold value judged, until determining to stop cycle when maximum score is more than threshold value, maximum score and corresponding classification are exported as recognition result.By the above-mentioned means, the hierarchy classification of the present invention from fine to coarse, to solve the problems, such as in existing image recognition tasks that caused mistakes and omissions are identified when the score of all subclasses is all relatively low.

Description

A kind of hierarchy sorting technique and device
Technical field
The present invention relates to neural network learning technical field, more particularly to a kind of hierarchy sorting technique and device.
Background technology
Neural network is widely used in field of image recognition.Under normal circumstances, a trained neural network can only It is enough to classify, and require not including independently between these types of classification to specific a few class things, such as apple and pears, and need The score of all categories is normalized (all numerical value are between 0~1 and and are 1).We can call these classifications Subclass is other, is then thick category on this, such as apple and pears belong to pomaceous fruits, then up a thick rank also has fruit Class, and so on.If there are normalized thick class output score, because the class scope bigger that thick class is included, theoretical The thick class score of upper target should than comprising subclass score it is high.
The characteristics of image of neural network extraction is the surface characteristics of object, such as shape, texture, color can't be complete The full essential distinction understood between things, so easily obscuring things similar in appearance, specific manifestation is exactly:Similar article Split-phase is near, while score is not easily distinguishable all than relatively low yet.
One is for the general work flow of polytypic image identification system:Image to be classified is inputted into nerve net Network obtains normalized array, and each address location corresponds to a classification in array, and each numerical value represents diagram picture and belongs to Then the score or prediction probability of this classification search maximum score and its affiliated classification, if this is maximum in array Score is more than some threshold value and then exports its score value and classification, does not otherwise export.Applied field is identified in the image of some openings Jing Zhong needs the type that identification is distinguished very more, such as the identification of refrigerator inside article, identification function demonstration, this scene are frequent Object classification new, unknown, fuzzy or that feature is similar is encountered, this can lead to the maximum score of neural network output still It is smaller, then the above-mentioned method distinguished with threshold value work may cause leakage identification or wrong identification, accuracy rate to decline, and influence user's body It tests.
Invention content
The invention mainly solves the technical problem of providing a kind of hierarchy sorting technique and device, to solve the prior art In middle image recognition tasks when the score of all subclasses is relatively low caused mistakes and omissions identification technology problem.
In order to solve the above technical problems, one aspect of the present invention is:A kind of hierarchy sorting technique is provided, The method includes:Neural network receives the image of input, and described image is identified processing to export normalization score; Wherein, the neural network has been completed to train in advance, and the normalization of output is scored at the prediction probability to classification;It searches Maximum score in the normalization score, and determine corresponding classification;Judge whether the maximum score is more than threshold value;Work as institute When stating maximum score more than the threshold value, the maximum score and corresponding classification are exported as recognition result;When the maximum When score is no more than the threshold value, score distribution is converted into thick class score and is distributed, then cycle performs returns described in lookup One changes the maximum score in score, and determines corresponding classification and judge the step of whether maximum score is more than threshold value, Until determining stopping cycle when the maximum score is more than the threshold value, the maximum score and corresponding classification are exported as knowledge Other result.
Wherein, score distribution is converted to thick class score to be distributed, is specifically included:The neural network input is as thin The score distribution of classification;Input class hierarchy tree;Wherein, the class hierarchy tree builds in advance, for storing subclass to thick class Tree-like relational structure;Child node is distributed as using the score as disaggregated classification currently inputted, inquires the class hierarchy All father nodes on tree, to obtain corresponding thick class;The score of all thick classes is calculated, and is normalized;And output The score distribution of thick class.
Wherein, the score of all thick classes is calculated, and is normalized, specially:It is calculated using formula is calculated as below All scores of thick class:
Wherein, m is subclass quantity, and n is thick class quantity, and c is numbered for thick class;If c includes mcThe subclass number included for thick class c Amount, rcFor the coefficient of balance of thick class c, P (c) is the score of thick class c, and x is the subclass number that thick class c is included, and P (x) is obtained for subclass Point;The thick class score being calculated is normalized using equation below.
Wherein, the class hierarchy tree is built, is specifically included:Selected part data set collects as assessment, and described in input Assessment collection;The neural network recognization processing assessment collects to obtain normalized output result;With reference to true tag, statistics nerve The disaggregated classification of network is as a result, structure subclass confusion matrix;It is one group that classification will easily be obscured in the confusion matrix to gather, as thick Class;To the thick class name, number, the relationship of subclass and thick class is preserved with tree form data structure;Judge to preserve the thick class Whether node reaches the root node of tree;If so, it preserves and exports the class hierarchy tree;Otherwise, the thick class is calculated to obscure Matrix, and easily obscure from using the confusion matrix classification and gather for one group and started the cycle over as the step of thick class, until preserving institute When stating the node of thick class and reaching the root node of tree, stop cycle, and export the class hierarchy tree.
Wherein, thick class confusion matrix is calculated, specially:The thick class confusion matrix is calculated using formula is calculated as below:
Wherein, m is subclass quantity, and i, j are numbered for subclass, xijThe element arranged for the i-th row jth in subclass confusion matrix X;n For thick class quantity, u, v are numbered for thick class, cuvFor the element of the u rows v row in then thick class confusion matrix C, U, V are respectively thick The set of subclass number that class u, v includes.
In order to solve the above technical problems, another technical solution used in the present invention is:A kind of hierarchy classification dress is provided It puts, described device includes:For receiving the image of input, and described image is identified processing with defeated in image processing unit Go out to normalize score;Wherein, the normalization of the output is scored at the prediction probability to classification;Searching unit, for searching The maximum score in normalization score is stated, and determines corresponding classification;Judging unit, for judging whether the maximum score is big In threshold value;Recognition result output unit, for when the maximum score is more than the threshold value, exporting the maximum score and right The classification answered is as recognition result;Thick class converting unit, for when the maximum score is more than the threshold value, by the score Distribution is converted to thick class score distribution, and triggers the searching unit and search the maximum score in the normalization score to determine Whether corresponding classification and the triggering judging unit judge more than threshold value the maximum score.
Wherein, the thick class converting unit is specifically used for:It is distributed as using the score as disaggregated classification currently inputted Child node, all father nodes on query categories hierarchical tree, to obtain corresponding thick class;Wherein, the advance structure of class hierarchy tree It builds, for storing subclass to the tree-like relational structure of thick class;The score of all thick classes is calculated, and is normalized to obtain The score distribution of the thick class.
Wherein, the thick class converting unit is specifically used for:The slightly score of class is calculated all using formula is calculated as below:
Wherein, m is subclass quantity, and n is thick class quantity, and c is numbered for thick class;If c includes mcThe subclass number included for thick class c Amount, rcFor the coefficient of balance of thick class c, P (c) is the score of thick class c, and x is the subclass number that thick class c is included, and P (x) is obtained for subclass Point;The thick class score being calculated is normalized using equation below.
Wherein, class hierarchy tree construction unit is further included, is used for:Selected part data set collects as assessment, and inputs institute Collection is estimated in commentary;Assessment described in identifying processing collects to obtain normalized output result;With reference to true tag, statistical neural network Disaggregated classification is as a result, structure subclass confusion matrix;It is one group that classification will easily be obscured in the confusion matrix to gather, as thick class;To institute Thick class name, number are stated, the relationship of subclass and thick class is preserved with tree form data structure;Judge to preserve the thick class node whether Reach the root node of tree;If so, it preserves and exports the class hierarchy tree;Otherwise, the calculating thick class confusion matrix, and from Classification will be easily obscured in the confusion matrix gather for one group and started the cycle over as the step of thick class, until preserving the section of the thick class When point reaches the root node of tree, stop cycle, and export the class hierarchy tree.
Wherein, the class hierarchy tree construction unit is specifically used for obscuring square using formula is calculated as below calculating the thick class Battle array:
Wherein, m is subclass quantity, and i, j are numbered for subclass, xijThe element arranged for the i-th row jth in subclass confusion matrix X;n For thick class quantity, u, v are numbered for thick class, cuvFor the element of the u rows v row in then thick class confusion matrix C, U, V are respectively thick The set of subclass number that class u, v includes.
In above scheme, by the hierarchy sorting technique and device in the present invention, when the score of all subclasses is all relatively low When be converted to thick class recognition result because the score of thick class is generally higher than subclass score, energy when being screened with threshold value Enough retain the confidence level higher of more results and each result, so as to reduce mistakes and omissions identification, improve accuracy rate.
Description of the drawings
Fig. 1 is a kind of flow diagram of hierarchy sorting technique in embodiment of the present invention;
Fig. 2 is that the flow that score distribution shown in FIG. 1 is converted to thick class score location mode as the distribution of subclass score is shown It is intended to;
Fig. 3 is the flow diagram of the structure class hierarchy tree method in embodiment of the present invention;
Fig. 4 is a kind of structure diagram of hierarchy sorter in first embodiment of the invention;
Fig. 5 is a kind of structure diagram of hierarchy sorter in second embodiment of the invention.
Specific embodiment
In order to describe the technical content, the structural feature, the achieved object and the effect of this invention in detail, below in conjunction with attached drawing and reality Applying example, the present invention is described in detail.
Referring to Fig. 1, the flow diagram for a kind of hierarchy sorting technique in embodiment of the present invention.This method packet Include following steps:
Step S10, neural network receive the image of input, and processing is identified to the image and is normalized with exporting Point.
Wherein, which can derive from the number of ways such as picture, film, camera, video camera.
The neural network is trained in advance to be completed and has classification capacity, and output has normalized distribution form, That is, all numerical value of output are in 0~1 section, and the sum of all numerical value of output are 1.The numerical value of neural network output Represent the scoring to each classification or prediction probability.
Further, the structure of the neural network can be linear classifier, BP neural network, depth convolutional neural networks Or recurrent neural network etc..It is related with classification that the neural network can be applied to image identification, target detection, image segmentation etc. Task.
Step S11 searches the maximum score p in normalization score and corresponding classification c.
For example, work as p=0.75, c=apples.
Step S12, judges whether maximum score p is more than threshold value h;If p>H then enters step S13;Otherwise, it enters step S14。
Wherein, 0<h<1, and different values may be used in the different cycles stage (that is, different classes of level) in threshold value h, For example, h=0.7 when recycling for the first time, h=0.8 when recycling for the second time, and so on.The concrete numerical value of threshold value h is generally by technology Personnel set according to actual test situation.
Step S13, output p, c value is as final recognition result;Then, flow terminates.
Step S14 is distributed using score distribution as subclass score, is converted to thick class score distribution;Then, return to step S11。
After subclass is converted to thick class, thick class score is more than or equal to the score of affiliated subclass, so after recycling several times, Thick class score (being up to 1.0) is necessarily greater than threshold value, so as to jump out cycle.
Referring to Fig. 2, step S14, that is, score distribution is converted into thick class score as the distribution of subclass score and is distributed, tool Body includes the following steps:
Step S140 inputs the score distribution of disaggregated classification.
Specifically, which may be from the output of neural network, it is also possible to from the last round of of the thick class modular converter Recurrence exports.Score distribution must be normalized distribution form, that is, all numerical value and are 1 between 0~1.Below For table, each layer (except top) is all normalized in table, can serve as inputting.
1 class hierarchy tree example of table
Step S141 inputs class hierarchy tree.
Wherein, which is the data organizational structure of tree-shaped.As shown in table 1, it is one embodiment of class hierarchy tree, In class hierarchy tree, the child node of tree represents thin category, for example, pomaceous fruits and nut fruits;The father node of tree then represents slightly Category, for example, fruit.So the leaf node of tree represents most fine classification, for example, the apple of 1 bottom of table, pears, The output classification of walnut etc., these classifications and aforementioned neurological network is consistent;There are one the root nodes of tree, for example, table 1 most pushes up The class in kind of layer, it is most coarse classification, enumerates all subclasses.
Step S142, using the subclass that currently inputs as child node, all father nodes on query categories hierarchical tree, with To corresponding thick class.
For example, as it is known that input is the distribution of pomaceous fruits in table 1, nut fruits, mineral water and Sprite, then inquiry obtains thick class For:Fruit, beverage.
Step S143, calculates the score of all thick classes, and is normalized.
Imbalanced training sets are the FAQs in big data, and a kind of solution in deep learning is for each classification Partition equilibrium coefficient is multiplied by the coefficient during calculating per the corresponding loss function of class, wherein, coefficient of balance=average per class sample Number/such sample number.In present embodiment, it is assumed that the score of subclass is after balance sample as a result, but slightly class includes Subclass quantity differ, such as in table 1, pomaceous fruits include two kinds of subclasses, and Sprite only comprising one kind, this can lead to the sample of thick class This is still unbalanced.As can be seen that the sample number of thick class be exactly it includes subclass number, then it is similar, can calculate thick The coefficient of balance of class.Specifically, calculation formula is as follows:
If sharing m subclass, n thick classes, c is numbered for thick class;If c includes mcThe coefficient of balance r of a subclass, then thick class cc For:
If thick class c is scored at P (c), it includes subclass number be x, subclass is scored at P (x), then:
The thick class score being calculated is normalized using equation below:
By taking aforementioned table 1 as an example, apple and pears are subclass, and pomaceous fruits are thick class, then the coefficient of balance of pomaceous fruits is:7/(4× 2)=0.875, so P (pomaceous fruits)=0.895 × [P (apple)+P (pears)]=0.875 × (0.5+0.2)=0.6125, together Reason can calculate nut fruits, mineral water, Sprite score be respectively:0.21875,0.035,0.0175, last renormalization The thick class score (rounding up) for respectively obtaining 1 the third line of table is:[0.6930,0.2475,0.0396,0.0198].Similarly may be used Calculate the score of fruit, beverage in table 1.
Step S144 exports the score distribution of thick classification.
Further, referring to Fig. 3, category hierarchical tree is used to store subclass to the tree-like relational structure of thick class.Structure The method of category hierarchical tree includes the following steps:
Step S20, selected part data set collect as assessment, input assessment collection;
Wherein, assessment collection includes all kinds of images and true tag, unsuitable very few per class picture, more suitable more than 50.
Step S21, neural network recognization handle the assessment collection, obtain normalized output result;
Step S22, with reference to true tag, the disaggregated classification of statistical neural network is as a result, structure subclass confusion matrix;
Confusion matrix is a kind of visualization mathematical tool, for comparison prediction result and true tag value.If subclass number is M, then the size of subclass confusion matrix X is m × m, and matrix column corresponds to the fine classification of prediction, and the row of matrix corresponds to finely true Real label, that is to say, that the true label of the i-th class of element representation of the i-th row jth row of matrix is classified as jth class prediction label Quantity.By taking table 2 as an example, the numerical value of matrix leading diagonal is the quantity correctly classified, remaining is the quantity classified by mistake, Mistake classification quantity bright corresponding classification of more speaking more more easily is obscured.
2 subclass confusion matrix example of table
Step S23, it is one group that classification will be easily obscured in confusion matrix to gather, as thick class;
Wherein, confusion matrix is observed to find easily to obscure classification, and using current class as subclass, it will be confusing multiple It is one group that subclass, which gathers, as thick class.It is one group artificially or automatically to gather confusing multiple subclasses, as thick class.For example, Apple and pears are artificially polymerized to pomaceous fruits.Single subclass can be polymerized to a thick class, and whole subclasses can also aggregate into one A thick class.
Step S24 is named the thick class, numbers, and the relationship of subclass and thick class is preserved with tree form data structure.
Wherein, the father node that thick class is tree is preserved, corresponding subclass is child node, and all relational structures form a classification layer Secondary tree.
Step S25, judges whether the node for preserving thick class reaches the root node of tree, if so, entering step S26;Otherwise, Enter step S27.
Wherein, when thick class only there are one when, represent reach root node, that is, judge thick class whether only there are one, to determine to be No arrival root node.
Because thick class quantity is always less than subclass quantity, cycle after several times thick class must only there are one (that is, Aforementioned " class in kind "), that is, it represents to reach root node, so as to jump out cycle.
Step S26 is preserved and is exported category hierarchical tree.Then, flow terminates.
Step S27 calculates thick class confusion matrix, then, return to step S23.
Wherein, the calculation formula of thick class confusion matrix is as follows:
If subclass number is m, then the size of subclass confusion matrix X is m × m, and the element of the i-th row jth row is xij, i and j also table Subclass number is shown;If thick class number is n, then the size of thick class confusion matrix C is n × n, and the element of u rows v row is cuv, u and V shows also thick class number, and U, V are respectively the set of subclass number that thick class u, v includes.Cyclically calculate each cuv, then cuvCalculation formula be:
By taking table 3 as an example, pomaceous fruits include apple and pears, so before first numerical value 97 is equal to the front two row in aforementioned table 2 The sum of numerical value of two row, i.e. 94=46+2+2+47;Nut fruits include walnut and Chinese chestnut, so second numerical value 3 of the first row is equal to The sum of numerical value that front two row the 3rd the 4th in aforementioned table 2 arranges, i.e. 3=1+1+1+0, and so on.
3 thick class confusion matrix example of table
Referring to Fig. 4, the structure diagram for a kind of hierarchy sorter in first embodiment of the invention.Its In, which can be applied in the image identification scenes divided more, which may be provided at various arithmetic facilities, for example, Computer, palm PC, microcontroller etc., the software unit, hardware cell or software and hardware that can be operate in these equipment are mutually tied The unit of conjunction can also be used as independent pendant and be integrated into these equipment or run in the application system of these equipment.
The device 30 includes:
For receiving the image of input, and the image is identified processing to export normalization in image processing unit 31 Score;Wherein, the normalization of the output is scored at the prediction probability to classification.In the present embodiment, the image processing unit A neural network module is arranged on, for passing through Processing with Neural Network and identification image, obtains the subclass score prediction of image.It should Neural network can be deployed in the computing device of high in the clouds or local.Further, which is trained complete Into and have classification capacity, output must have a normalized distribution form.
Searching unit 32 for searching the maximum score in the normalization score, and determines corresponding classification;
Judging unit 33, for judging whether the maximum score is more than threshold value;
Recognition result output unit 34, for when the maximum score is more than the threshold value, exporting the maximum score and correspondence Classification as recognition result;
Thick class converting unit 35, for when the maximum score is not more than the threshold value, score distribution to be converted to thick class Score be distributed, and trigger the searching unit 32 search the normalization score in maximum score with determine corresponding classification and Trigger whether the judging unit 33 judges more than threshold value the maximum score.
Further, this is stated thick class converting unit 35 and is specifically used for:
Child node is distributed as using the score as disaggregated classification currently inputted, all fathers save on query categories hierarchical tree Point, to obtain corresponding thick class;Wherein, category hierarchical tree is built in advance, for storing subclass to the tree-like relationship knot of thick class Structure;
The slightly score of class is calculated all using formula is calculated as below:
Wherein, m is subclass quantity, and n is thick class quantity, and c is numbered for thick class;If c includes mcThe subclass number included for thick class c Amount, rcFor the coefficient of balance of thick class c, P (c) is the score of thick class c, and x is the subclass number that thick class c is included, and P (x) is obtained for subclass Point;And
The thick class score being calculated is normalized using equation below.
Further, which further includes class hierarchy tree construction unit 36, for being used to build class hierarchy tree, with All subclasses are described to the relational structure of thick class.Specifically, category hierarchical tree construction unit 36 is used for:
Selected part data set collects as assessment, and inputs the assessment collection;
The identifying processing assessment collection is to obtain normalized output result;
With reference to true tag, the disaggregated classification of statistical neural network is as a result, structure subclass confusion matrix;
It is one group that classification will easily be obscured in the confusion matrix to gather, as thick class;
To thick class name, the number, the relationship of subclass and thick class is preserved with tree form data structure;
Judge whether the node for preserving the thick class reaches the root node of tree;If so, it preserves and exports category hierarchical tree; Otherwise, the thick class confusion matrix is calculated, and easily obscures from using the confusion matrix classification and gathers for one group and opened as the step of thick class Begin cycle, until when the node for preserving the thick class reaches the root node of tree, stops cycle, and export category hierarchical tree.
Further, category hierarchical tree construction unit 36 calculates thick class confusion matrix using formula is calculated as below:
Wherein, m is subclass quantity, and i, j are numbered for subclass, xijThe element arranged for the i-th row jth in subclass confusion matrix X;n For thick class quantity, u, v are numbered for thick class, cuvFor the element of the u rows v row in then thick class confusion matrix C, U, V are respectively thick The set of subclass number that class u, v includes.
Referring to Fig. 5, further, which further includes communication module 47, the communication being used to implement between each unit, For example, transmission recognition result gives the control of other modules or other modules to neural network.If neural network is deployed in high in the clouds, Communication module represents the communication forms such as Ethernet, WiFi or 3G;If neural network is deployed in local, communication module represents The communication forms such as LAN, interchanger, bus.
The device 40 further includes memory 48, for storing and exchanging the data between modules and record above-mentioned calculating Machine program is in itself.For example, these data include class hierarchy tree construction, Artificial Neural Network Structures, each rank from fine to coarse Classification results, predetermined threshold value parameter of section etc..The medium of the memory can be such as floppy disk, hard disk, CD-ROM and semiconductor Memory etc..
As described above, by hierarchy sorting technique and device in the present invention, when the score of all subclasses is all relatively low Thick class recognition result is converted to, it, can when being screened with threshold value because the score of thick class is generally higher than subclass score Retain the confidence level higher of more results and each result, so as to reduce mistakes and omissions identification, improve accuracy rate.
Further, the present invention can still carry out prediction classification, such as network is seen to new or unknown object type The picture of one mule so uncertain is donkey, still can be predicted as quadruped;The present invention is only to recognition result into one Step processing increases class hierarchy tree structure module and thick class modular converter, without require special neural network structure with Training method, method flow is simple, operand is small, there is stronger usability and practicality.
The foregoing is merely embodiments of the present invention, are not intended to limit the scope of the invention, every to utilize this It is relevant to be directly or indirectly used in other for the equivalent structure or equivalent flow shift that description of the invention and accompanying drawing content are made Technical field is included within the scope of the present invention.

Claims (10)

1. a kind of hierarchy sorting technique, which is characterized in that the method includes:
Neural network receives the image of input, and described image is identified processing to export normalization score;Wherein, it is described Neural network has been completed to train in advance, and the normalization of output is scored at the prediction probability to classification;
The maximum score in the normalization score is searched, and determines corresponding classification;
Judge whether the maximum score is more than threshold value;
When the maximum score is more than the threshold value, the maximum score and corresponding classification are exported as recognition result;When When the maximum score is not more than the threshold value, score distribution is converted into thick class score and is distributed, then cycle is performed and looked into The maximum score in the normalization score is looked for, and determines corresponding classification and judges whether the maximum score is more than threshold The step of value, until determine to stop cycle when the maximum score is more than the threshold value, exports the maximum score and corresponding Classification is as recognition result.
2. hierarchy sorting technique according to claim 1, which is characterized in that score distribution is converted into thick class and is obtained Distribution, specifically includes:
The neural network input is distributed as the score of disaggregated classification;
Input class hierarchy tree;Wherein, the class hierarchy tree builds in advance, for storing subclass to the tree-like relationship knot of thick class Structure;
Child node is distributed as using the score as disaggregated classification currently inputted, inquires the upper all father's sections of the class hierarchy tree Point, to obtain corresponding thick class;
The score of all thick classes is calculated, and is normalized;
And the score distribution of the thick class of output.
3. hierarchy sorting technique according to claim 2, which is characterized in that calculate the score of all thick classes, and carry out Normalized, specially:
The slightly score of class is calculated all using formula is calculated as below:
Wherein, m is subclass quantity, and n is thick class quantity, and c is numbered for thick class;If c includes mcFor the subclass quantity that thick class c is included, rc For the coefficient of balance of thick class c, P (c) is the score of thick class c, and x is the subclass number that thick class c is included, and P (x) is subclass score;
The thick class score being calculated is normalized using equation below.
4. hierarchy sorting technique according to claim 2, which is characterized in that the class hierarchy tree is built, it is specific to wrap It includes:
Selected part data set collects as assessment, and inputs the assessment collection;
The neural network recognization processing assessment collects to obtain normalized output result;
With reference to true tag, the disaggregated classification of statistical neural network is as a result, structure subclass confusion matrix;
It is one group that classification will easily be obscured in the confusion matrix to gather, as thick class;
To the thick class name, number, the relationship of subclass and thick class is preserved with tree form data structure;
Judge whether the node for preserving the thick class reaches the root node of tree;If so, it preserves and exports the class hierarchy tree; Otherwise, the thick class confusion matrix is calculated, and easily obscures classification from using the confusion matrix and gathers for one group of step as thick class Suddenly it starts the cycle over, until when the node for preserving the thick class reaches the root node of tree, stops cycle, and export the class hierarchy Tree.
5. hierarchy sorting technique according to claim 4, which is characterized in that calculate thick class confusion matrix, specially:
The thick class confusion matrix is calculated using formula is calculated as below:
Wherein, m is subclass quantity, and i, j are numbered for subclass, xijThe element arranged for the i-th row jth in subclass confusion matrix X;N is thick Class quantity, u, v are numbered for thick class, cuvFor the element of the u rows v row in then thick class confusion matrix C, U, V are respectively thick class u, v Comprising subclass number set.
6. a kind of hierarchy sorter, which is characterized in that described device includes:
For receiving the image of input, and described image is identified processing to export normalization score in image processing unit; Wherein, the normalization of the output is scored at the prediction probability to classification;
Searching unit for searching the maximum score in the normalization score, and determines corresponding classification;
Judging unit, for judging whether the maximum score is more than threshold value;
Recognition result output unit, for when the maximum score is more than the threshold value, exporting the maximum score and correspondence Classification as recognition result;
Thick class converting unit, for when the maximum score is not more than the threshold value, score distribution to be converted to thick class Score is distributed, and is triggered the searching unit and searched the maximum score in the normalization score to determine corresponding classification, with And whether the triggering judging unit judges more than threshold value the maximum score.
7. hierarchy sorter according to claim 6, which is characterized in that the thick class converting unit is specifically used for:
Child node is distributed as using the score as disaggregated classification currently inputted, all father nodes on query categories hierarchical tree, To obtain corresponding thick class;Wherein, the class hierarchy tree builds in advance, for storing subclass to the tree-like relationship knot of thick class Structure;
The score of all thick classes is calculated, and is normalized and is distributed with the score for obtaining the thick class.
8. hierarchy sorter according to claim 7, which is characterized in that the thick class converting unit is specifically used for:
The slightly score of class is calculated all using formula is calculated as below:
Wherein, m is subclass quantity, and n is thick class quantity, and c is numbered for thick class;If c includes mcFor the subclass quantity that thick class c is included, rc For the coefficient of balance of thick class c, P (c) is the score of thick class c, and x is the subclass number that thick class c is included, and P (x) is subclass score;
The thick class score being calculated is normalized using equation below.
9. hierarchy sorter according to claim 7, which is characterized in that class hierarchy tree construction unit is further included, For:
Selected part data set collects as assessment, and inputs the assessment collection;
Assessment described in identifying processing collects to obtain normalized output result;
With reference to true tag, the disaggregated classification of statistical neural network is as a result, structure subclass confusion matrix;
It is one group that classification will easily be obscured in the confusion matrix to gather, as thick class;
To the thick class name, number, the relationship of subclass and thick class is preserved with tree form data structure;
Judge whether the node for preserving the thick class reaches the root node of tree;If so, it preserves and exports the class hierarchy tree; Otherwise, the thick class confusion matrix is calculated, and easily obscures classification from using the confusion matrix and gathers for one group of step as thick class Suddenly it starts the cycle over, until when the node for preserving the thick class reaches the root node of tree, stops cycle, and export the class hierarchy Tree.
10. hierarchy sorter according to claim 7, which is characterized in that the class hierarchy tree construction unit tool Body, which is used to utilize, is calculated as below the formula calculating thick class confusion matrix:
Wherein, m is subclass quantity, and i, j are numbered for subclass, xijThe element arranged for the i-th row jth in subclass confusion matrix X;N is thick Class quantity, u, v are numbered for thick class, cuvFor the element of the u rows v row in then thick class confusion matrix C, U, V are respectively thick class u, v Comprising subclass number set.
CN201810012074.5A 2018-01-05 2018-01-05 Hierarchical classification method and device Active CN108229566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810012074.5A CN108229566B (en) 2018-01-05 2018-01-05 Hierarchical classification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810012074.5A CN108229566B (en) 2018-01-05 2018-01-05 Hierarchical classification method and device

Publications (2)

Publication Number Publication Date
CN108229566A true CN108229566A (en) 2018-06-29
CN108229566B CN108229566B (en) 2020-06-05

Family

ID=62643139

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810012074.5A Active CN108229566B (en) 2018-01-05 2018-01-05 Hierarchical classification method and device

Country Status (1)

Country Link
CN (1) CN108229566B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109670480A (en) * 2018-12-29 2019-04-23 深圳市丰巢科技有限公司 Image discriminating method, device, equipment and storage medium
CN109685110A (en) * 2018-11-28 2019-04-26 北京陌上花科技有限公司 Training method, image classification method and device, the server of image classification network
CN109919177A (en) * 2019-01-23 2019-06-21 西北工业大学 Feature selection approach based on stratification depth network
CN110309888A (en) * 2019-07-11 2019-10-08 南京邮电大学 A kind of image classification method and system based on layering multi-task learning
CN111144522A (en) * 2019-12-16 2020-05-12 浙江大学 Power grid NFC equipment fingerprint authentication method based on hardware intrinsic difference
CN111753757A (en) * 2020-06-28 2020-10-09 浙江大华技术股份有限公司 Image recognition processing method and device
WO2021083105A1 (en) * 2019-10-29 2021-05-06 北京灵汐科技有限公司 Neural network mapping method and apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135981A (en) * 2010-01-25 2011-07-27 安克生医股份有限公司 Method for multi-layer classifier
CN104978328A (en) * 2014-04-03 2015-10-14 北京奇虎科技有限公司 Hierarchical classifier obtaining method, text classification method, hierarchical classifier obtaining device and text classification device
CN105224960A (en) * 2015-11-04 2016-01-06 江南大学 Based on the corn seed classification hyperspectral imagery model of cognition update method of clustering algorithm
CN106611193A (en) * 2016-12-20 2017-05-03 太极计算机股份有限公司 Image content information analysis method based on characteristic variable algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135981A (en) * 2010-01-25 2011-07-27 安克生医股份有限公司 Method for multi-layer classifier
CN104978328A (en) * 2014-04-03 2015-10-14 北京奇虎科技有限公司 Hierarchical classifier obtaining method, text classification method, hierarchical classifier obtaining device and text classification device
CN105224960A (en) * 2015-11-04 2016-01-06 江南大学 Based on the corn seed classification hyperspectral imagery model of cognition update method of clustering algorithm
CN106611193A (en) * 2016-12-20 2017-05-03 太极计算机股份有限公司 Image content information analysis method based on characteristic variable algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JIA DENG等: "Hedging Your Bets: Optimizing Accuracy-Specificity Trade-offs in Large Scale Visual Recognition", 《2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685110A (en) * 2018-11-28 2019-04-26 北京陌上花科技有限公司 Training method, image classification method and device, the server of image classification network
CN109670480A (en) * 2018-12-29 2019-04-23 深圳市丰巢科技有限公司 Image discriminating method, device, equipment and storage medium
CN109670480B (en) * 2018-12-29 2023-01-24 深圳市丰巢科技有限公司 Image discrimination method, device, equipment and storage medium
CN109919177A (en) * 2019-01-23 2019-06-21 西北工业大学 Feature selection approach based on stratification depth network
CN109919177B (en) * 2019-01-23 2022-03-29 西北工业大学 Feature selection method based on hierarchical deep network
CN110309888A (en) * 2019-07-11 2019-10-08 南京邮电大学 A kind of image classification method and system based on layering multi-task learning
WO2021083105A1 (en) * 2019-10-29 2021-05-06 北京灵汐科技有限公司 Neural network mapping method and apparatus
US11769044B2 (en) 2019-10-29 2023-09-26 Lynxi Technologies Co., Ltd. Neural network mapping method and apparatus
CN111144522A (en) * 2019-12-16 2020-05-12 浙江大学 Power grid NFC equipment fingerprint authentication method based on hardware intrinsic difference
CN111753757A (en) * 2020-06-28 2020-10-09 浙江大华技术股份有限公司 Image recognition processing method and device
CN111753757B (en) * 2020-06-28 2021-06-18 浙江大华技术股份有限公司 Image recognition processing method and device

Also Published As

Publication number Publication date
CN108229566B (en) 2020-06-05

Similar Documents

Publication Publication Date Title
CN108229566A (en) A kind of hierarchy sorting technique and device
CN104573669B (en) Image object detection method
CN107527031B (en) SSD-based indoor target detection method
CN107480642A (en) A kind of video actions recognition methods based on Time Domain Piecewise network
CN108564029A (en) Face character recognition methods based on cascade multi-task learning deep neural network
WO2019179403A1 (en) Fraud transaction detection method based on sequence width depth learning
CN111695482A (en) Pipeline defect identification method
CN106980858A (en) The language text detection of a kind of language text detection with alignment system and the application system and localization method
CN110188209A (en) Cross-module state Hash model building method, searching method and device based on level label
CN104239712B (en) Real-time evaluation method for anti-interference performance of radar
CN104063719A (en) Method and device for pedestrian detection based on depth convolutional network
CN106228554B (en) Fuzzy coarse central coal dust image partition method based on many attribute reductions
CN108229550A (en) A kind of cloud atlas sorting technique that network of forests network is cascaded based on more granularities
CN112541532B (en) Target detection method based on dense connection structure
CN108363717B (en) Data security level identification and detection method and device
CN110490227A (en) A kind of few sample image classification method based on Feature Conversion
CN109961097A (en) Image classification dispatching method based on edge calculations under a kind of embedded scene
Wu et al. Fruit classification using convolutional neural network via adjust parameter and data enhancement
CN109492596A (en) A kind of pedestrian detection method and system based on K-means cluster and region recommendation network
CN109919045A (en) Small scale pedestrian detection recognition methods based on concatenated convolutional network
CN109903339A (en) A kind of video group personage&#39;s position finding and detection method based on multidimensional fusion feature
CN110377727A (en) A kind of multi-tag file classification method and device based on multi-task learning
CN110020669A (en) A kind of license plate classification method, system, terminal device and computer program
CN106611016B (en) A kind of image search method based on decomposable word packet model
CN110334775A (en) A kind of recognition methods of unmanned plane line fault and device based on width study

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant