CN105654100A - Method and device for identifying object through calculation device and electronic equipment - Google Patents

Method and device for identifying object through calculation device and electronic equipment Download PDF

Info

Publication number
CN105654100A
CN105654100A CN201410601359.4A CN201410601359A CN105654100A CN 105654100 A CN105654100 A CN 105654100A CN 201410601359 A CN201410601359 A CN 201410601359A CN 105654100 A CN105654100 A CN 105654100A
Authority
CN
China
Prior art keywords
hypersphere
classification
transform characteristics
sample
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410601359.4A
Other languages
Chinese (zh)
Inventor
廉旭航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Nokia Technologies Oy
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to CN201410601359.4A priority Critical patent/CN105654100A/en
Publication of CN105654100A publication Critical patent/CN105654100A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

An embodiment of the invention discloses a method for identifying an object through a calculation device, comprising the following steps: extracting one or more biological features of an object to be identified; using a feature transformation matrix to transform the one or more biological features into transform features, wherein the feature transformation matrix is used for pre-creating multiple super balls for multiple target objects, each super ball among the multiple super balls uniquely corresponds to one target object among the multiple target objects, and the multiple super balls are separated from one another in space; and identifying whether the object to be identified is one target object among the multiple target objects by determining whether the transform features correspond to one super ball among the multiple super balls. According to the embodiment of the invention, the accuracy of open-set biological object identification is improved.

Description

Utilize the method for calculation element identification object, device and electronic equipment
Technical field
Embodiment of the present invention relates to area of pattern recognition, utilizes the method for calculation element identification object, device and electronic equipment in particular to a kind of.
Background technology
Biological object identification based on living things feature recognitions such as such as face, fingerprint, irises has huge using value on the equipment such as smart mobile phone, Wearable device, bank authentication equipment, and it affects Consumer's Experience, information security and financial security.
In biological object identification, opener biological object identification refers to: biological object to be identified had both been probably some destination object in the biological object data base having built up, it is also possible to be not any one destination object in this biological object data base.
At present, the process of opener biological object identification is it is generally required to following two steps:
First, the biological object data base having built up determines destination object immediate with object to be identified;
Second, calculate the difference value between the two object to be identified and determined destination object, and by this difference value is compared with the threshold value pre-set, thus this object to be identified is identified.
As can be seen here, whether set threshold value is accurately the key factor affecting opener biological object identification.
But, owing to the threshold value in correlation technique is mainly configured according to artificial experience, therefore often it is difficult to obtain optimal threshold, thus being difficult to the precision of good opener biological object identification.
Summary of the invention
In order to overcome the problems referred to above of the prior art, embodiment of the present invention aims to provide and a kind of utilizes the method for calculation element identification object, device and electronic equipment.
According to an aspect of the present invention, it is provided that a kind of method utilizing calculation element identification object, including: extract object to be identified one or more biological characteristics; Use eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics, wherein said eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in wherein said multiple hyperspheres is uniquely corresponding to a destination object in the plurality of destination object and the plurality of hypersphere is spatially separated from each other;Whether it is determined by described transform characteristics corresponding to a hypersphere in the plurality of hypersphere, identifies whether described object to be identified is a destination object in the plurality of destination object.
According to another aspect of the present invention, it is provided that a kind of device utilizing calculation element identification object, including: feature extraction unit, for extracting one or more biological characteristics of object to be identified; Eigentransformation unit, for using eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics, wherein said eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in wherein said multiple hyperspheres is uniquely corresponding to a destination object in the plurality of destination object and the plurality of hypersphere is spatially separated from each other; Object identification unit, for whether being determined by described transform characteristics corresponding to a hypersphere in the plurality of hypersphere, identifies whether described object to be identified is a destination object in the plurality of destination object.
By being described below it will be appreciated that according to embodiment of the present invention, by use eigentransformation matrix, identify above-mentioned object to be identified via the multiple hyperspheres separated. It is uniquely corresponding to the destination object (namely enable to each destination object and be positioned at a hypersphere) in multiple destination object due to each hypersphere in these hyperspheres and these hyperspheres are spatially (namely the enabling to each destination object be spatially separated from each other) that are separated from each other, therefore the biological characteristic extracted can belong to a hypersphere in these hyperspheres after eigentransformation clearly or be not belonging to any hypersphere, thus the biological characteristic correspondingly extracted can belong to a destination object in multiple destination object clearly or be not belonging to any destination object. In such manner, it is possible to improve the precision of opener biological object identification.
Accompanying drawing explanation
Reading detailed description below by reference accompanying drawing, above-mentioned and other purposes of embodiment of the present invention, feature and advantage will become prone to understand. In the accompanying drawings, illustrate some embodiments of the present invention by way of example, and not by way of limitation, wherein:
Fig. 1 is the schematic diagram that embodiment of the present invention can be implemented in electronic equipment therein;
Fig. 2 is the flow chart of the method 200 utilizing calculation element identification object according to embodiment of the present invention;
Fig. 3 is another flow chart of the method 200 utilizing calculation element identification object according to embodiment of the present invention;
Fig. 4 is the schematic diagram illustrating the method according to embodiment of the present invention for facial recognition;
Fig. 5 be a diagram that the use second feature transformation matrix A according to embodiment of the present invention carry out eigentransformation before sample between the schematic diagram of relation;
Fig. 6 be a diagram that the use second feature transformation matrix A according to embodiment of the present invention carry out eigentransformation after sample between the schematic diagram of relation;
Fig. 7 be a diagram that the schematic diagram of the device 700 utilizing calculation element identification object according to embodiment of the present invention.
In various figures, identical or corresponding label represents identical or corresponding part.
Detailed description of the invention
Some illustrative embodiments shown in below with reference to the accompanying drawings describe principles of the invention and spirit. Should be appreciated that these embodiments of description are only used to make those skilled in the art better understood when and then realize the present invention, and the scope being not intended to limit the present invention in any manner.
With reference to Fig. 1, it illustrates embodiment of the present invention and can be implemented in the schematic diagram of electronic equipment 100 therein. According to embodiment of the present invention, electronic equipment 100 can be the portable electric appts of smart mobile phone etc. It will be appreciated, however, that this is merely exemplary and nonrestrictive. Other user device type can also easily take embodiment of the present invention, such as Wearable device, bank authentication equipment, personal digital assistant (PDA), pager, mobile computer, mobile TV, game device, laptop computer, photographing unit, video camera, GPS device and other kinds of voice and text communication system.
Electronic equipment 100 can have communication function. For this, as it is shown in figure 1, electronic equipment 100 can include one or more antenna 112, it is operable communicates with emitter 114 and receptor 116. Electronic equipment 100 also includes at least one controller 120. Should be appreciated that controller 120 includes realizing the circuit required for all functions of electronic equipment 100. Such as, controller 120 can include digital signal processor device, micro processor device, A/D converter, D/A converter and other support circuit. The control of electronic equipment 100 and signal processing function are according to the respective ability distribution of these equipment. Electronic equipment 100 can also include user interface, for instance can including ringer 122, speaker 124, microphone 126, display or view finder 128 and keypad 130, all apparatus above are both coupled to controller 120.
Electronic equipment 100 also includes battery 134, such as vibrating battery group, powers for the various circuit required for operation electronic equipment 100, and alternatively provides mechanical vibration as detectable output. Electronic equipment 100 also includes subscriber identification module (UIM) 138. UIM138 usually has the memory devices of built-in processor. UIM138 can such as include Subscriber Identity Module (SIM), Universal Integrated Circuit Card (UICC), USIM (USIM) or Removable User Identity Module (R-UIM) etc. UIM138 can include the card according to embodiment of the present invention and connect detecting device.
It addition, electronic equipment 100 also includes storage device. Such as, electronic equipment 100 can include volatile memory 140, it may for example comprise the volatile random access memory (RAM) for interim storage data in cache area. What electronic equipment 100 can also include other can be embed or moveable nonvolatile memory 142. Nonvolatile memory 142 additionally or alternatively can such as include EEPROM and flash memory etc. Memorizer can store the Arbitrary Term in multiple information segment and the data program etc. of electronic equipment 100 use, in order to realizes the function of electronic equipment 100, for instance the function hereinafter illustrated with reference to one or more embodiments.
Especially, electronic equipment 100 also includes the biological characteristic extraction elements such as the camera 136 of the one or more biological characteristics for extracting object to be identified, fingerprint extraction device. Electronic equipment can also include other biological characteristic extraction element.
The structured flowchart that should be appreciated that in Fig. 1 is merely illustrative purpose, it is not intended that the scope of restriction embodiment of the present invention. In some cases, some assembly can increase according to concrete needs or reduce.
Fig. 2 is the flow chart of the method 200 utilizing calculation element identification object according to embodiment of the present invention.It will be understood by those skilled in the art that the method 200 can be performed by the electronic equipment 100 described above with reference to Fig. 1, for instance performed by controller 120. For discussing conveniently, in the following description method 200 is described the electronic equipment 100 shown in reference Fig. 1.
After method 200 starts, in step S202, extract one or more biological characteristics of object to be identified.
It follows that method 200 proceeds to step S204, use eigentransformation matrix that one or more biological characteristics are transformed to transform characteristics. Wherein, this eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in these hyperspheres is uniquely corresponding to a destination object in multiple destination object and these hyperspheres are spatially separated from each other.
Method 200 then proceeds to step S206, whether is determined by above-mentioned transform characteristics corresponding to a hypersphere in above-mentioned multiple hyperspheres, identifies whether above-mentioned object to be identified is a destination object in above-mentioned multiple destination object. If it is to say, above-mentioned transform characteristics is corresponding to a hypersphere in above-mentioned multiple hyperspheres, then identifying that above-mentioned object to be identified is a destination object in above-mentioned multiple destination object, otherwise identifying that above-mentioned object to be identified is not belonging to above-mentioned multiple destination object.
As can be seen here, embodiment of the present invention uses eigentransformation matrix, identifies above-mentioned object to be identified via the multiple hyperspheres separated. It is uniquely corresponding to the destination object (namely enable to each destination object and be positioned at a hypersphere) in multiple destination object due to each hypersphere in these hyperspheres and these hyperspheres are spatially (namely the enabling to each destination object be spatially separated from each other) that are separated from each other, therefore the biological characteristic extracted can belong to a hypersphere in these hyperspheres after eigentransformation clearly or be not belonging to any hypersphere, thus the biological characteristic correspondingly extracted can belong to a destination object in multiple destination object clearly or be not belonging to any destination object. In such manner, it is possible to improve the precision of opener biological object identification.
According to the explanation above with respect to Fig. 2, the method according to embodiment of the present invention that it will be understood by those skilled in the art that at least includes training stage formerly and posterior test phase. Wherein, test phase is illustrated to step 206 by the step 202 in said method 200, and the training stage will be described in detail hereinafter with reference to Fig. 3.
Training stage is used for using features described above transformation matrix to create multiple hypersphere for above-mentioned multiple destination objects. Specifically, this training stage can include steps S208 to step S214.
Step S208, creates multiple classification, and wherein each classification in these classification is uniquely corresponding to a destination object in above-mentioned multiple destination object.
Step S210, each training sample for a destination object in above-mentioned multiple destination objects, fisrt feature transformation matrix is used to transform it into a transform characteristics sample, and this transform characteristics sample is divided to the classification corresponding to this destination object, and wherein this fisrt feature transformation matrix is determined so that this transform characteristics sample is spatially located remotely from each other close to center and above-mentioned multiple being sorted in of this classification.
Step S212, uses above-mentioned transform characteristics sample to create multiple hypersphere.
It should be noted that step S208 to step S212 is construed as the first order of training stage, i.e. " eigentransformation described towards hypersphere and data describe ".Specifically, the purpose of this first order is in that so that primitive character is suitable for describing hyper-sphere model (step S208 and step S210) after eigentransformation, and also reside in and utilize model (such as Support Vector data description model) to build hypersphere to be comprised (step S212) by homogeneous data, the boundary function of this hypersphere is portrayed by a hypersphere.
Step S214, use second feature transformation matrix a to convert destination object that above-mentioned multiple hypersphere makes each hypersphere in these hyperspheres be uniquely corresponding in above-mentioned multiple destination object and these hyperspheres are spatially separated from each other.
It should be noted that step S214 is construed as the second level of training stage, i.e. " eigentransformation separated towards hypersphere and data describe ". Specifically, the purpose of this second level is in that carrying out eigentransformation further is easily separated overlapping hypersphere, then builds data descriptive model.
As can be seen here, by step S208 to step S214, create multiple hypersphere and a destination object (namely enable to each destination object and be positioned at a hypersphere) and these hyperspheres of making each hypersphere in these hyperspheres be uniquely corresponding in multiple destination object are spatially (namely the enabling to each destination object be spatially separated from each other) that are separated from each other, therefore the biological characteristic extracted can belong to a hypersphere in these hyperspheres after eigentransformation clearly or be not belonging to any hypersphere, thus the biological characteristic correspondingly extracted can belong to a destination object in multiple destination object clearly or be not belonging to any destination object. in such manner, it is possible to improve the precision of opener biological object identification.
Fig. 4 illustrates the method according to embodiment of the present invention for recognition of face. Below with reference to Fig. 4, above-mentioned steps S208 is described in detail to step S214.
First, corresponding to above-mentioned steps S208, create m classification for m destination object. These classification such as include the classification A illustrated in Fig. 4 upper left corner, classification B and classification C, and wherein each classification is uniquely corresponding to a destination object (such as a people). For the A that classifies, it includes three facial photo of same person, and these three facial photo illustrate the face feature of this people by different shooting angles and light.
Second, for m the classification created, the training sample in labelling biological object identification storehouse and exceptional value respectively, and extract their feature, thus obtaining original high dimensional feature x.
3rd, corresponding to above-mentioned steps S210, use fisrt feature transformation matrix B that original high dimensional feature x is transformed to transform characteristics sample y, and this transform characteristics sample y is divided to the classification (A that such as classifies, classification B and classification C) corresponding to this destination object, and wherein this fisrt feature transformation matrix is determined so that this transform characteristics sample y is spatially located remotely from each other close to center and above-mentioned multiple being sorted in of classification. Specifically, it is possible to by makingMinimize and makeMaximizing and determine above-mentioned fisrt feature transformation matrix B, wherein m is the number of multiple classification, riIt is the number of transform characteristics sample in the i-th classification in multiple classification, ykIt is the position of kth transform characteristics sample in i-th classification, ciIt is the position at the center of i-th classification, cjIt it is the position at the center of jth classification.
According to one embodiment of present invention, step S210 can also include determining that this fisrt feature transformation matrix B also makes the exceptional sample in transform characteristics sample away from the center of corresponding classification.Specifically, not only makeMinimize, makeMaximize and makeMaximizing and determine fisrt feature transformation matrix B, wherein r ' is the number of the exceptional sample in transform characteristics sample, zlBeing the position of l exceptional sample, c is the position at the center classified corresponding with the l exceptional sample.
The Solve problems of above-mentioned fisrt feature transformation matrix B can be formulated as following optimization problem:
max 1 2 Σ i , j = 1 m | | c i - c j | | 2 + 1 2 Σ l = 1 r ′ | | z l - c | | 2 - 1 2 Σ i = 1 m Σ k = 1 r i | | y k - c i | | 2
s.t.BTB=I (1)
By suitable simplification, obtain:
1 2 Σ i , j = 1 m | | c i - c j | | 2 + 1 2 Σ l = 1 r ′ | | z l - c | | 2 - 1 2 Σ i = 1 m Σ k = 1 r i | | y k - c i | | 2 = Σ i , j = 1 m 1 r i 2 Σ i = 1 r i y i T y i + Σ i , j = 1 m 1 r j 2 Σ j = 1 r j y j T y j - Σ i , j = 1 m 2 r i r j Σ i = 1 r i Σ j = 1 r j y i T y j + Σ l = 1 r ′ z l T z l + r ′ r 2 Σ k = 1 r y k T y k , - Σ l = 1 r ′ 2 r Σ k = 1 r z l k y k - Σ i = 1 m ( 1 + 1 r i ) Σ k = 1 r i y k T y k + Σ i = 1 m 2 r i Σ k , p = 1 r i y k T y p = UMU T + UMU T - UOU T + UPU T + UQU T - URU T - USU T + UVU T = ULU T
Assume L=M+N-O+P+Q-R-S+V, M = Σ i , j = 1 m 1 2 r i 2 , N = Σ i , j = 1 m 1 2 r j 2 , O = Σ i , j = 1 m 1 r i r j ,
P = 1 2 , Q = r ′ 2 r 2 , R = Σ i = 1 r ′ 1 r , S = 1 2 Σ i = 1 m ( 1 + 1 r i ) , V = Σ i = 1 m 1 r i , U = B T x .
Therefore, formula (1) can be converted to:
maxBTxLxTB
s.t.BTB=I. (2)
The Lagrange's equation of formula (2) can be written as:
Lp=BTxLxTB-��(BTB-I).(3)
OrderObtain: xLxTB=�� B. (4)
The optimal solution of formula (4) can be passed through equation below (5) and solve:
xLxTbi=��ibi.(5)
Wherein ��iIt is i-th eigenvalue of maximum, biIt is ��iCorresponding characteristic vector. Assume B=[b1..., bd]��Rn��d, then transformation matrix is Y=BTX��
4th, corresponding to above-mentioned steps S212, use above-mentioned transform characteristics specimen needle that each classification is applied support vector description (SVDD) technology respectively and create hypersphere to create hypersphere, so that the radius that each hypersphere comprises corresponding transform characteristics sample and this hypersphere is little as far as possible.
5th, corresponding to above-mentioned steps S214, determine above-mentioned second feature transformation matrix A, thus when using this second feature transformation matrix A to convert above-mentioned multiple hypersphere, it is possible to make these hyperspheres be separated from each other and make each hypersphere in these hyperspheres be uniquely corresponding to a destination object in above-mentioned multiple destination object. That is, determine above-mentioned second feature transformation matrix A, thus when using this second feature transformation matrix A to convert above-mentioned multiple hypersphere, it is possible to make the centre of sphere of each hypersphere in these hyperspheres and the spacing of the centre of sphere of other hypersphere in these hyperspheres | | ci��-cj' | | more than their radius sum (r 'i+r��j); And the spacing of the centre of sphere for a hypersphere in these hyperspheres, the transform characteristics sample in this hypersphere and this hypersphere | | zik-c��i| | less than the spacing of the transform characteristics sample on this hypersphere surface Yu the centre of sphere of this hypersphere | | vim-c��i||��
Assume yikBe a sampling feature vectors in hypersphere, be the column vector (D is intrinsic dimensionality) of D �� 1, then the eigenmatrix of of a sort all samples composition is matrix Y. This matrix Y is the output data in the first order illustrated in Fig. 4, is again the input data in the second level illustrated in Fig. 4. Assume that transformation matrix A is the matrix of D �� d, then according to matrix multiplication, it is known that zikBeing the column vector of d �� 1, wherein d is the intrinsic dimensionality after eigentransformation.
Specifically, by making (r 'i+r��j)2< | | c 'i-c��j||2(1��i, j��m, i �� j) and make | | zik-c��i||2< | | v 'im-c��i| | determine this second feature transformation matrix A, wherein ri' it is the radius of i-th hypersphere, rj' it is the radius of jth hypersphere, ci' it is the position of the centre of sphere of i-th hypersphere, cj' it is the position of the centre of sphere of jth hypersphere, zikIt is the position of kth transform characteristics sample in i-th hypersphere, vim' it is the position of m-th transform characteristics sample on i-th hypersphere surface, ri��2=| | v 'i-c��i||2(1��i��m).��
The use second feature transformation matrix A that Fig. 5 and Fig. 6 respectively illustrates according to embodiment of the present invention carries out the relation between the sample before and after eigentransformation. In Figure 5, before eigentransformation, r1+r2> l, | | z11-c1| | < r1, | | z21-c2| | < r2. In figure 6, after eigentransformation, r '1+r��2< l ', | | z11-c��1| | < r '1, | | z21-c��2| | < r '2��
The Solve problems of above-mentioned second feature transformation matrix A can be formulated as equation below (6) and the associating Solve problems of formula (7):
(||ATvi-ATci||+||ATvj-ATcj||)2< | | ATci-ATcj||2(6)
||ATyik-ATci||2< | | ATvi-ATci||2(7)
Wherein second feature transformation matrix A is by yik��viAnd ciTransform to low dimensional feature space zik=ATyik, v 'i=ATviWith c 'i=ATci(1��i��m)��
And then, formula (11) can be expressed as formula (8):
(yik-ci)TAAT(yik-ci) < (vi-ci)TAAT(vi-ci)(8)
Make yik-ci=p, vi-ci=q1, AAT=T, then formula (8) can be transformed to:
F1(A)=pTTp-q1 TTq1< 0. (9)
Meanwhile, formula (12) can be expressed as formula (10):
(vi-ci)TAAT(vi-ci)+(vj-cj)TAAT(vj-cj)+(vi-ci)TAAT(vj-cj)TAAT(ci-cj)(10)
Make vj-cj=q2, ci-cj=0, then formula (10) can be transformed to:
F2(A)=q1 TTq1+q2 TTq2+q1 TTq2-��TT �� < 0. (11)
For above-mentioned formula (9) and formula (11), make F (A)=diag{F1(A), F2(A) }, then F1(A) < 0, F2(A) < 0.If F (A) < 0 and F (A) are the linear inequalities about matrix A, wherein when F (A) is converted to the convex optimization problem of one group of LMI, the Solve problems of second feature transformation matrix A can solve with iteration Interior-point method.
Now turn to Fig. 4, also illustrate how to use above-mentioned tried to achieve fisrt feature transformation matrix B and second feature transformation matrix A that object to be identified (such as the facial photo of a people) is identified. Specifically, for instance use LBP and Gabor method to extract one or more face characteristic; Use eigentransformation matrix U that this face characteristic is transformed to transform characteristics z, the wherein eigentransformation matrix U product equal to above-mentioned fisrt feature transformation matrix B and second feature transformation matrix A; And, determine that whether above-mentioned transform characteristics z is corresponding to a hypersphere in above-mentioned multiple hyperspheres, namely, if, then identify a destination object in these artificial above-mentioned multiple destination objects, such as can accepting and be categorized as classification A with typical module recognition methods (such as k-nearest neighbor algorithm), otherwise this people will be rejected.
The innovative point of embodiment of the present invention mainly includes following two aspect:
(1) a kind of opener biological object recognition methods separated based on hypersphere is proposed. The method ensure that each suprasphere model only comprises each independent data type sample, not overlapping between suprasphere, also can keep data sample relative distance in original feature space.
(2) a kind of two-stage biological Object identifying mechanism separated towards hypersphere is proposed. In " eigentransformation described towards hypersphere and data describe " of the first order, a kind of eigentransformation method described based on hypersphere is proposed, primitive character is made to be suitable for describing hyper-sphere model after eigentransformation, recycling model (such as Support Vector data description model) builds suprasphere and is comprised by homogeneous data, and its boundary function is portrayed by a hypersphere. Then, in " eigentransformation separated towards hypersphere and data describe " of the second level, do eigentransformation further and overlapping hypersphere is easily separated, then build data descriptive model. Therefore, object to be identified judges by carrying out classification through constructed model after Two Stages and identifies.
The advantage of embodiment of the present invention mainly includes following two aspect.
First, it is not necessary to manually set threshold value, but set up model thus obtaining the optimal value of destination object by training, thus overcoming traditional method to need the artificial deficiency setting threshold value, improve the precision of opener biological object identification.
Second, model includes all of training sample, there is the feature such as strong robustness, little, the good separating effect of amount of calculation.
With reference now to Fig. 7, which illustrates the schematic diagram of the device 700 utilizing calculation element identification object according to embodiment of the present invention. As it is shown in fig. 7, according to embodiment of the present invention, device 700 includes: feature extraction unit 702, for extracting one or more biological characteristics of object to be identified; Eigentransformation unit 704, for using eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics, wherein said eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in wherein said multiple hyperspheres is uniquely corresponding to a destination object in the plurality of destination object and the plurality of hypersphere is spatially separated from each other; And object identification unit 706, for whether being determined by described transform characteristics corresponding to a hypersphere in the plurality of hypersphere, identify whether described object to be identified is a destination object in the plurality of destination object.
In certain embodiments, the device 700 utilizing calculation element identification object according to embodiment of the present invention also includes hypersphere creating unit, for using described eigentransformation matrix to create the plurality of hypersphere for the plurality of destination object, described hypersphere creating unit includes: classification creating unit, for creating multiple classification, each classification in wherein said multiple classification is uniquely corresponding to a destination object in the plurality of destination object; Training sample converter unit, for each training sample for a destination object in the plurality of destination object, fisrt feature transformation matrix is used to transform it into a transform characteristics sample, and described transform characteristics sample is divided to the classification corresponding to described destination object, and wherein said fisrt feature transformation matrix is determined so that described transform characteristics sample is spatially located remotely from each other close to center and the plurality of being sorted in of described classification; Creating unit, is used for using described transform characteristics sample to create multiple hypersphere; Hypersphere converter unit, for using second feature transformation matrix to be spatially separated from each other to a destination object and the plurality of hypersphere converting the plurality of hypersphere and making each hypersphere in the plurality of hypersphere be uniquely corresponding in the plurality of destination object.
In certain embodiments, also include fisrt feature transformation matrix according to the device 700 utilizing calculation element identification object of embodiment of the present invention and determine unit: for by makingMinimize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjIt it is the position at the center of described jth classification.
In certain embodiments, the device 700 utilizing calculation element identification object according to embodiment of the present invention also includes second feature transformation matrix and determines unit, for determining described fisrt feature transformation matrix, also make the exceptional sample in described transform characteristics sample away from the center of the corresponding classification in the plurality of classification. Specifically, described second feature transformation matrix determines that unit is for passing through to makeMinimize, makeMaximize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjBeing the position at the center of described jth classification, r ' is the number of the exceptional sample in described transform characteristics sample, zlBeing the position of described l exceptional sample, c is the position at the center classified corresponding with described the l exceptional sample.
In certain embodiments, described creating unit makes the radius that each hypersphere in the plurality of hypersphere comprises corresponding transform characteristics sample and this hypersphere little as far as possible.
In certain embodiments, the device 700 utilizing calculation element identification object according to embodiment of the present invention also includes third feature transformation matrix and determines unit, for determining described second feature transformation matrix, so that: the spacing of the centre of sphere of other hypersphere in the centre of sphere of each hypersphere in the plurality of hypersphere and the plurality of hypersphere is more than their radius sum; And for a hypersphere in the plurality of hypersphere, the spacing of the transform characteristics sample being smaller than on described hypersphere surface with the centre of sphere of described hypersphere of the transform characteristics sample in described hypersphere and the centre of sphere of described hypersphere. Specifically, described third feature transformation matrix determines that unit is for passing through to make | | zik-c��i||2< | | v 'im-c��i| | and make (r 'i+r��j)2< | | c 'i-c��j| | (1��i, j��m, i �� j) determines described second feature transformation matrix, wherein zikIt is the position of kth transform characteristics sample in i-th hypersphere, ci' it is the position of the centre of sphere of i-th hypersphere, vim' it is the position of m-th transform characteristics sample on i-th hypersphere surface, ri' it is the radius of i-th hypersphere, rj' it is the radius of jth hypersphere, cj' it is the position of the centre of sphere of jth hypersphere.
In certain embodiments, described eigentransformation unit includes: fisrt feature converter unit, is used for using described fisrt feature transformation matrix that the one or more biological characteristic is transformed to the second transform characteristics; Second feature converter unit, is used for using described second feature transformation matrix that described second transform characteristics is transformed to described transform characteristics.
Should be appreciated that and be equally applicable to device 700 above with reference to each feature described by Fig. 1-Fig. 6, hence without repeating.
And it is to be understood that for clarity, some selectable unit of device 700 and subelement it are shown without in the figure 7. Additionally, terminology used here " unit " both can be hardware module, it is also possible to be software unit module. Correspondingly, device 700 can pass through to be embodied in various ways. Such as, in certain embodiments, device 700 some or all of can utilize software and/or firmware to realize, for instance is implemented as the computer program comprised on a computer-readable medium. Alternatively or additionally, device 700 can some or all of realize based on hardware, for instance is implemented as integrated circuit (IC), special IC (ASIC), SOC(system on a chip) (SOC), field programmable gate array (FPGA) etc. The scope of the present invention is not limited in this respect.
It is for illustration purposes only, is hereinbefore described some exemplary embodiments of the present invention. Embodiments of the invention can pass through being implemented in combination in of hardware, software or software and hardware. Hardware components can utilize special logic to realize; Software section can store in memory, by suitable instruction execution system, for instance microprocessor or special designs hardware perform. Especially, a kind of computer program for utilizing calculation element identification object all can be implemented as above with reference to the method described by Fig. 1 to Fig. 6, described computer program can be tangibly stored on non-transient computer-readable medium and be included machine-executable instruction, and described machine-executable instruction makes the step of machine executed method 200 when executed.
It will be understood by those skilled in the art that above-mentioned system and method can use computer executable instructions and/or be included in processor control routine to realize, for instance on the such as programmable memory of the mounting medium of disk, CD or DVD-ROM, such as read only memory (firmware) or the data medium of such as optics or electrical signal carrier, provide such code. Present system can be realized by the hardware circuit of the quasiconductor of such as super large-scale integration or gate array, such as logic chip, transistor etc. or the programmable hardware device of such as field programmable gate array, programmable logic device etc., can also realize with the software performed by various types of processors, it is also possible to realized by the combination of above-mentioned hardware circuit and software such as firmware.
Although it should be noted that, be referred to some unit of device in above-detailed, but this division is only not enforceable. It practice, can embody in a unit according to embodiments of the invention, the feature of two or more unit above-described and function. Otherwise, the feature of an above-described unit and function can Further Division for be embodied by multiple unit. Similarly, although describe the operation of the inventive method in the accompanying drawings with particular order, but this does not require that or implies and must operate to perform these according to this particular order, or having to carry out all shown operation could realize desired result. On the contrary, the step described in flow chart can change execution sequence. Additionally or alternatively, it is convenient to omit some step, multiple steps are merged into a step and performs, and/or a step is decomposed into the execution of multiple step.
Although describing the present invention by reference to some specific embodiments, it should be appreciated that, the present invention is not limited to disclosed specific embodiment. It is contemplated that contain various amendments included in the spirit and scope of claims and equivalent arrangements. Scope of the following claims meets broadest explanation, thus comprising all such amendments and equivalent structure and function.

Claims (20)

1. the method utilizing calculation element identification object, including:
Extract one or more biological characteristics of object to be identified;
Use eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics, wherein said eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in wherein said multiple hyperspheres is uniquely corresponding to a destination object in the plurality of destination object and the plurality of hypersphere is spatially separated from each other;
Whether it is determined by described transform characteristics corresponding to a hypersphere in the plurality of hypersphere, identifies whether described object to be identified is a destination object in the plurality of destination object.
2. method according to claim 1, also includes using described eigentransformation matrix to create the plurality of hypersphere for the plurality of destination object, including:
Creating multiple classification, each classification in wherein said multiple classification is uniquely corresponding to a destination object in the plurality of destination object;
Each training sample for a destination object in the plurality of destination object, fisrt feature transformation matrix is used to transform it into a transform characteristics sample, and described transform characteristics sample is divided to the classification corresponding to described destination object, and wherein said fisrt feature transformation matrix is determined so that described transform characteristics sample is spatially located remotely from each other close to center and the plurality of being sorted in of described classification;
Use described transform characteristics sample to create multiple hypersphere;
Use second feature transformation matrix a to convert destination object that the plurality of hypersphere makes each hypersphere in the plurality of hypersphere be uniquely corresponding in the plurality of destination object and the plurality of hypersphere are spatially separated from each other.
3. method according to claim 2, also includes: by makingMinimize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjIt it is the position at the center of described jth classification.
4. method according to claim 2, also comprises determining that described fisrt feature transformation matrix, also makes the exceptional sample in described transform characteristics sample away from the center of the corresponding classification in the plurality of classification.
5. method according to claim 4, by makingMinimize, makeMaximize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjBeing the position at the center of described jth classification, r ' is the number of the exceptional sample in described transform characteristics sample, zlBeing the position of described l exceptional sample, c is the position at the center classified corresponding with described the l exceptional sample.
6. method according to claim 2, wherein uses described transform characteristics sample to create the plurality of hypersphere, so that the radius that each hypersphere in the plurality of hypersphere comprises corresponding transform characteristics sample and this hypersphere is little as far as possible.
7. method according to claim 2, also comprise determining that described second feature transformation matrix so that:
The spacing of the centre of sphere of other hypersphere in the centre of sphere of each hypersphere in the plurality of hypersphere and the plurality of hypersphere is more than their radius sum; And
For a hypersphere in the plurality of hypersphere, the spacing of the transform characteristics sample being smaller than on described hypersphere surface with the centre of sphere of described hypersphere of the transform characteristics sample in described hypersphere and the centre of sphere of described hypersphere.
8. method according to claim 7, by making �� zik-c��i��2<��v��im-c��i�� and make (r 'i+r��j)2<��c��i-c��j��2(1��i, j��m, i �� j) determines described second feature transformation matrix, wherein zikIt is the position of kth transform characteristics sample in i-th hypersphere, ci' it is the position of the centre of sphere of i-th hypersphere, vim' it is the position of m-th transform characteristics sample on i-th hypersphere surface, ri' it is the radius of i-th hypersphere, rj' it is the radius of jth hypersphere, cj' it is the position of the centre of sphere of jth hypersphere.
9. the method according to any one of claim 2 to 8, uses described eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics and includes:
Use described fisrt feature transformation matrix that the one or more biological characteristic is transformed to the second transform characteristics;
Use described second feature transformation matrix that described second transform characteristics is transformed to described transform characteristics.
10. utilize a device for calculation element identification object, including:
Feature extraction unit, for extracting one or more biological characteristics of object to be identified;
Eigentransformation unit, for using eigentransformation matrix that the one or more biological characteristic is transformed to transform characteristics, wherein said eigentransformation matrix is used to be pre-created multiple hypersphere for multiple destination objects, and each hypersphere in wherein said multiple hyperspheres is uniquely corresponding to a destination object in the plurality of destination object and the plurality of hypersphere is spatially separated from each other;
Object identification unit, for whether being determined by described transform characteristics corresponding to a hypersphere in the plurality of hypersphere, identifies whether described object to be identified is a destination object in the plurality of destination object.
11. device according to claim 10, also including hypersphere creating unit, be used for using described eigentransformation matrix to create the plurality of hypersphere for the plurality of destination object, described hypersphere creating unit includes:
Classification creating unit, is used for creating multiple classification, and each classification in wherein said multiple classification is uniquely corresponding to a destination object in the plurality of destination object;
Training sample converter unit, for each training sample for a destination object in the plurality of destination object, fisrt feature transformation matrix is used to transform it into a transform characteristics sample, and described transform characteristics sample is divided to the classification corresponding to described destination object, and wherein said fisrt feature transformation matrix is determined so that described transform characteristics sample is spatially located remotely from each other close to center and the plurality of being sorted in of described classification;
Creating unit, is used for using described transform characteristics sample to create multiple hypersphere;
Hypersphere converter unit, for using second feature transformation matrix to be spatially separated from each other to a destination object and the plurality of hypersphere converting the plurality of hypersphere and making each hypersphere in the plurality of hypersphere be uniquely corresponding in the plurality of destination object.
12. device according to claim 11, also include fisrt feature transformation matrix and determine unit: for by makingMinimize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjIt it is the position at the center of described jth classification.
13. device according to claim 11, also include: second feature transformation matrix determines unit, for determining described fisrt feature transformation matrix, also make the exceptional sample in described transform characteristics sample away from the center of the corresponding classification in the plurality of classification.
14. device according to claim 13, described second feature transformation matrix determines that unit is for passing through to makeMinimize, makeMaximize and makeMaximizing and determine described fisrt feature transformation matrix, wherein m is the number of the plurality of classification, riIt is the number of transform characteristics sample in the i-th classification in the plurality of classification, ykIt is the position of kth transform characteristics sample in the classification of described i-th, ciIt is the position at the center of described i-th classification, cjBeing the position at the center of described jth classification, r ' is the number of the exceptional sample in described transform characteristics sample, zlBeing the position of described l exceptional sample, c is the position at the center classified corresponding with described the l exceptional sample.
15. device according to claim 11, wherein said creating unit makes the radius that each hypersphere in the plurality of hypersphere comprises corresponding transform characteristics sample and this hypersphere little as far as possible.
16. device according to claim 11, also include: third feature transformation matrix determines unit, be used for determining described second feature transformation matrix, so that:
The spacing of the centre of sphere of other hypersphere in the centre of sphere of each hypersphere in the plurality of hypersphere and the plurality of hypersphere is more than their radius sum; And
For a hypersphere in the plurality of hypersphere, the spacing of the transform characteristics sample being smaller than on described hypersphere surface with the centre of sphere of described hypersphere of the transform characteristics sample in described hypersphere and the centre of sphere of described hypersphere.
17. device according to claim 16, described third feature transformation matrix determines that unit is for passing through to make �� zik-c��i��2<��v��im-c��i�� and make (r 'i+r��j)2<��c��i-c��j��2(1��i, j��m, i �� j) determines described second feature transformation matrix, wherein zikIt is the position of kth transform characteristics sample in i-th hypersphere, ci' it is the position of the centre of sphere of i-th hypersphere, vim' it is the position of m-th transform characteristics sample on i-th hypersphere surface, ri' it is the radius of i-th hypersphere, rj' it is the radius of jth hypersphere, cj' it is the position of the centre of sphere of jth hypersphere.
18. the device according to any one of claim 11 to 17, described eigentransformation unit includes:
Fisrt feature converter unit, is used for using described fisrt feature transformation matrix that the one or more biological characteristic is transformed to the second transform characteristics;
Second feature converter unit, is used for using described second feature transformation matrix that described second transform characteristics is transformed to described transform characteristics.
19. the computer program utilizing calculation element identification object, described computer program is tangibly stored on non-transient computer-readable medium and is included machine-executable instruction, and described machine-executable instruction makes machine perform the step of method according to any one of claim 1 to 9 when executed.
20. an electronic equipment, including memorizer and processor, described processor is configured to perform method according to any one of claim 1 to 9.
CN201410601359.4A 2014-10-30 2014-10-30 Method and device for identifying object through calculation device and electronic equipment Pending CN105654100A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410601359.4A CN105654100A (en) 2014-10-30 2014-10-30 Method and device for identifying object through calculation device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410601359.4A CN105654100A (en) 2014-10-30 2014-10-30 Method and device for identifying object through calculation device and electronic equipment

Publications (1)

Publication Number Publication Date
CN105654100A true CN105654100A (en) 2016-06-08

Family

ID=56483194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410601359.4A Pending CN105654100A (en) 2014-10-30 2014-10-30 Method and device for identifying object through calculation device and electronic equipment

Country Status (1)

Country Link
CN (1) CN105654100A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109612961A (en) * 2018-12-13 2019-04-12 温州大学 The opener recognition methods of the micro- plastics of coastal environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1446344A (en) * 2000-06-19 2003-10-01 科雷洛吉克系统公司 Heuristic method of classification
CN101127029A (en) * 2007-08-24 2008-02-20 复旦大学 Method for training SVM classifier in large scale data classification
CN101809574A (en) * 2007-09-28 2010-08-18 日本电气株式会社 Method for classifying data and device for classifying data
WO2012039719A1 (en) * 2010-09-24 2012-03-29 Hewlett-Packard Development Company, L.P. Image registration
CN103295024A (en) * 2012-02-29 2013-09-11 佳能株式会社 Method and device for classification and object detection and image shoot and process equipment
CN104112143A (en) * 2014-07-23 2014-10-22 大连民族学院 Weighted hyper-sphere support vector machine algorithm based image classification method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1446344A (en) * 2000-06-19 2003-10-01 科雷洛吉克系统公司 Heuristic method of classification
CN101127029A (en) * 2007-08-24 2008-02-20 复旦大学 Method for training SVM classifier in large scale data classification
CN101809574A (en) * 2007-09-28 2010-08-18 日本电气株式会社 Method for classifying data and device for classifying data
WO2012039719A1 (en) * 2010-09-24 2012-03-29 Hewlett-Packard Development Company, L.P. Image registration
CN103295024A (en) * 2012-02-29 2013-09-11 佳能株式会社 Method and device for classification and object detection and image shoot and process equipment
CN104112143A (en) * 2014-07-23 2014-10-22 大连民族学院 Weighted hyper-sphere support vector machine algorithm based image classification method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
方景龙: ""基于SVDD的单/多示例学习研究"", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109612961A (en) * 2018-12-13 2019-04-12 温州大学 The opener recognition methods of the micro- plastics of coastal environment
CN109612961B (en) * 2018-12-13 2021-06-25 温州大学 Open set identification method of coastal environment micro-plastic

Similar Documents

Publication Publication Date Title
Celik et al. Content based image retrieval with sparse representations and local feature descriptors: A comparative study
US20200250461A1 (en) Target detection method, apparatus, and system
Paliwal et al. A tri-gram based feature extraction technique using linear probabilities of position specific scoring matrix for protein fold recognition
Wang et al. MARCH: Multiscale-arch-height description for mobile retrieval of leaf images
Huang et al. Multiple morphological profiles from multicomponent-base images for hyperspectral image classification
Inoue et al. A fast and accurate video semantic-indexing system using fast MAP adaptation and GMM supervectors
CN109426781A (en) Construction method, face identification method, device and the equipment of face recognition database
Iakovidou et al. Localizing global descriptors for content-based image retrieval
Bouadjenek et al. Histogram of Oriented Gradients for writer's gender, handedness and age prediction
Sicre et al. Identity documents classification as an image classification problem
CN112733590A (en) Pedestrian re-identification method based on second-order mixed attention
Wang et al. Multi-order co-occurrence activations encoded with Fisher Vector for scene character recognition
Xiong et al. RGB-D scene recognition via spatial-related multi-modal feature learning
Xu et al. Discriminative analysis for symmetric positive definite matrices on lie groups
Nguyen-Quoc et al. A revisit histogram of oriented descriptor for facial color image classification based on fusion of color information
Amato et al. Aggregating binary local descriptors for image retrieval
Li et al. Discriminative Hough context model for object detection
CN105654100A (en) Method and device for identifying object through calculation device and electronic equipment
Ambai et al. SPADE: scalar product accelerator by integer decomposition for object detection
Lahrache et al. Bag‐of‐features for image memorability evaluation
Karczmarek et al. Chain code-based local descriptor for face recognition
CN113239237A (en) Cross-media big data searching method and device
Huang et al. Local visual similarity descriptor for describing local region
Santoso et al. Learning-based human detection applied to RGB-D images
Gao et al. Scene text recognition by learning co‐occurrence of strokes based on spatiality embedded dictionary

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160608