CN102521590A - Method for identifying left and right palm prints based on directions - Google Patents
Method for identifying left and right palm prints based on directions Download PDFInfo
- Publication number
- CN102521590A CN102521590A CN2011103723653A CN201110372365A CN102521590A CN 102521590 A CN102521590 A CN 102521590A CN 2011103723653 A CN2011103723653 A CN 2011103723653A CN 201110372365 A CN201110372365 A CN 201110372365A CN 102521590 A CN102521590 A CN 102521590A
- Authority
- CN
- China
- Prior art keywords
- image
- variance
- histogram
- right sides
- foreground area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
- Collating Specific Patterns (AREA)
Abstract
The invention discloses a method for identifying left and right palm prints based on directions. The method comprises the following steps of: (1) dividing a palm print image into a foreground area block and a background area block of the palm print; (2) performing two-dimensional Fourier transformation on the foreground area block of the image, thereby obtaining a direction theta (x, y) of the foreground area image; (3) calculating a direction histogram g(i) of the foreground area according to the direction theta (x, y) of the foreground area image, and performing one-dimensional mean value smoothing on the direction histogram g(i), thereby obtaining a smoothed histogram; (4) calculating the peak values of all peaks in the smoothed histogram and selecting the direction of the highest peak as the direction of a palm print outside area; and (5) judging left and right palms according to the direction of the palm print outside area, judging as the left palm when the direction is more than 90 degrees and judging as the right palm when the direction is less than 90 degrees. According to the method, the left and right palms are judged by utilizing the palm print direction, the accuracy is very high and can reach 98.92%, the number of the palm print identification is efficiently reduced, the efficiency of the whole identifying process is greatly increased, and the identifying accuracy is also increased.
Description
Technical field
The present invention relates to the palmmprint recognition technology, particularly based on the left and right sides palm grain identification method of direction.Belong to Digital Image Processing and biometrics identification technology field.
Background technology
As a kind of important biometrics identification technology, palmmprint is identified in the recent period just the interest that causes people gradually.On the degree of depth and range of research, to compare with having the historical fingerprint recognition of long research, the research of palmmprint identification can only be said and also be in initial period.But because the outstanding characteristics that palmmprint itself is had make the palmmprint recognition technology have great potential, people are to the palmmprint Recognition Technology Research, and it is more and more deep also to become.
For high-resolution palm print; Itself and fingerprint image have a lot of similarities; Therefore the general technology that is similar to fingerprint image processing and coupling of using is discerned high-resolution palm print; But because palmmprint has area and the more unique point that is far longer than fingerprint, so recognition speed can reduce greatly.But for palmmprint, it has the characteristic that fingerprint does not have, about promptly palmmprint has the palm difference, if can to about the palm carry out the differentiation of efficiently and accurately, the palmmprint that need discern is reduced by half.Therefore, before palmmprint is discerned, to about slap and differentiate, can effectively reduce the number of discerning palmmprint, improve the efficient of whole identifying greatly, simultaneously also to the accuracy rate effect of haveing a certain upgrade of identification.Therefore, the palm was differentiated about palmmprint carried out, and just become has Practical significance very much.
Summary of the invention
The objective of the invention is to propose a kind of be used for automatic Palm Print Recognition System fast and effectively about slap recognition methods.
The present invention utilize the palmmprint direction to about slap and differentiate, method is simple, efficient, and has very high accuracy rate.Concrete thinking is: because the exterior lateral area of palmmprint generally has the consistent direction (as shown in Figure 1) of comparison; Other region directions are then relatively mixed and disorderly; Therefore after the direction of whole palmmprint being carried out statistics with histogram; The direction of higher peak correspondence generally is exactly the direction of exterior lateral area in the histogram.Therefore, histogram that can direction of passage obtains the principal direction of exterior lateral area.And because the palmprint image of gathering does not have apparent in view rotation; Therefore the direction of palm exterior lateral area has tangible difference about; With level is to the right X axle positive dirction; Be Y axle positive dirction straight down, then left side palm exterior lateral area direction is about 135 degree, and right palm exterior lateral area direction is about 45 degree.Therefore, the principal direction that can utilize the exterior lateral area of trying to achieve to about slap and differentiate.
Technical scheme of the present invention is: a kind of left and right sides palm grain identification method based on direction comprises the steps:
1) palmprint image is divided into foreground area and the background area that has palmmprint; The background area is other outer zone of palmmprint; The color of foreground area will be deeper than the background area; The purpose that palmprint image is cut apart is to eliminate the influence that the background area produces the direction statistics; Foreground area that has palmprint image and background area are distinguished, and follow-up direction calculating is only carried out in foreground area.
2) the foreground area piecemeal to image carries out 2 dimension Fourier transforms, and the direction θ of acquisition foreground region image (x, y);
3) according to the direction θ (x of foreground region image; Y) the direction histogram g (i) of calculating foreground area; And direction histogram g (i) carried out the mean value smoothing of one dimension, obtain the histogram
after level and smooth
4) direction of the direction at top as the palmmprint exterior lateral area selected in the position at all peaks in the calculating smoothed histogram;
5) slap about the discriminating direction according to the palmmprint exterior lateral area, direction is slapped greater than the left side that is judged to be of 90 degree, less than the right palm of being judged to be of 90 degree.
Further, foreground area and the background area of cutting apart palmprint image with gray variance as characteristic.
The method that said palmprint image is cut apart is:
A, palmprint image is divided into the image block of w * w;
B, calculate the characteristic of the variance of gray scale in each image block, just obtain a gray variance image as this piece.Use gray variance as characteristic, can the realization of efficiently and accurately before background segment.
C, use automatic threshold choosing method calculate segmentation threshold, and variance is confirmed as foreground area more than or equal to the image block of this threshold value, and variance is confirmed as the background area less than the image block of this threshold value.
The computing method of the gray variance image of said palmprint image are:
At first the gray average m of computed image piece (x, y):
Wherein (x y) is the coordinate of variance image, I (i, j) expression point (i, gray scale j).
Calculate then variance image v (x, y):
Variance image is normalized to t [0,255] and converts integer into:
V wherein
MaxExpression variance maximal value, v
MinExpression variance minimum value rounds under [] expression.
Said automatic threshold is chosen: utilize the automatic selected threshold of OTSU algorithm.
The method of concrete computed segmentation threshold value is:
Wherein the mid point number is gathered in card () expression.
In second step, calculate inter-class variance:
g(t)=w
0(p
0-p)
2+w
1(p
1-p)
2
Final threshold value is:
Said method according to Threshold Segmentation foreground area and background area is:
The threshold value T that utilization obtains;
carried out binaryzation, obtains cutting apart figure:
Wherein 1 representes foreground area, 0 expression background area.
The direction of said acquisition image block is:
A, palmprint image is divided into the image block of w ' * w ', the image block that comprises foreground area is carried out 2 dimension Fourier transforms.
B, near the response the initial point in the 2 dimension Fourier transform domains is put 0; Influence with removal of images piece medium and low frequency part.
C, find the position (u of the maximum point of response
0, v
0), wherein, u and v represent the coordinate system on the Fourier transform domain, level is to the right a u axle positive dirction, is v axle positive dirction straight up.
The direction of D, computed image piece: the direction of image block is quantized to 0 and is spent to 179 degree.
Wherein the coordinate axis of image block direction is to the right an x axle positive dirction with level, is y axle positive dirction straight down.
Said method according to image block direction calculating statistic histogram is:
g(i)=card({(x,y)|θ(x,y)=i}),i=0,1,2,...,179
Wherein the mid point number is gathered in card () expression, and (x y) is the direction of image block to θ.
Said histogram is carried out the one dimension mean value smoothing, level and smooth method is:
Wherein:
N is a smoothing parameter, and k=i+j, k, i, j are the independents variable of representative function.
Being calculated as of said histogram peak:
Wherein
N ' is for judging the width parameter at peak.
Beneficial effect:
Method of the present invention can fast and effeciently be discerned left and right sides palmmprint; The accuracy of identification is 98.92%; Before palmmprint is discerned, to about slap and differentiate, can effectively reduce the number of discerning palmmprint; Improve the efficient of whole identifying greatly, simultaneously also the accuracy rate effect of haveing a certain upgrade to discerning.Therefore, the palm was differentiated about palmmprint carried out, and just become has Practical significance very much.
Description of drawings
Fig. 1 is the palmprint image synoptic diagram.
Fig. 2 is the process flow diagram that palmprint image is divided into foreground area and background area according to the inventive method;
Fig. 3 is the process flow diagram that according to the method for the invention image block is carried out Fourier transform;
Fig. 4 is the process flow diagram that calculates directional image according to the method for the invention;
Fig. 5 is the process flow diagram that calculates direction histogram according to the method for the invention;
Fig. 6 is that histogram peak is chosen synoptic diagram.
The practical implementation method
Combine accompanying drawing that the present invention is further described through specific embodiment below.
Fig. 1 is a palmprint image, and the arrow direction is palmmprint exterior lateral area direction, can see palmmprint direction basically identical.
The embodiment of the invention comprises the following steps:
1, at first carry out palmprint image (Fig. 2 cutting apart a):
1) variance is calculated: do not overlap fritter mutually with what palmprint image was divided into w * w, w can be according to the streakline width value in the palmprint image.
Calculate the gray variance of each piece, then gray variance is represented a piece, and all these gray variances have just constituted variance image:
Wherein I (i, j) expression point (i, gray scale j), (x y) is the coordinate of variance image.
Because gradation of image is generally 0-255, calculate for ease and be convenient to and represent, the variogram that obtains is normalized to [0,255] and converts integer into:
V wherein
MaxExpression variance maximal value, v
MinExpression variance minimum value rounds under [] expression.
Fig. 2 (b) has provided a variance image that Fig. 2 a obtains after aforementioned calculation.
2) automatic threshold is chosen: utilize the automatic selected threshold of OTSU algorithm, be specially:
The statistic histogram that at first calculates
:
Wherein the mid point number is gathered in card () expression.
Calculate inter-class variance then:
g(t)=w
0(p
0-p)
2+w
1(p
1-p)
2
Final segmentation threshold is:
3) preceding background segment: utilize the segmentation threshold T that obtains;
carried out binaryzation, obtains cutting apart figure:
1 expression prospect wherein, 0 expression background.
Fig. 2 (c) has provided a segmentation result figure, and wherein white is foreground area, and black is the background area.2, direction calculating:
Whole palmmprint is divided into the fritter of w ' * w ', is some powers of 2 because Fourier transform needs the image block size, is taken as 32 * 32 in therefore concrete the realization, then each image block that drops on foreground area is carried out 2 dimension Fourier transforms.Near the initial point in the 2 dimension Fourier transform domains response is put 0,, and then find out the position that responds maximum point, be designated as: (u with the influence of removal of images piece medium and low frequency part
0, v
0), (with level is to the right u axle positive dirction, is v axle positive dirction straight up) then the direction of image block is:
Wherein the direction coordinate axis is to the right an x axle positive dirction with level, is y axle positive dirction straight down.
Fig. 3 has provided the schematic flow sheet of image block Fourier transform, and Fig. 3 a is an image block, and Fig. 3 b is the image block after Fourier transform, and wherein the position of Fig. 3 b circle is the peak response position.Fig. 4 has then provided the direction of palmprint image, and wherein Fig. 4 a is original palmprint image, and Fig. 4 b has then shown the direction of Fig. 4 a.
3, direction histogram:
To the top direction θ that obtains (x, y) counting statistics histogram:
g(i)=card({(x,y)|θ(x,y)=i}),i=0,1,2,...,179
Wherein the mid point number is gathered in card () expression.
Again histogram is carried out smoothly:
Wherein:
N is a smoothing parameter, is taken as 5 in concrete the realization.
Fig. 5 has provided a palmprint image directional image and its corresponding direction histogram after level and smooth.Fig. 5 a is the palmprint image direction, and Fig. 5 b is the direction histogram after level and smooth.
4, outside principal direction is calculated:
Calculate the position at all peaks in the smoothed histogram.
Provide the definition of histogram peak:
Wherein
N ' is for judging the width parameter at peak, and this parameter is excessive or too small all can to cause in the histogram peak value to be judged inaccurate, and through overtesting, this parameter is taken as 5.
Find out peaks all in the histogram according to above-mentioned definition, select the highest peak, the direction that it is corresponding is the direction of exterior lateral area.Fig. 6 has provided the synoptic diagram of a detection peak, and its orbicular spot and square are represented histogrammic peak, and square is represented the top, corresponding 33 degree in top among this figure.
5, the palm is differentiated about
After obtaining the principal direction of exterior lateral area, just can to about slap and differentiate.Exterior lateral area principal direction is slapped greater than the left side that is judged to be of 90 degree, less than the right palm of being judged to be of 90 degree.Exterior lateral area principal direction is 33 degree among the embodiment, is judged as the right palm.
Claims (10)
1. the left and right sides palm grain identification method based on direction comprises the steps:
(1) palmprint image is divided into the foreground area piecemeal and the background area piecemeal of palmmprint;
(2) the foreground area piecemeal to image carries out 2 dimension Fourier transforms, and the direction θ of acquisition foreground region image (x, y);
(3) according to the direction θ (x of foreground region image; Y) the direction histogram g (i) of calculating foreground area; And, obtain smoothed histogram
to the mean value smoothing that direction histogram g (i) carries out one dimension
(4) peak value at all peaks in the calculating smoothed histogram is selected the direction of the direction at top as the palmmprint exterior lateral area;
(5) slap about the discriminating direction according to the palmmprint exterior lateral area, direction is slapped greater than the left side that is judged to be of 90 degree, less than the right palm of being judged to be of 90 degree.
2. the left and right sides palm grain identification method based on direction according to claim 1 is characterized in that, cuts apart the foreground area and the background area of palmprint image as characteristic with gray variance.
3. the left and right sides palm grain identification method based on direction according to claim 1 and 2 is characterized in that, the said method of cutting apart palmprint image is:
A, palmprint image is divided into the image block of w * w;
C, use automatic threshold choosing method computed segmentation threshold value T confirm as foreground area with variance more than or equal to the image block of this threshold value, and variance is confirmed as the background area less than the image block of this threshold value.
4. the left and right sides palm grain identification method based on direction according to claim 3 is characterized in that, the method for said calculating gray variance image is:
The first step, the gray average m of computed image piece (x, y):
Wherein, (x y) is the coordinate of variance image, I (i, j) expression point (i, gray scale j)
Second step, Calculation variance image v (x, y):
The 3rd the step, with variance image v (x y) normalizes to [0,255] and converts integer into:
Wherein, v
MaxExpression variance maximal value, v
MinExpression variance minimum value rounds under [] expression.
5. the left and right sides palm grain identification method based on direction according to claim 3 is characterized in that the method for said computed segmentation threshold value is:
Wherein the mid point number is gathered in card () expression.
In second step, calculate inter-class variance:
g(t)=w
0(p
0-p)
2+w
1(p
1-p)
2
The 3rd step, calculated threshold T:
6. the left and right sides palm grain identification method based on direction according to claim 3 is characterized in that, the method for cutting apart foreground area and background area according to segmentation threshold T is:
Utilize segmentation threshold T;
carries out binaryzation to the gray variance image, obtains cutting apart figure:
Wherein 1 representes foreground area, 0 expression background area.
7. the left and right sides palm grain identification method based on direction according to claim 1: it is characterized in that the computing method of said image block direction are:
A, palmprint image is divided into the image block of w ' * w ', the image block that comprises foreground area is carried out 2 dimension Fourier transforms.
B, near the response the initial point in the 2 dimension Fourier transform domains is put 0;
C, find the position (u of the maximum point of response
0, v
0), wherein, u and v represent the coordinate system on the Fourier transform domain;
The direction of D, computed image piece:
Wherein the coordinate axis of image block direction is to the right an x axle positive dirction with level, is y axle positive dirction straight down.
8. the left and right sides palm grain identification method based on direction according to claim 1: it is characterized in that the histogrammic method of said calculated direction is:
g(i)=card({(x,y)|θ(x,y)=i}),i=0,1,2,...,179
Wherein, card () expression set mid point number, (x y) is the direction of image block to θ.
9. the left and right sides palm grain identification method based on direction according to claim 1: it is characterized in that, saidly the direction histogram is carried out level and smooth method be:
Wherein:
N is a smoothing parameter, and k=i+j, k, i, j are the independents variable of representative function.
10. the left and right sides palm grain identification method based on direction according to claim 1: it is characterized in that the computing method of said smoothed histogram peak value are:
Wherein
N ' is for judging the width parameter at peak, and k=i+j, k, i, j are the independents variable of representative function.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011103723653A CN102521590B (en) | 2011-04-29 | 2011-11-21 | Method for identifying left and right palm prints based on directions |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110110113.3 | 2011-04-29 | ||
CN201110110113 | 2011-04-29 | ||
CN2011103723653A CN102521590B (en) | 2011-04-29 | 2011-11-21 | Method for identifying left and right palm prints based on directions |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102521590A true CN102521590A (en) | 2012-06-27 |
CN102521590B CN102521590B (en) | 2013-12-04 |
Family
ID=46292501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011103723653A Expired - Fee Related CN102521590B (en) | 2011-04-29 | 2011-11-21 | Method for identifying left and right palm prints based on directions |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102521590B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015192444A1 (en) * | 2014-06-16 | 2015-12-23 | 中兴通讯股份有限公司 | Page layout adjusting method and terminal |
CN106156587A (en) * | 2016-06-30 | 2016-11-23 | 广东小天才科技有限公司 | A kind of automatic unlocking method of mobile terminal and mobile terminal |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6195460B1 (en) * | 1996-11-01 | 2001-02-27 | Yamatake Corporation | Pattern extraction apparatus |
CN101763500A (en) * | 2008-12-24 | 2010-06-30 | 中国科学院半导体研究所 | Method applied to palm shape extraction and feature positioning in high-freedom degree palm image |
-
2011
- 2011-11-21 CN CN2011103723653A patent/CN102521590B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6195460B1 (en) * | 1996-11-01 | 2001-02-27 | Yamatake Corporation | Pattern extraction apparatus |
CN101763500A (en) * | 2008-12-24 | 2010-06-30 | 中国科学院半导体研究所 | Method applied to palm shape extraction and feature positioning in high-freedom degree palm image |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015192444A1 (en) * | 2014-06-16 | 2015-12-23 | 中兴通讯股份有限公司 | Page layout adjusting method and terminal |
CN106156587A (en) * | 2016-06-30 | 2016-11-23 | 广东小天才科技有限公司 | A kind of automatic unlocking method of mobile terminal and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN102521590B (en) | 2013-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102254188B (en) | Palmprint recognizing method and device | |
CN101350063B (en) | Method and apparatus for locating human face characteristic point | |
CN102629318B (en) | Fingerprint image segmentation method based on support vector machine | |
CN104463877B (en) | A kind of water front method for registering based on radar image Yu electronic chart information | |
CN104091157A (en) | Pedestrian detection method based on feature fusion | |
CN104417490B (en) | A kind of car belt detection method and device | |
CN110766689A (en) | Method and device for detecting article image defects based on convolutional neural network | |
CN101430763A (en) | Detection method for on-water bridge target in remote sensing image | |
CN103870808A (en) | Finger vein identification method | |
CN108986038A (en) | A kind of wheel-hub contour detection method based on Improved Hough Transform | |
CN104463814B (en) | Image enhancement method based on local texture directionality | |
CN102346850A (en) | DataMatrix bar code area positioning method under complex metal background | |
CN103413119A (en) | Single sample face recognition method based on face sparse descriptors | |
CN102903108B (en) | Edge detection method based on underwater image statistical property | |
CN104156723B (en) | A kind of extracting method with the most stable extremal region of scale invariability | |
CN107909083A (en) | A kind of hough transform extracting method based on outline optimization | |
CN102073872B (en) | Image-based method for identifying shape of parasite egg | |
CN103020953A (en) | Segmenting method of fingerprint image | |
CN102750531A (en) | Method for detecting handwriting mark symbols for bill document positioning grids | |
CN103425985B (en) | A kind of face wrinkles on one's forehead detection method | |
CN104008386A (en) | Method and system for identifying type of tumor | |
CN102521590B (en) | Method for identifying left and right palm prints based on directions | |
CN104021372A (en) | Face recognition method and device thereof | |
CN102254304A (en) | Method for detecting contour of target object | |
CN105550646A (en) | Generalized illumination invariant face feature description method based on logarithmic gradient histogram |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20131204 Termination date: 20161121 |
|
CF01 | Termination of patent right due to non-payment of annual fee |