CN104268527B - A kind of iris locating method based on gradient detection - Google Patents
A kind of iris locating method based on gradient detection Download PDFInfo
- Publication number
- CN104268527B CN104268527B CN201410504448.7A CN201410504448A CN104268527B CN 104268527 B CN104268527 B CN 104268527B CN 201410504448 A CN201410504448 A CN 201410504448A CN 104268527 B CN104268527 B CN 104268527B
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- mtd
- mtr
- iris
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Abstract
The present invention discloses a kind of iris locating method based on gradient detection, and this method step includes the position of estimation pupil center, divides iris region to be searched as reference point, iris region to be searched is expanded into a certain size rectangular area by polar coordinates;Utilize the inner boundary of gradient detection method positioning iris region;The external boundary of iris region is positioned using gradient detection method and the reference radius of external boundary is finely adjusted according to the gradient in neighborhood.Technical scheme of the present invention, avoid and searched in a wide range of interior three dimensions point-by-point to the progress of iris boundary parameter, and adaptive should determine that to rim detection and region segmentation complexity threshold value, and the interference effects such as light source image point and uneven illumination are smaller, quick and accurate Iris Location can be realized, so as to be favorably improved the recognition speed of iris authentication system and accuracy rate.
Description
Technical field
The present invention relates to a kind of image partition method.More particularly, to a kind of for the fast of iris biological recognition system
Fast iris locating method.
Background technology
Iris recognition gradually turns into bio-identification with its accuracy, stability, security and the significant advantage such as untouchable
The study hotspot and development trend in field.In iris authentication system, iris preprocessing (including Iris Location, interference inspection
Survey, normalization and image enhaucament etc.) be iris recognition premise, wherein Iris Location is crucial.Only accurate positioning, it can just carry
More effective iris features are taken, are accurately identified so as to realize.But in actual applications, due to the hardware limitation of collecting device
With the factor such as illumination variation of collection environment, different degrees of noise jamming, details often be present in the iris image got
The situation such as low with contrast is obscured, this just brings difficulty to Iris Location.
At present in iris authentication system, conventional Iris-orientation Algorithm is broadly divided into three classes:The first kind is to be based on
The parameter asked for using calculus loop truss device iteration corresponding to gradient circulation integral maximum that Daugman is proposed is carried out
The algorithm of Iris Location;Second class is obtained based on utilizing in Hough loop truss transformation search parameter spaces for Wildes propositions
The parameter voted corresponding to most circumference carries out the algorithm of Iris Location;3rd class is based on edge gradient filtering and binaryzation
Rim detection simultaneously combines the algorithm that circumference fitting carries out Iris Location.The speed and precision of Iris Location contradict, in order to
Seek optimal balance between the real-time and accuracy of system, these algorithms require that the iris image of input must be high-quality
Amount.But in actual applications, the scene for gathering iris image is often complicated and changeable, it is desirable to can be got all the time high-quality
The iris image of amount is unpractical.
More than conventional Iris-orientation Algorithm be to utilize in iris image from pupil to iris again to sclera region transition
Locate the characteristics of pixel grey scale change is obvious, sought by the method for region segmentation in the parameter space of iris image rectangular coordinate system
The most optimized parameter on the corresponding inside and outside border of iris is looked for, so as to realize Iris Location.Although these algorithms can obtain under certain condition
Preferable locating effect is obtained, but the shortcomings that following obvious be present:
1.Daugman iterative detection, Wildes Hough transform and circumference fitting scheduling algorithm are required in iris
Point-by-point three dimensions (the row, column coordinate and radius in the circumference center of circle is carried out in certain limit in image to iris boundary parameter
Length) search, complexity is high, takes longer.If can not effectively limit search scope, certainly will be difficult to meet iris recognition system
The requirement of real-time of system.
2. these algorithms are all more sensitive on influences such as the noise jamming in real image and uneven illuminations, especially by light
The problems such as edges of regions caused by local edge Gray Level Jump caused by source image point and eyelashes etc. and uneven illumination obscures, causes rainbow
Membrane boundary detection efficiency is too low, so as to position failure.If effective iris image quality criterion and noise can not be introduced
Interference Detection algorithm, the accuracy requirement for meeting iris authentication system certainly will be difficult to.
3., it is necessary to input rational threshold value progress rim detection during Iris Location, and the selection of threshold value and iris
The intensity profile statistical property of image is relevant, and the iris image statistical property difference that different system collects is obvious, adaptively
Selected threshold is relatively difficult.In actual applications, if often being determined according to the imaging characteristic of particular system and substantial amounts of experiment
Dry empirical value is as threshold value to be chosen.The threshold value of non-self-adapting chooses the complexity that can increase system and algorithm design, gesture
The practicality requirement for meeting iris authentication system must be difficult to.
Accordingly, it is desirable to provide a kind of quick iris locating method for iris biological recognition system.
The content of the invention
It is an object of the invention to provide a kind of iris locating method based on gradient detection, solves Iris Location speed
The problem of degree is slow and can not be accurately positioned iris due to local disturbances.
To reach above-mentioned purpose, the present invention uses following technical proposals:
A kind of iris locating method based on gradient detection, this method step include:
S1, based on light source picture point and the relative position of pupil estimation pupil center location possible rectangular co-ordinate (xpp,
ypp);
S2, using the possible rectangular co-ordinate of the pupil center location of the estimation as origin, iris region to be searched is pressed into pole
Coordinate expands into rectangular area;
S3, in the rectangular area, utilize gradient detection method positioning iris region inner boundary;
S4, in the rectangular area, utilize gradient detection method positioning iris region external boundary.
Preferably, step S1 further comprises step:
S11, the spread pattern for changing light source battle array, regulation light source and the relative position of camera make light source picture point be gathered in pupil
Interior certain;
S12, search there may be the position of light source picture point;
S13, search there may be the position of pupil region;
S14, the pupil region that inside includes light source picture point is estimated based on possible light source picture point and the relative position of pupil
The rectangular co-ordinate of center.
Preferably, step S12 further comprises step:
S121, iris image is smoothed, formula is as follows:
Imgfil=imfilter (eyeimage, H);
In formula, eyeimage is the iris image collected, and H is smothing filtering operator;
S122, using Threshold segmentation and connected domain light source for searching picture point position that may be present is detected, formula is as follows:
In formula, numl is the number for the connected region that there may be light source picture point, TlightFor light source picture point gray scale detection
Threshold value;
S123, each possible light source picture point of estimation rectangular co-ordinate, formula are as follows:
In formula, loc_light is the row, column coordinate pair vector of rectangular co-ordinate corresponding to possible light source picture point center.
Preferably, step S13 further comprises step:
S131, iris image is locally filled with, formula is as follows:
Imgcom=imcomplement (imfill (imcomplement (imgfil), ' holes'));
S132, detected using Threshold segmentation and connected domain and search for pupil region position that may be present, formula is as follows:
In formula, nump is the number for the connected region that there may be pupil, TpupilFor pupil region gray scale detection threshold value;
S133, the row, column coordinate of each possible pupil of estimation and its radius, formula are as follows:
In formula, loc_pupil corresponds to the row, column coordinate pair vector of rectangular co-ordinate, rad_ for possible pupil center
Pupil is radius vectors corresponding to possible pupil region.
Preferably, step S14 further comprises step:
S141, the internal pupil region for including light source picture point of search, estimate the rectangular co-ordinate (xp of pupil center locationa,
ypa), formula is as follows:
S142, the possible rectangular co-ordinate (xp of statistics pupil center locationp,ypp), formula is as follows:
-3<xpp-xpa<3∩-3<ypp-ypa<3。
Preferably, step S2 further comprises:
With each possible center coordinate of eye pupil (xpp,ypp) it is origin, by its Nh×NwThe rectangular neighborhood of size presses pole
Coordinate expands into Nr×NθThe rectangular area of size, wherein NhAnd NwIt is the height and width of rectangular area respectively, method of deploying is by rainbow
Diaphragm area Nh×NwInterior every bit (x, y) is mapped to the rectangular area N represented by polar coordinates (r, θ)r×NθInterior, formula is as follows:
I(x(r,θ),y(r,θ))→I(r,θ),I(x,y)∈imgcom
In formula, for radius r in radial position, scope is [1, Rm], wherein RmIt is the iris boundary radius maximum limited,
Radial direction sampling number is Nr;Angle, θ along angle direction, scope be [0 °, 360 °), angular samples points be Nθ。
Preferably, step S3 further comprises:
The inner boundary of iris region is set as (xp, yp), radius rp;
With each possible iris inner boundary central coordinate of circle (xpp,ypp) it is origin, carry out deploying rectangle region by polar coordinates
The step S2 in domain, obtain a Nr×NθRectangularly-sampled battle array I (r, θ, the xp of sizep,ypp), obtained on each I along r directions
Its gradient vector Grad (r, xpp,ypp), formula is as follows:
Grad (r)=sum (I (r+1,:))-sum(I(r,:));
In formula, r=1,2 ..., Nr-1;
Count each gradient vector Grad (r, xpp,ypp) peak value, wherein three-dimensional parameter corresponding to peak-peak
Vector (r, xpp,ypp) be iris inner boundary parameter, formula is as follows:
In formula, (xp, yp) and rp are respectively the central coordinate of circle and radius of iris inner boundary.
Preferably, step S4 further comprises:
The center of circle of external boundary is set as (xo, yo), radius ro;
N of the center of circle (xo, yo) of external boundary in the inner boundary center of circle (xp, yp)o×NoIn neighborhood, formula is as follows:
In formula, (xop,yop) it is the possible coordinate in the exterior iris boundary center of circle, No≤5;
With each possible exterior iris boundary central coordinate of circle (xop,yop) it is origin, carry out deploying rectangle region by polar coordinates
The step S2 in domain, obtain a Nr×NθRectangularly-sampled battle array I (r, θ, the xo of sizep,yop), obtained on each I along r directions
Its gradient vector Grad (r ', xop,yop), formula is as follows:
Grad (r')=sum (I (r'+1,:))-sum(I(r',:));
In formula, r '=rp+1, rp+2 ..., Nr-1;
Count each gradient vector Grad (r ', xop,yop) peak value, three-dimensional ginseng wherein corresponding to peak-peak
Number vector (r ', xop,yop) be exterior iris boundary parameter, formula is as follows:
In formula, (xo, yo) and rorefThe respectively center of circle of exterior iris boundary and reference radius.
Preferably, the method for positioning exterior iris boundary reference radius further comprises:
Reference radius ro according to the gradient in iris region external boundary neighborhood △ r to the external boundaryrefCarry out micro-
Adjust, formula is as follows:
In formula, r△∈[roref-△r,roref)∪(roref,roref+ △ r], δ ∈ (0,1) are weight factor, and ro is rainbow
Film external boundary radius.
Beneficial effects of the present invention are as follows:
Technical scheme of the present invention, avoid and searched in a wide range of interior three dimensions point-by-point to the progress of iris boundary parameter
Rope, and adaptive should determine that to rim detection and region segmentation complexity threshold value, and light source image point and uneven illumination etc. are dry
It is smaller to disturb influence, quick and accurate Iris Location can be realized, so as to be favorably improved the recognition speed of iris authentication system
And accuracy rate.
Brief description of the drawings
The embodiment of the present invention is described in further detail below in conjunction with the accompanying drawings.
Fig. 1 shows a kind of flow chart of the iris locating method based on gradient detection.
Embodiment
In order to illustrate more clearly of the present invention, the present invention is done further with reference to preferred embodiments and drawings
It is bright.Similar part is indicated with identical reference in accompanying drawing.It will be appreciated by those skilled in the art that institute is specific below
The content of description is illustrative and be not restrictive, and should not be limited the scope of the invention with this.
A kind of iris locating method based on gradient detection comprises the concrete steps that:
The first step, iris image is obtained, estimate the position of pupil center and divide iris to be searched as reference point
Region.
First, search there may be the position of light source picture point.The iris image collected in practice exists different degrees of
Interference, in order to eliminate influence of the High-frequency Interference to iris boundary localization, is smoothed to iris image:
Imgfil=imfilter (eyeimage, H); (1)
In formula, eyeimage is the iris image collected, and H is 3 × 3 Gassian low-pass filter template.Iris image
In, the gray value of light source picture point is far longer than the gray value of other pixels, and light source for searching is detected using Threshold segmentation and connected domain
Picture point position that may be present:
In formula, numl is the number of light source picture point connected region, if Tlight=225 be light source picture point gray scale detection threshold
Value.The row, column coordinate of each possible light source picture point of estimation:
In formula, loc_light is the center row, column coordinate pair vector of possible light source picture point rectangular co-ordinate.
Then, search there may be the position of pupil region.Due to light source picture point be present in real pupil region, in order to
The interference of light source picture point is eliminated, iris image is locally filled with:
Imgcom=imcomplement (imfill (imcomplement (imgfil), ' holes')); (4)
In iris image, the gray value of pupil region is far smaller than the gray value of other pixels, utilizes Threshold segmentation and company
Logical domain detection search pupil region position that may be present:
In formula, nump is the number of pupil center's connected region, if Tpupil=30 be pupil region gray scale detection threshold value.
The row, column coordinate and its radius of each possible pupil of estimation:
In formula, loc_pupil is the row, column coordinate pair vector of possible pupil center's rectangular co-ordinate, and rad_pupil is
Its corresponding radius vectors.
Finally, by detecting the relative position of possible light source picture point and pupil, the internal pupil for including light source picture point is found out
Bore region is real pupil, estimates its center (xpa,ypa) be:
The physical location of pupil center and the horizontal or vertical deviation of position is estimated typically within 3 pixels, i.e. pupil
Possible rectangular co-ordinate (the xp in centerp,ypp) meet:
In formula, (xpr,ypr) it is actual pupil center location coordinate.
Second step, iris region to be searched is expanded into a certain size rectangular area by polar coordinates.
If iris image imgcom resolution ratio is Sh×Sw, the resolution ratio of iris region to be searched is Ns×Ns, wherein Ns
=min { Sh,Sw}/2.With each possible center coordinate of eye pupil (xpp,ypp) it is origin, by its Ns×NsThe rectangle of size is adjacent
Domain expands into (N by polar coordinatess/ 2) × 360 rectangular area of size, i.e., will by being sampled on radius and angle direction
Every bit (x, y) in iris region to be searched is mapped in the rectangular area represented by polar coordinates (r, θ):
I(x(r,θ),y(r,θ))→I(r,θ),I(x,y)∈imgcom (9)
In formula, for r in radial position, scope is [1, Ns/ 2], radius sampling number is Ns/2;θ is along angle direction, scope
Be [0 °, 360 °), angular samples points are 360.
3rd step, the inner boundary of iris region is positioned using gradient detection method.
If the center of circle of iris region inner boundary is (xp, yp), radius rp, iris area is positioned using gradient detection method
The inner boundary (xp, yp, rp) in domain.
The central coordinate of circle for estimating iris inner boundary in the first step is (xpa,ypa), if exist in its 5 × 5 neighborhood (xp,
yp).With each possible iris inner boundary central coordinate of circle (xpp,ypp) it is origin, obtain one according to the method for second step
(Ns/ 2) × 360 rectangularly-sampled battle array I (r, θ, the xp of sizep,ypp), obtain its gradient along r (OK) direction on each I
Vectorial Grad (r, xpp,ypp):
Grad (r)=sum (I (r+1,:))-sum(I(r,:)); (10)
In formula, r=1,2 ..., Nr-1。
Count each gradient vector Grad (r, xpp,ypp) peak value, wherein (gradient is maximum for peak-peak
Value) corresponding to three-dimensional parameter vector (r, xpp,ypp) be iris inner boundary parameter:
In formula, (xp, yp) and rp are respectively the central coordinate of circle and radius length of iris inner boundary.
4th step, the external boundary of iris region is positioned using gradient detection method.
If the center of circle of iris region external boundary is (xo, yo), radius ro, iris area is positioned using gradient detection method
The external boundary (xo, yo, ro) in domain.
Although the inside and outside border non-concentric of iris region, the center of circle (xo, yo) of external boundary the inner boundary center of circle (xp,
Yp N)o×NoIn neighborhood, i.e.,:
In formula, (xop,yop) it is the possible coordinate in the exterior iris boundary center of circle.In practice, N is seto=5.
With each possible exterior iris boundary central coordinate of circle (xop,yop) it is origin, obtain one according to the method for second step
Individual (Ns/ 2) × 360 rectangularly-sampled battle array I (r, θ, the xo of sizep,yop), obtain its radial direction ladder along r (OK) direction on each I
Spend vectorial Grad (r ', xop,yop):
Grad (r')=sum (I (r'+1,:))-sum(I(r',:)); (13)
In formula, r '=rp+1, rp+2 ..., Nr-1。
Count each gradient vector Grad (r ', xop,yop) peak value, wherein (gradient is maximum for peak-peak
Value) corresponding to three-dimensional parameter vector (r ', xop,yop) be exterior iris boundary parameter:
In formula, (xo, yo) be exterior iris boundary the center of circle, rorefFor reference radius.
In the iris image of reality, the pixel grey scale in iris region external boundary neighborhood △ r changes more gentle, region
It is obvious not as at inner boundary to split boundary, it is therefore desirable to according to reference of the gradient in neighborhood △ r to external boundary half
Footpath rorefIt is finely adjusted:
In formula, r△∈[roref-△r,roref)∪(roref,roref+ △ r], δ ∈ (0,1) are weight factor, and ro is rainbow
Film external boundary radius.In practice, if △ r=5, δ=0.75.
Pass through above step, you can the inside and outside boundary parameter (xp, yp, rp) and (xo, yo, ro) of iris region are tried to achieve, it is complete
Into Iris Location.
In summary, technical scheme of the present invention avoids carries out point-by-point three to iris boundary parameter interior on a large scale
Dimension space is searched for, and adaptive should determine that to rim detection and region segmentation complexity threshold value, and light source image point and illumination
Unequal interference effect is smaller.Tested for the image in Institute of Automation, CAS iris database CASIA 1.0,
The Iris-orientation Algorithm based on Hough loop truss classical Wildes is averagely taken as 2.97s, and above-mentioned based on gradient
It is only that (CPU of computer is Intel i5-3470s, 4G internal memories to 0.14s, and system is that the Iris-orientation Algorithm of detection is averagely time-consuming
Windows XPSP3, test software are MATLAB R2012a).Therefore, above-mentioned algorithm can realize quick and accurate iris
Positioning, so as to be favorably improved the recognition speed of iris authentication system and accuracy rate.
Obviously, the above embodiment of the present invention is only intended to clearly illustrate example of the present invention, and is not pair
The restriction of embodiments of the present invention, for those of ordinary skill in the field, may be used also on the basis of the above description
To make other changes in different forms, all embodiments can not be exhaustive here, it is every to belong to this hair
Row of the obvious changes or variations that bright technical scheme is extended out still in protection scope of the present invention.
Claims (7)
1. a kind of iris locating method based on gradient detection, it is characterised in that this method step includes:
S1, based on light source picture point and the relative position of pupil estimation pupil center location possible rectangular co-ordinate (xpp,ypp);
S2, using the possible rectangular co-ordinate of the pupil center location of the estimation as origin, iris region to be searched is pressed into polar coordinates
Expand into rectangular area;
S3, in the rectangular area, utilize gradient detection method positioning iris region inner boundary;
S4, in the rectangular area, utilize gradient detection method positioning iris region external boundary;
Step S1 further comprises step:
S11, the spread pattern for changing light source battle array, regulation light source and the relative position of camera make light source picture point be gathered in pupil
Certain;
S12, search there may be the position of light source picture point;
S13, search there may be the position of pupil region;
S14, the pupil region center that inside includes light source picture point is estimated based on possible light source picture point and the relative position of pupil
The rectangular co-ordinate of position;
Step S14 further comprises step:
S141, the internal pupil region for including light source picture point of search, estimate the rectangular co-ordinate (xp of pupil center locationa,ypa), it is public
Formula is as follows:
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<msub>
<mi>xp</mi>
<mi>a</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yp</mi>
<mi>a</mi>
</msub>
<mo>)</mo>
<mo>=</mo>
<mo>(</mo>
<mi>x</mi>
<mi>p</mi>
<mo>,</mo>
<mi>y</mi>
<mi>p</mi>
<mo>)</mo>
<mo>,</mo>
<mo>(</mo>
<mi>x</mi>
<mi>p</mi>
<mo>,</mo>
<mi>y</mi>
<mi>p</mi>
<mo>)</mo>
<mo>&Element;</mo>
<mi>l</mi>
<mi>o</mi>
<mi>c</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mtable>
<mtr>
<mtd>
<mrow>
<mi>s</mi>
<mo>.</mo>
<mi>t</mi>
<mo>.</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msqrt>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mi>p</mi>
<mo>-</mo>
<mi>x</mi>
<mi>l</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<mi>y</mi>
<mi>p</mi>
<mo>-</mo>
<mi>y</mi>
<mi>l</mi>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>&le;</mo>
<mi>r</mi>
<mi>p</mi>
<mo>,</mo>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mi>l</mi>
<mo>,</mo>
<mi>y</mi>
<mi>l</mi>
<mo>)</mo>
</mrow>
<mo>&Element;</mo>
<mi>l</mi>
<mi>o</mi>
<mi>c</mi>
<mo>_</mo>
<mi>l</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
<mi>t</mi>
<mo>,</mo>
<mi>r</mi>
<mi>p</mi>
<mo>&Element;</mo>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
In formula, loc_light is that the row, column coordinate pair of rectangular co-ordinate corresponding to possible light source picture point center is vectorial, loc_
Pupil is the row, column coordinate pair vector that possible pupil center corresponds to rectangular co-ordinate, and rad_pupil is possible pupil region
Corresponding radius vectors;
S142, the possible rectangular co-ordinate (xp of statistics pupil center locationp,ypp), formula is as follows:
- 3 < xpp-xpa∩ -3 < the yp of < 3p-ypa< 3.
2. the iris locating method according to claim 1 based on gradient detection, it is characterised in that step S12 enters one
Step includes step:
S121, iris image is smoothed, formula is as follows:
Imgfil=imfilter (eyeimage, H);
In formula, eyeimage is the iris image collected, and H is smothing filtering operator;
S122, using Threshold segmentation and connected domain light source for searching picture point position that may be present is detected, formula is as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>B</mi>
<mi>W</mi>
<mi>L</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mi>m</mi>
<mi>g</mi>
<mi>f</mi>
<mi>i</mi>
<mi>l</mi>
<mo>></mo>
<mo>=</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>l</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
<mi>t</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mo>_</mo>
<mi>l</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
<mi>t</mi>
<mo>,</mo>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>l</mi>
<mo>&rsqb;</mo>
<mo>=</mo>
<mi>b</mi>
<mi>w</mi>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mrow>
<mo>(</mo>
<mi>B</mi>
<mi>W</mi>
<mi>L</mi>
<mo>,</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, numl is the number for the connected region that there may be light source picture point, TlightFor light source picture point gray scale detection threshold value;
S123, each possible light source picture point of estimation rectangular co-ordinate, formula are as follows:
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<mi>x</mi>
<mi>i</mi>
<mo>,</mo>
<mi>y</mi>
<mi>i</mi>
<mo>&rsqb;</mo>
<mo>=</mo>
<mi>f</mi>
<mi>i</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mo>_</mo>
<mi>l</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
<mi>t</mi>
<mo>=</mo>
<mo>=</mo>
<mi>i</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mn>...</mn>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>l</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>xl</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>i</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>i</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>/</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>yl</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>i</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>i</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>/</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>l</mi>
<mi>o</mi>
<mi>c</mi>
<mo>_</mo>
<mi>l</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
<mi>t</mi>
<mo>=</mo>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>xl</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yl</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mn>...</mn>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>l</mi>
</mrow>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>.</mo>
</mrow>
3. the iris locating method according to claim 2 based on gradient detection, it is characterised in that step S13 enters one
Step includes step:
S131, iris image is locally filled with, formula is as follows:
Imgcom=imcomplement (imfill (imcomplement (imgfil), ' holes'));
S132, detected using Threshold segmentation and connected domain and search for pupil region position that may be present, formula is as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>B</mi>
<mi>W</mi>
<mi>P</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mi>m</mi>
<mi>g</mi>
<mi>c</mi>
<mi>o</mi>
<mi>m</mi>
<mo><</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
<mo>,</mo>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>p</mi>
<mo>)</mo>
<mo>=</mo>
<mi>b</mi>
<mi>w</mi>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mrow>
<mo>(</mo>
<mi>B</mi>
<mi>W</mi>
<mi>P</mi>
<mo>,</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
<mo>;</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, nump is the number for the connected region that there may be pupil, TpupilFor pupil region gray scale detection threshold value;
S133, the row, column coordinate of each possible pupil of estimation and its radius, formula are as follows:
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<mi>x</mi>
<mi>j</mi>
<mo>,</mo>
<mi>y</mi>
<mi>j</mi>
<mo>&rsqb;</mo>
<mo>=</mo>
<mi>f</mi>
<mi>i</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mi>l</mi>
<mi>a</mi>
<mi>b</mi>
<mi>e</mi>
<mi>l</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
<mo>=</mo>
<mo>=</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mn>...</mn>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>p</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>xp</mi>
<mi>j</mi>
</msub>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mrow>
<mi>max</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mi>min</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>/</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>yp</mi>
<mi>j</mi>
</msub>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mrow>
<mi>max</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mi>min</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>/</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>rp</mi>
<mi>j</mi>
</msub>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mo>(</mo>
<mrow>
<mrow>
<mo>(</mo>
<mrow>
<mi>max</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>min</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mrow>
<mo>(</mo>
<mrow>
<mi>max</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>min</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>y</mi>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>/</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>l</mi>
<mi>o</mi>
<mi>c</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
<mo>=</mo>
<msub>
<mrow>
<mo>{</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>xp</mi>
<mi>j</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yp</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>}</mo>
</mrow>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mn>...</mn>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>p</mi>
</mrow>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mo>_</mo>
<mi>p</mi>
<mi>u</mi>
<mi>p</mi>
<mi>i</mi>
<mi>l</mi>
<mo>=</mo>
<msub>
<mrow>
<mo>{</mo>
<msub>
<mi>rp</mi>
<mi>j</mi>
</msub>
<mo>}</mo>
</mrow>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mn>...</mn>
<mi>n</mi>
<mi>u</mi>
<mi>m</mi>
<mi>p</mi>
</mrow>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>.</mo>
</mrow>
4. the iris locating method according to claim 1 based on gradient detection, it is characterised in that step S2 is further
Including:
With each possible center coordinate of eye pupil (xpp,ypp) it is origin, by its Nh×NwThe rectangular neighborhood of size presses polar coordinates
Expand into Nr×NθThe rectangular area of size, wherein NhAnd NwIt is the height and width of rectangular area respectively, method of deploying is by iris area
Domain Nh×NwInterior every bit (x, y) is mapped to the rectangular area N represented by polar coordinates (r, θ)r×NθInterior, formula is as follows:
I(x(r,θ),y(r,θ))→I(r,θ),I(x,y)∈imgcom
In formula, for radius r in radial position, scope is [1, Rm], wherein RmIt is the iris boundary radius maximum limited, radially
Sampling number is Nr;Angle, θ along angle direction, scope be [0 °, 360 °), angular samples points be Nθ。
5. the iris locating method according to claim 1 based on gradient detection, it is characterised in that step S3 is further
Including:
The inner boundary of iris region is set as (xp, yp), radius rp;
With each possible iris inner boundary central coordinate of circle (xpp,ypp) it is origin, carry out by polar coordinates expansion rectangular area
Step S2, obtain a Nr×NθRectangularly-sampled battle array I (r, θ, the xp of sizep,ypp), obtain its footpath along r directions on each I
To gradient vector Grad (r, xpp,ypp), formula is as follows:
Grad (r)=sum (I (r+1,:))-sum(I(r,:));
In formula, r=1,2 ..., Nr-1;
Count each gradient vector Grad (r, xpp,ypp) peak value, wherein corresponding to peak-peak three-dimensional parameter vector
(r,xpp,ypp) be iris inner boundary parameter, formula is as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mi>p</mi>
<mo>,</mo>
<mi>y</mi>
<mi>p</mi>
<mo>)</mo>
<mo>=</mo>
<munder>
<mi>argmax</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>xp</mi>
<mi>p</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yp</mi>
<mi>p</mi>
</msub>
<mo>)</mo>
</mrow>
</munder>
<mo>(</mo>
<munder>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
<mi>r</mi>
</munder>
<mo>(</mo>
<mrow>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>r</mi>
<mo>,</mo>
<msub>
<mi>xp</mi>
<mi>p</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yp</mi>
<mi>p</mi>
</msub>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>)</mo>
<mo>,</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>r</mi>
<mi>p</mi>
<mo>=</mo>
<munder>
<mi>argmax</mi>
<mi>r</mi>
</munder>
<mrow>
<mo>(</mo>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mo>(</mo>
<mrow>
<mi>r</mi>
<mo>,</mo>
<mi>x</mi>
<mi>p</mi>
<mo>,</mo>
<mi>y</mi>
<mi>p</mi>
</mrow>
<mo>)</mo>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, (xp, yp) and rp are respectively the central coordinate of circle and radius of iris inner boundary.
6. the iris locating method according to claim 4 based on gradient detection, it is characterised in that step S4 is further
Including:
The center of circle of external boundary is set as (xo, yo), radius ro;
N of the center of circle (xo, yo) of external boundary in the inner boundary center of circle (xp, yp)o×NoIn neighborhood, formula is as follows:
In formula, (xop,yop) it is the possible coordinate in the exterior iris boundary center of circle, No≤5;
With each possible exterior iris boundary central coordinate of circle (xop,yop) it is origin, carry out by polar coordinates expansion rectangular area
Step S2, obtain a Nr×NθRectangularly-sampled battle array I (r, θ, the xo of sizep,yop), obtain its footpath along r directions on each I
To gradient vector Grad (r ', xop,yop), formula is as follows:
Grad (r')=sum (I (r'+1,:))-sum(I(r',:));
In formula, r '=rp+1, rp+2 ..., Nr-1;
Count each gradient vector Grad (r ', xop,yop) peak value, wherein three-dimensional parameter corresponding to peak-peak to
Measure (r ', xop,yop) be exterior iris boundary parameter, formula is as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mi>o</mi>
<mo>,</mo>
<mi>y</mi>
<mi>o</mi>
<mo>)</mo>
<mo>=</mo>
<munder>
<mi>argmax</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>xo</mi>
<mi>p</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yo</mi>
<mi>p</mi>
</msub>
<mo>)</mo>
</mrow>
</munder>
<mo>(</mo>
<munder>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
<msup>
<mi>r</mi>
<mo>&prime;</mo>
</msup>
</munder>
<mo>(</mo>
<mrow>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mrow>
<msup>
<mi>r</mi>
<mo>&prime;</mo>
</msup>
<mo>,</mo>
<msub>
<mi>xo</mi>
<mi>p</mi>
</msub>
<mo>,</mo>
<msub>
<mi>yo</mi>
<mi>p</mi>
</msub>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mo>)</mo>
<mo>)</mo>
<mo>,</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>ro</mi>
<mrow>
<mi>r</mi>
<mi>e</mi>
<mi>f</mi>
</mrow>
</msub>
<mo>=</mo>
<munder>
<mrow>
<mi>arg</mi>
<mi>max</mi>
</mrow>
<msup>
<mi>r</mi>
<mo>&prime;</mo>
</msup>
</munder>
<mrow>
<mo>(</mo>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mo>(</mo>
<mrow>
<msup>
<mi>r</mi>
<mo>&prime;</mo>
</msup>
<mo>,</mo>
<mi>x</mi>
<mi>o</mi>
<mo>,</mo>
<mi>y</mi>
<mi>o</mi>
</mrow>
<mo>)</mo>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, (xo, yo) and rorefThe respectively center of circle of exterior iris boundary and reference radius.
7. the iris locating method according to claim 6 based on gradient detection, it is characterised in that the positioning iris
The method of external boundary reference radius further comprises:
Reference radius ro according to the gradient in iris region external boundary neighborhood △ r to the external boundaryrefIt is finely adjusted,
Formula is as follows:
<mrow>
<mi>r</mi>
<mi>o</mi>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mi>m</mi>
<mi>e</mi>
<mi>a</mi>
<mi>n</mi>
<mo>(</mo>
<mrow>
<msub>
<mi>ro</mi>
<mrow>
<mi>r</mi>
<mi>e</mi>
<mi>f</mi>
</mrow>
</msub>
<mo>+</mo>
<mo>&Sigma;</mo>
<msub>
<mi>r</mi>
<mi>&Delta;</mi>
</msub>
</mrow>
<mo>)</mo>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mo>&Exists;</mo>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>r</mi>
<mi>&Delta;</mi>
</msub>
<mo>,</mo>
<mi>x</mi>
<mi>o</mi>
<mo>,</mo>
<mi>y</mi>
<mi>o</mi>
<mo>)</mo>
</mrow>
<mo>&GreaterEqual;</mo>
<mi>&delta;</mi>
<mi>G</mi>
<mi>r</mi>
<mi>a</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>ro</mi>
<mrow>
<mi>r</mi>
<mi>e</mi>
<mi>f</mi>
</mrow>
</msub>
<mo>,</mo>
<mi>x</mi>
<mi>o</mi>
<mo>,</mo>
<mi>y</mi>
<mi>o</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>ro</mi>
<mrow>
<mi>r</mi>
<mi>e</mi>
<mi>f</mi>
</mrow>
</msub>
<mo>,</mo>
<mi>e</mi>
<mi>l</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
In formula, r△∈[roref-△r,roref)∪(roref,roref+ △ r], δ ∈ (0,1) are weight factor, and ro is outside iris
Bound radius.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410504448.7A CN104268527B (en) | 2014-09-26 | 2014-09-26 | A kind of iris locating method based on gradient detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410504448.7A CN104268527B (en) | 2014-09-26 | 2014-09-26 | A kind of iris locating method based on gradient detection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104268527A CN104268527A (en) | 2015-01-07 |
CN104268527B true CN104268527B (en) | 2017-12-12 |
Family
ID=52160047
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410504448.7A Active CN104268527B (en) | 2014-09-26 | 2014-09-26 | A kind of iris locating method based on gradient detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104268527B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105141938B (en) * | 2015-08-18 | 2017-12-01 | 深圳先进技术研究院 | Sight positioner |
CN105260698B (en) * | 2015-09-08 | 2019-01-25 | 山东眼神智能科技有限公司 | The method and apparatus that iris image is positioned |
CN105488487B (en) * | 2015-12-09 | 2018-11-02 | 湖北润宏科技股份有限公司 | A kind of iris locating method and device |
CN105631816B (en) * | 2015-12-22 | 2018-04-03 | 北京无线电计量测试研究所 | A kind of iris image noise classification detection method |
CN106485754B (en) * | 2016-09-12 | 2019-06-14 | 微鲸科技有限公司 | Fish-eye scaling method and equipment |
CN107871322B (en) * | 2016-09-27 | 2020-08-28 | 北京眼神科技有限公司 | Iris image segmentation method and device |
CN109684915B (en) * | 2018-11-12 | 2021-01-01 | 温州医科大学 | Pupil tracking image processing method |
CN109598209A (en) * | 2018-11-15 | 2019-04-09 | 北京无线电计量测试研究所 | A kind of detection method of definition of iris image |
CN110781745B (en) * | 2019-09-23 | 2022-02-11 | 杭州电子科技大学 | Tail eyelash detection method based on composite window and gradient weighted direction filtering |
CN110929570B (en) * | 2019-10-17 | 2024-03-29 | 珠海虹迈智能科技有限公司 | Iris rapid positioning device and positioning method thereof |
CN111178189B (en) * | 2019-12-17 | 2024-04-09 | 北京无线电计量测试研究所 | Network learning auxiliary method and system |
CN112434675B (en) * | 2021-01-26 | 2021-04-09 | 西南石油大学 | Pupil positioning method for global self-adaptive optimization parameters |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6089711A (en) * | 1997-11-05 | 2000-07-18 | Blankenbecler; Richard | Radial gradient contact lenses |
CN101576951A (en) * | 2009-05-20 | 2009-11-11 | 电子科技大学 | Iris external boundary positioning method based on shades of gray and classifier |
-
2014
- 2014-09-26 CN CN201410504448.7A patent/CN104268527B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6089711A (en) * | 1997-11-05 | 2000-07-18 | Blankenbecler; Richard | Radial gradient contact lenses |
CN101576951A (en) * | 2009-05-20 | 2009-11-11 | 电子科技大学 | Iris external boundary positioning method based on shades of gray and classifier |
Non-Patent Citations (1)
Title |
---|
彩色虹膜定位算法的研究;印雨;《万方数据 企业知识服务平台》;20140225;第2.2.2、3.2、3.4、4.2.3、4.3节,图2.3 * |
Also Published As
Publication number | Publication date |
---|---|
CN104268527A (en) | 2015-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104268527B (en) | A kind of iris locating method based on gradient detection | |
CN107705328B (en) | Balanced probe position selection for 3D alignment algorithms | |
Kong et al. | A generalized Laplacian of Gaussian filter for blob detection and its applications | |
Jiang et al. | Salient region detection by ufo: Uniqueness, focusness and objectness | |
CN102831386B (en) | Object identification method and recognition device | |
US8908945B2 (en) | Biological unit identification based on supervised shape ranking | |
CN107871322A (en) | Iris segmentation method and apparatus | |
CN104899888B (en) | A kind of image sub-pixel edge detection method based on Legendre squares | |
Bheda et al. | A study on features extraction techniques for image mosaicing | |
Köthe et al. | Riesz-transforms versus derivatives: On the relationship between the boundary tensor and the energy tensor | |
CN105825497B (en) | Probe placement for image processing | |
CN106462775A (en) | Discrete edge binning template matching system, method and computer readable medium | |
JP2011054062A (en) | Apparatus and method for processing image, and program | |
US9589360B2 (en) | Biological unit segmentation with ranking based on similarity applying a geometric shape and scale model | |
US9443128B2 (en) | Segmenting biological structures from microscopy images | |
Lobaton et al. | Robust topological features for deformation invariant image matching | |
Esteves et al. | Cancer cell detection and morphology analysis based on local interest point detectors | |
Haker et al. | Scale-invariant range features for time-of-flight camera applications | |
Tu et al. | Automatic target recognition scheme for a high-resolution and large-scale synthetic aperture radar image | |
Werner et al. | Saliency-guided object candidates based on gestalt principles | |
KR101777948B1 (en) | Apparatus for automatically detecting an object using an entropy and difference of background rate of change | |
Li | Edge Location Method for Multidimensional Image Based on Edge Symmetry Algorithm | |
Hassen et al. | Multi-sensor image registration based-on local phase coherence | |
Bharathi et al. | Automatic land use/land cover classification using texture and data mining classifier | |
Dakhore et al. | Adaptive Ridge Edge Detection in Multitemporal Satellite Images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |