CN105956536A - Pretreatment method and device for iris recognition - Google Patents

Pretreatment method and device for iris recognition Download PDF

Info

Publication number
CN105956536A
CN105956536A CN201610265148.7A CN201610265148A CN105956536A CN 105956536 A CN105956536 A CN 105956536A CN 201610265148 A CN201610265148 A CN 201610265148A CN 105956536 A CN105956536 A CN 105956536A
Authority
CN
China
Prior art keywords
iris
point
image
circle
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610265148.7A
Other languages
Chinese (zh)
Inventor
韩桂明
李钊
周斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Senke Syed Technology Co Ltd
Original Assignee
Beijing Senke Syed Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Senke Syed Technology Co Ltd filed Critical Beijing Senke Syed Technology Co Ltd
Priority to CN201610265148.7A priority Critical patent/CN105956536A/en
Publication of CN105956536A publication Critical patent/CN105956536A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

An iris recognition pretreatment method comprises the following steps: (a) acquiring an original grayscale image including an iris image by acquiring an image of an iris in a human eye through a lens; (b) using the Haar wavelet method to obtain the iris image part of the low frequency sub-band in the original grayscale image; (c) using the Canny operator to conduct edge detection to the iris edges of the iris image part; (d) using sub-pixel circle positioning for the precise positioning of the iris inner circle; and (e) using the circle detection operator, combined with increasing the search radius step to achieve the precise positioning of the iris outer circle. The iris recognition pretreatment process can accurately identify the inner and outer boundary of an iris, and locate the iris position in an iris image quickly and precisely, which provides a good foundation for a noise-free image signal for following feature point recognition and feature matching. The corresponding iris recognition terminal, with a high identification ability, costs shorter time to acquire images and achieves high processing efficiency.

Description

The preprocess method of a kind of iris identification and device
Technical field
The present invention relates to a kind of signal recognition method and device, particularly to a kind of image boundary recognition accuracy Signal recognition method and device.
Background technology
The safety problem of current kindergarten is that everybody is all in concern.Cause most the problem of everybody concern It is to pick safety problem, if ensureing that picking safety is the matter of utmost importance in current kindergarten safety problem, this The root of problem is how to identify exactly have the identification of special related personnel, the most such as with child What accurately identifies the top priority that related personnel's identity is native system.Technology existing frequently-used in identification Fingerprint recognition, recognition of face and iris identification, fingerprint recognition and recognition of face is had to have certain resistance on using Hinder, i.e. identify when people has wearing and bother very much, but iris identification say in a sense can overcome such Trouble (such as band glove, mask etc.) because we walk etc. eyes must be allowed exposed outside, it is impossible to close not Ventilative, this brings conveniently to iris capturing, is also why native system uses iris identity recognizing technology One of reason.
The ultimate principle of iris technology: iris can control pupil size, and brings color to the eyeball of people. At prenatal development stage, iris has formed complicated unique structure the most, keeps constant in whole life course. Here it is the effectiveness real causes of the biological recognition system on iris basis, everyone iris is different, This give biological recognition system based on iris for human identity identification established feasibility basis.Typical case Iris scan in, near infrared light (the specifically Infrared of certain three kinds of special wavelength) irradiate under take the photograph Camera can shoot the image of tester's eyes, the attribute of software identification eyeball iris, then resolves to 1024 Sample areas, finds to reflect eyes hypothallus (stroma) structure of light the most knifeedge.This One unique information can be used for producing binary number code.
Biological activity: iris is the visible part of human eye, is in the protection of sclera, has extremely strong biology and lives Property.Such as, the size of pupil changes with light intensity;Automatic regulation process is had during depending on thing;Have per second Unconscious pupil scaling up to more than ten times;At human body brain death, it is in degree of depth comatose state or eyeball tissue When departing from human body, iris tissue shrinks the most completely, mydriasis phenomenon occurs.These biological activitys and human life Phenomenon exists simultaneously, and symbiosis ceases altogether, so wanting the iris using photo, video recording, the iris of corpse to replace live body Image is all impossible, thus ensure that the verity of physiological tissue.
Untouchable: iris digital picture can be obtained from certain distance, it is not necessary to user's contact arrangement, the person is not had There is infringement, thus the most accepted by the public
Uniqueness: uniqueness refers to that the information that each iris is comprised is different from, and occurs that form is identical The probability of iris tissue is well below its hetero-organization.The fibrous tissue details of iris is complicated and enriches, and Being formed of it is relevant with the physical and chemical condition of this tissue local of embryogenetic stage, has great randomness, Also certain iris cannot be replicated even with clone technology.The iris texture information of identical twins is different, with The iris texture of one people's right and left eyes is all without mutually approval.
Stability: the most extremely stable people of iris, in utero (fetus 7 months time) has been formed, Shape after being born 6 18 months, the most unchangeable. iris tissue will not be caused damage by general disease, Abrasion will not be caused because of factors such as occupations.
Antifalsification: can not to vision without surgically changing iris feature in the case of having a strong impact on, more Can not must be identical with the feature of certain special object by the iris tissue characteristic change of a people, with photo, Video recording, the iris of corpse replace the iris image of live body all can be tested out.
But the data processing cost such as the signals collecting during iris identification, signal identification are high, Er Qiehong Film is the same with finger print data need highly protective, it is to avoid leak.Believing in part because of image of high cost Number efficiency processed and accuracy rate cause.
Summary of the invention
It is an object of the invention to provide the preprocess method of a kind of iris identification, during solving iris identification The technical problem of later stage signal processing reliability.
Another object of the present invention provides the iris identification of the preprocess method application of a kind of iris identification eventually End, solves the technical problem that existing identification terminal pretreatment potentiality is low.
The preprocess method of the iris identification of the present invention, comprises the following steps:
A, by camera lens, the iris in human eye is carried out image acquisition, it is thus achieved that containing the original ash of iris image Degree image;
B, utilize Haar wavelet method obtain original-gray image medium and low frequency subband iris image part;
C, utilize Canny operator that the iris edge of iris image part is carried out rim detection;
D, utilize sub-pix circle location carry out being accurately positioned of iris inner circle;
E, utilize circular template, in conjunction with increasing search radius step-length, carry out being accurately positioned of iris cylindrical.
Step a comprises the following steps:
Utilize median filtering algorithm that the original-gray image gathered is carried out noise processed.
Step b comprises the following steps:
Original-gray image is carried out Haar wavelet transformation, forms four subbands, choose what low frequency sub-band was formed Greyscale image data.
Step d comprises the following steps:
First ask for its gray value along the neighbor point of gradient direction according to pixel edge point, carry out marginal point Region divides.
Step d comprises the following steps:
And then try to achieve this marginal point with it along the gray scale difference of gradient direction neighbor point, carry out gray-level interpolation and gray scale Difference is asked for.
Step d comprises the following steps:
Then utilize parameter fitting method to try to achieve accurate sub-pixel marginal point, carry out gaussian curve approximation.
Step e comprises the following steps:
1), centered by the pupil center of circle, the matrix of a size of 10 × 10, as the span in the iris center of circle;
2) successively with each point as the center of circle, from the beginning of the radius of inner boundary, radius is increased at a certain distance, so Just produce a series of annulus;
3) on each annulus, selected point at a certain angle, the most each annulus obtains same number same angular The point of degree, takes out the gray value of these points, is added by the gray value of these points of each annulus, the most each Annulus just has a gray value sum;
4) the gray value sum of each adjacent rings is compared, and record gray value sum change maximum The center of circle of that annulus and radius;
5) obtain all centers of circle span corresponding gray value sum change maximum, choose gray value it The center of circle corresponding time maximum with change and radius are as the center of circle of iris and radius.
Also include the normalization of iris data.
Utilize the iris identification terminal of the preprocess method of iris identification.
The preprocessing process of iris identification utilized can accurately identify inner circle and the cylindrical border of iris, quickly, The iris position being accurately positioned in iris image, provides good for follow-up Feature point recognition and characteristic matching Muting picture signal basis.Corresponding iris identification terminal has higher identification capability, acquisition time Short, treatment effeciency is higher.
Accompanying drawing explanation
Fig. 1 is the structural representation that the present invention uses the iris identification terminal of the preprocess method of iris identification;
Fig. 2 is the structural representation of the kindergarten ensuring pick-up safety system of the iris identification terminal of the present invention;
Fig. 3 is the structural representation of the iris light compensating apparatus of the iris identification terminal of the present invention;
Fig. 4 is the identification process schematic diagram of the iris identification terminal of the present invention;
Fig. 5 be the iris identification terminal of the present invention the identity maintenance process of kindergarten ensuring pick-up safety system show It is intended to;
Fig. 6 be iris identification of the present invention preprocess method in rounded edge point gradient direction and region divide signal Figure;
Fig. 7 be iris identification of the present invention preprocess method defined in neighbor point schematic diagram;
Fig. 8 be iris identification of the present invention preprocess method in gray-level interpolation schematic diagram;
Fig. 9 be iris identification of the present invention preprocess method in gaussian curve approximation algorithm principle figure;
Figure 10 is the Canny operator inner circle detection and location design sketch of prior art;
Figure 11 be iris identification of the present invention preprocess method in sub-pix inner circle locating effect figure.
Detailed description of the invention
Below in conjunction with the accompanying drawings the detailed description of the invention of the present invention is described in detail.
As it is shown in figure 1, iris identification terminal 01 includes the touch screen being connected with preprocessor 11 data link 12, iris of left eye acquisition camera 13, iris of left eye light compensating apparatus 14, iris of right eye acquisition camera 15, Iris of right eye light compensating apparatus 16, RFID interface 17, network interface 18 and blue tooth interface 19, wherein:
Preprocessor 11, for the iris image of collection carries out datumization conversion, forms iris finger print data, Complete comparing and form matching result;Control photographic head and light compensating apparatus collaborative collection iris image;Pass through The data-interface connected forms data link;Form human-computer interaction interface data;
Touch screen 12, is used for showing human-computer interaction interface data, it is provided that with the sensor of man-machine interaction;
Iris of left eye acquisition camera 13, for gathering the biological characteristic image of iris of left eye;
Iris of left eye light compensating apparatus 14, for adjusting the illumination physical parameter for iris of left eye, coordinates shooting Capitiform becomes the biological characteristic image of the iris of left eye under different wave length, brightness;
Iris of right eye acquisition camera 15, for gathering the biological characteristic image of iris of right eye;
Iris of right eye light compensating apparatus 16, for adjusting the illumination physical parameter for iris of right eye, coordinates shooting Capitiform becomes the biological characteristic image of the iris of right eye under different wave length, brightness;
RFID interface 17, for providing the wireless communication interface mated with near-field communication, sets up data communication chain Road;
Network interface 18, for providing the wirelessly or non-wirelessly communication interface with TCP/IP net mate, sets up number According to communication link;
Blue tooth interface 19, for providing the wireless communication interface mated with Bluetooth communication, sets up data communication chain Road.
The iris identification terminal of the present embodiment can set up data cube computation with service end, and distributed storage iris refers to Stricture of vagina data, set up the debugging enironment of biological characteristic image, simultaneously Integrated Human Machine Interaction interface, utilize near field to lead to Beacon knows the independence of iris identification terminal, it is ensured that data acquisition, man-machine interaction and the reliability of data response.
As in figure 2 it is shown, iris identification terminal 01 connects iris identification server respectively by network interface 18 03 and ensuring pick-up safety service response server 05 set up data link, wherein:
Ensuring pick-up safety service response server 05, for setting up the operation flow data of service end and terminal room, Operation flow data are formed human-machine interaction data by the request of data of response iris identification terminal 01;
Iris identification server 03, for storing iris finger print data and the matching result of iris identification terminal 01, Iris finger print data in distribution iris identification terminal 01 identification range, verifies iris identification terminal 01.
The kindergarten ensuring pick-up safety system based on iris identification of the present embodiment, utilizes Distributed Storage to tie Conjunction distributed data responds, and defines iris identification terminal, iris identification business whole with additional work attendance business Close so that personnel and equipment organically form entirety in kindergarten ensuring pick-up safety system, it is to avoid personnel with The impact of the discreteness of equipment.
The iris light compensating apparatus of iris identification terminal 01 or iris light compensating apparatus are to the formation of iris image and follow-up Data identification in conversion process has strong influence.
As it is shown on figure 3, (right eye or left eye) iris light compensating apparatus includes an axis and photographic head optical axis coincidence Annular fixed body, annular fixed body be an annulus, including coaxial rear ring plate 42, front annular slab 43, interior annular riser 44 and exterior annular riser 45, the outer ledge of rear ring plate 42 front end connects The rear end of exterior annular riser 45, the front end of exterior annular riser 45 connects the outside of front annular slab 43 rear end Edge, the inside edge of front annular slab 43 rear end connects the front end of interior annular riser 44, and interior annular is stood The inside edge of ring plate 42 front end after the rear end connection of plate 44;
Later on the basis of ring plate 42, the height of interior annular riser 44 is less than exterior annular riser 45 Highly, front annular slab 43 radially forms cambered surface, and the center of circle amphi position of cambered surface is in front annular slab 43 front;
In front annular slab 43 front end, it is evenly distributed first group of projection source 46, is evenly distributed second group of throwing Penetrating light source 47, the spacing of first group of projection source 46 to axis is more than second group of projection source 47 to axis Spacing, first group of projection source 46 is arranged alternately with second group of projection source 47;
First group of transmitted light source 46 or second group of transmitted light source 47 include the light source of three wavelength.
The signal source that iris biological variability is caused during can ensure that iris image collection by the present embodiment is by mistake Making up of difference.
As shown in Figure 4, for utilizing the iris identification terminal of native system to carry out the basic step of identification, can Well the head of a family, child, the position mobile node of teacher, work attendance information are combined with iris identification, Form reliable operation flow, it is ensured that holonomic system and the reliability of operation flow.
As it is shown in figure 5, utilize native system can be managed collectively in operation flow the authentication of various roles and Authentication, can be effectively ensured the reliability of operation flow.
In conjunction with Fig. 2, the major function of iris identification server 03 includes:
The all data that can store all kindergarten students, the head of a family and teacher (mainly have student information, family Length picks the accurate information of child and the work attendance information etc. of teacher);
The classification of data, integration can be completed as ultimate unit according to garden;
It is capable of identify that accessed iris picks the legitimacy of equipment, can forbid that the iris that non-our company authorizes sets Standby connection Cloud Server;
Can back up in realtime all data relevant to kindergarten.
The major function of iris identification terminal 01 includes:
Automatically the kindergarten number relevant to this garden can be downloaded from Cloud Server according to the garden long message inputted According to;
Can off line or networking mode complete kindergarten pick, the crucial operation such as teacher's work attendance;
The iris capturing work of student, the head of a family, teacher can be completed, can real-time update take to cloud after collection Business device (under networking state), storage > 1000 parts of iris templates data (under off-line state), when can be from after networking Move and upload all iris templates data stored by this locality;
Can show and pick data message, work attendance information.
Can be by all of information, the work attendance information copy picked to mobile disk.
The main merit of interaction data with the Web end page format that ensuring pick-up safety service response server 02 is formed Can include:
Complete the details typing of student, the head of a family and teacher;
Student, the head of a family and the iris templates data of teacher can be gathered by hand-held iris capturing equipment;
Inquire about the full detail of all kindergartens.
The general flow of iris identification is obtained tested rainbow by iris image acquiring, Iris preprocessing, feature extraction Film feature, carries out characteristic matching with the sample iris characteristic data storehouse formed by iris sample, it is thus achieved that coupling knot Really.
Wherein the preprocessing process of iris identification is divided into again iris image noise processed, the location of pupil, iris Inner circle positions, iris excircle orientation, the normalization of iris and the enhancing of iris image.At iris image Noise processed during, the general noise using median filter to filter image, but this wave filter for The wider little noise spot of distribution, without good filtration result, simultaneously because use fixed threshold, therefore exists Retain details and smooth aspect also exists contradiction.
For Iris Location, the most popular following three kinds of algorithms:
1, utilize Canny operator and Hough transform to combine and carry out the inside and outside circle location of iris and determining of pupil Position.This algorithm carries out rim detection initially with Canny edge detection operator to the inside and outside circle of iris, immediately The inner circle border being determined iris again by Hough transform, cylindrical border, this algorithm realizes Iris Location essence Degree height, but speed is slow.
2, utilize binary-state threshold and method of least square operator to combine and carry out Iris Location.This algorithm initially with Binary-state threshold method segmentation pupil, and obtain the inner circle of iris, then carry out rim detection by Canny algorithm, Using least square fitting cylindrical, the speed of this algorithm is fast, but to the inside and outside circle setting accuracy of iris not High
3, utilize small echo and loop truss algorithm to combine and carry out Iris Location.This algorithm is initially with Haar small echo Different passages obtain the frequency domain information of iris and pupil, utilize Canny algorithm to realize the inside and outside of iris The rim detection of circle, recycling method of least square realizes the location of inner circle, utilizes circular template to obtain cylindrical Location.In this algorithm, method of least square determines that the accuracy of inner circle is the highest, and loop truss algorithm determines cylindrical Speed is slower, it is therefore desirable to improve this algorithm.
Rim detection precision in order to solve to face in small echo and loop truss algorithm is the highest, and inner circle determines precision The problems such as the highest and excircle orientation algorithm is slower, have employed sub-pix circle location algorithm and realize the essence of inner circle calmly Position, utilizes the method increasing search radius step-length to promote the detection speed of circular template.Summary two side The improvement in face and innovation, define the most efficient a kind of Iris preprocessing algorithm.
The pre-place of accurately efficient iris image in the iris identification terminal based on iris identification of the present embodiment Reason method comprises the following steps:
A, by camera lens, the iris in human eye is carried out image acquisition, it is thus achieved that containing the original ash of iris image Degree image;
B, utilize Haar wavelet method obtain original-gray image medium and low frequency subband iris image part;
C, utilize Canny operator that the iris edge of iris image part is carried out rim detection;
D, utilize sub-pix circle location carry out being accurately positioned of iris inner circle;
E, utilize circular template, in conjunction with increasing search radius step-length, carry out being accurately positioned of iris cylindrical.
This method can accurately identify inner circle and the cylindrical border of iris, quickly, is accurately positioned in iris image Iris position, provide good muting picture signal base for follow-up Feature point recognition and characteristic matching Plinth.
Utilize median filtering algorithm that the original-gray image gathered is carried out at noise between step a and step b Reason, is necessary but necessarily step.
Concrete data handling procedure therein is as follows.
1.Haar wavelet decomposition
Traditional signal analysis is set up on the basis of Fourier (Fourier) converts, but owing to it is a kind of overall situation Conversion, it is impossible to being analyzed signal in time-frequency local, and partial analysis is the pass of Non-stationary Signal Analysis Key.Wavelet transformation is a kind of time one frequency analysis method, and it has the feature of multiresolution analysis, and All there is in time-frequency two territory the ability characterizing signal local feature, be that a kind of time window and frequency window can The Time-Frequency Localization changed analyzes method.In low frequency part, there is the bigger time in order to avoid distortion, i.e. have relatively low Temporal resolution and higher frequency resolution;At HFS, there is higher temporal resolution and relatively low Frequency resolution, therefore small echo be referred to as analyze signal microscope.
Due to the physiological characteristics of iris itself, the edge of iris is not to be apparent from.But Iris Location needs To edge image clearly, utilizing small echo can realize Analytical high resolution at low frequency, high-frequency energy realizes low resolution The feature that rate is analyzed, can be eliminated high-frequency noise and the iris image of the unnecessary amount of calculation of minimizing, prominent The structure at edge
Present invention employs Haar small echo and iris image resolves into four subbands, wherein low frequency sub-band shows relatively For pupil clearly and the edge of iris.
2.Canny operator carries out rim detection
The edge contour of iris image belongs to the typical step-like edge of son, and step-like edge uses first differential to calculate Son is relatively more effective, and Canny edge detection operator is a kind of preferably single order operator, available Gaussian function Gradient approximates.It is very close to the best edge operator of the linear combination formation of four exponential functions in theory. Canny operator has positioning precision edge high, single and the advantage such as Detection results is good.Use Canny operator The edge of sclera image and iris image can be detected the most clearly.
3. sub-pixel inner circle fine positioning algorithm
This method uses sub-pixel inner circle fine positioning algorithm, and first this algorithm is asked for according to pixel edge point It is along the gray value of the neighbor point of gradient direction, and then tries to achieve this marginal point with it along gradient direction neighbor point Gray scale difference, then utilizes parameter fitting method to try to achieve accurate sub-pixel marginal point.Algorithm is broadly divided into three Part: marginal point region divides, gray-level interpolation and gray scale difference are asked for, gaussian curve approximation.
3.1 marginal point regions divide
As shown in Figure 6, according to the ladder of the edge pixel point of circular edge gray distribution features, i.e. circular image Degree direction is the circular center of circle and edge pixel point line direction, utilizes the coarse positioning center of circle to obtain marginal point Gradient direction.0 is the coarse positioning center of circle, if (x, y), according to the intensity profile of circle for arbitrarily pixel edge point P Feature, the gradient direction of P is the line direction that this point is arrived in the center of circle, the i.e. direction of straight line OP.For the ease of dividing Analysis with calculate, rounded edge point is divided into as follows two regions: straight line L1And L2Respectively with x-axis positive direction Angle be 45 ° and-45 °, and be N, K, J, M with circumference intersection point, rounded edge point be divided into two Point, the edge pixel point that a part is answered for ∠ NOM and ∠ KOJ, the referred to as edge pixel point in region 1;Another part It it is the edge pixel point of edge pixel point corresponding for ∠ NOK and ∠ JOM, referred to as region 2.
3.2 gray-level interpolations and gray scale difference value
3.2.1 edge pixel point is along the definition of the neighbor point of gradient direction
As it is shown in fig. 7, set edge pixel point as P (Xp, Yp), it is assumed that P is in region 2, as shown in Figure 6, L2For the gradient direction straight line of P point, straight line y1, y2, y3, y4For neighbouring four horizontal strokes in units of pixel of P point Line, respectively with straight line L2Intersect at a B, A, C, D;Straight line x1, x2, x3, x4Neighbouring with pixel for P point For four ordinates of unit, respectively with straight line L2Intersect at a D ', C ', B ', A '., choose B ', A ', C ', D ' } and two groups of points of B, A, C, D} as marginal point along candidate's neighbor point of gradient direction.Owing to P point exists In region 2, gradient direction is based on y direction, i.e. L2The absolute value of slope more than 1, therefore, Hou Xuanlin Near point { B ', A ', C ', D ' } } { B, A, C, D} arrive substantially to be greater than candidate's neighbor point to the distance of P point The distance of P point, if selecting { B ', A ', C ', D ' } } as P point along the neighbor point of gradient direction, can cause Location, edge is accurate not, and therefore, { B, A, C, D} are as its neighbor point in selection.In like manner, if P is in district In the l of territory, it should be { B ', A ', C ', D ' } along the neighbor point of gradient direction.
3.2.2 gray-level interpolation obtains the gray value of neighbor point
As shown in Figure 8, O is the center of circle, and P is pixel edge point, and marginal point P is along the neighbor point of gradient direction Letter representation as hereinbefore.If P is in region 2, the whole picture of center representative of each square dotted line frame Vegetarian refreshments.Straight line OP is the gradient direction of P point, straight line L:y=ypThe intersection point of+1 and straight line OP is A, A Point is P point one right next-door neighbour point.It is when the change in location of P, different with the straight line in the center of circle by marginal point P, The most different from the coordinate of the intersection point of straight line L, then the gray value of intersection point is the most different.Owing to intersection point is non-whole picture Vegetarian refreshments, needs the gray value of antinode to carry out gray scale linear interpolation processing.In the hope of the gray scale of A point in region 2 As a example by value.Straight line OP and straight line y=yp+ 1 intersects at A point, if the coordinate of A point is (xa, ya), ya=yp+1 And xaValue can be according to the slope of straight line OP and A point vertical coordinate yaTry to achieve, with two the most close with A point Whole pixel A2、A3The gray value of gray value A point carry out linear interpolation, obtain the gray value of A point
f(xa, ya)=(1-λ) * f ([xa], ya)+λ*f([xa]+1 formula 1
Wherein, λ=xa-, (x, y), f (x, Y) represents that pixel coordinate is (x, gray value y), symbol to f The integer part represented, unit is pixel.
In like manner can obtain the B in region 2, the gray value of C, D is f (xb, yb), f (xo, yo), f (xd, yd)。
3.2.3 marginal point and the gray scale difference value of the neighbor point along gradient direction thereof
Obtain in image after the gray value of neighbor point D, C, A, B of pixel edge point P, to P point with A, B, C, D 4 carry out grey scale difference process.Consider to take the tightness of difference, choose forward difference With the meansigma methods of backward difference as the gray scale difference score value of this point, such as, for a P (xp, yp), gray value For f (xp, yp), the gray value that its gradient direction is close to former point C is f (xc, yc), the gray scale of latter point Value is f (xa, ya), then the gray scale difference value of this P (x, Y) is
3.3 gaussian curve approximation
As it is shown in figure 9, for circular image, the gradient direction of pixel edge point is that the straight of this point is arrived in the center of circle Line direction, it is only necessary to the neighbor point to pixel edge point and along gradient direction carries out gaussian curve approximation i.e. raised path between farm fields Obtain the position of sub-pixel edge point, so the Gauss curved matching of two dimension can be converted into one-dimensional Gauss Curve matching.P point is pixel edge point, and D, C, A, B are its neighbor point along gradient direction, Gauss P ' point corresponding to the M of hump should be its true edge point position, and the range difference that P point and P ' put is.One The expression formula of dimension Gaussian curve is:
In formula, μ is average, and σ is standard deviation, for convenience of calculating, takes the logarithm formula (3) both sides, and makes Can be converted into
Gathering theorem according to square aperture, pixel grey scale interpolation is
Making the serial number 0 of marginal point P, gray scale difference is f0, its neighbor point D, C, A, B sequence number represents respectively For-2 ,-1,1 and 2, corresponding gray scale difference value is expressed as f-2, f-1, f1, f2.To D, C, P, A, B 5 points, substitute into formula (5), have:
According to formula (6)~(10), Simultaneous Equations, can try to achieve A with method of least square, B, C are about f-2, f-1, f1, f2, f0Expression formula, substituted into parabola apex coordinate value δ=-B/2A:
According to 6 and the slope of the gradient direction straight line of pixel edge point P, P ' point and P can be obtained respectively Point coordinate difference δ on x direction and Y-directionx, δy.So, for pixel P (xp, yp), its correspondence Sub-pixel edge point is P ' (xpx, ypy)。
3.4 sub-pixel positioning algorithm steps
1) its gray value along the neighbor point of gradient direction is asked for according to pixel edge point gray-level interpolation.
2) pixel edge point and the gray scale difference of the neighbor point along gradient direction are asked for according to formula (2).
3) by 2) in each point gray scale difference substitute into formula (11) try to achieve the deviation δ gradient in conjunction with pixel edge point P The slope of direction straight line, can obtain the coordinate difference on x direction and y direction of P point and P point respectively
4)P(xp, yp), the sub-pixel edge point of its correspondence is P ' (xpx, ypy)
5) all pixel edge points are asked for successively the coordinate of its corresponding real sub-pixel edge point.
3.5 sub-pix inner circle locating effects
As shown in Figure 10 and Figure 11, in sub-pix inner circle locating effect is substantially better than Canny operator, loop truss is fixed The visible sub-pix inner circle positioning precision in position is higher.
4. the circular template improved carries out excircle orientation
Due to the cyclic nature that iris is better, it is possible to employing below equation operator location outward flange:
(x is y) with (x to I., Y.) it is the center of circle, for the gray value of point on the circumference of radius, iris can be embodied Grey scale change with sclera intersection.Do convolution with Gaussian function, further obviate noise, exacerbate friendship Graded at boundary, beneficially parameter extraction.Above-mentioned formula essence be in a yardstick circular detector. Its position fixing process is at (r, x., Y.) the three continuous iteration of parameter space seek the process of optimal solution.This operator needs The center of circle and radius are scanned for, and be carried out on the basis of determining gray scale difference a kind of on a large scale Searching method, it is clear that above-mentioned operator amount of calculation is bigger.Make improvements in the present invention, search this calculation little The hunting zone of son, specifically comprises the following steps that
1), centered by the pupil center of circle, the matrix of a size of 10 × 10, as the span in the iris center of circle;
2) successively with each point as the center of circle, from the beginning of the radius of inner boundary, radius is increased at a certain distance, so Just produce a series of annulus;
3) on each annulus, selected point at a certain angle, the most each annulus obtains same number same angular The point of degree, takes out the gray value of these points, is added by the gray value of these points of each annulus, the most each Annulus just has a gray value sum;
4) the gray value sum of each adjacent rings is compared, and record gray value sum change maximum The center of circle of that annulus and radius;
5) obtain all centers of circle span corresponding gray value sum change maximum, choose gray value it The center of circle corresponding time maximum with change and radius are as the center of circle of iris and radius.
5. the normalization of iris
The purpose of iris normalization is that every width original image is adjusted to identical size and correspondence position, thus disappears Except translation, scale and rotate the impact for iris identification.The present invention uses conventional iris normalization and calculates Method.
The above, the only present invention preferably detailed description of the invention, but protection scope of the present invention not office Being limited to this, any those familiar with the art, can be easily in the technical scope that the invention discloses The change expected or replacement, all should contain within protection scope of the present invention.Therefore, the protection of the present invention Scope should be as the criterion with the protection domain of claims.

Claims (9)

1. a preprocess method for iris identification, comprises the following steps:
A, by camera lens, the iris in human eye is carried out image acquisition, it is thus achieved that containing the original ash of iris image Degree image;
B, utilize Haar wavelet method obtain original-gray image medium and low frequency subband iris image part;
C, utilize Canny operator that the iris edge of iris image part is carried out rim detection;
D, utilize sub-pix circle location carry out being accurately positioned of iris inner circle;
E, utilize circular template, in conjunction with increasing search radius step-length, carry out being accurately positioned of iris cylindrical.
2. the preprocess method of iris identification as claimed in claim 1, step a comprises the following steps:
Utilize median filtering algorithm that the original-gray image gathered is carried out noise processed.
3. the preprocess method of iris identification as claimed in claim 1, step b comprises the following steps:
Original-gray image is carried out Haar wavelet transformation, forms four subbands, choose what low frequency sub-band was formed Greyscale image data.
4. the preprocess method of iris identification as claimed in claim 1, step d comprises the following steps:
First ask for its gray value along the neighbor point of gradient direction according to pixel edge point, carry out marginal point Region divides.
5. the preprocess method of iris identification as claimed in claim 1, step d comprises the following steps:
And then try to achieve this marginal point with it along the gray scale difference of gradient direction neighbor point, carry out gray-level interpolation and gray scale Difference is asked for.
6. the preprocess method of iris identification as claimed in claim 1, step d comprises the following steps:
Then utilize parameter fitting method to try to achieve accurate sub-pixel marginal point, carry out gaussian curve approximation.
7. the preprocess method of iris identification as claimed in claim 1, step e comprises the following steps:
1) centered by the pupil center of circle, the matrix of a size of 10 × 10, as the span in the iris center of circle;
2) successively with each point as the center of circle, from the beginning of the radius of inner boundary, radius is increased at a certain distance, so Just produce a series of annulus;
3) on each annulus, selected point at a certain angle, the most each annulus obtains same number same angular The point of degree, takes out the gray value of these points, is added by the gray value of these points of each annulus, the most each Annulus just has a gray value sum;
4) the gray value sum of each adjacent rings is compared, and record gray value sum change maximum The center of circle of that annulus and radius;
5) obtain all centers of circle span corresponding gray value sum change maximum, choose gray value it The center of circle corresponding time maximum with change and radius are as the center of circle of iris and radius.
8. the preprocess method of the iris identification as according to any one of claim 1 to 7, also includes iris The normalization of data.
9. utilize the iris identification of the preprocess method of iris identification according to any one of claim 1 to 8 Terminal.
CN201610265148.7A 2016-04-26 2016-04-26 Pretreatment method and device for iris recognition Pending CN105956536A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610265148.7A CN105956536A (en) 2016-04-26 2016-04-26 Pretreatment method and device for iris recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610265148.7A CN105956536A (en) 2016-04-26 2016-04-26 Pretreatment method and device for iris recognition

Publications (1)

Publication Number Publication Date
CN105956536A true CN105956536A (en) 2016-09-21

Family

ID=56916795

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610265148.7A Pending CN105956536A (en) 2016-04-26 2016-04-26 Pretreatment method and device for iris recognition

Country Status (1)

Country Link
CN (1) CN105956536A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778499A (en) * 2016-11-24 2017-05-31 江苏大学 A kind of method of quick positioning people's eye iris during iris capturing
CN108830239A (en) * 2018-06-21 2018-11-16 朱秀娈 Vehicle fingerprint identifies trigger mechanism
CN109325462A (en) * 2018-10-11 2019-02-12 深圳斐视沃德科技有限公司 Recognition of face biopsy method and device based on iris
CN110223473A (en) * 2019-05-08 2019-09-10 苏州凸现信息科技有限公司 A kind of safety defense monitoring system and its working method based on the identification of multinomial changing features
CN110309814A (en) * 2019-07-11 2019-10-08 中国工商银行股份有限公司 A kind of iris identification method and device based on edge detection
CN110619272A (en) * 2019-08-14 2019-12-27 中山市奥珀金属制品有限公司 Iris image segmentation method
CN112163507A (en) * 2020-09-25 2021-01-01 北方工业大学 Lightweight iris recognition system facing mobile terminal
CN112414552A (en) * 2020-11-24 2021-02-26 西南交通大学 Body temperature detection device for elevator and body temperature calculation method thereof
CN113192120A (en) * 2021-04-25 2021-07-30 无锡信捷电气股份有限公司 Circle positioning algorithm based on two-dimensional edge measurement and least square principle
CN116740796A (en) * 2022-05-24 2023-09-12 湖南金康光电有限公司 Iris recognition method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020023011A (en) * 2000-09-22 2002-03-28 박형근 Human iris recognition method using harr wavelet transform and lvq
CN101334263A (en) * 2008-07-22 2008-12-31 东南大学 Circular target circular center positioning method
CN105225216A (en) * 2014-06-19 2016-01-06 江苏天穗农业科技有限公司 Based on the Iris preprocessing algorithm of space apart from circle mark rim detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020023011A (en) * 2000-09-22 2002-03-28 박형근 Human iris recognition method using harr wavelet transform and lvq
CN101334263A (en) * 2008-07-22 2008-12-31 东南大学 Circular target circular center positioning method
CN105225216A (en) * 2014-06-19 2016-01-06 江苏天穗农业科技有限公司 Based on the Iris preprocessing algorithm of space apart from circle mark rim detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
何玉峰: "虹膜识别预处理算法的研究及实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778499B (en) * 2016-11-24 2020-06-26 江苏大学 Method for rapidly positioning human iris in iris acquisition process
CN106778499A (en) * 2016-11-24 2017-05-31 江苏大学 A kind of method of quick positioning people's eye iris during iris capturing
CN108830239A (en) * 2018-06-21 2018-11-16 朱秀娈 Vehicle fingerprint identifies trigger mechanism
CN109325462B (en) * 2018-10-11 2021-03-12 深圳斐视沃德科技有限公司 Face recognition living body detection method and device based on iris
CN109325462A (en) * 2018-10-11 2019-02-12 深圳斐视沃德科技有限公司 Recognition of face biopsy method and device based on iris
CN110223473A (en) * 2019-05-08 2019-09-10 苏州凸现信息科技有限公司 A kind of safety defense monitoring system and its working method based on the identification of multinomial changing features
CN110309814A (en) * 2019-07-11 2019-10-08 中国工商银行股份有限公司 A kind of iris identification method and device based on edge detection
CN110619272A (en) * 2019-08-14 2019-12-27 中山市奥珀金属制品有限公司 Iris image segmentation method
CN112163507A (en) * 2020-09-25 2021-01-01 北方工业大学 Lightweight iris recognition system facing mobile terminal
CN112163507B (en) * 2020-09-25 2024-03-05 北方工业大学 Mobile-end-oriented lightweight iris recognition system
CN112414552A (en) * 2020-11-24 2021-02-26 西南交通大学 Body temperature detection device for elevator and body temperature calculation method thereof
CN113192120A (en) * 2021-04-25 2021-07-30 无锡信捷电气股份有限公司 Circle positioning algorithm based on two-dimensional edge measurement and least square principle
CN116740796A (en) * 2022-05-24 2023-09-12 湖南金康光电有限公司 Iris recognition method and system

Similar Documents

Publication Publication Date Title
CN105956536A (en) Pretreatment method and device for iris recognition
CN205750807U (en) A kind of kindergarten ensuring pick-up safety system based on iris identification
US11263432B2 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
US10339362B2 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
KR102561723B1 (en) System and method for performing fingerprint-based user authentication using images captured using a mobile device
US20190388182A1 (en) Tracking surgical items with prediction of duplicate imaging of items
WO2020125499A1 (en) Operation prompting method and glasses
Neal et al. Measuring shape
CN103093215B (en) Human-eye positioning method and device
US20160019421A1 (en) Multispectral eye analysis for identity authentication
US20170091550A1 (en) Multispectral eye analysis for identity authentication
US20160019420A1 (en) Multispectral eye analysis for identity authentication
CN110210276A (en) A kind of motion track acquisition methods and its equipment, storage medium, terminal
CN107438854A (en) The system and method that the image captured using mobile device performs the user authentication based on fingerprint
CN107292242A (en) A kind of iris identification method and terminal
CN103902958A (en) Method for face recognition
US20210089763A1 (en) Animal identification based on unique nose patterns
CN106980852A (en) Based on Corner Detection and the medicine identifying system matched and its recognition methods
CN1885314A (en) Pre-processing method for iris image
CN110766656B (en) Method, device, equipment and storage medium for screening fundus macular region abnormality
CN109993090B (en) Iris center positioning method based on cascade regression forest and image gray scale features
Rafik et al. Application of metaheuristic for optimization of iris Image segmentation by using evaluation Hough Transform and methods Daugman
She et al. Lawn plant identification and segmentation based on least squares support vector machine and multifeature fusion
CN206363347U (en) Based on Corner Detection and the medicine identifying system that matches
Hashim et al. Fast Iris localization based on image algebra and morphological operations

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160921