CN104732202A - Method for eliminating influence of glasses frame during human eye detection - Google Patents
Method for eliminating influence of glasses frame during human eye detection Download PDFInfo
- Publication number
- CN104732202A CN104732202A CN201510074487.2A CN201510074487A CN104732202A CN 104732202 A CN104732202 A CN 104732202A CN 201510074487 A CN201510074487 A CN 201510074487A CN 104732202 A CN104732202 A CN 104732202A
- Authority
- CN
- China
- Prior art keywords
- eye
- human eye
- binaryzation
- circle
- glasses frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The invention relates to a method for eliminating the influence of a glasses frame during human eye detection. At present, the problem that the glasses frame interferes human eye positioning exists during human eye detection. According to the method and on the basis that an AdaBoost detecting algorithm is utilized to obtain human face eye areas, firstly, vertical and horizontal gray projection is applied to the eye areas and strip-shaped areas of the left eye and the right eye are obtained according to the inherent features of the glasses frame; secondly, initial binaryzation is conducted on the strip-shaped areas of the left eye and the right eye respectively, the quasi-circular features of irises and pupils of the human eyes are utilized to adjust binaryzation threshold values through application of self-adaptive binaryzation circle fitting, and binaryzation is conducted on the strip-shaped areas of the left eye and the right eye again; iteration processing is performed, wherein circle centers finally obtained through binaryzation image circle fitting conducted on the strip-shaped areas of the left eye and the right eye are position coordinates of the left eye and the right eye of the human face. The method fully excavates the structural information of eyes and utilizes the inherent features of the glasses frame, and human eye positioning errors caused by the glasses frame are effectively avoided.
Description
Technical field
The invention belongs to the technical field of living things feature recognition and information security, particularly in a kind of human eye detection, eliminate the method for spectacle-frame impact.
Background technology
Biometrics identification technology because biological characteristic (fingerprint, iris, face picture etc.) self intrinsic characteristic makes it surmount and substitutes the possibility that traditional identity recognizing technology becomes a reality, and starts to be promoted and use in the specific application area of some countries and regions.Eye recognition can identify it when target being observed does not find, thus in the ill-matched situation of target, also can carry out target identification, makes face recognition technology have the application potential of the safer social environment of structure; Wherein human eye is as the key position of face characteristic, is the important evidence of living things feature recognition, and the accuracy of human eye detection location directly affects the reliability of recognition of face and fatigue detecting.But the accuracy of human eye location is by the impact of many factors, and under the practical application condition of indefinite recognition of face, spectacle-frame is one of main factor causing Wrong localization.The typical method that projection function method and Hough transform method are located as human eye, for in the process of spectacle-frame, all there is certain limitation, be embodied in: projection function method locates human eye by the grey scale change of image, although this method realizes simple, calculated amount is little, but because spectacle-frame is near human eye, the grey scale change of spectacle-frame creates interference to human eye location, and locating effect is poor; The feature that Hough transform method utilizes iris to be circle carries out human eye location, although the method calculates accurately, but just must can there is good effect when iris is complete, and be difficult to collect complete iris due to reflective, the spectacle-frame of eyeglass and ciliary impact.
Current disclosed documents and materials, also not for the research of spectacle-frame to the elimination related fields of location human eye interference.
Summary of the invention
Object of the present invention is exactly the interference problem of locating human eye for spectacle-frame in human eye detection, provides in a kind of human eye detection the method eliminating spectacle-frame impact.
The concrete steps of the inventive method are:
Step (1) is on the basis of the face ocular utilizing AdaBoost detection algorithm to obtain, vertical gray-level projection is carried out to fixed face ocular, namely along bridge of the nose direction projection, split according to the maximum of points in the centre position of Gray Projection distribution and obtain left and right ocular; For non-gray level image, carry out gray processing aftertreatment;
Step (2) carries out horizontal environmental well respectively to images of left and right eyes region, namely perpendicular to bridge of the nose direction projection, according to the minimum point of the zone line of Gray Projection distribution, and obtain the horizontal level of place human eye in conjunction with the geometry feature of human eye, thus extract images of left and right eyes bar-shaped zone;
Step (3) carries out initial binary to images of left and right eyes bar-shaped zone, then the architectural feature of the sub-circular of people's ocular iris and pupil is utilized, the matching of ocular circle is carried out in binary image, carry out self-adaptative adjustment binary-state threshold binaryzation again by circle degree of fitting again, successive ignition is until circle degree of fitting reaches the value of setting in advance;
Described binaryzation, from an initial threshold value, utilizes the matching of binaryzation circle to carry out the threshold value of self-adaptative adjustment binaryzation, finally to meet the requirements of the center of circle corresponding to round degree of fitting as position of human eye coordinate.
The present invention is used for detecting the AdaBoost algorithm of ocular and chooses ocular and non-ocular when training classifier as positive and negative samples, and it is carry out in the ocular detected that spectacle-frame is eliminated.The present invention's key is the difference utilizing human eye similar round and spectacle-frame bar shaped, utilizes the method for self-adaption binaryzation circle matching, effectively eliminates the impact of spectacle-frame on location human eye by circle matching.
The present invention utilizes the inherent feature of spectacle-frame and the feature of eyes similar round as auxiliary to eliminate the adverse effect of spectacle-frame to location, abundant excavation eye structure information, and in conjunction with the frame-type fixed character of glasses, effectively avoid the human eye Wrong localization that picture frame causes.
Embodiment:
Below in conjunction with embodiment, the present invention is further described.
A method for spectacle-frame impact is eliminated in human eye detection, specific as follows:
Step 1, on the basis of the face ocular utilizing AdaBoost detection algorithm to obtain, first in ocular, use vertical and horizontal environmental well, and obtain left and right eye bar-shaped zone in conjunction with the inherent feature of picture frame; Secondly respectively initial binary is carried out to left and right eye bar-shaped zone, utilize the similar round feature of human eye iris and pupil, use the matching of self-adaption binaryzation circle to adjust binary-state threshold, again to left and right eye bar-shaped zone binaryzation; Iterative processing, the center of circle of the binary image circle matching of final left and right eye bar-shaped zone is face images of left and right eyes position coordinates.
In the present embodiment, step 1(splits people left and right ocular) detailed process comprises: first the width of the ocular utilizing AdaBoost detection algorithm to obtain (being generally rectangle) is designated as
(perpendicular to bridge of the nose direction), is highly designated as
(level is in bridge of the nose direction), gets after vertical gray-level projection
~
maximal value in scope is images of left and right eyes center line, is divided into left and right ocular accordingly.
Step 2, extraction human eye bar-shaped zone, detailed process is: first carry out horizontal environmental well, after low-pass filtering, extracts all minimum points of horizontal environmental well.Each minimum point is a U-shaped the lowest point, gets and has most large U type opening, and near height
the minimum point corresponding to center, take out a bar-shaped zone as human eye bar-shaped zone according to the position of this point and corresponding most large U type opening, left and right eye area process is identical.
Step 3, self-adaptation circle matching location human eye, detailed process is: to the human eye bar-shaped zone binaryzation extracted, and initially gets a less binary-state threshold, initially gets binary-state threshold in the present embodiment
be taken as 1, if when the pixel value in region is less than this threshold value, the pixel value after binaryzation is 0, otherwise is 1.Then search for the connected region that all pixel values are 0, whether the number of pixels comprised in determinating area is greater than
(
usually get the height of 1/2nd bar-shaped zones square integer, in the present embodiment
get 40), the round degree of fitting of satisfied then calculating, if the pixel that this region comprises is maximum, and circle degree of fitting is more than or equal to 0.9, then get the center of the matching center of circle as human eye, demarcate human eye with this; If do not met, allow binary-state threshold
the value (getting added value in the present embodiment is 2) that each increase by is determined, then repeat above-mentioned decision process, successive iteration is until meet the demands.Images of left and right eyes bar-shaped zone processing procedure is identical, and process obtains right and left eyes center respectively, location human eye.
Claims (1)
1. eliminate a method for spectacle-frame impact in human eye detection, it is characterized in that the concrete steps of the method are:
Step (1) is on the basis of the face ocular utilizing AdaBoost detection algorithm to obtain, vertical gray-level projection is carried out to fixed face ocular, namely along bridge of the nose direction projection, split according to the maximum of points in the centre position of Gray Projection distribution and obtain left and right ocular; For non-gray level image, carry out gray processing aftertreatment;
Step (2) carries out horizontal environmental well respectively to images of left and right eyes region, namely perpendicular to bridge of the nose direction projection, according to the minimum point of the zone line of Gray Projection distribution, and obtain the horizontal level of place human eye in conjunction with the geometry feature of human eye, thus extract images of left and right eyes bar-shaped zone;
Step (3) carries out initial binary to images of left and right eyes bar-shaped zone, then the architectural feature of the sub-circular of people's ocular iris and pupil is utilized, the matching of ocular circle is carried out in binary image, carry out self-adaptative adjustment binary-state threshold binaryzation again by circle degree of fitting again, successive ignition is until circle degree of fitting reaches the value of setting in advance;
Described binaryzation, from an initial threshold value, utilizes the matching of binaryzation circle to carry out the threshold value of self-adaptative adjustment binaryzation, finally to meet the requirements of the center of circle corresponding to round degree of fitting as position of human eye coordinate.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510074487.2A CN104732202A (en) | 2015-02-12 | 2015-02-12 | Method for eliminating influence of glasses frame during human eye detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510074487.2A CN104732202A (en) | 2015-02-12 | 2015-02-12 | Method for eliminating influence of glasses frame during human eye detection |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104732202A true CN104732202A (en) | 2015-06-24 |
Family
ID=53456075
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510074487.2A Pending CN104732202A (en) | 2015-02-12 | 2015-02-12 | Method for eliminating influence of glasses frame during human eye detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104732202A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109101856A (en) * | 2018-09-25 | 2018-12-28 | 广东工业大学 | A kind of image in 2 D code recognition methods and device |
WO2019144710A1 (en) * | 2018-01-23 | 2019-08-01 | 北京七鑫易维信息技术有限公司 | Method and apparatus for determining position of pupil |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101383001A (en) * | 2008-10-17 | 2009-03-11 | 中山大学 | Quick and precise front human face discriminating method |
CN101984453A (en) * | 2010-11-02 | 2011-03-09 | 中国科学技术大学 | Human eye recognition system and method |
CN102324166A (en) * | 2011-09-19 | 2012-01-18 | 深圳市汉华安道科技有限责任公司 | Fatigue driving detection method and device |
CN103035931A (en) * | 2011-10-04 | 2013-04-10 | 住友电气工业株式会社 | Cell frame, cell stack and redox flow battery |
CN103093215A (en) * | 2013-02-01 | 2013-05-08 | 北京天诚盛业科技有限公司 | Eye location method and device |
WO2014169441A1 (en) * | 2013-04-16 | 2014-10-23 | Thomson Licensing | Method and system for eye tracking using combination of detection and motion estimation |
-
2015
- 2015-02-12 CN CN201510074487.2A patent/CN104732202A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101383001A (en) * | 2008-10-17 | 2009-03-11 | 中山大学 | Quick and precise front human face discriminating method |
CN101984453A (en) * | 2010-11-02 | 2011-03-09 | 中国科学技术大学 | Human eye recognition system and method |
CN102324166A (en) * | 2011-09-19 | 2012-01-18 | 深圳市汉华安道科技有限责任公司 | Fatigue driving detection method and device |
CN103035931A (en) * | 2011-10-04 | 2013-04-10 | 住友电气工业株式会社 | Cell frame, cell stack and redox flow battery |
CN103093215A (en) * | 2013-02-01 | 2013-05-08 | 北京天诚盛业科技有限公司 | Eye location method and device |
WO2014169441A1 (en) * | 2013-04-16 | 2014-10-23 | Thomson Licensing | Method and system for eye tracking using combination of detection and motion estimation |
Non-Patent Citations (3)
Title |
---|
吴国龙 等: "非理想成像条件下的虹膜定位方法", 《计算机工程与设计》 * |
孙艳秋: "一种简单快速的人眼定位方法", 《赤峰学院学报(自然科学版)》 * |
李爱平 等: "基于灰度投影与改进Hough变换的人眼定位算法", 《电子设计工程》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019144710A1 (en) * | 2018-01-23 | 2019-08-01 | 北京七鑫易维信息技术有限公司 | Method and apparatus for determining position of pupil |
TWI714952B (en) * | 2018-01-23 | 2021-01-01 | 大陸商北京七鑫易維信息技術有限公司 | Method and device for determining pupil position |
US10949991B2 (en) | 2018-01-23 | 2021-03-16 | Beijing 7Invensun Technology Co., Ltd. | Method and apparatus for determining position of pupil |
CN109101856A (en) * | 2018-09-25 | 2018-12-28 | 广东工业大学 | A kind of image in 2 D code recognition methods and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Gu et al. | Feature points extraction from faces | |
CN106339702A (en) | Multi-feature fusion based face identification method | |
CN105956578A (en) | Face verification method based on identity document information | |
CN104036278B (en) | The extracting method of face algorithm standard rules face image | |
CN103679118A (en) | Human face in-vivo detection method and system | |
Abate et al. | BIRD: Watershed based iris detection for mobile devices | |
CN107066969A (en) | A kind of face identification method | |
Oh et al. | Extracting sclera features for cancelable identity verification | |
CN102867179A (en) | Method for detecting acquisition quality of digital certificate photo | |
CN103902962A (en) | Shielding or light source self-adaption human face recognition method and device | |
KR20090093223A (en) | Removal Eye Glasses using Variable Mask and Inpainting for Improved Performance of Face Recognition System | |
CN104915656A (en) | Quick human face recognition method based on binocular vision measurement technology | |
CN105160331A (en) | Hidden Markov model based face geometrical feature identification method | |
CN106203338B (en) | Human eye state method for quickly identifying based on net region segmentation and threshold adaptive | |
CN110705454A (en) | Face recognition method with living body detection function | |
CN105631285A (en) | Biological feature identity recognition method and apparatus | |
Chen et al. | A robust segmentation approach to iris recognition based on video | |
CN111860453A (en) | Face recognition method for mask | |
CN106778499B (en) | Method for rapidly positioning human iris in iris acquisition process | |
Dong et al. | Eye detection based on integral projection and hough round transform | |
CN104573628A (en) | Three-dimensional face recognition method | |
CN104732202A (en) | Method for eliminating influence of glasses frame during human eye detection | |
CN109409223A (en) | A kind of iris locating method | |
CN103020599A (en) | Identity authentication method based on face | |
CN105335695A (en) | Glasses detection based eye positioning method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150624 |