CN109614858A - A kind of detection method and device of pupil center - Google Patents

A kind of detection method and device of pupil center Download PDF

Info

Publication number
CN109614858A
CN109614858A CN201811291708.1A CN201811291708A CN109614858A CN 109614858 A CN109614858 A CN 109614858A CN 201811291708 A CN201811291708 A CN 201811291708A CN 109614858 A CN109614858 A CN 109614858A
Authority
CN
China
Prior art keywords
pupil
region
point
boundary point
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811291708.1A
Other languages
Chinese (zh)
Other versions
CN109614858B (en
Inventor
李峰
赵亚丽
曹凯
文斌
高扬
王建璞
冯涛
朱江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Spaceflight Morning Letter Technology Co Ltd
Original Assignee
Beijing Spaceflight Morning Letter Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Spaceflight Morning Letter Technology Co Ltd filed Critical Beijing Spaceflight Morning Letter Technology Co Ltd
Priority to CN201811291708.1A priority Critical patent/CN109614858B/en
Publication of CN109614858A publication Critical patent/CN109614858A/en
Application granted granted Critical
Publication of CN109614858B publication Critical patent/CN109614858B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a kind of detection method of pupil center and devices, and pupil region locating for pupil is extracted specially from facial image;The preliminary multiple boundary points for extracting pupil region;False border interference point is rejected from multiple boundary points, obtains true pupil boundary point set;Elliptical parameter is determined according to true pupil boundary point set, and by ellipse fitting method, and ellipse is the central point of pupil.Due to wherein carrying out noise spot rejecting to the pupil boundary points of extraction, and then the position of pupil center is calculated by ellipse fitting method, to improve the accuracy that pupil center is detected under occlusion state.

Description

A kind of detection method and device of pupil center
Technical field
The present invention relates to tracking technique field is realized, more particularly to the detection method and device of a kind of pupil center.
Background technique
The key of visual trace technology is the accurate detection to pupil center.However, in the image of actual photographed, pupil Shape is simultaneously irregular, and the often interference by eyelid, eyelashes, hot spot and blink etc..What pupil detection generallyd use Method is ellipse fitting to be carried out according to the boundary point of pupil region, but for the pupil under occlusion state, extracted Boundary point can partly be distorted, if directly using these boundary points carry out ellipse fitting, cause extract pupil center not Accurately, so that calculated direction of visual lines be made to have bigger deviation.
Summary of the invention
In view of this, the present invention provides a kind of detection method of pupil center and device, with solve practical sight with The problem for causing pupil center's detection error larger because being blocked when track.
To solve the above-mentioned problems, the invention discloses a kind of detection methods of pupil center, are applied to eye tracking system In system, pupil center's detection method comprising steps of
Pupil region locating for pupil is extracted from facial image;
Tentatively extract multiple boundary points of the pupil region;
False border interference point is rejected from the multiple boundary point, obtains true pupil boundary point set;
Elliptical parameter is determined according to the true pupil boundary point set, and by ellipse fitting method, and the ellipse is For the central point of the pupil.
It is optionally, described that pupil region locating for pupil is extracted from facial image, comprising:
Binary conversion treatment is carried out to the facial image, obtains binary image;
The binary image is extracted, the pupil region of binaryzation is obtained.
Optionally, the multiple boundary points for tentatively extracting the pupil region, comprising:
Tentatively it is set to according to center of the pupil region to the pupil, obtains rough central point;
The pupil region is split to connection labeling algorithm using eight, multiple connected regions are obtained, according to each The area accounting and its region mass center of the connected region calculate credible pupil region at a distance from the rough central point;
The profile of the credible pupil region is extracted, and regard borderline multiple points of the profile as the multiple side Boundary's point
It is optionally, described that false border interference point is rejected from the multiple boundary point, comprising:
It for each boundary point in the multiple boundary point, is successively detected in default neighborhood, extracts first most Big edge strength determines boundary point as first if the first maximal margin intensity is greater than the first preset threshold;
By it is each it is described just determine boundary point in the default neighborhood and detect, the second maximal margin intensity of extraction, such as Second maximal margin intensity described in fruit is greater than the second preset threshold, then described just will determine boundary point and be denoted as the true pupil boundary One element of point set.
A kind of detection device of pupil center is additionally provided, is applied in gaze tracking system, pupil center's detection Method comprising steps of
Region extraction module, for extracting pupil region locating for pupil from facial image;
Border points extraction module, for tentatively extracting multiple boundary points of the pupil region;
Noise spot rejects module, for rejecting false border interference point from the multiple boundary point, obtains true Pupil boundary point set;
Center point calculation module for according to the true pupil boundary point set, and is determined by ellipse fitting method ellipse Round parameter, the central point of the oval as pupil.
Optionally, the region extraction module includes:
Binarization unit obtains binary image for carrying out binary conversion treatment to the facial image;
It extracts execution unit and obtains the pupil region of binaryzation for extracting to the binary image.
Optionally, the border points extraction unit includes:
The thick order member of central point, for being tentatively set to according to center of the pupil region to the pupil, obtains Rough central point;
Area calculation unit obtains multiple for being split to connection labeling algorithm to the pupil region using eight Connected region calculates at a distance from the rough central point according to the area accounting of each connected region and its region mass center Credible pupil region;
Contours extract unit, for extracting the profile of the credible pupil region, and by the borderline more of the profile A point is as the multiple boundary point
Optionally, the noise spot rejecting module includes:
First selection unit, for for each boundary point in the multiple boundary point, successively in default neighborhood into Row detection, extracts the first maximal margin intensity, if the first maximal margin intensity is greater than the first preset threshold, is made Just to determine boundary point;
Second selection unit, for by it is each it is described just determine boundary point in the default neighborhood and detect, extraction the Two maximal margin intensity just determine boundary point note for described if the second maximal margin intensity is greater than the second preset threshold For an element of the true pupil boundary point set.
It can be seen from the above technical proposal that the present invention provides a kind of detection method of pupil center and devices, specifically To extract pupil region locating for pupil from facial image;The preliminary multiple boundary points for extracting pupil region;From multiple boundaries False border interference point is rejected in point, obtains true pupil boundary point set;According to true pupil boundary point set, and by ellipse Circle approximating method determines elliptical parameter, and ellipse is the central point of pupil.Due to wherein being carried out to the pupil boundary points of extraction Noise spot is rejected, and then the position of pupil center is calculated by ellipse fitting method, to improve pupil under occlusion state The accuracy of Spot detection.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with It obtains other drawings based on these drawings.
Fig. 1 is a kind of flow chart of the detection method of pupil center provided by the embodiments of the present application;
Fig. 2 is a kind of block diagram of the detection device of pupil center provided by the embodiments of the present application.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Embodiment one
Fig. 1 is a kind of flow chart of the detection method of pupil center provided by the embodiments of the present application.
Shown in referring to Fig.1, detection method provided in this embodiment is for examining the center of the pupil in facial image It surveys, so as to be tracked according to the center to sight therein, which specifically includes step:
S1, pupil region locating for pupil is extracted from facial image.
In the present embodiment, examined using the facial image that binarization method obtains picture pick-up device or camera installation It surveys, obtains the pupil region.
In original human eye gray scale image, pupil position shows the very low ellipse of gray value or the part that is blocked The elliptical shape in region, other position gray values are higher, and the noise spot gray value that wherein eyelash is formed is also very low, but due to It counts less, does not influence the preresearch estimates to pupil position.
When carrying out pupil region extraction, firstly, carrying out image binaryzation processing by selected fixed threshold T, make pupil The profile in region highlights.
Wherein, f (x, y) is eye grayscale image, and g (x, y) is the facial image after binaryzation post-processing.
After obtaining the facial image of binaryzation, the binaryzation of pupil is extracted by image segmentation, denoising, edge detection Pupil region.
S2, the preliminary multiple boundary points for extracting pupil region.
First according to the pupil image obtained after binary conversion treatment, coarse localization is carried out to pupil center, is obtained in rough Heart point.The position for defining rough central point is O (x, y), and sets (xn,yn) it is image various point locations, then have:
Secondly, reject binaryzation after by the formation such as eyelashes interference region, use eight to connection labeling algorithm to image into Row segmentation, divides the image into several connected regions, by the area accounting and each region center of mass point that compare each connected region The Distance Judgment of position and rough central point O (x, y) goes out credible pupil region.
Then, the profile of pupil region is extracted using Sobel edge detection algorithm, and multiple boundary points of profile are stored For point set C.
Border interference point in S3, the multiple boundary points of rejecting.
So-called border interference point refers in multiple boundary points obtained above because blocking the false boundary point to be formed.Really Pupil boundary points, between pupil and iris;The pupil boundary points of distortion, be often positioned in pupil and eyelid or hot spot it Between;So true pupil boundary points have two attributes:
To consistency in 1.;
2. export-oriented step evolution, and step degree is less than the step degree of distortion pupil boundary points, i.e. P (iris)-P (pupil) < P (eyelid)-P (pupil), P (iris)-P (pupil) < P (hot spot)-P (pupil), in which:
It is introversive: the direction from boundary point to pupil center
It is export-oriented: with it is interior in the opposite direction
P (iris): the gray value of iris
P (eyelid): the gray value of eyelid
P (pupil): the gray value of pupil
P (hot spot): the gray value of hot spot
Definition threshold value T1 is P (eyelid)-P (pupil), and T2 is P (hot spot)-P (pupil), and pupil center's point rough position is O (x,y).Following detection process is carried out to boundary point:
Firstly, choosing certain boundary point CiAs starting point, in CiIt is detected in the neighborhood in the direction O, extracts the first maximal margin Intensity Ei1
If Ei1Less than the first preset threshold T1, then to CiIn OCiIt is detected in the neighborhood in direction, extracts the second maximum side Edge intensity Ei2If Ei2Less than the second preset threshold T2, then Ci is saved to true pupil boundary point set NC, as therein one A element.
And so on, each boundary point is subjected to above-mentioned detection, the boundary point then conduct to above-mentioned condition is unsatisfactory for Interference boundary point is rejected, until traversing all boundary points.
By boundary points detection, the noise spot that eyelid, hot spot can be blocked to generation is rejected, and point set NC is to retain after rejecting Effective pupil boundary point set.
S4, the central point that pupil is calculated according to true pupil boundary point set.
Ellipse fitting is carried out according to true pupil boundary point set NC, finds out pupil center's point position.Pupil is by pupil boundary Point set NC is as ellipse fitting boundary point, with ellipse representation pupil shape, stealthy equation are as follows:
ax2+bxy+cy2+ dx+ey+f=0
By specifying constraint, criterion is minimised as to elliptical algebraic distance quadratic sum with each boundary point, using most Small square law principle finds out coefficient vector [a, b, c, d, e, f]T, to calculate the position elliptical center point E (x, y):
Elliptical center point position is to reject pupil center's point position after interference.
It can be seen from the above technical proposal that present embodiments provide a kind of detection method of pupil center, specially from Pupil region locating for pupil is extracted in facial image;The preliminary multiple boundary points for extracting pupil region;From multiple boundary points False border interference point is rejected, true pupil boundary point set is obtained;According to true pupil boundary point set, and by oval quasi- Conjunction method determines elliptical parameter, and ellipse is the central point of pupil.Due to wherein interfering the pupil boundary points of extraction Point is rejected, and then the position of pupil center is calculated by ellipse fitting method, to improve pupil center under occlusion state The accuracy of detection.
It should be noted that for simple description, therefore, it is stated as a series of action groups for embodiment of the method It closes, but those skilled in the art should understand that, embodiment of that present invention are not limited by the describe sequence of actions, because according to According to the embodiment of the present invention, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art also should Know, the embodiments described in the specification are all preferred embodiments, and the related movement not necessarily present invention is implemented Necessary to example.
Embodiment two
Fig. 2 is a kind of block diagram of the detection device of pupil center provided by the embodiments of the present application.
Referring to shown in Fig. 2, detection device provided in this embodiment is for examining the center of the pupil in facial image It surveys, so as to be tracked according to the center to sight therein, which specifically includes region extraction module 10, side Boundary's point extraction module 20, noise spot reject module 30 and center point calculation module 40.
Region extraction module is used to extract pupil region locating for pupil from facial image.
In the present embodiment, examined using the facial image that binarization method obtains picture pick-up device or camera installation It surveys, obtains the pupil region.
In original human eye gray scale image, pupil position shows the very low ellipse of gray value or the part that is blocked The elliptical shape in region, other position gray values are higher, and the noise spot gray value that wherein eyelash is formed is also very low, but due to It counts less, does not influence the preresearch estimates to pupil position.
The module includes binarization unit and extracts execution unit, and when carrying out pupil region extraction, binarization unit is logical It crosses selected fixed threshold T and carries out image binaryzation processing, highlight the profile of pupil region.
Wherein, f (x, y) is eye grayscale image, and g (x, y) is the facial image after binaryzation post-processing.
It extracts execution unit to be used for after obtaining the facial image of binaryzation, be mentioned by image segmentation, denoising, edge detection Take the pupil region of the binaryzation of pupil.
Border points extraction module is for tentatively extracting multiple boundary points of pupil region.
The module specifically includes the thick order member of central point, area calculation unit and contours extract unit.The thick order of central point Member is for carrying out coarse localization to pupil center, obtaining rough central point according to the pupil image obtained after binary conversion treatment.It is fixed The position of adopted rough central point is O (x, y), and sets (xn,yn) it is image various point locations, then have:
Area calculation unit is used to reject the interference region after binaryzation by formation such as eyelashes, and use eight is calculated to connection label Method is split image, divides the image into several connected regions, by comparing the area accounting of each connected region and each The Distance Judgment of region mass center point position and rough central point O (x, y) goes out credible pupil region.
Contours extract unit is used to extract the profile of pupil region using Sobel edge detection algorithm, and by the more of profile A boundary point is stored as point set C.
Noise spot rejects module and is used to reject the border interference point in multiple boundary points.
So-called border interference point refers in multiple boundary points obtained above because blocking the false boundary point to be formed.Really Pupil boundary points, between pupil and iris;The pupil boundary points of distortion, be often positioned in pupil and eyelid or hot spot it Between;So true pupil boundary points have two attributes:
To consistency in 1.;
2. export-oriented step evolution, and step degree is less than the step degree of distortion pupil boundary points, i.e. P (iris)-P (pupil) < P (eyelid)-P (pupil), P (iris)-P (pupil) < P (hot spot)-P (pupil), in which:
It is introversive: the direction from boundary point to pupil center
It is export-oriented: with it is interior in the opposite direction
P (iris): the gray value of iris
P (eyelid): the gray value of eyelid
P (pupil): the gray value of pupil
P (hot spot): the gray value of hot spot
Definition threshold value T1 is P (eyelid)-P (pupil), and T2 is P (hot spot)-P (pupil), and pupil center's point rough position is O (x,y).The module specifically includes the first selection unit and the second selection unit.
First selection unit is for choosing certain boundary point CiAs starting point, in CiIt is detected, is extracted in the neighborhood in the direction O First maximal margin intensity Ei1
If the second selection unit is used for Ei1Less than the first preset threshold T1, then to CiIn OCiIt is examined in the neighborhood in direction It surveys, extracts the second maximal margin intensity Ei2If Ei2Less than the second preset threshold T2, then Ci is saved to true pupil boundary points Collect NC, as one of element.
And so on, each boundary point is detected using said two units, to the side for being unsatisfactory for above-mentioned condition Boundary's point is then rejected as interference boundary point, until traversing all boundary points.
By boundary points detection, the noise spot that eyelid, hot spot can be blocked to generation is rejected, and point set NC is to retain after rejecting Effective pupil boundary point set.
Center point calculation module is used to calculate the central point of pupil according to true pupil boundary point set.
Ellipse fitting is carried out according to true pupil boundary point set NC, finds out pupil center's point position.Pupil is by pupil boundary Point set NC is as ellipse fitting boundary point, with ellipse representation pupil shape, stealthy equation are as follows:
ax2+bxy+cy2+ dx+ey+f=0
By specifying constraint, criterion is minimised as to elliptical algebraic distance quadratic sum with each boundary point, using most Small square law principle finds out coefficient vector [a, b, c, d, e, f]T, to calculate the position elliptical center point E (x, y):
Elliptical center point position is to reject pupil center's point position after interference.
It can be seen from the above technical proposal that present embodiments provide a kind of detection device of pupil center, specially from Pupil region locating for pupil is extracted in facial image;The preliminary multiple boundary points for extracting pupil region;From multiple boundary points False border interference point is rejected, true pupil boundary point set is obtained;According to true pupil boundary point set, and by oval quasi- Conjunction method determines elliptical parameter, and ellipse is the central point of pupil.Due to wherein interfering the pupil boundary points of extraction Point is rejected, and then the position of pupil center is calculated by ellipse fitting method, to improve pupil center under occlusion state The accuracy of detection.
For device embodiment, since it is basically similar to the method embodiment, related so being described relatively simple Place illustrates referring to the part of embodiment of the method.
All the embodiments in this specification are described in a progressive manner, the highlights of each of the examples are with The difference of other embodiments, the same or similar parts between the embodiments can be referred to each other.
It should be understood by those skilled in the art that, the embodiment of the embodiment of the present invention can provide as method, apparatus or calculate Machine program product.Therefore, the embodiment of the present invention can be used complete hardware embodiment, complete software embodiment or combine software and The form of the embodiment of hardware aspect.Moreover, the embodiment of the present invention can be used one or more wherein include computer can With in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code The form of the computer program product of implementation.
The embodiment of the present invention be referring to according to the method for the embodiment of the present invention, terminal device (system) and computer program The flowchart and/or the block diagram of product describes.It should be understood that flowchart and/or the block diagram can be realized by computer program instructions In each flow and/or block and flowchart and/or the block diagram in process and/or box combination.It can provide these Computer program instructions are set to general purpose computer, special purpose computer, Embedded Processor or other programmable data processing terminals Standby processor is to generate a machine, so that being held by the processor of computer or other programmable data processing terminal devices Capable instruction generates for realizing in one or more flows of the flowchart and/or one or more blocks of the block diagram The device of specified function.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing terminal devices In computer-readable memory operate in a specific manner, so that instruction stored in the computer readable memory generates packet The manufacture of command device is included, which realizes in one side of one or more flows of the flowchart and/or block diagram The function of being specified in frame or multiple boxes.
These computer program instructions can also be loaded into computer or other programmable data processing terminal devices, so that Series of operation steps are executed on computer or other programmable terminal equipments to generate computer implemented processing, thus The instruction executed on computer or other programmable terminal equipments is provided for realizing in one or more flows of the flowchart And/or in one or more blocks of the block diagram specify function the step of.
Although the preferred embodiment of the embodiment of the present invention has been described, once a person skilled in the art knows bases This creative concept, then additional changes and modifications can be made to these embodiments.So the following claims are intended to be interpreted as Including preferred embodiment and fall into all change and modification of range of embodiment of the invention.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning Covering non-exclusive inclusion, so that process, method, article or terminal device including a series of elements not only wrap Those elements are included, but also including other elements that are not explicitly listed, or further includes for this process, method, article Or the element that terminal device is intrinsic.In the absence of more restrictions, being wanted by what sentence "including a ..." limited Element, it is not excluded that there is also other identical elements in process, method, article or the terminal device for including the element.
Technical solution provided by the present invention is described in detail above, specific case used herein is to this hair Bright principle and embodiment is expounded, method of the invention that the above embodiments are only used to help understand and its Core concept;At the same time, for those skilled in the art, according to the thought of the present invention, in specific embodiment and application There will be changes in range, in conclusion the contents of this specification are not to be construed as limiting the invention.

Claims (8)

1. a kind of detection method of pupil center is applied in gaze tracking system, which is characterized in that pupil center's detection Method comprising steps of
Pupil region locating for pupil is extracted from facial image;
Tentatively extract multiple boundary points of the pupil region;
False border interference point is rejected from the multiple boundary point, obtains true pupil boundary point set;
Elliptical parameter, the oval as institute are determined according to the true pupil boundary point set, and by ellipse fitting method State the central point of pupil.
2. detection method as described in claim 1, which is characterized in that described to extract pupil locating for pupil from facial image Region, comprising:
Binary conversion treatment is carried out to the facial image, obtains binary image;
The binary image is extracted, the pupil region of binaryzation is obtained.
3. detection method as described in claim 1, which is characterized in that the multiple boundaries for tentatively extracting the pupil region Point, comprising:
Tentatively it is set to according to center of the pupil region to the pupil, obtains rough central point;
The pupil region is split to connection labeling algorithm using eight, multiple connected regions are obtained, according to each described The area accounting and its region mass center of connected region calculate credible pupil region at a distance from the rough central point;
The profile of the credible pupil region is extracted, and regard borderline multiple points of the profile as the multiple boundary Point.
4. detection method as described in claim 1, which is characterized in that it is described rejected from the multiple boundary point it is false Border interference point, comprising:
It for each boundary point in the multiple boundary point, is successively detected in default neighborhood, extracts the first maximum side Edge intensity determines boundary point as first if the first maximal margin intensity is greater than the first preset threshold;
By it is each it is described just determine boundary point in the default neighborhood and detect, the second maximal margin intensity of extraction, if institute The second maximal margin intensity is stated greater than the second preset threshold, then it will described just determining boundary point be denoted as the true pupil boundary point set An element.
5. a kind of detection device of pupil center is applied in gaze tracking system, which is characterized in that pupil center's detection Method comprising steps of
Region extraction module, for extracting pupil region locating for pupil from facial image;
Border points extraction module, for tentatively extracting multiple boundary points of the pupil region;
Noise spot rejects module and obtains true pupil for rejecting false border interference point from the multiple boundary point Boundary point set;
Center point calculation module for according to the true pupil boundary point set, and is determined by ellipse fitting method elliptical Parameter, the central point of the oval as pupil.
6. detection device as claimed in claim 5, which is characterized in that the region extraction module includes:
Binarization unit obtains binary image for carrying out binary conversion treatment to the facial image;
It extracts execution unit and obtains the pupil region of binaryzation for extracting to the binary image.
7. detection device as claimed in claim 5, which is characterized in that the border points extraction unit includes:
The thick order member of central point obtains rough for being tentatively set to according to center of the pupil region to the pupil Central point;
Area calculation unit obtains multiple connections for being split to connection labeling algorithm to the pupil region using eight Region calculates at a distance from the rough central point credible according to the area accounting of each connected region and its region mass center Pupil region;
Contours extract unit, for extracting the profile of the credible pupil region, and by borderline multiple points of the profile As the multiple boundary point.
8. detection device as claimed in claim 5, which is characterized in that the noise spot rejects module and includes:
First selection unit, for successively being examined in default neighborhood for each boundary point in the multiple boundary point It surveys, extracts the first maximal margin intensity, if the first maximal margin intensity is greater than the first preset threshold, as first Determine boundary point;
Second selection unit, for by it is each it is described just determine boundary point in the default neighborhood and detect, extraction second is most Big edge strength described just will determine boundary point and be denoted as institute if the second maximal margin intensity is greater than the second preset threshold State an element of true pupil boundary point set.
CN201811291708.1A 2018-10-31 2018-10-31 Pupil center detection method and device Active CN109614858B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811291708.1A CN109614858B (en) 2018-10-31 2018-10-31 Pupil center detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811291708.1A CN109614858B (en) 2018-10-31 2018-10-31 Pupil center detection method and device

Publications (2)

Publication Number Publication Date
CN109614858A true CN109614858A (en) 2019-04-12
CN109614858B CN109614858B (en) 2021-01-15

Family

ID=66002747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811291708.1A Active CN109614858B (en) 2018-10-31 2018-10-31 Pupil center detection method and device

Country Status (1)

Country Link
CN (1) CN109614858B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112434675A (en) * 2021-01-26 2021-03-02 西南石油大学 Pupil positioning method for global self-adaptive optimization parameters
CN113342161A (en) * 2021-05-27 2021-09-03 常州工学院 Sight tracking method based on near-to-eye camera
CN115546143A (en) * 2022-09-30 2022-12-30 杭州长川科技股份有限公司 Method and device for positioning center point of wafer, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
CN103996020A (en) * 2014-04-10 2014-08-20 中航华东光电(上海)有限公司 Head mounted eye tracker detection method
CN104182720A (en) * 2013-05-22 2014-12-03 北京三星通信技术研究有限公司 Pupil detection method and device
CN105955465A (en) * 2016-04-25 2016-09-21 华南师范大学 Desktop portable sight line tracking method and apparatus
CN106022315A (en) * 2016-06-17 2016-10-12 北京极创未来科技有限公司 Pupil center positioning method for iris recognition
US9760774B2 (en) * 2014-08-29 2017-09-12 Alps Electric Co., Ltd. Line-of-sight detection apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
CN104182720A (en) * 2013-05-22 2014-12-03 北京三星通信技术研究有限公司 Pupil detection method and device
CN103996020A (en) * 2014-04-10 2014-08-20 中航华东光电(上海)有限公司 Head mounted eye tracker detection method
US9760774B2 (en) * 2014-08-29 2017-09-12 Alps Electric Co., Ltd. Line-of-sight detection apparatus
CN105955465A (en) * 2016-04-25 2016-09-21 华南师范大学 Desktop portable sight line tracking method and apparatus
CN106022315A (en) * 2016-06-17 2016-10-12 北京极创未来科技有限公司 Pupil center positioning method for iris recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SALAH RABBA等: "Pupil localization for gaze estimation using unsupervised graph-based model", 《2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS)》 *
胡畔等: "基于视线跟踪技术的眼控鼠标", 《天津师范大学学报(自然科学版)》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112434675A (en) * 2021-01-26 2021-03-02 西南石油大学 Pupil positioning method for global self-adaptive optimization parameters
CN113342161A (en) * 2021-05-27 2021-09-03 常州工学院 Sight tracking method based on near-to-eye camera
CN113342161B (en) * 2021-05-27 2022-10-14 常州工学院 Sight tracking method based on near-to-eye camera
CN115546143A (en) * 2022-09-30 2022-12-30 杭州长川科技股份有限公司 Method and device for positioning center point of wafer, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN109614858B (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN101317183B (en) Method for localizing pixels representing an iris in an image acquired of an eye
KR101808467B1 (en) Feature extraction and matching and template update for biometric authentication
US7929734B2 (en) Method and apparatus for detecting eyes in face region
Kang et al. Real-time image restoration for iris recognition systems
Sutra et al. The Viterbi algorithm at different resolutions for enhanced iris segmentation
CN101317184A (en) Method for extracting features of an iris in images
CN111914665B (en) Face shielding detection method, device, equipment and storage medium
CN109614858A (en) A kind of detection method and device of pupil center
CN103778406B (en) Method for checking object and equipment
CN106203358B (en) A kind of iris locating method and equipment
JP2007188504A (en) Method for filtering pixel intensity in image
CN106570447B (en) Based on the matched human face photo sunglasses automatic removal method of grey level histogram
CN103198301B (en) iris locating method and device
Asmuni et al. An improved multiscale retinex algorithm for motion-blurred iris images to minimize the intra-individual variations
CN109446935B (en) Iris positioning method for iris recognition in long-distance traveling
Fazilov et al. Algorithm for Extraction of the Iris Region in an Eye Image
Soelistio et al. Circle-based eye center localization (CECL)
KR100794361B1 (en) The eyelid detection and eyelash interpolation method for the performance enhancement of iris recognition
KR20180072517A (en) Method for detecting borderline between iris and sclera
Kovoor et al. Iris biometric recognition system employing canny operator
Sreecholpech et al. A robust model-based iris segmentation
CN110502996A (en) A kind of dynamic identifying method towards fuzzy finger vein image
KR102466084B1 (en) Image-based pupil detection method
James A Review of Daugman’s Algorithm in Iris Segmentation
JP7452677B2 (en) Focus determination device, iris authentication device, focus determination method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant