CN112434675B - Pupil positioning method for global self-adaptive optimization parameters - Google Patents
Pupil positioning method for global self-adaptive optimization parameters Download PDFInfo
- Publication number
- CN112434675B CN112434675B CN202110101150.1A CN202110101150A CN112434675B CN 112434675 B CN112434675 B CN 112434675B CN 202110101150 A CN202110101150 A CN 202110101150A CN 112434675 B CN112434675 B CN 112434675B
- Authority
- CN
- China
- Prior art keywords
- formula
- pupil
- pixel value
- eye image
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a pupil positioning method for global self-adaptive optimization parameters, belonging to the field of graphic image processing; the method solves the problem that the pupil positioning algorithm cannot accurately position the pupil in the eye image at present; the technical scheme is as follows: preprocessing the eye image by a Gaussian filtering formula; selecting an interested area and an initial search point; searching the eye image in a near L field to obtain the minimum pixel value of the pupil separation parameter in the eye image; performing binarization parameter optimization on the eye image, solving the maximum pixel value of the pupil separation parameter, and obtaining the final pupil separation parameter through a limit value taking formula; fitting the pupil connected domain by adopting an ellipse fitting mode to finally realize the positioning of the pupil; the invention has the following beneficial effects: the morphological characteristics and related parameters of the pupil are considered, the adaptive optimization can be carried out based on the actual image, the robustness is good, and the pupil positioning accuracy is high.
Description
Technical Field
The invention relates to a pupil positioning method for global self-adaptive optimization parameters, and belongs to the field of graphic image processing.
Background
Pupil positioning plays an important role in gaze tracking and medical diagnostics. The accurate pupil positioning can effectively improve the accuracy of sight tracking and medical diagnosis, so the pupil positioning has great research value. Currently, human eye detection algorithms are mainly classified into image processing-based methods and statistical-based methods. The pupil positioning method based on image processing mainly includes a gray scale integral projection method, a symmetric transformation method, a method based on Hough transformation and the like, and the main method based on statistical learning is an AdaBoost algorithm. The identification principle, the applicable conditions and the existing problems of the methods are systematically explained, and the method capable of accurately positioning the pupil is pointed out.
The gray scale integral projection method utilizes the characteristic that the gray scale value of the pupil area is lower to obtain a curve with an obvious trough area. This valley region is the approximate projected position of the pupil in the vertical direction. The gray scale integral projection method has less calculation amount, but has poor pupil positioning precision. The human eye positioning method based on the generalized symmetric transformation has good robustness, but has the defect of large calculation amount. The Hough transform-based method mainly utilizes edge detection to position the boundary points of the pupil, and then utilizes the Hough transform method to fit the boundary points of the pupil, which is not accurate in pupil extraction. The AdaBoost algorithm also cannot accurately locate the pupil.
Generally, the current pupil location algorithm is only suitable for pupil location under the condition of high eye image quality, and is not ideal for pupil location effect in a complex environment.
Disclosure of Invention
The invention aims to: in order to solve the problems of inaccurate pupil positioning and poor robustness, the pupil separation parameter is obtained based on the pixel value difference between the pupil and the pupil outer area in the eye image and by combining an image preprocessing method, and accurate pupil positioning is finally realized.
In order to achieve the above object, the present invention provides a pupil location method with global adaptive optimization parameters, which comprises the following steps:
s100, preprocessing the eye image, and reducing and eliminating noise in the eye image through a Gaussian filter formula, so that the eye image is smoother;
s101, reducing and eliminating noise in the eye image, wherein the Gaussian filter formula isIn the formula: mu is the average value of the pixel values of the eye image and has no dimensional quantity; delta is standard deviation of pixel values of the eye image and has no dimensional quantity; x is a single pixel value in the eye image and is dimensionless;f(x) Is a single pixel value in the processed eye image without dimensional quantity; e is the base number of the natural logarithm, and is a constant; pi is a circumference ratio, constant;
s200, selecting an interested region and an initial search point of the processed eye image, and specifically dividing the selection of the interested region and the initial search point of the eye image into the following steps;
s201, converting the eye image processed by the Gaussian filter formula into a two-dimensional matrix by adopting a Python programming language;
s202, positioning a minimum pixel value point in the eye image according to a minimum value positioning formula of the matrix elements; the minimum positioning formula isIn the formula:is the minimum pixel value point of the eye image;X low_all、Y low_allthe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities; min is a minimum value function;i、jthe number of rows and columns of the two-dimensional matrix is dimensionless;the horizontal and vertical coordinates in the eye image are Xi,YjThe pixel value of the corresponding point is dimensionless;
s203, according to the ROI selected formula, representing an ROI area, wherein the ROI area is an interesting area, and the ROI is given by the formulaIn the formula: (X)center,Ycenter) The horizontal and vertical coordinates of the center point of the whole eye image are dimensionless; rangeHIs the ROI area height, dimensionless quantity; rangeWIs the width of the ROI area, without dimension; A. b, C, D are coordinates of four top points of ROI, and have no dimension;Hthe height of the eye image is dimensionless;Wthe width of the eye image is dimensionless;
s204, determining a search starting coordinate by utilizing an initial search point selection formula and combining the ROI area and the minimum pixel value point of the whole eye image, wherein the initial search point selection formula isIn the formula:X init、Y initis the horizontal and vertical coordinates of the initial search point,X low_all、Y low_allthe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities;X low_roi,Y low_roithe horizontal and vertical coordinates of the minimum pixel value point in the ROI are dimensionless quantities;
s300, searching a near L neighborhood of the eye image to obtain a minimum pixel value of a pupil separation parameter in the eye image; the step of solving the minimum pixel value of the pupil separation parameter is specifically divided into the following steps;
s301, establishing a near L neighborhood searching mode, setting a searching step length L as 20, and establishing a jump limit condition formula which isIn the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s302, establishing an exception handling formula which comprises a formula that eight directions of initial search points simultaneously meet a jump limit conditionThe abnormal processing formula of the method and the abnormal processing formula of the initial search point which returns the same pixel value after searching in eight directions are the same, and the abnormal processing formula of the initial search point which simultaneously satisfies the jump limit condition formula isIn the formulaThe horizontal and vertical coordinates of the next search point are obtained; pLTThe upper left pixel value of the initial search point is dimensionless; pTThe upper pixel value of the initial search point is a dimensionless quantity; pRTThe pixel value at the upper right of the initial search point is a dimensionless quantity; pLThe left pixel value of the initial search point is a dimensionless quantity; pRIs the right pixel value of the initial search point, without dimension; pLDThe pixel value at the lower left of the initial search point is a dimensionless quantity; pDThe lower pixel value of the initial search point is a dimensionless quantity; pRDThe pixel value at the lower right of the initial search point is a dimensionless quantity; the Point function represents the horizontal and vertical coordinates of the Point corresponding to the pixel value;
the exception handling formula of the pixel value equality returned by the eight-direction search of the initial search point isIn the formula: l isextraIs an additional search step, dimensionless; k is the number of anomalies, dimensionless; pmAnd PnIs the search pixel value from 8 directions, dimensionless;
s303, according to a near L neighborhood searching mode and a jump limit condition formula, programming by a Python program language, utilizing a pupil separation parameter obtaining formula, combining a maximum pixel value function, selecting a relative maximum value in eight direction searching return pixel values as a minimum pixel value of the pupil separation parameter to obtain the minimum pixel value of the pupil separation parameter, wherein the pupil separation parameter obtaining formula isIn the formula PmSearching pixel values in 8 directions without dimension; pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; t isL_nearestSearching step length, wherein the initial value is 1.5, and the step length is free of dimensional quantity; the maximum pixel value function isIn the formula: pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; max is the maximum value function;
s400, according to the obtained minimum pixel value of the pupil separation parameter, carrying out binarization parameter optimization on the eye image, solving the maximum value of the pupil separation parameter, and obtaining a final pupil separation parameter through a limit value formula, wherein the method specifically comprises the following steps;
s401, setting a maximum value jump judgment formula of a pupil separation parameter, wherein the maximum value jump judgment formula of the pupil separation parameter isIn the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s402, establishing a one-way parameter search value formula, combining eight direction search values of an initial point and row-column coordinates of eight direction pixel points in a matrix, respectively setting the eight direction pixel points of the initial point as new initial search coordinates for searching through Python program language programming, selecting a relative minimum value of eight direction search return values as a maximum pixel value of a pupil separation parameter according to a maximum value formula of the pupil separation parameter, and solving the maximum pixel value of the pupil separation parameter, wherein the one-way parameter search value formula is as follows:in the formula: pmFor searchingSearch pixel value, P, for 8 directions of index pointsm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; t isL_optimizationThe jump judgment value is 10, and no dimensional quantity exists; the maximum value of the pupil separation parameter has a value formula ofIn the formula: pmThe pixel values of the search points in 8 directions are searched, and the dimension is not needed; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity;
s403, obtaining the final pupil separation parameter according to a limit value formula, wherein the limit value formula isIn the formula: ppupilIs a pupil separation parameter, dimensionless; pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity; mboundIs the constraint condition of the maximum value of the parameter, and the value is 80;
s500, performing binarization processing on the eye image, performing pupil connected domain contour screening, fitting the pupil in an ellipse fitting mode, and finally positioning the pupil, wherein the method specifically comprises the following steps;
s501, according to an image binarization formula, writing a program through Python language, performing binarization processing on the eye image, and obtaining an eye image binarization image, wherein the image binarization formula isIn the formula: ppupilIs the final pupil separation parameter, dimensionless; p(i,j)Is the element of the ith row and the jth column of the two-dimensional matrix;
s502, establishing a pupil connected domain screening formula based on the obtained binary image, and solving a pupil connected domain which is connected with the pupilThe domain screening formula isIn the formula DiThe average distance between the contour point and the center of the eye image is dimensionless; n is the number of the outlines and is a dimensionless quantity; x is the number ofi,yiIs the abscissa, ordinate, X of the ith contour pointcenter,YcenterIs the abscissa, ordinate, S, of the center point of the eye imageiIs the area of the eye contour, XRThe outline point abscissa at the rightmost side of the pupil outline is a dimensionless quantity; xLIs the abscissa of the leftmost contour point of the pupil contour without dimensional quantity; y isTThe longitudinal coordinate of the highest contour point of the pupil contour is a dimensionless quantity; y isDThe longitudinal coordinate of the lowest contour point of the pupil contour is a dimensionless quantity; o isiA connected domain contour point set meeting the condition is obtained; mDIs a distance constraint value, and has no dimension; mSIs an area constraint condition value and has no dimension; ciThe number of connected domain outline points with subscript i in all connected domain outlines is dimensionless; mCThe constraint condition value is a constraint condition value of the number of the contour points and has no dimensional quantity; scThe area of each contour in the connected domain contour point set meeting the condition is free of dimensional quantity; c is a profile subscript in the connected domain profile point set meeting the condition and has no dimensional quantity; cpupilIs a connected threshold value of the pupil without dimensional quantity;
s503, substituting the contour coordinate points of the pupil connected domain into an ellipse fitting formula, fitting the pupil connected domain to realize pupil positioning, wherein the ellipse fitting formula isIn the formula, x and y are coordinates of the pupil connected domain contour point respectively, and A, B, C, D, E, F is a constant of an ellipse fitting formula and is determined by the pupil connected domain contour point.
The pupil positioning method for the global adaptive optimization parameters is characterized in that: all parameters in the method can be adjusted and optimized in a global self-adaptive manner according to the actual eye image, and the pupil position of the actual eye image can be accurately identified.
Compared with the prior art, the invention has the following beneficial effects: (1) the morphological characteristics of the pupil are considered, and ellipse fitting is adopted; (2) the problem that the pixel value of the pupil changes along with the change of the image quality is considered; (3) the method can be suitable for different eye images and has better robustness; (4) pupil positioning accuracy is high.
Drawings
FIG. 1 is a process flow diagram of the present method.
FIG. 2 is a schematic view of the ROI area of the method.
FIG. 3 is a schematic diagram of the near L neighborhood search method.
Fig. 4 is a low quality eye image without occlusions and without multiple light sources.
FIG. 5 is a diagram of an adaptive threshold algorithm binarized for low quality eye images without occlusions and without multiple light sources.
FIG. 6 is a diagram of the method after binarization of a low-quality eye image without occlusion and without multiple light sources.
Fig. 7 is a low quality eye image with glasses blocking.
FIG. 8 is a diagram of an adaptive threshold algorithm binarizing a low-quality eye image under the occlusion of glasses.
FIG. 9 is a schematic diagram of the method after binarization of a low-quality eye image shielded by glasses.
Fig. 10 is a low quality eye image under multiple light source illumination.
Fig. 11 is a schematic diagram of the adaptive threshold algorithm after binarization of the low-quality eye image under multi-light source irradiation.
Fig. 12 is a schematic diagram of the method after binarization of a low-quality eye image under multi-light source irradiation.
Fig. 13 is a diagram of the results of the Hough circle transform algorithm on the positioning of a low quality eye image without occlusions and without multiple light sources.
FIG. 14 is a diagram of the positioning results of the present method for a low quality eye image without obstructions and without multiple light sources.
Fig. 15 is a diagram of the positioning result of the Hough circle transformation algorithm on the low-quality eye image under the shielding of the glasses.
FIG. 16 is a diagram of the positioning result of the method for the low-quality eye image under the occlusion of the glasses.
Fig. 17 is a diagram of the positioning result of the Hough circle transform algorithm on the low-quality eye image under the irradiation of multiple light sources.
FIG. 18 is a diagram of the positioning results of the method for low quality eye images under multiple light source illumination.
FIG. 19 is a graph of the accuracy of the present method.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments of the present invention and the description thereof are provided herein for the purpose of explanation, not limitation, of the present invention.
As shown in fig. 1, it is a technical flowchart of a pupil location method with global adaptive optimization parameters, and the method includes the following steps:
s100, taking a low-quality eye image without an obstruction and multiple light sources as an example, as shown in fig. 4, preprocessing the eye image, and reducing and eliminating noise in the eye image by using a gaussian filter formula, so as to make the eye image smoother;
s101, reducing and eliminating noise in the eye image, wherein the Gaussian filter formula isIn the formula: mu is the average value of the pixel values of the eye image and has no dimensional quantity; delta is standard deviation of pixel values of the eye image and has no dimensional quantity; x is a single pixel value in the eye image and is dimensionless;f(x) Is a single pixel value in the processed eye image without dimensional quantity; e is the base number of the natural logarithm, and is a constant; pi is a circumference ratio, constant;
s200, selecting an interested region and an initial search point of the processed eye image, and specifically dividing the selection of the interested region and the initial search point of the eye image into the following steps;
s201, converting the eye image processed by the Gaussian filter formula into a two-dimensional matrix by adopting a Python programming language;
s202, positioning a minimum pixel value point in the eye image according to a minimum value positioning formula of the matrix elements; the minimum positioning formula isIn the formula:is the minimum pixel value point of the eye image; xlow_all、Ylow_allThe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities; min is a minimum value function;i、jthe number of rows and columns of the two-dimensional matrix is dimensionless;the horizontal and vertical coordinates in the eye image are Xi,YjThe pixel value of the corresponding point is dimensionless;
s203, according to the ROI selection formula, characterizing an ROI area, wherein the ROI area is the region of interest, as shown in figure 2, the ROI selection formula isIn the formula: (X)center,Ycenter) The horizontal and vertical coordinates of the center point of the whole eye image are dimensionless; rangeHIs the ROI area height, dimensionless quantity; rangeWIs the width of the ROI area, without dimension; A. b, C, D are coordinates of four top points of ROI, and have no dimension;Hthe height of the eye image is dimensionless;Wthe width of the eye image is dimensionless;
s204, determining a search starting coordinate by utilizing an initial search point selection formula and combining the ROI area and the minimum pixel value point of the whole eye image, wherein the initial search point selection formula isIn the formula:X init、Y initis the horizontal and vertical coordinates of the initial search point,X low_all、Y low_allthe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities;X low_roi,Y low_roithe horizontal and vertical coordinates of the minimum pixel value point in the ROI are dimensionless quantities;
s300, searching a near L neighborhood of the eye image to obtain a minimum pixel value of a pupil separation parameter in the eye image; the step of solving the minimum pixel value of the pupil separation parameter is specifically divided into the following steps;
s301, establishing a near L neighborhood searching mode, setting a searching step length L as 20, and establishing a jump limit condition formula as shown in figure 3In the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s302, establishing an exception handling formula which comprises an exception handling formula that eight directions of the initial search point simultaneously satisfy a jump limit condition formula and an exception handling formula that eight directions of the initial search point search return equal pixel values, wherein the exception handling formula that eight directions of the initial search point simultaneously satisfy the jump limit condition formula isIn the formulaThe horizontal and vertical coordinates of the next search point are obtained; pLTThe upper left pixel value of the initial search point is dimensionless; pTThe upper pixel value of the initial search point is a dimensionless quantity; pRTThe pixel value at the upper right of the initial search point is a dimensionless quantity; pLLeft pixel value of starting search point, noneDimension quantity; pRIs the right pixel value of the initial search point, without dimension; pLDThe pixel value at the lower left of the initial search point is a dimensionless quantity; pDThe lower pixel value of the initial search point is a dimensionless quantity; pRDThe pixel value at the lower right of the initial search point is a dimensionless quantity; the Point function represents the horizontal and vertical coordinates of the Point corresponding to the pixel value; the exception handling formula of the pixel value equality returned by the eight-direction search of the initial search point isIn the formula: l isextraIs an additional search step, dimensionless; k is the number of anomalies, dimensionless; pmAnd PnIs the search pixel value from 8 directions, dimensionless;
s303, according to a near L neighborhood searching mode and a jump limit condition formula, programming by a Python program language, utilizing a pupil separation parameter obtaining formula, combining a maximum pixel value function, selecting a relative maximum value in eight direction searching return pixel values as a minimum pixel value of the pupil separation parameter to obtain the minimum pixel value of the pupil separation parameter, wherein the pupil separation parameter obtaining formula isIn the formula PmSearching pixel values in 8 directions without dimension; pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; t isL_nearestSearching step length, wherein the initial value is 1.5, and the step length is free of dimensional quantity; the maximum pixel value function isIn the formula: pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; max is the maximum value function;
s400, according to the obtained minimum pixel value of the pupil separation parameter, carrying out binarization parameter optimization on the eye image, solving the maximum pixel value of the pupil separation parameter, and obtaining a final pupil separation parameter through a limit value taking formula, wherein the method specifically comprises the following steps;
s401, setting a maximum value jump judgment formula of a pupil separation parameter, wherein the maximum value jump judgment formula of the pupil separation parameter isIn the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s402, establishing a one-way parameter search value formula, combining eight direction search values of an initial point and row-column coordinates of eight direction pixel points in a matrix, respectively setting the eight direction pixel points of the initial point as new initial search coordinates for searching through Python program language programming, selecting a relative minimum value of eight direction search return values as a maximum pixel value of a pupil separation parameter according to a maximum value formula of the pupil separation parameter, and solving the maximum pixel value of the pupil separation parameter, wherein the one-way parameter search value formula is as follows:in the formula: pmFor searching pixel values of 8 directions of the search point, Pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; TL _ optimization is a jump judgment value, the value is 10, and no dimensional quantity exists; the maximum value of the pupil separation parameter has a value formula ofIn the formula: pmThe pixel values of the search points in 8 directions are searched, and the dimension is not needed; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity;
s403, obtaining the final pupil separation parameter according to a limit value formula, wherein the limit value formula isIn the formula: ppupilIs a pupil separation parameter, dimensionless; pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity; mboundIs the constraint condition of the maximum value of the parameter, and the value is 80;
s500, performing binarization processing on the eye image, performing pupil connected domain contour screening, fitting the pupil in an ellipse fitting mode, and finally positioning the pupil, wherein the method specifically comprises the following steps;
s501, according to an image binarization formula, writing a program through Python language, performing binarization processing on the eye image to obtain a binarized image, as shown in FIG. 6, the method is a schematic diagram of the method after binarization of a low-quality eye image without a barrier and multiple light sources, wherein the image binarization formula isIn the formula: ppupilIs the final pupil separation parameter, dimensionless; p(i,j)Is the element of the ith row and the jth column of the two-dimensional matrix;
s502, establishing a pupil connected domain screening formula based on the obtained binary image, and solving a pupil connected domain, wherein the pupil connected domain screening formula isIn the formula DiThe average distance between the contour point and the center of the eye image is dimensionless; n is the number of the outlines and is a dimensionless quantity; x is the number ofi,yiIs the abscissa, ordinate, X of the ith contour pointcenter,YcenterIs the abscissa, ordinate, S, of the center point of the eye imageiIs the area of the eye contour, XRThe outline point abscissa at the rightmost side of the pupil outline is a dimensionless quantity; xLIs the abscissa of the leftmost contour point of the pupil contour without dimensional quantity; y isTThe longitudinal coordinate of the highest contour point of the pupil contour is a dimensionless quantity; y isDIs the pupil profileThe vertical coordinate of the lowest contour point is free of dimensional quantity; o isiA connected domain contour point set meeting the condition is obtained; mDIs a distance constraint value, and has no dimension; mSIs an area constraint condition value and has no dimension; ciThe number of connected domain outline points with subscript i in all connected domain outlines is dimensionless; mCThe constraint condition value is a constraint condition value of the number of the contour points and has no dimensional quantity; scThe area of each contour in the connected domain contour point set meeting the condition is free of dimensional quantity; c is a profile subscript in the connected domain profile point set meeting the condition and has no dimensional quantity; cpupilIs a connected threshold value of the pupil without dimensional quantity;
s503, substituting the contour coordinate points of the pupil connected domain into an ellipse fitting formula, fitting the pupil connected domain to realize pupil positioning, wherein the ellipse fitting formula isIn the formula, x and y are respectively coordinates of the pupil connected domain contour points, A, B, C, D, E, F is a constant of an ellipse fitting formula and is determined by the pupil connected domain contour points; fig. 14 shows the positioning result of the method for a low-quality eye image without occlusion and multiple light sources.
Furthermore, all parameters in the method can be adjusted and optimized in a global self-adaptive manner according to the actual eye image, and the pupil position of the actual eye image can be accurately identified.
Further comparing the schematic diagram and the positioning result diagram of the method after the binarization of the low-quality eye image without the shielding object and the multiple light sources, the schematic diagram and the positioning result diagram of the low-quality eye image under the shielding of the glasses after the binarization, and the schematic diagram and the positioning result diagram of the low-quality eye image under the irradiation of the multiple light sources, obtaining the result diagrams under different methods and photo qualities: FIG. 4 is a low quality eye image without obstructions and without multiple light sources; FIG. 5 is a diagram of an adaptive threshold algorithm after binarization for a low quality eye image without obstructions and without multiple light sources; FIG. 6 is a diagram of the method after binarization of a low-quality eye image without occlusion and without multiple light sources. FIG. 7 is a low quality eye image with glasses blocking; FIG. 8 is a diagram of a binarized low quality eye image with glasses occlusion using an adaptive threshold algorithm; FIG. 9 is a schematic diagram of the method after binarization of a low-quality eye image shielded by glasses. FIG. 10 is a low quality eye image under multiple light source illumination; FIG. 11 is a diagram of an adaptive threshold algorithm after binarization of a low-quality eye image under multiple light source illumination; FIG. 12 is a diagram illustrating the method after binarization of a low-quality eye image under multi-light source irradiation; FIG. 13 is a diagram of the results of Hough circle transform algorithm positioning of a low quality eye image without obstructions and without multiple light sources; FIG. 14 is a diagram of the positioning results of the present method for a low quality eye image without obstructions and without multiple light sources; FIG. 15 is a diagram of the results of the Hough circle transform algorithm in locating low quality eye images with glasses occlusion; FIG. 16 is a diagram of the positioning result of the method for low quality eye images with glasses shielding; FIG. 17 is a diagram of the results of the Hough circle transform algorithm in positioning a low quality eye image under multiple light source illumination; FIG. 18 is a diagram of the positioning results of the method for low quality eye images under multiple light source illumination.
Through the comparative analysis, an accuracy rate curve chart of the method is obtained, and as shown in fig. 19, the accuracy rates of the method are higher than 96.1%, and are obviously superior to the adaptive threshold algorithm and the Hough circle transformation algorithm.
Compared with the prior art, the invention has the following beneficial effects: (1) the morphological characteristics of the pupil are considered, and ellipse fitting is adopted; (2) the problem that the pixel value of the pupil changes along with the change of the image quality is considered; (3) the method can be suitable for different eye images and has better robustness; (4) pupil positioning accuracy is high.
Finally, it should be noted that: although the present invention has been described in detail with reference to the above embodiments, it should be understood by those skilled in the art that: modifications and equivalents may be made thereto without departing from the spirit and scope of the invention and it is intended to cover in the claims the invention as defined in the appended claims.
Claims (2)
1. A pupil localization method for global adaptive optimization parameters, the method comprising the steps of:
s100, preprocessing the eye image, and reducing and eliminating noise in the eye image through a Gaussian filter formula, so that the eye image is smoother;
s101, reducing and eliminating noise in the eye image, wherein the Gaussian filter formula isIn the formula: mu is the average value of the pixel values of the eye image and has no dimensional quantity; delta is standard deviation of pixel values of the eye image and has no dimensional quantity; x is a single pixel value in the eye image and is dimensionless;f(x) Is a single pixel value in the processed eye image without dimensional quantity; e is the base number of the natural logarithm, and is a constant; pi is a circumference ratio, constant;
s200, selecting an interested region and an initial search point of the processed eye image, and specifically dividing the selection of the interested region and the initial search point of the eye image into the following steps;
s201, converting the eye image processed by the Gaussian filter formula into a two-dimensional matrix by adopting a Python programming language;
s202, positioning a minimum pixel value point in the eye image according to a minimum value positioning formula of the matrix elements; the minimum positioning formula isIn the formula:is the minimum pixel value point of the eye image; xlow_all、Ylow_allThe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities; min is a minimum value function;i、jthe number of rows and columns of the two-dimensional matrix is dimensionless;the horizontal and vertical coordinates in the eye image are Xi,YjThe pixel value of the corresponding point is dimensionless;
s203, representing an ROI (region of interest) according to an ROI selected formula, wherein the ROI is an ROIIn the formula: (X)center,Ycenter) The horizontal and vertical coordinates of the center point of the whole eye image are dimensionless; rangeHIs the ROI area height, dimensionless quantity; rangeWIs the width of the ROI area, without dimension; A. b, C, D are coordinates of four top points of ROI, and have no dimension;Hthe height of the eye image is dimensionless;Wthe width of the eye image is dimensionless;
s204, determining a search starting coordinate by utilizing an initial search point selection formula and combining the ROI area and the minimum pixel value point of the whole eye image, wherein the initial search point selection formula isIn the formula:X init、Y initis the horizontal and vertical coordinates of the initial search point,X low_all、Y low_allthe horizontal and vertical coordinates of the minimum pixel value point in the eye image are dimensionless quantities;X low_roi,Y low_roithe horizontal and vertical coordinates of the minimum pixel value point in the ROI are dimensionless quantities;
s300, searching a near L neighborhood of the eye image to obtain a minimum pixel value of a pupil separation parameter in the eye image; the step of solving the minimum pixel value of the pupil separation parameter is specifically divided into the following steps;
s301, establishing a near L neighborhood searching mode, setting a searching step length L as 20, and establishing a jump limit condition formula which isIn the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s302, establishing an exception handling formula which comprises an exception handling formula that eight directions of the initial search point simultaneously satisfy a jump limit condition formula and an exception handling formula that eight directions of the initial search point search return equal pixel values, wherein the exception handling formula that eight directions of the initial search point simultaneously satisfy the jump limit condition formula isIn the formulaThe horizontal and vertical coordinates of the next search point are obtained; pLTThe upper left pixel value of the initial search point is dimensionless; pTThe upper pixel value of the initial search point is a dimensionless quantity; pRTThe pixel value at the upper right of the initial search point is a dimensionless quantity; pLThe left pixel value of the initial search point is a dimensionless quantity; pRIs the right pixel value of the initial search point, without dimension; pLDThe pixel value at the lower left of the initial search point is a dimensionless quantity; pDThe lower pixel value of the initial search point is a dimensionless quantity; pRDThe pixel value at the lower right of the initial search point is a dimensionless quantity; the Point function represents the horizontal and vertical coordinates of the Point corresponding to the pixel value; the exception handling formula of the pixel value equality returned by the eight-direction search of the initial search point isIn the formula: l isextraIs an additional search step, dimensionless; k is the number of anomalies, dimensionless; pmAnd PnIs the search pixel value from 8 directions, dimensionless;
s303, searching according to the near L neighborhoodA mode and jump limit condition formula, which is programmed by Python program language, utilizes a pupil separation parameter calculation formula and combines a maximum pixel value function to select eight directions to search and return a relative maximum value in pixel values as a minimum pixel value of the pupil separation parameter to obtain the minimum pixel value of the pupil separation parameter, wherein the pupil separation parameter calculation formula isIn the formula PmSearching pixel values in 8 directions without dimension; pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; t isL_nearestSearching step length, wherein the initial value is 1.5, and the step length is free of dimensional quantity; the maximum pixel value function isIn the formula: pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; max is the maximum value function;
s400, according to the obtained minimum pixel value of the pupil separation parameter, carrying out binarization parameter optimization on the eye image, solving the maximum pixel value of the pupil separation parameter, and obtaining a final pupil separation parameter through a limit value taking formula, wherein the method specifically comprises the following steps;
s401, setting a maximum value jump judgment formula of a pupil separation parameter, wherein the maximum value jump judgment formula of the pupil separation parameter isIn the formula: pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value;
s402, establishing a one-way parameter search value formula, combining eight direction search values of an initial point and row and column coordinates of eight direction pixel points in a matrix, and respectively setting the eight direction pixel points of the initial point as new initial search through Python program language programmingSearching coordinates, selecting a relative minimum value from eight direction search return values as a maximum pixel value of the pupil separation parameter according to a maximum value dereferencing formula of the pupil separation parameter, and solving the maximum pixel value of the pupil separation parameter, wherein the unidirectional parameter search dereferencing formula is as follows:in the formula: pmFor searching pixel values of 8 directions of the search point, Pm_laterNo dimensional quantity for the next pixel value searched; pm_lastNo dimensional quantity is the searched current pixel value; t isL_optimizationThe jump judgment value is 10, and no dimensional quantity exists; the maximum value of the pupil separation parameter has a value formula ofIn the formula: pmThe pixel values of the search points in 8 directions are searched, and the dimension is not needed; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity;
s403, obtaining the final pupil separation parameter according to a limit value formula, wherein the limit value formula isIn the formula: ppupilIs a pupil separation parameter, dimensionless; pL_nearestIs the minimum pixel value of the pupil separation parameter, and has no dimension; pL_optimizationThe maximum pixel value of the pupil separation parameter is a dimensionless quantity; mboundIs the constraint condition of the maximum value of the parameter, and the value is 80;
s500, performing binarization processing on the eye image, performing contour screening on a pupil connected domain, fitting the pupil connected domain in an ellipse fitting mode, and finally positioning the pupil, wherein the method specifically comprises the following steps;
s501, according to an image binarization formula, writing a program through Python language, performing binarization processing on the eye image, and obtaining a binarization image of the eye imageImage, the image binarization formula isIn the formula: ppupilIs the final pupil separation parameter, dimensionless; p(i,j)Is the element of the ith row and the jth column of the two-dimensional matrix;
s502, establishing a pupil connected domain screening formula based on the obtained binary image, and solving a pupil connected domain, wherein the pupil connected domain screening formula isIn the formula DiThe average distance between the contour point and the center of the eye image is dimensionless; n is the number of the outlines and is a dimensionless quantity; x is the number ofi,yiIs the abscissa, ordinate, X of the ith contour pointcenter,YcenterIs the abscissa, ordinate, S, of the center point of the eye imageiIs the area of the eye contour, XRThe outline point abscissa at the rightmost side of the pupil outline is a dimensionless quantity; xLIs the abscissa of the leftmost contour point of the pupil contour without dimensional quantity; y isTThe longitudinal coordinate of the highest contour point of the pupil contour is a dimensionless quantity; y isDThe longitudinal coordinate of the lowest contour point of the pupil contour is a dimensionless quantity; o isiA connected domain contour point set meeting the condition is obtained; mDIs a distance constraint value, and has no dimension; mSIs an area constraint condition value and has no dimension; ciThe number of connected domain outline points with subscript i in all connected domain outlines is dimensionless; mCThe constraint condition value is a constraint condition value of the number of the contour points and has no dimensional quantity; scThe area of each contour in the connected domain contour point set meeting the condition is free of dimensional quantity; c is a profile subscript in the connected domain profile point set meeting the condition and has no dimensional quantity; cpupilIs a connected threshold value of the pupil without dimensional quantity;
s503, substituting the contour coordinate points of the pupil connected domain into an ellipse fitting formula, fitting the pupil connected domain to realize pupil positioning, wherein the ellipse fitting formula isIn the formula, x and y are respectively the abscissa and the ordinate of the pupil connected domain contour point, and A, B, C, D, E, F is a constant of an ellipse fitting formula and is determined by the pupil connected domain contour point.
2. The pupil location method with global adaptive optimization parameters as claimed in claim 1, wherein: all parameters in the method can be adjusted and optimized in a global self-adaptive manner according to the actual eye image, and the pupil position of the actual eye image can be accurately identified.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110101150.1A CN112434675B (en) | 2021-01-26 | 2021-01-26 | Pupil positioning method for global self-adaptive optimization parameters |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110101150.1A CN112434675B (en) | 2021-01-26 | 2021-01-26 | Pupil positioning method for global self-adaptive optimization parameters |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112434675A CN112434675A (en) | 2021-03-02 |
CN112434675B true CN112434675B (en) | 2021-04-09 |
Family
ID=74697273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110101150.1A Expired - Fee Related CN112434675B (en) | 2021-01-26 | 2021-01-26 | Pupil positioning method for global self-adaptive optimization parameters |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112434675B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112926536B (en) * | 2021-04-06 | 2024-04-16 | 科大讯飞股份有限公司 | Deformed pupil positioning method, device and equipment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101359365A (en) * | 2008-08-07 | 2009-02-04 | 电子科技大学中山学院 | Iris positioning method based on Maximum between-Cluster Variance and gray scale information |
CN101246544B (en) * | 2008-01-24 | 2010-06-16 | 电子科技大学中山学院 | Iris positioning method based on boundary point search and minimum kernel value similarity region edge detection |
CN102129686A (en) * | 2011-03-24 | 2011-07-20 | 西北工业大学 | Method for detecting sub-voxel surface based on voxel level outline rough positioning |
CN102510734A (en) * | 2010-07-20 | 2012-06-20 | 松下电器产业株式会社 | Pupil detection device and pupil detection method |
CN103475838A (en) * | 2013-06-21 | 2013-12-25 | 青岛海信信芯科技有限公司 | Deinterlacing method based on edge self adaption |
CN106919933A (en) * | 2017-03-13 | 2017-07-04 | 重庆贝奥新视野医疗设备有限公司 | The method and device of Pupil diameter |
KR101942759B1 (en) * | 2017-07-28 | 2019-01-28 | 계명대학교 산학협력단 | Eye pupil detection using ensemble of random forest and fast radial symmetry transform with near infrared camera and system thereof |
CN111666847A (en) * | 2020-05-26 | 2020-09-15 | 张彦龙 | Iris segmentation, feature extraction and matching method based on local 0-1 quantization technology |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103488990B (en) * | 2013-09-29 | 2016-05-18 | 武汉虹识技术有限公司 | A kind of cross neighborhood method is extracted method and the device of eyelashes image and location pupil |
US10038691B2 (en) * | 2013-10-08 | 2018-07-31 | Princeton Identity, Inc. | Authorization of a financial transaction |
CN104268527B (en) * | 2014-09-26 | 2017-12-12 | 北京无线电计量测试研究所 | A kind of iris locating method based on gradient detection |
CN105389574B (en) * | 2015-12-25 | 2019-03-22 | 成都品果科技有限公司 | The method and system of human eye iris in a kind of detection picture |
JP6930223B2 (en) * | 2017-05-31 | 2021-09-01 | 富士通株式会社 | Pupil detection computer program, pupil detection device and pupil detection method |
CN109389033B (en) * | 2018-08-28 | 2022-02-11 | 江苏理工学院 | Novel pupil rapid positioning method |
KR102164686B1 (en) * | 2018-09-05 | 2020-10-13 | 트러스트팜모바일 주식회사 | Image processing method and apparatus of tile images |
CN109614858B (en) * | 2018-10-31 | 2021-01-15 | 北京航天晨信科技有限责任公司 | Pupil center detection method and device |
CN109766818B (en) * | 2019-01-04 | 2021-01-26 | 京东方科技集团股份有限公司 | Pupil center positioning method and system, computer device and readable storage medium |
CN110472521B (en) * | 2019-07-25 | 2022-12-20 | 张杰辉 | Pupil positioning calibration method and system |
CN111339982A (en) * | 2020-03-05 | 2020-06-26 | 西北工业大学 | Multi-stage pupil center positioning technology implementation method based on features |
CN111738195A (en) * | 2020-06-30 | 2020-10-02 | 湖南文理学院 | Iris positioning method and computer readable storage medium |
-
2021
- 2021-01-26 CN CN202110101150.1A patent/CN112434675B/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101246544B (en) * | 2008-01-24 | 2010-06-16 | 电子科技大学中山学院 | Iris positioning method based on boundary point search and minimum kernel value similarity region edge detection |
CN101359365A (en) * | 2008-08-07 | 2009-02-04 | 电子科技大学中山学院 | Iris positioning method based on Maximum between-Cluster Variance and gray scale information |
CN102510734A (en) * | 2010-07-20 | 2012-06-20 | 松下电器产业株式会社 | Pupil detection device and pupil detection method |
CN102129686A (en) * | 2011-03-24 | 2011-07-20 | 西北工业大学 | Method for detecting sub-voxel surface based on voxel level outline rough positioning |
CN103475838A (en) * | 2013-06-21 | 2013-12-25 | 青岛海信信芯科技有限公司 | Deinterlacing method based on edge self adaption |
CN106919933A (en) * | 2017-03-13 | 2017-07-04 | 重庆贝奥新视野医疗设备有限公司 | The method and device of Pupil diameter |
KR101942759B1 (en) * | 2017-07-28 | 2019-01-28 | 계명대학교 산학협력단 | Eye pupil detection using ensemble of random forest and fast radial symmetry transform with near infrared camera and system thereof |
CN111666847A (en) * | 2020-05-26 | 2020-09-15 | 张彦龙 | Iris segmentation, feature extraction and matching method based on local 0-1 quantization technology |
Non-Patent Citations (6)
Title |
---|
Accurate and Robust Pupil Positioning Algorithm Using Adaboost Cascade Detector and Improved Starburst Model;Bing Liu 等;《5th International Conference on Computer Sciences and Automation Engineering》;20160229;第877-883页 * |
An Adaptive Algorithm for Precise Pupil Boundary Detection using Entropy of Contour Gradients;Cihan Topal 等;《arXiv:1709.06366v1》;20170920;第1-17页 * |
Robust pupil center detection using a curvature algorithm;Danjie Zhu 等;《Computer Methods and Programs in Biomedicine》;19990429;第59卷(第3期);第145-157页 * |
Robust Pupil Tracking Algorithm Based on Ellipse Fitting;Thoriq Satriya 等;《2016 International Symposium on Electronics and Smart Devices》;20161130;第253-257页 * |
基于隐含马尔可夫模型的计算机唇读算法研究;闫龙;《中国优秀硕士学位论文全文数据库信息科技辑》;20150315;第I138-2052页 * |
虹膜图像分割算法研究;任月庆;《中国优秀硕士学位论文全文数据库信息科技辑》;20161115;第I138-249页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112434675A (en) | 2021-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111626190B (en) | Water level monitoring method for scale recognition based on clustering partition | |
CN106446896B (en) | Character segmentation method and device and electronic equipment | |
CN110097536B (en) | Hexagonal bolt looseness detection method based on deep learning and Hough transform | |
CN111091095B (en) | Method for detecting ship target in remote sensing image | |
CN110490913B (en) | Image matching method based on feature description operator of corner and single line segment grouping | |
CN112907520B (en) | Single tree crown detection method based on end-to-end deep learning method | |
CN114240845B (en) | Light cutting method surface roughness measurement method applied to cutting workpiece | |
CN109961065B (en) | Sea surface ship target detection method | |
CN111860587B (en) | Detection method for small targets of pictures | |
CN111476804B (en) | Efficient carrier roller image segmentation method, device, equipment and storage medium | |
CN112991283A (en) | Flexible IC substrate line width detection method based on super-pixels, medium and equipment | |
CN115331245A (en) | Table structure identification method based on image instance segmentation | |
CN112434675B (en) | Pupil positioning method for global self-adaptive optimization parameters | |
CN111915628A (en) | Single-stage instance segmentation method based on prediction target dense boundary points | |
CN111368573A (en) | Positioning method based on geometric feature constraint | |
CN105654042B (en) | The proving temperature character identifying method of glass-stem thermometer | |
CN116863463A (en) | Egg assembly line rapid identification and counting method | |
CN112991395B (en) | Vision tracking method based on foreground condition probability optimization scale and angle | |
CN116469085A (en) | Monitoring method and system for risk driving behavior | |
Zheng et al. | Improvement of grayscale image segmentation based on pso algorithm | |
CN117237657A (en) | RSCD arc detection method based on hough transformation | |
US20130238985A1 (en) | Methods and devices for eliminating cracks within pages | |
CN114529818A (en) | Cultivated land plot extraction method and system | |
CN112396648A (en) | Target identification method and system capable of positioning mass center of target object | |
CN118334019B (en) | Injection quality detection method and system for injection part |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210409 Termination date: 20220126 |