CN115565224A - Method, device, medium and equipment for continuously positioning pupil center in real time - Google Patents

Method, device, medium and equipment for continuously positioning pupil center in real time Download PDF

Info

Publication number
CN115565224A
CN115565224A CN202211197791.2A CN202211197791A CN115565224A CN 115565224 A CN115565224 A CN 115565224A CN 202211197791 A CN202211197791 A CN 202211197791A CN 115565224 A CN115565224 A CN 115565224A
Authority
CN
China
Prior art keywords
eyeball area
determining
eyeball
image
integral projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211197791.2A
Other languages
Chinese (zh)
Inventor
叶姗姗
张勇
牛霄鹏
王文熹
陈宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Shixi Technology Co Ltd
Original Assignee
Zhuhai Shixi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Shixi Technology Co Ltd filed Critical Zhuhai Shixi Technology Co Ltd
Priority to CN202211197791.2A priority Critical patent/CN115565224A/en
Publication of CN115565224A publication Critical patent/CN115565224A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Software Systems (AREA)
  • Mathematical Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Ophthalmology & Optometry (AREA)
  • Algebra (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application discloses a method, a device, a medium and equipment for continuously positioning a pupil center in real time, which are used for realizing the pupil center positioning based on an image gray value and improving the accuracy and the real-time performance of the pupil center positioning. The method comprises the following steps: detecting a face image from a video frame, and converting the face image into a gray image; calculating an integral projection of the gray scale image; determining an eyeball area in the gray-scale image according to the integral projection; and detecting a plurality of corner points in the eyeball area, and jointly positioning the central position of the TONG-type hole through the plurality of corner points.

Description

Method, device, medium and equipment for continuously positioning pupil center in real time
Technical Field
The present application relates to the field of image processing, and in particular, to a method, an apparatus, a medium, and a device for continuously positioning a pupil center in real time.
Background
The sight line refers to the gazing direction of human eyes, reflects the focus of attention of a person, can obtain the characteristics of mental activities and behaviors of the person by researching the gazing condition of the person, and realizes human-computer interaction by tracking the sight line of the person, which is an application popular at present. Eye tracking can not be positioned through eye pupils, the eye pupil positioning is one of indispensable technologies in various computer vision applications such as virtual reality and augmented reality, and whether pupil centers can be extracted from real-time videos correctly and quickly is one of key factors influencing the accuracy of an eye tracking system.
In the prior art, the main research methods for realizing eye movement tracking include a direct observation method, an electrooculogram method and an eye movement instrument method, wherein the eye movement instrument method is a mainstream method, the technology is relatively perfect, the commercialization degree is high, a head-mounted eye movement instrument, a bracket eye movement instrument and the like are adopted, but infrared rays can cause certain damage to human eyes, and the eye movement instrument cannot be worn for a long time. In order to solve the problem, a method for tracking eye movement by using a depth camera is also provided in the prior art, but the problems of system configuration construction, complex user calibration process, low sight line estimation accuracy and the like also exist. In summary, the prior art solutions for eye tracking have problems of high technical cost, intrusive hardware, and inaccurate measurement.
Disclosure of Invention
The application provides a method, a device, a medium and equipment for continuously positioning a pupil center in real time, which are used for realizing the pupil center positioning based on an image gray value and improving the accuracy and the real-time performance of the pupil center positioning.
The application provides a method for continuously positioning the center of a pupil in real time in a first aspect, which comprises the following steps:
detecting a face image from a video frame, and converting the face image into a gray image;
calculating an integral projection of the gray scale image;
determining an eyeball area in the gray-scale image according to the integral projection;
and detecting a plurality of corner points in the eyeball area, and jointly positioning the central position of the TONG-type hole through the plurality of corner points.
Optionally, the calculating an integral projection of the grayscale image includes:
respectively calculating a vertical integral projection and a horizontal integral projection of the gray level image;
the determining an eyeball region in the grayscale image according to the integral projection comprises:
and determining the abscissa of the eyeball area according to the vertical integral projection of the gray level image, and determining the ordinate of the eyeball area according to the horizontal integral projection of the gray level image.
Optionally, the determining the ordinate of the eyeball area according to the horizontal integral projection of the grayscale image includes:
determining a first extreme value and a second extreme value of a trough in a horizontal integral projection of the gray image;
respectively carrying out differential calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a differential calculation result;
and determining the ordinate of the eyeball area according to the difference calculation result.
Optionally, the difference calculation result includes a plurality of difference values, and determining the ordinate of the eyeball area according to the difference calculation result includes:
accumulating and calculating the absolute values of the plurality of difference values to obtain difference accumulated values;
and determining the vertical coordinate of the eyeball area according to the difference accumulated value.
Optionally, the determining the ordinate of the eyeball area according to the difference accumulated value includes:
calculating a target projection through a preset formula according to the horizontal integral projection and the difference accumulated value;
determining the ordinate of the eyeball area according to the extreme value of the wave trough in the target projection;
the preset formula is as follows:
T(y)=a×H(y)-b×C(y);
where a and b are preset weights, a, b ∈ (0,1), H (y) represents the horizontal integral projection, and C (y) represents the difference accumulation value.
Optionally, after determining the horizontal coordinate of the eyeball area according to the vertical integral projection of the gray scale image, and determining the vertical coordinate of the eyeball area according to the horizontal integral projection of the gray scale image, the method further includes:
judging whether a set threshold condition is met, wherein the threshold condition is that the horizontal coordinate is smaller than a horizontal threshold and the vertical coordinate is smaller than a vertical threshold;
and if so, determining the eyeball area according to the horizontal coordinate and the vertical coordinate.
Optionally, after determining an eyeball region in the grayscale image according to the integral projection, the method further includes:
judging whether the current frame has data interruption of an eyeball area;
if yes, predicting the eyeball area of the current frame according to the eyeball area of at least one frame in the past, detecting a plurality of angular points in the predicted eyeball area, and jointly positioning the center position of the TONG hole through the angular points;
and if not, directly detecting a plurality of corner points in the eyeball area, and jointly positioning the central position of the TONG-TONG hole through the corner points.
Optionally, the determining whether the data interruption of the eyeball area occurs in the current frame includes:
determining a diameter of the eyeball region;
if the diameter of the eyeball area is smaller than the preset diameter, determining that data interruption of the eyeball area occurs in the current frame;
and if the diameter of the eyeball area is larger than or equal to the preset diameter, determining that the data interruption of the eyeball area does not occur in the current frame.
Optionally, the predicting the eyeball area of the current frame according to the eyeball area of at least one past frame includes:
acquiring an eyeball area of at least one past frame;
and predicting the eyeball area of the current frame by a linear approximation method according to the eyeball area of at least one past frame.
Optionally, the formula of the linear approximation is:
Figure BDA0003871130770000031
wherein k0 and k1 are parameters to be solved;
the mean square error of the measured value and the approximation value of n points in the past i frames is as follows:
Figure BDA0003871130770000032
by approximating E (Δ ε) i ) K0 and k1 are solved, and the eyeball area of the current frame is predicted according to k0 and k1 and the eyeball area of the past i frame.
Optionally, detecting a plurality of corner points in the eyeball region, and jointly positioning the center of the tonehole through the plurality of corner points includes:
detecting a plurality of corner points in the eyeball area through a Shi-Tomasi algorithm;
and jointly positioning the pupil center position according to the detected corner points.
A second aspect of the present application provides a gaze estimation method, including:
detecting facial features of a human face in a video frame;
determining the pupil center position of the face in the video frame by any one of the first aspect and the first aspect through a method for optionally and continuously positioning the pupil center in real time;
and estimating the gazing direction of the human face in the video frame according to the facial features and the pupil center position.
The third aspect of the present application provides a device for continuously locating the pupil center in real time, comprising:
the detection unit is used for detecting a face image from a video frame and converting the face image into a gray image;
a calculation unit for calculating an integral projection of the grayscale image;
a determining unit, configured to determine an eyeball area in the grayscale image according to the integral projection;
and the positioning unit is used for detecting a plurality of corner points in the eyeball area and jointly positioning the central position of the TONG-type hole through the corner points.
Optionally, the computing unit is specifically configured to:
respectively calculating a vertical integral projection and a horizontal integral projection of the gray level image;
the determination unit includes:
the first determination module is used for determining the abscissa of the eyeball area according to the vertical integral projection of the gray level image;
and the second determination module is used for determining the vertical coordinate of the eyeball area according to the horizontal integral projection of the gray level image.
Optionally, the second determining module is specifically configured to:
determining a first extreme value and a second extreme value of a trough in a horizontal integral projection of the gray image;
respectively carrying out differential calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a differential calculation result;
and determining the ordinate of the eyeball area according to the difference calculation result.
Optionally, the second determining module is further specifically configured to:
accumulating and calculating the absolute values of the plurality of difference values to obtain difference accumulated values;
and determining the vertical coordinate of the eyeball area according to the difference accumulated value.
Optionally, the second determining module is further specifically configured to:
calculating a target projection through a preset formula according to the horizontal integral projection and the difference accumulated value;
determining the vertical coordinate of the eyeball area according to the extreme value of the wave trough in the target projection;
the preset formula is as follows:
T(y)=a×H(y)-b×C(y);
wherein a and b are preset weights, a, b ∈ (0,1), H (y) represents the horizontal integral projection, and C (y) represents the difference accumulation value.
Optionally, the apparatus further comprises:
the first judging unit is used for judging whether a set threshold condition is met, wherein the threshold condition is that the horizontal coordinate is smaller than a horizontal threshold and the vertical coordinate is smaller than a vertical threshold;
the first determining unit is specifically configured to determine the eyeball area according to the horizontal coordinate and the vertical coordinate when the determination result of the second determining unit is yes.
Optionally, the apparatus further comprises:
the second judging unit is used for judging whether data interruption of the eyeball area occurs in the current frame or not;
the prediction unit is used for predicting the eyeball area of the current frame according to the eyeball area of at least one past frame when the judgment result of the second judgment unit is yes, and the positioning unit is specifically used for detecting a plurality of angular points in the predicted eyeball area and jointly positioning the center position of the TONG-TONG hole through the angular points;
and the positioning unit is further used for directly detecting a plurality of angular points in the eyeball area and jointly positioning the center position of the TONG-hole through the angular points when the judgment result of the second judgment unit is negative.
Optionally, the second judging unit is specifically configured to:
determining a diameter of the eyeball region;
if the diameter of the eyeball area is smaller than the preset diameter, determining that data interruption of the eyeball area occurs in the current frame;
and if the diameter of the eyeball area is larger than or equal to the preset diameter, determining that the data interruption of the eyeball area does not occur in the current frame.
Optionally, the prediction unit is specifically configured to:
acquiring an eyeball area of at least one past frame;
and predicting the eyeball area of the current frame by a linear approximation method according to the eyeball area of at least one past frame.
Optionally, the formula of the linear approximation is:
Figure BDA0003871130770000061
wherein k0 and k1 are parameters to be solved;
the mean square error of the measured value and the approximation value of n points in the past i frames is as follows:
Figure BDA0003871130770000062
by approximating E (Δ ε) i ) K0 and k1 are solved, and the eyeball area of the current frame is predicted according to k0 and k1 and the eyeball area of the past i frame.
Optionally, the positioning unit is specifically configured to:
detecting a plurality of corner points in the eyeball area through a Shi-Tomasi algorithm;
and jointly positioning the pupil center position according to the detected corner points.
The fourth aspect of the present application provides a device for continuously locating the center of a pupil in real time, the device comprising:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the memory holds a program, and the processor calls the program to execute the method for continuously positioning the pupil center in real time according to any one of the first aspect and the first aspect.
A fifth aspect of the present application provides a computer-readable storage medium having a program stored thereon, where the program is executed on a computer to perform the method for continuously locating the center of a pupil in real time according to any one of the first aspect and the first aspect.
A sixth aspect of the present application provides a gaze positioning device, which includes a processor and a camera, where the processor executes a method for continuously positioning a pupil center in real time, which is optional in any of the first aspect and the first aspect when running.
According to the technical scheme, the method has the following advantages:
firstly, carrying out face detection on a video frame, then converting the face into a gray image, preliminarily determining an eyeball area by calculating gray integral projection, then carrying out corner detection in the eyeball area, and jointly positioning the pupil center position in the eyeball area through a plurality of detected corners. The algorithm is simple, the accuracy is high, the human eyes in the input video image can be quickly found, the change of the positions of the pupils can be positioned and tracked when the human eyes move, the pupil center positioning calculation can be realized only through the gray value in the whole process, the method is not invasive, even an ordinary camera (such as a smart phone or a network camera) can realize the eye movement tracking through the method, the accuracy and the instantaneity of the pupil center positioning are ensured under the limited condition, and the method has popularization and application prospects.
Drawings
In order to more clearly illustrate the technical solutions in the present application, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart illustrating an embodiment of a method for continuously positioning a pupil center in real time according to the present disclosure;
fig. 2 is a schematic diagram of a gray scale image in the method for continuously positioning the pupil center in real time according to the present application;
fig. 3 is a schematic diagram illustrating a pupil center position in the method for continuously positioning the pupil center in real time according to the present application;
FIG. 4 is a schematic flowchart illustrating a method for continuously positioning a pupil center in real time according to another embodiment of the present disclosure;
FIG. 5-a is a schematic view of a vertical integral projection of a gray scale image in the method for real-time continuous pupil center positioning provided herein;
FIG. 5-b is a schematic diagram illustrating the horizontal coordinate positioning of the eyeball area in the method for real-time continuous pupil center positioning provided by the present application;
FIG. 6-a is a schematic view of a horizontal integral projection of a gray scale image in the method for continuously positioning the pupil center in real time according to the present application;
FIG. 6-b is a schematic diagram illustrating the positioning of the vertical coordinate of the eyeball area in the method for continuously positioning the pupil center in real time according to the present application;
FIG. 7 is a schematic view of a target projection in the method for continuously positioning the pupil center in real time according to the present application;
fig. 8 is a schematic view illustrating positioning of an eyeball area in the method for continuously positioning the pupil center in real time according to the present application;
FIG. 9 is a schematic structural diagram illustrating an embodiment of a device for real-time continuous pupil center positioning provided herein;
fig. 10 is a schematic structural diagram of another embodiment of the device for real-time continuous pupil center location provided in the present application.
Detailed Description
The application provides a method, a device, a medium and equipment for continuously positioning a pupil center in real time, which are used for realizing the pupil center positioning based on an image gray value and improving the accuracy and the real-time performance of the pupil center positioning.
It should be noted that the method for continuously positioning the pupil center in real time provided by the present application may be applied to a terminal, and may also be applied to a server, for example, the terminal may be a fixed terminal such as a smart phone or a computer, a tablet computer, a smart television, a smart watch, a portable computer terminal, or a desktop computer. For convenience of explanation, the terminal is taken as an execution subject for illustration in the present application.
Referring to fig. 1, fig. 1 is a diagram illustrating an embodiment of a method for continuously locating a pupil center in real time according to the present application, the method including:
101. detecting a face image from the video frame, and converting the face image into a gray image;
the terminal obtains video stream through a camera, obtains video frames of color images frame by frame from the video stream, performs face detection on the video frames by using a dlib library in OpenCV (open virtual circuit library), namely a trained 68 feature point face detection model, to obtain face images, and converts the detected face images into gray images. The gray image is also called a gray scale image, which is obtained by dividing the white and black into several levels according to a logarithmic relationship, called gray scale, and the gray scale can be specifically divided into 256 levels. Referring to fig. 2, fig. 2 is a face image after gray processing, that is, a gray image in the present application, and the conversion of the face image into the gray image can reduce subsequent computation and simplify the processing process.
It should be noted that, in the present application, a video frame may be extracted from a real-time video stream of a camera, or may be extracted from an offline video stream, and is not limited herein. 102. Calculating integral projection of the gray level image;
the terminal calculates integral projection of a gray level image of a human face, wherein the integral projection means that the image forms two integral projection vectors along the horizontal direction and/or the vertical direction, in image analysis, the horizontal integral projection and the vertical integral projection can be used for describing a gray level distribution structure of the image, and the gray level distribution characteristics of the image in the corresponding direction can be known through the horizontal integral projection and the vertical integral projection.
103. Determining an eyeball area in the gray level image according to the integral projection;
referring to fig. 2, in the gray image of the face, the eyeball area is darker than the surrounding area, i.e. the gray value is lower, and the eyeball area can be located by using the feature, specifically, the eyeball area is located according to the distribution of the valleys in the integral projection of the gray image.
104. And detecting a plurality of angular points in the eyeball area, and jointly positioning the central position of the TONG-type hole through the angular points.
The pupil center is specifically a small circular hole at the center of the iris in an animal or human eye, which is a passage for light rays to enter the eye, and from the perspective of a gray scale image: the center of the pupil appears as a high density black circular area. The eye tracking is to track the movement of the eye, and if the eye tracking is to be accurately realized, the determination of the pupil center position cannot be left. After determining the eyeball area in the gray-scale image, the terminal detects a plurality of angular points in the eyeball area, and then positions the pupil center position through multi-angular point combination, please refer to fig. 3, where fig. 3 is a schematic diagram of the finally positioned pupil center position in the present application. After the pupil center position is located, the fixation point of the person can be calculated, and the eye movement tracking can be realized by continuously locating the pupil center position within a period of time.
Specifically, a corner is usually defined as an intersection of two edges, or a local neighborhood of the corner should have boundaries of two different regions in different directions, and for detecting the corner of the gray-scale image, a gray-scale change of a pixel neighborhood point, i.e., a change in image brightness, is mainly considered, and a point having a sufficiently large contrast with the brightness of the neighboring point is defined as the corner. The pupil center in the gray image has a sufficiently high brightness contrast with respect to the eyeball area, so that the pupil center can be located by performing corner detection in the eyeball area, and the location accuracy can be further improved by joint location of multiple corners.
Most of eye movement tracking systems based on videos in the prior art rely on an infrared camera, the infrared light is used for irradiating human eyes, light rays contacting the human eyes can affect the human eyes to a certain degree and even cause damage to the human eyes, in the embodiment, pupil center positioning can be achieved only by acquiring gray values of face images, and the eye movement tracking systems are not invasive to the human eyes.
In this embodiment, first, a video frame is subjected to face detection, then, a face is converted into a gray image, an eyeball area is preliminarily determined by calculating gray integral projection, then, angular point detection is performed in the eyeball area, and the pupil center position in the eyeball area is jointly positioned through a plurality of detected angular points. The algorithm is simple, the accuracy is high, the human eyes in the input video image can be quickly found, the change of the positions of the pupils can be positioned and tracked when the human eyes move, the pupil center positioning calculation can be realized only through the gray value in the whole process, the method is not invasive, even an ordinary camera (such as a smart phone or a network camera) can realize the eye movement tracking through the method, the accuracy and the instantaneity of the pupil center positioning are ensured under the limited condition, and the method has popularization and application prospects.
Please refer to fig. 4, and fig. 4 is another embodiment of the method for continuously positioning a pupil center in real time provided by the present application, which includes:
401. detecting a face image from the video frame, and converting the face image into a gray image;
step 401 in this embodiment is similar to step 101 in the previous embodiment, and is not described again here.
402. Respectively calculating a vertical integral projection and a horizontal integral projection of the gray level image;
the terminal respectively calculates integral projection of the gray level image of the face: assuming that the image size is m × n, G (x, y) represents a gray value at (x, y) in the image.
In the image [ x ] 1 ,x 2 ]The horizontally integrated projection of the region is H (y), where x 1 ,x 2 ∈[0,m]And x is 2 >x 1
Figure BDA0003871130770000101
In the image [ y 1 ,y 2 ]The vertical integral projection of the region is V (x), where y 1 ,y 2 ∈[0,n]And y is 2 >y 1
Figure BDA0003871130770000102
403. Determining the abscissa of the eyeball area according to the vertical integral projection of the gray level image;
referring to fig. 5-a and 5-b, fig. 5-a is a schematic diagram of vertical integral projection of a gray scale image, where the coordinates of the minimum value of the vertical integral projection of the left and right sides, i.e., the coordinates corresponding to the extreme values of the valleys, are used to locate the abscissa, and the coordinates of the minimum value of the vertical integral projection of the gray scale image are shown in fig. 5-b, so that the abscissas of the left and right eyeball areas can be determined.
404. Determining a first extreme value and a second extreme value of a trough in a horizontal integral projection of the gray level image;
steps 404 to 406 are processes of determining an ordinate of an eyeball region, and when determining the ordinate of the eyeball region, considering that the gray level of eyebrows is lower than the gray level of eyeballs under certain conditions, if positioning is directly performed through the trough extremum of the horizontal integral projection, the eyebrows may be mistakenly determined as eyeballs, which may cause inaccurate ordinate positioning, so when determining the ordinate of the eyeball region, determination needs to be performed on the basis of integral projection in combination with differential calculation. Specifically, the minimum two wave troughs in the horizontal integral projection of the gray image are determined, that is, the first extreme value and the second extreme value of the wave troughs are determined, and the two extreme values correspond to the longitudinal coordinates of the eyebrow region and the eyeball region respectively.
Referring to fig. 6-a and 6-b, fig. 6-a is a schematic diagram of a horizontal integral projection of a gray scale image, it can be seen that two troughs in the horizontal integral projection correspond to an eyebrow and an eye respectively, but the difference between the two troughs is not large, if the position where the eyebrow is likely to be located is determined directly by the trough extremum, that is, the location result is as shown in fig. 6-b.
405. Respectively carrying out differential calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a differential calculation result;
because the gray scale change of the eyeball area in the horizontal direction is rich, namely the eye white is passed through by skin to the pupil, and then the eye white is passed through by the pupil to the skin, the gray scale distribution of the eyebrow area is concentrated, the change rate in the horizontal direction is small, and the eyebrow area and the eyeball area can be distinguished through the characteristic. Therefore, the terminal respectively carries out difference calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a difference calculation result, and the vertical coordinates of the eyeball area and the eyebrow area are distinguished by using the difference calculation result.
Specifically, the calculation formula of the difference between adjacent pixels in the horizontal direction is as follows:
ΔG(x+1,y)=G(x+1,y)-G(x,y) (3)
further, the terminal may further perform an accumulated calculation of absolute values on difference calculation results between adjacent pixels in the horizontal direction of the first extreme value and the second extreme value, and accurately reflect a change rate of the gray value in the horizontal direction of the first extreme value and the second extreme value through the difference accumulated value, where a calculation formula of the difference accumulated value is as follows:
Figure BDA0003871130770000111
where C (y) represents a difference accumulation value, the greater the gradation value change rate, the greater the difference accumulation value.
406. Determining the vertical coordinate of the eyeball area according to the difference calculation result;
the terminal can directly determine the ordinate of the eyeball area through the differential accumulated value, namely, the differential accumulated value calculation in the horizontal direction is carried out on the first extreme value y1 and the second extreme value y2, and the determination that the differential accumulated value is larger is determined as the ordinate of the eyeball area.
Further, in order to improve the positioning accuracy of the longitudinal coordinate of the eyeball area, the terminal can calculate the target projection according to the horizontal integral projection and the differential accumulated value through a preset formula, and then determine the extreme value of the trough in the target projection as the longitudinal coordinate of the eyeball area, wherein the preset formula is as follows:
T(y)=a×H(y)-b×C(y) (5)
the values a and b are preset weights, a, b are (0,1), aH (y) represents horizontal integral projection, C (y) represents a difference accumulated value, and the specific values of a and b can be adjusted according to different application scenes and precision requirements.
For the first and second extreme values y 1 ,y 2 Respectively calculating difference accumulated values to obtain S (y) 1 ),S(y 2 ). If S (y) 1 )<S(y 2 ) Then H (y) will be 1 ) Is given as H (y) 2 ) And ensuring that T (y) obtains the minimum value at the eye, wherein the y at the moment can be determined as the vertical coordinate of the eyeball area. The target projection T (y) calculated by the preset formula is shown in fig. 7, and the positioning result of the eyeball area is shown in fig. 8.
It should be noted that, in some specific embodiments, after the horizontal and vertical coordinates of the eyeball area are determined through steps 402 to 406, it is further necessary to determine whether the horizontal and vertical coordinates meet the set threshold condition, where the threshold condition is given according to a specific experiment: horizontal threshold τ x And a vertical threshold τ y When x is<τ x ,y<τ y The eye region E (x, y) is obtained.
407. Judging whether the current frame has data interruption of the eyeball area, if so, executing step 408, otherwise, directly executing step 409;
when the eyeball area is located, the condition that eyelashes or eyelids partially or even severely cover the eyeballs and the condition that the current frame cannot be located in the eyeball area or the located eyeball area is incomplete due to factors such as intrinsic shake and blinking of the eyes often occur, and such a condition can cause data interruption in the application. The influence of blinking is the largest, according to statistics, normal people need not blink ten times per minute on average, normally blink once in 2-6 s, and each blink needs 0.2-0.4 s, which may cause the discontinuity of pupil positioning and even failure of positioning, and further cause the inaccurate realization of eye tracking.
Therefore, in the present embodiment, it is necessary to determine whether the data of the eyeball area is interrupted for each frame of image, that is, determine whether the current frame successfully locates the eyeball area. If the current frame has data interruption of the eyeball area, the step 408 needs to be executed first, and then the step 409 needs to be executed, and if the current frame does not have data termination of the eyeball area, the step 409 is directly executed.
It should be noted that, the condition for determining whether data interruption occurs may be specifically set according to different application scenarios and different precision requirements, and in some specific embodiments, the terminal may determine whether data interruption occurs by using the diameter of the eyeball area detected by the current frame. Since the diameter of the eyeball of an adult is usually a fixed value, by setting a preset diameter, if the diameter of the eyeball area detected by the current frame is smaller than the preset diameter, it can be determined that data interruption occurs, and then step 408 is executed; if the diameter of the eyeball area detected by the current frame is greater than or equal to the preset diameter, it is determined that no data interruption occurs, and then step 409 is executed.
408. Acquiring an eyeball area of at least one past frame, and predicting the eyeball area of the current frame by a linear approximation method according to the eyeball area of at least one past frame;
when the data interruption of the eyeball area of the current frame is determined, the eyeball area of at least one past frame is used for predicting the interrupted eyeball area of the current frame, and the data interruption of the current frame is compensated. It should be noted that, when predicting the eyeball area of the current frame according to the eyeball area of at least one past frame, a deep learning method may be specifically adopted, and a mathematical approximation method may also be adopted, where the eyeball area of the current frame is predicted mainly by using the relevant features of the eyeball area of at least one past frame, and the details are not limited herein.
In some specific embodiments, if the current frame has data interruption, the terminal acquires the eyeball area of at least one past frame, and determines the eyeball area of the current frame interrupted by the linear approximation method, that is, the eyeball area when blinking is predicted from the past n frames, that is, the n +1 th frame.
The following describes the linear approximation method provided in the present application in detail:
assume that the value of the gray function G (x, y) at n sequential times is G ti (x, y) is denoted as G (t) i ) (i =1,2,.., n). The gray value G (t) at time t is optimally linearly approximated by the following formula:
Figure BDA0003871130770000131
t i the error between the time measurement and the approximation is: delta epsilon i =G(t i )-k 0 -k 1 t i The mean square error estimated for n points is:
Figure BDA0003871130770000132
in order to test the relation between the performance of the linear approximation algorithm and the actual position, the performance of the linear approximation algorithm is verified by means of mean square error. The best approximation is to make the above formula E (Delta epsilon) i ) The minimum value is obtained. The process of calculating the coefficients of the approximation function using the least squares method is as follows:
E(Δε i ) Is about an independent variable k 0 And k 1 To make E (Δ ε) a binary function of i ) Taking the minimum value to satisfy:
Figure BDA0003871130770000141
so that k can be calculated using the elimination method or the claime rule 0 ,k 1
Calculated, the solution of the best linear approximation algorithm predictor for the case of n =1,2,3,4,5 is given in table 1.
Figure BDA0003871130770000142
Table 1: best linear approximation predictor solution
For the interpolation method, it is generally the case that the larger n, the better the approximation. Through experimental tests, when the accuracy rate of n =3 is already 95%, n =3 can be specifically selected for prediction in the application in consideration of the problem of calculation amount. Namely:
Figure BDA0003871130770000143
and when the terminal determines that the data interruption occurs in the current frame, calculating the eyeball area E (x, y) of the current frame from the eyeball areas of the last 3 frames by the above expression.
409. And detecting a plurality of angular points in the eyeball area, and jointly positioning the central position of the TONG-type hole through the angular points.
After determining the eyeball area E (x, y), the terminal moves the obtained eye area E (x, y) to the x and y directions by micro displacement u, v respectively by using Shi-Tomasi algorithm, and then detects the angular point according to the change of the gray level. When E (x, y) is translated by (u, v), the gray scale variation is:
Figure BDA0003871130770000144
wherein, ω (x, y) is a weighting function, and is generally a gaussian weighting function.
The Taylor formula is used for G (x + u, y + v):
G(x+u,y+v)≈G(x,y)+uG x +vG y (11)
wherein, G x 、G y Representing the gradient values of the image gray in the x and y directions, respectively. Equation (11) is further derived as:
Figure BDA0003871130770000151
wherein the content of the first and second substances,
Figure BDA0003871130770000152
as can be seen from equation (13), the magnitude of the change in the gray level value depends on the autocorrelation function M, and two eigenvalues λ of the calculation matrix M 12 And (4) comparing. Given a threshold τ by experiment, the corner response function R is calculated:
R=min(λ 12 )>τ (14)
i.e. if the smaller feature value is still larger than the given threshold, this feature point is the Shi-Tomasi corner point.
Pupil positioning is performed by using a mode of jointly positioning a plurality of Shi-Tomasi angular points so as to improve the accuracy of pupil positioning. The multi-corner joint positioning formula is as follows:
Figure BDA0003871130770000153
wherein R is i Is the smaller of the two feature values of the autocorrelation function M, n is the number of corner points, and C is the final corner point, i.e., the pupil center position where the terminal is finally located, as shown in fig. 3.
In this embodiment, because the inherent jitter of the eyes and the data interruption caused by blinking and other factors may occur when processing continuous eye diagram data, it is necessary to determine whether data interruption occurs in each frame, predict the eyeball area of the current frame by using the eyeball area of at least one past frame when data interruption occurs, and then perform the positioning of the pupil center, thereby compensating for the pupil center positioning failure and the eye movement tracking failure caused by data interruption, and ensuring the accuracy, continuity and real-time performance of the pupil center positioning. Further, the diameter of the eyeball area is adopted for the data interruption condition to judge, when the data interruption occurs, the eyeball area of which the current frame is interrupted is determined by a linear approximation method by using the eyeball area of at least one frame in the past, and the pupil center position is determined in the eyeball area by using a multi-angle point joint positioning method based on Shi-Tomasi, so that the continuity of the pupil center positioning is further improved.
Further, the present application also provides a gaze estimation method, specifically, the embodiment shown in fig. 1 or fig. 4 of the present application first locates the pupil center position in the video stream, and then performs gaze estimation by combining the facial features of the face in the video stream, so as to obtain the current gaze direction of the user, thereby implementing human-computer interaction.
Referring to fig. 9, fig. 9 is a diagram illustrating an embodiment of the apparatus for continuously positioning a pupil center in real time according to the present application, where the apparatus includes:
a detection unit 901, configured to detect a face image from a video frame, and convert the face image into a grayscale image;
a calculation unit 902 for calculating an integral projection of the gray-scale image;
a determination unit 903 for determining an eyeball region in the grayscale image from the integral projection;
and a positioning unit 904, configured to detect several corner points in the eyeball region, and position the center of the tonehole through a combination of the several corner points.
Optionally, the calculating unit 902 is specifically configured to:
respectively calculating vertical integral projection and horizontal integral projection of the gray level image;
the determination unit 903 includes:
the first determining module 9031 is configured to determine an abscissa of the eyeball area according to a vertical integral projection of the grayscale image;
and the second determining module 9032 is configured to determine a vertical coordinate of the eyeball area according to the horizontal integral projection of the grayscale image.
Optionally, the second determining module 9032 is specifically configured to:
determining a first extreme value and a second extreme value of a trough in a horizontal integral projection of the gray level image;
respectively carrying out differential calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a differential calculation result;
and determining the vertical coordinate of the eyeball area according to the difference calculation result.
Optionally, the second determining module 9032 is further specifically configured to:
accumulating and calculating the absolute values of the plurality of difference values to obtain a difference accumulated value;
and determining the vertical coordinate of the eyeball area according to the difference accumulated value.
Optionally, the second determining module 9032 is further specifically configured to:
calculating target projection through a preset formula according to the horizontal integral projection and the difference accumulated value;
determining the ordinate of the eyeball area according to the extreme value of the wave trough in the target projection;
the preset formula is as follows:
T(y)=a×H(y)-b×C(y);
where a and b are preset weights, a, b ∈ (0,1), H (y) represents a horizontal integral projection, and C (y) represents a difference accumulation value.
Optionally, the apparatus further comprises:
a first judging unit 905 configured to judge whether a set threshold condition is satisfied, where the threshold condition is that the horizontal coordinate is smaller than the horizontal threshold and the vertical coordinate is smaller than the vertical threshold;
the determining unit 903 is specifically configured to determine the eyeball area according to the horizontal coordinate and the vertical coordinate when the determination result of the second determining unit 905 is yes.
Optionally, the apparatus further comprises:
a second judging unit 906, configured to judge whether data interruption of the eyeball area occurs in the current frame;
a predicting unit 907, configured to predict an eyeball area of a current frame according to an eyeball area of at least one past frame when a determination result of the second determining unit 906 is yes, where the positioning unit 904 is specifically configured to detect a plurality of corner points in the predicted eyeball area, and jointly position a center position of a g-shaped hole through the plurality of corner points;
the positioning unit 904 is further configured to, when the determination result of the second determining unit 906 is negative, directly detect a plurality of corner points in the eyeball area, and jointly position the center of the tonehole through the plurality of corner points.
Optionally, the second determining unit 906 is specifically configured to:
determining the diameter of an eyeball area;
if the diameter of the eyeball area is smaller than the preset diameter, determining that data interruption of the eyeball area occurs in the current frame;
and if the diameter of the eyeball area is larger than or equal to the preset diameter, determining that the data interruption of the eyeball area does not occur in the current frame.
Optionally, the prediction unit 907 is specifically configured to:
acquiring an eyeball area of at least one past frame;
and predicting the eyeball area of the current frame by a linear approximation method according to the eyeball area of at least one past frame.
Optionally, the formula of the linear approximation is:
Figure BDA0003871130770000181
wherein k0 and k1 are parameters to be solved;
the mean square error of the measured value and the approximation value of n points in the past i frames is as follows:
Figure BDA0003871130770000182
by approximating E (Δ ε) i ) K0 and k1 are solved, and the eyeball area of the current frame is predicted according to k0 and k1 and the eyeball area of the past i frame.
Optionally, the positioning unit 904 is specifically configured to:
detecting a plurality of corner points in the eyeball area by using a Shi-Tomasi algorithm;
and jointly positioning the pupil center position according to the detected corner points.
In the device of this embodiment, the functions of each unit and each module correspond to the steps in the method embodiments shown in fig. 1 or fig. 4, and are not described herein again.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating an embodiment of a device for continuously positioning a pupil center in real time according to the present invention, the device includes:
a processor 1001, a memory 1002, an input-output unit 1003, a bus 1004;
the processor 1001 is connected to the memory 1002, the input-output unit 1003, and the bus 1004;
the memory 1002 holds a program that the processor 1001 invokes to perform any of the above methods for continuously locating the pupil center in real time.
The present application also relates to a computer-readable storage medium having a program stored thereon, which when run on a computer causes the computer to perform any of the above methods for continuously locating the center of a pupil in real time.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.

Claims (16)

1. A method for continuously locating the center of a pupil in real time, the method comprising:
detecting a face image from a video frame, and converting the face image into a gray image;
calculating an integral projection of the gray scale image;
determining an eyeball area in the gray-scale image according to the integral projection;
and detecting a plurality of corner points in the eyeball area, and jointly positioning the central position of the TONG-type hole through the plurality of corner points.
2. The method of claim 1, wherein the calculating an integral projection of the grayscale image comprises:
respectively calculating a vertical integral projection and a horizontal integral projection of the gray level image;
the determining an eye region in the grayscale image from the integrated projection comprises:
and determining the abscissa of the eyeball area according to the vertical integral projection of the gray level image, and determining the ordinate of the eyeball area according to the horizontal integral projection of the gray level image.
3. The method of claim 2, wherein determining the vertical coordinate of the eyeball region from the horizontally integrated projection of the grayscale image comprises:
determining a first extreme value and a second extreme value of a trough in a horizontal integral projection of the gray image;
respectively carrying out differential calculation on adjacent pixels in the horizontal direction on the first extreme value and the second extreme value to obtain a differential calculation result;
and determining the ordinate of the eyeball area according to the difference calculation result.
4. The method according to claim 3, wherein the difference calculation result comprises a plurality of difference values, and the determining the ordinate of the eyeball region according to the difference calculation result comprises:
accumulating and calculating the absolute values of the plurality of difference values to obtain a difference accumulated value;
and determining the vertical coordinate of the eyeball area according to the difference accumulated value.
5. The method of claim 4, wherein determining the ordinate of the eye region from the differential accumulation value comprises:
calculating a target projection through a preset formula according to the horizontal integral projection and the difference accumulated value;
determining the vertical coordinate of the eyeball area according to the extreme value of the wave trough in the target projection;
the preset formula is as follows:
T(y)=a×H(y)-b×C(y);
where a and b are preset weights, a, b ∈ (0,1), H (y) represents the horizontal integral projection, and C (y) represents the difference accumulation value.
6. The method of claim 2, wherein after determining the horizontal coordinates of the eye region from the vertically integrated projection of the gray scale image and determining the vertical coordinates of the eye region from the horizontally integrated projection of the gray scale image, the method further comprises:
judging whether a set threshold condition is met, wherein the threshold condition is that the horizontal coordinate is smaller than a horizontal threshold and the vertical coordinate is smaller than a vertical threshold;
and if so, determining the eyeball area according to the horizontal coordinate and the vertical coordinate.
7. The method of claim 1, wherein after said determining an eye region in said grayscale image from said integral projection, said method further comprises:
judging whether the current frame has data interruption of an eyeball area;
if yes, predicting an eyeball area of the current frame according to the eyeball area of at least one frame in the past, detecting a plurality of angular points in the predicted eyeball area, and jointly positioning the center position of the TONG-TONG hole through the angular points;
and if not, directly detecting a plurality of corner points in the eyeball area, and jointly positioning the central position of the TONG-TONG hole through the corner points.
8. The method of claim 7, wherein the determining whether the current frame has data interruption of the eyeball area comprises:
determining a diameter of the eyeball region;
if the diameter of the eyeball area is smaller than the preset diameter, determining that data interruption of the eyeball area occurs in the current frame;
and if the diameter of the eyeball area is larger than or equal to the preset diameter, determining that the data interruption of the eyeball area does not occur in the current frame.
9. The method of claim 7, wherein predicting the eye region of the current frame based on the eye regions of at least one past frame comprises:
acquiring an eyeball area of at least one past frame;
and predicting the eyeball area of the current frame by a linear approximation method according to the eyeball area of at least one past frame.
10. The method of claim 9, wherein the linear approximation is formulated as:
Figure FDA0003871130760000031
wherein k0 and k1 are parameters to be solved;
the mean square error of the measured value and the approximation value of n points in the past i frames is as follows:
Figure FDA0003871130760000032
by approximating E (Δ ε) i ) K0 and k1 are solved, and the eyeball area of the current frame is predicted according to k0 and k1 and the eyeball area of the past i frame.
11. The method according to any one of claims 1 to 10, wherein the detecting of the several corner points in the eyeball region, the jointly locating the TONG-TONG hole center position by the several corner points comprises:
detecting a plurality of corner points in the eyeball area through a Shi-Tomasi algorithm;
and jointly positioning the pupil center position according to the detected corner points.
12. A gaze estimation method, characterized in that the gaze estimation method comprises:
detecting facial features of a human face in a video frame;
determining a pupil center position of a face in the video frame by the method of any one of claims 1 to 11;
and estimating the gazing direction of the human face in the video frame according to the facial features and the pupil center position.
13. An apparatus for continuously locating the center of a pupil in real time, the apparatus comprising:
the detection unit is used for detecting a face image from a video frame and converting the face image into a gray image;
a calculation unit for calculating an integral projection of the gray scale image;
a determining unit, configured to determine an eyeball area in the grayscale image according to the integral projection;
and the positioning unit is used for detecting a plurality of corner points in the eyeball area and jointly positioning the central position of the TONG-type hole through the corner points. .
14. An apparatus for continuously locating the center of a pupil in real time, the apparatus comprising:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the memory holds a program that the processor calls to perform the method of any of claims 1 to 11.
15. A computer-readable storage medium having a program stored thereon, the program, when executed on a computer, performing the method of any one of claims 1 to 11.
16. A gaze location device, comprising a processor and a camera, the processor performing the method of any of claims 1 to 11.
CN202211197791.2A 2022-09-29 2022-09-29 Method, device, medium and equipment for continuously positioning pupil center in real time Pending CN115565224A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211197791.2A CN115565224A (en) 2022-09-29 2022-09-29 Method, device, medium and equipment for continuously positioning pupil center in real time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211197791.2A CN115565224A (en) 2022-09-29 2022-09-29 Method, device, medium and equipment for continuously positioning pupil center in real time

Publications (1)

Publication Number Publication Date
CN115565224A true CN115565224A (en) 2023-01-03

Family

ID=84742416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211197791.2A Pending CN115565224A (en) 2022-09-29 2022-09-29 Method, device, medium and equipment for continuously positioning pupil center in real time

Country Status (1)

Country Link
CN (1) CN115565224A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130043366A (en) * 2011-10-20 2013-04-30 경북대학교 산학협력단 Gaze tracking apparatus, display apparatus and method therof
CN103996020A (en) * 2014-04-10 2014-08-20 中航华东光电(上海)有限公司 Head mounted eye tracker detection method
CN105930762A (en) * 2015-12-02 2016-09-07 中国银联股份有限公司 Eyeball tracking method and device
CN114764944A (en) * 2020-12-30 2022-07-19 中国科学院长春光学精密机械与物理研究所 Pupil positioning method and device based on angular point detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130043366A (en) * 2011-10-20 2013-04-30 경북대학교 산학협력단 Gaze tracking apparatus, display apparatus and method therof
CN103996020A (en) * 2014-04-10 2014-08-20 中航华东光电(上海)有限公司 Head mounted eye tracker detection method
CN105930762A (en) * 2015-12-02 2016-09-07 中国银联股份有限公司 Eyeball tracking method and device
CN114764944A (en) * 2020-12-30 2022-07-19 中国科学院长春光学精密机械与物理研究所 Pupil positioning method and device based on angular point detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
侯向丹;赵丹;刘洪普;顾军华;: "基于积分投影和差分投影的人眼定位", 《计算机工程与科学》, vol. 39, no. 3, pages 534 - 539 *

Similar Documents

Publication Publication Date Title
CN108427503B (en) Human eye tracking method and human eye tracking device
Fogelton et al. Eye blink detection based on motion vectors analysis
Zhu et al. Subpixel eye gaze tracking
US11715231B2 (en) Head pose estimation from local eye region
JP2016515242A (en) Method and apparatus for gazing point estimation without calibration
US20150092983A1 (en) Method for calibration free gaze tracking using low cost camera
CN115482574B (en) Screen gaze point estimation method, device, medium and equipment based on deep learning
CN103885589A (en) Eye movement tracking method and device
CN110807427A (en) Sight tracking method and device, computer equipment and storage medium
WO2020232855A1 (en) Method and apparatus for adjusting screen display on the basis of subtle expression
Ghani et al. GazePointer: A real time mouse pointer control implementation based on eye gaze tracking
JPWO2018078857A1 (en) Gaze estimation apparatus, gaze estimation method, and program recording medium
CN112232128B (en) Eye tracking based method for identifying care needs of old disabled people
Jafari et al. Eye-gaze estimation under various head positions and iris states
CN114005167A (en) Remote sight estimation method and device based on human skeleton key points
Wan et al. Robust and accurate pupil detection for head-mounted eye tracking
CN112863453B (en) Holographic display method and holographic display system
Sadri et al. Particle filtering in the design of an accurate pupil tracking system
Abdulin et al. Method to detect eye position noise from video-oculography when detection of pupil or corneal reflection position fails
CN115565224A (en) Method, device, medium and equipment for continuously positioning pupil center in real time
JP2004157778A (en) Nose position extraction method, program for operating it on computer, and nose position extraction device
CN112528714B (en) Single-light-source-based gaze point estimation method, system, processor and equipment
Elahi et al. Webcam-based accurate eye-central localization
CN114764944A (en) Pupil positioning method and device based on angular point detection
KR100338805B1 (en) Method for detecting drowsiness level

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination