CN114360043A - Model parameter calibration method, sight tracking method, device, medium and equipment - Google Patents

Model parameter calibration method, sight tracking method, device, medium and equipment Download PDF

Info

Publication number
CN114360043A
CN114360043A CN202210267130.6A CN202210267130A CN114360043A CN 114360043 A CN114360043 A CN 114360043A CN 202210267130 A CN202210267130 A CN 202210267130A CN 114360043 A CN114360043 A CN 114360043A
Authority
CN
China
Prior art keywords
pupil
eyeball
center
point
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210267130.6A
Other languages
Chinese (zh)
Other versions
CN114360043B (en
Inventor
郭岩松
孙其民
付阳
沈忱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Virtual Reality Institute Co Ltd filed Critical Nanchang Virtual Reality Institute Co Ltd
Priority to CN202210267130.6A priority Critical patent/CN114360043B/en
Publication of CN114360043A publication Critical patent/CN114360043A/en
Application granted granted Critical
Publication of CN114360043B publication Critical patent/CN114360043B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A model parameter calibration method, a sight tracking method, a device, a medium and equipment are provided, wherein the model parameter calibration method comprises the following steps: respectively acquiring images shot when a user watches each preset watching point on a screen to obtain human eye images corresponding to each preset watching point; carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set; performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model; calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function; and solving an equation set formed by nonlinear equations corresponding to all the preset gazing points. The method facilitates the calibration of the model parameters and obtains more accurate sight direction.

Description

Model parameter calibration method, sight tracking method, device, medium and equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a model parameter calibration method, a sight tracking method, a device, a medium and equipment.
Background
An eye tracking system for a virtual reality headset is mainly composed of an eye, a camera and a screen. Near-eye display systems for virtual reality headsets typically have an eyepiece between the eye and the screen to magnify the screen. The eyeball tracking algorithm estimates the coordinates or the sight direction of the fixation point on the screen according to the eyeball image shot by the camera.
Currently, the eye tracking algorithm is used to estimate the direction of sight, and usually requires establishing a system model and measuring model parameters or using preset model parameters. Different users have different eye characteristics, and the sight lines of the users when watching a certain point are different, so that sight line direction estimation errors are caused if each user adopts uniform model parameters. Therefore, a convenient method for calibrating the model parameters on site is needed to be provided for the user, so that the sight line estimation precision is improved.
Disclosure of Invention
In view of the above situation, it is necessary to provide a method, a device, a medium and an apparatus for calibrating model parameters, a method for tracking a line of sight, a device, a medium and a device for calibrating model parameters on site, so as to improve the accuracy of line of sight estimation.
A method for calibrating model parameters comprises the following steps:
respectively acquiring images shot when a user watches each preset fixation point on a screen to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and acquiring a nonlinear equation corresponding to each preset gazing point, and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain the value of each unknown parameter in the eyeball model.
Further, in the above method for calibrating model parameters, the step of performing corneal refraction correction on the set of pupil contour points includes:
establishing a first optimization function related to the pupil contour points according to the intersection condition of rays respectively emitted by each contour point in the pupil contour point set through the center of a camera and the eyeball model;
and establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point.
Further, in the above method for calibrating model parameters, the first optimization function is:
Figure 246419DEST_PATH_IMAGE001
the second optimization function is:
Figure 27294DEST_PATH_IMAGE002
wherein case I is the intersection point of ray and cornea, and the corneal surface is shown as the surface of cornea according to the law of refraction
Figure 153513DEST_PATH_IMAGE003
Normal vector of (c)
Figure 994430DEST_PATH_IMAGE004
Direction of ray
Figure 330733DEST_PATH_IMAGE005
Vector of refracted light
Figure 14831DEST_PATH_IMAGE006
Are coplanar, i.e.
Figure 385770DEST_PATH_IMAGE007
The corresponding refraction light parameter equation is as follows:
Figure 132009DEST_PATH_IMAGE008
,t2the parameters of the equation are expressed in terms of,
Figure 830975DEST_PATH_IMAGE009
Figure 953651DEST_PATH_IMAGE010
representing a real number set, in case of case I, defining
Figure 913517DEST_PATH_IMAGE011
Figure 706024DEST_PATH_IMAGE012
The intersection of the refracted ray and the pupil plane;
case II is defined as the intersection point between the ray and the eyeball and the intersection point between the ray and the cornea
Figure 16919DEST_PATH_IMAGE013
case III means that the ray has no intersection point with the eyeball and the cornea, and the ray parameter t is selected at the moment0To minimize the distance between the ray and the center of the eyeball, define
Figure 943287DEST_PATH_IMAGE014
In the formula, i tableThe frame number of the human eye image is shown, j represents the j-th contour point in the pupil contour point set, E is the position coordinate of the eyeball center, ri represents the pupil radius,
Figure 633026DEST_PATH_IMAGE015
the coordinates representing the center of the pupil are,
Figure 252226DEST_PATH_IMAGE016
a jth pupil image contour point representing the ith pupil,
Figure 784838DEST_PATH_IMAGE017
representing the intersection point of the camera ray corresponding to the outline point of the pupil image and the eyeball model,
Figure 623219DEST_PATH_IMAGE018
is a sign of a norm,
Figure 557677DEST_PATH_IMAGE019
the inner product of the vector is represented as,
Figure 82199DEST_PATH_IMAGE020
the intersection of the ray with the corneal surface is shown,
Figure 711895DEST_PATH_IMAGE021
the center of the ellipse representing the fit of the pupil contour points,
Figure 245644DEST_PATH_IMAGE022
is represented by
Figure 34609DEST_PATH_IMAGE023
The calculated pupil center coordinate, n is the refractive index of the cornea,
Figure 605399DEST_PATH_IMAGE024
is a refractive index of air and is,
Figure 112603DEST_PATH_IMAGE025
an optimization function is represented with respect to the state of the model,
Figure 59831DEST_PATH_IMAGE026
represents the eyeball model states in the 1 st to the Nth frame images, and is a Lagrange multiplier,
Figure 703302DEST_PATH_IMAGE027
is a positive number so that the residual satisfying case II is always smaller than the residual satisfying case III.
Further, the method for calibrating the model parameters includes the step of determining the intersection point of the user's sight line and the screen according to the corrected pupil contour point set and the eyeball model:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction on the corresponding human eye image under a camera coordinate system;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
Further, the method for calibrating the model parameters includes the steps of determining the coordinates of the pupil center in the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center, and the pupil-eyeball distance:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
Further, in the above method for calibrating model parameters, the unknown parameters of the eyeball model include: the position coordinates of the eyeball center, the distance between the pupil center and the eyeball center, the distance between the cornea curvature center and the eyeball center, the cornea curvature radius and the included angle between the visual axis and the optical axis.
The invention also discloses a sight tracking method, which comprises the following steps:
correcting parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method;
acquiring a to-be-detected human eye image of the user, and carrying out contour detection on pupil projection in the to-be-detected human eye image to obtain a contour point set;
and performing corneal refraction correction on the contour point set, and determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
The invention also discloses a model parameter calibration device, which comprises:
the first acquisition module is used for respectively acquiring images shot when a user watches each preset fixation point on a screen so as to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module is used for detecting contour points of pupil projections in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
the first correction module is used for performing corneal refraction correction on the pupil contour point set and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
the equation establishing module is used for calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and the equation solving module is used for acquiring the nonlinear equation corresponding to each preset gazing point and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points so as to obtain the value of each unknown parameter in the eyeball model.
The invention also discloses a sight tracking device, comprising:
the correction module is used for correcting the parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method;
the second acquisition module is used for acquiring the to-be-detected eye image of the user;
the second pupil detection module is used for carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set;
a second correction module for performing corneal refractive correction on the set of contour points;
and the determining module is used for determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
The invention also discloses a readable storage medium on which a computer program is stored, which program, when executed by a processor, performs the method of any of the above.
The invention also discloses an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the method of any one of the above items when executing the computer program.
The method guides a user to watch by utilizing a plurality of preset watching points on a screen, establishes a nonlinear equation set of unknown parameters of an eyeball model according to a measurement result of an eye pattern, and solves the nonlinear equation set to obtain model parameters. The model parameters of different users can be corrected by the user in the calibration process, so that more accurate sight direction can be obtained.
Drawings
FIG. 1 is a flow chart of a method for calibrating model parameters according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating the corneal refraction correction procedure for the set of pupil contour points according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a gaze tracking method according to a third embodiment of the present invention;
FIG. 4 is a block diagram of a model parameter calibration apparatus according to a fourth embodiment of the present invention;
FIG. 5 is a block diagram of a gaze tracking device in a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device in an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
These and other aspects of embodiments of the invention will be apparent with reference to the following description and attached drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the ways in which the principles of the embodiments of the invention may be practiced, but it is understood that the scope of the embodiments of the invention is not limited correspondingly. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
According to the model parameter calibration method, a plurality of preset fixation points on a screen are utilized to guide a user to perform fixation, a nonlinear equation set related to model parameters is established according to the measurement result of an eye pattern, the nonlinear equation set is solved to obtain the model parameters, the on-site calibration of the model parameters is realized, and the accuracy of sight direction estimation is improved.
Example 1
Referring to fig. 1, a method for calibrating model parameters according to a first embodiment of the present invention includes steps S11-S15.
Step S11, images shot when a user watches each preset fixation point on the screen are respectively obtained to obtain human eye images corresponding to each preset fixation point, and the number of the preset fixation points is set according to the number of unknown parameters in the eyeball model.
The present embodiment relates to a system formed by a screen, a camera and an eyeball, wherein the known parameters in the system are as follows:
taking a screen or a virtual image of the screen as a reference coordinate system, the width w (pixels) of a camera image and the height h (pixels) of the camera image;
camera intrinsic parameters: focal length
Figure 569626DEST_PATH_IMAGE028
(pixel), horizontal translation
Figure 298548DEST_PATH_IMAGE029
Vertical translation
Figure 550931DEST_PATH_IMAGE030
Camera extrinsic parameters: rotation angle vector
Figure 314488DEST_PATH_IMAGE031
A rotation matrix corresponding to the rotation angle vector, and a translation vector
Figure 86135DEST_PATH_IMAGE032
Where subscripts x, y, and z respectively represent three axis directions in three-dimensional coordinates.
The camera internal parameters are calibrated before assembly, and the camera external parameters are calibrated after assembly. The parameters of the model corresponding to the eyeball in the system are unknown parameters. The eyeball model is a physiological model of the eyeball, and the number of the preset fixation points is set according to the number of unknown parameters in the eyeball model. For example, the number of the unknown parameters of the eyeball model in this embodiment is 8, which are as follows:
position of eyeball center
Figure 708877DEST_PATH_IMAGE033
Distance between pupil center and eyeball center
Figure 856962DEST_PATH_IMAGE034
Distance between corneal curvature center and eyeball center
Figure 475025DEST_PATH_IMAGE035
Radius of curvature of cornea
Figure 558519DEST_PATH_IMAGE036
The angle between the visual axis of the eye and the optical axis is expressed as
Figure 262032DEST_PATH_IMAGE037
And
Figure 354753DEST_PATH_IMAGE038
by calibrating at least 8 preset fixation points, the 8 model parameters to be estimated can be solved.
And in the model parameter calibration stage, guiding a user to sequentially watch the 8 preset watching points, and sequentially collecting human eye images of the user to respectively obtain human eye images corresponding to the preset watching points.
And step S12, performing contour point detection on the pupil projection in the human eye image corresponding to the current preset gazing point to obtain a pupil contour point set.
And respectively obtaining 8 human eye images aiming at the 8 fixation points, and carrying out contour detection on the pupil projection in each human eye image to obtain a corresponding pupil contour point set. The contour points of the pupil projection detected in the current human eye image are collected into
Figure 827323DEST_PATH_IMAGE039
Step S13, performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the user' S gaze and the screen according to the corrected pupil contour point set and the eyeball model.
Through analysis, in the existing sight line estimation algorithm, due to corneal refraction, system deviation usually exists in real data related to the eyeball position and the gaze direction, and therefore sight line estimation accuracy is low. Considering the influence of corneal refraction on the accuracy of the ray estimation, the present embodiment needs to correct corneal refraction for contour points in the set of pupil contour points.
Specifically, as shown in fig. 2, in one embodiment of the present invention, the step of performing corneal refraction correction on the set of pupil contour points includes steps S131 to S133.
Step S131, establishing a first optimization function related to the pupil contour points according to the intersection condition of the rays respectively emitted from each contour point in the pupil contour point set through the center of the camera and the eyeball model.
Step S132, establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point.
Step S133, correcting the contour points in the pupil contour point set according to the second optimization function.
Before corneal refraction correction is carried out, mapping from camera pixel coordinates to space coordinates needs to be established, wherein the mapping comprises mapping of a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system. Firstly, mapping a pixel coordinate system to an image coordinate system according to camera internal parameters, then mapping the image coordinate system to a camera coordinate system, and finally mapping the camera coordinate system to a world coordinate system through calibrated camera external parameters.
After coordinate mapping is completed, a ray emitted by the human eye image to each contour point through the camera center (0, 0, 0) is calculated to pass through the cornea. It can be understood that the pupil contour point in the human eye image is a perspective projection of the contour of the real pupil on the camera imaging plane through corneal refraction. The ray in this step refers to the ray corresponding to the perspective projection. Intersection conditions of the rays and the eyeball model are divided into three types, namely case I, intersection points of the rays and the eyeball model are on the surface of a cornea; intersection point of case II ray and eyeball model
Figure 471931DEST_PATH_IMAGE040
On the eyeball; the case III ray has no intersection with both the eyeball and the cornea.
Specifically, the first optimization function for the contour points is established according to the intersection of the ray and the eyeball model as follows.
J pupil image contour point for i pupil
Figure 36642DEST_PATH_IMAGE016
From the center of the camera, through
Figure 792109DEST_PATH_IMAGE041
The ray of the corresponding 3D point is defined as
Figure 384764DEST_PATH_IMAGE042
,t1In order to be the parameters of the equation,
Figure 810060DEST_PATH_IMAGE043
Figure 488166DEST_PATH_IMAGE044
representing a set of real numbers.
case I: intersection of ray and eyeball model
Figure 781744DEST_PATH_IMAGE045
On the corneal surface. According to the law of refraction, at the surface of the cornea
Figure 104272DEST_PATH_IMAGE046
Normal vector of (c)
Figure 90683DEST_PATH_IMAGE047
Direction of ray
Figure 256085DEST_PATH_IMAGE005
Vector of refracted light
Figure 963141DEST_PATH_IMAGE048
Are coplanar, i.e.
Figure 264809DEST_PATH_IMAGE049
The corresponding refraction light parameter equation is as follows:
Figure 422121DEST_PATH_IMAGE050
,t2the parameters of the equation are expressed in terms of,
Figure 186071DEST_PATH_IMAGE051
wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set,
Figure 821452DEST_PATH_IMAGE052
representing the vector of the refracted ray, n is the refractive index of the cornea,
Figure 508785DEST_PATH_IMAGE053
in order for the refractive index of air to be typically 1,
Figure 446785DEST_PATH_IMAGE054
the direction of the rays is indicated,
Figure 586780DEST_PATH_IMAGE055
indicating the point of intersection
Figure 760272DEST_PATH_IMAGE056
The normal vector of (a) is,
Figure 177478DEST_PATH_IMAGE057
the inner product of the vector is represented as,
Figure 411013DEST_PATH_IMAGE056
the intersection of the ray with the corneal surface is shown,
calculating the intersection point of the refracted ray and the pupil plane according to the parameter equation of the refracted ray
Figure 913670DEST_PATH_IMAGE058
Definition of
Figure 890853DEST_PATH_IMAGE059
Intersection point of case II ray and eyeball model
Figure 552779DEST_PATH_IMAGE060
On the eyeball and not on the cornea. At this time define
Figure 331117DEST_PATH_IMAGE061
The case III ray has no intersection with both the eyeball and the cornea. Then selecting ray parameters
Figure 914545DEST_PATH_IMAGE063
To minimize the distance between the ray and the center of the eyeball, define
Figure 960998DEST_PATH_IMAGE065
Correspondingly, the first optimization function is:
Figure 87217DEST_PATH_IMAGE067
wherein E is the position coordinate of the eyeball center, ri represents the pupil radius,
Figure 662555DEST_PATH_IMAGE069
the coordinates representing the center of the pupil are,
Figure 998858DEST_PATH_IMAGE071
is a positive number such that the residuals satisfying case II are always smaller than the residuals satisfying case III.
Wherein case I is that the ray has an intersection point with the cornea, and the distance between the point on the contour circle and the center of the circle is expected to be as close as possible to the radius; caseII is that the ray has no intersection point with the cornea, but has an intersection point with the eyeball, and the distance between the point on the contour circle and the center of the circle is expected to be as small as possible; caseIII means that the ray has no intersection point with the cornea and no intersection point with the eyeball, namely, the ray deviates far, and the distance between the point on the contour circle and the spherical center of the eyeball is expected to be as small as possible.
In practice, the case where the ray does not intersect the cornea may not be considered. Calculating whether the value of the contour point projected on the iris and the distance from the pupil center and the pupil radius are small enough according to the equation
Figure 193210DEST_PATH_IMAGE073
When sufficiently small, the estimated pupil center is closer to the true value.
It should be noted that, for Case I, when a ray intersects the cornea, the ray includes three points: camera center, pupil contour point, corneal refraction point. When the cornea model is a spherical surface, the intersection of the ray and the corneal spherical surface has two conditions, namely one intersection point and two intersection points. When tangent, there is a point of intersection, i.e. only one corneal refraction point, in which case the point of intersection is the target refraction point; when two intersection points are generated after rays pass through the cornea, two corneal refraction points are formed, that is, two refraction rays are respectively formed by refraction, and in this case, an intersection point closer to the camera is selected as a target refraction point among the two refraction points. The refracted ray vector is a vector corresponding to the refracted ray generated by the target refracted point.
Intersection of refracted ray and pupil plane
Figure 298570DEST_PATH_IMAGE075
The calculation method is as follows:
and calculating the intersection point of the refraction ray and the pupil plane by combining the refraction light parameter equation, the iris plane equation and the corneal spherical equation.
Wherein the corneal spherical equation is determined according to the spherical center of the corneal spherical surface and the corneal curvature radius, wherein the spherical center of the corneal spherical surface
Figure 310388DEST_PATH_IMAGE077
Comprises the following steps:
Figure 9354DEST_PATH_IMAGE079
e is the position coordinates of the eyeball center,
Figure 132030DEST_PATH_IMAGE081
is the distance between the cornea curvature center and the eyeball sphere center,
Figure 357475DEST_PATH_IMAGE083
is an optical axis;
the iris plane equation is determined according to the position coordinate P of the pupil center and the optical axis.
For a given eyeball center E, i.e.
Figure 651447DEST_PATH_IMAGE085
According to
Figure 962343DEST_PATH_IMAGE087
And an optical axis
Figure 623131DEST_PATH_IMAGE089
The spherical center of the corneal spherical surface can be obtained
Figure 578449DEST_PATH_IMAGE091
Figure 666490DEST_PATH_IMAGE093
Then according to the spherical center of the corneal sphere
Figure 464682DEST_PATH_IMAGE091
And radius of curvature of cornea
Figure 804528DEST_PATH_IMAGE095
The corneal spherical equation can be determined.
According to the center of the iris circle
Figure 738986DEST_PATH_IMAGE097
(i.e. pupil center) and the normal to the iris
Figure 997929DEST_PATH_IMAGE099
(i.e., the optical axis), the point-normal iris plane equation can be determined.
At any time, the state of the eyeball model is determined by the position coordinates of the eyeball center
Figure 158783DEST_PATH_IMAGE101
And state vector
Figure 426953DEST_PATH_IMAGE103
And (6) determining. Wherein
Figure 481497DEST_PATH_IMAGE105
Representing the zenith angle and azimuth angle of the optical axis in the spherical coordinate system of the eyeball,
Figure DEST_PATH_IMAGE107A
representing the pupil radius.
The refraction light parameter equation, the iris plane equation and the corneal spherical equation are combined, and the coordinates of the intersection point of the refraction ray and the iris plane, namely the corresponding points on the pupil contour circle when the current model parameter values can be calculated.
After the projection of the pupil contour point is calculated, the pupil is calculatedAnd (4) mapping projection of the central point. Further taking into account the spatial coordinates of the pupil center
Figure 81980DEST_PATH_IMAGE109
And should also be close to the pupil center, a second optimization function can be obtained. The second optimization function is:
Figure 464551DEST_PATH_IMAGE111
wherein the content of the first and second substances,
Figure 536412DEST_PATH_IMAGE112
the center of the ellipse representing the fit of the pupil contour points,
Figure 179883DEST_PATH_IMAGE113
is represented by
Figure 921574DEST_PATH_IMAGE112
Calculated pupil center coordinates.
Figure 916075DEST_PATH_IMAGE114
Represents an optimization function with respect to the model state, and λ is the lagrangian multiplier when the constrained objective function is constructed using the lagrangian multiplier method. When optimizing a function
Figure 526048DEST_PATH_IMAGE115
When the pupil contour points are small enough, the found pupil centers are considered as the real pupil centers, and therefore the more accurate coordinates of the pupil contour points can be determined.
Further, in an embodiment of the present invention, the step of determining an intersection point of the line of sight of the user and the screen according to the corrected pupil contour point set and the eyeball model includes:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction on the corresponding human eye image under a camera coordinate system;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
The step of determining the coordinates of the pupil center under the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance comprises the following steps:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
Step S14, calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function.
And step S15, acquiring a nonlinear equation corresponding to each preset gazing point, and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain the values of the unknown parameters in the eyeball model.
In particular, the user's line of sight
Figure 899392DEST_PATH_IMAGE116
And a screen
Figure 936618DEST_PATH_IMAGE117
The intersection point is
Figure 418415DEST_PATH_IMAGE118
. The coordinate of the current preset fixation point in the world coordinate system is
Figure 955049DEST_PATH_IMAGE119
Figure 573112DEST_PATH_IMAGE120
The current preset gazing point is the real position of the current preset gazing point.
Computing
Figure 46819DEST_PATH_IMAGE122
And
Figure 625699DEST_PATH_IMAGE124
a distance of (i) that
Figure 577474DEST_PATH_IMAGE126
The calculation formula is as follows:
Figure 50044DEST_PATH_IMAGE128
and | | is a norm symbol.
With 8 points of regard
Figure 570018DEST_PATH_IMAGE130
Establishing a nonlinear equation set and solving to obtain 8 model parameters to be estimated, wherein the expression of the model parameters is
Figure 760828DEST_PATH_IMAGE132
Figure 250715DEST_PATH_IMAGE134
Wherein, in the step (A),
Figure 843371DEST_PATH_IMAGE136
parameters of three coordinate axes are included. The solution of the system of nonlinear equations may use the Levenberg-Marquardt algorithm.
In particular, the method comprises the following steps of,
Figure 268667DEST_PATH_IMAGE138
Figure 946773DEST_PATH_IMAGE140
respectively representing the distances corresponding to 8 fixation points
Figure 240351DEST_PATH_IMAGE126
Figure 61414DEST_PATH_IMAGE142
Figure 47825DEST_PATH_IMAGE144
Representing a two-norm function representing finding the model parameter x that minimizes the sum of squared gaze point errors.
In the embodiment, a plurality of preset fixation points on a screen are used for guiding a user to watch, a nonlinear equation set of unknown parameters of an eyeball model is established according to the measurement result of the eye diagram, and the nonlinear equation set is solved to obtain model parameters. The model parameters of different users can be corrected by the user in the calibration process, and more accurate sight direction can be obtained.
Example 2
The following describes, as a specific example in practical operation, a step of determining an intersection point of the user's gaze and the screen according to the corrected pupil contour point set and the eyeball model.
The calculation process involves a world coordinate system and a camera coordinate system. Let the x-axis and y-axis of the world coordinate system be in the screen plane and the z-axis be perpendicular to the screen plane. Let the origin of the world coordinate system be at the center of the screen, the x-axis horizontally to the right, the y-axis horizontally down, and the z-axis pointing to the eyeball side. The camera coordinate system coincides with the world coordinate system but tilts up about the x-axis with a slight rotation about the y-axis and z-axis. Lower corner mark
Figure DEST_PATH_IMAGE146A
Representing the camera coordinate system, lower corner marks
Figure DEST_PATH_IMAGE148AA
Representing a world coordinate system. Upper corner mark
Figure 291855DEST_PATH_IMAGE149
Indicating the optical axis of the eye, superscript
Figure 123545DEST_PATH_IMAGE150
Representing the visual axis of the eye.
First, fit the pupil projection to an ellipse, and in particular implementation,
fitting the contour of the corneal refraction-corrected pupil projection to an ellipse
Figure 831738DEST_PATH_IMAGE151
. According to fitting ellipse
Figure 723471DEST_PATH_IMAGE152
Calculating pupil normal, i.e. direction of optical axis in camera coordinate system
Figure 110590DEST_PATH_IMAGE154
And calculating a two-dimensional projection vector of the three-dimensional optical axis direction on the image
Figure DEST_PATH_IMAGE155
Secondly, initializing eyeball center projection, specifically:
computing at least two images
Figure 653960DEST_PATH_IMAGE155
The intersection point of the two points is taken as the eyeball center projection
Figure 75714DEST_PATH_IMAGE156
According to the camera intrinsic parameters and
Figure DEST_PATH_IMAGE157
calculating the direction of the eyeball center in the camera coordinate system
Figure 544873DEST_PATH_IMAGE158
And carrying out normalization processing. Wherein, cx,cyRespectively, the abscissa and ordinate of the center pixel of the image sensor, and f is the focal length of the camera.
Finally, fitting model parameters specifically include:
setting the fixation point coordinate in the world coordinate system as
Figure DEST_PATH_IMAGE159
Calculating the direction of the eye's optical axis in the camera coordinate system from the corresponding eye diagram
Figure 825812DEST_PATH_IMAGE160
Calculating the distance between the camera and the center of the eyeball
Figure DEST_PATH_IMAGE161
Figure 264884DEST_PATH_IMAGE162
Is the eyeball center coordinate under the world coordinate system,
Figure DEST_PATH_IMAGE163
representing translation vectors in the camera extrinsic parameters;
calculating the eyeball center coordinates under the camera coordinate system
Figure 947669DEST_PATH_IMAGE164
Calculating pupil center coordinates under camera coordinate system
Figure DEST_PATH_IMAGE165
Figure 351843DEST_PATH_IMAGE166
The distance between the pupil center and the eyeball center;
centering the pupil under the camera coordinate system
Figure DEST_PATH_IMAGE167
Converting to world coordinate system to obtain
Figure 979134DEST_PATH_IMAGE168
Wherein, in the step (A),
Figure DEST_PATH_IMAGE169
as an external parameter of the camera, i.e. rotation angle vector
Figure 362842DEST_PATH_IMAGE170
A corresponding rotation matrix;
the direction of the optical axis in the camera coordinate system
Figure DEST_PATH_IMAGE171
Conversion to world coordinate system
Figure 634554DEST_PATH_IMAGE172
The direction of the optical axis under the world coordinate system
Figure DEST_PATH_IMAGE173
Corrected to the direction of visual axis in world coordinate system
Figure 570149DEST_PATH_IMAGE174
Wherein:
Figure DEST_PATH_IMAGE175
Figure 825681DEST_PATH_IMAGE176
Figure DEST_PATH_IMAGE177
and
Figure 248966DEST_PATH_IMAGE178
is the included angle between the visual axis and the optical axis of the eyes;
by pupil centre
Figure DEST_PATH_IMAGE179
And direction of visual axis
Figure 765398DEST_PATH_IMAGE180
Defining a line of sight
Figure DEST_PATH_IMAGE181
Calculating line of sight
Figure 12839DEST_PATH_IMAGE182
Plane with the screen
Figure DEST_PATH_IMAGE183
Point of intersection of
Figure 490088DEST_PATH_IMAGE184
Example 3
Referring to fig. 3, a gaze tracking method according to three embodiments of the present invention includes steps S31-S33.
Step S31, correcting the parameters of the eyeball model according to the values of the unknown parameters of the eyeball model obtained by the model parameter calibration method.
And step S32, acquiring the human eye image to be detected of the user, and carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set.
Step S33, performing corneal refraction correction on the contour point set, and determining an intersection point between the user' S sight line and the screen according to the corrected contour point set and the eyeball model after parameter correction.
The parameters of the eyeball model are calibrated by the model parameter calibration method in the embodiment. Acquiring a current human eye image to be detected of a user, and carrying out contour detection on pupil projection in the human eye image to be detected to obtain a contour point set. Corneal refraction correction is performed on the set of contour points to eliminate the effect of corneal refraction. Inputting the corrected contour point set into a system formed by a screen, a camera and eyeballs, and calculating the intersection point of the sight of the user and the screen
Figure DEST_PATH_IMAGE185
Referring to fig. 4, a model parameter calibration apparatus according to a fourth embodiment of the present invention includes:
a first obtaining module 41, configured to obtain images captured when a user watches each preset gazing point on a screen, respectively, so as to obtain a human eye image corresponding to each preset gazing point, where the number of the preset gazing points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module 42 is configured to perform contour point detection on pupil projections in a human eye image corresponding to a current preset gaze point to obtain a pupil contour point set;
a first correction module 43, configured to perform corneal refraction correction on the pupil contour point set, and determine an intersection point between the user's gaze and the screen according to the corrected pupil contour point set and the eyeball model;
an equation establishing module 44, configured to calculate a distance between the position of the intersection and a real position of the current preset gaze point, and establish a nonlinear equation corresponding to the current preset gaze point by using each unknown parameter in the eyeball model as a variable and using the distance as a target function;
and the equation solving module 45 is configured to obtain a nonlinear equation corresponding to each preset gazing point, and solve an equation set formed by the nonlinear equations corresponding to each preset gazing point to obtain values of each unknown parameter in the eyeball model.
Referring to fig. 5, a gaze tracking apparatus according to a fifth embodiment of the present invention includes:
a correction module 51, configured to correct parameters of the eyeball model according to values of unknown parameters of the eyeball model obtained by the model parameter calibration method;
a second obtaining module 52, configured to obtain an image of a human eye to be detected of the user;
the second pupil detection module 53 is configured to perform contour detection on a pupil projection in the human eye image to be detected, so as to obtain a contour point set;
a second correction module 54 for performing corneal refractive correction on the set of contour points;
and a determining module 55, configured to determine an intersection point between the line of sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter modification.
The model parameter calibration device provided by the embodiment of the invention has the same implementation principle and technical effect as the method embodiment, and for brief description, the corresponding content in the method embodiment can be referred to where the device embodiment is not mentioned.
In another aspect, the present invention further provides an electronic device, please refer to fig. 6, which includes a processor 10, a memory 20, and a computer program 30 stored on the memory and executable on the processor, wherein the processor 10 executes the computer program 30 to implement the method in the above embodiment.
The electronic device may be, but is not limited to, a virtual reality headset, a computer, a server, and the like. Processor 10 may be, in some embodiments, a Central Processing Unit (CPU), controller, microcontroller, microprocessor or other data Processing chip that executes program code stored in memory 20 or processes data.
The memory 20 includes at least one type of readable storage medium, which includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, and the like. The memory 20 may in some embodiments be an internal storage unit of the electronic device, for example a hard disk of the electronic device. The memory 20 may also be an external storage device of the electronic device in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the electronic device. Further, the memory 20 may also include both an internal storage unit and an external storage device of the electronic apparatus. The memory 20 may be used not only to store application software installed in the electronic device and various types of data, but also to temporarily store data that has been output or will be output.
Optionally, the electronic device may further comprise a user interface, a network interface, a communication bus, etc., the user interface may comprise a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface may further comprise a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), typically used to establish a communication link between the device and other electronic devices. The communication bus is used to enable connection communication between these components.
It should be noted that the configuration shown in fig. 6 does not constitute a limitation of the electronic device, and in other embodiments the electronic device may include fewer or more components than shown, or some components may be combined, or a different arrangement of components.
The invention also proposes a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as in the above-mentioned embodiments.
Those of skill in the art will understand that the logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be viewed as implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus (e.g., a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or execute the instructions). For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (11)

1. A method for calibrating model parameters is characterized by comprising the following steps:
respectively acquiring images shot when a user watches each preset fixation point on a screen to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and acquiring a nonlinear equation corresponding to each preset gazing point, and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain the value of each unknown parameter in the eyeball model.
2. The method for calibrating the model parameters of claim 1, wherein said step of performing corneal refraction correction on said set of pupil contour points comprises:
establishing a first optimization function related to the pupil contour points according to the intersection condition of rays respectively emitted by each contour point in the pupil contour point set through the center of a camera and the eyeball model;
and establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point.
3. A method for calibration of model parameters according to claim 2, wherein said first optimization function is:
Figure 476225DEST_PATH_IMAGE001
the second optimization function is:
Figure 479953DEST_PATH_IMAGE002
wherein case I is the intersection point of ray and cornea, and the corneal surface is shown as the surface of cornea according to the law of refraction
Figure 593402DEST_PATH_IMAGE003
Normal vector of (c)
Figure 659578DEST_PATH_IMAGE004
Direction of ray
Figure 415045DEST_PATH_IMAGE005
Vector of refracted light
Figure 7700DEST_PATH_IMAGE006
Are coplanar, i.e.
Figure 432996DEST_PATH_IMAGE007
The corresponding refraction light parameter equation is as follows:
Figure 111102DEST_PATH_IMAGE008
,t2the parameters of the equation are expressed in terms of,
Figure 139101DEST_PATH_IMAGE009
Figure 586263DEST_PATH_IMAGE010
a set of real numbers is represented as,
case I case, define
Figure 949505DEST_PATH_IMAGE011
Figure 114907DEST_PATH_IMAGE012
Representing the intersection of the refracted ray with the pupil plane;
case II is defined as the intersection point between the ray and the eyeball and the intersection point between the ray and the cornea
Figure 946597DEST_PATH_IMAGE013
case III means that the ray has no intersection point with the eyeball and the cornea, and the ray parameter t is selected at the moment0To minimize the distance between the ray and the center of the eyeball, define
Figure 389210DEST_PATH_IMAGE014
Wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set, E is the position coordinate of the eyeball spherical center, ri represents the pupil radius,
Figure 280943DEST_PATH_IMAGE015
the coordinates representing the center of the pupil are,
Figure 933641DEST_PATH_IMAGE016
is a positive number, and the number of the positive number,
Figure 178809DEST_PATH_IMAGE017
a jth pupil image contour point representing the ith pupil,
Figure 866142DEST_PATH_IMAGE018
representing the intersection point of the camera ray corresponding to the outline point of the pupil image and the eyeball model,
Figure 928776DEST_PATH_IMAGE019
is a sign of a norm,
Figure 678557DEST_PATH_IMAGE021
the inner product of the vector is represented as,
Figure 383208DEST_PATH_IMAGE022
the intersection of the ray with the corneal surface is shown,
Figure 659469DEST_PATH_IMAGE023
the center of the ellipse representing the fit of the pupil contour points,
Figure 532485DEST_PATH_IMAGE024
is represented by
Figure 894196DEST_PATH_IMAGE025
The calculated pupil center coordinate, n is the refractive index of the cornea,
Figure 605800DEST_PATH_IMAGE026
is a refractive index of air and is,
Figure 877512DEST_PATH_IMAGE027
an optimization function is represented with respect to the state of the model,
Figure 547528DEST_PATH_IMAGE028
the states of eyeball models in the 1 st to Nth frame images are represented, and lambda is a Lagrange multiplier.
4. The method for calibrating model parameters according to claim 1, wherein the step of determining the intersection point of the user's gaze and the screen based on the corrected set of pupil contour points and the eyeball model comprises:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction on the corresponding human eye image under a camera coordinate system;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
5. The method for calibrating model parameters according to claim 4, wherein the step of determining the coordinates of the pupil center in the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center, and the pupil-eyeball distance comprises:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
6. The method for calibrating model parameters according to claim 1, wherein the unknown parameters of the eyeball model comprise: the position coordinates of the eyeball center, the distance between the pupil center and the eyeball center, the distance between the cornea curvature center and the eyeball center, the cornea curvature radius and the included angle between the visual axis and the optical axis.
7. A gaze tracking method, comprising:
correcting the parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method according to any one of claims 1 to 6;
acquiring a to-be-detected human eye image of the user, and carrying out contour detection on pupil projection in the to-be-detected human eye image to obtain a contour point set;
and performing corneal refraction correction on the contour point set, and determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
8. A model parameter calibration device is characterized by comprising:
the first acquisition module is used for respectively acquiring images shot when a user watches each preset fixation point on a screen so as to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module is used for detecting contour points of pupil projections in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
the first correction module is used for performing corneal refraction correction on the pupil contour point set and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
the equation establishing module is used for calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and the equation solving module is used for acquiring the nonlinear equation corresponding to each preset gazing point and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points so as to obtain the value of each unknown parameter in the eyeball model.
9. A gaze tracking device, comprising:
a correction module, configured to correct parameters of an eyeball model according to values of unknown parameters of the eyeball model obtained by the model parameter calibration method according to any one of claims 1 to 6;
the second acquisition module is used for acquiring the to-be-detected eye image of the user;
the second pupil detection module is used for carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set;
a second correction module for performing corneal refractive correction on the set of contour points;
and the determining module is used for determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
10. A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
11. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the computer program.
CN202210267130.6A 2022-03-18 2022-03-18 Model parameter calibration method, sight tracking method, device, medium and equipment Active CN114360043B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210267130.6A CN114360043B (en) 2022-03-18 2022-03-18 Model parameter calibration method, sight tracking method, device, medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210267130.6A CN114360043B (en) 2022-03-18 2022-03-18 Model parameter calibration method, sight tracking method, device, medium and equipment

Publications (2)

Publication Number Publication Date
CN114360043A true CN114360043A (en) 2022-04-15
CN114360043B CN114360043B (en) 2022-06-17

Family

ID=81094291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210267130.6A Active CN114360043B (en) 2022-03-18 2022-03-18 Model parameter calibration method, sight tracking method, device, medium and equipment

Country Status (1)

Country Link
CN (1) CN114360043B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546876A (en) * 2022-11-07 2022-12-30 广州图语信息科技有限公司 Pupil tracking method and device
CN116052264A (en) * 2023-03-31 2023-05-02 广州视景医疗软件有限公司 Sight estimation method and device based on nonlinear deviation calibration

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079577A (en) * 2001-09-12 2003-03-18 Nippon Telegr & Teleph Corp <Ntt> Visual axis measuring apparatus and method, visual axis measuring program, and recording medium recording the same
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
CN102520796A (en) * 2011-12-08 2012-06-27 华南理工大学 Sight tracking method based on stepwise regression analysis mapping model
CN108351514A (en) * 2015-11-02 2018-07-31 欧库勒斯虚拟现实有限责任公司 Use the eye tracks of structure light
CN108968907A (en) * 2018-07-05 2018-12-11 四川大学 The bearing calibration of eye movement data and device
CN110263745A (en) * 2019-06-26 2019-09-20 京东方科技集团股份有限公司 A kind of method and device of pupil of human positioning
CN113729611A (en) * 2017-09-08 2021-12-03 托比股份公司 Eye tracking using eyeball center position

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079577A (en) * 2001-09-12 2003-03-18 Nippon Telegr & Teleph Corp <Ntt> Visual axis measuring apparatus and method, visual axis measuring program, and recording medium recording the same
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
CN102520796A (en) * 2011-12-08 2012-06-27 华南理工大学 Sight tracking method based on stepwise regression analysis mapping model
CN108351514A (en) * 2015-11-02 2018-07-31 欧库勒斯虚拟现实有限责任公司 Use the eye tracks of structure light
CN113729611A (en) * 2017-09-08 2021-12-03 托比股份公司 Eye tracking using eyeball center position
CN108968907A (en) * 2018-07-05 2018-12-11 四川大学 The bearing calibration of eye movement data and device
CN110263745A (en) * 2019-06-26 2019-09-20 京东方科技集团股份有限公司 A kind of method and device of pupil of human positioning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ARANTXA VILLANUEVA等: "A Novel Gaze Estimation System With One Calibration Point", 《IEEE TRANSACTIONS ON SYSTEMS》, 16 July 2008 (2008-07-16) *
刘冬: "基于立体视觉的视线估计方法研究", 《中国优秀硕士学位论文》, vol. 2017, no. 5, 15 May 2017 (2017-05-15) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546876A (en) * 2022-11-07 2022-12-30 广州图语信息科技有限公司 Pupil tracking method and device
CN115546876B (en) * 2022-11-07 2023-12-19 广州图语信息科技有限公司 Pupil tracking method and device
CN116052264A (en) * 2023-03-31 2023-05-02 广州视景医疗软件有限公司 Sight estimation method and device based on nonlinear deviation calibration

Also Published As

Publication number Publication date
CN114360043B (en) 2022-06-17

Similar Documents

Publication Publication Date Title
CN109690553A (en) The system and method for executing eye gaze tracking
CN114360043B (en) Model parameter calibration method, sight tracking method, device, medium and equipment
US9244529B2 (en) Point-of-gaze estimation robust to head rotations and/or device rotations
US9291834B2 (en) System for the measurement of the interpupillary distance using a device equipped with a display and a camera
JP2010259605A (en) Visual line measuring device and visual line measuring program
CN109271914A (en) Detect method, apparatus, storage medium and the terminal device of sight drop point
EP3339943A1 (en) Method and system for obtaining optometric parameters for fitting eyeglasses
JP6594129B2 (en) Information processing apparatus, information processing method, and program
CN112102389A (en) Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object
US20150029322A1 (en) Method and computations for calculating an optical axis vector of an imaged eye
US10216010B2 (en) Determining user data based on image data of a selected eyeglass frame
CN113808160B (en) Sight direction tracking method and device
WO2019010959A1 (en) Method and device for determining sight line, and computer readable storage medium
US11181978B2 (en) System and method for gaze estimation
JP6840697B2 (en) Line-of-sight direction estimation device, line-of-sight direction estimation method, and line-of-sight direction estimation program
US20220003632A1 (en) Method and device for measuring the local refractive power and/or the refractive power distribution of a spectacle lens
JP2019215688A (en) Visual line measuring device, visual line measurement method and visual line measurement program for performing automatic calibration
KR20200006621A (en) Methods, apparatus, and computer programs for determining near vision points
CN116051631A (en) Light spot labeling method and system
US10866635B2 (en) Systems and methods for capturing training data for a gaze estimation model
JP2018101212A (en) On-vehicle device and method for calculating degree of face directed to front side
Kang et al. A robust extrinsic calibration method for non-contact gaze tracking in the 3-D space
WO2022032911A1 (en) Gaze tracking method and apparatus
JP6906943B2 (en) On-board unit
US20230157539A1 (en) Computer-implemented method for determining a position of a center of rotation of an eye using a mobile device, mobile device and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant