CN112101064A - Sight tracking method, device, equipment and storage medium - Google Patents
Sight tracking method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN112101064A CN112101064A CN201910521500.2A CN201910521500A CN112101064A CN 112101064 A CN112101064 A CN 112101064A CN 201910521500 A CN201910521500 A CN 201910521500A CN 112101064 A CN112101064 A CN 112101064A
- Authority
- CN
- China
- Prior art keywords
- user
- information
- thermal imaging
- infrared thermal
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000001931 thermography Methods 0.000 claims abstract description 132
- 238000013507 mapping Methods 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 210000001747 pupil Anatomy 0.000 description 51
- 238000010586 diagram Methods 0.000 description 22
- 230000001815 facial effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 210000000744 eyelid Anatomy 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 206010025421 Macule Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the invention discloses a sight tracking method, a sight tracking device, sight tracking equipment and a storage medium. The method comprises the following steps: acquiring an infrared thermal imaging eye pattern and user state information when a user looks at the eye; determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging eye pattern; pre-stored sight line calibration data corresponding to user position information included in user state information is acquired; and determining gaze information based on the characteristic information and pre-stored gaze calibration data. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
Description
Technical Field
The embodiment of the invention relates to the field of image processing, in particular to a sight tracking method, a sight tracking device, sight tracking equipment and a storage medium.
Background
Gaze tracking techniques, also known as eye tracking techniques, estimate the gaze and/or fixation point position of the eye, primarily by measuring eye movement data. With the development of the human-computer interaction technology, the technology of controlling the display screen to operate by using the sight tracking technology is gradually mature.
In the prior art, a user is usually required to perform sight line calibration by watching a plurality of calibration points on a display screen to obtain an eye image when the user watches each gaze point on the display screen, and a gaze point mapping function for establishing a mapping relationship between the eye image of the user and gaze point information of the user is determined according to image characteristics of each eye image and a position of the calibration point corresponding to each eye image. And the subsequent terminal equipment calculates the fixation point coordinate when the user gazes the display screen based on the fixation point mapping function and the acquired eye image when the user gazes, so that the sight tracking is realized.
However, the number of calibration points watched by the user is usually 9 or 16, and the tracking time consumed each time the user performs gaze tracking is longer due to the large number of calibration points, which reduces the user experience.
Disclosure of Invention
Embodiments of the present invention provide a method, an apparatus, a device, and a storage medium for tracking a line of sight, so as to reduce a line of sight tracking time, improve tracking efficiency, and enhance user experience.
In a first aspect, an embodiment of the present invention provides a gaze tracking method, where the gaze tracking method includes:
acquiring an infrared thermal imaging graph and user state information when a user looks at; wherein the user state information comprises user location information;
determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph;
acquiring prestored sight line calibration data corresponding to the user position information;
and determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
In a second aspect, an embodiment of the present invention further provides a gaze tracking apparatus, including:
the acquisition module is used for acquiring an infrared thermal imaging image and user state information when a user looks at the eye; wherein the user state information comprises user location information;
the characteristic information determining module is used for determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph;
the calibration data searching module is used for acquiring prestored sight line calibration data corresponding to the user position information;
and the gazing information determining module is used for determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
In a third aspect, an embodiment of the present invention further provides a terminal device, including an input apparatus, further including:
one or more processors;
storage means for storing one or more programs;
the one or more programs are executed by the one or more processors to cause the one or more processors to implement a gaze tracking method provided by any of the embodiments of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a gaze tracking method provided in any embodiment of the present invention.
The embodiment of the invention acquires the infrared thermal imaging graph and the user state information when the user looks at; determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph; pre-stored sight line calibration data corresponding to user position information included in user state information is acquired; and determining gaze information based on the characteristic information and pre-stored gaze calibration data. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
Drawings
Fig. 1A is a schematic flowchart of a gaze tracking method according to a first embodiment of the present invention;
FIG. 1B is a schematic view of a horizontal angle in one embodiment of the present invention;
FIG. 1C is a schematic illustration of vertical angles in a first embodiment of the present invention;
fig. 2 is a schematic flowchart of a gaze tracking method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a gaze tracking apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a terminal device in the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1A is a schematic flowchart of a gaze tracking method according to an embodiment of the present invention. The embodiment may be applicable to a case where gaze tracking is performed on a gazing user, and the method may be performed by a gaze tracking apparatus, which is implemented by software and/or hardware and configured in a terminal device. The sight line tracking method includes:
s110, acquiring an infrared thermal imaging graph and user state information when the user looks at the eye.
Wherein the infrared thermography profile comprises an image of heat or temperature radiated by the eye region of the user. The ocular region may include, among others, the pupil, the iris, the white of the eye, the upper eyelid, and the lower eyelid.
In nature, all objects above absolute zero (-273 ℃) can radiate infrared rays. Therefore, when the user looks at the display plane, the infrared thermal imaging graph is obtained due to the difference between the infrared radiation heat of the user and the infrared radiation heat of the background. Different colors are adopted in the infrared thermal imaging image to reflect different temperature distributions of the surface of the collected target. That is to say, each pixel point in the infrared thermal imaging image carries temperature information. Generally, the higher the target surface temperature, the brighter the colors in the infrared thermography image.
For example, the acquisition area of the image acquisition device may be set as a face area, so as to directly acquire an infrared thermal imaging image corresponding to the face area when the user looks at the face area. Of course, the acquisition region of the image acquisition device can also be directly set to be only the eye region, so that the data calculation amount during the subsequent operation of determining the corresponding characteristic information is reduced, and meanwhile, the accuracy of the determined pupil pixel point is improved.
When a user watches the display plane of the terminal equipment, an image acquisition device of the terminal equipment is triggered to acquire infrared light radiated by an eye region when the user watches the display plane at regular time or in real time, and a corresponding infrared thermal imaging graph is formed; or the terminal equipment wirelessly acquires an infrared thermal imaging image acquired in the image acquisition device; or the terminal equipment remotely calls the infrared thermal imaging images in other terminals; or the terminal equipment acquires an infrared thermal imaging image which is stored in advance or uploaded by the image acquisition device from the cloud platform storage system.
Wherein the user state information comprises user location information. Wherein the user location information comprises: horizontal angle and vertical angle. Illustratively, the horizontal angle may be an angle at which a straight line formed by the center of the user's eyes and the center of the gazed plane deviates from a vertical median plane of the gazed plane; the vertical angle may be the angle of the gazed plane from the horizontal plane.
For example, the horizontal angle may be determined as the center of the eye of the user, based on a midpoint between a critical position where the user can be detected and a critical position where the user cannot be detected when the infrared sensor arranged on the terminal device scans from left to right or from right to left. Exemplarily, the center of the eyes of the user can be determined according to a critical value of a shooting angle of a camera arranged on the terminal equipment; the center of the eyes of the user can be determined according to the position of the user image in the shot area, wherein the user image is collected by the camera which shoots at a fixed angle on the terminal equipment. The terminal device may determine the horizontal angle by an angle at which a straight line formed by the center of the user's eyes and the center of the plane being looked at deviates from a vertical midperpendicular of the plane being looked at.
As shown in the horizontal angle diagram of fig. 1B, the angle Φ at which the straight line formed by the center of the eyes of the user 12 and the center O of the gazed plane deviates from the vertical central vertical plane 11 of the gazed plane 10 is a horizontal angle; the straight line formed by the center of the eyes of the user 13 and the center O of the plane to be looked at deviates from the vertical median plane 11 of the plane to be looked at 10 by an angle ψ which is a horizontal angle. For example, whether the user is positioned to the left or right in front of the gazed plane (i.e., the aforementioned display plane) may be represented by the plus or minus of the horizontal angle. For example, the angle on the left side may be specified as positive and the angle on the right side may be specified as negative. That is, the horizontal angle Φ > 0 and the horizontal angle ψ < 0 in FIG. 1B. It should be noted that the horizontal angle is in the range of [ -90 °,90 ° ].
For example, the vertical angle may be a deflection angle of the display plane of the terminal device sensed by a gyroscope provided in the terminal device.
As shown in fig. 1C, the vertical angle is the angle between the horizontal plane and the side of the display plane of the watched plane. Exemplarily, when the gazed plane 21 is located in a vertical direction, an included angle α between the horizontal plane 20 and the gazed plane 21 is determined as a vertical angle, where α is 90 °; when a user rotates the watched plane 21 to the watched plane 22 anticlockwise, determining that an included angle beta between the horizontal plane 20 and the watched plane 22 is a vertical angle, wherein beta is more than 0 degree and less than 90 degrees; when the user rotates the gazed plane 21 clockwise to the position of the gazed plane 23, it is determined that the angle γ between the horizontal plane 20 and the gazed plane 23 is a vertical angle, when 90 ° < γ < 180 °. It should be noted that the vertical angle ranges from [0 °,180 ° ].
S120, determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph.
The characteristic information is eye characteristic information, and comprises one or more of pupil information, cornea information, light spot information, iris information, eyelid information, eye image information and gazing depth information of a user. The pupil information at least comprises pupil pixel points and pupil position information. The pupil position information at least comprises a pupil center position and a pupil diameter; wherein the corneal information includes at least: corneal speckle reflection information; wherein the iris information at least includes: iris edge information. The pupil pixel points represent pixel points belonging to the pupil in the infrared thermal imaging image identified by the terminal equipment.
And the terminal equipment determines the pixel point type based on the temperature value of each pixel point in the infrared thermal imaging graph. The pixel point types comprise pupils and non-pupils. And the terminal equipment determines the pixel points with the pixel point types of pupils as pupil pixel points, and correspondingly takes the coordinate values of the pupil pixel points as position information.
The method for determining the pixel type by the terminal device based on the temperature value of each pixel in the infrared thermal imaging graph may be, but is not limited to, the following two methods:
the first method is as follows: and determining pixel points meeting the preset temperature threshold as pupil pixel points. Exemplarily, the temperature interval of the pupil can be determined as a preset temperature threshold value through a large number of known temperature values of all pixel points of the infrared thermal imaging graph and corresponding pixel point types; and can be set by the technician according to experience or requirements.
The second method comprises the following steps: classifying the pixel point types of all the pixel points in the infrared thermal imaging image through a pupil determination model, and acquiring the pixel point with the pixel point type of a pupil as a pupil pixel point. Illustratively, the pupil determination model may be a linear determination model constructed according to temperature values of pixel points of a large number of known infrared thermography images and corresponding pixel point types. The linear discriminant model can be one or more of linear discriminant models such as linear discriminant analysis, partial least squares, principal component analysis, support vectors and the like.
It can be understood that, when the infrared thermal imaging image collected by the image collecting device is the infrared thermal imaging image of the facial region of the user, in order to reduce the data calculation amount when determining the pupil pixel points and improve the accuracy of the determined pupil pixel points, the infrared thermal imaging eye diagram can be determined in advance according to the obtained infrared thermal imaging image, and then the corresponding pupil pixel points are determined directly according to the temperature values of the pixel points in the infrared thermal imaging eye diagram.
Optionally, the infrared thermal imaging eye pattern is determined according to the obtained infrared thermal imaging image, and the infrared thermal imaging eye pattern may be defined by marking a graph with a set shape around the eye. Illustratively, the delineation of the infrared thermographic eye pattern may be performed using a fixed size rectangle.
Optionally, the infrared thermal imaging eye pattern is determined according to the acquired infrared thermal imaging pattern, and the infrared thermal imaging eye pattern may also be determined by means of eye recognition in the acquired infrared thermal imaging pattern including the eye.
Optionally, determining an infrared thermal imaging eye diagram according to the obtained infrared thermal imaging diagram, and determining coordinates of pixel points of an eye region as a target region by performing human eye positioning on a common facial image; and acquiring a target area in the infrared thermal imaging image to obtain an infrared thermal imaging eye pattern. The common facial image is the common facial image synchronously acquired by the image acquisition device when the image acquisition device acquires the infrared thermal imaging image.
And S130, acquiring prestored sight line calibration data corresponding to the user position information.
The pre-stored sight line calibration data comprises pre-stored historical tracking data used for calibrating the sight line of the user. Wherein the historical tracking data includes known historical feature information and corresponding historical gaze point coordinates. Illustratively, a fixation point mapping function for establishing a mapping relationship between eye features in an infrared thermal imaging image of the user and fixation point coordinates when the user fixates on a fixated plane may be determined according to the historical feature information and corresponding historical fixation point coordinates. The historical characteristic information includes information of historical pupil pixel points, for example, the historical pupil pixel points and/or historical pupil center pixel points.
The terminal equipment can directly search and obtain prestored sight line calibration data corresponding to the user position information locally; the pre-stored sight line calibration data corresponding to the user position information can be acquired from other terminal equipment in a wireless connection or remote calling mode; and pre-stored sight line calibration data corresponding to the user position information can be searched and acquired from the cloud platform storage system.
It should be noted that S130 may be executed before S120, after S120, or simultaneously with S120, and this embodiment does not limit any specific operation sequence of S120 and S130.
And S140, determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
Wherein, the gazing information is the gazing point coordinate when the user gazes at the gazed area. Exemplarily, the gazed area may be the aforementioned display plane.
And determining a mapping coefficient between the infrared thermal imaging image and the plane of the region to be watched according to the pre-stored sight line calibration data, and determining the corresponding fixation point coordinate according to the characteristic information and the determined mapping coefficient. The pupil center pixel point can be obtained by calculation based on the position coordinates of the pupil pixel point. Exemplarily, a mapping coefficient between the infrared thermal imaging graph and a plane where the gazed area is located can be determined according to the position information of the pupil pixel and/or the pupil center pixel and pre-stored calibration data.
Wherein, the gaze point can be understood as the intersection point of the sight line vector of the user and the gazed object, wherein the gazed object includes an actual object, a virtual object, a display screen, and the like; the gazed area is a circular or other shaped area centered on the gazing point. The sight line vector is a conical area obtained by rotating the pupil of the user at a certain angle.
The embodiment of the invention acquires the infrared thermal imaging graph and the user state information when the user looks at; determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph; pre-stored sight line calibration data corresponding to user position information included in user state information is obtained through searching; and determining gaze information based on the characteristic information and pre-stored gaze calibration data. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
In addition to the technical solutions of the foregoing embodiments, in order to implement self-control of the user on the sight line tracking scheme, a tracking implementation instruction may be additionally received before the infrared thermal imaging image during the user's gaze is acquired, so that after the sight line tracking instruction is received, the infrared thermal imaging image is acquired.
When the user controls the terminal device to perform the sight tracking mode through voice, gesture, touch or a remote controller, the terminal device receives a corresponding sight tracking instruction.
Example two
Fig. 2 is a schematic flow chart of a gaze tracking method according to a second embodiment of the present invention, which is further refined and added based on the technical solutions of the above embodiments.
Further, after the feature "determining the gaze information based on the feature information and the pre-stored gaze calibration data", additionally updating the pre-stored gaze calibration data according to the feature information and the user state information "to realize updating along with the pre-stored gaze calibration data.
Further, after the feature "acquiring the infrared thermal imaging graph when the user gazes" is performed, additionally "determining the infrared thermal imaging eye graph when the user gazes based on the infrared thermal imaging graph" is performed to limit the acquisition time of the infrared thermal imaging graph and further improve the acquisition mode of the infrared thermal imaging graph.
The gaze tracking method shown in fig. 2 specifically includes the following steps:
s211, acquiring an infrared thermal imaging graph and user state information when the user looks at the eye.
The terminal equipment can acquire an infrared thermal imaging image of a face area when a user watches a display plane of the terminal equipment through the infrared image acquisition device to serve as an infrared thermal imaging image. Meanwhile, the terminal equipment can determine the user state information through an infrared sensor or a camera. Wherein the user state information comprises user location information.
S212, determining an infrared thermal imaging eye pattern when the user looks at the eye based on the infrared thermal imaging image.
Exemplarily, each pixel point located in a first preset area in the infrared thermal imaging graph can be obtained to form an infrared thermal imaging left eye graph; and/or obtaining each pixel point in the infrared thermal imaging image in a second preset area to form an infrared thermal imaging right eye image.
And determining the rough position of the eye from the acquired infrared thermal imaging map based on the anthropometric relation. When the terminal equipment acquires the infrared thermal imaging image of the left eye, the first pixel point at the upper left corner of the infrared thermal imaging image is set as the original point, the pixel point of 20% multiplied by 30% of the face area is set as the initial point, meanwhile, a rectangular frame with the size of 25% multiplied by 20% is expanded rightwards along the direction parallel to the edge of the infrared thermal imaging image, and the area in the rectangular frame is determined to be the infrared thermal imaging left eye image. When the terminal equipment acquires the infrared thermal imaging image of the right eye, the first pixel point at the upper left corner of the infrared thermal imaging image is set as the origin, the 60% multiplied by 30% pixel point of the face area is set as the starting point, meanwhile, a rectangular frame with the size of 25% multiplied by 20% is expanded rightwards along the direction parallel to the edge of the infrared thermal imaging image, and the area in the rectangular frame is determined to be the infrared thermal imaging right eye image.
S220, determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph.
The characteristic information comprises pupil center pixel points of the user.
In an optional implementation manner of the embodiment of the present invention, each pixel point in the infrared thermal imaging graph may be traversed, and a temperature value of each pixel point may be obtained; based on each temperature value and in combination with a pupil judgment model, determining pixel points of which the pixel point types belong to pupils as target pixel points; and determining the central coordinates of the target pixel points as pupil central pixel points based on the coordinate values of the target pixel points.
Optionally, after the infrared thermal imaging image is obtained, a traversal mode in a set direction may be adopted to obtain a temperature value corresponding to each pixel point. For example, the vertex at the top left corner in the infrared thermal imaging image may be first determined as a starting point, and the vertex at the bottom right corner may be first determined as an ending point. Firstly, traversing an initial column (the column in which the initial point is located) from the upper right to the lower right of the column in which the initial point is located and simultaneously acquiring the temperature value of each pixel point; and traversing other columns to the right in sequence according to the same mode and simultaneously acquiring the temperature value of each pixel point until the temperature value is traversed to the termination point of the termination column (the column in which the termination point is located) and the corresponding temperature value is acquired, and then ending the traversing process. Of course, the pixel points in the infrared thermal imaging graph can be traversed through other rules, which are not described herein again.
Exemplarily, according to a linear discrimination model constructed in advance based on temperature values of pixels of a large number of known infrared thermal imaging images and corresponding pixel types, determining pixel categories of the traversed temperature values, determining the pixel categories of the pixel points belonging to pupils as target pixel points, and simultaneously obtaining pixel coordinates and temperature values of the target pixel points. The linear discriminant model can be one or more of linear discriminant models such as linear discriminant analysis, partial least squares, principal component analysis, support vectors and the like. It should be noted that, when the number of the target pixel points is 1, the determined target pixel points can be directly used as pupil pixel points. When the number of the target pixel points is larger than 1, a certain target pixel point needs to be selected or each target pixel point needs to be calculated through a set rule to determine a pupil pixel point.
Illustratively, the central pixel point of the formed pattern of each pixel point can be determined as the pupil pixel point through the pixel point coordinates of each target pixel point.
Exemplarily, the weighted average of the coordinate values of the target pixel points can be determined by taking the temperature value of each target pixel point as the weight, so as to obtain the position information of the pupil center pixel point.
wherein, (X, Y) is the coordinate value of the pupil center pixel point; (x)i,yi) The coordinate value of the target pixel point is obtained; i isiThe temperature value of the target pixel point is obtained; and i is the number of the target pixel points, wherein i is more than or equal to 1.
In another optional implementation manner of the embodiment of the present invention, a temperature value of each pixel point in the infrared thermal imaging graph may also be obtained, and a pixel point whose temperature value meets a first threshold is determined as a first pixel point; and determining the weighted average value of the position information of each first pixel point by taking the temperature value of each first pixel point as the weight to obtain the pupil center pixel point.
Optionally, the temperature value of each pixel point in the infrared thermal imaging graph is obtained, and the pixel point with the temperature value meeting the first threshold is determined as the first pixel point, which may be: acquiring temperature values of all pixel points in the infrared thermal imaging left eye diagram, and determining the pixel points of which the pixel point types belong to the pupils as left target pixel points by combining a left eye pupil discrimination model; and/or obtaining the temperature value of each pixel point in the infrared thermal imaging right eye diagram, and determining the type of the pixel point to belong to a right target pixel point of a pupil by combining a right eye pupil discrimination model.
Or optionally, obtaining the temperature value of each pixel point in the infrared thermal imaging graph, and determining the pixel point with the temperature value meeting the first threshold as the first pixel point, and may further be: acquiring temperature values of all pixel points in the infrared thermal imaging left eye diagram, and determining a left first pixel point with a temperature value meeting a first threshold value as a left target pixel point; and/or obtaining the temperature value of each pixel point in the infrared thermal imaging right eye diagram, and determining a right first pixel point with the temperature value meeting a first threshold value as a right target pixel point.
It can be understood that, in order to improve the accuracy of the determined pupil pixel, it may be further determined that a left second pixel, whose temperature value of each pixel in the infrared thermal imaging left eye diagram satisfies the second threshold, is a left target pixel, and/or that a right second pixel, whose temperature value of each pixel in the infrared thermal imaging right eye diagram satisfies the second threshold, is a right target pixel.
The first threshold value may be determined by a large number of temperature values of each pixel point in the known infrared thermography image and the pupil or non-pupil type to which each pixel point belongs, or may be set by a technician according to experience or demand.
The second threshold value may be determined by a large number of temperature values of each pixel point in the known infrared thermal imaging image and the type of the white of the eye or the non-white of the eye to which each pixel point belongs, or may be set by a technician according to experience or demand.
Exemplarily, the temperature value of each first pixel is used as a weight, and a weighted average of the position information of each first pixel is determined to obtain the pupil center pixel, which may be: determining a weighted average value of coordinate values of each left target pixel point by taking the temperature value of each left target pixel point as a weight to obtain position information of a central pixel point of a left pupil; and/or determining the weighted average value of the coordinate values of the right target pixel points by taking the temperature value of each right target pixel point as the weight to obtain the position information of the right pupil center pixel point.
In particular, according to the formulaDetermining the position information of the central pixel point of the left pupil;
wherein (X)l,Yl) The coordinate value of the central pixel point of the left pupil; (x)li,yli) The coordinate value of the left target pixel point; i isliThe temperature value of the left target pixel point is obtained; li is the number of the left target pixel points, wherein li is more than or equal to 1.
And/or according to a formulaDetermining the position information of the center pixel point of the right pupil;
wherein (X)r,Yr) The coordinate value of the center pixel point of the right pupil; (x)ri,yri) The coordinate value of the right target pixel point; i isriIs a right target pixel pointThe temperature value of (a); ri is the number of right target pixel points, wherein ri is more than or equal to 1.
And S230, acquiring pre-stored sight line calibration data corresponding to the user position information.
And S240, determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
According to the position information of known pupil center pixel points in a large number of infrared thermal imaging graphs and the known fixation point coordinates corresponding to the plane to be fixed, the mapping coefficient between the infrared thermal imaging graphs and the plane to be fixed is determined and is stored in the terminal equipment or the cloud platform storage system in advance. The gazed plane is understood to be the plane in which the gazed area is located.
In an optional implementation manner of the embodiment of the present invention, the fixation point coordinate corresponding to the pupil center pixel point may also be determined according to the determined position information of the pupil center pixel point and a predetermined mapping coefficient between the infrared thermal imaging map and the plane to be fixated. It should be noted that the mapping coefficient is related to the gazing user, the size of the gazed plane, and the distance between the gazing user and the gazed plane.
In another optional implementation manner of the embodiment of the present invention, when it is known that the pupil center pixel point is the left pupil center pixel point, the corresponding gaze point coordinate is the left eye gaze point coordinate, and pre-stored gaze correction data corresponding to the left eye is obtained to determine a left eye mapping coefficient; and when the known pupil center pixel point is the right pupil center pixel point, the corresponding fixation point coordinate is the right eye fixation point coordinate, and the pre-stored sight line correction data corresponding to the right eye is obtained to determine the right eye mapping coefficient. The mapping coefficients are related to the gazing user, the size of the gazed plane, the left and right eyes, and the distance between the gazing user and the gazed plane.
Determining a left eye fixation point coordinate of a plane to be fixed according to the position information of the center pixel point of the left pupil and a predetermined left eye mapping coefficient; and/or determining the right eye fixation point coordinate of the plane to be fixed according to the position information of the right pupil center pixel point and a predetermined right eye mapping coefficient.
It should be noted that, when the calibrated left-eye gaze point coordinate and right-eye gaze point coordinate are determined at the same time, the average value of the two coordinates may be determined as the finally determined gaze point coordinate, that is, the finally determined gaze information.
And S250, updating the pre-stored sight line calibration data according to the characteristic information and the user state information.
Specifically, the terminal equipment acquires historical tracking data when receiving a sight tracking instruction last time; wherein the historical tracking data comprises historical feature information and historical user state information; if the characteristic information and the user state information meet the change threshold of the historical tracking data, updating the pre-stored sight line calibration data according to historical pre-stored sight line calibration data corresponding to the characteristic information and the user state information; if the characteristic information and the user state information do not meet the change threshold, searching and acquiring historical pre-stored sight calibration data corresponding to the characteristic information and the user state information from the historical pre-stored sight calibration data corresponding to all historical tracking data, and updating the pre-stored sight calibration data. Illustratively, the feature information may be position information of the pupil center pixel point.
The embodiment of the invention perfects the sight tracking scheme by additionally adding the updating step of pre-storing sight calibration data; and by adding a sight tracking instruction receiving step, the acquisition time of the infrared thermal imaging graph and the user state information is limited, and the acquisition mode of the infrared thermal imaging graph is perfected. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
On the basis of the technical solutions of the above embodiments, further, the user state information further includes iris information. Iris information includes, but is not limited to, the macula, the filament, the crown, the striation, the crypt, etc. of the eye.
According to the embodiment of the invention, the iris information is added in the user state information, so that the aim of tracking the sight line when different users use the same terminal equipment is also fulfilled, and meanwhile, the tracking error when different users adopt the same pre-stored sight line calibration data for calibration is reduced. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a gaze tracking apparatus according to a third embodiment of the present invention. The present embodiment is applicable to a case where gaze tracking is performed on a gazing user, and the apparatus includes: an acquisition module 310, a characteristic information determination module 320, a calibration data lookup module 330, and a gaze information determination module 340.
The acquiring module 310 is configured to acquire an infrared thermal imaging image and user state information when a user looks at; wherein the user state information comprises user location information;
the characteristic information determining module 320 is configured to determine corresponding characteristic information based on a temperature value of each pixel point in the infrared thermal imaging graph;
a calibration data searching module 330, configured to obtain pre-stored sight calibration data corresponding to the user location information;
a gaze information determination module 340 configured to determine gaze information based on the characteristic information and the pre-stored gaze calibration data.
The embodiment of the invention obtains the infrared thermal imaging graph and the user state information when the user looks at the eye through the obtaining module; determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph through a characteristic information determination module; searching and acquiring prestored sight line calibration data corresponding to the user position information included in the user state information through a calibration data searching module; and determining gaze information based on the characteristic information and pre-stored gaze calibration data by a gaze information determination module. By adopting the technical scheme, the technical problem that the sight tracking time is long due to the fact that the user needs to be collected in advance to watch a plurality of calibration points to calibrate the sight when the sight tracking is carried out is solved, the sight tracking time is shortened, the sight tracking efficiency is improved, and the use experience of the user is enhanced.
On the basis of the technical solutions of the above embodiments, further, the apparatus further includes:
and the calibration data updating module is used for updating the pre-stored sight calibration data according to the characteristic information and the user state information.
On the basis of the technical solutions of the above embodiments, further, the device,
and the thermal imaging eye pattern determining module is used for determining the infrared thermal imaging eye pattern when the user looks at the eye based on the infrared thermal imaging image.
Further, the thermal imaging eye diagram determining module is specifically configured to:
the thermal imaging left eye diagram determining unit is used for acquiring each pixel point located in a first preset area in the infrared thermal imaging diagram to form an infrared thermal imaging left eye diagram; and/or the presence of a gas in the gas,
and the thermal imaging right eye diagram determining unit is used for acquiring each pixel point positioned in a second preset area in the infrared thermal imaging diagram to form an infrared thermal imaging right eye diagram.
On the basis of the technical solutions of the foregoing embodiments, further, the calibration data updating module includes:
the acquisition unit is used for acquiring historical tracking data when receiving the sight tracking instruction at the previous time; wherein the historical tracking data comprises historical feature information and historical user state information;
the updating unit is used for updating the pre-stored sight calibration data according to historical pre-stored sight calibration data corresponding to the characteristic information and the user state information when the characteristic information and the user state information meet the change threshold of the historical tracking data;
and the updating unit is further used for searching and acquiring historical pre-stored sight calibration data corresponding to the characteristic information and the user state information from all historical pre-stored sight calibration data corresponding to the historical tracking data and updating the pre-stored sight calibration data when the characteristic information and the user state information do not meet the change threshold.
On the basis of the technical solutions of the foregoing embodiments, further, the characteristic information determining module 320 includes:
the first acquisition unit is used for acquiring the temperature value of each pixel point in the infrared thermal imaging image and determining the pixel point with the temperature value meeting a first threshold value as a first pixel point;
and the first determining unit is used for determining the weighted average value of the position information of each first pixel point by taking the temperature value of each first pixel point as the weight to obtain the characteristic information.
On the basis of the technical solutions of the foregoing embodiments, further, the gaze information determining module 340 includes:
and the fixation point coordinate determination unit is used for determining corresponding characteristic information and/or fixation point coordinates corresponding to pupil center pixel points according to the characteristic information, the pre-stored sight line calibration data and a mapping coefficient between the infrared thermal imaging graph and a plane where a region to be fixed is located.
On the basis of the technical solutions of the foregoing embodiments, further, the user location information includes: horizontal and vertical angles;
wherein the horizontal angle is an angle of a straight line formed by the center of the eyes of the user and the center deviating from the vertical middle vertical plane;
wherein the vertical angle is the angle from the horizontal.
On the basis of the technical solutions of the above embodiments, further, the user state information further includes iris information.
The gaze tracking device can execute the gaze tracking method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the gaze tracking method.
Example four
Fig. 4 is a schematic diagram of a hardware structure of a terminal device according to a fourth embodiment of the present invention. As shown in fig. 4, the terminal device includes an input device 410, a processor 420, and a storage device 430.
The input device 410 is used for acquiring an infrared thermal imaging graph and user state information when a user looks at;
one or more processors 420;
a storage device 430 for storing one or more programs.
In fig. 4, a processor 420 is taken as an example, the input device 410 in the terminal device may be connected to the processor 420 and the storage device 430 through a bus or other means, and the processor 420 and the storage device 430 are also connected through a bus or other means, which is taken as an example in fig. 4.
In this embodiment, the processor 420 in the terminal device may determine corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph acquired by the input device 410; pre-stored sight line calibration data corresponding to the user position information can be searched and obtained; gaze information may also be determined based on the characteristic information and pre-stored gaze calibration data.
The storage device 430 in the terminal device, which is a computer-readable storage medium, may be used to store one or more programs, which may be software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the gaze tracking method in the embodiment of the present invention (for example, the obtaining module 310, the feature information determining module 320, the calibration data searching module 330, and the gaze information determining module 340 shown in fig. 3). The processor 420 executes various functional applications of the terminal device and data processing by executing software programs, instructions, and modules stored in the storage 430, that is, implements the gaze tracking method in the above-described method embodiments.
The storage device 430 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data (such as the infrared thermal imaging map, the user status information, the temperature value of each pixel point, the pre-stored sight line calibration data, the pupil pixel point, and the fixation point coordinate in the above embodiment). Further, the storage 430 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 430 may further include memory located remotely from processor 420, which may be connected to the terminal device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Furthermore, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, the computer program, when being executed by a gaze tracking apparatus, implementing a gaze tracking method provided by the present invention, the method including: acquiring an infrared thermal imaging graph and user state information when a user looks at; wherein the user state information comprises user location information; determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph; searching and acquiring prestored sight line calibration data corresponding to the user position information; and determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the gaze tracking method according to the embodiments of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (11)
1. A gaze tracking method, comprising:
acquiring an infrared thermal imaging graph and user state information when a user looks at; wherein the user state information comprises user location information;
determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph;
acquiring prestored sight line calibration data corresponding to the user position information;
and determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
2. The method of claim 1, further comprising, after said determining gaze information based on said feature information and said pre-stored gaze calibration data:
and updating the pre-stored sight line calibration data according to the characteristic information and the user state information.
3. The method of claim 2, further comprising, after said obtaining the infrared thermography profile of the user's gaze,:
determining an infrared thermal imaging eye pattern when the user looks at based on the infrared thermal imaging pattern;
wherein the determining the infrared thermal imaging eye pattern when the user looks at based on the infrared thermal imaging image comprises:
acquiring all pixel points in the infrared thermal imaging graph in a first preset area to form an infrared thermal imaging left eye graph; and/or
And obtaining all pixel points positioned in a second preset area in the infrared thermal imaging graph to form an infrared thermal imaging right eye graph.
4. The method of claim 3, wherein said updating the pre-stored gaze calibration data based on the characteristic information and the user status information comprises:
acquiring historical tracking data when a sight tracking instruction is received last time; wherein the historical tracking data comprises historical feature information and historical user state information;
if the characteristic information and the user state information meet the change threshold of the historical tracking data, updating the pre-stored sight line calibration data according to historical pre-stored sight line calibration data corresponding to the characteristic information and the user state information;
if the characteristic information and the user state information do not meet the change threshold, searching and acquiring historical pre-stored sight line calibration data corresponding to the characteristic information and the user state information from historical pre-stored sight line calibration data corresponding to all historical tracking data, and updating the pre-stored sight line calibration data.
5. The method according to claim 1, wherein the determining corresponding feature information based on the temperature value of each pixel point in the infrared thermal imaging graph comprises:
acquiring temperature values of all pixel points in the infrared thermal imaging graph, and determining the pixel points with the temperature values meeting a first threshold value as first pixel points;
and determining the weighted average value of the position information of each first pixel point by taking the temperature value of each first pixel point as the weight, and obtaining a pupil center pixel point as the characteristic information.
6. The method of claim 1, wherein determining gaze information based on the feature information and the pre-stored gaze calibration data comprises:
determining a mapping coefficient between the infrared thermal imaging graph and a plane of a watched area according to the pre-stored sight line calibration data;
and determining the fixation point coordinate corresponding to the characteristic information according to the characteristic information and the mapping coefficient.
7. The method according to any of claims 1-6, wherein the user location information comprises: horizontal and vertical angles;
wherein the horizontal angle is an angle of a straight line formed by the center of the user's eyes and the center of the watched plane deviating from the vertical middle vertical plane of the watched plane;
wherein the vertical angle is an angle of the gazed plane deviating from a horizontal plane.
8. The method of any of claims 1-6, wherein the user state information further comprises iris information.
9. A gaze tracking device, comprising:
the acquisition module is used for acquiring an infrared thermal imaging image and user state information when a user looks at the eye; wherein the user state information comprises user location information;
the characteristic information determining module is used for determining corresponding characteristic information based on the temperature value of each pixel point in the infrared thermal imaging graph;
the calibration data searching module is used for acquiring prestored sight line calibration data corresponding to the user position information;
and the gazing information determining module is used for determining gazing information based on the characteristic information and the pre-stored sight line calibration data.
10. A terminal device, comprising an input device, and further comprising:
one or more processors;
storage means for storing one or more programs;
the one or more programs are executed by the one or more processors to cause the one or more processors to implement a gaze tracking method of any of claims 1-8.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a gaze tracking method according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910521500.2A CN112101064B (en) | 2019-06-17 | 2019-06-17 | Sight tracking method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910521500.2A CN112101064B (en) | 2019-06-17 | 2019-06-17 | Sight tracking method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112101064A true CN112101064A (en) | 2020-12-18 |
CN112101064B CN112101064B (en) | 2024-07-05 |
Family
ID=73748544
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910521500.2A Active CN112101064B (en) | 2019-06-17 | 2019-06-17 | Sight tracking method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112101064B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113128417A (en) * | 2021-04-23 | 2021-07-16 | 南开大学 | Double-region eye movement tracking method based on head posture |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009064395A (en) * | 2007-09-10 | 2009-03-26 | Hiroshima Univ | Pointing device, program for making computer to correct error between operator's gaze position and cursor position, and computer-readable recording medium with the program recorded |
CN102662476A (en) * | 2012-04-20 | 2012-09-12 | 天津大学 | Gaze estimation method |
JP2014052758A (en) * | 2012-09-06 | 2014-03-20 | Hiroshima City Univ | Sight line measurement method |
CN105867603A (en) * | 2015-12-08 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Eye-controlled method and device |
JP2017102731A (en) * | 2015-12-02 | 2017-06-08 | 国立大学法人静岡大学 | Gaze detection device and gaze detection method |
CN108259887A (en) * | 2018-04-13 | 2018-07-06 | 宁夏大学 | Watch point calibration method and device, blinkpunkt scaling method and device attentively |
JP2018205819A (en) * | 2017-05-30 | 2018-12-27 | 富士通株式会社 | Gazing position detection computer program, gazing position detection device, and gazing position detection method |
CN109522887A (en) * | 2019-01-24 | 2019-03-26 | 北京七鑫易维信息技术有限公司 | A kind of Eye-controlling focus method, apparatus, equipment and storage medium |
-
2019
- 2019-06-17 CN CN201910521500.2A patent/CN112101064B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009064395A (en) * | 2007-09-10 | 2009-03-26 | Hiroshima Univ | Pointing device, program for making computer to correct error between operator's gaze position and cursor position, and computer-readable recording medium with the program recorded |
CN102662476A (en) * | 2012-04-20 | 2012-09-12 | 天津大学 | Gaze estimation method |
JP2014052758A (en) * | 2012-09-06 | 2014-03-20 | Hiroshima City Univ | Sight line measurement method |
JP2017102731A (en) * | 2015-12-02 | 2017-06-08 | 国立大学法人静岡大学 | Gaze detection device and gaze detection method |
CN105867603A (en) * | 2015-12-08 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Eye-controlled method and device |
JP2018205819A (en) * | 2017-05-30 | 2018-12-27 | 富士通株式会社 | Gazing position detection computer program, gazing position detection device, and gazing position detection method |
CN108259887A (en) * | 2018-04-13 | 2018-07-06 | 宁夏大学 | Watch point calibration method and device, blinkpunkt scaling method and device attentively |
CN109522887A (en) * | 2019-01-24 | 2019-03-26 | 北京七鑫易维信息技术有限公司 | A kind of Eye-controlling focus method, apparatus, equipment and storage medium |
Non-Patent Citations (4)
Title |
---|
TAYLOR R. HAYES 等: "Mapping and correcting the influence of gaze position on pupil size measurements", 《BEHAV RES METHODS》, 1 June 2017 (2017-06-01), pages 1 - 31 * |
YONGSHENG ZHOU 等: "Study on the Calibration Method of Gaze Point in Gaze Tracking", 《INTERNATIONAL CONFERENCE ON FRONTIER COMPUTING》, 19 May 2019 (2019-05-19), pages 820 - 830 * |
崔耀: "视控人机交互系统技术研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 15 January 2014 (2014-01-15), pages 140 - 2 * |
张远辉 等: "眼球光心标 定与距离修正的3维注视点估计", 《中国图象图形学报》, vol. 24, no. 8, 31 August 2019 (2019-08-31), pages 1369 - 1380 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113128417A (en) * | 2021-04-23 | 2021-07-16 | 南开大学 | Double-region eye movement tracking method based on head posture |
CN113128417B (en) * | 2021-04-23 | 2023-04-07 | 南开大学 | Double-region eye movement tracking method based on head posture |
Also Published As
Publication number | Publication date |
---|---|
CN112101064B (en) | 2024-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9785233B2 (en) | Systems and methods of eye tracking calibration | |
EP3571673B1 (en) | Method for displaying virtual image, storage medium and electronic device therefor | |
CN103870796B (en) | Eye sight evaluation method and device | |
WO2020125499A1 (en) | Operation prompting method and glasses | |
US20220301218A1 (en) | Head pose estimation from local eye region | |
CN109472189B (en) | Pupil radius compensation | |
CN109690553A (en) | The system and method for executing eye gaze tracking | |
US20150029322A1 (en) | Method and computations for calculating an optical axis vector of an imaged eye | |
CN104809424B (en) | Method for realizing sight tracking based on iris characteristics | |
US9514524B2 (en) | Optical distortion compensation | |
JP2016173313A (en) | Visual line direction estimation system, visual line direction estimation method and visual line direction estimation program | |
CN108090463B (en) | Object control method, device, storage medium and computer equipment | |
WO2023011339A1 (en) | Line-of-sight direction tracking method and apparatus | |
TW202009786A (en) | Electronic apparatus operated by head movement and operation method thereof | |
CN108369744B (en) | 3D gaze point detection through binocular homography mapping | |
JP2022538669A (en) | Improved eye tracking latency | |
CN110341617B (en) | Eyeball tracking method, device, vehicle and storage medium | |
WO2015051605A1 (en) | Image collection and locating method, and image collection and locating device | |
US11080888B2 (en) | Information processing device and information processing method | |
CN115840502B (en) | Three-dimensional sight tracking method, device, equipment and storage medium | |
CN109522887A (en) | A kind of Eye-controlling focus method, apparatus, equipment and storage medium | |
CN112099622B (en) | Sight tracking method and device | |
CN110537897A (en) | Sight tracking method and device, computer readable storage medium and electronic equipment | |
CN112446251A (en) | Image processing method and related device | |
CN110338750B (en) | Eyeball tracking equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |