CN113116291A - Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium - Google Patents

Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium Download PDF

Info

Publication number
CN113116291A
CN113116291A CN201911416784.5A CN201911416784A CN113116291A CN 113116291 A CN113116291 A CN 113116291A CN 201911416784 A CN201911416784 A CN 201911416784A CN 113116291 A CN113116291 A CN 113116291A
Authority
CN
China
Prior art keywords
target
calibration
eye
confidence
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911416784.5A
Other languages
Chinese (zh)
Inventor
杨平平
陈岩
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911416784.5A priority Critical patent/CN113116291A/en
Publication of CN113116291A publication Critical patent/CN113116291A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Abstract

The embodiment of the application provides a calibration and calibration method and device for eyeball tracking, a mobile terminal and a storage medium, wherein the method comprises the following steps: detecting touch operation and determining a touch area corresponding to the touch operation; if the touch area is located in the target calibration area, shooting a target human eye image corresponding to the touch area; extracting target eye characteristic data from the target eye image, inputting the target eye characteristic data into an eyeball tracking model, and obtaining a target fixation point coordinate; if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, taking the target eye feature data and the target fixation point coordinate as confidence point data and recording the confidence point data in a confidence data set; and when the number of the confidence point data in the confidence data set reaches a second threshold value, re-fitting the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. The calibration effect that eyeball was tracked can be improved to this application embodiment.

Description

Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to a calibration and calibration method and device for eyeball tracking, a mobile terminal and a storage medium.
Background
Currently, the most common eyeball tracking technology is Pupil Center Corneal Reflection (PCCR). The principle of the PCCR technology is that a camera of a physical tracking device captures images of highly visible reflected light formed by irradiating a pupil with a light source, reflection conditions of the light source in a cornea and the pupil are determined through the images, and finally, the gazing direction of human eyes is obtained through calculation of the direction of vectors formed by reflection of the cornea and the pupil and other geometric characteristics.
Before calculating the direction of the human eye's gaze, a process of parameter calibration is typically employed. When the parameters are calibrated, a user is required to watch the calibration point, then the screen is clicked to confirm watching, the camera can shoot the current image watched by the user, and the current image is in one-to-one correspondence with the current watching point. According to the conventional calibration method, the eyeball tracking effect is very dependent on the first calibration effect, and once the first calibration is not good, the follow-up eyeball tracking effect is not good.
Disclosure of Invention
The embodiment of the application provides a calibration and calibration method and device for eyeball tracking, a mobile terminal and a storage medium, which can automatically calibrate a last calibration result after last calibration, and improve the calibration effect of eyeball tracking.
A first aspect of the embodiments of the present application provides a calibration and calibration method for eyeball tracking, including:
after the last calibration is completed, detecting touch operation and determining a touch area corresponding to the touch operation;
if the touch area is located in the target calibration area, shooting a target human eye image corresponding to the touch area;
extracting target eye characteristic data from the target eye image, inputting the target eye characteristic data into an eyeball tracking model, and obtaining a target fixation point coordinate; the eye tracking model is determined based on the last calibration;
if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, taking the target eye feature data and the target fixation point coordinate as confidence point data, and recording the confidence point data in a confidence data set;
and when the number of the confidence point data in the confidence data set reaches a second threshold value, re-fitting the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model.
A second aspect of the embodiments of the present application provides a calibration apparatus for eyeball tracking, including:
the detection unit is used for detecting touch operation after the last calibration is finished;
the determining unit is used for determining a touch area corresponding to the touch operation;
the shooting unit is used for shooting a target human eye image corresponding to the touch area under the condition that the touch area is positioned in a target calibration area;
an extraction unit for extracting target eye feature data from the target eye image;
the model calculation unit is used for inputting the target eye characteristic data into an eyeball tracking model to obtain a target fixation point coordinate; the eye tracking model is determined based on the last calibration;
the processing unit is used for taking the target eye feature data and the target fixation point coordinate as confidence point data and recording the confidence point data in a confidence data set under the condition that the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value;
and the calibration unit is used for re-fitting the eyeball tracking model according to the confidence point data in the confidence data set under the condition that the number of the confidence point data in the confidence data set reaches a second threshold value to obtain the calibrated eyeball tracking model.
A third aspect of an embodiment of the present application provides a mobile terminal, including a processor and a memory, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the step instructions in the first aspect of the embodiment of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
In the embodiment of the application, after the last calibration is completed, the last calibration can be calibrated. After the last calibration is completed, detecting touch operation and determining a touch area corresponding to the touch operation; if the touch area is located in the target calibration area, shooting a target human eye image corresponding to the touch area; extracting target eye characteristic data from the target eye image, inputting the target eye characteristic data into an eyeball tracking model, and obtaining a target fixation point coordinate; determining an eyeball tracking model based on the last calibration; if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, taking the target eye feature data and the target fixation point coordinate as confidence point data, and recording the confidence point data in a confidence data set; and when the number of the confidence point data in the confidence data set reaches a second threshold value, re-fitting the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. According to the eyeball tracking model calibration method and device, after last calibration, confidence point data are collected through touch operation and recorded in the confidence data set, and when the number of the confidence point data in the confidence data set reaches a second threshold value, the eyeball tracking model is refitted according to the confidence point data in the confidence data set, so that the calibrated eyeball tracking model is obtained. The calibration method can automatically calibrate the last calibration result after the last calibration, and improve the eyeball tracking accuracy of the calibrated eyeball tracking model, thereby improving the calibration effect of eyeball tracking.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a calibration method for eye tracking according to an embodiment of the present disclosure;
FIG. 2a is a schematic diagram of a calibration area provided in an embodiment of the present application;
fig. 2b is a schematic diagram illustrating a pupil center calculation according to an embodiment of the present disclosure;
fig. 2c is a schematic diagram of extracting a pupil eye angle vector according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another calibration method for eye tracking according to an embodiment of the present disclosure;
fig. 4 is a schematic specific flowchart of calibration provided in the embodiment of the present application;
fig. 5 is a schematic specific flowchart of calibration according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a calibration apparatus for eye tracking according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The mobile terminal according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and so on. For convenience of description, the above-mentioned devices are collectively referred to as a mobile terminal.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a calibration method for eye tracking according to an embodiment of the present disclosure. As shown in fig. 1, the calibration method for eye tracking may include the following steps.
101, after the last calibration is completed, the mobile terminal detects the touch operation and determines a touch area corresponding to the touch operation.
In the embodiment of the application, the eyeball tracking accuracy of the mobile terminal greatly depends on the calibration effect of the eyeball tracking. If a certain error occurs in calibration during eyeball tracking, the subsequent eyeball tracking effect is affected.
The mobile terminal may provide an eye tracking function to the user. The user can manually select to turn on or turn off the eyeball tracking function, and the mobile terminal can also automatically turn on the eyeball tracking function. In some application scenarios, the mobile terminal may automatically turn on the eye tracking function. For example, in the reading mode, the mobile terminal may automatically turn on the eye tracking function.
When the user uses the eyeball tracking for the first time, the mobile terminal guides the user to carry out calibration of the eyeball tracking. The calibration process for eyeball tracking may specifically include the following steps:
(11) the mobile terminal generates N calibration points in N calibration areas of the touch display screen;
(12) the mobile terminal respectively collects N human eye images under the condition that the sight of a user respectively watches the N calibration points;
(13) the mobile terminal extracts N eye feature data from the N eye images respectively;
(14) and fitting the model to be fitted by the mobile terminal based on the N eye characteristic data to obtain the eyeball tracking model.
In the embodiment of the application, in the calibration process, the mobile terminal guides a user to sequentially watch the N calibration areas of the touch display screen to generate N calibration points. And respectively collecting N human eye images by the mobile terminal under the condition that the sight of a user respectively watches the N calibration points.
Specifically, a touch display screen of the mobile terminal generates a calibration point in a calibration area each time to prompt a user to look at the calibration point, the mobile terminal collects at least one eye image under the condition that the user watches the calibration point, and selects one eye image from the at least one eye image as an eye image corresponding to the calibration point, or synthesizes at least two eligible images selected from the at least one eye image to obtain the eye image corresponding to the calibration point. Then, another calibration area of the touch display screen of the mobile terminal generates a calibration point, and the human eye image is continuously acquired. And respectively acquiring N human eye images corresponding to the N calibration points by the mobile terminal under the condition that the sight of the user respectively watches the N calibration points.
The color of the calibration point is different from that of the background area, and the colors of the background areas except the calibration point are uniform. For example, the color of the calibration point is black, the color of the background area is white, and the larger the color difference between the calibration point and the background area is, the better the color difference is.
The eye characteristic data may include eyeball pupil center coordinates, eyeball cornea reflected light spot center coordinates, left eye angular coordinates, right eye angular coordinates, and the like.
The model to be fitted may comprise a multivariate polynomial function model. For example, take a binary quadratic polynomial function model as an example:
Figure BDA0002351385170000051
Figure BDA0002351385170000052
wherein, in the eyeball tracking process, ZxRefers to the X-axis coordinate, Z, of the point of gaze of the user's gaze on the display screenyRefers to the Y-axis coordinate of the gaze point at which the user looks at the display screen. During calibration, ZxIs the X-axis coordinate of the index point; zyIs the Y-axis coordinate of the index point. v. ofxRefers to the X-axis coordinate in the eye feature data (e.g., the difference between the X-axis coordinate of the left eye angular coordinate and the X-axis coordinate of the eyeball pupil center coordinate); v. ofyRefers to the Y-axis coordinate (e.g., the difference between the Y-axis coordinate of the left eye angular coordinate and the Y-axis coordinate of the eyeball pupil center coordinate) in the eye feature data. a is0、b0、a1、b1、a2、b2、a3、b3、a4、b4、a5、b5And the calibration parameters of the function model to be fitted are obtained.
In order to obtain better calibration parameters during the calibration process, at least 9 eye feature data of 9 human eye images are generally needed. Wherein N may be an integer greater than or equal to 9.
The calibration process is to determine calibration parameters of the function model to be fitted, and after the calibration parameters of the function model to be fitted are determined, the eyeball tracking model is determined. The eye tracking model may then be used to calculate coordinates of the point of gaze of the user's eye on the touch sensitive display screen.
After the first calibration is finished, the mobile terminal can regularly remind the user to perform calibration again. Each calibration process is similar to the first calibration process and is not described herein again.
After the mobile terminal detects the touch operation, the area where the touch operation is in contact with the touch display screen can be determined, and the area where the touch operation is in contact with the touch display screen is determined to be the touch area corresponding to the touch operation.
The touch operation may include a click operation or a slide operation.
The sliding operation may include an operation in a sliding region, and if the sliding region is located in the target calibration region, points in the entire sliding region may be fitted.
And 102, if the touch area is located in the target calibration area, shooting a target human eye image corresponding to the touch area by the mobile terminal.
In the embodiment of the application, the target calibration area is an area used for generating a calibration point in a calibration process. For example, please refer to fig. 2a, fig. 2a is a schematic diagram of a calibration area according to an embodiment of the present disclosure. As shown in fig. 2a, the touch display screen may include 9 calibration areas, and each calibration area includes a calibration point. The target calibration area may be any one of the 9 calibration areas. And if the touch area is located in any one of the 9 calibration areas, determining that the touch area is located in the target calibration area.
If the touch area is detected to be located in the target calibration area, the mobile terminal can shoot a target human eye image through the front camera, and the target human eye image is a human eye image corresponding to the touch area.
The target eye image can be any one of a plurality of eye images continuously shot by the mobile terminal through the front-facing camera. For example, when the touch area is located in the target calibration area, the mobile terminal may continuously capture N human eye images corresponding to the touch area, and for each human eye image in the N human eye images, the operations in steps 103 to 105 may be performed.
The mobile terminal may include an infrared light source (IR) camera or a Red Green Blue (RGB) camera. The IR camera can send out infrared light, and the facula appears on the people's eye in infrared light irradiation, and the grey level people's eye image can be shot to the IR camera.
The RGB camera can shoot colorful human eye images.
Generally, an IR camera will reflect an infrared pupil image more accurately than an RGB camera; the RGB scheme requires more image processing and the IR camera is more computationally accurate than the RGB camera. In terms of versatility, the structure and scheme design of an RGB camera is more versatile than an IR camera.
103, extracting target eye characteristic data from the target eye image by the mobile terminal, and inputting the target eye characteristic data into an eyeball tracking model to obtain a target fixation point coordinate; the eye tracking model is determined based on the last calibration.
In this embodiment, the target eye characteristic data may include an eyeball pupil center coordinate, an eyeball cornea reflected light spot center coordinate, a left eye angle coordinate, a right eye angle coordinate, and the like.
The eyeball tracking model is determined for the last calibration, and the parameters of the eyeball tracking model are determined for the last calibration. The parameters of the eye tracking model may be re-determined each time the calibration is performed (for example, the parameters of the eye tracking model may include a in the above formula0、b0、a1、b1、a2、b2、a3、b3、a4、b4、a5、b5)。
Optionally, in step 103, the mobile terminal extracts target eye feature data from the target eye image, which may specifically include the following steps:
(21) the method comprises the steps that a mobile terminal determines an interested ROI in a target human eye image;
(22) the mobile terminal extracts a pupil eye angle vector from the ROI, and the target eye feature data comprises the pupil eye angle vector.
In the embodiment of the present application, the region of interest may be a left eye region or a right eye region in the human eye image. The pupil canthus vector is the vector from the pupil center to the left canthus or the vector from the pupil center to the right canthus.
Optionally, in step (22), the mobile terminal extracts a pupil and eye corner vector from the ROI region, and may further include the following steps:
(221) the mobile terminal determines the pupil center coordinates in the ROI by adopting a nuclear gradient pupil positioning method;
(222) the mobile terminal determines a first eye corner coordinate and a second eye corner coordinate in the ROI area by adopting an edge detection and nuclear gradient eye corner positioning method;
(223) and the mobile terminal determines a pupil eye angle vector according to the pupil center coordinate, the first eye angle coordinate and the second eye angle coordinate.
In the embodiment of the present application, please refer to fig. 2b, and fig. 2b is a schematic diagram illustrating a pupil center point calculation according to the embodiment of the present application. As shown in FIG. 2b, the gray area on the left image of FIG. 2b is the eyeball, c is the candidate point of the pupil center (c is located in the gray area), xiA point to be found at the edge of the eyeball, diIs a unit displacement vector (from c to x)iDirection of (g), giIn the unit gradient direction (in x from the eyeball)iThe tangential direction of the position is vertical to the direction outside the eyeball), when c and the edge of the eyeball are traversed in all directions, the average value of the displacement vector and the gradient direction is obtained, and the position corresponding to the maximum value is taken as the central position of the pupil. Specifically, the pupil center position may be calculated as follows.
Figure BDA0002351385170000081
Wherein, C*Is the central position of the pupil, di TIs diThe transposed vector of (1). di=xi-c. Where g is the kernel gradient, embodiments of the present application may enhance the robustness of the center point when processing the edges of low-resolution or blurred images as well as eyelash and hair images.
The following illustrates the difference between the nuclear gradient of the present application embodiment and the conventional gradient.
For example, if a certain matrix is present
Figure BDA0002351385170000082
Conventional gradient x-direction gradient
Figure BDA0002351385170000083
Figure BDA0002351385170000084
Namely, it is
gradx(x,y)=(gradx(x+1,y)-gradx(x-1,y))/2,
grady(x,y)=(grady(x,y+1)-grady(x,y-1))/2;
The nuclear gradient of the examples of the present application is as follows:
Figure BDA0002351385170000085
Figure BDA0002351385170000086
normalizing the gradient: grad' (x, y) ═ abs (grad)x(x,y)2+grady(x,y)2)。
The canthus Detection in the embodiment of the present application uses a nuclear gradient and Susan Edge Detection (Susan Edge Detection) method for Detection. Using the nuclear gradient alone, the gradient at the corners of the eye is easier to detect than at the pupil, and up to 2 corners of the eye can be filtered out in combination with edge detection.
Referring to fig. 2c, fig. 2c is a schematic diagram illustrating an extraction of a pupil eye angle vector according to an embodiment of the present disclosure. As shown in fig. 2 c. After obtaining the ROI area of human eyes, carrying out pupil positioning based on a nuclear gradient method on the ROI area, and outputting pupil center coordinates (x, y). And performing edge detection and nuclear gradient positioning on the ROI area to obtain a first eye corner coordinate (X1, Y1) and a second eye corner coordinate (X2, Y2), and calculating two candidate pupil eye corner vectors vec1 and vec 1. Wherein vec1 ═ (X1-X, Y1-Y); vec2 ═ (X2-X, Y2-Y). A vector with the smallest module in the module | vec1| of the candidate pupil eye angle vector vec1 and the module | vec2| of the selected pupil eye angle vector vec2 is determined as the pupil eye angle vector (for example, | vec1| is greater than | vec2|, vec2 is determined as the pupil eye angle vector).
And 104, if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, the mobile terminal takes the target eye feature data and the target fixation point coordinate as confidence point data, and the confidence point data is recorded in a confidence data set.
In the embodiment of the application, if the difference between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold, the target eye feature data is regarded as reliable data, the target eye feature data and the target fixation point coordinate are used as confidence point data, and the confidence point data is recorded in a confidence data set. The coordinates corresponding to the target calibration area may include coordinates of a calibration point within the target calibration area.
Each target calibration area corresponds to a calibration point coordinate, which can be specifically shown in fig. 2 a. The coordinates of the calibration points within each target calibration area are determined.
In the embodiment of the application, if the difference between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than the first threshold, the target fixation point corresponding to the target fixation point coordinate is considered to be close to the coordinate of the calibration point in the target calibration area, and the data is considered to be reliable data. Otherwise, the data is considered unreliable.
For example, the confidence point data of the present application is premised on consistency between the hands and eyes of the user. That is, the area touched by the finger of the user is the area watched by the eyes of the user. However, in some cases, if the user's eyes look at another area when the user's finger touches a certain area, and at this time, the degree of difference between the target gaze point coordinate and the coordinate corresponding to the target calibration area is large, the target eye feature data and the target gaze point coordinate are not data desired in the embodiment of the present application, and are considered to be unreliable data.
The first threshold may be preset and stored in a memory (e.g., a non-volatile memory) of the mobile terminal.
The eyeball tracking model may include a multiple polynomial function model. For example, take a binary quadratic polynomial function model as an example:
Figure BDA0002351385170000102
Figure BDA0002351385170000103
if the coordinate corresponding to the target calibration area is Zx1、Zy1(ii) a The coordinate of the target fixation point is Zx2、Zy2(ii) a The difference between the target fixation point coordinates and the coordinates corresponding to the target calibration area can be calculated by the following formula:
Figure BDA0002351385170000101
wherein, deta is the difference degree of the fixation point coordinate and the coordinate corresponding to the target calibration area, detazZx=Zx1-Zx2;detazZy=Zy1-Zy2
Optionally, if the difference between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than the first threshold, the mobile terminal discards the target eye feature data.
105, when the number of the confidence point data in the confidence data set reaches a second threshold value, the mobile terminal performs refitting on the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model.
In the embodiment of the application, when the number of the confidence point data in the confidence data set reaches the second threshold, the number of the confidence point data in the confidence data set is considered to reach the standard of fitting again, and then the eyeball tracking model is fitted again according to the confidence point data in the confidence data set, so that the calibrated eyeball tracking model is obtained.
Wherein, after the re-fitting, the calibrated parameters of the eye tracking model (for example, the parameters of the eye tracking model may include a in the above formula)0、b0、a1、b1、a2、b2、a3、b3、a4、b4、a5、b5) Can be recalculated and replace the last calibrated parameters of the eye tracking model.
The second threshold may be preset and stored in a memory (e.g., a non-volatile memory) of the mobile terminal. For example, the second threshold may be set to be greater than 9. For example, the second threshold may be set to 10, 100, 200, 1000, etc.
In the embodiment of the application, at most one confidence point data can be collected in each touch operation, after calibration is completed, the confidence point data can be collected in the touch operation process of a user and used as a data source for subsequent correction, so that re-fitting can be performed, the eyeball tracking accuracy of the calibrated eyeball tracking model can be improved, and the calibration effect of eyeball tracking can be improved.
Optionally, after step 105 is executed, the following steps may also be executed:
(31) under the condition of eyeball tracking, the mobile terminal acquires a target user eye image;
(32) the mobile terminal extracts eye feature data of a target user from a human eye image of the target user;
(33) and the mobile terminal inputs the eye characteristic data of the target user into the calibrated eyeball tracking model to obtain the coordinates of the fixation point of the target user.
In the embodiment of the present application, step (32) and step (33) may refer to the above detailed description of step 103, and are not described herein again.
According to the embodiment of the application, the calibrated eyeball tracking model is adopted for eyeball tracking, the fixation point coordinate of the human eye can be accurately calculated, and therefore the eyeball tracking accuracy is improved.
In the embodiment of the application, after the previous calibration, the confidence point data is collected through touch operation and recorded in the confidence data set, and when the number of the confidence point data in the confidence data set reaches a second threshold value, the eyeball tracking model is re-fitted according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. The calibration method can automatically calibrate the last calibration result after the last calibration, and improve the eyeball tracking accuracy of the calibrated eyeball tracking model, thereby improving the calibration effect of eyeball tracking.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another calibration method for eye tracking according to an embodiment of the present disclosure. As shown in fig. 3, the calibration method for eye tracking may include the following steps.
301, the mobile terminal starts a calibration mode, and generates N calibration points in N calibration areas of the touch display screen.
302, under the condition that the sight of the user respectively watches the N calibration points, the mobile terminal respectively collects N human eye images.
303, the mobile terminal extracts N eye feature data from the N eye images respectively.
And 304, fitting the model to be fitted by the mobile terminal based on the N eye characteristic data to obtain the eyeball tracking model.
In the embodiment of the application, in the calibration process, the mobile terminal guides a user to sequentially watch the N calibration areas of the touch display screen to generate N calibration points. And respectively collecting N human eye images by the mobile terminal under the condition that the sight of a user respectively watches the N calibration points.
Specifically, a touch display screen of the mobile terminal generates a calibration point in a calibration area each time to prompt a user to look at the calibration point, the mobile terminal collects at least one eye image under the condition that the user watches the calibration point, and selects one eye image from the at least one eye image as an eye image corresponding to the calibration point, or synthesizes at least two eligible images selected from the at least one eye image to obtain the eye image corresponding to the calibration point. Then, another calibration area of the touch display screen of the mobile terminal generates a calibration point, and the human eye image is continuously acquired. And respectively acquiring N human eye images corresponding to the N calibration points by the mobile terminal under the condition that the sight of the user respectively watches the N calibration points.
The color of the calibration point is different from that of the background area, and the colors of the background areas except the calibration point are uniform. For example, the color of the calibration point is black, the color of the background area is white, and the larger the color difference between the calibration point and the background area is, the better the color difference is.
The eye characteristic data may include eyeball pupil center coordinates, eyeball cornea reflected light spot center coordinates, left eye angular coordinates, right eye angular coordinates, and the like.
The model to be fitted may comprise a multivariate polynomial function model. For example, take a binary quadratic polynomial function model as an example:
Figure BDA0002351385170000121
Figure BDA0002351385170000122
wherein, in the eyeball tracking process, ZxRefers to the X-axis coordinate, Z, of the point of gaze of the user's gaze on the display screenyRefers to the Y-axis coordinate of the gaze point at which the user looks at the display screen. During calibration, ZxIs the X-axis coordinate of the index point; zyIs the Y-axis coordinate of the index point. v. ofxRefers to the X-axis coordinate (e.g., the X-axis coordinate of the left-eye angular coordinate and the eye) in the eye feature dataThe difference in X-axis coordinates of the spherical pupil center coordinates); v. ofyRefers to the Y-axis coordinate (e.g., the difference between the Y-axis coordinate of the left eye angular coordinate and the Y-axis coordinate of the eyeball pupil center coordinate) in the eye feature data. a is0、b0、a1、b1、a2、b2、a3、b3、a4、b4、a5、b5And the calibration parameters of the function model to be fitted are obtained.
In order to obtain better calibration parameters during the calibration process, at least 9 eye feature data of 9 human eye images are generally needed. Wherein N may be an integer greater than or equal to 9.
The calibration process is to determine calibration parameters of the function model to be fitted, and after the calibration parameters of the function model to be fitted are determined, the eyeball tracking model is determined. The eye tracking model may then be used to calculate coordinates of the point of gaze of the user's eye on the touch sensitive display screen.
The following describes a specific calibration process by taking the mobile terminal as a mobile phone and taking N equal to 9 as an example, with reference to fig. 4. Referring to fig. 4, fig. 4 is a schematic diagram illustrating a specific process of calibration according to an embodiment of the present disclosure. As shown in fig. 4, after the sight line calibration is started, human eyes watch 1-9 calibration areas in the mobile phone screen in sequence; shooting 9 images of a calibration area corresponding to 1-9 by a front camera (IR camera or infrared camera) of the mobile phone; detecting and outputting an ROI (region of interest) in each image through human eyes, carrying out pupil positioning based on a nuclear gradient method on the ROI, and outputting pupil center coordinates (x, y); performing edge detection and nuclear gradient localization eye coordinates (including a first eye corner coordinate (X1, Y1) and a second eye corner coordinate (X2, Y2)), wherein the first eye corner coordinate can be a left eye corner coordinate, and the second eye corner coordinate can be a right eye corner coordinate; obtaining a pupil-eye angle vector vec1 ═ (X1-X, Y1-Y), vec2 ═ X2-X, Y2-Y; taking the vector with the minimum of | vec1| and | vec2| as a calculation; calculating the coordinates (Z) of the calibration points corresponding to 1-9 calibration areas in the screen of the mobile phonex1,Zy1)~(Zx9,Zy9) (ii) a Inputting the 9 coordinates into a nonlinear polynomial model, and calculating a 0-a 5 and b 0-b 5 to obtain the calibrated eyeA ball tracking model.
The following describes a specific procedure of calibration with reference to fig. 5, taking the mobile terminal as a mobile phone, taking N equal to 9 as an example, and taking the second threshold equal to 1000. Referring to fig. 5, fig. 5 is a schematic diagram illustrating a specific process of calibration according to an embodiment of the present disclosure. As shown in fig. 5, when the calibration of the sight line calibration is started, human eyes watch any one of the calibration areas 1-9 in the screen of the mobile phone in sequence, if yes, the front camera (IR camera or infrared camera) of the mobile phone shoots the human eye image of the corresponding calibration area; detecting and outputting an ROI (region of interest) in the human eye image through human eyes, carrying out pupil positioning based on a nuclear gradient method on the ROI, and outputting pupil center coordinates (x, y); performing edge detection and nuclear gradient localization eye coordinates (including a first eye corner coordinate (X1, Y1) and a second eye corner coordinate (X2, Y2)), wherein the first eye corner coordinate can be a left eye corner coordinate, and the second eye corner coordinate can be a right eye corner coordinate; obtaining a pupil-eye angle vector vec1 ═ (X1-X, Y1-Y), vec2 ═ X2-X, Y2-Y; taking the vector with the minimum of | vec1| and | vec2| as a calculation; calculating the coordinates (Z) of the calibration points corresponding to 1-9 calibration areas in the screen of the mobile phonex1,Zy1)~(Zx9,Zy9) (ii) a Substituting into a nonlinear polynomial model from a0 to a5, b0 to b5, and calculating the previous results of the difference of 9 degrees; if the current frame deta is different<The average deta difference of the previous 9 times is recorded as the total number of the new confidence points; when the total number of new confidence points reaches N (for example 1000 sets), a 0-a 5, b 0-b 5 are refitted, the old value is replaced, and N is cleared, namely N is equal to 0.
305, after the last calibration is completed, the mobile terminal detects the touch operation and determines a touch area corresponding to the touch operation.
And 306, if the touch area is located in the target calibration area, the mobile terminal shoots a target human eye image corresponding to the touch area.
307, the mobile terminal extracts target eye characteristic data from the target eye image, inputs the target eye characteristic data into an eyeball tracking model, and obtains a target fixation point coordinate; the eye tracking model is determined based on the last calibration.
308, if the difference between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold, the mobile terminal takes the target eye feature data and the target fixation point coordinate as confidence point data, and records the confidence point data in the confidence data set.
309, when the number of the confidence point data in the confidence data set reaches a second threshold value, the mobile terminal performs refitting on the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model.
The specific implementation of steps 305 to 309 can refer to steps 101 to 105 shown in fig. 1, and is not described herein again.
In the embodiment of the application, after the previous calibration, the confidence point data is collected through touch operation and recorded in the confidence data set, and when the number of the confidence point data in the confidence data set reaches a second threshold value, the eyeball tracking model is re-fitted according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. The calibration method can automatically calibrate the last calibration result after the last calibration, and improve the eyeball tracking accuracy of the calibrated eyeball tracking model, thereby improving the calibration effect of eyeball tracking.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the mobile terminal includes hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the mobile terminal may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In accordance with the above, referring to fig. 6, fig. 6 is a schematic structural diagram of an eyeball tracking calibration apparatus 600 provided in the embodiment of the present application, which may include a detection unit 601, a determination unit 602, a shooting unit 603, an extraction unit 604, a model calculation unit 605, a processing unit 606, and a calibration unit 607, wherein:
a detection unit 601, configured to detect a touch operation after the last calibration is completed;
a determining unit 602, configured to determine a touch area corresponding to the touch operation;
a shooting unit 603, configured to shoot a target human eye image corresponding to the touch area when the touch area is located in a target calibration area;
an extracting unit 604, configured to extract target eye feature data from the target eye image;
a model calculation unit 605, configured to input the target eye feature data into an eyeball tracking model, and obtain a target fixation point coordinate; the eye tracking model is determined based on the last calibration;
the processing unit 606 is configured to, when a difference between the target fixation point coordinate and a coordinate corresponding to the target calibration area is smaller than a first threshold, take the target eye feature data and the target fixation point coordinate as a piece of confidence point data, and record the confidence point data in a confidence data set;
the calibration unit 607 is configured to, when the number of the confidence point data in the confidence data set reaches a second threshold, re-fit the eyeball tracking model according to the confidence point data in the confidence data set to obtain a calibrated eyeball tracking model.
Optionally, the extracting unit 604 extracts target eye feature data from the target eye image, specifically; determining an interesting ROI (region of interest) in the target human eye image; and extracting a pupil eye corner vector from the ROI region, wherein the target eye feature data comprises the pupil eye corner vector.
Optionally, the extracting unit 604 extracts the pupil and eye corner vector from the ROI region, specifically: determining the pupil center coordinates in the ROI by adopting a nuclear gradient pupil positioning method; determining a first eye corner coordinate and a second eye corner coordinate in the ROI area by adopting an edge detection and nuclear gradient eye corner positioning method; and determining a pupil eye angle vector according to the pupil center coordinate, the first eye angle coordinate and the second eye angle coordinate.
Optionally, the processing unit 606 is further configured to discard the target eye feature data when a difference between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold.
Optionally, the touch operation includes a click operation or a slide operation.
Optionally, the calibration apparatus 600 for eye tracking may further include a calibration unit 608;
the calibration unit 608 is configured to start a calibration mode before the last calibration is completed, and generate N calibration points in N calibration areas of the touch display screen; under the condition that the sight of a user respectively watches the N calibration points, respectively collecting N human eye images; extracting N pieces of eye feature data from the N pieces of eye images respectively; and fitting a model to be fitted based on the N eye characteristic data to obtain the eyeball tracking model.
Optionally, the calibration apparatus 600 for eye tracking may further include an eye tracking unit 609;
the eyeball tracking unit 609 is configured to obtain an image of the human eyes of the target user when the eyeball tracking is performed after the calibration unit 607 performs refitting on the eyeball tracking model according to the confidence point data in the confidence data set to obtain a calibrated eyeball tracking model; extracting target user eye feature data from the target user eye image; and inputting the eye feature data of the target user into the calibrated eyeball tracking model to obtain the coordinates of the fixation point of the target user.
The shooting unit 603 may be a camera of the mobile terminal, the detection unit 601 may be a touch display of the mobile terminal, and the determination unit 602, the extraction unit 604, the model calculation unit 605, the processing unit 606, the calibration unit 607, the calibration unit 608, and the eye tracking unit 609 may be a processor of the mobile terminal.
In the embodiment of the application, after the previous calibration, the confidence point data is collected through touch operation and recorded in the confidence data set, and when the number of the confidence point data in the confidence data set reaches a second threshold value, the eyeball tracking model is re-fitted according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. The calibration method can automatically calibrate the last calibration result after the last calibration, and improve the eyeball tracking accuracy of the calibrated eyeball tracking model, thereby improving the calibration effect of eyeball tracking.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure, as shown in fig. 7, the mobile terminal 700 includes a processor 701 and a memory 702, and the processor 701 and the memory 702 may be connected to each other through a communication bus 703. The communication bus 703 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus 704 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 7, but this is not intended to represent only one bus or type of bus. The memory 702 is used for storing a computer program comprising program instructions, the processor 701 being configured for invoking the program instructions, the program comprising instructions for performing the method shown in fig. 1 to 5.
The processor 701 may be a general purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs according to the above schemes.
The Memory 702 may be, but is not limited to, a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The mobile terminal 700 may also include a camera 704 and a display 705. The cameras 704 may include front-facing cameras, rear-facing cameras, and the like. The display 705 may include a liquid crystal display, an LED display, an OLED display, or other touch display.
In addition, the mobile terminal 700 may also include general-purpose components such as a communication interface, an antenna, and the like, which will not be described in detail herein.
In the embodiment of the application, after the previous calibration, the confidence point data is collected through touch operation and recorded in the confidence data set, and when the number of the confidence point data in the confidence data set reaches a second threshold value, the eyeball tracking model is re-fitted according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model. The calibration method can automatically calibrate the last calibration result after the last calibration, and improve the eyeball tracking accuracy of the calibrated eyeball tracking model, thereby improving the calibration effect of eyeball tracking.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the calibration methods for eye tracking as described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute some or all of the steps of any of the calibration methods for eye tracking as described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, read-only memory, random access memory, magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A calibration and calibration method for eye tracking is characterized by comprising the following steps:
after the last calibration is completed, detecting touch operation and determining a touch area corresponding to the touch operation;
if the touch area is located in the target calibration area, shooting a target human eye image corresponding to the touch area;
extracting target eye characteristic data from the target eye image, inputting the target eye characteristic data into an eyeball tracking model, and obtaining a target fixation point coordinate; the eye tracking model is determined based on the last calibration;
if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, taking the target eye feature data and the target fixation point coordinate as confidence point data, and recording the confidence point data in a confidence data set;
and when the number of the confidence point data in the confidence data set reaches a second threshold value, re-fitting the eyeball tracking model according to the confidence point data in the confidence data set to obtain the calibrated eyeball tracking model.
2. The method of claim 1, wherein extracting target eye feature data from the target eye image comprises:
determining an interesting ROI (region of interest) in the target human eye image;
and extracting a pupil eye corner vector from the ROI region, wherein the target eye feature data comprises the pupil eye corner vector.
3. The method of claim 2, wherein said extracting a pupil canthi vector from the ROI region comprises:
determining the pupil center coordinates in the ROI by adopting a nuclear gradient pupil positioning method;
determining a first eye corner coordinate and a second eye corner coordinate in the ROI area by adopting an edge detection and nuclear gradient eye corner positioning method;
and determining a pupil eye angle vector according to the pupil center coordinate, the first eye angle coordinate and the second eye angle coordinate.
4. The method of claim 1, further comprising:
and if the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value, discarding the target eye feature data.
5. The method of claim 1, wherein the touch operation comprises a click operation or a slide operation.
6. The method according to any one of claims 1 to 5, wherein before the last calibration is completed, the method further comprises:
starting a calibration mode, and generating N calibration points in N calibration areas of the touch display screen;
under the condition that the sight of a user respectively watches the N calibration points, respectively collecting N human eye images;
extracting N pieces of eye feature data from the N pieces of eye images respectively;
and fitting a model to be fitted based on the N eye characteristic data to obtain the eyeball tracking model.
7. The method according to any one of claims 1 to 6, wherein after the re-fitting of the eye tracking model according to the confidence point data in the confidence data set to obtain the calibrated eye tracking model, the method further comprises:
under the condition of eyeball tracking, acquiring a target user eye image;
extracting target user eye feature data from the target user eye image;
and inputting the eye feature data of the target user into the calibrated eyeball tracking model to obtain the coordinates of the fixation point of the target user.
8. An eyeball tracking calibration device, comprising:
the detection unit is used for detecting touch operation after the last calibration is finished;
the determining unit is used for determining a touch area corresponding to the touch operation;
the shooting unit is used for shooting a target human eye image corresponding to the touch area under the condition that the touch area is positioned in a target calibration area;
an extraction unit for extracting target eye feature data from the target eye image;
the model calculation unit is used for inputting the target eye characteristic data into an eyeball tracking model to obtain a target fixation point coordinate; the eye tracking model is determined based on the last calibration;
the processing unit is used for taking the target eye feature data and the target fixation point coordinate as confidence point data and recording the confidence point data in a confidence data set under the condition that the difference degree between the target fixation point coordinate and the coordinate corresponding to the target calibration area is smaller than a first threshold value;
and the calibration unit is used for re-fitting the eyeball tracking model according to the confidence point data in the confidence data set under the condition that the number of the confidence point data in the confidence data set reaches a second threshold value to obtain the calibrated eyeball tracking model.
9. A mobile terminal comprising a processor and a memory, the memory for storing a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1 to 7.
CN201911416784.5A 2019-12-31 2019-12-31 Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium Pending CN113116291A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911416784.5A CN113116291A (en) 2019-12-31 2019-12-31 Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911416784.5A CN113116291A (en) 2019-12-31 2019-12-31 Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113116291A true CN113116291A (en) 2021-07-16

Family

ID=76769149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911416784.5A Pending CN113116291A (en) 2019-12-31 2019-12-31 Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113116291A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116661587A (en) * 2022-12-29 2023-08-29 荣耀终端有限公司 Eye movement data processing method and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749991A (en) * 2012-04-12 2012-10-24 广东百泰科技有限公司 Non-contact free space eye-gaze tracking method suitable for man-machine interaction
US20160029883A1 (en) * 2013-03-28 2016-02-04 Eye Tracking Analysts Ltd Eye tracking calibration
US20180348861A1 (en) * 2017-05-31 2018-12-06 Magic Leap, Inc. Eye tracking calibration techniques
CN109947253A (en) * 2019-03-25 2019-06-28 京东方科技集团股份有限公司 The method for establishing model of eyeball tracking, eyeball tracking method, equipment, medium
WO2019221724A1 (en) * 2018-05-16 2019-11-21 Tobii Ab Method to reliably detect correlations between gaze and stimuli
CN110502099A (en) * 2018-05-16 2019-11-26 托比股份公司 Reliably detect the associated method between watching attentively and stimulating

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749991A (en) * 2012-04-12 2012-10-24 广东百泰科技有限公司 Non-contact free space eye-gaze tracking method suitable for man-machine interaction
US20160029883A1 (en) * 2013-03-28 2016-02-04 Eye Tracking Analysts Ltd Eye tracking calibration
US20180348861A1 (en) * 2017-05-31 2018-12-06 Magic Leap, Inc. Eye tracking calibration techniques
WO2019221724A1 (en) * 2018-05-16 2019-11-21 Tobii Ab Method to reliably detect correlations between gaze and stimuli
CN110502099A (en) * 2018-05-16 2019-11-26 托比股份公司 Reliably detect the associated method between watching attentively and stimulating
CN109947253A (en) * 2019-03-25 2019-06-28 京东方科技集团股份有限公司 The method for establishing model of eyeball tracking, eyeball tracking method, equipment, medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116661587A (en) * 2022-12-29 2023-08-29 荣耀终端有限公司 Eye movement data processing method and electronic equipment
CN116661587B (en) * 2022-12-29 2024-04-12 荣耀终端有限公司 Eye movement data processing method and electronic equipment

Similar Documents

Publication Publication Date Title
CN105933607B (en) A kind of take pictures effect method of adjustment and the mobile terminal of mobile terminal
US10616475B2 (en) Photo-taking prompting method and apparatus, an apparatus and non-volatile computer storage medium
US10284817B2 (en) Device for and method of corneal imaging
US10048749B2 (en) Gaze detection offset for gaze tracking models
US10380421B2 (en) Iris recognition via plenoptic imaging
CN110706283B (en) Calibration method and device for sight tracking, mobile terminal and storage medium
US20160162673A1 (en) Technologies for learning body part geometry for use in biometric authentication
CN110807427B (en) Sight tracking method and device, computer equipment and storage medium
JP2016515242A (en) Method and apparatus for gazing point estimation without calibration
US10254831B2 (en) System and method for detecting a gaze of a viewer
WO2012137801A1 (en) Input device, input method, and computer program
Takemura et al. Estimation of a focused object using a corneal surface image for eye-based interaction
EP4095744A1 (en) Automatic iris capturing method and apparatus, computer-readable storage medium, and computer device
WO2019153927A1 (en) Screen display method, device having display screen, apparatus, and storage medium
CN111580665B (en) Method and device for predicting fixation point, mobile terminal and storage medium
CN110908511B (en) Method for triggering recalibration and related device
WO2018076172A1 (en) Image display method and terminal
CN113116291A (en) Calibration and calibration method and device for eyeball tracking, mobile terminal and storage medium
Schneider et al. Towards around-device interaction using corneal imaging
NL2004878C2 (en) System and method for detecting a person&#39;s direction of interest, such as a person&#39;s gaze direction.
CA3091068A1 (en) Methods and systems for displaying a visual aid
CN113342157A (en) Eyeball tracking processing method and related device
US11973927B2 (en) Detecting eye tracking calibration errors
US20220021867A1 (en) Detecting Eye Tracking Calibration Errors
CN116489500A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination