CN113495613A - Eyeball tracking calibration method and device - Google Patents

Eyeball tracking calibration method and device Download PDF

Info

Publication number
CN113495613A
CN113495613A CN202010191024.5A CN202010191024A CN113495613A CN 113495613 A CN113495613 A CN 113495613A CN 202010191024 A CN202010191024 A CN 202010191024A CN 113495613 A CN113495613 A CN 113495613A
Authority
CN
China
Prior art keywords
user
calibration
target
calibration coefficient
personal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010191024.5A
Other languages
Chinese (zh)
Other versions
CN113495613B (en
Inventor
张朕
路伟成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202010191024.5A priority Critical patent/CN113495613B/en
Priority to PCT/CN2021/079596 priority patent/WO2021185110A1/en
Priority to JP2022555765A priority patent/JP2023517380A/en
Publication of CN113495613A publication Critical patent/CN113495613A/en
Application granted granted Critical
Publication of CN113495613B publication Critical patent/CN113495613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Abstract

The invention discloses an eyeball tracking calibration method and device, which are used for completing calibration in a scene applying sight line positioning interaction without an independent calibration link. The method comprises the following steps: in one interactive operation, if a user selects a non-sight line positioning interactive mode, starting a background calibration process; in a background calibration process, acquiring eye feature information of a user; acquiring position coordinates obtained by positioning in a non-sight positioning interaction mode as calibration point coordinates; and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates. Therefore, in the embodiment of the invention, if the user selects the non-line-of-sight positioning interaction mode for positioning, the background calibration process is started at the same time, and the position coordinates obtained by positioning in the non-line-of-sight positioning interaction mode are used as the coordinates of the calibration point in the background calibration process to calculate and obtain the personal calibration coefficient. The background calibration process of the embodiment of the invention is hidden for the user and does not need to exit the current scene.

Description

Eyeball tracking calibration method and device
Technical Field
The present invention relates to the field of eye tracking technologies, and in particular, to a method and an apparatus for calibrating eye tracking.
Background
With the development of scientific technology, the eyeball tracking technology has been widely used, for example, it can perform positioning interaction through sight in terminal devices (sight positioning devices for short) related to sight positioning, such as Virtual Reality (VR), Augmented Reality (AR), eye-controlled tablet computers, and the like.
Because there are some differences in the physiological structures of the eyes of each user, in the prior art, before using a gaze positioning apparatus with an eye tracking function, a user usually needs to enter a calibration link first, obtain the eye feature information of the user while the user focuses attention on one or more calibration points, and then obtain the personal calibration coefficient of the user through correlation calculation according to the coordinates of the calibration points and the eye feature information. And after the calibration is finished, displaying a point set picture to enable a user to evaluate the calibration effect by himself, and if the user subjectively considers that the calibration effect cannot meet the sight line positioning requirement, recalibrating is needed.
After the calibration is completed, the user can enter a scene of using the sight line for positioning, if the user finds out that the sight line is inaccurate in positioning in the using process, or the relative positions of the head display and the eyes are changed due to reasons such as adjusting the position of the head display and the like, the user needs to quit the using scene and enter the calibration link again, and inconvenience is brought to the user.
Disclosure of Invention
In view of the above problems, the present invention provides an eyeball tracking calibration method and device to complete calibration in a scene applying gaze location interaction, without requiring a separate calibration link.
In order to achieve the purpose, the invention provides the following technical scheme:
an eye tracking calibration method, comprising:
in one interactive operation, if the interactive mode selected by the user is a non-sight line positioning interactive mode, starting a background calibration process;
in the background calibration process, acquiring the eye characteristic information of the user;
acquiring position coordinates obtained by adopting the non-sight positioning interaction mode for positioning as calibration point coordinates;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
Optionally, before starting the background calibration process, the method further includes: acquiring eye feature information of a user; calculating to obtain the fixation point coordinate of the user according to the obtained eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor, or, a personal calibration factor associated with the user identification of the user; displaying the interaction effect of the functional area to which the fixation point coordinate belongs; and if the fact that the non-sight line positioning interaction equipment is used is monitored, determining that the interaction mode adopted by the current interaction operation is the non-sight line positioning mode according to the fact that the non-sight line positioning interaction equipment is positioned to other functional areas.
Optionally, the user identifier of the user is a target user identifier; before the step of calculating the gaze point coordinates of the user by using the target calibration coefficients, the method further comprises: if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient; and if the user identification of the user is not related to the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
Optionally, after the current personal calibration coefficient is calculated, the method further includes: if the target user identifier is associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to be the current personal calibration coefficient; and if the target user identifier is not associated with a personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
Optionally, the user identifier is bound to the biometric feature; before the step of calculating the gaze point coordinates of the user by using the target calibration coefficients, the method further comprises: obtaining a biometric characteristic of the user; matching the acquired biological identification features with the biological identification features of the established user identification; if the matching is successful, determining the successfully matched user identifier as the user identifier of the user; and if the matching fails, establishing a user identifier for the user, and binding the established user identifier with the acquired biological identification features.
An eyeball tracking calibration method calibration device comprises:
a background calibration initiation unit to: in one interactive operation, if the interactive mode selected by the user is a non-sight line positioning interactive mode, starting a background calibration process;
a collection unit for: in the background calibration process, acquiring the eye characteristic information of the user;
a calibration unit for:
acquiring position coordinates obtained by adopting the non-sight positioning interaction mode for positioning as calibration point coordinates;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
Optionally, the method further includes: the computing unit is used for acquiring the eye characteristic information of the user before the background calibration process is started; calculating to obtain the fixation point coordinate of the user according to the obtained eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor, or, a personal calibration factor associated with the user identification of the user; the display unit is used for displaying the interaction effect of the functional area to which the fixation point coordinate belongs; and the monitoring unit is used for determining that the interaction mode adopted by the current interaction operation is the non-sight positioning mode if the non-sight positioning interaction equipment is monitored to be used and is positioned to other functional areas according to the non-sight positioning interaction equipment.
Optionally, the user identifier of the user is a target user identifier; before the calculating using the target calibration coefficients to obtain the user's gaze point coordinates, the calculating unit is further configured to: if the target user identifier is associated with a personal calibration coefficient, determining the personal calibration coefficient associated with the target user identifier as the target calibration coefficient; and if the user identification of the user is not related to the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
Optionally, after calculating the current personal calibration coefficient, the calibration unit is further configured to: if the target user identifier is associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier by using the current personal calibration coefficient; and if the target user identifier is not associated with a personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
Optionally, the user identifier is bound to the biometric feature; the device further comprises: a biometric unit to: acquiring the biological identification characteristics of the user before the target calibration coefficient is used for calculating the fixation point coordinates of the user; matching the acquired biological identification features with the biological identification features of the established user identification; if the matching is successful, determining the successfully matched user identifier as the user identifier of the user; and if the matching fails, establishing a user identifier for the user, and binding the established user identifier with the acquired biological identification features.
An eye tracking calibration method device, comprising: a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the eye tracking calibration method according to any one of the preceding claims when executing the program.
A storage medium having stored thereon computer-executable instructions which, when loaded and executed by a processor, carry out the eye tracking calibration method steps according to any one of the preceding claims.
Therefore, in the embodiment of the invention, in a scene where the eyeball tracking is used for interaction, if the user selects the non-line-of-sight positioning interaction mode for positioning, the background calibration process is started at the same time, and the position coordinates obtained by positioning in the non-line-of-sight positioning interaction mode are used as the coordinates of the calibration point in the background calibration process to calculate the personal calibration coefficient. The background calibration process of the embodiment of the invention is hidden for the user, the user can not sense the background calibration process, and the background calibration process does not need to quit the current scene for calibration as in the prior art, so that the calibration time can be saved, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is an exemplary flowchart of an eyeball tracking calibration method according to an embodiment of the present invention;
fig. 2a is a schematic diagram of obtaining coordinates of a gaze point by using a gaze positioning interaction manner according to an embodiment of the present invention;
fig. 2b is a schematic diagram illustrating an expansion of a functional area to which a fixation point coordinate belongs according to an embodiment of the present invention;
fig. 3 is another exemplary flowchart of an eye tracking calibration method according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating an eyeball tracking calibration method according to another embodiment of the present invention;
fig. 5 is a flowchart illustrating an eyeball tracking calibration method according to another embodiment of the present invention;
fig. 6 is a flowchart illustrating an eyeball tracking calibration method according to another embodiment of the present invention;
fig. 7 is an exemplary structure of an eye tracking calibration apparatus according to an embodiment of the present invention;
fig. 8 is another exemplary structure of an eye tracking calibration apparatus according to an embodiment of the present invention;
fig. 9 is a further exemplary structure of the eyeball tracking calibration apparatus according to the embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like in the description and claims of the present invention and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not set forth for a listed step or element but may include steps or elements not listed.
The eyeball tracking calibration method and the eyeball tracking calibration device provided by the embodiment of the invention are applied to the field of eyeball tracking, which can also be called sight line tracking, and are technologies for estimating the sight line and/or the fixation point of eyes by measuring the eye movement condition.
The eyeball tracking calibration device provided by the embodiment of the invention can be deployed in terminal equipment (sight line positioning equipment for short) related to sight line positioning, such as a VR system, AR equipment, an eye control tablet computer and the like.
Wherein, the VR system generally includes a plurality of devices, for example, VR all-in-one machine + handle/remote controller, PC end + VR head mounted display (with VR software installed on PC end to communicate with head display) + handle/remote controller/mouse, smart mobile terminal + head mounted display (with VR software installed on smart mobile terminal to communicate with head display), etc.
The eyeball tracking calibration device can be deployed in an eye control tablet computer and the like, and in order to collect the eye pattern of a user, the eye control tablet computer can also be deployed with an infrared light source and a camera (such as an environment camera and an infrared camera) for shooting the eye pattern. An eye diagram here refers to an image comprising the eyes, such as the frontal, lateral head portrait, or an image comprising only the eyes.
Alternatively, the eye tracking calibration device may be deployed on the VR/AR head mounted display described above. A sight line tracking device (or an eyeball tracking device) is further installed on the head-mounted display, and other eyeball tracking devices can be used besides the eyeball tracking device which uses the camera for shooting the eye pattern and the infrared light source, for example, a MEMS micro electro mechanical system can be used, and specifically, the eye-tracking device can comprise a MEMS infrared scanning reflector, an infrared light source and an infrared receiver, and can detect the eyeball movement by shooting the eye image; for another example, the gaze tracking device may also be a capacitive sensor that detects eye movement by a capacitance value between the eye and the capacitive plate; for another example, the eye-gaze tracking device may also be a myoelectric current detector, more specifically, an eye movement is detected by placing electrodes at the bridge of the nose, forehead, ears, or earlobe to detect a pattern of myoelectric signals.
Embodiments of the present invention will be described in further detail below based on the common aspects of the invention referred to in the above description.
Example one
In order to solve the problems existing in the prior calibration mode: if the user finds that the line of sight is not accurately positioned in a scene applying line of sight positioning interaction, or the relative positions of the head display and the eyes are changed due to reasons of adjusting the position of the head display and the like, the user needs to quit the problem of scene recalibration.
In the first embodiment of the present application, an eyeball tracking calibration method is provided to complete calibration in a scene where gaze positioning interaction is applied, and a separate calibration link is not required.
Referring to fig. 1, the eyeball tracking calibration method includes:
s0: the eye tracking function is activated.
In one example, the eye tracking functionality of the gaze location device is turned on by default.
Of course, the eye tracking function may be activated according to the user's operation.
S1: and in one interactive operation, if the interactive mode selected by the user is a non-sight line positioning interactive mode, starting a background calibration process.
One interactive operation may refer to: the user inputs information and operation commands through the input and output system, the system receives post-processing, and the processing result is displayed through the input and output system.
The subsequent user can further input information and operation commands according to the processing result.
Taking VR scenes as an example, the gaze point coordinates are generally obtained by using a line-of-sight positioning interaction method as a default (see fig. 2 a).
And (3) performing interactive effect display such as expansion (please refer to fig. 2b), color change, brightness enhancement and the like on the functional area to which the fixation point coordinate belongs. Functional regions herein include, but are not limited to: icon controls, spatial regions, virtual objects (e.g., in a VR game, a virtual object that is desired to be grabbed is selected or walked to a designated spatial region through line-of-sight positioning), and the like.
The aforementioned mouse, remote controller, handle, and even keyboard at PC end can be used for human-computer interaction, and compared with the line-of-sight positioning interaction mode, these interaction modes can be collectively referred to as non-line-of-sight positioning interaction mode, and the mouse, remote controller, handle, keyboard and the like can be collectively referred to as non-line-of-sight positioning interaction device.
If the user uses the non-line-of-sight positioning interaction mode in one interaction operation, a background calibration process is entered, the process is hidden for the user and the user does not perceive the process.
S2: in the background calibration process, the eye feature information of the user is acquired.
In one example, the eye characteristic information may include pupil position coordinates, pupil shape, iris position, iris shape, eyelid position, canthus position, position coordinates of corneal reflection spots, and the like.
Specifically, if the eyeball tracking calibration device is deployed in the mobile terminal device, an eye pattern can be shot by a camera of the mobile terminal device, and the eye pattern is processed to obtain eye feature information.
And if the eyeball tracking calibration device is deployed on the VR/AR head-mounted display, the eye image can be shot by the sight tracking device on the VR/AR head-mounted display, and the eye image is processed to obtain eye characteristic information.
S3: and acquiring position coordinates obtained by positioning in the non-sight positioning interaction mode as coordinates of the calibration point.
Taking the handle as an example, if the user moves the cursor to a certain position by using the handle and clicks the confirmation, the position coordinate obtained at this time is the coordinate of the calibration point.
Unlike the prior art, the embodiments of the present invention do not exhibit fixed calibration points, nor do they require the user to concentrate the gaze force at the fixed calibration points. The position coordinates obtained by the positioning in the non-line-of-sight positioning interaction mode are the points gazed by the user because the user arrives at the eyes when the non-line-of-sight positioning interaction equipment such as a handle and a mouse is used.
S4: and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
The personal calibration coefficients are parameters used for calculating the final result of the sight line in the sight line estimation algorithm, and are related to data such as the pupil radius, the corneal curvature, the angle difference between the visual axis and the optical axis of the user and the like.
The principle of calculation of the personal calibration coefficients is as follows:
according to the obtained eye feature information and the calibration point coordinates, the personal calibration coefficient can be reversely solved by using a sight line estimation algorithm. After the current personal calibration coefficient is obtained, the previous personal calibration coefficient can be covered or the default calibration coefficient can be replaced, so that the eyeball tracking precision is more accurate.
Therefore, in the embodiment of the invention, in a scene where the eyeball tracking is used for interaction, if the user selects the non-line-of-sight positioning interaction mode for positioning, the background calibration process is started at the same time, and the position coordinates obtained by positioning in the non-line-of-sight positioning interaction mode are used as the coordinates of the calibration point in the background calibration process to calculate the personal calibration coefficient. The background calibration process of the embodiment of the invention can continuously update the personal calibration coefficient of the user, so that the eyeball tracking is more accurate, in addition, the calibration method is hidden for the user, the user can not perceive the calibration method, and the calibration method does not need to quit the current scene for calibration as in the prior art, so that the calibration time can be saved, and the user experience is improved.
Example two
Generally, eye tracking techniques have default calibration coefficients. The default calibration coefficients are those that are used by most people with greater accuracy.
Of course, there is a possibility that the default calibration coefficients may be inaccurate due to individual differences such as the radius of the eyeball of the user, which is also the reason for obtaining the individual calibration coefficients by performing the calibration.
The defining of the target calibration coefficients may include: the system defaults to calibration coefficients, or personal calibration coefficients associated with the user's subscriber identity.
And at the beginning, the target calibration coefficient is the default calibration coefficient of the system, and after at least one background calibration, the target calibration coefficient is updated to be the latest personal calibration coefficient.
An exemplary process of performing the eyeball tracking calibration based on the target calibration coefficient in a scene that the gaze direction interaction manner is adopted by default will be described in this embodiment, please refer to fig. 3, which may include:
and S31, acquiring the eye feature information of the user, and calculating the fixation point coordinate of the user according to the acquired eye feature information and the target calibration coefficient.
The obtaining manner can be seen in step S2 of the first embodiment.
There are various eye tracking algorithms for calculating the coordinates of the fixation point, for example, an eye tracking algorithm based on the shape of the pupil position calculates the fixation information by the direction of the principal axis of the pupil and the pupil position.
For another example, a feature vector fitting method has the working principle that:
extracting the center position of the pupil, the center position of the left eye corner and the center position of the right eye corner;
subtracting the central position of the left eye angle from the central position of the pupil as a characteristic vector A;
the pupil center position minus the right eye angle center position is taken as the feature vector B.
And constructing a mapping function (a plurality of equation sets) between the vector A, the vector B and the fixation point.
Based on the given eigenvectors a and B, and the known gaze information, a polynomial fit is performed to obtain the unknown coefficients in the mapping function (the solution of the unknown coefficients can be done during the calibration process).
After the unknown coefficients are obtained, the feature vectors extracted currently are input into the mapping function, and then the current fixation point can be obtained (the process is a tracking process).
Specifically, the feature data may include: pupil location, pupil shape, iris location, iris shape, eyelid location, canthus location, spot location, and the like.
Different eye tracking algorithms may have different requirements for the types of feature data, for example, the aforementioned eye tracking algorithm according to the pupil position form, and the feature data to be extracted includes the main axis direction of the pupil, the pupil position, the pupil long axis length, the pupil short axis length, and the like.
For another example, in the aforementioned feature vector fitting method, the feature data to be extracted includes the pupil center position, the left eye corner center position, and the right eye corner center position.
Since the eye tracking algorithm is of many kinds, it is not described in detail herein.
It should be noted that, if the user is a new user, or the user id of the user is not associated with the personal calibration coefficient, the target calibration coefficient is the default standard coefficient.
S32: and displaying the interaction effect of the functional area to which the fixation point coordinate belongs.
Specifically, the functional area to which the gazing point coordinate belongs may be displayed with an interactive effect such as enlarging (see fig. 2b), changing color, and brightening. The functional area can be an icon control, a space area, a virtual object and the like.
S31 to S32 are one interactive operations.
If the user thinks that the functional area displaying the interactive effect is the expected functional area, the user will subsequently confirm to enter the functional area by means of continuous watching, pressing keys and the like.
If the user thinks that the functional area showing the interaction effect is not the expected functional area (which indicates that the accuracy of the calculation of the coordinates of the fixation point cannot meet the use requirement), the user can subsequently use a mouse, a remote controller, a handle, or even a keyboard at the PC end and other non-line-of-sight positioning interaction equipment to move the focus on the screen for repositioning.
S33: judging whether a background calibration starting condition is met, if so, entering S34, otherwise, entering S31;
the background calibration start-up conditions may include: and monitoring that the non-line-of-sight positioning interaction equipment is used, and positioning to other functional areas according to the non-line-of-sight positioning interaction equipment.
When the background calibration starting is met, the interaction mode adopted by the current interaction operation is determined to be a non-sight positioning mode, a background calibration process is executed subsequently, and the obtained latest personal calibration coefficient is assigned to the target calibration coefficient.
If the background calibration is not satisfied, it indicates that the current line-of-sight positioning accuracy satisfies the user' S requirement, and the target calibration coefficient does not need to be changed, and the process returns to step S31 directly.
S34: and starting a background calibration process, and calculating to obtain the current personal calibration coefficient of the user according to the obtained eye characteristic information and the calibration point coordinates.
For a detailed description, please refer to the aforementioned S2-S4, which are not described herein.
S35: the target calibration coefficient is updated using the calculated individual calibration coefficient, and the process returns to S31.
After the non-line-of-sight positioning interactive device is used for positioning, the mode of line-of-sight positioning interaction is returned again.
In this embodiment, in each interactive operation based on gaze positioning, the gaze point coordinates are calculated using the default calibration coefficients or the personal calibration coefficients associated with the user, and the functional area to which the gaze point coordinates belong is highlighted.
If the highlighted functional area is the functional area expected by the user, the user can confirm to enter the functional area by means of continuous watching, pressing keys and the like.
If the highlighted functional area is not the functional area expected by the user, the user can reposition the user by using a non-line-of-sight positioning interaction device such as a handle and the like, and meanwhile, background calibration is started. Therefore, the embodiment can determine whether to start background calibration besides human-computer interaction.
In the second embodiment, the user id of the user is mentioned. For convenience of reference, the subscriber identity of the current user may be referred to as a target subscriber identity.
Before the target calibration coefficient is used to calculate the user's gaze point coordinates, the method may further include the following steps:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
and if the user identification of the user is not related to the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
Furthermore, after calculating the current personal calibration coefficient, the following operations can be performed:
if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification into the current personal calibration coefficient;
and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
In a third embodiment, how to perform the eye tracking calibration when the initial target user identifier is not associated with the personal calibration coefficient will be described, please refer to fig. 4, which may exemplarily include:
s41: and the target user identification is not associated with the personal calibration coefficient, and the default calibration coefficient of the system is determined as the target calibration coefficient.
S42-S45 are similar to S31-S34, and are not repeated herein.
S46: and associating the calculated current personal calibration coefficient with the target user identification.
S47: the target calibration coefficient is updated using the calculated individual calibration coefficient, and the process returns to S41.
After the non-line-of-sight positioning interactive device is used for positioning, the mode of line-of-sight positioning interaction is returned again.
The embodiment can realize that: and when the user is not associated with the personal calibration coefficient, determining the default calibration coefficient as a target calibration coefficient to calculate the fixation point coordinate, and highlighting the functional area to which the fixation point coordinate belongs. If the highlighted functional area is not the functional area expected by the user, the user can reposition the user by using a non-line-of-sight positioning interaction device such as a handle and the like, and meanwhile, background calibration is started.
The personal calibration coefficient obtained by background calibration is associated with the target user identification, and the personal calibration coefficient is used in the next interactive operation, so that the fixation point coordinate can be calculated by using the personal calibration coefficient obtained by the last calculation when the next positioning is carried out by using the sight line positioning mode, the transition from using the default calibration coefficient to using the more accurate personal calibration coefficient can be realized, and further more personalized and accurate sight line positioning service can be provided for the user.
Example four
In a fourth embodiment, an eye tracking calibration method is described according to the personal calibration coefficient associated with the initial target user identifier, please refer to fig. 5, which may exemplarily include:
s51: the target user identification is associated with a personal calibration coefficient, and the personal calibration coefficient associated with the target user identification is determined as the target calibration coefficient.
S52-S55 are similar to S31-S34, and are not repeated herein.
S56: and updating the personal calibration coefficient associated with the target user identification to be the current personal calibration coefficient.
S57: the target calibration coefficient is updated using the calculated individual calibration coefficient, and the process returns to S51.
After the non-line-of-sight positioning interactive device is used for positioning, the mode of line-of-sight positioning interaction is returned again.
The embodiment can realize that: when the user is associated with the personal calibration coefficient, the personal calibration coefficient associated with the user is determined as the target calibration coefficient, so that more personalized and accurate sight line positioning service can be provided for the user.
In addition, after the background calibration is started, the calculated personal calibration coefficient is associated with the target user identifier, and the personal calibration coefficient is used in the next interactive operation, so that the personal calibration coefficient can be updated in a self-adaptive manner, and further more personalized and accurate sight line positioning service can be provided for a user.
EXAMPLE five
A user using a gaze location device may default to the same person. Alternatively, in consideration of the fact that there may be a plurality of users using the same device, different users may be distinguished by biometric recognition or other techniques.
In specific implementation, the user identifier and the biometric feature can be bound, different biometric features correspond to different user identifiers, and different user identifiers represent different users, so that different users can be distinguished.
Referring to fig. 6, an exemplary method for calibrating eye tracking based on biometric features includes the following steps:
s60: acquiring the biological identification characteristics of the current user;
exemplary biometric features herein may include: iris, fingerprint, voiceprint, or even facial features.
S61: and matching the acquired biological recognition characteristics with the biological recognition characteristics of the established user identification.
S62: if the matching is successful, determining the successfully matched user identifier as the user identifier (target user identifier) of the current user;
for example, if the matching user id is 000010, "000010" is determined as the user id of the current user.
S63: if the matching fails, a user identifier (target user identifier) is established for the user, and the established user identifier is bound with the acquired biological identification features.
The user id successfully matched in step S62 or the user id established in step S63 is the target user id in the foregoing embodiment.
S64: judging whether the target user identification is associated with the personal calibration coefficient, if so, entering S65, otherwise, entering S66;
s65: determining a personal calibration coefficient associated with the target user identification as a target calibration coefficient;
s66: and determining the default calibration coefficient of the system as the target calibration coefficient.
S67: and calculating the fixation point coordinate of the user by using the target calibration coefficient.
Specifically, the eye feature information of the user is acquired first, and the fixation point coordinates of the user are calculated according to the acquired eye feature information and the target calibration coefficient.
For a detailed description, refer to the above description, and are not repeated herein.
S68-S610 are similar to S32-S34 and are not repeated here.
S611: if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification into the current personal calibration coefficient;
s612: and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
S613: the target calibration coefficient is updated using the calculated individual calibration coefficient, and the process returns to S67.
In this embodiment, after extracting the biometric features (e.g. iris) of the user, the biometric features are matched with the feature recognition features of the established user identifier to identify whether the user is a new user, and if the user is matched with the established user identifier, the user identifier successfully matched is determined as the user identifier of the current user. In the interactive operation, the personal calibration coefficient of the successfully matched user identifier is used for calculating the fixation point coordinate.
And if the user is a new user, establishing a user identifier for the new user, and calculating the coordinates of the fixation point by using a default calibration coefficient of the system in the first interactive operation.
The biological identification features are used for distinguishing the users, which is beneficial to providing personalized and more accurate sight line positioning service for different users. Since one can imagine such a scenario:
assume that a mobile device is used by multiple family members. Member a performs the interactive operation first and calculates the personal calibration factor. Then, the member B also uses the mobile device, and if the member B does not distinguish the mobile device, the member B directly uses the calculated personal calibration coefficient of the member a to perform calibration, and the positioning may not be accurate. If the background calibration is triggered, the calculated personal calibration coefficient is actually that of the member B.
And then, if the member A is used again, under the condition of no distinction, the personal calibration coefficient of the member B is used for calculating the fixation point coordinate, and the accurate positioning still cannot be achieved.
If different users are distinguished through biological identification, personal calibration coefficients of different users are not mixed together, and personalized sight line positioning service can be provided for different users.
EXAMPLE six
The present embodiment provides an eyeball-tracking calibration apparatus, referring to fig. 7, the apparatus comprising:
the background calibration starting unit 1 is used for starting a background calibration flow of the calibration unit 3 if the interaction mode selected by the user is a non-sight line positioning interaction mode in one interaction operation;
the acquisition unit 2 is used for acquiring the eye characteristic information of the user in a background calibration process;
in the background calibration procedure, the calibration unit 3 is configured to:
acquiring position coordinates obtained by positioning in the non-sight positioning interaction mode as calibration point coordinates;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
The eyeball tracking calibration device related to the embodiment of the invention can be deployed in terminal equipment (sight line positioning equipment for short) related to sight line positioning, such as VR systems, AR equipment, eye control tablet computers and the like, and executes the eyeball tracking calibration method.
For details and advantages, please refer to the above description, which is not repeated herein.
In other embodiments, referring to fig. 8, the eyeball tracking calibration device in all the embodiments may further include:
the calculating unit 4 is configured to, before starting the background calibration process, acquire eye feature information of the user, and calculate a gaze point coordinate of the user according to the acquired eye feature information and the target calibration coefficient;
wherein the target calibration coefficients include: the system defaults to calibration coefficients, or personal calibration coefficients associated with the user's subscriber identity.
The display unit 5 is used for displaying the interaction effect of the functional area to which the fixation point coordinate belongs;
and the monitoring unit 6 is used for determining that the interaction mode adopted by the current interaction operation is the non-sight positioning mode if the non-sight positioning interaction equipment is monitored to be used and is positioned to other functional areas according to the non-sight positioning interaction equipment.
For details and advantages, please refer to the above description, which is not repeated herein.
The user id of the user may be referred to as a target user id.
In other embodiments of the present invention, after calculating the current personal calibration coefficient, the calibration unit 3 in all the above embodiments may further be configured to:
if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification by using the current personal calibration coefficient;
and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
For details and advantages, please refer to the above description, which is not repeated herein.
In other embodiments of the present invention, after calculating the current personal calibration coefficient, the calculating unit 4 in all the above embodiments may further be configured to:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
and if the user identification of the user is not related to the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
For details and advantages, please refer to the above description, which is not repeated herein.
The user identification may be bound to a biometric feature. In another embodiment of the present invention, referring to fig. 9, the apparatus may further include:
a biometric unit 7 for:
acquiring biological identification characteristics of a user before a target calibration coefficient is used for calculating and obtaining the fixation point coordinates of the user;
matching the acquired biological identification features with the biological identification features of the established user identification;
if the matching is successful, determining the successfully matched user identifier as the user identifier of the user;
and if the matching fails, establishing a user identifier for the user, and binding the established user identifier with the acquired biological identification features.
For details and advantages, please refer to the above description, which is not repeated herein.
EXAMPLE seven
The embodiment of the present invention provides a storage medium, where computer-executable instructions are stored, and when the computer-executable instructions are loaded and executed by a processor, the steps of the eyeball tracking calibration method according to any one of the first to sixth embodiments are implemented.
Example eight
An embodiment of the present invention provides a processor, where the processor is configured to execute a program, where the program executes the steps of the eyeball tracking calibration method according to any one of the first to sixth embodiments when the program is executed.
Example nine
The calibration device of the eyeball tracking calibration method provided by the embodiment of the invention comprises a processor, a memory and a program which is stored on the memory and can be run on the processor, wherein when the processor executes the program, the steps of the eyeball tracking calibration method provided by any one of the first to sixth embodiments can be executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. An eye tracking calibration method, comprising:
in one interactive operation, if the interactive mode selected by the user is a non-sight line positioning interactive mode, starting a background calibration process;
in the background calibration process, acquiring the eye characteristic information of the user;
acquiring position coordinates obtained by adopting the non-sight positioning interaction mode for positioning as calibration point coordinates;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
2. The method of claim 1,
before starting the background calibration process, the method further comprises:
acquiring eye feature information of a user;
calculating to obtain the fixation point coordinate of the user according to the obtained eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor, or, a personal calibration factor associated with the user identification of the user;
displaying the interaction effect of the functional area to which the fixation point coordinate belongs;
and if the fact that the non-sight line positioning interaction equipment is used is monitored, determining that the interaction mode adopted by the current interaction operation is the non-sight line positioning mode according to the fact that the non-sight line positioning interaction equipment is positioned to other functional areas.
3. The method of claim 2,
the user identification of the user is a target user identification;
before the step of calculating the gaze point coordinates of the user by using the target calibration coefficients, the method further comprises:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
and if the target user identification is not associated with the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
4. The method of claim 3, after calculating the current personal calibration coefficients, further comprising:
if the target user identifier is associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to be the current personal calibration coefficient;
and if the target user identifier is not associated with a personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
5. The method of any one of claims 2 to 4,
the user identification is bound with the biological recognition characteristic;
before the step of calculating the gaze point coordinates of the user by using the target calibration coefficients, the method further comprises:
obtaining a biometric characteristic of the user;
matching the acquired biological identification features with the biological identification features of the established user identification;
if the matching is successful, determining the successfully matched user identifier as the user identifier of the user;
and if the matching fails, establishing a user identifier for the user, and binding the established user identifier with the acquired biological identification features.
6. An eyeball tracking calibration method calibration device is characterized by comprising the following steps:
a background calibration initiation unit to: in one interactive operation, if the interactive mode selected by the user is a non-sight line positioning interactive mode, starting a background calibration process;
a collection unit for: in the background calibration process, acquiring the eye characteristic information of the user;
a calibration unit for:
acquiring position coordinates obtained by adopting the non-sight positioning interaction mode for positioning as calibration point coordinates;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
7. The apparatus of claim 6, further comprising:
a computing unit to:
before starting the background calibration process, acquiring eye characteristic information of a user;
calculating to obtain the fixation point coordinate of the user according to the obtained eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor, or, a personal calibration factor associated with the user identification of the user;
the display unit is used for displaying the interaction effect of the functional area to which the fixation point coordinate belongs;
and the monitoring unit is used for determining that the interaction mode adopted by the current interaction operation is the non-sight positioning mode if the non-sight positioning interaction equipment is monitored to be used and is positioned to other functional areas according to the non-sight positioning interaction equipment.
8. The apparatus of claim 7,
the user identification of the user is a target user identification;
before the calculating using the target calibration coefficients to obtain the user's gaze point coordinates, the calculating unit is further configured to:
if the target user identifier is associated with a personal calibration coefficient, determining the personal calibration coefficient associated with the target user identifier as the target calibration coefficient;
and if the user identification of the user is not related to the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
9. The apparatus of claim 8, wherein after calculating the current personal calibration coefficient, the calibration unit is further configured to:
if the target user identifier is associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to be the current personal calibration coefficient;
and if the target user identifier is not associated with a personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
10. The apparatus of any one of claims 7-9,
the user identification is bound with the biological recognition characteristic;
the device further comprises:
a biometric unit to:
acquiring the biological identification characteristics of the user before the target calibration coefficient is used for calculating the fixation point coordinates of the user;
matching the acquired biological identification features with the biological identification features of the established user identification;
if the matching is successful, determining the successfully matched user identifier as the user identifier of the user;
and if the matching fails, establishing a user identifier for the user, and binding the established user identifier with the acquired biological identification features.
CN202010191024.5A 2020-03-18 2020-03-18 Eyeball tracking calibration method and device Active CN113495613B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010191024.5A CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device
PCT/CN2021/079596 WO2021185110A1 (en) 2020-03-18 2021-03-08 Method and device for eye tracking calibration
JP2022555765A JP2023517380A (en) 2020-03-18 2021-03-08 Eye tracking calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010191024.5A CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device

Publications (2)

Publication Number Publication Date
CN113495613A true CN113495613A (en) 2021-10-12
CN113495613B CN113495613B (en) 2023-11-21

Family

ID=77769930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010191024.5A Active CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device

Country Status (3)

Country Link
JP (1) JP2023517380A (en)
CN (1) CN113495613B (en)
WO (1) WO2021185110A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047822A (en) * 2021-11-24 2022-02-15 京东方科技集团股份有限公司 Near-to-eye display method and system
CN116311396A (en) * 2022-08-18 2023-06-23 荣耀终端有限公司 Method and device for fingerprint identification

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700500A (en) * 2023-08-07 2023-09-05 江西科技学院 Multi-scene VR interaction method, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045374A (en) * 2014-04-22 2015-11-11 联想(新加坡)私人有限公司 Automatic gaze calibration
CN106406509A (en) * 2016-05-16 2017-02-15 上海青研科技有限公司 Head-mounted eye control virtual reality device
CN109410285A (en) * 2018-11-06 2019-03-01 北京七鑫易维信息技术有限公司 A kind of calibration method, device, terminal device and storage medium
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9760772B2 (en) * 2014-03-20 2017-09-12 Lc Technologies, Inc. Eye image stimuli for eyegaze calibration procedures
CN105843397A (en) * 2016-04-12 2016-08-10 公安部上海消防研究所 Virtual reality interactive system based on pupil tracking technology
CN109976535B (en) * 2019-05-05 2022-12-02 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045374A (en) * 2014-04-22 2015-11-11 联想(新加坡)私人有限公司 Automatic gaze calibration
CN106406509A (en) * 2016-05-16 2017-02-15 上海青研科技有限公司 Head-mounted eye control virtual reality device
CN109410285A (en) * 2018-11-06 2019-03-01 北京七鑫易维信息技术有限公司 A kind of calibration method, device, terminal device and storage medium
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047822A (en) * 2021-11-24 2022-02-15 京东方科技集团股份有限公司 Near-to-eye display method and system
CN114047822B (en) * 2021-11-24 2023-12-19 京东方科技集团股份有限公司 Near-to-eye display method and system
CN116311396A (en) * 2022-08-18 2023-06-23 荣耀终端有限公司 Method and device for fingerprint identification
CN116311396B (en) * 2022-08-18 2023-12-12 荣耀终端有限公司 Method and device for fingerprint identification

Also Published As

Publication number Publication date
JP2023517380A (en) 2023-04-25
WO2021185110A1 (en) 2021-09-23
CN113495613B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
AU2021202479B2 (en) Head mounted display system configured to exchange biometric information
CN110460837B (en) Electronic device with foveal display and gaze prediction
US10488925B2 (en) Display control device, control method thereof, and display control system
US9311527B1 (en) Real time eye tracking for human computer interaction
CN113495613B (en) Eyeball tracking calibration method and device
CN112567287A (en) Augmented reality display with frame modulation
CN112667069A (en) Method for automatically identifying at least one user of an eye tracking device and eye tracking device
WO2014155133A1 (en) Eye tracking calibration
KR101638095B1 (en) Method for providing user interface through head mount display by using gaze recognition and bio-signal, and device, and computer-readable recording media using the same
US20140085189A1 (en) Line-of-sight detection apparatus, line-of-sight detection method, and program therefor
WO2021119212A1 (en) Systems and methods for operating a head-mounted display system based on user identity
CN109976528A (en) A kind of method and terminal device based on the dynamic adjustment watching area of head
CN112578903A (en) Eye tracking method, eye tracker, and computer program
De Buyser et al. Exploring the potential of combining smart glasses and consumer-grade EEG/EMG headsets for controlling IoT appliances in the smart home
CN114967128B (en) Sight tracking system and method applied to VR glasses
CN110018733A (en) Determine that user triggers method, equipment and the memory devices being intended to
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN109960412A (en) A kind of method and terminal device based on touch-control adjustment watching area
KR102396519B1 (en) Method for providing 360° panoramic video based on real-time field-of-view changes with eye-gaze tracking in mri environment, recording medium and device for performing the method
CN116056623A (en) Method and device for determining at least one astigmatic effect of at least one eye

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant