WO2021185110A1 - Method and device for eye tracking calibration - Google Patents

Method and device for eye tracking calibration Download PDF

Info

Publication number
WO2021185110A1
WO2021185110A1 PCT/CN2021/079596 CN2021079596W WO2021185110A1 WO 2021185110 A1 WO2021185110 A1 WO 2021185110A1 CN 2021079596 W CN2021079596 W CN 2021079596W WO 2021185110 A1 WO2021185110 A1 WO 2021185110A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
calibration
calibration coefficient
target
personal
Prior art date
Application number
PCT/CN2021/079596
Other languages
French (fr)
Chinese (zh)
Inventor
张朕
路伟成
Original Assignee
北京七鑫易维信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京七鑫易维信息技术有限公司 filed Critical 北京七鑫易维信息技术有限公司
Priority to JP2022555765A priority Critical patent/JP2023517380A/en
Publication of WO2021185110A1 publication Critical patent/WO2021185110A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the technical field of eye tracking, in particular to an eye tracking calibration method and device.
  • the user usually needs to enter the calibration process before using the gaze positioning device with eye tracking function.
  • the user's eye feature information is obtained, and then the user's personal calibration coefficient is obtained by correlation calculation based on the calibration point coordinates and the eye feature information.
  • a point set screen will be displayed, allowing the user to evaluate the calibration effect by himself. If the user subjectively believes that the calibration effect cannot meet the sight positioning requirements, recalibration is required.
  • the calibration After the calibration is completed, it will enter the scene where the line of sight positioning is used. If the user finds that the line of sight is inaccurate during use, or the relative position of the head-mounted display and the eye changes due to adjustment of the head-mounted display position, etc., they need to exit the use The scene re-enters the calibration link, which brings inconvenience to users.
  • At least some embodiments of the present invention provide an eye tracking calibration method and device, so as to complete the calibration in a scene where the gaze positioning interaction is applied, without requiring a separate calibration link.
  • an eye tracking calibration method including:
  • the interactive mode selected by the user is a non-line-of-sight interactive mode, start the background calibration process
  • the current personal calibration coefficient of the user is calculated.
  • the method before starting the background calibration process, further includes: acquiring user's eye characteristic information; calculating the user's eye characteristic information and target calibration coefficient according to the acquired eye characteristic information Gaze point coordinates; the target calibration coefficients include: system default calibration coefficients, or personal calibration coefficients associated with the user ID of the user; display the interaction effect of the functional area to which the coordinates of the gaze point belong; if monitoring When the non-line-of-sight positioning interactive device is used and the non-line-of-sight positioning interactive device is positioned to other functional areas, it is determined that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode.
  • the user ID of the user is a target user ID; before the use of the target calibration coefficient to calculate the gaze point coordinates of the user, the method further includes: if the target user ID is associated with a personal calibration Coefficient, the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient; if the user ID of the user is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
  • the method further includes: if the target user identification has been associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification to all The current personal calibration coefficient; if the target user identification is not associated with a personal calibration coefficient, the current personal calibration coefficient is associated with the target user identification.
  • the user identification is bound to the biometric feature; before the use of the target calibration coefficient to calculate the user's gaze point coordinates, the method further includes: acquiring the user's biometric feature; Match the biometric features of the user ID with the biometric features of the established user ID; if the matching is successful, the user ID that is successfully matched is determined as the user ID of the user; if the matching fails, the user ID is created for the user, And bind the established user identification with the acquired biometric features.
  • an eye tracking calibration method calibration device including:
  • the background calibration start unit is set to start the background calibration process if the interactive mode selected by the user is a non-line-of-sight interactive mode during an interactive operation;
  • An acquisition unit configured to acquire the user's eye characteristic information in the background calibration process
  • the calibration unit is configured to obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates; and calculate the current personal calibration coefficient of the user according to the obtained eye feature information and the calibration point coordinates.
  • the above-mentioned device further includes: a calculation unit configured to obtain the user's eye characteristic information before starting the background calibration process; and according to the obtained eye characteristic information and the target calibration coefficient The coordinates of the gaze point of the user are calculated; the target calibration coefficient includes: a system default calibration coefficient, or a personal calibration coefficient associated with the user ID of the user; the display unit is set to check the gaze point The interactive effect of the functional area to which the coordinate belongs is displayed; the monitoring unit is set to determine the interactive operation used in this interactive operation if it is detected that the non-line-of-sight positioning interactive device is used, and when the non-line-of-sight positioning interactive device is positioned to other functional areas The interactive mode is a non-line-of-sight positioning mode.
  • the user identification of the user is the target user identification; before the gaze point coordinates of the user are calculated using the target calibration coefficient, the calculation unit is further configured to: A personal calibration coefficient is already associated, and the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient; and if the user’s user ID is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as The target calibration coefficient.
  • the calibration unit is further configured to update the current personal calibration coefficient if the target user identification has been associated with a personal calibration coefficient.
  • the user identifier is bound to the biometric feature; the device further includes: a biometric unit configured to obtain the coordinates of the gaze point of the user before the user's gaze point coordinates are calculated using the target calibration coefficient The biometric characteristics of the user; matching the obtained biometric characteristics with the biometric characteristics of the established user ID; if the matching is successful, the user ID that is successfully matched is determined as the user ID of the user; and if the matching fails Establish a user identity for the user, and bind the established user identity with the acquired biometric characteristics.
  • a biometric unit configured to obtain the coordinates of the gaze point of the user before the user's gaze point coordinates are calculated using the target calibration coefficient The biometric characteristics of the user; matching the obtained biometric characteristics with the biometric characteristics of the established user ID; if the matching is successful, the user ID that is successfully matched is determined as the user ID of the user; and if the matching fails Establish a user identity for the user, and bind the established user identity with the acquired biometric characteristics.
  • an eye tracking calibration method and equipment including: a processor, a memory, and a computer program stored in the memory and running on the processor, and the processor executes the program When realizing the steps of the eye tracking calibration method described in any one of the above.
  • a storage medium stores computer-executable instructions, and when the computer-executable instructions are loaded and executed by a processor, the implementation of any of the above Eye tracking calibration method steps.
  • the background calibration process in an interactive scene using eye tracking, if the user selects the non-line-of-sight positioning interactive mode for positioning, the background calibration process will be initiated at the same time, and the non-line-of-sight positioning interactive mode will be used in the background calibration process
  • the position coordinates obtained by positioning are used as the calibration point coordinates to calculate the personal calibration coefficient.
  • the background calibration process in the embodiment of the present invention is hidden from the user, and the user does not perceive it, nor does it need to exit the current scene to perform calibration again like the prior art, which can save calibration time and improve user experience.
  • Fig. 1 is a flowchart of an eye tracking calibration method according to one embodiment of the present invention.
  • Fig. 2a is a schematic diagram of obtaining the coordinates of a gaze point in an interactive way of gaze positioning according to one of the embodiments of the present invention.
  • Fig. 2b is a schematic diagram of expanding the functional area to which the coordinates of the gaze point belong according to one embodiment of the present invention.
  • Fig. 3 is a flowchart of another eye tracking calibration method according to one embodiment of the present invention.
  • Fig. 4 is a flowchart of yet another eye tracking calibration method according to one of the embodiments of the present invention.
  • Fig. 5 is a flowchart of still another eye tracking calibration method according to one of the embodiments of the present invention.
  • Fig. 6 is a flowchart of still another eye tracking calibration method according to one of the embodiments of the present invention.
  • Fig. 7 is a structural block diagram of an eye tracking calibration device according to one of the embodiments of the present invention.
  • Fig. 8 is a structural block diagram of another eye tracking calibration device according to one embodiment of the present invention.
  • Fig. 9 is a structural block diagram of still another eye tracking calibration device according to one of the embodiments of the present invention.
  • Eye tracking calibration method and the eye tracking calibration device provided in the embodiments of the present invention are applied to the field of eye tracking.
  • Eye tracking can also be called gaze tracking, which is a technique for estimating the line of sight and/or gaze point of the eye by measuring eye movement. .
  • the eye-tracking calibration device involved in the embodiment of the present invention can be deployed in terminal devices related to line-of-sight positioning (referred to as line-of-sight positioning equipment) such as VR systems, AR devices, eye-controlled tablet computers.
  • line-of-sight positioning equipment such as VR systems, AR devices, eye-controlled tablet computers.
  • VR systems generally include multiple devices, for example, VR all-in-one machine + handle/remote control, PC side + VR head-mounted display (VR software is installed on the PC side to communicate with the head display) + handle/remote control/mouse , Smart mobile terminal + head-mounted display (VR software is installed on the smart mobile terminal to communicate with the head-mounted display), etc.
  • the above-mentioned eye tracking calibration device can be deployed in an eye-controlled tablet computer, etc., in order to collect the user's eye diagram, the eye-controlled tablet computer can also be deployed with an infrared light source and set as a camera for taking eye diagrams (such as environmental cameras, infrared cameras) .
  • the eye diagram here refers to an image that includes eyes, such as a frontal or profile portrait, or an image that only includes eyes.
  • the eye tracking calibration device can be deployed on the aforementioned VR/AR head-mounted display.
  • the head-mounted display is also equipped with a line-of-sight tracking device (or eye tracking device).
  • a line-of-sight tracking device or eye tracking device.
  • other eye tracking devices such as MEMS micro-electromechanical systems, can also be used.
  • it can include MEMS infrared scanning mirrors, infrared light sources, and infrared receivers, which can detect eye movement by taking eye images;
  • the gaze tracking device can also be a capacitive sensor, which passes through the capacitance between the eyeball and the capacitor plate.
  • the gaze tracking device can also be a muscle current detector, more specifically, by placing electrodes on the bridge of the nose, forehead, ears or earlobes to detect the pattern of muscle current signals to detect eye movement.
  • an eye tracking calibration method is provided to complete the calibration in a scene where the gaze positioning interaction is applied, without requiring a separate calibration link.
  • the eye tracking calibration method includes:
  • the eye tracking function of the gaze positioning device is turned on by default.
  • the eye tracking function can also be activated according to the user's operation.
  • An interactive operation can refer to: the user inputs information and operation commands through the input and output system, the system receives post-processing, and displays the processing result through the input and output system.
  • the general default is to obtain the coordinates of the gaze point using the gaze positioning interactive method (see Figure 2a).
  • the functional areas include, but are not limited to: icon controls, spatial areas, virtual objects (for example, in a VR game, you can go to a designated spatial area through the line of sight or select the virtual object you want to grab) and so on.
  • mice, remote control, handle, and even the keyboard on the PC side can also be used for human-computer interaction.
  • these interactive modes can be collectively referred to as the non-line-of-sight positioning interactive mode.
  • Mouse, remote control , Handles, keyboards, etc. can be collectively referred to as non-line-of-sight positioning interactive devices.
  • the background calibration process will be entered, which is hidden from the user and the user will not perceive it.
  • the eye feature information may include pupil position coordinates, pupil shape, iris position, iris shape, eyelid position, eye corner position, position coordinates of corneal reflection spots, and the like.
  • the eye diagram can be captured by a camera of the mobile terminal device, etc., and the eye diagram can be processed to obtain eye feature information.
  • the eye-tracking calibration device is deployed on a VR/AR head-mounted display
  • the eye-tracking device on the VR/AR head-mounted display can capture eye images and process the eye images to obtain eye feature information.
  • the position coordinates obtained at this time are the calibration point coordinates.
  • the embodiment of the present invention does not show a fixed calibration point, nor does it require the user to focus on the fixed calibration point. It is based on the fact that the user can see it when using non-line-of-sight positioning interactive devices such as handles and mice. Therefore, the position coordinates obtained by positioning using the above-mentioned non-line-of-sight positioning interactive method are the points that the user is staring at.
  • the personal calibration coefficient is a parameter used in the gaze estimation algorithm to calculate the final result of the gaze, and is related to the user’s pupil radius, corneal curvature, and the angle difference between the visual axis and the optical axis.
  • the personal calibration coefficient can be inversely solved by using the line-of-sight estimation algorithm. After the current personal calibration coefficient is obtained, the previous personal calibration coefficient can be overwritten or the default calibration coefficient can be replaced to make the eye tracking accuracy more accurate.
  • the background calibration process in an interactive scene using eye tracking, if the user selects the non-line-of-sight positioning interactive mode for positioning, the background calibration process will be started at the same time, and the non-line-of-sight positioning interactive mode will be used in the background calibration process.
  • the position coordinates obtained by positioning are used as the calibration point coordinates to calculate the personal calibration coefficient.
  • the background calibration process of the embodiment of the present invention can continuously update the user's personal calibration coefficients, making eye tracking more accurate.
  • the calibration method is hidden from the user, and the user will not perceive it, and it does not need to be the same as in the prior art. You need to exit the current scene and perform calibration again, which can save calibration time and improve user experience.
  • the eye tracking technology has default calibration coefficients.
  • the default calibration coefficient is the one with higher accuracy when most people use it.
  • the definable target calibration coefficients include: system default calibration coefficients, or personal calibration coefficients associated with the user's user ID.
  • the target calibration coefficient is the system default calibration coefficient. After at least one background calibration, the target calibration coefficient is updated to the latest personal calibration coefficient.
  • FIG. 3 may include:
  • S31 Obtain the user's eye feature information, and calculate the user's gaze point coordinates according to the acquired eye feature information and the target calibration coefficient.
  • step S2 of the foregoing first embodiment For the acquisition method, refer to step S2 of the foregoing first embodiment.
  • the eye tracking algorithm based on the shape of the pupil position calculates gaze information based on the principal axis direction of the pupil and the pupil position.
  • Another example is a certain feature vector fitting method, its working principle is:
  • the center position of the pupil minus the center position of the right eye corner is used as the feature vector B.
  • a polynomial fitting is performed to obtain the unknown coefficients in the mapping function (the unknown coefficients can be solved in the calibration process).
  • this process is a tracking process.
  • the feature data may include: pupil position, pupil shape, iris position, iris shape, eyelid position, eye corner position, spot position, and so on.
  • the aforementioned eye tracking algorithm based on the shape of the pupil position, the feature data that needs to be extracted include the main axis direction of the pupil, the pupil position, and the long axis of the pupil. Length, pupil minor axis length, etc.
  • the feature data that needs to be extracted include the center position of the pupil, the center position of the left eye corner, and the center position of the right eye corner.
  • the target calibration coefficient is the default standard coefficient if the user is a new user, or the user ID of the user is not associated with a personal calibration coefficient.
  • S32 Display the interactive effect of the functional area to which the coordinates of the fixation point belong.
  • the function area to which the fixation point coordinates belong can be expanded (see Fig. 2b), color change, brightening, and other interactive effects display.
  • the functional area here can be icon controls, space areas, virtual objects, and so on.
  • S31 to S32 are an interactive operation.
  • the mouse, remote control, handle, and even the keyboard on the PC side can be used later.
  • the line of sight locates the interactive device to move the focus on the screen to reposition it.
  • the background calibration start condition may include: detecting that the non-line-of-sight positioning interactive device is used, and positioning the interactive device to other functional areas according to the non-line-of-sight positioning interactive device.
  • the background calibration process When the background calibration is started, it is determined that the interactive method used in this interactive operation is the non-line-of-sight positioning method, the background calibration process will be executed later, and the latest personal calibration coefficient obtained will be assigned to the target calibration coefficient.
  • the background calibration start is not satisfied, it indicates that the current line-of-sight positioning accuracy meets the needs of the user, the target calibration coefficient does not need to be changed, and the step S31 is returned directly.
  • S34 Start the background calibration process, and calculate the user's current personal calibration coefficient according to the acquired eye feature information and calibration point coordinates.
  • S35 Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S31.
  • the non-line-of-sight positioning interactive device After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
  • the default calibration coefficient or the user's associated personal calibration coefficient is used to calculate the coordinates of the gaze point, and the functional area to which the coordinates of the gaze point belongs is highlighted.
  • the user can confirm to enter the functional area by means of continuous gaze, keystrokes, etc.
  • this embodiment can also determine whether to start background calibration.
  • the user ID of the user is mentioned in the second embodiment.
  • the user ID of the current user can be referred to as the target user ID.
  • the target calibration coefficient associated with the target user ID is determined as the target calibration coefficient
  • the system default calibration coefficient is determined as the target calibration coefficient.
  • the target user ID is not associated with a personal calibration coefficient, associate the current personal calibration coefficient with the target user ID.
  • the third embodiment will introduce how to perform eye tracking calibration when the target user identification is not associated with the personal calibration coefficient at the beginning.
  • Fig. 4 may include:
  • the target user ID is not associated with a personal calibration coefficient, and the system default calibration coefficient is determined as the target calibration coefficient.
  • S42-S45 are similar to the aforementioned S31-S34, and will not be repeated here.
  • the non-line-of-sight positioning interactive device After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
  • This embodiment can be implemented: when the user is not associated with a personal calibration coefficient, the default calibration coefficient is determined as the target calibration coefficient to calculate the fixation point coordinates, and the functional area to which the fixation point coordinates belong is highlighted. If the highlighted functional area is not the functional area expected by the user, the user will use a handle or other non-line-of-sight positioning interactive device to reposition it, and at the same time start the background calibration.
  • the personal calibration coefficient obtained by the background calibration will be associated with the target user ID, and the personal calibration coefficient will be used in the next interactive operation. In this way, the personal calibration obtained from the previous calculation can be used when the gaze positioning method is used for positioning next time
  • the coefficients calculate the coordinates of the gaze point, which can realize the transition from using the default calibration coefficients to using more accurate personal calibration coefficients, and further provide users with more personalized and accurate gaze positioning services.
  • the fourth embodiment will specifically introduce the eye tracking calibration method according to the personal calibration coefficient associated with the target user identification at the beginning.
  • Figure 5 may include:
  • the target user identification has been associated with a personal calibration coefficient, and the personal calibration coefficient associated with the target user identification is determined as the target calibration coefficient.
  • S52-S55 are similar to the aforementioned S31-S34, and will not be repeated here.
  • the non-line-of-sight positioning interactive device After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
  • This embodiment can realize that when the user has been associated with a personal calibration coefficient, the personal calibration coefficient associated with the user is determined as the target calibration coefficient, which can provide the user with more personalized and accurate line-of-sight positioning services.
  • the calculated personal calibration coefficient will be associated with the target user ID, and the personal calibration coefficient will be used in the next interactive operation, which can realize the self-adaptive update of the personal calibration coefficient and further provide users with more personality And accurate line-of-sight positioning services.
  • the user who uses the sight positioning device can be the same person by default. Or, considering that there may be multiple people using the same device, it is also possible to distinguish different users through technologies such as biometrics.
  • the user ID can be bound to the biometric feature. Different biometric features correspond to different user IDs, and different user IDs represent different users, so that different users can be distinguished.
  • the eye tracking calibration method based on biometric features exemplarily includes the following process:
  • Exemplary biometric features here may include at least one of iris, fingerprint, voiceprint, and even facial features.
  • the user ID that is successfully matched in step S62 or the user ID established in step S63 is the target user ID in the foregoing embodiment.
  • S64 Determine whether the target user ID has been associated with personal calibration coefficients, if yes, go to S65, otherwise go to S66;
  • S65 Determine the personal calibration coefficient associated with the target user identification as the target calibration coefficient
  • the user's eye feature information is acquired first, and the user's gaze point coordinates are calculated according to the acquired eye feature information and the target calibration coefficient.
  • S68-S610 are similar to the aforementioned S32-S34, and will not be repeated here.
  • biometric feature for example, iris
  • it will be matched with the feature identification feature of the established user identification to identify whether it is a new user. If it matches the established user identification , The user ID that is successfully matched is determined as the user ID of the current user. In this interactive operation, the gaze point coordinates will be calculated using the personal calibration coefficient of the successfully matched user ID.
  • a user ID will be created for it, and in the first interactive operation, the system default calibration coefficient will be used to calculate the fixation point coordinates.
  • Member A first conducted interactive operations and calculated personal calibration coefficients. After that, member B also uses the mobile device. If it is not distinguished, it directly uses the calculated personal calibration coefficient of member A for calibration. The positioning may not be accurate. If the background calibration is triggered, the calculated personal calibration coefficient is actually member B's.
  • the personal calibration coefficient of member B will be used to calculate the fixation point coordinates, and the positioning accuracy will still not be achieved.
  • This embodiment provides an eye tracking calibration device.
  • the device includes:
  • the background calibration activation unit 1 is set to activate the background calibration process of the calibration unit 3 if the interaction mode selected by the user is a non-line-of-sight positioning interaction mode during an interactive operation;
  • the acquisition unit 2 is configured to acquire the user's eye characteristic information in the background calibration process
  • the calibration unit 3 is configured to obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates; and according to the obtained eye feature information and the calibration point coordinates, the user’s current Personal calibration factor.
  • the eye-tracking calibration device can be deployed in terminal devices related to gaze positioning (gaze positioning equipment for short), such as VR systems, AR devices, and eye-controlled tablet computers, to perform the eye-tracking calibration method.
  • terminal devices related to gaze positioning such as VR systems, AR devices, and eye-controlled tablet computers
  • the eye tracking calibration device in all the above embodiments may further include:
  • the calculation unit 4 is configured to obtain the user's eye feature information before starting the background calibration process, and calculate the user's gaze point coordinates according to the obtained eye feature information and the target calibration coefficient;
  • the target calibration coefficient includes: a system default calibration coefficient, or a personal calibration coefficient associated with the user ID of the user.
  • the display unit 5 is set to display the interactive effect of the functional area to which the coordinates of the gaze point belong;
  • the monitoring unit 6 is configured to determine that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode when it is detected that the non-line-of-sight positioning interactive device is used and is positioned to other functional areas according to the non-line-of-sight positioning interactive device.
  • the user ID of the above-mentioned user may be referred to as the target user ID.
  • the calibration unit 3 in all the above embodiments may also be configured to update the target using the current personal calibration coefficient if the target user ID has been associated with a personal calibration coefficient.
  • the calculation unit 4 in all the above embodiments may also be configured to associate the target user identification with the personal calibration coefficient if the target user identification has already been associated with the personal calibration coefficient.
  • the personal calibration coefficient is determined as the target calibration coefficient; and if the user ID of the user is not associated with the personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
  • the foregoing apparatus may further include:
  • the biometric identification unit 7 is configured to obtain the user's biometric characteristics before the user's gaze point coordinates are calculated by using the target calibration coefficient; if the obtained biometric characteristics are matched with the established biometric characteristics of the user identification; if If the matching is successful, the user identification that is successfully matched is determined as the user identification of the user; and if the matching fails, the user identification is established for the user, and the established user identification is bound with the acquired biometric characteristics.
  • An embodiment of the present invention provides a storage medium that stores computer-executable instructions.
  • the computer-executable instructions are loaded and executed by a processor, the eye tracking calibration method described in any one of the first to sixth embodiments is implemented. step.
  • An embodiment of the present invention provides a processor configured to run a program, wherein the steps of the eye tracking calibration method described in any one of the first to the sixth embodiments are executed when the program is running.
  • An embodiment of the present invention provides a calibration device for an eye tracking calibration method.
  • the device includes a processor, a memory, and a program stored on the memory and running on the processor.
  • the processor executes the program, any of the first to sixth embodiments can be executed.
  • this application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can direct a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory in a computer-readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM).
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
  • this application can be provided as a method, a system, or a computer program product. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • a computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.

Abstract

A method and device for eye tracking calibration, calibration being completed in a scenario applying sight positioning interaction, obviating the need for a separate calibration segment. The method comprises: starting an eye tracking function (S0); if an interaction scheme selected by a user is a non-line-of-sight positioning interaction scheme, starting a background calibration process (S1); during the background calibration process, acquiring ocular feature information of the user (S2); acquiring position coordinates produced by employing the non-line-of-sight positioning interaction scheme for positioning to serve as calibration point coordinates (S3); and calculating to produce a current personal calibration coefficient of the user on the basis of the ocular feature information and of the calibration point coordinates acquired (S4). Hence, in the method, if the user selects the non-line-of-sight positioning interaction scheme for positioning, the background calibration process is started at the same time, during the background calibration process, the position coordinates produced by using the non-line-of-sight positioning interaction scheme for positioning serve as the calibration point coordinates to produce by calculation the personal calibration coefficient. The background calibration process of the method is hidden from the user, and the need to exit from a current scenario is obviated.

Description

眼球追踪校准方法及装置Eye tracking calibration method and device 技术领域Technical field
本发明涉及眼球追踪技术领域,特别是涉及眼球追踪校准方法及装置。The present invention relates to the technical field of eye tracking, in particular to an eye tracking calibration method and device.
背景技术Background technique
随着科学技术的发展,眼球追踪技术获得了越来越广泛的应用,例如,其可在虚拟现实(VR)、增强现实(AR)、眼控平板电脑等涉及视线定位的终端设备(简称视线定位设备)中通过视线来进行定位交互。With the development of science and technology, eye tracking technology has gained more and more widespread applications. For example, it can be used in virtual reality (VR), augmented reality (AR), eye-controlled tablet computers and other terminal devices that involve line-of-sight positioning (referred to as line-of-sight). In the positioning device), the positioning interaction is performed through the line of sight.
由于每个用户的眼睛的生理结构存在一些差异,因此,在现有技术中,用户在使用具有眼球追踪功能的视线定位设备前,通常需要首先进入校准环节,通过在用户集中注意力凝视一个或多个校准点的同时,获取用户的眼部特征信息,进而根据校准点坐标和眼部特征信息关联计算得到使用者的个人校准系数。在校准结束后会展示一幅点集画面,让使用者自行评估校准效果,若使用者主观认为该校准效果不能达到视线定位需求,则需要重新校准。Because there are some differences in the physiological structure of each user’s eyes, in the prior art, the user usually needs to enter the calibration process before using the gaze positioning device with eye tracking function. At the same time of multiple calibration points, the user's eye feature information is obtained, and then the user's personal calibration coefficient is obtained by correlation calculation based on the calibration point coordinates and the eye feature information. After the calibration is completed, a point set screen will be displayed, allowing the user to evaluate the calibration effect by himself. If the user subjectively believes that the calibration effect cannot meet the sight positioning requirements, recalibration is required.
校准完成后,会进入使用视线定位的场景中,若使用者在使用过程中发现视线定位不准,或由于调整头戴显示器位置等原因导致头显和眼睛的相对位置发生改变,则需要退出使用场景重新进入校准环节,给用户使用带来不便。After the calibration is completed, it will enter the scene where the line of sight positioning is used. If the user finds that the line of sight is inaccurate during use, or the relative position of the head-mounted display and the eye changes due to adjustment of the head-mounted display position, etc., they need to exit the use The scene re-enters the calibration link, which brings inconvenience to users.
发明内容Summary of the invention
本发明至少部分实施例提供一种眼球追踪校准方法及装置,以在应用视线定位交互的场景中完成校准,不需要单独的校准环节。At least some embodiments of the present invention provide an eye tracking calibration method and device, so as to complete the calibration in a scene where the gaze positioning interaction is applied, without requiring a separate calibration link.
在本发明其中一实施例中,提供了一种眼球追踪校准方法,包括:In one of the embodiments of the present invention, an eye tracking calibration method is provided, including:
在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动后台校准流程;In an interactive operation, if the interactive mode selected by the user is a non-line-of-sight interactive mode, start the background calibration process;
在所述后台校准流程中,获取所述使用者的眼部特征信息;In the background calibration process, obtaining the user's eye characteristic information;
获取采用所述非视线定位交互方式定位得到的位置坐标作为校准点坐标;Acquiring the position coordinates obtained by positioning in the non-line-of-sight positioning interactive manner as the calibration point coordinates;
根据获取到的眼部特征信息以及校准点坐标,计算得到所述使用者当前的个人校准系数。According to the acquired eye feature information and calibration point coordinates, the current personal calibration coefficient of the user is calculated.
在一个可选实施例中,在启动所述后台校准流程之前,还包括:获取使用者的眼部特征信息;根据获取到的所述眼部特征信息及目标校准系数计算得到所述使用者的注视点坐标;所述目标校准系数包括:系统默认校准系数,或者,与所述使用者的用户标识相关联的个人校准系数;对所述注视点坐标所属功能区的交互效果进行展示;若监测到非视线定位交互设备被使用,并根据所述非视线定位交互设备定位至其他功能区时,确定本次交互操作所采用的交互方式为非视线定位方式。In an optional embodiment, before starting the background calibration process, the method further includes: acquiring user's eye characteristic information; calculating the user's eye characteristic information and target calibration coefficient according to the acquired eye characteristic information Gaze point coordinates; the target calibration coefficients include: system default calibration coefficients, or personal calibration coefficients associated with the user ID of the user; display the interaction effect of the functional area to which the coordinates of the gaze point belong; if monitoring When the non-line-of-sight positioning interactive device is used and the non-line-of-sight positioning interactive device is positioned to other functional areas, it is determined that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode.
在一个可选实施例中,所述使用者的用户标识为目标用户标识;在所述使用目标校准系数计算得到使用者的注视点坐标之前,还包括:若所述目标用户标识已关联个人校准系数,将与所述目标用户标识相关联的个人校准系数确定为所述目标校准系数;若使用者的用户标识未关联个人校准系数,将所述系统默认校准系数确定为所述目标校准系数。In an optional embodiment, the user ID of the user is a target user ID; before the use of the target calibration coefficient to calculate the gaze point coordinates of the user, the method further includes: if the target user ID is associated with a personal calibration Coefficient, the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient; if the user ID of the user is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
在一个可选实施例中,在计算得到所述当前的个人校准系数后,还包括:若所述目标用户标识已关联有个人校准系数,将所述目标用户标识关联的个人校准系数更新为所述当前的个人校准系数;若所述目标用户标识未关联有个人校准系数,将所述当前的个人校准系数与所述目标用户标识相关联。In an optional embodiment, after the current personal calibration coefficient is calculated, the method further includes: if the target user identification has been associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification to all The current personal calibration coefficient; if the target user identification is not associated with a personal calibration coefficient, the current personal calibration coefficient is associated with the target user identification.
在一个可选实施例中,用户标识与生物识别特征相绑定;在所述使用目标校准系数计算得到使用者的注视点坐标之前,还包括:获取所述使用者的生物识别特征;将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配;若匹配成功,将匹配成功的用户标识确定为所述使用者的用户标识;若匹配失败,为所述使用者建立用户标识,并将建立的用户标识与获取的生物识别特征相绑定。In an optional embodiment, the user identification is bound to the biometric feature; before the use of the target calibration coefficient to calculate the user's gaze point coordinates, the method further includes: acquiring the user's biometric feature; Match the biometric features of the user ID with the biometric features of the established user ID; if the matching is successful, the user ID that is successfully matched is determined as the user ID of the user; if the matching fails, the user ID is created for the user, And bind the established user identification with the acquired biometric features.
在本发明其中一实施例中,还提供了一种眼球追踪校准方法校准装置,包括:In one of the embodiments of the present invention, there is also provided an eye tracking calibration method calibration device, including:
后台校准启动单元,设置为在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动后台校准流程;The background calibration start unit is set to start the background calibration process if the interactive mode selected by the user is a non-line-of-sight interactive mode during an interactive operation;
采集单元,设置为在所述后台校准流程中,获取所述使用者的眼部特征信息;An acquisition unit, configured to acquire the user's eye characteristic information in the background calibration process;
校准单元,设置为获取采用所述非视线定位交互方式定位得到的位置坐标作为校准点坐标;以及根据获取到的眼部特征信息以及校准点坐标,计算得到所述使用者当前的个人校准系数。The calibration unit is configured to obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates; and calculate the current personal calibration coefficient of the user according to the obtained eye feature information and the calibration point coordinates.
在一个可选实施例中,上述装置还包括:计算单元,设置为在启动所述后台校准流程之前,获取使用者的眼部特征信息;根据获取到的所述眼部特征信息及目标校准系数计算得到所述使用者的注视点坐标;所述目标校准系数包括:系统默认校准系数,或者,与所述使用者的用户标识相关联的个人校准系数;展示单元,设置为对所述注视点坐标所属功能区的交互效果进行展示;监测单元,设置为若监测到非视线定位交互设备被使用,并根据所述非视线定位交互设备定位至其他功能区时,确定本次交互操作所采用的交互方式为非视线定位方式。In an optional embodiment, the above-mentioned device further includes: a calculation unit configured to obtain the user's eye characteristic information before starting the background calibration process; and according to the obtained eye characteristic information and the target calibration coefficient The coordinates of the gaze point of the user are calculated; the target calibration coefficient includes: a system default calibration coefficient, or a personal calibration coefficient associated with the user ID of the user; the display unit is set to check the gaze point The interactive effect of the functional area to which the coordinate belongs is displayed; the monitoring unit is set to determine the interactive operation used in this interactive operation if it is detected that the non-line-of-sight positioning interactive device is used, and when the non-line-of-sight positioning interactive device is positioned to other functional areas The interactive mode is a non-line-of-sight positioning mode.
在一个可选实施例中,所述使用者的用户标识为目标用户标识;在所述使用目标校准系数计算得到使用者的注视点坐标之前,所述计算单元还设置为若所述目标用户标识已关联有个人校准系数,将与所述目标用户标识相关联的个人校准系数确定为所述目标校准系数;以及若使用者的用户标识未关联个人校准系数,将所述系统默认校准系数确定为所述目标校准系数。In an optional embodiment, the user identification of the user is the target user identification; before the gaze point coordinates of the user are calculated using the target calibration coefficient, the calculation unit is further configured to: A personal calibration coefficient is already associated, and the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient; and if the user’s user ID is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as The target calibration coefficient.
在一个可选实施例中,在计算得到所述当前的个人校准系数后,所述校准单元还设置为若所述目标用户标识已关联有个人校准系数,使用所述当前的个人校准系数更新所述目标用户标识关联的个人校准系数;以及若所述目标用户标识未关联有个人校准系数,将所述当前的个人校准系数与所述目标用户标识相关联。In an optional embodiment, after the current personal calibration coefficient is calculated, the calibration unit is further configured to update the current personal calibration coefficient if the target user identification has been associated with a personal calibration coefficient. The personal calibration coefficient associated with the target user identification; and if the target user identification is not associated with a personal calibration coefficient, the current personal calibration coefficient is associated with the target user identification.
在一个可选实施例中,用户标识与生物识别特征相绑定;所述装置还包括:生物识别单元,设置为在所述使用目标校准系数计算得到使用者的注视点坐标之前,获取所述使用者的生物识别特征;将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配;若匹配成功,将匹配成功的用户 标识确定为所述使用者的用户标识;以及若匹配失败,为所述使用者建立用户标识,并将建立的用户标识与获取的生物识别特征相绑定。In an optional embodiment, the user identifier is bound to the biometric feature; the device further includes: a biometric unit configured to obtain the coordinates of the gaze point of the user before the user's gaze point coordinates are calculated using the target calibration coefficient The biometric characteristics of the user; matching the obtained biometric characteristics with the biometric characteristics of the established user ID; if the matching is successful, the user ID that is successfully matched is determined as the user ID of the user; and if the matching fails Establish a user identity for the user, and bind the established user identity with the acquired biometric characteristics.
在本发明其中一实施例中,还提供了一种眼球追踪校准方法设备,包括:处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如上述任一项所述眼球追踪校准方法的步骤。In one of the embodiments of the present invention, there is also provided an eye tracking calibration method and equipment, including: a processor, a memory, and a computer program stored in the memory and running on the processor, and the processor executes the program When realizing the steps of the eye tracking calibration method described in any one of the above.
在本发明其中一实施例中,还提供了一种存储介质,所述存储介质中存储有计算机可执行指令,所述计算机可执行指令被处理器加载并执行时,实现如上任一项所述眼球追踪校准方法步骤。In one of the embodiments of the present invention, a storage medium is also provided. The storage medium stores computer-executable instructions, and when the computer-executable instructions are loaded and executed by a processor, the implementation of any of the above Eye tracking calibration method steps.
在本发明至少部分实施例中,在使用眼球追踪进行交互的场景中,若使用者选择了非视线定位交互方式进行定位,会同时启动后台校准流程,在后台校准流程中使用非视线定位交互方式定位得到的位置坐标作为校准点坐标计算得到个人校准系数。本发明实施例的后台校准流程对使用者是隐藏的,使用者并不会感知,也不需要像现有技术一样,需要退出当前场景重新进行校准,可节省校准时间,提升用户使用体验。In at least some embodiments of the present invention, in an interactive scene using eye tracking, if the user selects the non-line-of-sight positioning interactive mode for positioning, the background calibration process will be initiated at the same time, and the non-line-of-sight positioning interactive mode will be used in the background calibration process The position coordinates obtained by positioning are used as the calibration point coordinates to calculate the personal calibration coefficient. The background calibration process in the embodiment of the present invention is hidden from the user, and the user does not perceive it, nor does it need to exit the current scene to perform calibration again like the prior art, which can save calibration time and improve user experience.
附图说明Description of the drawings
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据提供的附图获得其他的附图。In order to explain the embodiments of the present invention or the technical solutions in the prior art more clearly, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the prior art. Obviously, the drawings in the following description are only It is an embodiment of the present invention. For those of ordinary skill in the art, other drawings can be obtained based on the provided drawings without creative work.
图1是根据本发明其中一实施例的一种眼球追踪校准方法的流程图。Fig. 1 is a flowchart of an eye tracking calibration method according to one embodiment of the present invention.
图2a是根据本发明其中一实施例的采用视线定位交互方式得到注视点坐标的示意图。Fig. 2a is a schematic diagram of obtaining the coordinates of a gaze point in an interactive way of gaze positioning according to one of the embodiments of the present invention.
图2b是根据本发明其中一实施例的对注视点坐标所属功能区进行扩大的示意图。Fig. 2b is a schematic diagram of expanding the functional area to which the coordinates of the gaze point belong according to one embodiment of the present invention.
图3是根据本发明其中一实施例的另一种眼球追踪校准方法的流程图。Fig. 3 is a flowchart of another eye tracking calibration method according to one embodiment of the present invention.
图4是根据本发明其中一实施例的又一种眼球追踪校准方法的流程图。Fig. 4 is a flowchart of yet another eye tracking calibration method according to one of the embodiments of the present invention.
图5是根据本发明其中一实施例的再一种眼球追踪校准方法的流程图。Fig. 5 is a flowchart of still another eye tracking calibration method according to one of the embodiments of the present invention.
图6是根据本发明其中一实施例的再一种眼球追踪校准方法的流程图。Fig. 6 is a flowchart of still another eye tracking calibration method according to one of the embodiments of the present invention.
图7是根据本发明其中一实施例的一种眼球追踪校准装置的结构框图。Fig. 7 is a structural block diagram of an eye tracking calibration device according to one of the embodiments of the present invention.
图8是根据本发明其中一实施例的另一种眼球追踪校准装置的结构框图。Fig. 8 is a structural block diagram of another eye tracking calibration device according to one embodiment of the present invention.
图9是根据本发明其中一实施例的再一种眼球追踪校准装置的结构框图。Fig. 9 is a structural block diagram of still another eye tracking calibration device according to one of the embodiments of the present invention.
具体实施方式Detailed ways
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.
本发明的说明书和权利要求书及上述附图中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述特定的顺序。此外术语“包括”和“具有”以及他们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有设定于已列出的步骤或单元,而是可包括没有列出的步骤或单元。The terms "first" and "second" in the specification and claims of the present invention and the above-mentioned drawings are used to distinguish different objects, rather than to describe a specific sequence. In addition, the terms "including" and "having" and any variations of them are intended to cover non-exclusive inclusions. For example, a process, method, system, product, or device that includes a series of steps or units is not set in the listed steps or units, but may include steps or units that are not listed.
在本发明实施例中提供的眼球追踪校准方法和眼球追踪校准装置应用于眼球追踪领域,眼球追踪也可以称为视线追踪,是通过测量眼睛运动情况来估计眼睛的视线和/或注视点的技术。The eye tracking calibration method and the eye tracking calibration device provided in the embodiments of the present invention are applied to the field of eye tracking. Eye tracking can also be called gaze tracking, which is a technique for estimating the line of sight and/or gaze point of the eye by measuring eye movement. .
本发明实施例涉及的眼球追踪校准装置可部署于VR系统、AR设备、眼控平板电脑等涉及视线定位的终端设备(简称视线定位设备)中。The eye-tracking calibration device involved in the embodiment of the present invention can be deployed in terminal devices related to line-of-sight positioning (referred to as line-of-sight positioning equipment) such as VR systems, AR devices, eye-controlled tablet computers.
其中,VR系统一般包括多个设备,例如,VR一体机+手柄/遥控器,PC端+VR头戴显示器(PC端上安装有VR软件,以与头显通信)+手柄/遥控器/鼠标,智能移动终端+头戴显示器(智能移动终端上安装有VR软件,以与头显通信)等。Among them, VR systems generally include multiple devices, for example, VR all-in-one machine + handle/remote control, PC side + VR head-mounted display (VR software is installed on the PC side to communicate with the head display) + handle/remote control/mouse , Smart mobile terminal + head-mounted display (VR software is installed on the smart mobile terminal to communicate with the head-mounted display), etc.
上述眼球追踪校准装置可部署于眼控平板电脑等中,为可采集到使用者的眼图,眼控平板电脑还可部署红外光源、设置为拍摄眼图的相机(例如环境相机、红外相机)。这里的眼图指包括眼睛的图像,例如正面、侧面头像,或者只包含眼部的图像。The above-mentioned eye tracking calibration device can be deployed in an eye-controlled tablet computer, etc., in order to collect the user's eye diagram, the eye-controlled tablet computer can also be deployed with an infrared light source and set as a camera for taking eye diagrams (such as environmental cameras, infrared cameras) . The eye diagram here refers to an image that includes eyes, such as a frontal or profile portrait, or an image that only includes eyes.
或者,眼球追踪校准装置可部署于上述的VR/AR头戴显示器上。头戴显示器上还安装有视线追踪装置(或眼球追踪装置),除了上述介绍的使用拍摄眼图的相机加红外光源的眼球追踪装置,还可以使用其他眼球追踪装置,例如,MEMS微机电系统,具体可包括MEMS红外扫描反射镜、红外光源、红外接收器,其可通过拍摄眼部图像检测眼球运动;再例如,视线追踪装置也可以是电容传感器,其通过眼球与电容极板之间的电容值来检测眼球运动;再例如,视线追踪装置还可以是肌电流检测器,更具体的,通过在鼻梁、额头、耳朵或耳垂处放置电极,以检测肌电流信号的模式来检测眼球运动。Alternatively, the eye tracking calibration device can be deployed on the aforementioned VR/AR head-mounted display. The head-mounted display is also equipped with a line-of-sight tracking device (or eye tracking device). In addition to the above-mentioned eye tracking device that uses a camera for capturing eye diagrams and an infrared light source, other eye tracking devices, such as MEMS micro-electromechanical systems, can also be used. Specifically, it can include MEMS infrared scanning mirrors, infrared light sources, and infrared receivers, which can detect eye movement by taking eye images; for another example, the gaze tracking device can also be a capacitive sensor, which passes through the capacitance between the eyeball and the capacitor plate. For example, the gaze tracking device can also be a muscle current detector, more specifically, by placing electrodes on the bridge of the nose, forehead, ears or earlobes to detect the pattern of muscle current signals to detect eye movement.
下面将基于以上描述中本发明涉及的共性方面,对本发明实施例做进一步详细说明。Hereinafter, the embodiments of the present invention will be further described in detail based on the common aspects related to the present invention in the above description.
实施例一Example one
为解决现有校准方式中存在的:若使用者在应用视线定位交互的场景中中发现视线定位不准,或由于调整头戴显示器位置等原因导致头显和眼睛的相对位置发生改变,则需要退出场景重新校准的问题。In order to solve the existing calibration methods: if the user finds that the line of sight is inaccurate in the scene where the line of sight positioning interaction is applied, or the relative position of the head-mounted display and the eye changes due to the adjustment of the head-mounted display position, etc., you need to Exit the scene recalibration problem.
在本申请实施例一中提供一种眼球追踪校准方法以在应用视线定位交互的场景中完成校准,不需要单独的校准环节。In the first embodiment of the present application, an eye tracking calibration method is provided to complete the calibration in a scene where the gaze positioning interaction is applied, without requiring a separate calibration link.
请参见图1,该眼球追踪校准方法包括:Please refer to Figure 1. The eye tracking calibration method includes:
S0:启动眼球追踪功能。S0: Start the eye tracking function.
在一个示例中,视线定位设备的眼球追踪功能默认是开启的。In one example, the eye tracking function of the gaze positioning device is turned on by default.
当然,也可根据使用者的操作启动眼球追踪功能。Of course, the eye tracking function can also be activated according to the user's operation.
S1:在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动后台校准流程。S1: In an interactive operation, if the interactive mode selected by the user is the non-line-of-sight positioning interactive mode, the background calibration process is started.
一次交互操作可指:使用者通过输入输出系统输入信息和操作命令,系统接到后处理,并通过输入输出系统显示处理结果。An interactive operation can refer to: the user inputs information and operation commands through the input and output system, the system receives post-processing, and displays the processing result through the input and output system.
后续使用者可以根据处理结果进一步输入信息和操作命令。Follow-up users can further input information and operation commands according to the processing results.
以VR场景为例,一般默认是采用视线定位交互方式得到注视点坐标(请参见图2a)。Taking the VR scene as an example, the general default is to obtain the coordinates of the gaze point using the gaze positioning interactive method (see Figure 2a).
对注视点坐标所属功能区进行扩大(请参见图2b)、变色、增亮等交互效果展示。这里的功能区包括但不限于:图标控件、空间区域、虚拟物体(例如VR游戏中,可通过视线定位走到指定的空间区域或选中想要抓取的虚拟物体)等。Display the interactive effects such as expansion (see Figure 2b), color change, and brightening of the functional area to which the fixation point coordinates belong. The functional areas here include, but are not limited to: icon controls, spatial areas, virtual objects (for example, in a VR game, you can go to a designated spatial area through the line of sight or select the virtual object you want to grab) and so on.
而前述提及的鼠标、遥控器、手柄,乃至PC端的键盘等,也都可用于人机交互,与视线定位交互方式相比,这些交互方式可统称为非视线定位交互方式,鼠标、遥控器、手柄、键盘等可统称为非视线定位交互设备。The aforementioned mouse, remote control, handle, and even the keyboard on the PC side can also be used for human-computer interaction. Compared with the line-of-sight interactive mode, these interactive modes can be collectively referred to as the non-line-of-sight positioning interactive mode. Mouse, remote control , Handles, keyboards, etc. can be collectively referred to as non-line-of-sight positioning interactive devices.
如果在一次交互操作中,用户使用了非视线定位交互方式,那么将进入后台校准流程,该流程对使用者是隐藏的,使用者并不会感知。If in an interactive operation, the user uses the non-line-of-sight positioning interactive method, then the background calibration process will be entered, which is hidden from the user and the user will not perceive it.
S2:在后台校准流程中,获取使用者的眼部特征信息。S2: In the background calibration process, obtain the user's eye feature information.
在一个示例中,眼部特征信息可包括瞳孔位置坐标、瞳孔形状、虹膜位置、虹膜形状、眼皮位置、眼角位置、角膜反射光斑的位置坐标等。In one example, the eye feature information may include pupil position coordinates, pupil shape, iris position, iris shape, eyelid position, eye corner position, position coordinates of corneal reflection spots, and the like.
具体的,若眼球追踪校准装置部署于移动终端设备中,可通过移动终端设备的相机等拍摄眼图,对眼图进行处理后得到眼部特征信息。Specifically, if the eye tracking calibration device is deployed in a mobile terminal device, the eye diagram can be captured by a camera of the mobile terminal device, etc., and the eye diagram can be processed to obtain eye feature information.
而若眼球追踪校准装置部署于VR/AR头戴显示器上,可由VR/AR头戴显示器上的视线追踪装置拍摄眼部图像,对眼部图像进行处理后得到眼部特征信息。If the eye-tracking calibration device is deployed on a VR/AR head-mounted display, the eye-tracking device on the VR/AR head-mounted display can capture eye images and process the eye images to obtain eye feature information.
S3:获取采用上述非视线定位交互方式定位得到的位置坐标作为校准点坐标。S3: Obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates.
以手柄为例,若使用者采用手柄移动光标至某一位置,并点击了确认,此时得到的位置坐标为校准点坐标。Take the handle as an example, if the user uses the handle to move the cursor to a certain position and clicks to confirm, the position coordinates obtained at this time are the calibration point coordinates.
与现有技术不同的是,本发明实施例并不展示固定校准点,也不需要使用者集中注视力凝视固定校准点。而是基于使用者在使用手柄、鼠标等非视线定位交互设备时是眼到手到的,因此,采用上述非视线定位交互方式定位得到的位置坐标,就是使用者所凝视的点。Different from the prior art, the embodiment of the present invention does not show a fixed calibration point, nor does it require the user to focus on the fixed calibration point. It is based on the fact that the user can see it when using non-line-of-sight positioning interactive devices such as handles and mice. Therefore, the position coordinates obtained by positioning using the above-mentioned non-line-of-sight positioning interactive method are the points that the user is staring at.
S4:根据获取到的眼部特征信息以及校准点坐标,计算得到所述使用者当前的个人校准系数。S4: According to the acquired eye feature information and calibration point coordinates, calculate the user's current personal calibration coefficient.
个人校准系数是视线估计算法中用于计算视线最终结果的参数,和用户的瞳孔半径、角膜曲率、视轴和光轴之间的角度差等数据相关。The personal calibration coefficient is a parameter used in the gaze estimation algorithm to calculate the final result of the gaze, and is related to the user’s pupil radius, corneal curvature, and the angle difference between the visual axis and the optical axis.
个人校准系数的计算原理如下:The calculation principle of personal calibration coefficient is as follows:
根据获取到的眼部特征信息和校准点坐标,使用视线估计算法可以反解出个人校准系数。得到当前的个人校准系数后,可以覆盖上一次的个人校准系数或者替代默认校准系数,使眼球追踪精度更加精确。According to the obtained eye feature information and calibration point coordinates, the personal calibration coefficient can be inversely solved by using the line-of-sight estimation algorithm. After the current personal calibration coefficient is obtained, the previous personal calibration coefficient can be overwritten or the default calibration coefficient can be replaced to make the eye tracking accuracy more accurate.
可见,在本发明实施例中,在使用眼球追踪进行交互的场景中,若使用者选择了非视线定位交互方式进行定位,会同时启动后台校准流程,在后台校准流程中使用非视线定位交互方式定位得到的位置坐标作为校准点坐标计算得到个人校准系数。本发明实施例的后台校准流程可以不断更新用户的个人校准系数,使得眼球追踪更加准确,另外该校准方法对使用者是隐藏的,使用者并不会感知,也不需要像现有技术一样,需要退出当前场景重新进行校准,可节省校准时间,提升用户使用体验。It can be seen that in the embodiment of the present invention, in an interactive scene using eye tracking, if the user selects the non-line-of-sight positioning interactive mode for positioning, the background calibration process will be started at the same time, and the non-line-of-sight positioning interactive mode will be used in the background calibration process. The position coordinates obtained by positioning are used as the calibration point coordinates to calculate the personal calibration coefficient. The background calibration process of the embodiment of the present invention can continuously update the user's personal calibration coefficients, making eye tracking more accurate. In addition, the calibration method is hidden from the user, and the user will not perceive it, and it does not need to be the same as in the prior art. You need to exit the current scene and perform calibration again, which can save calibration time and improve user experience.
实施例二Example two
一般的,眼球追踪技术都有默认校准系数。默认校准系数是大多数人使用时准确度较高的校准系数。Generally, the eye tracking technology has default calibration coefficients. The default calibration coefficient is the one with higher accuracy when most people use it.
当然,因使用者眼球半径等个体差异,使用默认校准系数可能会有定位不准的可能,这也是进行校准得到个人校准系数的原因。Of course, due to individual differences such as the user's eyeball radius, using the default calibration coefficient may have the possibility of inaccurate positioning, which is also the reason for the calibration to obtain the personal calibration coefficient.
可定义目标校准系数包括:系统默认校准系数,或者,与使用者的用户标识相关联的个人校准系数。The definable target calibration coefficients include: system default calibration coefficients, or personal calibration coefficients associated with the user's user ID.
在最开始时,目标校准系数为系统默认校准系数,经过至少一次后台校准后,目标校准系数更新为最新的个人校准系数。At the very beginning, the target calibration coefficient is the system default calibration coefficient. After at least one background calibration, the target calibration coefficient is updated to the latest personal calibration coefficient.
本实施例将介绍在默认采用视线定位交互方式的场景下,基于目标校准系数进行的眼球追踪校准的示例性流程,请参见图3,其可包括:This embodiment will introduce an exemplary flow of eye tracking calibration based on target calibration coefficients in a scenario where the gaze positioning interaction mode is adopted by default. Please refer to FIG. 3, which may include:
S31:获取使用者的眼部特征信息,并根据获取到的眼部特征信息及目标校准系数计算得到使用者的注视点坐标。S31: Obtain the user's eye feature information, and calculate the user's gaze point coordinates according to the acquired eye feature information and the target calibration coefficient.
获取方式可参见前述实施例一的步骤S2。For the acquisition method, refer to step S2 of the foregoing first embodiment.
计算注视点坐标的眼动追踪算法有多种,例如,根据瞳孔位置形态的眼动追踪算法,是通过瞳孔的主轴方向、瞳孔位置来计算注视信息。There are many eye tracking algorithms for calculating the coordinates of the gaze point. For example, the eye tracking algorithm based on the shape of the pupil position calculates gaze information based on the principal axis direction of the pupil and the pupil position.
再例如某一种特征向量拟合法,其工作原理是:Another example is a certain feature vector fitting method, its working principle is:
提取瞳孔的中心位置、左眼角中心位置和右眼角的中心位置;Extract the center position of the pupil, the center position of the left eye corner, and the center position of the right eye corner;
将瞳孔中心位置减去左眼角中心位置作为特征向量A;Take the center position of the pupil minus the center position of the left eye corner as the feature vector A;
将瞳孔中心位置减去右眼角中心位置作为特征向量B。The center position of the pupil minus the center position of the right eye corner is used as the feature vector B.
构建向量A、向量B与注视点间的映射函数(若干方程组)。Construct a mapping function (several equations) between vector A, vector B and the gaze point.
基于给定的特征向量A和B,以及已知的注视信息,进行多项式拟合,以得到映射函数中的未知系数(未知系数的求解可在校准过程中完成)。Based on the given eigenvectors A and B, and the known gaze information, a polynomial fitting is performed to obtain the unknown coefficients in the mapping function (the unknown coefficients can be solved in the calibration process).
在得到未知系数后,再将当前提取的特征向量输入映射函数,即可得到当前的注视点(此过程为跟踪过程)。After obtaining the unknown coefficients, input the currently extracted feature vector into the mapping function to obtain the current gaze point (this process is a tracking process).
具体的,特征数据可包括:瞳孔位置、瞳孔形状、虹膜位置、虹膜形状、眼皮位置、眼角位置、光斑位置等等。Specifically, the feature data may include: pupil position, pupil shape, iris position, iris shape, eyelid position, eye corner position, spot position, and so on.
不同的眼动追踪算法对于特征数据的种类可能有不同的要求,例如,前述提及的根据瞳孔位置形态的眼动追踪算法,需要提取的特征数据包括瞳孔的主轴方向、瞳孔位置、瞳孔长轴长度、瞳孔短轴长度等。Different eye tracking algorithms may have different requirements for the types of feature data. For example, the aforementioned eye tracking algorithm based on the shape of the pupil position, the feature data that needs to be extracted include the main axis direction of the pupil, the pupil position, and the long axis of the pupil. Length, pupil minor axis length, etc.
再例如,前述提及的特征向量拟合法,其需要提取的特征数据包括瞳孔中心位置、左眼角中心位置和右眼角的中心位置。For another example, in the aforementioned feature vector fitting method, the feature data that needs to be extracted include the center position of the pupil, the center position of the left eye corner, and the center position of the right eye corner.
由于眼动追踪算法种类很多,在此不再一一赘述。Since there are many types of eye tracking algorithms, I will not repeat them here.
需要说明的是,若使用者为新使用者,或者使用者的用户标识未关联有个人校准系数时,目标校准系数为默认标准系数。It should be noted that if the user is a new user, or the user ID of the user is not associated with a personal calibration coefficient, the target calibration coefficient is the default standard coefficient.
S32:对注视点坐标所属功能区的交互效果进行展示。S32: Display the interactive effect of the functional area to which the coordinates of the fixation point belong.
具体的,可对注视点坐标所属功能区进行扩大(请参见图2b)、变色、增亮等交互效果展示。这里的功能区可为图标控件、空间区域、虚拟物体等。Specifically, the function area to which the fixation point coordinates belong can be expanded (see Fig. 2b), color change, brightening, and other interactive effects display. The functional area here can be icon controls, space areas, virtual objects, and so on.
S31至S32为一次交互操作。S31 to S32 are an interactive operation.
若使用者认为展示交互效果的功能区就是预期的功能区,那后续会通过持续注视、按键等方式确认进入该功能区。If the user thinks that the functional area displaying the interactive effect is the expected functional area, then follow-up will confirm to enter the functional area by means of continuous gaze, key press, etc.
而若使用者认为展示交互效果的功能区不是预期的功能区(说明此时注视点坐标计算的准确度不能满足使用需求),则后续可使用鼠标、遥控器、手柄,乃至PC端的键盘等非视线定位交互设备来移动屏幕上的焦点,以重新定位。If the user thinks that the functional area displaying the interactive effect is not the expected functional area (it means that the accuracy of the fixation point coordinate calculation at this time cannot meet the usage requirements), then the mouse, remote control, handle, and even the keyboard on the PC side can be used later. The line of sight locates the interactive device to move the focus on the screen to reposition it.
S33:判断是否满足后台校准启动条件,若满足,进入S34,否则,进入S31;S33: Judge whether the background calibration start condition is met, if so, go to S34, otherwise, go to S31;
后台校准启动条件可包括:监测到非视线定位交互设备被使用,并根据非视线定位交互设备定位至其他功能区。The background calibration start condition may include: detecting that the non-line-of-sight positioning interactive device is used, and positioning the interactive device to other functional areas according to the non-line-of-sight positioning interactive device.
在满足后台校准启动时,确定本次交互操作所采用的交互方式为非视线定位方式,后续将执行后台校准流程,并将得到的最新的个人校准系数赋值给目标校准系数。When the background calibration is started, it is determined that the interactive method used in this interactive operation is the non-line-of-sight positioning method, the background calibration process will be executed later, and the latest personal calibration coefficient obtained will be assigned to the target calibration coefficient.
而若不满足后台校准启动,表明当前的视线定位准确度满足了使用者的需要,目标校准系数不需要更改,直接返回步骤S31。If the background calibration start is not satisfied, it indicates that the current line-of-sight positioning accuracy meets the needs of the user, the target calibration coefficient does not need to be changed, and the step S31 is returned directly.
S34:启动后台校准流程,根据获取到的眼部特征信息以及校准点坐标,计算得到使用者当前的个人校准系数。S34: Start the background calibration process, and calculate the user's current personal calibration coefficient according to the acquired eye feature information and calibration point coordinates.
具体介绍请参见前述的S2-S4,在此不作赘述。For specific introduction, please refer to the aforementioned S2-S4, which will not be repeated here.
S35:使用计算得到的个人校准系数更新目标校准系数,返回S31。S35: Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S31.
在使用非视线定位交互设备定位后,会再次回到视线定位交互方式。After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
在本实施例中,在每一次基于视线定位的交互操作中,都会使用默认校准系数,或使用者已关联的个人校准系数计算注视点坐标,并对注视点坐标所属功能区进行突出展示。In this embodiment, in every interactive operation based on gaze positioning, the default calibration coefficient or the user's associated personal calibration coefficient is used to calculate the coordinates of the gaze point, and the functional area to which the coordinates of the gaze point belongs is highlighted.
若突出展示的功能区是使用者预期的功能区,则使用者可通过持续注视、按键等方式确认进入该功能区。If the highlighted functional area is the functional area expected by the user, the user can confirm to enter the functional area by means of continuous gaze, keystrokes, etc.
而若突出展示的功能区并非使用者预期的功能区,使用者会使用手柄等非视线定位交互设备重新进行定位,同时启动后台校准。因此,本实施例除了可实现人机交互外,还可确定出是否要启动后台校准。If the highlighted functional area is not the functional area expected by the user, the user will use a handle and other non-line-of-sight positioning interactive devices to reposition it, and at the same time, the background calibration will be started. Therefore, in addition to realizing human-computer interaction, this embodiment can also determine whether to start background calibration.
实施例二中提及了使用者的用户标识。为称呼方便,可将当前使用者的用户标识称为目标用户标识。The user ID of the user is mentioned in the second embodiment. For the convenience of addressing, the user ID of the current user can be referred to as the target user ID.
在使用目标校准系数计算得到使用者的注视点坐标之前,可还包括如下步骤:Before using the target calibration coefficient to calculate the coordinates of the user's gaze point, the following steps may be further included:
若目标用户标识已关联个人校准系数,将与目标用户标识相关联的个人校准系数确定为目标校准系数;If the target user ID has been associated with a personal calibration coefficient, the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient;
若使用者的用户标识未关联个人校准系数,将系统默认校准系数确定为目标校准系数。If the user ID of the user is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
此外,在计算得到当前的个人校准系数后,还可进行如下操作:In addition, after calculating the current personal calibration coefficient, you can perform the following operations:
若目标用户标识已关联有个人校准系数,将目标用户标识关联的个人校准系数更新为当前的个人校准系数;If the target user ID has been associated with a personal calibration coefficient, update the personal calibration coefficient associated with the target user ID to the current personal calibration coefficient;
若目标用户标识未关联有个人校准系数,将当前的个人校准系数与目标用户标识相关联。If the target user ID is not associated with a personal calibration coefficient, associate the current personal calibration coefficient with the target user ID.
实施例三将对在最开始目标用户标识未关联个人校准系数的情况下,如何进行眼球追踪校准进行介绍,请参见图4,其示例性得可包括:The third embodiment will introduce how to perform eye tracking calibration when the target user identification is not associated with the personal calibration coefficient at the beginning. Please refer to Fig. 4, which may include:
S41:目标用户标识未关联有个人校准系数,将系统默认校准系数确定为目标校准系数。S41: The target user ID is not associated with a personal calibration coefficient, and the system default calibration coefficient is determined as the target calibration coefficient.
S42-S45与前述S31-S34相类似,在此不作赘述。S42-S45 are similar to the aforementioned S31-S34, and will not be repeated here.
S46:将计算得到的当前的个人校准系数与目标用户标识相关联。S46: Associate the calculated current personal calibration coefficient with the target user identification.
S47:使用计算得到的个人校准系数更新目标校准系数,返回S41。S47: Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S41.
在使用非视线定位交互设备定位后,会再次回到视线定位交互方式。After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
本实施例可实现:在使用者未关联有个人校准系数时,将默认校准系数确定为目标校准系数计算注视点坐标,并对注视点坐标所属功能区进行突出展示。若突出展示的功能区并非使用者预期的功能区,使用者会使用手柄等非视线定位交互设备重新进行定位,同时启动后台校准。This embodiment can be implemented: when the user is not associated with a personal calibration coefficient, the default calibration coefficient is determined as the target calibration coefficient to calculate the fixation point coordinates, and the functional area to which the fixation point coordinates belong is highlighted. If the highlighted functional area is not the functional area expected by the user, the user will use a handle or other non-line-of-sight positioning interactive device to reposition it, and at the same time start the background calibration.
后台校准得到的个人校准系数将与目标用户标识相关联,并在下一次交互操作中使用个人校准系数,这样一来,在下一次使用视线定位方式进行定位时,就可使用上一次计算得到的个人校准系数计算注视点坐标,可实现由 使用默认校准系数过渡到使用更为精确的个人校准系数,进一步可为使用者提供更个性化和精确的视线定位服务。The personal calibration coefficient obtained by the background calibration will be associated with the target user ID, and the personal calibration coefficient will be used in the next interactive operation. In this way, the personal calibration obtained from the previous calculation can be used when the gaze positioning method is used for positioning next time The coefficients calculate the coordinates of the gaze point, which can realize the transition from using the default calibration coefficients to using more accurate personal calibration coefficients, and further provide users with more personalized and accurate gaze positioning services.
实施例四Example four
实施例四将具体按照最开始目标用户标识已关联个人校准系数,对眼球追踪校准方法进行介绍,请参见图5,其示例性得可包括:The fourth embodiment will specifically introduce the eye tracking calibration method according to the personal calibration coefficient associated with the target user identification at the beginning. Please refer to Figure 5, which may include:
S51:目标用户标识已关联有个人校准系数,将与目标用户标识相关联的个人校准系数确定为目标校准系数。S51: The target user identification has been associated with a personal calibration coefficient, and the personal calibration coefficient associated with the target user identification is determined as the target calibration coefficient.
S52-S55与前述S31-S34相类似,在此不作赘述。S52-S55 are similar to the aforementioned S31-S34, and will not be repeated here.
S56:将目标用户标识关联的个人校准系数更新为当前的个人校准系数。S56: Update the personal calibration coefficient associated with the target user ID to the current personal calibration coefficient.
S57:使用计算得到的个人校准系数更新目标校准系数,返回S51。S57: Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S51.
在使用非视线定位交互设备定位后,会再次回到视线定位交互方式。After the non-line-of-sight positioning interactive device is used for positioning, it will return to the line-of-sight positioning interactive mode again.
本实施例可实现:在使用者已关联有个人校准系数时,将与使用者相关联的个人校准系数确定为目标校准系数,可为使用者提供更为个性化和精确的视线定位服务。This embodiment can realize that when the user has been associated with a personal calibration coefficient, the personal calibration coefficient associated with the user is determined as the target calibration coefficient, which can provide the user with more personalized and accurate line-of-sight positioning services.
此外,若启动后台校准后,计算得到的个人校准系数将与目标用户标识相关联,并在下一次交互操作中使用个人校准系数,可实现自适应更新个人校准系数,进一步可为使用者提供更个性化和精确的视线定位服务。In addition, if the background calibration is activated, the calculated personal calibration coefficient will be associated with the target user ID, and the personal calibration coefficient will be used in the next interactive operation, which can realize the self-adaptive update of the personal calibration coefficient and further provide users with more personality And accurate line-of-sight positioning services.
实施例五Example five
使用视线定位设备的使用者可默认为是同一个人。或者,考虑到可能有多人使用同一设备的情况,也可以通过生物识别等技术,区分不同的使用者。The user who uses the sight positioning device can be the same person by default. Or, considering that there may be multiple people using the same device, it is also possible to distinguish different users through technologies such as biometrics.
在具体实现时,可将用户标识与生物识别特征相绑定,不同的生物识别特征对应不同的用户标识,而不同的用户标识表征不同的使用者,从而可区分不同的使用者。In specific implementation, the user ID can be bound to the biometric feature. Different biometric features correspond to different user IDs, and different user IDs represent different users, so that different users can be distinguished.
请参见图6,基于生物识别特征的眼球追踪校准方法示例性的包括如下流程:Please refer to Fig. 6, the eye tracking calibration method based on biometric features exemplarily includes the following process:
S60:获取当前使用者的生物识别特征;S60: Obtain the biometric characteristics of the current user;
这里的生物识别特征示例性的可包括:虹膜、指纹、声纹,乃至脸部特征中的至少一种。Exemplary biometric features here may include at least one of iris, fingerprint, voiceprint, and even facial features.
S61:将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配。S61: Match the acquired biometric features with the established biometric features of the user identification.
S62:若匹配成功,将匹配成功的用户标识确定为当前使用者的用户标识(目标用户标识);S62: If the matching is successful, determine the user ID that is successfully matched as the user ID of the current user (target user ID);
例如,匹配成功的用户标识为000010,则将“000010”确定为当前使用者的用户标识。For example, if the user ID that is successfully matched is 000010, then "000010" is determined as the user ID of the current user.
S63:若匹配失败,为使用者建立用户标识(目标用户标识),并将建立的用户标识与获取的生物识别特征相绑定。S63: If the matching fails, establish a user ID (target user ID) for the user, and bind the established user ID with the acquired biometric characteristics.
步骤S62中匹配成功的用户标识,或者步骤S63中建立的用户标识,即为前述实施例中的目标用户标识。The user ID that is successfully matched in step S62 or the user ID established in step S63 is the target user ID in the foregoing embodiment.
S64:判断目标用户标识是否已关联个人校准系数,若是,进入S65,否则进入S66;S64: Determine whether the target user ID has been associated with personal calibration coefficients, if yes, go to S65, otherwise go to S66;
S65:将与目标用户标识相关联的个人校准系数确定为目标校准系数;S65: Determine the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
S66:将系统默认校准系数确定为目标校准系数。S66: Determine the system default calibration coefficient as the target calibration coefficient.
S67:使用目标校准系数计算得到使用者的注视点坐标。S67: Use the target calibration coefficient to calculate the coordinates of the user's gaze point.
具体的,是先获取使用者的眼部特征信息,根据获取到的眼部特征信息及目标校准系数计算得到使用者的注视点坐标。Specifically, the user's eye feature information is acquired first, and the user's gaze point coordinates are calculated according to the acquired eye feature information and the target calibration coefficient.
具体介绍请参见本文前述记载,在此不作赘述。For specific introduction, please refer to the previous record in this article, so I won't repeat it here.
S68-S610与前述S32-S34相类似,在此不作赘述。S68-S610 are similar to the aforementioned S32-S34, and will not be repeated here.
S611:若目标用户标识已关联有个人校准系数,将目标用户标识关联的个人校准系数更新为当前的个人校准系数;S611: If the target user ID has been associated with a personal calibration coefficient, update the personal calibration coefficient associated with the target user ID to the current personal calibration coefficient;
S612:若目标用户标识未关联有个人校准系数,将当前的个人校准系数与目标用户标识相关联。S612: If the target user ID is not associated with a personal calibration coefficient, associate the current personal calibration coefficient with the target user ID.
S613:使用计算得到的个人校准系数更新目标校准系数,返回S67。S613: Use the calculated personal calibration coefficient to update the target calibration coefficient, and return to S67.
在本实施例中,在提取使用者的生物识别特征(例如虹膜)后,会与已建立的用户标识的特征识别特征相匹配,以识别是否是新用户,若与已建立 的用户标识相匹配,则将匹配成功的用户标识确定为当前使用者的用户标识。在本次交互操作中,会使用匹配成功的用户标识的个人校准系数计算注视点坐标。In this embodiment, after extracting the user’s biometric feature (for example, iris), it will be matched with the feature identification feature of the established user identification to identify whether it is a new user. If it matches the established user identification , The user ID that is successfully matched is determined as the user ID of the current user. In this interactive operation, the gaze point coordinates will be calculated using the personal calibration coefficient of the successfully matched user ID.
而若是新用户,则为其建立用户标识,在首次交互操作中,会使用系统默认校准系数计算注视点坐标。If it is a new user, a user ID will be created for it, and in the first interactive operation, the system default calibration coefficient will be used to calculate the fixation point coordinates.
使用生物识别特征对使用者进行区分,有助于为不同的使用者提供个性化的、更为准确的视线定位服务。因为可以设想一下这种场景:The use of biometrics to distinguish users can help provide personalized and more accurate sight positioning services for different users. Because you can imagine this scenario:
假定一部移动设备被多个家庭成员使用。成员A先进行了交互操作,计算了个人校准系数。之后,成员B也使用该部移动设备,若不加以区别,直接使用已计算出的成员A个人校准系数进行校准,定位可能并不准确。若触发了后台校准,计算得到的个人校准系数其实是成员B的。Suppose a mobile device is used by multiple family members. Member A first conducted interactive operations and calculated personal calibration coefficients. After that, member B also uses the mobile device. If it is not distinguished, it directly uses the calculated personal calibration coefficient of member A for calibration. The positioning may not be accurate. If the background calibration is triggered, the calculated personal calibration coefficient is actually member B's.
之后,若成员A再次使用,则在不加区分的情况下,又会使用成员B的个人校准系数进行计算注视点坐标,仍然无法达到定位准确。After that, if member A uses it again, without distinction, the personal calibration coefficient of member B will be used to calculate the fixation point coordinates, and the positioning accuracy will still not be achieved.
而若通过生物识别区分不同的用户,令不同用户的个人校准系数不再混合在一起,可为不同用户提供个性化的视线定位服务。However, if different users are distinguished by biometrics, the personal calibration coefficients of different users are no longer mixed together, and personalized gaze positioning services can be provided for different users.
实施例六Example Six
本实施例提供一种眼球追踪校准装置,参见图7,该装置包括:This embodiment provides an eye tracking calibration device. Referring to FIG. 7, the device includes:
后台校准启动单元1,设置为在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动校准单元3的后台校准流程;The background calibration activation unit 1 is set to activate the background calibration process of the calibration unit 3 if the interaction mode selected by the user is a non-line-of-sight positioning interaction mode during an interactive operation;
采集单元2,设置为在后台校准流程中,获取所述使用者的眼部特征信息;The acquisition unit 2 is configured to acquire the user's eye characteristic information in the background calibration process;
在后台校准流程中,校准单元3,设置为获取采用上述非视线定位交互方式定位得到的位置坐标作为校准点坐标;以及根据获取到的眼部特征信息以及校准点坐标,计算得到使用者当前的个人校准系数。In the background calibration process, the calibration unit 3 is configured to obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates; and according to the obtained eye feature information and the calibration point coordinates, the user’s current Personal calibration factor.
本发明实施例涉及的眼球追踪校准装置可部署于VR系统、AR设备、眼控平板电脑等涉及视线定位的终端设备(简称视线定位设备)中,执行眼球追踪校准方法。The eye-tracking calibration device according to the embodiment of the present invention can be deployed in terminal devices related to gaze positioning (gaze positioning equipment for short), such as VR systems, AR devices, and eye-controlled tablet computers, to perform the eye-tracking calibration method.
具体内容及有益效果请参见前述记载,在此不作赘述。For specific content and beneficial effects, please refer to the aforementioned record, which will not be repeated here.
在其他实施例中,请参见图8,上述所有实施例中的眼球追踪校准装置还可包括:In other embodiments, referring to FIG. 8, the eye tracking calibration device in all the above embodiments may further include:
计算单元4,设置为在启动后台校准流程之前,获取使用者的眼部特征信息,根据获取到的眼部特征信息及目标校准系数计算得到使用者的注视点坐标;The calculation unit 4 is configured to obtain the user's eye feature information before starting the background calibration process, and calculate the user's gaze point coordinates according to the obtained eye feature information and the target calibration coefficient;
其中,目标校准系数包括:系统默认校准系数,或者,与使用者的用户标识相关联的个人校准系数。Wherein, the target calibration coefficient includes: a system default calibration coefficient, or a personal calibration coefficient associated with the user ID of the user.
展示单元5,设置为对注视点坐标所属功能区的交互效果进行展示;The display unit 5 is set to display the interactive effect of the functional area to which the coordinates of the gaze point belong;
监测单元6,设置为若监测到非视线定位交互设备被使用,并根据非视线定位交互设备定位至其他功能区时,确定本次交互操作所采用的交互方式为非视线定位方式。The monitoring unit 6 is configured to determine that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode when it is detected that the non-line-of-sight positioning interactive device is used and is positioned to other functional areas according to the non-line-of-sight positioning interactive device.
具体内容及有益效果请参见前述记载,在此不作赘述。For specific content and beneficial effects, please refer to the aforementioned record, which will not be repeated here.
上述使用者的用户标识可称为目标用户标识。The user ID of the above-mentioned user may be referred to as the target user ID.
在本发明其他实施例中,在计算得到当前的个人校准系数后,上述所有实施例中的校准单元3还可设置为若目标用户标识已关联有个人校准系数,使用当前的个人校准系数更新目标用户标识关联的个人校准系数;以及若目标用户标识未关联有个人校准系数,将当前的个人校准系数与目标用户标识相关联。In other embodiments of the present invention, after the current personal calibration coefficient is calculated, the calibration unit 3 in all the above embodiments may also be configured to update the target using the current personal calibration coefficient if the target user ID has been associated with a personal calibration coefficient. The personal calibration coefficient associated with the user ID; and if the target user ID is not associated with a personal calibration coefficient, associate the current personal calibration coefficient with the target user ID.
具体内容及有益效果请参见前述记载,在此不作赘述。For specific content and beneficial effects, please refer to the aforementioned record, which will not be repeated here.
在本发明其他实施例中,在计算得到当前的个人校准系数后,上述所有实施例中的计算单元4还可设置为若目标用户标识已关联有个人校准系数,将与目标用户标识相关联的个人校准系数确定为目标校准系数;以及若使用者的用户标识未关联个人校准系数,将系统默认校准系数确定为目标校准系数。In other embodiments of the present invention, after the current personal calibration coefficient is calculated, the calculation unit 4 in all the above embodiments may also be configured to associate the target user identification with the personal calibration coefficient if the target user identification has already been associated with the personal calibration coefficient. The personal calibration coefficient is determined as the target calibration coefficient; and if the user ID of the user is not associated with the personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
具体内容及有益效果请参见前述记载,在此不作赘述。For specific content and beneficial effects, please refer to the aforementioned record, which will not be repeated here.
用户标识可与生物识别特征相绑定。在本发明其他实施例中,请参见图9,上述装置还可包括:User identification can be bound to biometric features. In other embodiments of the present invention, referring to FIG. 9, the foregoing apparatus may further include:
生物识别单元7,设置为在使用目标校准系数计算得到使用者的注视点坐标之前,获取使用者的生物识别特征;将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配;若匹配成功,将匹配成功的用户标识确定为使用者的用户标识;以及若匹配失败,为使用者建立用户标识,并将建立的用户标识与获取的生物识别特征相绑定。The biometric identification unit 7 is configured to obtain the user's biometric characteristics before the user's gaze point coordinates are calculated by using the target calibration coefficient; if the obtained biometric characteristics are matched with the established biometric characteristics of the user identification; if If the matching is successful, the user identification that is successfully matched is determined as the user identification of the user; and if the matching fails, the user identification is established for the user, and the established user identification is bound with the acquired biometric characteristics.
具体内容及有益效果请参见前述记载,在此不作赘述。For specific content and beneficial effects, please refer to the aforementioned record, which will not be repeated here.
实施例七Example Seven
本发明实施例提供了一种存储介质,该存储介质中存储有计算机可执行指令,计算机可执行指令被处理器加载并执行时,实现实施例一至六任一项所述的眼球追踪校准方法的步骤。An embodiment of the present invention provides a storage medium that stores computer-executable instructions. When the computer-executable instructions are loaded and executed by a processor, the eye tracking calibration method described in any one of the first to sixth embodiments is implemented. step.
实施例八Example eight
本发明实施例提供了一种处理器,所述处理器设置为运行程序,其中,所述程序运行时执行实施例一至六中任一项所述的眼球追踪校准方法步骤。An embodiment of the present invention provides a processor configured to run a program, wherein the steps of the eye tracking calibration method described in any one of the first to the sixth embodiments are executed when the program is running.
实施例九Example 9
本发明实施例提供一种眼球追踪校准方法校准装置,该设备包括处理器、存储器以及存储在存储器上并可在处理器上运行的程序,处理器执行程序时,可执行实施例一至六中任一项所述的眼球追踪校准方法步骤。An embodiment of the present invention provides a calibration device for an eye tracking calibration method. The device includes a processor, a memory, and a program stored on the memory and running on the processor. When the processor executes the program, any of the first to sixth embodiments can be executed. One of the steps of the eye tracking calibration method.
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art should understand that the embodiments of the present application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流 程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。This application is described with reference to flowcharts and/or block diagrams of methods, devices (systems), and computer program products according to embodiments of this application. It should be understood that each process and/or block in the flowchart and/or block diagram, and the combination of the processes and/or blocks in the flowchart and/or block diagram can be realized by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to produce a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment can be used to generate It is a device that realizes the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions can also be stored in a computer-readable memory that can direct a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device. The device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment. The instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。In a typical configuration, the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
存储器可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。存储器是计算机可读介质的示例。The memory may include non-permanent memory in a computer-readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可 读媒体(transitory media),如调制的数据信号和载波。Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology. The information can be computer-readable instructions, data structures, program modules, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、商品或者设备中还存在另外的相同要素。It should also be noted that the terms "include", "include" or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, commodity or equipment including a series of elements not only includes those elements, but also includes Other elements that are not explicitly listed, or also include elements inherent to such processes, methods, commodities, or equipment. If there are no more restrictions, the element defined by the sentence "including a..." does not exclude the existence of other identical elements in the process, method, commodity or equipment that includes the element.
本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art should understand that the embodiments of the present application can be provided as a method, a system, or a computer program product. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
以上仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。The above are only examples of the application, and are not used to limit the application. For those skilled in the art, this application can have various modifications and changes. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of this application shall be included in the scope of the claims of this application.

Claims (10)

  1. 一种眼球追踪校准方法,包括:An eye tracking calibration method, including:
    在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动后台校准流程;In an interactive operation, if the interactive mode selected by the user is a non-line-of-sight interactive mode, start the background calibration process;
    在所述后台校准流程中,获取所述使用者的眼部特征信息;In the background calibration process, obtaining the user's eye characteristic information;
    获取采用所述非视线定位交互方式定位得到的位置坐标作为校准点坐标;Acquiring the position coordinates obtained by positioning in the non-line-of-sight positioning interactive manner as the calibration point coordinates;
    根据获取到的眼部特征信息以及校准点坐标,计算得到所述使用者当前的个人校准系数。According to the acquired eye feature information and calibration point coordinates, the current personal calibration coefficient of the user is calculated.
  2. 如权利要求1所述的方法,其中,在启动所述后台校准流程之前,还包括:The method according to claim 1, wherein before starting the background calibration procedure, the method further comprises:
    获取使用者的眼部特征信息;Obtain the user's eye feature information;
    根据获取到的所述眼部特征信息及目标校准系数计算得到所述使用者的注视点坐标;所述目标校准系数包括:系统默认校准系数,或者,与所述使用者的用户标识相关联的个人校准系数;The gaze point coordinates of the user are calculated according to the acquired eye feature information and target calibration coefficients; the target calibration coefficients include: system default calibration coefficients, or those associated with the user ID of the user Personal calibration factor;
    对所述注视点坐标所属功能区的交互效果进行展示;Display the interactive effect of the functional area to which the coordinates of the gaze point belong;
    若监测到非视线定位交互设备被使用,并根据所述非视线定位交互设备定位至其他功能区时,确定本次交互操作所采用的交互方式为非视线定位方式。If it is detected that the non-line-of-sight positioning interactive device is used and the non-line-of-sight positioning interactive device is positioned to other functional areas, it is determined that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode.
  3. 如权利要求2所述的方法,其中,所述使用者的用户标识为目标用户标识;在所述使用目标校准系数计算得到使用者的注视点坐标之前,还包括:The method according to claim 2, wherein the user identification of the user is the target user identification; before the use of the target calibration coefficient to calculate the coordinates of the gaze point of the user, the method further comprises:
    若所述目标用户标识已关联个人校准系数,将与所述目标用户标识相关联的个人校准系数确定为所述目标校准系数;If the target user ID has been associated with a personal calibration coefficient, determining the personal calibration coefficient associated with the target user ID as the target calibration coefficient;
    若所述目标用户标识未关联个人校准系数,将所述系统默认校准系数确定为所述目标校准系数。If the target user ID is not associated with a personal calibration coefficient, the system default calibration coefficient is determined as the target calibration coefficient.
  4. 如权利要求3所述的方法,其中,在计算得到所述当前的个人校准系数后,还包括:The method according to claim 3, wherein, after the current personal calibration coefficient is obtained by calculation, the method further comprises:
    若所述目标用户标识已关联有个人校准系数,将所述目标用户标识关联的个人校准系数更新为所述当前的个人校准系数;If the target user ID has been associated with a personal calibration coefficient, updating the personal calibration coefficient associated with the target user ID to the current personal calibration coefficient;
    若所述目标用户标识未关联有个人校准系数,将所述当前的个人校准系数与所述目标用户标识相关联。If the target user identification is not associated with a personal calibration coefficient, the current personal calibration coefficient is associated with the target user identification.
  5. 如权利要求2-4任一项所述的方法,其中,用户标识与生物识别特征相绑定;在所述使用目标校准系数计算得到使用者的注视点坐标之前,还包括:The method according to any one of claims 2-4, wherein the user identification is bound to the biometric feature; before the use of the target calibration coefficient to calculate the coordinates of the user's gaze point, the method further comprises:
    获取所述使用者的生物识别特征;Acquiring the biometric characteristics of the user;
    将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配;Match the acquired biometric features with the biometric features of the established user ID;
    若匹配成功,将匹配成功的用户标识确定为所述使用者的用户标识;If the matching is successful, determine the user ID of the user who is successfully matched as the user ID of the user;
    若匹配失败,为所述使用者建立用户标识,并将建立的用户标识与获取的生物识别特征相绑定。If the matching fails, a user identity is established for the user, and the established user identity is bound to the acquired biometric characteristics.
  6. 一种眼球追踪校准方法校准装置,包括:A calibration device for an eye tracking calibration method, including:
    后台校准启动单元,设置为在一次交互操作中,若使用者选择的交互方式为非视线定位交互方式,启动后台校准流程;The background calibration start unit is set to start the background calibration process if the interactive mode selected by the user is a non-line-of-sight interactive mode during an interactive operation;
    采集单元,设置为在所述后台校准流程中,获取所述使用者的眼部特征信息;An acquisition unit, configured to acquire the user's eye characteristic information in the background calibration process;
    校准单元,设置为获取采用所述非视线定位交互方式定位得到的位置坐标作为校准点坐标;以及根据获取到的眼部特征信息以及校准点坐标,计算得到所述使用者当前的个人校准系数。The calibration unit is configured to obtain the position coordinates obtained by the non-line-of-sight positioning interactive method as the calibration point coordinates; and calculate the current personal calibration coefficient of the user according to the obtained eye feature information and the calibration point coordinates.
  7. 如权利要求6所述的装置,其中,还包括:The device of claim 6, further comprising:
    计算单元,设置为在启动所述后台校准流程之前,获取使用者的眼部特征信息;以及根据获取到的所述眼部特征信息及目标校准系数计算得到所述 使用者的注视点坐标;所述目标校准系数包括:系统默认校准系数,或者,与所述使用者的用户标识相关联的个人校准系数;The calculation unit is configured to obtain the user's eye feature information before starting the background calibration process; and calculate the user's gaze point coordinates according to the obtained eye feature information and the target calibration coefficient; The target calibration coefficient includes: a system default calibration coefficient, or a personal calibration coefficient associated with the user ID of the user;
    展示单元,设置为对所述注视点坐标所属功能区的交互效果进行展示;The display unit is set to display the interactive effect of the functional area to which the coordinates of the gaze point belong;
    监测单元,设置为若监测到非视线定位交互设备被使用,并根据所述非视线定位交互设备定位至其他功能区时,确定本次交互操作所采用的交互方式为非视线定位方式。The monitoring unit is configured to determine that the interactive mode used in this interactive operation is the non-line-of-sight positioning mode if it is detected that the non-line-of-sight positioning interactive device is used and the non-line-of-sight positioning interactive device is positioned to other functional areas.
  8. 如权利要求7所述的装置,其中,所述使用者的用户标识为目标用户标识;在所述使用目标校准系数计算得到使用者的注视点坐标之前,所述计算单元还设置为若所述目标用户标识已关联有个人校准系数,将与所述目标用户标识相关联的个人校准系数确定为所述目标校准系数;以及若使用者的用户标识未关联个人校准系数,将所述系统默认校准系数确定为所述目标校准系数。7. The device of claim 7, wherein the user identification of the user is a target user identification; before the use of the target calibration coefficient to calculate the user's gaze point coordinates, the calculation unit is further configured to The target user ID has been associated with a personal calibration coefficient, and the personal calibration coefficient associated with the target user ID is determined as the target calibration coefficient; and if the user ID of the user is not associated with a personal calibration coefficient, the system is calibrated by default The coefficient is determined as the target calibration coefficient.
  9. 如权利要求8所述的装置,其中,在计算得到所述当前的个人校准系数后,所述校准单元还设置为若所述目标用户标识已关联有个人校准系数,将所述目标用户标识关联的个人校准系数更新为所述当前的个人校准系数;以及若所述目标用户标识未关联有个人校准系数,将所述当前的个人校准系数与所述目标用户标识相关联。The device according to claim 8, wherein, after the current personal calibration coefficient is calculated, the calibration unit is further configured to associate the target user identification with the personal calibration coefficient if the target user identification has already been associated with the personal calibration coefficient The personal calibration coefficient of is updated to the current personal calibration coefficient; and if the target user identification is not associated with a personal calibration coefficient, the current personal calibration coefficient is associated with the target user identification.
  10. 如权利要求7-9任一项所述的装置,其中,用户标识与生物识别特征相绑定;所述装置还包括:生物识别单元,设置为在所述使用目标校准系数计算得到使用者的注视点坐标之前,获取所述使用者的生物识别特征;将获取的生物识别特征与已建立的用户标识的生物识别特征进行匹配;若匹配成功,将匹配成功的用户标识确定为所述使用者的用户标识;以及若匹配失败,为所述使用者建立用户标识,并将建立的用户标识与获取的生物识别特征相绑定。8. The device according to any one of claims 7-9, wherein the user identification is bound to a biometric feature; the device further comprises: a biometric unit configured to calculate the user’s calibration coefficient using the target calibration coefficient. Before the coordinates of the gaze point, obtain the biometric features of the user; match the obtained biometric features with the biometric features of the established user ID; if the matching is successful, determine the user ID that is successfully matched as the user And if the matching fails, establish a user identity for the user, and bind the established user identity with the acquired biometric characteristics.
PCT/CN2021/079596 2020-03-18 2021-03-08 Method and device for eye tracking calibration WO2021185110A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022555765A JP2023517380A (en) 2020-03-18 2021-03-08 Eye tracking calibration method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010191024.5 2020-03-18
CN202010191024.5A CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device

Publications (1)

Publication Number Publication Date
WO2021185110A1 true WO2021185110A1 (en) 2021-09-23

Family

ID=77769930

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/079596 WO2021185110A1 (en) 2020-03-18 2021-03-08 Method and device for eye tracking calibration

Country Status (3)

Country Link
JP (1) JP2023517380A (en)
CN (1) CN113495613B (en)
WO (1) WO2021185110A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700500A (en) * 2023-08-07 2023-09-05 江西科技学院 Multi-scene VR interaction method, system and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047822B (en) * 2021-11-24 2023-12-19 京东方科技集团股份有限公司 Near-to-eye display method and system
CN116311396B (en) * 2022-08-18 2023-12-12 荣耀终端有限公司 Method and device for fingerprint identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150269729A1 (en) * 2014-03-20 2015-09-24 Lc Technologies, Inc. Eye Image Stimuli for Eyegaze Calibration Procedures
CN105843397A (en) * 2016-04-12 2016-08-10 公安部上海消防研究所 Virtual reality interactive system based on pupil tracking technology
CN109976535A (en) * 2019-05-05 2019-07-05 北京七鑫易维信息技术有限公司 A kind of calibration method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150302585A1 (en) * 2014-04-22 2015-10-22 Lenovo (Singapore) Pte. Ltd. Automatic gaze calibration
CN106406509B (en) * 2016-05-16 2023-08-01 上海青研科技有限公司 Head-mounted eye-control virtual reality equipment
CN109410285B (en) * 2018-11-06 2021-06-08 北京七鑫易维信息技术有限公司 Calibration method, calibration device, terminal equipment and storage medium
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150269729A1 (en) * 2014-03-20 2015-09-24 Lc Technologies, Inc. Eye Image Stimuli for Eyegaze Calibration Procedures
CN105843397A (en) * 2016-04-12 2016-08-10 公安部上海消防研究所 Virtual reality interactive system based on pupil tracking technology
CN109976535A (en) * 2019-05-05 2019-07-05 北京七鑫易维信息技术有限公司 A kind of calibration method, device, equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700500A (en) * 2023-08-07 2023-09-05 江西科技学院 Multi-scene VR interaction method, system and storage medium

Also Published As

Publication number Publication date
CN113495613B (en) 2023-11-21
JP2023517380A (en) 2023-04-25
CN113495613A (en) 2021-10-12

Similar Documents

Publication Publication Date Title
WO2021185110A1 (en) Method and device for eye tracking calibration
EP3781896B1 (en) System for locating and identifying an object in unconstrained environments
JP7191714B2 (en) Systems and methods for direct pointing detection for interaction with digital devices
US11436625B2 (en) Head mounted display system configured to exchange biometric information
US9953214B2 (en) Real time eye tracking for human computer interaction
US20160029883A1 (en) Eye tracking calibration
Harezlak et al. Towards accurate eye tracker calibration–methods and procedures
KR102092931B1 (en) Method for eye-tracking and user terminal for executing the same
CN108681399B (en) Equipment control method, device, control equipment and storage medium
WO2016111880A1 (en) Gaze detection offset for gaze tracking models
US11047691B2 (en) Simultaneous localization and mapping (SLAM) compensation for gesture recognition in virtual, augmented, and mixed reality (xR) applications
AU2017430168A1 (en) Detailed eye shape model for robust biometric applications
Sun et al. Real-time gaze estimation with online calibration
US10488918B2 (en) Analysis of user interface interactions within a virtual reality environment
JP6898234B2 (en) Reflection-based control activation
US20180004287A1 (en) Method for providing user interface through head mounted display using eye recognition and bio-signal, apparatus using same, and computer readable recording medium
US10824223B2 (en) Determination apparatus and determination method
Pires et al. Unwrapping the eye for visible-spectrum gaze tracking on wearable devices
KR20200107520A (en) Smart mirror system and method for controlling smart mirror
RU2818028C1 (en) Method and device for calibration in oculography
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
WO2022022220A1 (en) Morphing method and morphing apparatus for facial image
Modi et al. A novel framework for camera-based eye gaze estimation
Zhu et al. A Framework for Gaze-Contingent Interfaces
CN114661158A (en) Processing method and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21772008

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022555765

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21772008

Country of ref document: EP

Kind code of ref document: A1