CN112578901A - Eyeball tracking calibration method and related equipment - Google Patents

Eyeball tracking calibration method and related equipment Download PDF

Info

Publication number
CN112578901A
CN112578901A CN201910945827.2A CN201910945827A CN112578901A CN 112578901 A CN112578901 A CN 112578901A CN 201910945827 A CN201910945827 A CN 201910945827A CN 112578901 A CN112578901 A CN 112578901A
Authority
CN
China
Prior art keywords
calibration
groups
determining
function
fixation point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910945827.2A
Other languages
Chinese (zh)
Inventor
韩世广
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910945827.2A priority Critical patent/CN112578901A/en
Publication of CN112578901A publication Critical patent/CN112578901A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application discloses eyeball tracking calibration method and related equipment, which are applied to electronic equipment, and the method comprises the following steps: acquiring M groups of fitting data in the process of playing a first game, and determining a fixation point function according to the M groups of fitting data, wherein M is an integer greater than 0; acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0; and when the fixation point function is determined to meet the calibration requirement through the N groups of test data, defining the fixation point function as the fixation point function for determining the eyeball fixation point. By adopting the method and the device, more data can be acquired to calibrate the predicted point function under the condition of not losing user experience, so that the more accurate predicted point function can be obtained.

Description

Eyeball tracking calibration method and related equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an eyeball tracking calibration method and related devices.
Background
Eyeball tracking is a machine vision technology, and is a technology for capturing an eye image of a user through equipment, analyzing the eye image by adopting an algorithm and finally obtaining a fixation position of the user. When a user uses the eyeball tracking related equipment for the first time, the fixation point of the user needs to be corrected, so that the positioning accuracy of the fixation point in the using process is ensured. In order to improve user experience, the target and related manufacturers use less calibration data as much as possible, which results in under-fitting of the function of the gazing point, thereby reducing the accuracy of positioning the gazing point.
Disclosure of Invention
The embodiment of the application provides an eyeball tracking calibration method and related equipment, which are used for acquiring more data to calibrate a predicted point function under the condition of not losing user experience so as to obtain a more accurate predicted point function.
In a first aspect, an embodiment of the present application provides an eyeball tracking calibration method applied to an electronic device, where the method includes:
acquiring M groups of fitting data in the process of playing a first game, and determining a fixation point function according to the M groups of fitting data, wherein M is an integer greater than 0;
acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0;
and when the fixation point function is determined to meet the calibration requirement based on the N groups of test data, defining the fixation point function as the fixation point function for determining the eyeball fixation point.
In a second aspect, an embodiment of the present application provides an eyeball tracking calibration apparatus, which is applied to an electronic device, the apparatus includes:
the acquisition unit is used for acquiring M groups of fitting data in the process of the first game;
a determining unit, configured to determine a point of regard function according to the M sets of fitting data, where M is an integer greater than 0;
the acquisition unit is further used for acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0;
and the defining unit is used for defining the fixation point function as the fixation point function for determining the eyeball fixation point when the fixation point function is determined to meet the calibration requirement based on the N groups of test data.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the application, in the process of the first game, M groups of fitting data are collected to determine the gaze point function, then N groups of test data are collected to test whether the gaze point function meets the calibration requirement, if the gaze point function meets the calibration requirement, the gaze point function is defined as the gaze point function for determining the eyeball gaze point, and the fitting data and the test data are collected in the process of the first game, so that the user does not feel boring in the collection process, more data can be collected to determine and calibrate the gaze point function, and a more accurate predicted point function can be obtained while the user experience is not lost and even improved.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of hardware of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a software architecture diagram of an eyeball tracking calibration method provided in the embodiment of the present application;
fig. 3 is a schematic flowchart of an eyeball tracking calibration method according to an embodiment of the present application;
FIG. 4 is a diagram illustrating human-computer interaction of an eye tracking calibration method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a first calibration vector in an eyeball tracking calibration method according to an embodiment of the present application;
fig. 6 is a schematic view of a scene of an eyeball tracking calibration method according to an embodiment of the present application;
fig. 7 is a schematic flowchart of an eyeball tracking calibration method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an eyeball tracking calibration apparatus according to an embodiment of the application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
Electronic devices may include a variety of handheld devices, vehicle-mounted devices, wearable devices (e.g., smartwatches, smartbands, pedometers, etc.), computing devices or other processing devices communicatively connected to wireless modems, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal Equipment (terminal device), and so forth having wireless communication capabilities. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
As shown in fig. 1, fig. 1 is a schematic structural diagram of hardware of an electronic device according to an embodiment of the present disclosure. The electronic device includes a processor, a Memory, a signal processor, a transceiver, a display screen, a speaker, a microphone, a Random Access Memory (RAM), a camera, a sensor, and Infrared light (IR), among others. The storage, the signal processor, the display screen, the loudspeaker, the microphone, the RAM, the camera, the sensor and the IR are connected with the processor, and the transceiver is connected with the signal processor.
The Display screen may be a Liquid Crystal Display (LCD), an Organic or inorganic Light-Emitting Diode (OLED), an Active Matrix/Organic Light-Emitting Diode (AMOLED), or the like.
The camera may be a common camera or an infrared camera, and is not limited herein. The camera may be a front camera or a rear camera, and is not limited herein.
Wherein the sensor comprises at least one of: light-sensitive sensors, gyroscopes, infrared proximity sensors, fingerprint sensors, pressure sensors, etc. Among them, the light sensor, also called an ambient light sensor, is used to detect the ambient light brightness. The light sensor may include a light sensitive element and an analog to digital converter. The photosensitive element is used for converting collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the light sensor may further include a signal amplifier, and the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The processor is a control center of the electronic equipment, various interfaces and lines are used for connecting all parts of the whole electronic equipment, and various functions and processing data of the electronic equipment are executed by operating or executing software programs and/or modules stored in the memory and calling data stored in the memory, so that the electronic equipment is monitored integrally.
The processor may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The memory is used for storing software programs and/or modules, and the processor executes various functional applications and data processing of the electronic equipment by operating the software programs and/or modules stored in the memory. The memory mainly comprises a program storage area and a data storage area, wherein the program storage area can store an operating system, a software program required by at least one function and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Wherein, the IR is used for irradiating the human eye to generate a bright spot (glint) on the human eye, and the camera is used for shooting the human eye to obtain an image comprising the bright spot and a pupil (pupil).
As shown in fig. 2, fig. 2 is a software architecture diagram of an eyeball tracking calibration method according to an embodiment of the present application. The software architecture diagram includes four layers, where a first layer includes applications such as electronic books, browsers, launchers, systems, unlocking, mobile payment, point of interest tracking, and the like. The second layer is an eyeball tracking service (OEyeTracerservice) which comprises an eyeball tracking authorization (OEyeTracerAuthentication), an eyeball tracking strategy (OEyeTracerStrategy), an eyeball tracking algorithm (OEyeTracerAlgo), an eyeball tracking parameter (OEyeTracerParams) and the like, and the OEyeTracerservice of the second layer is connected with the application of the first layer through an eyeball tracking SDK (OEyeTracerSDK) interface; the second layer further includes a camera NDK interface (CameraNDKInterface) and a camera service (CameraService), the CameraNDKInterface is connected with the oeyetracker service, and the CameraService is connected with the CameraNDKInterface. The third layer comprises a Google HAL Interface (Google HAL Interface), a high-pass HAL Interface (Qualcomm HAL Interface), Cam X, Chi-cdk and the like, the Google HAL Interface is connected with the CameraServer of the second layer, the Qualcomm HAL Interface is connected with the Google HAL Interface, the Cam X is respectively connected with the Qualcomm HAL Interface and the Chi-cdk, the fourth layer comprises an RGB sensor (RGB sensor), a Digital Signal Processor (DSP), an infrared sensor (IR sensor), a Laser (Laser), a Light Emitting Diode (LED) and the like, and the IR sensor is connected with the Cam X of the third layer. The connection between OEyeTracker service and OEyeTracker SDK, the connection between CameraService and CameraNDKInterface, and the connection between Google HAL Interface and CameraService are all through Binder architecture.
The OEyeTracker SDK is responsible for providing the point of regard acquisition and the input api for the common application, and the form of the api is a jar/aar package. Oeyetracker service is responsible for managing the gazing point algorithm, gazing point post-processing, input processing, and authentication and parameter setting. Eyetracker algo is the core algorithm for eye tracking, including the algorithm for determining the function of the point of regard in this application. Oeyetracerstrategy is associated with algorithmic post-processing such as filtering, gaze point jumping, gaze point shift monitoring, gaze point input. The OEyeTrackerAuuthentization call-back module is responsible for authenticating whether the requester is allowed. OEyeTracker param is responsible for parsing configuration and hot update configuration.
As shown in fig. 3, fig. 3 is a schematic flowchart of an eyeball tracking calibration method provided in the embodiment of the present application, and is applied to the electronic device shown in fig. 1 and fig. 2, where the method includes:
step 301: during the process of the first game, M groups of fitting data are collected, and a point-of-regard function is determined according to the M groups of fitting data, wherein M is an integer larger than 0.
The first game may be a game of a capture target, and the game of the capture target may be games such as "playing a groundmouse", "flapping crab", "catching a mouse with an eye disease, and" catching a stolen rabbit ".
In an implementation manner of the present application, the electronic device includes a camera and an infrared lamp, and the acquiring of M groups of fitting data includes:
illuminating a human eye through the infrared lamp, wherein the infrared lamp is used for generating bright spots on the human eye;
shooting human eyes including bright spots through the camera to obtain M first images;
and analyzing the M first images to obtain M groups of fitting data, wherein the M groups of fitting data correspond to the M first images one by one, and each group of fitting data comprises a first bright spot coordinate and a first pupil coordinate.
As shown in fig. 4, fig. 4 is a human-computer interaction diagram of an eyeball tracking calibration method provided in the embodiment of the application. The hardware in fig. 4 is shown in fig. 1, and the software in fig. 4 is shown in fig. 2. The user first starts a first game, which displays a first interface, i.e. performs an eye tracking calibration activity (oeyetracker Service), which triggers oeyetracker Service, which starts oeyetracker algo, which starts Camera Service and Camera hal (Camera hal), then invokes IR illumination infrared light through corresponding hardware driven infrared light and Camera drive (IR & Camera Driver) to make the user's eye appear bright spots, and invokes a Camera in the Camera to photograph the eye. The camera gradually uploads the shot first image to the OEyeTrackerAlgo for processing to obtain first fitting data, then the OEyeTrackerAlgo informs the OEyeTrackerAlgo to display a second interface, and the process is repeated to obtain second fitting data until M fitting data are collected.
In an implementation manner of the present application, the determining a point-of-regard function through the M sets of fitting data includes:
determining M first calibration vectors through the M sets of fitting data, wherein the M first calibration vectors correspond to the M sets of fitting data one to one, and each first calibration vector is determined based on the first hot spot coordinates and the first pupil coordinates;
determining fitting parameters based on the M first calibration vectors;
a point of regard function is determined based on the fitting parameters.
Specifically, each of the first calibration vectors is determined based on the first speckle coordinate and the first pupil coordinate, and includes: if the first hot spot coordinate is (x)1,y1) The first pupil coordinate is (x)2,y2) Then the first calibration vector is V (V)x,Vy) In which V isx=x1-x2,Vy=y1-y2
As shown in fig. 5, fig. 5 is a schematic structural diagram of a first calibration vector in an eyeball tracking calibration method according to an embodiment of the present application. FIG. 5 shows the target image obtained by processing the first image, assuming the first hot spot coordinate is (x)1,y1) Pupil coordinate is (x)2,y2) Then the calibration vector is V (x)1-x2,y1-y2)。
Specifically, the determining a fitting parameter based on the M first calibration vectors and determining a gaze point function based on the fitting parameter includes:
setting the fitting function to
Xgaze=a0+a1*Vx+a2*Vy+a3*Vx+VyAnd Ygaze=b0+b1*Vx+b2*Vy+b3*Vy*Vy,
Wherein, X isgazeAnd YgazeRespectively the abscissa and ordinate of the point of regard, said VxAnd the abscissa and the ordinate of the first calibration vector, respectively, said a0A is described1A is described2A is described3B said0B said1B said2And b is3Is a fitting parameter;
determining M equations based on the M first calibration vectors and the fitting function;
solving the fitting parameters based on the M equations;
and substituting the fitting parameters into the fitting function to obtain the fixation point function.
Wherein M is an integer greater than or equal to 7.
Step 302: and in the process of the first game, collecting N groups of test data, wherein N is an integer greater than 0.
It should be noted that, the process of acquiring N sets of test data in step 302 can be referred to fig. 4, and is not described in detail here.
In an implementation manner of the present application, after acquiring the N sets of test data, the method further includes:
determining whether the point of regard function meets calibration requirements based on the N sets of test data.
Specifically, the determining whether the gaze point function meets calibration requirements based on the N sets of test data includes:
determining N second calibration vectors through the N groups of test data, wherein the N second calibration vectors correspond to the N groups of test data one to one;
determining N predicted fixation point coordinates based on the N second calibration vectors and the fixation point function, wherein the N predicted fixation point coordinates correspond to the N second calibration vectors one to one;
determining N first distances, wherein each first distance is a distance between a corresponding predicted fixation point coordinate and a corresponding actual fixation point coordinate, and the corresponding actual fixation point coordinate is determined based on the corresponding predicted fixation point coordinate;
determining whether the point of gaze function meets the calibration requirements based on the N first distances.
It should be noted that, the method for determining the N second calibration vectors can be referred to as the method for determining the M first calibration vectors in fig. 5, and will not be described in detail here.
In one implementation of the application, the first game comprises a game in which a target is captured, and in the first game, coordinates of a position where the target currently appears are coordinates of the actual gaze point.
Further, if the coordinate of the actual fixation point is O0(x0, y0), the coordinates of the predicted point of regard are O (x, y), and the first distance is [ (x-x)0)2+(y-y0)2]1/2
Step 303: and when the fixation point function is determined to meet the calibration requirement based on the N groups of test data, defining the fixation point function as the fixation point function for determining the eyeball fixation point.
In an implementation manner of the present application, the method further includes:
and when the gazing point function is determined to be not in accordance with the calibration requirement through the N groups of test data, acquiring M groups of fitting data in the process of the first game, and determining the gazing point function through the M groups of fitting data.
In an implementation of the present application, the determining whether the gaze point function meets the calibration requirement based on the N first distances comprises:
determining that the point of gaze function meets the calibration requirements under a first condition comprising at least one of: the N first distances are all smaller than or equal to a first threshold, the number of the N first distances which are smaller than or equal to a second threshold is larger than or equal to a third threshold, the mean value of the N first distances is smaller than or equal to a fourth threshold, and the variance of the N first thresholds is smaller than or equal to a fifth threshold;
determining that the point of regard function does not meet the calibration requirements under a second condition, the second condition comprising at least one of: at least one of the N first distances is greater than the first threshold, the number of the N first distances that are less than the second threshold is less than the third threshold, the mean value of the N first distances is greater than the fourth threshold, and the variance of the N first thresholds is greater than the fifth threshold.
The first threshold may be the same as or different from the second threshold, the first threshold may be 0.1mm, 0.5mm, 1mm, 1.5mm or other values, and the second threshold may be 0.1mm, 0.5mm, 1mm, 1.5mm or other values, which are not limited herein.
The third threshold may be, for example, 3, 5, 7, 10 or other values, the fourth threshold may be, for example, 0.1mm, 0.5mm, 1mm, 1.5mm or other values, and the fifth threshold may be, for example, 0.1mm2、0.5mm2、1mm2、1.5mm2They are not limited herein.
As shown in fig. 6, fig. 6 is a schematic view of a scene of an eyeball tracking calibration method according to an embodiment of the present application. The electronic device in fig. 6 may be the electronic device in fig. 1, and the software architecture of the electronic device is shown in fig. 2. In the embodiment of the application, the electronic device firstly displays a starting interface of a first game, and the first game is a 'playing a land mouse' game. As shown in fig. 6, the launch interface includes an icon of a "batting mouse" game, the display "is ready? Begin eyeball tracking calibration with me! ", and also includes a game start button. When a user presses a game starting button, a 'playing a land mouse' game is operated, a land mouse is displayed on a display screen, and eyes of the user look at the land mouse on the display screen. The electronic equipment irradiates eyes of a user through infrared light, and then shoots the eyes of the user through the camera to obtain a first image. The electronic equipment processes the first image in the background to obtain fitting data, and then the fitting data is used for determining fitting parameters, so that a fixation point function is determined. And repeating the acquisition process by the electronic equipment to obtain test data, and testing whether the watching point function meets the requirement by using the test data.
It can be seen that, in the embodiment of the application, in the process of the first game, M groups of fitting data are collected to determine the gaze point function, then N groups of test data are collected to test whether the gaze point function meets the calibration requirement, if the gaze point function meets the calibration requirement, the gaze point function is defined as the gaze point function for determining the eyeball gaze point, and the fitting data and the test data are collected in the process of the first game, so that the user does not feel boring in the collection process, more data can be collected to determine and calibrate the gaze point function, and a more accurate predicted point function can be obtained while the user experience is not lost and even improved.
In an implementation of the present application, upon determining from the N sets of test data that the gaze point function does not meet the calibration requirements, the method further comprises:
and determining a first time length of the target, wherein the first time length is longer than a second time length, and the second time length is the time length of the target when the gaze point function is determined to meet the calibration requirement through the N groups of test data.
The first duration is, for example, 0.5s, 1s, 1.5s or other values, and the first duration is, for example, 0.8s, 1.3s, 1.8s or other values, which are not limited herein.
It can be seen that, in the embodiment of the application, the time length of the target appearing in the first game is prolonged, so that the user can more accurately position the target, and more accurate data can be obtained.
Referring to fig. 7, fig. 7 is a schematic flowchart of an eyeball tracking calibration method provided by the embodiment of the present application, and the method is applied to an electronic device, where the electronic device includes a camera and an infrared lamp, and the method includes:
step 701: during the playing of the first game, the human eyes are irradiated by the infrared lamps, and the infrared lamps are used for generating bright spots on the human eyes.
Step 702: shooting human eyes including bright spots through the camera to obtain M first images, wherein M is an integer larger than 0.
Step 703: and analyzing the M first images to obtain M groups of fitting data, wherein the M groups of fitting data correspond to the M first images one by one, and each group of fitting data comprises a first bright spot coordinate and a first pupil coordinate.
Step 704: determining M first calibration vectors through the M groups of fitting data, wherein the M first calibration vectors correspond to the M groups of fitting data one to one, and each first calibration vector is determined based on the first hot spot coordinates and the first pupil coordinates.
Step 705: determining fitting parameters based on the M first calibration vectors.
Step 706: a point of regard function is determined based on the fitting parameters.
Step 707: and in the process of the first game, collecting N groups of test data, wherein N is an integer greater than 0.
Step 708: and determining N second calibration vectors according to the N groups of test data, wherein the N second calibration vectors correspond to the N groups of test data one to one.
Step 709: and determining N predicted fixation point coordinates based on the N second calibration vectors and the fixation point function, wherein the N predicted fixation point coordinates correspond to the N second calibration vectors one to one.
Step 710: determining N first distances, wherein each first distance is a distance between the corresponding predicted gazing point coordinate and the corresponding actual gazing point coordinate, and the corresponding actual gazing point coordinate is determined based on the corresponding predicted gazing point coordinate.
Step 711: determining whether the point of gaze function meets the calibration requirements based on the N first distances.
If yes, go to step 712;
if not, go to step 701.
Step 712: and defining the fixation point function as a fixation point function for determining the fixation point of the eyeball.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
In accordance with the embodiments shown in fig. 3 and fig. 7, please refer to fig. 8, and fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in the figure, the electronic device includes a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
acquiring M groups of fitting data in the process of playing a first game, and determining a fixation point function according to the M groups of fitting data, wherein M is an integer greater than 0;
acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0;
and when the fixation point function is determined to meet the calibration requirement based on the N groups of test data, defining the fixation point function as the fixation point function for determining the eyeball fixation point.
In an implementation manner of the present application, the electronic device includes a camera and an infrared lamp, and in terms of acquiring M sets of fitting data, the program includes instructions specifically configured to perform the following steps:
illuminating a human eye through the infrared lamp, wherein the infrared lamp is used for generating bright spots on the human eye;
shooting human eyes including bright spots through the camera to obtain M first images;
and analyzing the M first images to obtain M groups of fitting data, wherein the M groups of fitting data correspond to the M first images one by one, and each group of fitting data comprises a first bright spot coordinate and a first pupil coordinate.
In an implementation of the application, in determining the point of regard function from the M sets of fitting data, the program comprises instructions specifically for performing the steps of:
determining M first calibration vectors through the M sets of fitting data, wherein the M first calibration vectors correspond to the M sets of fitting data one to one, and each first calibration vector is determined based on the first hot spot coordinates and the first pupil coordinates;
determining fitting parameters based on the M first calibration vectors;
a point of regard function is determined based on the fitting parameters.
In an implementation manner of the present application, after acquiring N sets of test data, the program includes instructions further for performing the following steps:
determining N second calibration vectors through the N groups of test data, wherein the N second calibration vectors correspond to the N groups of test data one to one;
determining N predicted fixation point coordinates based on the N second calibration vectors and the fixation point function, wherein the N predicted fixation point coordinates correspond to the N second calibration vectors one to one;
determining N first distances, wherein each first distance is a distance between a corresponding predicted fixation point coordinate and a corresponding actual fixation point coordinate, and the corresponding actual fixation point coordinate is determined based on the corresponding predicted fixation point coordinate;
determining whether the point of gaze function meets the calibration requirements based on the N first distances.
In an implementation of the application, in determining whether the gaze point function meets the calibration requirements based on the N first distances, the program comprises instructions specifically for performing the steps of:
determining that the point of gaze function meets the calibration requirements under a first condition comprising at least one of: the N first distances are all smaller than or equal to a first threshold, the number of the N first distances which are smaller than or equal to a second threshold is larger than or equal to a third threshold, the mean value of the N first distances is smaller than or equal to a fourth threshold, and the variance of the N first thresholds is smaller than or equal to a fifth threshold;
determining that the point of regard function does not meet the calibration requirements under a second condition, the second condition comprising at least one of: at least one of the N first distances is greater than the first threshold, the number of the N first distances that are less than the second threshold is less than the third threshold, the mean value of the N first distances is greater than the fourth threshold, and the variance of the N first thresholds is greater than the fifth threshold.
In one implementation of the application, the first game comprises a game in which a target is captured, and in the first game, coordinates of a position where the target currently appears are coordinates of the actual gaze point.
In an implementation manner of the present application, the program includes instructions for further performing the following steps:
and when the gazing point function is determined to be not in accordance with the calibration requirement through the N groups of test data, acquiring M groups of fitting data in the process of the first game, and determining the gazing point function through the M groups of fitting data.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
The above embodiments mainly introduce the scheme of the embodiments of the present application from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The following is an embodiment of the apparatus of the present application, which is used to execute the method implemented by the embodiment of the method of the present application. Referring to fig. 9, fig. 9 is a schematic structural diagram of an eye tracking calibration apparatus applied to an electronic device according to an embodiment of the present application, the apparatus including:
the acquisition unit 901 is used for acquiring M groups of fitting data in the process of the first game;
a determining unit 902, configured to determine a point of regard function according to the M sets of fitting data, where M is an integer greater than 0;
the collecting unit 901 is further configured to collect N groups of test data during the process of playing the first game, where N is an integer greater than 0;
a defining unit 903, configured to define the gaze point function as a gaze point function for determining a gaze point of an eyeball when it is determined that the gaze point function meets calibration requirements based on the N sets of test data.
In an implementation manner of the present application, the electronic device includes a camera and an infrared lamp, and in terms of acquiring M groups of fitting data, the acquisition unit 901 is specifically configured to:
illuminating a human eye through the infrared lamp, wherein the infrared lamp is used for generating bright spots on the human eye;
shooting human eyes including bright spots through the camera to obtain M first images;
and analyzing the M first images to obtain M groups of fitting data, wherein the M groups of fitting data correspond to the M first images one by one, and each group of fitting data comprises a first bright spot coordinate and a first pupil coordinate.
In an implementation manner of the present application, in determining the point of regard function through the M sets of fitting data, the determining unit 902 is specifically configured to:
determining M first calibration vectors through the M sets of fitting data, wherein the M first calibration vectors correspond to the M sets of fitting data one to one, and each first calibration vector is determined based on the first hot spot coordinates and the first pupil coordinates;
determining fitting parameters based on the M first calibration vectors;
a point of regard function is determined based on the fitting parameters.
In an implementation manner of the present application, after acquiring N sets of test data, the determining unit 902 is further configured to determine N second calibration vectors according to the N sets of test data, where the N second calibration vectors correspond to the N sets of test data one to one; determining N predicted fixation point coordinates based on the N second calibration vectors and the fixation point function, wherein the N predicted fixation point coordinates correspond to the N second calibration vectors one to one; determining N first distances, wherein each first distance is a distance between a corresponding predicted fixation point coordinate and a corresponding actual fixation point coordinate, and the corresponding actual fixation point coordinate is determined based on the corresponding predicted fixation point coordinate; determining whether the point of gaze function meets the calibration requirements based on the N first distances.
In an implementation manner of the present application, in determining whether the gazing point function meets the calibration requirement based on the N first distances, the determining unit 902 is specifically configured to:
determining that the point of gaze function meets the calibration requirements under a first condition comprising at least one of: the N first distances are all smaller than or equal to a first threshold, the number of the N first distances which are smaller than or equal to a second threshold is larger than or equal to a third threshold, the mean value of the N first distances is smaller than or equal to a fourth threshold, and the variance of the N first thresholds is smaller than or equal to a fifth threshold;
determining that the point of regard function does not meet the calibration requirements under a second condition, the second condition comprising at least one of: at least one of the N first distances is greater than the first threshold, the number of the N first distances that are less than the second threshold is less than the third threshold, the mean value of the N first distances is greater than the fourth threshold, and the variance of the N first thresholds is greater than the fifth threshold.
In one implementation of the application, the first game comprises a game in which a target is captured, and in the first game, coordinates of a position where the target currently appears are coordinates of the actual gaze point.
In an implementation manner of the present application, the program includes instructions for further performing the following steps:
and when the gazing point function is determined to be not in accordance with the calibration requirement through the N groups of test data, acquiring M groups of fitting data in the process of the first game, and determining the gazing point function through the M groups of fitting data.
It should be noted that the acquisition unit 901, the determination unit 902, and the definition unit 903 may be implemented by a processor.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An eye tracking calibration method applied to an electronic device, the method comprising:
acquiring M groups of fitting data in the process of playing a first game, and determining a fixation point function according to the M groups of fitting data, wherein M is an integer greater than 0;
acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0;
and when the fixation point function is determined to meet the calibration requirement based on the N groups of test data, defining the fixation point function as the fixation point function for determining the eyeball fixation point.
2. The method of claim 1, wherein the electronic device comprises a camera and an infrared lamp, and wherein the acquiring M sets of fitting data comprises:
illuminating a human eye through the infrared lamp, wherein the infrared lamp is used for generating bright spots on the human eye;
shooting human eyes including bright spots through the camera to obtain M first images;
and analyzing the M first images to obtain M groups of fitting data, wherein the M groups of fitting data correspond to the M first images one by one, and each group of fitting data comprises a first bright spot coordinate and a first pupil coordinate.
3. The method of claim 2, wherein said determining a point of regard function from said M sets of fitting data comprises:
determining M first calibration vectors through the M sets of fitting data, wherein the M first calibration vectors correspond to the M sets of fitting data one to one, and each first calibration vector is determined based on the first hot spot coordinates and the first pupil coordinates;
determining fitting parameters based on the M first calibration vectors;
a point of regard function is determined based on the fitting parameters.
4. The method of any one of claims 1-3, wherein after said collecting N sets of test data, the method further comprises:
determining N second calibration vectors through the N groups of test data, wherein the N second calibration vectors correspond to the N groups of test data one to one;
determining N predicted fixation point coordinates based on the N second calibration vectors and the fixation point function, wherein the N predicted fixation point coordinates correspond to the N second calibration vectors one to one;
determining N first distances, wherein each first distance is a distance between a corresponding predicted fixation point coordinate and a corresponding actual fixation point coordinate, and the corresponding actual fixation point coordinate is determined based on the corresponding predicted fixation point coordinate;
determining whether the point of gaze function meets the calibration requirements based on the N first distances.
5. The method of claim 4, wherein said determining whether the gaze point function meets the calibration requirements based on the N first distances comprises:
determining that the point of gaze function meets the calibration requirements under a first condition comprising at least one of: the N first distances are all smaller than or equal to a first threshold, the number of the N first distances which are smaller than or equal to a second threshold is larger than or equal to a third threshold, the mean value of the N first distances is smaller than or equal to a fourth threshold, and the variance of the N first thresholds is smaller than or equal to a fifth threshold;
determining that the point of regard function does not meet the calibration requirements under a second condition, the second condition comprising at least one of: at least one of the N first distances is greater than the first threshold, the number of the N first distances that are less than the second threshold is less than the third threshold, the mean value of the N first distances is greater than the fourth threshold, and the variance of the N first thresholds is greater than the fifth threshold.
6. The method according to claim 4 or 5, wherein the first game includes a game in which a target is captured, and in the first game, coordinates of a position where a target currently appears are coordinates of the actual gazing point.
7. The method according to any one of claims 1-6, further comprising:
and when the gazing point function is determined to be not in accordance with the calibration requirement through the N groups of test data, acquiring M groups of fitting data in the process of the first game, and determining the gazing point function through the M groups of fitting data.
8. An eye tracking calibration device, applied to an electronic device, the device comprising:
the acquisition unit is used for acquiring M groups of fitting data in the process of the first game;
a determining unit, configured to determine a point of regard function according to the M sets of fitting data, where M is an integer greater than 0;
the acquisition unit is further used for acquiring N groups of test data in the process of the first game, wherein N is an integer greater than 0;
and the defining unit is used for defining the fixation point function as the fixation point function for determining the eyeball fixation point when the fixation point function is determined to meet the calibration requirement based on the N groups of test data.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 7.
CN201910945827.2A 2019-09-30 2019-09-30 Eyeball tracking calibration method and related equipment Pending CN112578901A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910945827.2A CN112578901A (en) 2019-09-30 2019-09-30 Eyeball tracking calibration method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910945827.2A CN112578901A (en) 2019-09-30 2019-09-30 Eyeball tracking calibration method and related equipment

Publications (1)

Publication Number Publication Date
CN112578901A true CN112578901A (en) 2021-03-30

Family

ID=75117087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910945827.2A Pending CN112578901A (en) 2019-09-30 2019-09-30 Eyeball tracking calibration method and related equipment

Country Status (1)

Country Link
CN (1) CN112578901A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150049013A1 (en) * 2013-08-19 2015-02-19 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
CN105850113A (en) * 2014-01-06 2016-08-10 欧库勒斯虚拟现实有限责任公司 Calibration of virtual reality systems
CN108259887A (en) * 2018-04-13 2018-07-06 宁夏大学 Watch point calibration method and device, blinkpunkt scaling method and device attentively

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150049013A1 (en) * 2013-08-19 2015-02-19 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
CN105850113A (en) * 2014-01-06 2016-08-10 欧库勒斯虚拟现实有限责任公司 Calibration of virtual reality systems
CN108259887A (en) * 2018-04-13 2018-07-06 宁夏大学 Watch point calibration method and device, blinkpunkt scaling method and device attentively

Similar Documents

Publication Publication Date Title
US11172817B2 (en) Fundus image capture system
CN111510630B (en) Image processing method, device and storage medium
US20200293754A1 (en) Task execution method, terminal device, and computer readable storage medium
CN108989678B (en) Image processing method and mobile terminal
EP3584740B1 (en) Method for detecting biological feature data, biological feature recognition apparatus and electronic terminal
US9888090B2 (en) Magic wand methods, apparatuses and systems
CN110568930A (en) Method for calibrating fixation point and related equipment
CN108345845B (en) Image sensor, lens module, mobile terminal, face recognition method and device
CN109788174B (en) Light supplementing method and terminal
CN108063901A (en) A kind of image-pickup method, terminal and computer readable storage medium
CN108040164A (en) Close shot image pickup method, terminal and computer-readable recording medium
CN111552389A (en) Method and device for eliminating fixation point jitter and storage medium
CN108156378A (en) Photographic method, mobile terminal and computer readable storage medium
CN109068116A (en) Image processing method, device, mobile terminal and storage medium based on light filling
WO2014075026A1 (en) Remote control using depth camera
WO2018117681A1 (en) Image processing method and electronic device supporting same
CN112748798B (en) Eyeball tracking calibration method and related equipment
CN110462617A (en) For authenticating the electronic device and method of biological data by multiple cameras
CN111083374B (en) Filter adding method and electronic equipment
CN113395438B (en) Image correction method and related device for eyeball tracking technology
CN110177207A (en) Image pickup method, mobile terminal and the computer readable storage medium of backlight image
JPWO2016132617A1 (en) Information processing apparatus, information processing method, and program
CN108960097B (en) Method and device for obtaining face depth information
CN112748797B (en) Eyeball tracking method and related equipment
CN112578901A (en) Eyeball tracking calibration method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210330