CN113491502A - Eyeball tracking calibration inspection method, device, equipment and storage medium - Google Patents

Eyeball tracking calibration inspection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113491502A
CN113491502A CN202010260984.2A CN202010260984A CN113491502A CN 113491502 A CN113491502 A CN 113491502A CN 202010260984 A CN202010260984 A CN 202010260984A CN 113491502 A CN113491502 A CN 113491502A
Authority
CN
China
Prior art keywords
feedback
calibration
user
area
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010260984.2A
Other languages
Chinese (zh)
Inventor
张朕
路伟成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202010260984.2A priority Critical patent/CN113491502A/en
Publication of CN113491502A publication Critical patent/CN113491502A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Abstract

The invention discloses an eyeball tracking calibration inspection method, device, equipment and storage medium. The method comprises the following steps: determining a fixation point of a user according to eye characteristic information and a calibration coefficient of the user, wherein the eye characteristic information is characteristic change information of eyeballs and the peripheries of the eyeballs during eyeball movement; if the gazing point is in a feedback-capable area and the gazing time reaches a first preset duration, displaying a feedback picture in the feedback-capable area, wherein the feedback-capable area is used for making a feedback response to the gazing action of the user; and if the feedback area of the displayed feedback picture is correct, determining that the eyeball tracking calibration is qualified. The method and the device solve the problems that the calibration effect is not evaluated after the conventional eyeball tracking calibration, and a user cannot timely and accurately judge the calibration effect, realize timely and accurate feedback of the visual calibration effect after the eyeball tracking calibration, and improve the user experience and the accuracy of experimental analysis.

Description

Eyeball tracking calibration inspection method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of electronics, in particular to an eyeball tracking calibration inspection method, device, equipment and storage medium.
Background
Due to the high accuracy requirement of the eye tracking technology, users often need to calibrate before using the device with the eye tracking function.
In the existing eyeball tracking calibration process, after the eye feature information of a user is obtained and the calibration coefficient is obtained through calculation, the calibration link is finished, and the evaluation on the calibration effect is lacked. A few eyeball tracking devices can evaluate the calibration effect, but are limited to giving calibration scores or displaying a point set picture after the calibration is finished, and a user can hardly judge how much difference exists between the gazing point position and the actual gazing point position of the point set picture viewed by the user. Once the deviation of the calculated calibration coefficient is large due to unskilled or uncooperative user operation in the calibration process, the method is suitable for the contrary, and the user experience or experimental analysis in the subsequent use process is seriously influenced.
Disclosure of Invention
The invention provides an eyeball tracking calibration inspection method, device, equipment and storage medium, which are used for realizing evaluation and visual feedback of eyeball tracking calibration effect.
In a first aspect, an embodiment of the present invention provides an eyeball tracking calibration verification method, including:
determining a fixation point of a user according to eye characteristic information and a calibration coefficient of the user, wherein the eye characteristic information is characteristic change information of eyeballs and the peripheries of the eyeballs during eyeball movement;
if the gazing point is in a feedback-capable area and the gazing time reaches a first preset duration, displaying a feedback picture in the feedback-capable area, wherein the feedback-capable area is used for making a feedback response to the gazing action of the user;
and if the feedback area of the displayed feedback picture is correct, determining that the eyeball tracking calibration is qualified.
Optionally, before determining the gaze point of the user according to the eye feature information and the calibration coefficient, the method further includes:
and acquiring the eye feature information when the calibration point appears, and determining the calibration coefficient according to the eye feature information and the calibration point.
Optionally, the method further includes:
if the fixation point is not in the feedback-capable area within a second preset time or the fixation time does not reach the first preset time, re-determining the calibration coefficient; alternatively, the first and second electrodes may be,
and if the feedback area of the displayed feedback picture is wrong, re-determining the calibration coefficient.
Optionally, the sum of the feedbackable region, the first preset duration and the second preset duration is set according to a usage scenario.
In a second aspect, an embodiment of the present invention further provides an eyeball tracking calibration verification apparatus, where the apparatus includes:
the fixation point determining module is used for determining the fixation point of the user according to the eye feature information and the calibration coefficient of the user, wherein the eye feature information is feature change information of eyeballs and the peripheries of the eyeballs during eyeball movement;
the feedback display module is used for displaying a feedback picture in a feedback area if the fixation point is in the feedback area and the fixation time reaches a first preset duration, wherein the feedback area is used for making a feedback response to the fixation action of the user;
and the inspection and determination module is used for determining that the eyeball tracking calibration is qualified if the feedback area displaying the feedback picture is correct.
Optionally, the apparatus further comprises:
and the first calibration coefficient determining module is used for acquiring the eye feature information when the calibration point appears, and determining the calibration coefficient according to the eye feature information and the calibration point.
Optionally, the apparatus further comprises:
the second calibration coefficient determining module is used for re-determining the calibration coefficient if the fixation point is not in the feedback-capable area within a second preset time or the fixation time does not reach the first preset time; or, if the feedback area of the feedback picture is displayed wrongly, the calibration coefficient is determined again.
Optionally, the sum of the feedbackable region, the first preset duration and the second preset duration is set according to a usage scenario.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the eyeball tracking calibration verification method according to any embodiment of the present invention is implemented.
In a fourth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, where a computer program is stored, and when the computer program is executed by a processor, the method for calibrating an eye tracking calibration according to any of the embodiments of the present invention is implemented.
The invention acquires the calibration coefficient of the user after the eyeball tracking calibration, enters an intelligent inspection scene, determines the fixation point of the user by combining the eye characteristic information of the user, when the fixation point falls into the feedback area and meets the fixation duration, the feedback picture is displayed in the corresponding feedback area, so that the user can visually see the feedback of the eyeball tracking calibration effect and judge the accuracy of the eyeball tracking calibration in time, when the eyeball tracking calibration effect is not good, the calibration and the inspection are performed again, so that the problems that the calibration effect is not evaluated after the conventional eyeball tracking calibration is lacked, a user cannot accurately judge the calibration effect in time are solved, the problem that the user experience or experimental analysis in the subsequent use process is influenced when the deviation of the obtained calibration coefficient is large is solved, after the eyeball tracking calibration is realized, visual calibration effect is timely and accurately fed back, and user experience and accuracy of experimental analysis are improved.
Drawings
Fig. 1 is a flowchart illustrating an eyeball tracking calibration verification method according to an embodiment of the present invention;
fig. 2 is a flowchart of an eyeball tracking calibration verification method according to a second embodiment of the present invention;
fig. 3 is a block diagram of an eyeball tracking calibration verification apparatus according to a third embodiment of the present invention;
fig. 4 is a block diagram of a computer device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of an eyeball tracking calibration verification method according to an embodiment of the present invention, which is applicable to a case where calibration effects are verified after eyeball tracking calibration.
As shown in fig. 1, the method specifically includes the following steps:
and step 110, determining the fixation point of the user according to the eye feature information and the calibration coefficient of the user.
The eye feature information is feature change information of the eyeball and the eyeball periphery during eyeball movement. When the eyes of a person look at different directions, the eyes can slightly change, the changes can generate extractable features, the computer can extract the features through image capture or scanning, and the calibration method is combined with the calibration point to calibrate the eyeball tracking, so that the change of the sight line of the eyes is tracked in real time, the state and the demand of a user are predicted, the response is carried out, and the purpose of controlling the equipment by the eyes is achieved. Eye characteristic information of the user is acquired by the eyeball tracking equipment, and the calibration coefficient is acquired by performing eyeball tracking calibration on the user by the eyeball tracking equipment.
In the embodiment of the present invention, the eyeball tracking calibration method is not limited, and calibration data obtained by different eyeball tracking calibration methods are different: for example, when calibration data is collected by the eye tracking device, since the eye tracking device may be a MEMS, for example, including a MEMS infrared scanning mirror, an infrared light source, and an infrared receiver, eye movement is detected by capturing an eye image; the capacitance sensor can also detect eyeball movement through capacitance values between the eyeballs and the capacitance plates; it may also be a myoelectric current detector, for example, by placing electrodes at the bridge of the nose, forehead, ears or earlobe, detecting eye movement by the detected myoelectric current signal pattern; of course the most used is the pupillary-corneal reflex method. Therefore, the eye feature information acquired by the user in the calibration process may include: the eye characteristic image and/or the capacitance value and/or the electromyographic signal and other data, and the eye characteristic image may include: pupil position, pupil shape, iris position, iris shape, eyelid position, canthus position, spot position, etc.
Specifically, after eyeball tracking calibration is performed on a user, a calibration coefficient of the user is obtained, an intelligent feedback scene is entered, eye feature information of the user is obtained, and a user's point of regard is calculated according to the eye feature information and the calibration coefficient.
In a specific example, an eyeball tracking device is used to obtain a calibration coefficient after eyeball tracking calibration, enter an intelligent feedback scene, and obtain eye feature information of a user, where the obtained eye feature information may be pupil position coordinates, position coordinates of purkinje spots, and the like, and based on a sight line estimation algorithm, the eye feature information and the calibration coefficient of the user are input, an offset of the pupil position coordinates relative to the purkinje spot position coordinates is calculated, and then spatial coordinate transformation is performed on the offset, and real-time fixation point position coordinates are output.
And 120, if the gazing point is in the feedback-capable area and the gazing time reaches a first preset duration, displaying a feedback picture in the feedback-capable area.
The intelligent feedback scene can be provided with a plurality of feedback areas, and the feedback areas can be understood as areas capable of giving feedback responses when the user performs corresponding gazing actions. The feedback response can be a feedback picture, for example, a feedback mode that a corresponding pattern, graphic transformation, dynamic color map and the like appear in the feedbackable region and can generate visual interaction with the user is enough, so that the user can judge whether the own point of regard accurately falls into the feedbackable region. For example, fireworks appear to bloom when the user looks at the feedbackable region; when a user watches the fruit, the fruit is cut or bitten; when a user gazes at a star, the star is lighted up or enlarged, and the like.
Specifically, when the gaze point falls within a certain feedback-enabled area, timing is started, and when the gaze point reaches a first preset duration within the feedback-enabled area, a feedback screen is displayed on the feedback-enabled area.
The eye feature information of different users is different, and the calibration coefficients are obtained by combining the eye feature information of the users based on an eyeball tracking calibration method, so that the calibration coefficients obtained by different users are different. In addition, according to different requirements of application scenes on eyeball tracking calibration accuracy, calibration coefficients obtained by the same user in different scenes are different. Therefore, the criteria for verifying the accuracy of the eye tracking calibration are different.
Optionally, the feedback area and the preset duration are set according to the usage scenario.
Specifically, the intelligent feedback scene can change the coordinate range threshold and the time threshold of the dynamic feedback picture by acquiring the eye movement interaction requirement in the actual use scene. The size, shape and position of the feedback area and the preset time for reaching the feedback standard are set according to the requirements of different use scenes on the eyeball tracking calibration accuracy. In a specific example, in an actual usage scenario, if the minimum trigger range for completing the eye movement interaction is a circle with n pixel values as radii and the longest trigger duration is m seconds, in an intelligent feedback scenario of the calibration verification stage, the radius of the feedback-enabled area range is adjusted to n pixel values and the preset duration is m seconds. When the eye movement interaction accuracy requirement in the actual use scene is high (such as scenes of eye-controlled typing, point-of-gaze rendering and the like), the condition for generating the feedback picture is stricter (the interaction area capable of feedback is reduced, the time threshold is increased and the like).
And step 130, if the feedback area of the feedback picture is displayed correctly, determining that the eyeball tracking calibration is qualified.
Specifically, after the feedback picture is displayed in the feedbackable area, the user can determine whether the feedbackable area displaying the feedback picture is the area watched by the user, and if the feedbackable area displaying the feedback picture is the area watched by the user, it indicates that the calibration effect of this time meets the requirement of using the eyeball tracking function subsequently, and it can be determined that the eyeball tracking calibration of this time is qualified.
According to the technical scheme of the embodiment, after eyeball tracking calibration, the calibration coefficient of a user is acquired, an intelligent inspection scene is entered, the gaze point of the user is determined by combining eye characteristic information of the user, when the gaze point falls into a feedback area and meets the gaze duration, a feedback picture is displayed in the corresponding feedback area, the user can visually see feedback of the effect of eyeball tracking calibration and judge the accuracy of eyeball tracking calibration in time, when the effect of eyeball tracking calibration is not good, calibration and inspection are carried out again, the problems that the calibration effect evaluation is lacked after the existing eyeball tracking calibration, the user cannot judge the calibration effect accurately in time, the user experience or experimental analysis in the subsequent use process is influenced when the deviation of the calibration coefficient is large are solved, and the visual calibration effect is fed back accurately in time after eyeball tracking calibration is realized, the accuracy of user experience and experimental analysis is improved.
Example two
Fig. 2 is a flowchart of an eyeball tracking calibration verification method according to a second embodiment of the present invention. In this embodiment, the eyeball tracking calibration verification method is further optimized based on the above embodiments.
As shown in fig. 2, the method specifically includes:
step 210, obtaining eye feature information, and determining a calibration coefficient according to the eye feature information and the calibration point.
Specifically, when a calibration point appears on the display screen, the eyeball tracking equipment is used for acquiring the eye feature information of the user, and the calibration point can be displayed in a mode that a plurality of calibration points appear on the display screen in sequence or only one calibration point appears, so that the calibration point moves, the calibration point which is watched by the user and moved is indicated, and the like; based on the sight line estimation algorithm, the personalized calibration coefficient for the user is obtained by combining the eye feature information of the user with the position coordinates of the calibration points through calculation.
And step 220, determining the fixation point of the user according to the eye feature information and the calibration coefficient of the user.
Step 230, determining whether the gazing point is in the feedback-enabled area and the gazing time reaches the first preset duration within the second preset duration.
Specifically, an eyeball tracking calibration and inspection link is entered after the calibration coefficient is obtained, an intelligent feedback scene appears in the calibration and inspection link, eyeball tracking equipment collects eye characteristic information of a user, and the user's point of regard is calculated according to the calibration coefficient which is just obtained. If the determined gazing point is not in the feedback-enabled area all the time or the gazing time length in any one feedback-enabled area does not reach the first preset time length within a certain time, the step 210 is performed again; if the gazing point is in a certain feedback area and the gazing time reaches a first preset time, proceed to step 240.
And 240, displaying a feedback picture in the feedback area.
Specifically, when the gazing point is in a certain feedback area and the gazing time reaches a first preset time length, the feedback picture is displayed in the feedback area. The specific display feedback screen is as described in step 120, and will not be repeated here.
Step 250, judging whether the feedback area displaying the feedback picture is correct.
Specifically, if the feedback-enabled area displaying the feedback picture is the same as the area actually watched by the user, the calibration may be considered to be qualified, and step 260 is performed; otherwise, go to step 210.
Specifically, the method for determining that the feedback-enabled area displaying the feedback picture is the same as the area actually watched by the user may be to allow the user to manually determine, for example, a keyboard key is preset or a confirmation point appears on the screen, and the user clicks the key or clicks the confirmation point on the screen, that is, to determine that the feedback-enabled area displaying the feedback picture is the same as the area actually watched by the user. If the user does not click a key or confirm within the preset time, it indicates that the system fails to calibrate and returns to step 210 to re-determine the calibration coefficients. For a user with difficulty in moving, it may be set that when the user closes the eye for a certain time (generally longer than the time period for closing the eye when the person normally blinks, for example, 2 seconds), the feedback-enabled area for displaying the feedback screen is determined to be the same as the area that the user actually gazes at. If the user has not closed the eye for a predetermined period of time, the calibration is determined to be unacceptable and the system returns to step 210 to re-determine the calibration coefficients. That is, when the user makes a corresponding confirmation instruction, it is determined that the eye tracking calibration is acceptable, and if the user does not make a confirmation instruction, it is determined that the eye tracking calibration is not acceptable. Accordingly, the user may also be configured to make an unqualified instruction to determine that the eye tracking calibration is unqualified, and the user may determine that the eye tracking calibration is qualified if the user does not make an instruction.
The user can also use a voice feedback mode to confirm the qualified calibration, and when finding that the feedback area displaying the feedback picture is the same as or different from the area actually watched by the user, the user can send out voice to confirm.
The verification of calibration is carried out by the modes, so that normal people and people with inconvenient actions can use a convenient method to confirm whether the calibration is qualified or not, and a user can use and operate conveniently.
And step 260, the eyeball tracking calibration is qualified.
Generally, after the eyeball tracking calibration is qualified, the subsequent practical application is carried out, and the specific application program is controlled by eyes.
The calibration coefficient of a user is acquired after eyeball tracking calibration, the calibration coefficient enters an intelligent inspection scene, the gaze point of the user is determined by combining eye characteristic information of the user, when the gaze point falls into a feedback area and meets the gaze duration, a feedback picture is displayed in the corresponding feedback area, the user can visually see feedback of the effect of eyeball tracking calibration and judge the accuracy of eyeball tracking calibration in time, when the effect of eyeball tracking calibration is not good, calibration and inspection are carried out again, meanwhile, the feedback area and the preset duration are intelligently adjusted according to different application scenes, different eye movement interaction requirements are met, the problems that the calibration effect evaluation is lacked after the existing eyeball tracking calibration, the user cannot accurately judge the calibration effect in time, and the user experience or experimental analysis in the subsequent use process is influenced when the deviation of the calibration coefficient is large are solved, after the eyeball tracking calibration, the visual calibration effect can be timely and accurately fed back, and the user experience and the accuracy of experimental analysis are improved.
EXAMPLE III
The eyeball tracking calibration inspection device provided by the embodiment of the invention can execute the eyeball tracking calibration inspection method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. Fig. 3 is a block diagram of an eyeball tracking calibration verification apparatus according to a third embodiment of the present invention, and as shown in fig. 3, the apparatus includes: a point of regard determination module 310, a feedback display module 320, and a verification determination module 330.
The gaze point determining module 310 is configured to determine a gaze point of the user according to the eye feature information of the user and the calibration coefficient, where the eye feature information is feature change information of the eyeball and the periphery of the eyeball during the eyeball movement.
The feedback display module 320 is configured to display a feedback picture in the feedback-enabled area if the gazing point is in the feedback-enabled area and the gazing time reaches a first preset duration, where the feedback-enabled area is used for making a feedback response to the gazing action of the user.
The verification determining module 330 is configured to determine that the eyeball tracking calibration is qualified if the feedbackable region displaying the feedback frame is correct.
According to the technical scheme of the embodiment, after eyeball tracking calibration, the calibration coefficient of a user is acquired, an intelligent inspection scene is entered, the gaze point of the user is determined by combining eye characteristic information of the user, when the gaze point falls into a feedback area and meets the gaze duration, a feedback picture is displayed in the corresponding feedback area, the user can visually see feedback of the effect of eyeball tracking calibration and judge the accuracy of eyeball tracking calibration in time, when the effect of eyeball tracking calibration is not good, calibration and inspection are carried out again, the problems that the calibration effect evaluation is lacked after the existing eyeball tracking calibration, the user cannot judge the calibration effect accurately in time, the user experience or experimental analysis in the subsequent use process is influenced when the deviation of the calibration coefficient is large are solved, and the visual calibration effect is fed back accurately in time after eyeball tracking calibration is realized, the accuracy of user experience and experimental analysis is improved.
Optionally, the apparatus further comprises:
the first calibration coefficient determining module 340 is configured to obtain eye feature information when a calibration point occurs, and determine a calibration coefficient according to the eye feature information and the calibration point.
Optionally, the apparatus further comprises:
a second calibration coefficient determining module 350, configured to determine a calibration coefficient again if the injection point is not in the feedbackable region in the second preset duration or the fixation time does not reach the first preset duration; or, if the feedback area of the feedback picture is displayed in error, the calibration coefficient is determined again.
Optionally, the feedback-enabled area, the first preset time and the second preset time are set according to a usage scenario.
The invention acquires the calibration coefficient of the user after the eyeball tracking calibration, enters an intelligent inspection scene, determines the fixation point of the user by combining the eye characteristic information of the user, when the fixation point falls into the feedback area and meets the fixation duration, the feedback picture is displayed in the corresponding feedback area, so that the user can visually see the feedback of the eyeball tracking calibration effect and judge the accuracy of the eyeball tracking calibration in time, when the eyeball tracking calibration effect is not good, the calibration and the inspection are performed again, so that the problems that the calibration effect is not evaluated after the conventional eyeball tracking calibration is lacked, a user cannot accurately judge the calibration effect in time are solved, the problem that the user experience or experimental analysis in the subsequent use process is influenced when the deviation of the obtained calibration coefficient is large is solved, after the eyeball tracking calibration is realized, visual calibration effect is timely and accurately fed back, and user experience and accuracy of experimental analysis are improved.
Example four
Fig. 4 is a block diagram of a computer apparatus according to a fourth embodiment of the present invention, as shown in fig. 4, the computer apparatus includes a processor 410, a memory 420, an input device 430, and an output device 440; the number of the processors 410 in the computer device may be one or more, and one processor 410 is taken as an example in fig. 4; the processor 410, the memory 420, the input device 430 and the output device 440 in the computer apparatus may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 4.
The memory 420 serves as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the eye tracking calibration verification method in the embodiment of the present invention (for example, the gazing point determination module 310, the feedback display module 320, and the verification determination module 330 in the eye tracking calibration verification apparatus). The processor 410 executes software programs, instructions and modules stored in the memory 420 to execute various functional applications and data processing of the computer device, i.e. to implement the eyeball tracking calibration verification method described above.
The memory 420 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 420 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 420 may further include memory located remotely from processor 410, which may be connected to a computer device through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the computer apparatus. The output device 440 may include a display device such as a display screen.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a method for eye tracking calibration verification, the method comprising:
determining a fixation point of the user according to the eye characteristic information and the calibration coefficient of the user, wherein the eye characteristic information is characteristic change information of eyeballs and the peripheries of the eyeballs during eyeball movement;
if the gazing point is in the feedback area and the gazing time reaches a first preset duration, displaying a feedback picture in the feedback area, wherein the feedback area is used for making a feedback response to the gazing action of the user;
and if the feedback area of the displayed feedback picture is correct, determining that the eyeball tracking calibration is qualified.
Of course, the embodiments of the present invention provide a storage medium containing computer-executable instructions, which are not limited to the method operations described above, but can also perform related operations in the eyeball tracking calibration verification provided by any embodiments of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the above search apparatus, each included unit and module are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An eye tracking calibration verification method, comprising:
determining a fixation point of a user according to eye characteristic information and a calibration coefficient of the user, wherein the eye characteristic information is characteristic change information of eyeballs and the peripheries of the eyeballs during eyeball movement;
if the gazing point is in a feedback-capable area and the gazing time reaches a first preset duration, displaying a feedback picture in the feedback-capable area, wherein the feedback-capable area is used for making a feedback response to the gazing action of the user;
and if the feedback area of the displayed feedback picture is correct, determining that the eyeball tracking calibration is qualified.
2. The eye tracking calibration verification method according to claim 1, further comprising, before determining the user's gaze point based on the eye feature information and the calibration coefficients:
and acquiring the eye feature information when the calibration point appears, and determining the calibration coefficient according to the eye feature information and the calibration point.
3. The eye tracking calibration verification method according to any one of claims 1-2, further comprising:
if the fixation point is not in the feedback-capable area within a second preset time or the fixation time does not reach the first preset time, re-determining the calibration coefficient; alternatively, the first and second electrodes may be,
and if the feedback area of the displayed feedback picture is wrong, re-determining the calibration coefficient.
4. The method according to claim 1, wherein the feedback area, the first preset time and the second preset time are set according to a usage scenario.
5. An eye tracking calibration verification device, said device comprising:
the fixation point determining module is used for determining the fixation point of the user according to the eye feature information and the calibration coefficient of the user, wherein the eye feature information is feature change information of eyeballs and the peripheries of the eyeballs during eyeball movement;
the feedback display module is used for displaying a feedback picture in a feedback area if the fixation point is in the feedback area and the fixation time reaches a first preset duration, wherein the feedback area is used for making a feedback response to the fixation action of the user;
and the inspection and determination module is used for determining that the eyeball tracking calibration is qualified if the feedback area displaying the feedback picture is correct.
6. The eye tracking calibration verification device of claim 1, further comprising:
and the first calibration coefficient determining module is used for acquiring the eye feature information when the calibration point appears, and determining the calibration coefficient according to the eye feature information and the calibration point.
7. The eye tracking calibration verification device according to any one of claims 5-6, further comprising:
the second calibration coefficient determining module is used for re-determining the calibration coefficient if the fixation point is not in the feedback-capable area within a second preset time or the fixation time does not reach the first preset time; or, if the feedback area of the feedback picture is displayed wrongly, the calibration coefficient is determined again.
8. The eye tracking calibration device of claim 1, wherein the feedback area, the first predetermined duration and the second predetermined duration are set according to a usage scenario.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the eye tracking calibration verification method according to any one of claims 1-4.
10. A storage medium containing computer executable instructions for performing the eye tracking calibration verification method according to any one of claims 1-4 when executed by a computer processor.
CN202010260984.2A 2020-04-03 2020-04-03 Eyeball tracking calibration inspection method, device, equipment and storage medium Pending CN113491502A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010260984.2A CN113491502A (en) 2020-04-03 2020-04-03 Eyeball tracking calibration inspection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010260984.2A CN113491502A (en) 2020-04-03 2020-04-03 Eyeball tracking calibration inspection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113491502A true CN113491502A (en) 2021-10-12

Family

ID=77995232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010260984.2A Pending CN113491502A (en) 2020-04-03 2020-04-03 Eyeball tracking calibration inspection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113491502A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992907A (en) * 2021-10-29 2022-01-28 南昌虚拟现实研究院股份有限公司 Eyeball parameter checking method, system, computer and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication
KR20140051480A (en) * 2012-10-04 2014-05-02 삼성전자주식회사 Apparatus and method for display
CN103850582A (en) * 2012-11-30 2014-06-11 由田新技股份有限公司 Eye-movement operation password input method and safe using same
CN105247447A (en) * 2013-02-14 2016-01-13 眼球控制技术有限公司 Systems and methods of eye tracking calibration
CN206026294U (en) * 2016-07-12 2017-03-22 吴越 Wear -type field of vision detector
CN107390863A (en) * 2017-06-16 2017-11-24 北京七鑫易维信息技术有限公司 Control method and device, electronic equipment, the storage medium of equipment
CN110187855A (en) * 2019-05-28 2019-08-30 武汉市天蝎科技有限公司 The intelligent adjusting method for avoiding hologram block vision of near-eye display device
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking
CN110623629A (en) * 2019-07-31 2019-12-31 毕宏生 Visual attention detection method and system based on eyeball motion

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication
KR20140051480A (en) * 2012-10-04 2014-05-02 삼성전자주식회사 Apparatus and method for display
CN103850582A (en) * 2012-11-30 2014-06-11 由田新技股份有限公司 Eye-movement operation password input method and safe using same
CN105247447A (en) * 2013-02-14 2016-01-13 眼球控制技术有限公司 Systems and methods of eye tracking calibration
CN206026294U (en) * 2016-07-12 2017-03-22 吴越 Wear -type field of vision detector
CN107390863A (en) * 2017-06-16 2017-11-24 北京七鑫易维信息技术有限公司 Control method and device, electronic equipment, the storage medium of equipment
CN110187855A (en) * 2019-05-28 2019-08-30 武汉市天蝎科技有限公司 The intelligent adjusting method for avoiding hologram block vision of near-eye display device
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking
CN110623629A (en) * 2019-07-31 2019-12-31 毕宏生 Visual attention detection method and system based on eyeball motion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992907A (en) * 2021-10-29 2022-01-28 南昌虚拟现实研究院股份有限公司 Eyeball parameter checking method, system, computer and readable storage medium
CN113992907B (en) * 2021-10-29 2023-11-07 南昌虚拟现实研究院股份有限公司 Eyeball parameter verification method, eyeball parameter verification system, computer and readable storage medium

Similar Documents

Publication Publication Date Title
US10488925B2 (en) Display control device, control method thereof, and display control system
CN108427503B (en) Human eye tracking method and human eye tracking device
CN109410285B (en) Calibration method, calibration device, terminal equipment and storage medium
CN107392120B (en) Attention intelligent supervision method based on sight line estimation
CN108681399B (en) Equipment control method, device, control equipment and storage medium
WO2020015468A1 (en) Image transmission method and apparatus, terminal device, and storage medium
CN109032351B (en) Fixation point function determination method, fixation point determination device and terminal equipment
TW201814572A (en) Facial recognition-based authentication
CN109976535B (en) Calibration method, device, equipment and storage medium
JPH11175246A (en) Sight line detector and method therefor
CN109254662A (en) Mobile device operation method, apparatus, computer equipment and storage medium
CN113495613B (en) Eyeball tracking calibration method and device
CN112987910B (en) Testing method, device, equipment and storage medium of eyeball tracking equipment
CN112733619A (en) Pose adjusting method and device for acquisition equipment, electronic equipment and storage medium
CN106681509A (en) Interface operating method and system
CN110174937A (en) Watch the implementation method and device of information control operation attentively
US10108259B2 (en) Interaction method, interaction apparatus and user equipment
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
WO2016131337A1 (en) Method and terminal for detecting vision
CN111610886A (en) Method and device for adjusting brightness of touch screen and computer readable storage medium
CN114281236B (en) Text processing method, apparatus, device, medium, and program product
Kathpal et al. iChat: interactive eyes for specially challenged people using OpenCV Python
CN116413915A (en) Luminance adjusting method of near-eye display device, near-eye display device and medium
CN114740966A (en) Multi-modal image display control method and system and computer equipment
CN112631424A (en) Gesture priority control method and system and VR glasses thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination