CN113534945A - Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium - Google Patents

Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium Download PDF

Info

Publication number
CN113534945A
CN113534945A CN202010304327.3A CN202010304327A CN113534945A CN 113534945 A CN113534945 A CN 113534945A CN 202010304327 A CN202010304327 A CN 202010304327A CN 113534945 A CN113534945 A CN 113534945A
Authority
CN
China
Prior art keywords
calibration coefficient
determining
eye
gazing
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010304327.3A
Other languages
Chinese (zh)
Inventor
杨飞
赖建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202010304327.3A priority Critical patent/CN113534945A/en
Publication of CN113534945A publication Critical patent/CN113534945A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The embodiment of the invention provides a method, a device, equipment and a storage medium for determining an eyeball tracking calibration coefficient. The method comprises the following steps: collecting an eye image of a user watching a set target point; determining the calculated fixation point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image; determining the gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point; and determining the calibration coefficient with the gazing precision meeting the set condition as a target calibration coefficient. According to the method for determining the eyeball tracking calibration coefficient provided by the embodiment of the invention, the target calibration coefficient suitable for the user is determined by watching the eye image of the set target point and the stored calibration coefficient by the user, so that the calibration coefficient of the user can be rapidly determined, and the eyeball tracking efficiency is improved.

Description

Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium
Technical Field
The embodiment of the invention relates to the technical field of eyeball tracking, in particular to a method, a device, equipment and a storage medium for determining an eyeball tracking calibration coefficient.
Background
When a user uses an eye tracking device to track eyes, in order to achieve the best tracking effect, calibration needs to be performed first to obtain a calibration coefficient for the user, and the calibration coefficient is optimal for the eye tracking effect of the user in the current software environment. Therefore, it is important to obtain calibration coefficients for the user.
In the prior art, a user needs to calibrate the eyeball tracking device every time, the calibration process is complex, and the eyeball tracking efficiency and the user experience are affected. Or, the user identity and the calibration coefficient are bound and stored, and identity recognition is performed during use to obtain the calibration coefficient of the user, but the method is limited by software and hardware or biological information of the user, and has the defect of wrong recognition.
Disclosure of Invention
Embodiments of the present invention provide a method, an apparatus, a device, and a storage medium for determining an eyeball tracking calibration coefficient, which can quickly determine a calibration coefficient of a user, thereby improving eyeball tracking efficiency.
In a first aspect, an embodiment of the present invention provides a method for determining an eyeball tracking calibration coefficient, including:
collecting an eye image of a user watching a set target point;
determining the calculated fixation point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image;
determining the gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point;
and determining the calibration coefficient with the gazing precision meeting the set condition as a target calibration coefficient.
In a second aspect, an embodiment of the present invention further provides an apparatus for determining an eyeball tracking calibration coefficient, including:
the eye image acquisition module is used for acquiring an eye image of a user gazing at a set target point;
the calculation fixation point information acquisition module is used for determining calculation fixation point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image;
the gazing precision determining module is used for determining the gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point;
and the target calibration coefficient determining module is used for determining the calibration coefficient of which the gazing precision meets the set conditions as the target calibration coefficient.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method for determining the eye tracking calibration coefficient according to the embodiment of the present invention.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processing device to implement the method for determining the eyeball tracking calibration coefficient according to the embodiment of the present invention.
In this embodiment, an eye image of a user gazing at a set target point is collected, then calculated gazing point information corresponding to each calibration coefficient is determined according to each stored calibration coefficient and the eye image, then gazing accuracy of each calibration coefficient is determined according to the calculated gazing point information and the information of the set target point, and finally, a calibration coefficient of which the gazing accuracy meets a set condition is determined as a target calibration coefficient. According to the method for determining the eyeball tracking calibration coefficient provided by the embodiment of the invention, the target calibration coefficient suitable for the user is determined by watching the eye image of the set target point and the stored calibration coefficient by the user, and the calibration coefficient of the user can be rapidly determined, so that the eyeball tracking efficiency and the user experience are improved.
Drawings
Fig. 1 is a flowchart of a method for determining an eye tracking calibration coefficient according to an embodiment of the invention;
fig. 2 is a schematic structural diagram of an apparatus for determining an eye tracking calibration coefficient according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a computer device in a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Gaze tracking, which may also be referred to as eye movement tracking, is a technique for estimating the gaze direction and/or gaze point of an eye by measuring eye movement. The method specifically comprises the steps that an eye image of a user to be detected is captured in real time, and the relative position of eye features is analyzed through the eye image of the user to be detected, so that the fixation point information of the user to be detected is obtained; or detecting eyeball movement through a capacitance value between the eyeballs and the capacitance plate to obtain the information of the fixation points of the user to be detected; or electrodes are arranged at the bridge of the nose, the forehead, the ears or the earlobes, the eyeball movement is detected through the detected myoelectric current signal mode, and the information of the fixation point of the user to be detected is obtained. Of course, other methods for acquiring the gaze point information of the user to be detected in real time may be adopted, which all fall within the scope of the present invention.
Tracking of the eye can be achieved by optical recording. The principle of the optical recording method is that an infrared camera is used for recording the eye movement condition of a tested person, namely, an eye image capable of reflecting the eye movement is obtained, and eye features are extracted from the obtained eye image to be used for establishing an estimation model of the sight. Wherein the eye features may include: pupil location, pupil shape, iris location, iris shape, eyelid location, canthus location, spot location (or purkinje spot), and the like. Optical recording methods include pupil-cornea reflectometry. The principle of the pupil-cornea reflection method is that a near-infrared light source irradiates an eye, an infrared camera shoots the eye, and meanwhile, a reflection point of the light source on the cornea, namely a light spot, is shot, so that an eye image with the light spot is obtained.
Of course, the eye tracking device may be a MEMS micro-electromechanical system, including, for example, a MEMS infrared scanning mirror, an infrared light source, an infrared receiver; or a capacitance sensor which detects the eyeball movement through the capacitance value between the eyeball and the capacitance plate; and more, the device can be a myoelectric current detector which detects eye movement through a detected myoelectric current signal mode by placing an electrode at the bridge of the nose, forehead, ear or earlobe.
At present, there are various methods for the gaze tracking technology to acquire the gaze information of the user, which are not described in detail herein.
Example one
Fig. 1 is a flowchart of a method for determining an eye tracking calibration coefficient according to an embodiment of the present invention, which is applicable to determining a calibration coefficient when a user uses an eye tracking device, and the method can be performed by an eye tracking calibration coefficient determining apparatus, which can be implemented by hardware and/or software, and can be generally integrated into a device having a function of determining an eye tracking calibration coefficient, where the device can be an electronic device such as a server or a server cluster. As shown in fig. 1, the method specifically comprises the following steps:
and step 110, collecting an eye image of a user gazing at a set target point.
The set point may be a point displayed at a set position on a screen, or may be a position point of a certain marker designated in the real world, which is collectively referred to as a target point for the user to watch. The eye images may be a plurality of consecutive eye images collected when the user gazes at the set target point. In this embodiment, when the user uses the eye-movement tracking device, the user's eyes need to be calibrated first to obtain the calibration coefficient of the user, so that the user's sight line is tracked according to the calibration coefficient. The user uses an eye tracking device that displays or indicates a point for the user to gaze and collects a plurality of eye images of the user gazing at the target point.
And step 120, determining the calculated fixation point information corresponding to each calibration coefficient according to the stored calibration coefficients and the eye image.
The stored calibration coefficient may be a calibration coefficient obtained by calibrating eyeballs of different users, and the score obtained by the calibration exceeds a set threshold, that is, an optimal calibration coefficient. In this embodiment, one of the stored calibration coefficients may be selected as the calibration coefficient of the current user, so that recalibration of the current user is not required. The calculated gazing point information may be coordinate information of a gazing point calculated from the calibration coefficient and the eye image.
Specifically, the manner of determining the gaze point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image may be: performing feature extraction on the eye image to obtain eye feature information; and calculating according to the stored calibration coefficients and the characteristic information to obtain the calculated fixation point information corresponding to each calibration coefficient.
The eye feature information may include pupil center position information and spot center position information. The eye characteristic information may include one or more sub-feature information, i.e., feature information corresponding to one or more eye images. In this embodiment, the process of extracting the features of the eye image to obtain the eye feature information may be: if the eye images contain a plurality of eye images, respectively extracting the features of each eye image to obtain a plurality of sub-feature information; performing clustering algorithm selection on the plurality of sub-feature information to obtain at least one optimal sub-feature information; and determining at least one optimal sub-feature information as the eye feature information.
The sub-characteristic information may include sub-pupil center position information and sub-spot center position information. In this embodiment, if a plurality of eye images of the target point are set by the user, feature extraction is performed on each image to obtain sub-feature information corresponding to each eye image, a clustering algorithm is then acquired to cluster the sub-feature information to obtain one or more optimal sub-feature information, and finally, the calculated fixation point information is obtained according to the calibration coefficient and the optimal sub-feature information.
And step 130, determining the fixation precision of each calibration coefficient according to the information of the calculated fixation point and the information of the set target point.
The gazing precision can be the distance between the gazing point and the set target point or the included angle between the line connecting the gazing point and the eyes of the user and the line connecting the set target point and the eyes of the user.
Optionally, after obtaining the calculated gaze point corresponding to each stored calibration coefficient, obtaining a distance between the calculated gaze point and the set target point, and determining the distance obtained by calculation as the gaze precision.
Optionally, after obtaining the calculated gaze point corresponding to each stored calibration coefficient, an included angle between the calculated gaze point and a line connecting the user's eyes and the set target point and the line connecting the user's eyes is determined, and the included angle obtained by calculation is determined as the gaze precision.
And step 140, determining the calibration coefficient with the gazing precision meeting the set conditions as a target calibration coefficient.
Specifically, if the gaze precision is to calculate the distance between the gaze point and the set target point, the manner of determining the calibration coefficient whose gaze precision satisfies the set condition as the target calibration coefficient may be: and determining the calibration coefficient with the distance smaller than the first set value as a target calibration coefficient. Wherein the first set value may be one of 0.6mm to 2 mm.
Specifically, the method for determining the calibration coefficient with the gazing precision meeting the set condition as the target calibration coefficient includes the following steps: and determining the calibration coefficient with the included angle smaller than the second set value as a target calibration coefficient. Wherein the second set value may be one of 0.5 degrees to 2 degrees.
It should be noted that the specific values of the first setting value and the second setting value are determined according to the accuracy of the eye tracking device itself and the required accuracy of the actual application, and may be determined and changed according to the actual requirement.
In this embodiment, when determining the target calibration coefficient, the calibration coefficient may be selected from all stored calibration coefficients, or selected from some calibration coefficients (e.g., the calibration coefficients with scores sorted in the top).
According to the technical scheme of the embodiment, firstly, an eye image of a user gazing a set target point is collected, then, calculated gazing point information corresponding to each calibration coefficient is determined according to each stored calibration coefficient and the eye image, then, gazing precision of each calibration coefficient is determined according to the calculated gazing point information and the information of the set target point, and finally, the calibration coefficient of which the gazing precision meets set conditions is determined as a target calibration coefficient. According to the method for determining the eyeball tracking calibration coefficient provided by the embodiment of the invention, the target calibration coefficient suitable for the user is determined by watching the eye image of the set target point and the stored calibration coefficient by the user, and the calibration coefficient of the user can be rapidly determined, so that the eyeball tracking efficiency and the user experience are improved.
Optionally, before acquiring the eye image of the user gazing at the set target point, the method further includes the following steps: performing eyeball tracking calibration on a plurality of users to obtain a plurality of calibration coefficients; calculating the score of each calibration coefficient, and determining the calibration coefficient with the score exceeding a set threshold value as an optimal calibration coefficient; and storing the optimal calibration coefficient.
Wherein, to the user carry out eyeball tracking calibration, the process of obtaining the calibration coefficient may be: collecting eye images of a plurality of target points watched by a user; processing a plurality of eye images corresponding to each target point by adopting a clustering algorithm to obtain at least one optimal eye image corresponding to each target point; and calculating a calibration coefficient according to at least one eye image corresponding to each target point.
Specifically, the method for processing the multiple eye images corresponding to each target point by using the clustering algorithm may be that, for the multiple eye images corresponding to the current target point, first, the feature information of each eye image is extracted to obtain multiple sub-feature information, and then, the clustering algorithm is used to cluster the multiple sub-feature information to obtain one or more optimal sub-feature information of the current target point. Correspondingly, the calibration coefficient may be calculated according to one or more optimal sub-feature information corresponding to each target point.
The way of calculating the score of the calibration coefficient may be: determining and calculating the fixation precision of the fixation point according to the calibration coefficient and at least one optimal eye image corresponding to each target point; and calculating the score of the corresponding calibration coefficient according to the gazing precision of the calculated gazing point. The lower the numerical value of the gaze accuracy, the higher the score of the corresponding calibration coefficient.
The gazing precision can be the distance between the gazing point and the set target point or the included angle between the line connecting the gazing point and the eyes of the user and the line connecting the set target point and the eyes of the user. Specifically, for the current target point, the calculated gaze point information corresponding to the current target point is obtained according to the one or more optimal sub-characteristic information of the current target point and the calibration coefficient, and then the gaze accuracy of the current calculated gaze point is calculated according to the calculated gaze point information and the current target point information.
In this embodiment, the score of the calibration coefficient may be calculated from the gaze accuracy of each of the calculated gaze points by obtaining the average gaze accuracy of each of the calculated gaze points and determining the score of the calibration coefficient from the average gaze accuracy. In this embodiment, a correspondence between the gazing accuracy and the score may be preset, and a mapping table of the gazing accuracy and the calibration coefficient score may be created according to the correspondence. Specifically, after the calibration coefficient is obtained, the gazing accuracy of each calculation gazing point is determined according to the calibration coefficient, the average gazing accuracy is obtained, and then a score corresponding to the average gazing accuracy is searched from the mapping table, so that the score of the calibration coefficient is obtained. For example, assuming that the gaze accuracy of each calculated gaze point is 0.3, 0.33, 0.44, 0.35, 0.46 and the average gaze accuracy is 0.376, the score corresponding to 0.376 is found from the mapping table to be 92 points, that is, the score of the calibration coefficient is 92 points. If the set threshold is set to 90 points, the calibration coefficient with the score of 92 is determined as the optimal calibration coefficient and stored.
In this embodiment, the calibration coefficient with the score exceeding the set threshold is stored for the subsequent user using the eye tracking device to use, and the subsequent user using the eye tracking device does not need to be calibrated again, so that the eye tracking efficiency can be improved.
Example two
Fig. 2 is a schematic structural diagram of an apparatus for determining an eyeball tracking calibration coefficient according to a second embodiment of the present invention. As shown in fig. 2, the apparatus includes: the eye image acquisition module 210, the calculation gazing point information acquisition module 220, the gazing accuracy determination module 230, and the target calibration coefficient determination module 240.
An eye image collecting module 210, configured to collect an eye image of a user gazing at a set target point;
a calculation fixation point information obtaining module 220, configured to determine, according to the stored calibration coefficients and the eye image, calculation fixation point information corresponding to each calibration coefficient;
a gazing precision determining module 230, configured to determine gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point;
and a target calibration coefficient determining module 240, configured to determine, as a target calibration coefficient, a calibration coefficient for which the gaze precision satisfies a set condition.
Optionally, the module 220 for obtaining information of a point of regard is further configured to:
extracting the features of the eye image to obtain eye feature information; the eye feature information comprises pupil center position information and light spot center position information;
and calculating according to the stored calibration coefficients and the eye characteristic information to obtain the calculated fixation point information corresponding to each calibration coefficient.
Optionally, the module 220 for obtaining information of a point of regard is further configured to:
if the eye images contain a plurality of eye images, respectively extracting the features of each eye image to obtain a plurality of sub-feature information;
performing clustering algorithm selection on the plurality of sub-feature information to obtain at least one optimal sub-feature information; and determining the at least one optimal sub-feature information as eye feature information.
Optionally, the gaze accuracy determining module 230 is further configured to:
acquiring the distance between the calculated fixation point and the set target point, and determining the distance as fixation precision;
optionally, the target calibration coefficient determining module 240 is further configured to:
and determining the calibration coefficient with the distance smaller than the first set value as a target calibration coefficient.
Optionally, the gaze accuracy determining module 230 is further configured to:
determining an included angle between a line connecting the gaze point and the eyes of the user and a line connecting the set target point and the eyes of the user, and determining the included angle as the gaze precision;
optionally, the target calibration coefficient determining module 240 is further configured to:
and determining the calibration coefficient with the included angle smaller than the second set value as a target calibration coefficient.
Optionally, the method further includes: an optimal calibration coefficient acquisition module to:
performing eyeball tracking calibration on a plurality of users to obtain a plurality of calibration coefficients;
calculating the score of each calibration coefficient, and determining the optimal calibration coefficient by using the calibration coefficient with the score exceeding a set threshold;
and storing the optimal calibration coefficient.
Optionally, the optimal calibration coefficient obtaining module is further configured to:
collecting eye images of a plurality of target points watched by a user;
processing a plurality of eye images corresponding to each target point by adopting a clustering algorithm to obtain at least one optimal eye image corresponding to each target point;
and calculating a calibration coefficient according to at least one eye image corresponding to each target point.
Optionally, the optimal calibration coefficient obtaining module is further configured to:
calculating the gazing precision of each target point according to the calibration coefficient and at least one optimal eye image corresponding to each target point;
and calculating the score of the calibration coefficient according to the gazing precision of each target point.
The device can execute the methods provided by all the embodiments of the invention, and has corresponding functional modules and beneficial effects for executing the methods. For details not described in detail in this embodiment, reference may be made to the methods provided in all the foregoing embodiments of the present invention.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a computer device according to a third embodiment of the present invention. FIG. 3 illustrates a block diagram of a computer device 312 suitable for use in implementing embodiments of the present invention. The computer device 312 shown in FIG. 3 is only an example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention. Device 312 is a computing device for determining the function of typical eye tracking calibration coefficients.
As shown in FIG. 3, computer device 312 is in the form of a general purpose computing device. The components of computer device 312 may include, but are not limited to: one or more processors 316, a storage device 328, and a bus 318 that couples the various system components including the storage device 328 and the processors 316.
Bus 318 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Computer device 312 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 312 and includes both volatile and nonvolatile media, removable and non-removable media.
Storage 328 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 330 and/or cache Memory 332. The computer device 312 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 334 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 3, and commonly referred to as a "hard drive"). Although not shown in FIG. 3, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk-Read Only Memory (CD-ROM), a Digital Video disk (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 318 by one or more data media interfaces. Storage 328 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
Program 336 having a set (at least one) of program modules 326 may be stored, for example, in storage 328, such program modules 326 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which may comprise an implementation of a network environment, or some combination thereof. Program modules 326 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.
The computer device 312 may also communicate with one or more external devices 314 (e.g., keyboard, pointing device, camera, display 324, etc.), with one or more devices that enable a user to interact with the computer device 312, and/or with any devices (e.g., network card, modem, etc.) that enable the computer device 312 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 322. Also, computer device 312 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), etc.) and/or a public Network, such as the internet, via Network adapter 320. As shown, network adapter 320 communicates with the other modules of computer device 312 via bus 318. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the computer device 312, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape drives, and data backup storage systems, to name a few.
The processor 316 executes programs stored in the storage device 328 to perform various functional applications and data processing, such as the determination of the eye tracking calibration coefficients provided by the above-mentioned embodiments of the present invention.
Example four
Embodiments of the present invention provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processing device, implements a method for determining an eyeball tracking calibration coefficient according to an embodiment of the present invention. The computer readable medium of the present invention described above may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a source text input by a user, and translating the source text into a target text corresponding to a target language; acquiring historical correction behaviors of the user; and correcting the target text according to the historical correction behaviors to obtain a translation result, and pushing the translation result to a client where the user is located.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (11)

1. A method for determining eye tracking calibration coefficients, comprising:
collecting an eye image of a user watching a set target point;
determining the calculated fixation point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image;
determining the gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point;
and determining the calibration coefficient with the gazing precision meeting the set condition as a target calibration coefficient.
2. The method of claim 1, wherein determining, from the stored calibration coefficients and the eye image, computed gaze point information for which each calibration coefficient corresponds, comprises:
extracting the features of the eye image to obtain eye feature information; the eye feature information comprises pupil center position information and light spot center position information;
and calculating according to the stored calibration coefficients and the eye characteristic information to obtain the calculated fixation point information corresponding to each calibration coefficient.
3. The method according to claim 2, wherein performing feature extraction on the eye image to obtain eye feature information comprises:
if the eye images contain a plurality of eye images, respectively extracting the features of each eye image to obtain a plurality of sub-feature information;
performing clustering algorithm selection on the plurality of sub-feature information to obtain at least one optimal sub-feature information; and determining the at least one optimal sub-feature information as eye feature information.
4. The method of claim 1, wherein determining the gaze accuracy of each calibration coefficient based on the calculated gaze point information and the set target point information comprises:
acquiring the distance between the calculated fixation point and the set target point, and determining the distance as fixation precision;
correspondingly, the step of determining the calibration coefficient with the gazing precision meeting the set condition as the target calibration coefficient comprises the following steps:
and determining the calibration coefficient of which the distance is smaller than the first set value as a target calibration coefficient.
5. The method of claim 1, wherein determining the gaze accuracy of each calibration coefficient based on the calculated gaze point information and the set target point information comprises:
determining an included angle between a line connecting the gaze point and the eyes of the user and a line connecting the set target point and the eyes of the user, and determining the included angle as the gaze precision;
correspondingly, the step of determining the calibration coefficient with the gazing precision meeting the set condition as the target calibration coefficient comprises the following steps:
and determining the calibration coefficient of which the included angle is smaller than the second set value as a target calibration coefficient.
6. The method of claim 1, further comprising, prior to acquiring the eye image of the user gaze set target point:
performing eyeball tracking calibration on a plurality of users to obtain a plurality of calibration coefficients;
calculating the score of each calibration coefficient, and determining the calibration coefficient with the score exceeding a set threshold value as an optimal calibration coefficient;
and storing the optimal calibration coefficient.
7. The method of claim 6, wherein performing eye tracking calibration on the user to obtain calibration coefficients comprises:
collecting eye images of a plurality of target points watched by a user;
processing a plurality of eye images corresponding to each target point by adopting a clustering algorithm to obtain at least one optimal eye image corresponding to each target point;
and calculating a calibration coefficient according to at least one optimal eye image corresponding to each target point.
8. The method of claim 7, wherein calculating a score for each calibration coefficient comprises:
calculating the gazing precision of each target point according to the calibration coefficient and at least one optimal eye image corresponding to each target point;
and calculating the score of the calibration coefficient according to the gazing precision of each target point.
9. An apparatus for determining eye tracking calibration coefficients, comprising:
the eye image acquisition module is used for acquiring an eye image of a user gazing at a set target point;
the calculation fixation point information acquisition module is used for determining calculation fixation point information corresponding to each calibration coefficient according to each stored calibration coefficient and the eye image;
the gazing precision determining module is used for determining the gazing precision of each calibration coefficient according to the information of the calculated gazing point and the information of the set target point;
and the target calibration coefficient determining module is used for determining the calibration coefficient of which the gazing precision meets the set conditions as the target calibration coefficient.
10. A computer device, the device comprising: comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method for determining eye tracking calibration coefficients according to any one of claims 1-8 when executing the program.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processing device, carries out the method for determining eye tracking calibration coefficients according to any one of claims 1 to 8.
CN202010304327.3A 2020-04-17 2020-04-17 Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium Pending CN113534945A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010304327.3A CN113534945A (en) 2020-04-17 2020-04-17 Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010304327.3A CN113534945A (en) 2020-04-17 2020-04-17 Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium

Publications (1)

Publication Number Publication Date
CN113534945A true CN113534945A (en) 2021-10-22

Family

ID=78123278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010304327.3A Pending CN113534945A (en) 2020-04-17 2020-04-17 Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium

Country Status (1)

Country Link
CN (1) CN113534945A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115857678A (en) * 2022-11-21 2023-03-28 北京中科睿医信息科技有限公司 Eye movement testing method, device, equipment and storage medium
WO2023071952A1 (en) * 2021-10-29 2023-05-04 南昌虚拟现实研究院股份有限公司 Eyeball parameter checking method and system, and computer and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023071952A1 (en) * 2021-10-29 2023-05-04 南昌虚拟现实研究院股份有限公司 Eyeball parameter checking method and system, and computer and readable storage medium
CN115857678A (en) * 2022-11-21 2023-03-28 北京中科睿医信息科技有限公司 Eye movement testing method, device, equipment and storage medium
CN115857678B (en) * 2022-11-21 2024-03-29 北京中科睿医信息科技有限公司 Eye movement testing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
AU2018385433B2 (en) Digital visual acuity eye examination for remote physician assessment
US8913789B1 (en) Input methods and systems for eye positioning using plural glints
US9489574B2 (en) Apparatus and method for enhancing user recognition
US20140099623A1 (en) Social graphs based on user bioresponse data
CN110807427B (en) Sight tracking method and device, computer equipment and storage medium
WO2023011339A1 (en) Line-of-sight direction tracking method and apparatus
US20220301217A1 (en) Eye tracking latency enhancements
CN113534945A (en) Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium
CN110458441A (en) Checking method, device, system and the storage medium of quality inspection
CN112987910B (en) Testing method, device, equipment and storage medium of eyeball tracking equipment
CN111916203A (en) Health detection method and device, electronic equipment and storage medium
JP5719216B2 (en) Gaze measurement apparatus and gaze measurement program
CN112836685A (en) Reading assisting method, system and storage medium
CN113741682A (en) Method, device and equipment for mapping fixation point and storage medium
CN115624315B (en) Eye movement tracking method and device, electronic equipment, computer storage medium and product
CN112651270A (en) Gaze information determination method and apparatus, terminal device and display object
EP3018558B1 (en) Method and system for detecting objects of interest
CN111933277A (en) Method, device, equipment and storage medium for detecting 3D vertigo
JP2015123262A (en) Sight line measurement method using corneal surface reflection image, and device for the same
CN112433664A (en) Man-machine interaction method and device used in book reading process and electronic equipment
US20240013431A1 (en) Image capture devices, systems, and methods
CN115857678B (en) Eye movement testing method, device, equipment and storage medium
CN115762772B (en) Method, device, equipment and storage medium for determining emotional characteristics of target object
CN114428547A (en) Sight tracking method, device, equipment and storage medium
CN117058148B (en) Imaging quality detection method, device and equipment for nystagmus patient

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination