CN118170250A - Eyeball tracking device and eyeball tracking method - Google Patents

Eyeball tracking device and eyeball tracking method Download PDF

Info

Publication number
CN118170250A
CN118170250A CN202410177228.1A CN202410177228A CN118170250A CN 118170250 A CN118170250 A CN 118170250A CN 202410177228 A CN202410177228 A CN 202410177228A CN 118170250 A CN118170250 A CN 118170250A
Authority
CN
China
Prior art keywords
iris
image
eyeball
infrared
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410177228.1A
Other languages
Chinese (zh)
Inventor
刘木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202410177228.1A priority Critical patent/CN118170250A/en
Publication of CN118170250A publication Critical patent/CN118170250A/en
Pending legal-status Critical Current

Links

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

The application discloses an eyeball tracking device and an eyeball tracking method, and belongs to the technical field of electronic equipment. The eyeball tracking device includes: a housing; a lens arranged on the shell; a first light source for emitting first infrared light toward the eyes of the user; a second light source for emitting a second infrared light toward the eyes of the user, the wavelength of the first infrared light being different from the wavelength of the second infrared light; the first infrared camera is used for receiving the first infrared light reflected by eyes of a user and filtering the second infrared light to form a first eyeball image; the second infrared camera receives the second infrared light reflected by eyes of the user and filters the first infrared light to form a second eyeball image.

Description

Eyeball tracking device and eyeball tracking method
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to an eyeball tracking device and an eyeball tracking method.
Background
Along with the continuous development of technology and the gradual improvement of the requirements of people on convenience and safety, various MR or AR devices simultaneously have the functions of eyeball tracking and iris recognition. The eyeball tracking mainly comprises the steps of shooting infrared reflecting points (namely reflecting points of infrared lamps irradiated on eyeballs) contained in eyeball images through an infrared camera, positioning pupil positions according to the positions of the reflecting points, and determining the gazing point of eyeballs of users through a certain algorithm.
In the eyeball tracking process, the infrared reflecting point easily appears at the edge of the pupil and the iris area or appears at the edge of the pupil and the iris area, so that the eyeball characteristics are difficult to identify, and the eyeball tracking effect is reduced.
Disclosure of Invention
The embodiment of the application aims to provide an eyeball tracking device, an eyeball tracking method and equipment, which at least can solve the problem that the position of an infrared reflecting point influences the identification of eyeball characteristics in the eyeball tracking process.
In a first aspect, an embodiment of the present application provides an eye tracking apparatus, including:
A housing;
A lens provided in the housing;
a first light source for emitting first infrared light toward the eyes of the user;
A second light source for emitting a second infrared light toward the eyes of the user, the first infrared light having a wavelength different from the wavelength of the second infrared light;
The first infrared camera is used for receiving the first infrared light reflected by the eyes of the user and filtering the second infrared light to form a first eyeball image;
the second infrared camera receives the second infrared light reflected by the eyes of the user and filters the first infrared light to form a second eyeball image.
In a second aspect, an embodiment of the present application provides an eye tracking method, which is applied to the eye tracking device in the first aspect, including:
Acquiring a first eyeball image shot by the first infrared camera and a second eyeball image shot by the second infrared camera;
determining position information of a highlight reflection point in the first eyeball image and positioning information of a pupil in the second eyeball image;
And determining the fixation point and/or the sight line direction of the user according to the position information of the highlight reflection point, the position information of the first light source and the positioning information of the pupil.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the second aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the second aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the second aspect.
In an embodiment of the present application, an eyeball tracking device includes: a housing; a lens arranged on the shell; a first light source for emitting first infrared light toward the eyes of the user; a second light source for emitting a second infrared light toward the eyes of the user, the wavelength of the first infrared light being different from the wavelength of the second infrared light; the first infrared camera is used for receiving the first infrared light reflected by eyes of a user and filtering the second infrared light to form a first eyeball image; the second infrared camera receives the second infrared light reflected by eyes of the user and filters the first infrared light to form a second eyeball image. In the embodiment of the application, the first light source and the second light source emit infrared light with different wavelengths to eyes of a user, and the first infrared camera and the second infrared camera can respectively and correspondingly receive the infrared light with corresponding wavelengths reflected by the eyes of the user, so that the influence of the positions of the infrared light reflection points on the identification of the eyeball characteristics in each eyeball image can be reduced, and the eyeball tracking effect is improved.
Drawings
FIGS. 1 a-1 c are schematic diagrams illustrating the possible locations of the bright glints formed by infrared light during eye tracking;
fig. 2 is a schematic structural diagram of an eye tracking device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a layout position of a camera according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of the front surface of the optical engine lens barrel according to the embodiment of the present application;
Fig. 5 is a flowchart of an eye tracking method according to an embodiment of the present application;
fig. 6 is a flowchart of a gaze point location determining method according to an embodiment of the present application;
Fig. 7 is a schematic diagram of a gaze point position determining method provided by an embodiment of the present application;
Fig. 8 is a flowchart of an iris recognition method according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The electronic device provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The eyeball tracking is to perform image processing through instrument and equipment, locate pupil positions, acquire pupil position coordinates, and calculate the point of eye fixation or staring through a certain algorithm; the iris recognition is based on iris texture information in eyes, and various MR or AR devices at present have eyeball tracking and iris recognition functions. However, in eye tracking and iris recognition, the following situations typically occur:
Case one: as shown in fig. 1a, a highlight reflecting point 1 formed by infrared light appears at the pupil edge 2, which affects the identification of pupil edge information, so that the position of the pupil center cannot be accurately positioned, and the effects of eyeball tracking and iris identification are directly affected;
And a second case: as shown in fig. 1b, the highlight reflection point 1 formed by infrared light appears in the iris area 3, and since the iris area 3 contains a large number of textures and color features, the highlight reflection point 1 positioned in the iris area 3 affects the feature extraction and recognition effect of the iris, so that the iris recognition result is inaccurate;
Case three: as shown in fig. 1c, the highlight reflection point 1 formed by infrared light appears at both the pupil edge 2 and the iris region 3, affecting the effects of eye tracking and iris recognition.
In order to solve the above-mentioned problems of MR or AR devices in the process of eye tracking and iris recognition, an embodiment of the present application provides an eye tracking device, as shown in fig. 2, which includes:
A housing;
A lens 210 provided in the housing;
a first light source 220 for emitting a first infrared light toward the user's eyes 5;
a second light source 230 for emitting a second infrared light toward the eyes 5 of the user, the wavelength of the first infrared light being different from the wavelength of the second infrared light;
The first infrared camera 240 receives the first infrared light reflected by the eyes 5 of the user and filters out the second infrared light to form a first eyeball image;
the second infrared camera 250 receives the second infrared light reflected by the user's eyes 5 and filters the first infrared light to form a second eyeball image.
In the embodiment of the application, the eye tracking device includes a housing, in which a lens 210 is disposed, and a first light source 220 and a second light source 230 are disposed at edges of the lens, and because the first light source 220 and the second light source 230 emit infrared light with different wavelengths to the eyes 5 of the user, the first infrared camera 240 and the second infrared camera 250 can respectively and correspondingly receive infrared light with corresponding wavelengths reflected by the eyes 5 of the user, so that an influence of a position of a highlight reflection point 1 formed by infrared light on identification of an eye feature in each eye image can be reduced, for example, an influence of a position of a highlight reflection point 1 formed by the second infrared light on a pupil edge 2 in the first eye image can be reduced, or an influence of a position of a highlight reflection point 1 formed by the first infrared light on an iris region 3 in the second eye image can be reduced, and further a clearer eye feature can be identified through the first eye image and the second eye image, and an eye tracking effect is improved.
In one possible implementation, the wavelength of the first infrared light is adjustable within a first band of wavelengths, and the wavelength of the second infrared light is adjustable within a second band of wavelengths; the first band range and the second band range do not overlap.
In the embodiment of the present application, a first band corresponding to a first infrared light emitted from the first light source 220 is denoted as an a band, a second band corresponding to a second infrared light emitted from the second light source 230 is denoted as a B band, a wavelength of the first infrared light is adjustable within the a band, a wavelength of the second infrared light is adjustable within the B band, the AB band is typically 780nm to 2500nm, the a band and the B band are within the AB band, and the a band and the B band do not overlap.
In one possible implementation, the first infrared camera 240 includes a first infrared filter for transmitting the first infrared light and filtering the second infrared light;
the second infrared camera 250 includes a second infrared filter for transmitting the second infrared light and filtering the first infrared light.
In the embodiment of the present application, the first infrared camera 240 and the second infrared camera 250 respectively work in different wavebands, that is, the first infrared camera 240 receives the first infrared light reflected by the eyes 5 of the user and filters the second infrared light to form a first eyeball image; the second infrared camera 250 receives the second infrared light reflected by the eyes 5 of the user and filters the first infrared light to form a second eyeball image. A first infrared filter may be disposed at the first infrared camera 240, through which the first infrared light is transmitted and the second infrared light is filtered; a second infrared filter is disposed on the second infrared camera 250, and the second infrared light is transmitted through the second infrared filter and the first infrared light is filtered. In this way, the first infrared camera 240 can be prevented from receiving the second infrared light reflected by the user's eye 5, such that the formed first eyeball image does not include the highlight reflection point 1 of the second infrared light, and likewise, the second infrared camera 250 can be prevented from receiving the first infrared light reflected by the user's eye 5, such that the formed second eyeball image does not include the highlight reflection point 1 of the first infrared light.
In one possible implementation, a plurality of first light sources 220 may be included, the plurality of first light sources 220 disposed around the optical axis 211 of the lens 210; the second light source 230 is disposed on a side of the first light source 220 away from the optical axis 211 of the lens 210.
In practice, the MR or AR device typically comprises 6 first light sources 220,6, which are arranged in a ring along the edge of the lens, towards the user's eyes 5. The second light source 230 is disposed on a side of the first light source 220 away from the optical axis 211 of the lens 210, so as to avoid that the reflected light of the second infrared light on the eye 5 of the user is received by the second infrared camera 250, and a highlight reflection point 1 is formed on the second eyeball image.
The lens may include a plurality of second light sources 230, where the plurality of second light sources 230 are disposed on opposite sides of the lens 210.
In practical applications, the MR or AR device is generally provided with the second light sources 230 on the upper and lower sides of the edge of the lens, so that the second infrared camera 250 captures a second eyeball image which is clear and does not include the highlight reflection point 1 according to the second infrared light.
In one possible implementation, the second light source 230 is disposed on the first side and/or the second side of the lens 210, and the second infrared camera 250 is disposed on the third side and/or the fourth side of the lens 210; the first side and the second side are one opposite side of the lens 210, and the third side and the fourth side are the other opposite side of the lens 210; the first infrared camera 240 is disposed on the first side or the second side of the lens 210.
In the embodiment of the present application, the second light sources 230 are disposed on the upper and lower sides of the lens 210, and the second infrared cameras 250 may be disposed on the left and right sides of the lens 210. Preferably, as shown in fig. 3, the second infrared camera 250 may be disposed such that the projection on the plane on which the lens 210 is located on the target horizontal line l, for example, disposed at the position 6 and the position 7. The first ir camera 240 is disposed on the upper side or the lower side of the lens 210, so that the user may blink, and the position of the upper eyelid moves down, and the lower eyelid remains in the original position, preferably, the projection of the first ir camera 240 on the plane of the lens 210 is generally located below the target horizontal line l, for example, disposed at the position 8, the position 9 and the position 10. Wherein the target horizontal line l may be a horizontal line passing through the optical center of the lens 210.
In one possible implementation, the first infrared light causes an eyeball image of the first eyeball image to have a plurality of highlight reflection points 1; the second infrared light causes the pupil image and the iris image in the second eyeball image to have no highlight reflection point 1.
In the embodiment of the present application, the first infrared camera 240 receives the first infrared light reflected by the eyes 5 of the user and filters the second infrared light to form a first eyeball image, and the plurality of first light sources 220 are disposed around the optical axis 211 of the lens 210, and the first infrared light reflected by the eyes 5 of the user is incident into the first infrared camera 240, so that a light spot (i.e. a highlight reflection spot) is formed on the first eyeball image captured by the first infrared camera 240. The second infrared camera 250 receives the second infrared light reflected by the eyes of the user and filters the first infrared light to form a second eyeball image, and the second light source 230 is arranged on one side of the first light source 220 away from the optical axis 211 of the lens 210, the second infrared light reflected by the eyes 5 of the user can be seldom incident into the second infrared camera 250, and the probability of forming a light spot on the second eyeball image shot by the second infrared camera 250 is very low, namely, the pupil image and the iris image in the second eyeball image do not have the highlight reflection point 1.
Wherein, in the process of acquiring the first eyeball image and the second eyeball image, the first light source 220 and the second light source 230 are synchronously lightened, and the first light source 220 and the second light source 230 are synchronously turned off;
The first infrared camera 240 and the second infrared camera 250 are synchronously exposed, the output frame rate of the first infrared camera 240 is the same as the output frame rate of the second infrared camera 250, and the exposure time of the first infrared camera 240 is the same as the exposure time of the second infrared camera 250.
Thus, the first eyeball image and the second eyeball image can be acquired simultaneously, and timeliness of eyeball tracking is ensured.
In a possible implementation manner, a first preset angle r1 is formed between the outgoing direction of the first infrared light and the optical axis 211; a second preset angle r2 is formed between the emergent direction of the second infrared light and the optical axis 211, and the second preset angle r2 is smaller than the first preset angle r1.
In the embodiment of the present application, the first preset angle r1 may be set according to actual needs, and in the wearing state of the user, the point where the first infrared light intersects with the optical axis 211 may be the center position of the eyeball, so that the first infrared camera 240 includes a clear highlight reflection point 1 in the first eyeball image formed according to the first infrared light. The second preset angle r2 is smaller than the first preset angle r1, so that the probability that the second infrared light forms a light spot on the second eyeball image shot by the second infrared camera 250 is reduced, and the pupil image and the iris image in the second eyeball image do not have the highlight reflection point 1. In practical application, as shown in fig. 4, the eye tracking device includes a lens 210, the lens 210 is embedded in an optical engine housing 260, a light ring frame 250 is disposed at an edge of the lens 210, a first light source 220 and a second light source 230 are disposed on the light ring frame 250, the second light source 230 is far away from an edge of the lens 210 relative to the first light source 220, the first light source 220 faces vertically towards the center of an eyeball, the second light source 230 faces vertically towards an eyelid position, a deployment horizontal inclination angle of the second light source 230 is large relative to a deployment horizontal inclination angle of the first light source 220, a chief ray of the second infrared light is not reflected to the second infrared camera 250 through the eyes 5 of a user, only a small amount of light at the edge can be reflected to the second infrared camera 250, and a probability of forming a light spot on a second eyeball image captured by the second infrared camera 250 is greatly reduced. Therefore, the first eyeball image and the second eyeball image can be used for eyeball tracking, so that not only can the reflecting point on the eyeball image be obtained, but also the clear eyeball image can be obtained, further the influence of the infrared reflecting point on the identification of the eyeball characteristics can be reduced, and the eyeball tracking effect is improved.
In one possible implementation manner, the eye tracking may be performed according to the first eye image and the second eye image, including:
Determining position information of a highlight reflection point 1 in the first eyeball image and positioning information of a pupil 4 in the second eyeball image; the gaze point and/or gaze direction of the user is determined from the positional information of the highlighting spot 1, the positional information of the first light source 220, and the positioning information of the pupil 4.
In one possible implementation, the method further includes:
Determining an iris feature image in the second eyeball image; and extracting target iris characteristics in the iris characteristic image, and matching the target iris characteristics with pre-stored iris characteristics to obtain an iris recognition result of the user.
That is, the first eyeball image and the second eyeball image shot by the eyeball tracking device can realize eyeball tracking and iris recognition, reduce the influence of the position of the highlighting reflection point 1 of infrared light on the eyeball feature recognition in each eyeball image, and improve the effect of eyeball tracking and iris recognition.
Referring to fig. 5, fig. 5 is a flowchart illustrating an eye tracking method according to an embodiment of the application. The eye tracking method 500 may be applied to the eye tracking device described above, and specifically may include the following steps:
S501: a first eyeball image captured by the first infrared camera 240 and a second eyeball image captured by the second infrared camera 250 are acquired.
In the embodiment of the present application, during the process of eye tracking, a first eyeball image captured by the first infrared camera 240 and a second eyeball image captured by the second infrared camera 250 are obtained, wherein the first infrared light makes the eyeball image in the first eyeball image have a plurality of highlight reflection points; the second infrared light causes the pupil image and the iris image in the second eyeball image to have no highlight reflection point.
S502: position information of a highlight reflection point in the first eyeball image and positioning information of a pupil in the second eyeball image are determined.
In the embodiment of the application, the first eye image can be preprocessed, wherein the preprocessing comprises stretching, compressing, graying, binarization processing and the like, and specifically, an adaptive threshold value can be set to perform binarization processing on the first eye image to obtain a binarized image; and further extracting the position information of the highlight reflection point in the binarized image. Meanwhile, an edge detection algorithm (such as a canny edge detection method) can be used for obtaining the characteristic edge of the pupil area in the second eyeball image, and the Hough circle detection method is utilized for detecting the pupil circle or ellipse boundary to obtain the positioning information of the pupil in the second eyeball image.
S503: and determining the fixation point and/or the sight line direction of the user according to the position information of the highlight reflection point, the position information of the first light source 220 and the positioning information of the pupil.
In the embodiment of the present application, as shown in fig. 6, preprocessing and pupil detection are performed on the second eyeball image to obtain positioning information of the pupil; detecting the reflection point of the first eyeball image to obtain the position information of the highlight reflection point; and further determining the gaze point and/or gaze direction of the user based on the spatial geometrical relationship.
In a possible implementation manner, in the step S503, determining the gaze point and/or the gaze direction of the user according to the position information of the highlight reflection point, the position information of the first light source 220, and the positioning information of the pupil includes:
Determining the center point coordinates of the eyeball according to the position information of the high-brightness reflecting point and the position information of the first light source 220 corresponding to the high-brightness reflecting point; determining an optical axis of the eyeball of the user according to the eyeball center point coordinates and the positioning information of the pupil; determining a visual axis according to the optical axis of the eyeball of the user and preset visual axis conversion information; the sight line axis conversion information is conversion information between a pupil optical axis and a sight line axis which are calibrated in advance; and determining the gaze point and/or the gaze direction of the user according to the gaze axis.
Here, the positioning information of the pupil may be position coordinates of a pupil center point.
In the embodiment of the present application, as shown in fig. 7, the coordinates of the center point of the eyeball may be determined according to the position coordinates of the highlighted reflective point in the first eyeball image and the internal and external parameters calibrated in advance by the first infrared camera, by using the principle of pinhole imaging and the position information of the first light source 220 corresponding to the reflective point. Fitting the position coordinates of the pupil center in the image by using pupil edge information in the second eyeball image and pre-calibrated internal and external parameters of the second infrared camera, and determining the space position coordinates of the pupil center point by using a pinhole imaging principle and eyeball refraction (light incident from air to the pupil center of the eyeball); and determining the optical axis of the eyeball of the user according to the eyeball center point coordinates and the pupil center point coordinates. Determining a visual axis according to an optical axis of an eyeball of a user and preset visual axis conversion information; and further determining the gaze point and/or gaze direction of the user based on the gaze axis.
Thus, the first eyeball image and the second eyeball image are used for carrying out eyeball tracking, so that the influence of infrared light reflection points on eyeball characteristic recognition can be reduced, and the eyeball tracking effect is improved.
In one possible implementation manner, the above-mentioned eye tracking method 500 further includes:
S504: and determining an iris characteristic image in the second eyeball image.
In the embodiment of the application, the iris area has rich biological characteristics, and the biological characteristics can be used for identity verification through identification, so that the iris characteristic image in the second eyeball image can be obtained, subsequent characteristic extraction processing can be carried out according to the iris characteristic image, and the identity of a user can be verified.
In a possible implementation manner, in step S504, determining an iris feature image in the second eyeball image includes:
acquiring a feature extraction region between a first edge curve of a pupil region and a second edge curve of an iris region in the second eyeball image; and performing polar coordinate transformation on the feature extraction area to obtain the iris feature image.
In the embodiment of the application, the polar coordinate transformation is to mutually transform a rectangular coordinate system and a polar coordinate system of an image, so that a round image can be transformed into a rectangular image, and specific image processing operations such as filtering, feature recognition and the like can be realized after the polar coordinate transformation is carried out on the image.
S505: and extracting target iris characteristics in the iris characteristic image, and matching the target iris characteristics with pre-stored iris characteristics to obtain an iris recognition result of the user.
In the embodiment of the application, the target iris characteristics (such as Gabor wavelet characteristics and the like) in the iris characteristic image are extracted, and the target iris characteristics are matched with the pre-stored iris characteristics to obtain the iris recognition result of the user.
In a possible implementation manner, in step S505, the matching the target iris feature with a pre-stored iris feature to obtain an iris recognition result of the user includes:
performing quantization coding on the target iris characteristics to obtain target iris characteristic codes;
And determining an iris recognition result of the user according to the Hamming distance between the target iris feature code and the iris feature code corresponding to the prestored iris feature.
In the embodiment of the application, as shown in fig. 8, first, preprocessing and positioning segmentation processing are performed on a second eyeball image to obtain an iris characteristic image; extracting target iris characteristics after normalizing the iris characteristic image, and carrying out quantization coding on the target iris characteristics to obtain target iris characteristic codes; and then, the iris recognition result of the user is determined through feature matching, and particularly, the iris recognition result of the user can be determined according to the Hamming distance between the target iris feature code and the iris feature code corresponding to the pre-stored iris feature.
The embodiment of the application provides an eyeball tracking method, which comprises the steps of obtaining a first eyeball image shot by a first infrared camera and a second eyeball image shot by a second infrared camera; determining position information of a highlight reflection point in the first eyeball image and positioning information of a pupil in the second eyeball image; and determining the fixation point and/or the sight line direction of the user according to the position information of the highlight reflection point, the position information of the first light source and the positioning information of the pupil. Because the light interference of the pupil area in the second eyeball image is smaller, the pupil area is positioned more accurately, and the fixation point and/or the sight line direction of a user can be accurately identified according to the position information of the highlight reflection point in the first eyeball image and the positioning information of the pupil in the second eyeball image, the influence of infrared light reflection light on the eyeball tracking result is reduced, and the accuracy and the robustness of the eyeball tracking are improved.
And, by determining an iris feature image in the second eye image; extracting target iris characteristics in the iris characteristic image, and matching the target iris characteristics with the pre-stored iris characteristics to obtain an iris recognition result of the user. Because the light interference of the pupil area and the iris area in the second eyeball image is smaller, the pupil area and the iris area are positioned more accurately, and the effective texture information of the iris area can be extracted more abundantly, so that the accuracy and the robustness of iris recognition are improved.
The embodiment of the application also provides eyeball tracking equipment which comprises the eyeball tracking device.
In the embodiment of the application, the eyeball tracking device can be wearable, such as AR glasses, or non-wearable, such as a mobile phone with a lens, a tablet, and the like.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements each process of the above method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the embodiment of the method, and can achieve the same technical effects, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (19)

1. An eye tracking device, comprising:
A housing;
a lens (210) provided in the housing;
a first light source (220) for emitting a first infrared light towards the eyes of a user;
A second light source (230) for emitting a second infrared light towards the eyes of the user, the first infrared light having a wavelength different from the wavelength of the second infrared light;
a first infrared camera (240) that receives the first infrared light reflected by the user's eyes and filters out the second infrared light to form a first eyeball image;
and a second infrared camera (250) for receiving the second infrared light reflected by the eyes of the user and filtering the first infrared light to form a second eyeball image.
2. The eye tracking device according to claim 1, wherein the wavelength of the first infrared light is adjustable in a first band of wavelengths and the wavelength of the second infrared light is adjustable in a second band of wavelengths; the first band range and the second band range do not overlap.
3. The eye tracking device according to claim 1, wherein the first infrared camera (240) comprises a first infrared filter for transmitting the first infrared light and filtering the second infrared light;
The second infrared camera (250) comprises a second infrared filter, and the second infrared filter is used for transmitting the second infrared light and filtering the first infrared light.
4. The eye tracking device according to claim 1, characterized in that it comprises a plurality of said first light sources (220), said plurality of first light sources (220) being arranged around the optical axis (211) of said lens (210);
The second light source (230) is arranged at one side of the first light source (220) away from the optical axis (211) of the lens (210).
5. The eye tracking device according to claim 4, comprising a plurality of said second light sources (230), said plurality of said second light sources (230) being arranged on opposite sides of said lens (210).
6. The eye tracking device according to claim 4, wherein the second light source (230) is arranged at a first side and/or a second side of the lens (210), and the second infrared camera (250) is arranged at a third side and/or a fourth side of the lens (210); the first side and the second side are one opposite side of the lens (210), and the third side and the fourth side are the other opposite side of the lens (210);
The first infrared camera (240) is disposed on a first side or a second side of the lens (210).
7. The eye tracking device according to claim 1, wherein the first infrared light causes an eye image in the first eye image to have a plurality of highlight reflection points; the second infrared light causes pupil images and iris images in the second eyeball images to have no highlight reflection points.
8. The eye tracking device according to claim 1, wherein the first light source (220) and the second light source (230) are illuminated synchronously during the acquisition of the first eye image and the second eye image, the first light source (220) and the second light source (230) being turned off synchronously;
the first infrared camera (240) and the second infrared camera (250) are synchronously exposed, the output frame rate of the first infrared camera (240) is the same as the output frame rate of the second infrared camera (250), and the exposure time of the first infrared camera (240) is the same as the exposure time of the second infrared camera (250).
9. The eye tracking device according to claim 1, characterized in that a first predetermined angle (r 1) is present between the outgoing direction of the first infrared light and the optical axis (211);
A second preset angle (r 2) is formed between the emergent direction of the second infrared light and the optical axis (211), and the second preset angle (r 2) is smaller than the first preset angle (r 1).
10. The eye tracking device according to claim 7, wherein eye tracking from the first eye image and the second eye image comprises:
determining position information of the highlight reflection point in the first eyeball image and positioning information of a pupil in the second eyeball image;
And determining the fixation point and/or the sight line direction of the user according to the position information of the highlight reflection point, the position information of the first light source (220) and the positioning information of the pupil.
11. The eye tracking apparatus according to claim 10, wherein the determining the gaze point and/or gaze direction of the user based on the positional information of the highlight reflection point, the positional information of the first light source (220), and the positioning information of the pupil, comprises:
Determining eyeball center point coordinates according to the position information of the high-brightness reflecting point and the position information of the first light source (220) corresponding to the high-brightness reflecting point;
Determining an optical axis of the eyeball of the user according to the eyeball center point coordinates and the positioning information of the pupil;
determining a visual axis according to the optical axis of the eyeball of the user and preset visual axis conversion information; the sight line axis conversion information is conversion information between a pupil optical axis and a sight line axis which are calibrated in advance;
And determining the gaze point and/or the gaze direction of the user according to the gaze axis.
12. An eye tracking device according to claim 7, further comprising:
determining an iris feature image in the second eyeball image;
And extracting target iris characteristics in the iris characteristic image, and matching the target iris characteristics with pre-stored iris characteristics to obtain an iris recognition result of the user.
13. The eye tracking apparatus according to claim 12, wherein the determining an iris feature image in the second eye image comprises:
acquiring a feature extraction region between a first edge curve of a pupil region and a second edge curve of an iris region in the second eyeball image;
and performing polar coordinate transformation on the feature extraction area to obtain the iris feature image.
14. The eye tracking device according to claim 12, wherein the matching the target iris feature with a pre-stored iris feature to obtain an iris recognition result of the user comprises:
performing quantization coding on the target iris characteristics to obtain target iris characteristic codes;
And determining an iris recognition result of the user according to the Hamming distance between the target iris feature code and the iris feature code corresponding to the prestored iris feature.
15. An eye tracking method, applied to the eye tracking device according to any one of claims 1 to 14, comprising:
acquiring a first eyeball image shot by the first infrared camera (240) and a second eyeball image shot by the second infrared camera (250);
determining position information of a highlight reflection point in the first eyeball image and positioning information of a pupil in the second eyeball image;
And determining the fixation point and/or the sight line direction of the user according to the position information of the highlight reflection point, the position information of the first light source (220) and the positioning information of the pupil.
16. The method according to claim 15, wherein said determining the gaze point and/or gaze direction of the user based on the positional information of the highlight reflection point, the positional information of the first light source (220), and the positioning information of the pupil comprises:
Determining eyeball center point coordinates according to the position information of the high-brightness reflecting point and the position information of the first light source (220) corresponding to the high-brightness reflecting point;
Determining a pupil optical axis according to the eyeball center point coordinates and the positioning information of the pupil;
determining a visual axis according to the pupil optical axis and preset visual axis conversion information; the sight line axis conversion information is conversion information between a pupil optical axis and a sight line axis which are calibrated in advance;
And determining the gaze point and/or the gaze direction of the user according to the gaze axis.
17. The method of claim 15, further comprising, after said acquiring a first eyeball image captured by said first infrared camera (240) and a second eyeball image captured by said second infrared camera (250):
determining an iris feature image in the second eyeball image;
And extracting target iris characteristics in the iris characteristic image, and matching the target iris characteristics with pre-stored iris characteristics to obtain an iris recognition result of the user.
18. The method of claim 17, wherein said determining an iris feature image in said second eye image comprises:
acquiring a feature extraction region between a first edge curve of a pupil region and a second edge curve of an iris region in the second eyeball image;
and performing polar coordinate transformation on the feature extraction area to obtain the iris feature image.
19. The method of claim 17, wherein the matching the target iris feature with a pre-stored iris feature to obtain an iris recognition result of the user comprises:
performing quantization coding on the target iris characteristics to obtain target iris characteristic codes;
And determining an iris recognition result of the user according to the Hamming distance between the target iris feature code and the iris feature code corresponding to the prestored iris feature.
CN202410177228.1A 2024-02-08 2024-02-08 Eyeball tracking device and eyeball tracking method Pending CN118170250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410177228.1A CN118170250A (en) 2024-02-08 2024-02-08 Eyeball tracking device and eyeball tracking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410177228.1A CN118170250A (en) 2024-02-08 2024-02-08 Eyeball tracking device and eyeball tracking method

Publications (1)

Publication Number Publication Date
CN118170250A true CN118170250A (en) 2024-06-11

Family

ID=91357502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410177228.1A Pending CN118170250A (en) 2024-02-08 2024-02-08 Eyeball tracking device and eyeball tracking method

Country Status (1)

Country Link
CN (1) CN118170250A (en)

Similar Documents

Publication Publication Date Title
EP3453316B1 (en) Eye tracking using eyeball center position
US10387724B2 (en) Iris recognition via plenoptic imaging
US6088470A (en) Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US20160019421A1 (en) Multispectral eye analysis for identity authentication
US20170091550A1 (en) Multispectral eye analysis for identity authentication
US20160019420A1 (en) Multispectral eye analysis for identity authentication
US8755607B2 (en) Method of normalizing a digital image of an iris of an eye
US8854446B2 (en) Method of capturing image data for iris code based identification of vertebrates
US10360450B2 (en) Image capturing and positioning method, image capturing and positioning device
CN108354584A (en) Eyeball tracking module and its method for tracing, virtual reality device
KR102393298B1 (en) Method and apparatus for iris recognition
US10891479B2 (en) Image processing method and system for iris recognition
CN103577801A (en) Quality metrics method and system for biometric authentication
US11163994B2 (en) Method and device for determining iris recognition image, terminal apparatus, and storage medium
CA2833740A1 (en) Method of generating a normalized digital image of an iris of an eye
Reddy et al. A robust scheme for iris segmentation in mobile environment
US10288879B1 (en) Method and system for glint/reflection identification
WO2022005336A1 (en) Noise-resilient vasculature localization method with regularized segmentation
CN118170250A (en) Eyeball tracking device and eyeball tracking method
JP3848953B2 (en) Living body eye determination method and living body eye determination device
CN116091750A (en) Glasses degree identification method and device, electronic equipment and storage medium
JP4527088B2 (en) Living body eye determination method and living body eye determination device
JP2648198B2 (en) Gaze direction detection method
CN114495247A (en) Iris positioning method, device and equipment
JP3595164B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination