CN109919117B - Eyeball detection device and pupil detection method - Google Patents

Eyeball detection device and pupil detection method Download PDF

Info

Publication number
CN109919117B
CN109919117B CN201910196821.XA CN201910196821A CN109919117B CN 109919117 B CN109919117 B CN 109919117B CN 201910196821 A CN201910196821 A CN 201910196821A CN 109919117 B CN109919117 B CN 109919117B
Authority
CN
China
Prior art keywords
eyeball
image
bright
pupil
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910196821.XA
Other languages
Chinese (zh)
Other versions
CN109919117A (en
Inventor
黄昱豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201910196821.XA priority Critical patent/CN109919117B/en
Publication of CN109919117A publication Critical patent/CN109919117A/en
Application granted granted Critical
Publication of CN109919117B publication Critical patent/CN109919117B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides an eyeball detection device, a pupil detection method and an iris identification method. The optical assembly provides a plurality of incident lights to the eyeball and forms a plurality of bright spots on the eyeball, and at least part of the bright spots are positioned near the pupil of the eyeball. The image sensor is used for capturing an eyeball image, and the eyeball image comprises images of the bright spots. The operation unit analyzes the gray level of the eyeball image, obtains the distribution positions of the bright spots through the gray level of the eyeball image, and judges the positions of pupils according to the distribution positions of the bright spots.

Description

Eyeball detection device and pupil detection method
The application is a divisional application of patent application of the application with the names of an eyeball detection device, a pupil detection method and an iris identification method, which is applied to the original science and technology Co-ordinates, the application date is 2013, 11, 25 and the application number is 201310607136.4.
Technical Field
The present invention relates to an eyeball detection device, and more particularly, to a detection method and an iris recognition method for performing eyeball tracking by using the eyeball detection device.
Background
The present eyeball detection device can be used for detecting the eyeball movement direction or can be used for iris boundary identification. Most eyeball detection devices use the characteristic that the pupil moves in position with the change of the line of sight to detect the direction of eyeball fixation.
Generally, most of the conventional eyeball detection devices detect the gazing direction of the eyeball by using a bright spot (glint) formed by the incident light projected on the eyeball, and the conventional eyeball detection devices generally use the bright spot as a reference point for positioning the eyeball.
In detail, after capturing an image of an eyeball, a conventional eyeball detection device recognizes a pupil and a bright spot from an image of a Cornea (Cornea) of the eyeball. In the process of identifying the pupil, the whole eyeball image is scanned, and the pupil and the bright spot are identified by analyzing the gray value (GRAY SCALE value) distribution of the whole eyeball image. Then, the eyeball detection device further obtains the relative position between the pupil and the bright spot, and judges the gazing direction of the eyeball according to the relative position.
Disclosure of Invention
The invention aims to provide an eyeball detection device capable of identifying pupils quickly.
Still another objective of the present invention is to provide a pupil detection method and an iris recognition method that do not need to determine the gray value distribution of the whole first or second eyeball image to obtain the position of the pupil.
Still another object of the present invention is to provide an iris recognition method for accelerating the boundary search speed of iris images.
The embodiment of the invention provides an eyeball detection device which judges the position of a pupil according to the distribution position of at least one bright point on an eyeball.
The embodiment of the invention provides an eyeball detection device which comprises an optical assembly, an image sensor and an operation unit. The optical assembly provides a plurality of incident lights to the eyeball and forms a plurality of bright spots on the eyeball, and at least part of the bright spots are positioned near the pupil of the eyeball. The image sensor is used for capturing an image of the eyeball, and the image of the eyeball comprises the bright spots. The operation unit analyzes the gray level of the eyeball image, obtains the distribution positions of the bright spots through the gray level of the eyeball image, and judges the positions of pupils according to the distribution positions of the bright spots.
The embodiment of the invention provides a method for detecting pupils, which judges the positions of the pupils according to the distribution positions of one or more formed bright spots.
The embodiment of the invention provides a method for detecting pupils, which comprises the steps of providing one or more incident lights to the eyeballs and forming one or more first bright spots on the eyeballs, wherein at least part of the first bright spots are positioned near the pupils. A first eyeball image is captured from the eyeball, wherein the first eyeball image comprises a plurality of first bright spots and images of pupils. The gray scale values of the first eyeball images are analyzed to obtain the distribution positions of the first bright spots. The position of the pupil is judged according to the distribution positions of the first bright spots.
The embodiment of the invention provides a method for identifying an iris, which can judge the deformation of an iris image during eyeball displacement.
The embodiment of the invention provides a method for identifying an iris, which comprises the steps of emitting a plurality of incident lights to the eyeball when the eyeball is positioned at a reference position so as to form a first reference point, a second reference point and a third reference point which are used as marks for the eyeball to be positioned at the reference position, wherein the positions of the first reference point, the second reference point and the third reference point correspond to the positions of the incident lights. When the eyeball moves from the reference position to a measurement position, the incident light forms a first measurement bright spot, a second measurement bright spot and a third measurement bright spot on the eyeball, and the first measurement bright spot, the second measurement bright spot and the third measurement bright spot are positioned near the pupil of the eyeball. An eyeball image of the eyeball is captured, wherein the eyeball image comprises the bright spots and the iris image. And analyzing the gray value of the eyeball image to obtain the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot. And calculating displacement amounts generated by the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot relative to the positions of the first reference point, the second reference point and the third reference point, so as to obtain the deformation amount of the iris image when the eyeball is positioned at the measurement position relative to the iris image when the eyeball is positioned at the reference position.
The embodiment of the invention provides a method for identifying an iris, which can judge the resolution variation of an iris image.
The embodiment of the invention provides a method for identifying an iris, which comprises the steps of providing a plurality of incident lights to be incident to an eyeball. The first reference point, the second reference point and the third reference point are set to be marks of the eyeball at the reference positions, wherein the positions of the first reference point, the second reference point and the third reference point correspond to the positions of the incident light emission positions. The incident light forms a first measurement bright spot, a second measurement bright spot and a third measurement bright spot on the eyeball, and the first measurement bright spot, the second measurement bright spot and the third measurement bright spot are positioned near the pupil of the eyeball, wherein the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot correspond to the positions of the first reference spot, the second reference spot and the third reference spot. An eyeball image of the eyeball is captured, wherein the eyeball image comprises the bright spots and the iris image. And analyzing the gray value of the eyeball image to obtain the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot. Calculating the change of the distance between the first measurement bright spot and the second measurement bright spot relative to the distance between the first reference spot and the second reference spot, and calculating the change of the distance between the second measurement bright spot and the third measurement bright spot relative to the distance between the second reference spot and the third reference spot, thereby obtaining the resolution change generated by the iris image when the eyeball is positioned at the reference position.
In summary, the embodiments of the present invention provide an eyeball detection device and a method for detecting and identifying an eyeball (such as a pupil and an iris). The eyeball detection device comprises an optical component, an image sensor and an operation unit. According to the eye tracking detection method, the operation unit can judge the gray value distribution of the peripheral area near the bright point in the first eye image, so that the position of the pupil is obtained. Therefore, the operation unit does not need to judge the gray value distribution of the whole first eyeball image so as to obtain the position of the pupil. Compared with the prior art, the eyeball detection device of the embodiment of the invention can recognize the pupil faster.
The embodiment of the invention provides an eyeball detection device and a detection method for eyeball tracking. The control unit is used for controlling different incident light incidence positions at different time points, so that the bright point positions in eyeball images at different time points can be adjusted, and further the bright point positions can be more confirmed through gray values and special patterns after the image subtraction step, thereby helping to reduce the probability of searching for wrong bright point positions. The arithmetic unit may judge only the gradation value distribution of the peripheral region in the vicinity of the distribution position of the bright spots in the difference image, so that the position of the pupil P1 can be obtained with an acceleration. Compared with the prior art, the operation unit does not need to judge the gray value distribution of the whole first eyeball image or the second eyeball image so as to obtain the position of the pupil.
The embodiment of the invention provides a method for identifying an iris, which is characterized in that the displacement generated by the positions of a first measurement bright spot, a second measurement bright spot and a third measurement bright spot relative to the positions of a first reference spot, a second reference spot and a third reference spot is calculated by an operation unit, so that the elliptical length and the elliptical short axis of an iris image can be calculated according to the displacement, the elliptical boundary of the iris image can be estimated, and the boundary searching speed of the iris image can be further accelerated.
The embodiment of the invention provides a method for identifying an iris, which comprises the steps of calculating a first variation, a second variation and a third variation by an operation unit, estimating the boundary of an iris image, and further accelerating the boundary searching speed of the iris image.
So that the manner in which the techniques, methods and functions of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Drawings
Fig. 1A is a schematic side view of an eyeball detecting device according to a first embodiment of the present invention.
Fig. 1B is a schematic front view of an eyeball detecting device according to a first embodiment of the present invention.
FIG. 1C is a functional block diagram of an eyeball detecting device according to a first embodiment of the present invention.
Fig. 1D is a flowchart of a pupil detection method according to a first embodiment of the present invention.
Fig. 2A is a front view of an eyeball detecting device according to a second embodiment of the present invention.
Fig. 2B is a flowchart of a pupil detection method according to a second embodiment of the present invention.
Fig. 2C is a flowchart of a pupil detection method according to a second embodiment of the present invention.
Fig. 3A is a functional block diagram of an eyeball detecting device according to a third embodiment of the present invention.
Fig. 3B is a flowchart of a pupil detection method according to a third embodiment of the present invention.
Fig. 4 is a flowchart of an iris recognition method according to an embodiment of the invention.
Fig. 5 is a flowchart of an iris recognition method according to an embodiment of the invention.
Reference numerals
100. 200, 300 Eyeball detecting device
110. 210 Optical component
112. 212 Light source
114. 214 Beam splitting assembly
120. Image sensor
130. 230 Arithmetic unit
150. Bearing frame
152. Picture frame
154. Mirror foot
340. Control unit
L1 incident light
M1 viewing area
E1 Eyeball (eyeball)
G1 Bright spot
G1a first bright point
G1b second bright spot
P1 pupil
I1 Iris
S101-S104 steps
S201 to S204 steps
S301 to S307 steps
Steps S401 to S405
S501 to S506 steps
Detailed Description
Fig. 1A is a schematic side view of an eyeball detecting device according to a first embodiment of the present invention, fig. 1B is a schematic front view of the eyeball detecting device according to the first embodiment of the present invention, and fig. 1C is a functional block diagram of the eyeball detecting device according to the first embodiment of the present invention. Referring to fig. 1A to 1C, the eyeball detecting device 100 includes an optical component 110, an image sensor 120 and an arithmetic unit 130. The optical assembly 110 provides at least one incident light L1 to form at least one bright spot G1 in the vicinity of the pupil P1. The image sensor 120 is used for capturing an eyeball image, and the eyeball image includes an image of the bright spot G1. The operation unit 130 analyzes the gray value of the eyeball image, and accordingly obtains the distribution position of the bright spots G1. In addition, the computing unit 130 can determine the pupil P1 position according to the distribution position of the bright point G1.
The eye detection device 100 may be mounted on a glasses frame, or may be mounted on a panel of a notebook computer or a smart phone. In the present embodiment, the eye detecting device 100 is an eye detecting device 100 with glasses, and the optical component 110 and the image sensor 120 are both mounted on the carrier frame 150. The carrier frame 150 is wearable by a user, and allows the optical assembly 110 and the image sensor 120 to be located in front of the eyeball E1. However, in other embodiments, the eye detection device 100 may be mounted on a mobile device, such as a front lens or a front panel of a notebook computer or a smart phone. However, the present invention is not limited thereto.
In practice, the carrying frame 150 may be a frame, and includes two mirror frames 152 and two temples 154 respectively connected to the mirror frames 152. The user places the temple 154 over the ear with the frame 152 in front of the eye E1. However, the design of the carrier 150 is merely illustrative, and the present invention is not limited to the carrier 150.
The optical component 110 can emit at least one incident light L1 to the eyeball E1. These incident light L1 falls on the eyeball E1 and can be reflected on the iris I1 to form at least one bright spot G1, wherein the bright spot G1 is located near the pupil P1. In the present embodiment, one incident light L1 is incident to the eyeball E1, so that the number of bright spots G1 is one. It is noted that the incident light L1 is invisible light (invisible light), such as infrared light (INFRARED LIGHT, IR) or near infrared light (NEAR INFRARED LIGHT, NIR), and the outer layer of the iris I1 structure covers the Cornea (Cornea) to form a smooth curved surface, so that the incident light in each direction forms a reflective bright spot on the Cornea and on the optical path of the image sensor, and thus the incident light L1 forms more than one bright spot G1.
Specifically, the optical element 110 may include at least one light source 112 and at least one light splitting element 114, and the optical element 110 provides at least one incident light L1 through the light source 112 and the light splitting element 114. In practice, the light source 112 may be a light emitting Diode (LIGHT EMITTING Diode, LED), and the light splitting component 114 may have a light guiding function and have a plurality of optical microstructures, where the optical microstructures may be printed patterns, grooves or ribs, and the grooves are, for example, V-grooves (V-cut). When the light provided by the light source 112 is incident on the light-splitting component 114, the light can be reflected, refracted or scattered by the optical microstructures, so as to be emitted from the light-emitting portion of the light-splitting component 114.
The image sensor 120 is used for capturing an eyeball image. It should be noted that the wavelength range sensed by the image sensor 120 covers the wavelength range of the incident light L1. The captured eyeball image can show the eyes of the user, such as the eye white (not labeled), the iris I1, the pupil P1 and the like. In addition, the captured eyeball image further includes the bright spot G1. Specifically, the image sensor 120 senses the incident light L1 through a photosensitive element, which may be a Complementary Metal Oxide Semiconductor (CMOS) sensor (Complementary Metal-Oxide-Semiconductor Sensor) or a Charge-Coupled Device (CCD).
The arithmetic unit 130 may be a digital signal Processor (DIGITAL SIGNAL Processor, DSP) or a central processing unit (Central Processing Unit, CPU). The operation unit 130 analyzes the gray value distribution of the eyeball image captured by the image sensor 120, and obtains the distribution position of the bright spot G1 through the gray value distribution of the eyeball image, so as to determine the position of the pupil P1 according to the distribution position of the bright spot G1.
Fig. 1D is a flowchart of a pupil detection method according to a first embodiment of the present invention. Please refer to fig. 1B, fig. 1C, and fig. 1D.
In step S101, when the user uses the eyeball detection device 100, for example, when the user wears the carrying frame 150 with the eyeball detection device 100, the optical component 110 is made to provide an incident light L1 to the eyeball E1. The incident light L1 falls on the eyeball E1 and can be reflected to form a bright spot G1 in the vicinity of the pupil P1, for example, in the vicinity of the iris I1 region.
It should be noted that the placement of the light source 112 or the arrangement of the light source 112 and the light splitting component 114 can adjust the incident light L1 to be incident on the iris I1 area near the pupil P1. That is, the position of the bright point G1 is changed with the change of the position of the incident light L1, that is, the position of the bright point G1 corresponds to the position of the incident light L1.
Next, step S102 is performed to capture the first eyeball image by capturing the eyeball E1 through the image sensor 120. The first eyeball image captured by the image sensor 120 displays the image of the eyes of the user and the image of the bright point G1. Then, the image sensor 120 transmits the data of the captured first eyeball image to the operation unit 130.
Next, step S103 is performed, where the arithmetic unit 130 analyzes the gray value of the first spherical image to obtain the distribution position of the bright point G1. Taking an 8-bit 256-color gray value as an example, the change in gray value from pure black to gray to pure white is quantized to 256 colors, and the range of gray values is 0 to 255. It should be noted that, the gray value of the bright point G1 is mostly close to or equal to 255, and the gray value of the pupil P1 is closer to 0 than the gray value of the bright point G1. The computing unit 130 can learn the position, shape and range size of the pixel with the gray value close to the maximum value in all the pixels through the gray value distribution of the first spherical image, and further estimate the position of the pixel corresponding to the position of the bright point G1 in the first spherical image.
Then, step S104 is performed, and the computing unit 130 determines the position of the pupil P1 according to the distribution position of the bright point G1. In detail, the operation unit 130 selects an appropriate threshold value (threshold GRAY SCALE value), the gray value of the pupil P1 in the first eye image is smaller than the threshold gray value, and the gray values of the plurality of bright spots G1 in the first eye image are larger than the threshold gray value.
After confirming the distribution position of the bright point G1, the operation unit 130 scans the gray value distribution in the viewing area M1 (as shown in fig. 1B) in the vicinity of the distribution position of the bright point G1, and determines the portion in the viewing area M1 where the gray value of the first spherical image is smaller than the threshold gray value. The viewing area M1 may be defined by the position of the bright point G1, for example, the viewing area M1 may be set to be a region slightly larger than and covering the distribution position of the bright point G1 and the pupil. It should be noted that the bright point G1 may be located at the boundary of the viewing area M1 or inside the viewing area M1. The user can set the range of the viewing area M1 through the computing unit 130 according to the size of the position range of the pupil P1 to be searched. The invention is not limited to the range of the viewing area M1.
After the computing unit 130 determines that the gray level of a specific area in the inspection area M1 is smaller than the threshold gray level, it further determines whether the shape of the specific area conforms to the shape of the pupil P1 to reduce the probability of misjudging the position of the pupil P1. For example, if the computing unit 130 determines that the specific area corresponding to the gray level value smaller than the threshold has a rectangle and a circle, it can be determined that the specific area of the circle corresponds to the shape of the pupil P1 more than the specific area of the rectangle, i.e. approaches to the circle. In addition, in order to further reduce the probability of misjudging the position of the pupil P1, the area value range of the pupil P1 in the first spherical image may be established first, and the computing unit 130 may determine whether the proportion of the specific area falls within the area value range of the pupil P1 image, so as to further reduce the probability of misjudging the position of the pupil P1.
It should be noted that, through the above-mentioned pupil detection method, the operation unit 130 may only analyze the gray value distribution of the inspection area M1 near the bright point G1 in the first spherical image to narrow the range of finding the pupil P1, so as to accelerate the position of the pupil P1. Compared with the prior art, the operation unit 130 does not need to analyze the gray value distribution of the entire first eyeball image to find the position of the pupil P1.
Fig. 2A is a schematic side view of an eyeball detecting device according to a second embodiment of the present invention, and fig. 2B is a functional block diagram of the eyeball detecting device according to the second embodiment of the present invention. Referring to fig. 2A and 2B, the eyeball detecting device 200 of the second embodiment is similar to the eyeball detecting device 100 of the first embodiment in structure, for example, both eyeball detecting devices 100 and 200 also include an optical component 110 and an image sensor 120. However, there are still differences between the eyeball detecting devices 100 and 200. The differences between the eyeball detecting device 200 and the eyeball detecting device 100 are described in detail below, and the same features are not described again.
The eyeball detecting device 200 of the second embodiment includes an optical component 210, an image sensor 120 and an arithmetic unit 130. The optical assembly 110 provides a plurality of incident lights L1 to form a plurality of bright spots G1 in the vicinity of the pupil P1. The image sensor 120 is used for capturing an eyeball image, and the eyeball image includes the images of the bright spots G1. The operation unit 130 analyzes the gray level of the eyeball image, and accordingly obtains the distribution positions of the bright spots G1. In addition, the computing unit 130 can determine the pupil I1 position according to the distribution positions of the bright spots G1.
The optical assembly 210 can emit a plurality of incident light beams L1 to the eyeball E1. These incident light L1 falls on the eyeball E1 and can be reflected on the iris I1 to form a plurality of bright spots G1, wherein at least part of the bright spots G1 are located in the vicinity of the pupil P1.
The present embodiment can use only one or a small number of light sources 212 and light splitting components 214 to split the light into a plurality of incident lights L1. Alternatively, the optical element 210 may also include a plurality of light sources 212 without any light splitting element 214, and the optical element 210 provides a plurality of incident lights L1 by the light sources 212. Therefore, the present invention is not limited to the number of light sources 212 and the structure of the light splitting assembly 214.
Fig. 2C is a flowchart of a pupil detection method according to a first embodiment of the present invention. Please refer to fig. 2A, 2B and 2C.
When the user uses the eyeball detection device 200, the optical component 210 provides a plurality of incident light beams L1 to the eyeballs E1 and can reflect at the iris I1 area near the pupil P1 to form a plurality of bright spots G1.
It is noted that the position of the bright point G1 changes with the change of the position where the incident light L1 is emitted. For example, assuming that four positions at which the incident light L1 is emitted are provided, and the four positions at which the incident light L1 is emitted are arranged in a rectangular shape and have an aspect ratio of about 2 to 1, four bright spots G1 will appear in the iris I1 region near the pupil P1, and the arrangement of the bright spots G1 is also in a rectangular shape in principle and has an aspect ratio of 2 to 1.
Next, step S202 is performed to capture the first eyeball image from the eyeball E1 through the image sensor 120. The first eyeball image captured by the image sensor 120 displays the image of the eyes of the user and the image of the bright point G1. Then, the image sensor 120 transmits the data of the captured first eyeball image to the operation unit 130.
Next, step S203 is executed, and the computing unit 130 obtains the position, the shape and the range size of the pixel distribution of the gray value close to the maximum value among all the pixels through the gray value distribution of the first spherical image, and further estimates the position of the pixel distribution corresponding to the bright point G1 position in the first spherical image.
Then, step S204 is performed, and the computing unit 130 determines the position of the pupil P1 according to the distribution positions of the bright spots G1. In detail, the operation unit 130 selects an appropriate threshold gray level, and the gray level of the plurality of bright spots G1 in the first spherical image is greater than the threshold gray level. After confirming the distribution positions of the bright spots G1, the operation unit 130 scans the gray value distribution in the viewing area M1 (as shown in fig. 2A) in the vicinity of the distribution positions of the bright spots G1, and determines the portion of the viewing area M1 where the gray value of the first spherical image is smaller than the threshold gray value.
It should be noted that the viewing area M1 may be defined by the positions of the bright spots G1. For example, the inspection area M1 may be a slightly larger area that covers the distribution positions of the bright spots G1 and the pupils, or may be surrounded by the bright spots G1.
Similarly, in order to reduce the probability of misjudging the position of the pupil P1, after the computing unit 130 determines that a specific area smaller than the threshold gray level is within the inspection area M1, it is determined whether the shape of the specific area conforms to the shape of the pupil P1 and whether the ratio of the specific area falls within the area value range of the pupil P1 image.
It should be noted that, with the above-mentioned pupil detection method, the computing unit 130 may define the range or the shape of the inspection area M1 only through the plurality of bright spots G1, so as to narrow the range of finding the pupil P1, thereby accelerating the obtaining of the position of the pupil P1.
Fig. 3A is a functional block diagram of an eyeball detecting device according to a third embodiment of the present invention. Referring to fig. 3A, the eyeball detecting device 300 of the third embodiment is similar to the eyeball detecting device 200 of the second embodiment in structure, for example, both eyeball detecting devices 300 and 200 include an optical component 210 and an image sensor 120. However, there are still differences between the eyeball detecting devices 300 and 200. The differences between the eyeball detecting device 300 and the eyeball detecting device 200 are described in detail below, and the same features are not described again.
The eyeball detecting device 300 of the third embodiment includes an optical component 210, an image sensor 120, an arithmetic unit 230 and a control unit 340. The optical assembly 210 provides a plurality of incident lights L1 to form a plurality of bright spots G1 in the vicinity of the pupil P1. The control unit 340 can control the time point when the incident light is incident on the eyeball E1, that is, the control unit 340 can control the optical component 110 to provide the incident light L1 to the eyeball E1 at different time points. The image sensor 120 is used for capturing eyeball images at different time points, and the eyeball images respectively include the bright points G1a or G1b, i.e. the bright points G1a or G1b are respectively displayed in the eyeball images captured at different time points. The operation unit 230 analyzes the gray values of the eyeball images at different time points, and accordingly obtains the distribution positions of the bright spots G1, and determines the pupil I1 position according to the distribution positions of the bright spots G1a and G1b.
Specifically, the control unit 340 is configured to control the optical assembly 210 to provide the incident light L1 to the eyeball E1 at a time point, that is, the control unit 340 controls the optical assembly 210 to provide the incident light L1 at different time points. The image sensor 120 is used for capturing eyeball images at different time points, and the eyeball images captured at different time points all show bright points G1a or G1b. In addition to determining and analyzing the gray values of the eyeball images at different time points, the operation unit 230 may further transmit an instruction to the control unit 340, so that the control unit 340 controls the optical assembly 210 to provide the incident light L1 at the time point.
Fig. 3B is a flowchart of a pupil detection method according to a second embodiment of the present invention. Please refer to fig. 3A and 3B.
In step S301, the control unit 340 controls the optical assembly 210 to provide multiple incident light beams L1 at a first time point. These incident lights L are incident on the iris I1 area near the pupil P1 to be reflected to form a plurality of first bright spots G1a, and the positions of the first bright spots G1a correspond to the positions where the incident lights L1 are emitted. It should be noted that the present embodiment may employ the optical assembly 210 including the plurality of light sources 112 without including any light splitting assembly 114.
Next, step S302 is performed to capture a first eyeball image at a first time point through the image sensor 120. The first eyeball image photographed by the image sensor 120 is photographed at a first time point, and displays an image of the eyes of the user and an image of the first bright point G1 a. The image sensor 120 transmits the data of the captured first eyeball image to the operation unit 230.
Next, step S303 is performed, and the control unit 340 controls the optical assembly 210 to provide the plurality of incident light beams L1 to the iris I1 area near the pupil P1 at the second time point. The incident light L1 is reflected to form a plurality of second bright points G1b again, and the positions of the second bright points G1b correspond to the positions where the incident light L1 is emitted. It is noted that the second time point is different from the first time point, and the position of the second bright point G1b formed at the second time point is different from the position of the first bright point G1a formed at the first time point. In detail, at the first time point, only a part of the light sources 112 emits the incident light L1, and at the second time point, only another part of the light sources 112 emits the incident light L1.
For example, the number of light sources 212 may be four, and the four light sources 212 may be positioned in a rectangular arrangement, wherein the aspect ratio of the rectangle is 2 to 1. The control unit 340 may control four light sources 212 to provide only two light sources 212 positioned at opposite angles thereof at a first time point, and the control unit 340 controls two light sources 212 positioned at other opposite angles at a second time point. However, it should be noted that the present invention is not limited to the positions and the number of the light sources 212 providing the incident light L1 at different time points, and the light emitting sequence of the light sources 212 at different time points is not limited.
Next, step S304 is performed to capture a second eyeball image at a second time point through the image sensor 120. The second eyeball image captured by the image sensor 120 shows an image of the eyes of the user and an image of the second bright point G1 b. The image sensor 120 transmits the data of the second eyeball image to the computing unit 230.
It should be noted that the first time point is a certain time point when the user starts to use the eyeball detection device 300, and the second time point is another time point different from the first time point. The first eyeball image is an eyeball image captured by the image sensor 120 at a first time point, and the second eyeball image is an eyeball image captured by the image sensor 120 at a second time point.
Next, step S305 is performed to analyze the gray values of the first and second eyeball images to obtain the distribution positions of the first and second bright spots G1a and G1 b. Specifically, the computing unit 230 analyzes the gray value distributions of the first and second eyeball images, respectively, and can obtain the positions, shapes and ranges of the distribution of the pixels having all the gray values close to the maximum value in the first and second eyeball images. Accordingly, the positions, shapes and ranges of the distribution of the pixels with all gray values close to the maximum value are compared by analysis, and the positions of all the first bright spots G1a and the second bright spots G1b in the first eyeball image and the second eyeball image are further estimated.
Next, step S306 is executed to perform image subtraction on the first eyeball image and the second eyeball image (Image Subtraction). It should be noted that, in the present embodiment, the number of the light sources 212 is four, and in the first spherical image, the first bright point G1a is provided by only two light sources 212 located diagonally therein. In the second eyeball image, the second bright point G1b is provided by two light sources 212 which are opposite to each other. The gray values of the corresponding pixels of the first eyeball image and the second eyeball image are subtracted, and the obtained difference image of the two images, for example, the difference gray value of the difference image is between-255 and 255.
It should be noted that, since the first bright spot G1a image position in the first eyeball image and the second bright spot G1b image position in the second eyeball image do not overlap, the gray value corresponding to the first bright spot G1a image position and the second bright spot G1b image position in the difference image generated after the first eyeball image and the second eyeball image perform image subtraction is the extremum. For example, in this way, the gray value corresponding to the first bright point G1a in the first eyeball image is the highest, and the gray value corresponding to the second bright point G1b in the second eyeball image is the lowest (e.g., negative gray value).
Accordingly, in the difference image, gray values corresponding to the positions of the two first bright points G1a and the two second bright points G1b show a special pattern. And this particular pattern is defined by two brightest points and two darkest points. However, in other embodiments, the first eye image may be subtracted from the second eye image, so that the gray level corresponding to the bright point G1 in the second eye image is the highest and the gray level corresponding to the bright point G1 in the first eye image is the lowest in the difference image. The invention is not limited in this regard.
Further, when the computing unit 230 determines the positions of the first bright point G1a and the second bright point G1b, the computing unit analyzes the positions, the shapes and the ranges of the pixels with all gray values close to the maximum value (255) and the minimum value (-255) according to the difference image, and estimates the positions of the first bright point G1a and the second bright point G1b in the difference image. Then, the computing unit 230 determines whether the estimated position arrangement of the first bright point G1a or the second bright point G1b corresponds to the special pattern, and further confirms the positions of the first bright point G1a and the second bright point G1b in the difference image.
Accordingly, the control unit 340 controls the incident positions of the incident light L1 at different time points, so that the positions of the first bright point G1a and the second bright point G1b in the eyeball images at different time points can be changed, and the positions of the first bright point G1a and the second bright point G1b can be further confirmed through the gray value and the special pattern after the image subtraction step, thereby helping to reduce the probability of searching for the position of the wrong bright point G1.
Next, step S207 is performed, and the computing unit 230 determines the position of the pupil P1 according to the distribution positions of the first bright point G1a and the second bright point G1 b. In detail, the operation unit 230 selects an appropriate threshold gray level, the gray level of the pupil P1 in the original first image is smaller than the threshold gray level, and the absolute value of the gray level of the first bright point G1a or the second bright point G1b in the difference image is larger than the threshold gray level. With the critical gray scale value, the operation unit 230 can confirm the distribution position of the first bright point G1a or the second bright point G1 b. After confirming the distribution position of the first bright spot G1a or the second bright spot G1b, the operation unit 230 scans the gray value distribution in the inspection area M1 near the distribution position of the first bright spot G1a or the second bright spot G1b1, and determines the portion of the original first image in the inspection area M1 where the gray value is smaller than the threshold gray value.
For example, in the difference image, the gray value corresponding to the first bright point G1a image position and the second bright point G1b image position is the extremum, the gray value corresponding to the first bright point G1a position in the first eyeball image is the highest, and the gray value corresponding to the second bright point G1b position in the second eyeball image is the lowest. Accordingly, the operation unit 230 confirms the position of the first bright point G1a through the critical gray value, so that the operation unit 230 scans the gray value distribution in the neighboring viewing area M1 of the first bright point G1a to determine the position of the pupil P1.
It should be noted that, the viewing area M1 may be defined by the first bright points G1a and/or the second bright points G1 b. The inspection area M1 may be slightly larger than and cover an area surrounded by the distribution position of the first bright spot G1a or the second bright spot G1 b. Alternatively, the inspection area M1 may be surrounded by the first bright point G1a or the second bright point G1 b. The user can set the range of the viewing area M1 through the computing unit 230 according to the size of the range of the position of the pupil P1 to be searched. The invention is not limited to the range of the viewing area M1.
When the computing unit 230 determines that the gray level of a specific region in the inspection region M1 is smaller than the threshold gray level, it determines whether the shape and the scale of the specific region are close to those of the pupil P1 in the differential image, so as to reduce the probability of misjudging the position of the pupil P1.
Based on the above, with the pupil detection method of the present invention, the operation unit 230 can only determine the gray value distribution of the inspection area M1 near the distribution position of the first bright point G1a or the second bright point G1b in the difference image, so as to accelerate the acquisition of the position of the pupil P1. Compared to the prior art, the computing unit 230 does not need to determine the gray value distribution of the entire first or second eyeball image to obtain the position of the pupil P1.
Fig. 4 is a flowchart of an iris recognition method according to an embodiment of the invention, and the iris recognition method of the embodiment may employ the eyeball detection device of fig. 2A. Please refer to fig. 4 and refer to fig. 2A in conjunction therewith.
Step S401 is performed, when the eyeball E1 is located at a reference position, a plurality of incident light beams L1 are emitted onto the eyeball E1 to form a plurality of bright spots G1 near the pupil P1 of the eyeball E1, wherein the distribution positions of the bright spots G1 are respectively used as a first reference point, a second reference point and a third reference point.
Specifically, the multiple incident light beams L1 can be provided by the light source 212 and the light splitting component 214, so that the emitting positions of the multiple incident light beams L1 are the emitting positions of the light splitting component 214. Alternatively, the multiple incident light beams L1 may be provided by at least three light sources 212 without any beam splitting element 214, so that the multiple incident light beams L1 are emitted from the light sources 212. The placement of the light source 212 or the arrangement of the light source 212 and the light splitting component 214 can adjust the incidence of the incident light L1 on the iris I1 area near the pupil P1.
The first reference point, the second reference point and the third reference point are used as marks of the eyeball E1 at the reference position, and the reference position is used for providing the position of a group of reference datum in the subsequent iris recognition step. In the present embodiment, in a state where the eyeball E1 is seen straight ahead, the user presets the positions of the plurality of bright spots G1 corresponding to the emission positions of these incident lights L1 as the positions of the first reference point, the second reference point, and the third reference point. In detail, a first reference axis is formed between the first reference point and the second reference point,
The second reference point and the third reference point form a second reference axis, and a reference included angle is formed between the first reference axis and the second reference axis. In addition, the present invention may further include setting a fourth reference point or more other reference points for more clearly marking the reference positions. That is, the present invention is not limited to the number of reference points.
In the present embodiment, three positions of the incident light L1 are provided, and the positions of the three positions of the incident light L1 are arranged in a right triangle, and the two lengths of the right triangle are 2 to 1. Three bright spots G1 will appear in the iris I1 area near the pupil P1, and the arrangement of these bright spots G1 is in principle the same as the arrangement of the positions where the incident light L1 is emitted. That is, in principle, the ratio between the length of the first reference axis and the length of the second reference axis is about 2 to 1, and the reference angle is 90 degrees.
Next, step S402 is performed, when the eyeball E1 moves from the reference position to a measurement position, the incident light L1 forms a first measurement spot, a second measurement spot and a third measurement spot on the eyeball E1, and the first measurement spot, the second measurement spot and the third measurement spot are located near the pupil P1 of the eyeball E1. A first shaft is formed between the first measurement bright spot and the second measurement bright spot, a second shaft is formed between the second measurement bright spot and the third measurement bright spot, and an included angle is formed between the first shaft and the second shaft.
Specifically, when the eyeball E1 rotates relative to the reference position, the shape of the eyeball E1 is generally in a three-dimensional sphere, and the iris I1 protrudes out of the sphere, so that the positions of a plurality of bright spots G1 formed by the incident light L1 incident on the iris I1 will be changed, and the bright spots G1 at the changed positions are the first measurement bright spot, the second measurement bright spot and the third measurement bright spot. That is, when the eye E1 is deviated from the front view, the position of the bright spot G1 is deviated from the positions of the first, second and third reference points to the positions of the first, second and third measurement bright spots.
Next, step S403 is performed to capture an eyeball image through the image sensor 120. The eyeball image captured by the image sensor 120 displays an image of the eyes of the user, the first measurement bright spot, the second measurement bright spot, the third measurement bright spot and an image of the iris I1. Then, the image sensor 120 transmits the data of the photographed eyeball image to the operation unit 130 or 230.
Then, step S404 is performed to analyze the gray level of the eyeball image to obtain the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot. In detail, the computing unit 130 or 230 analyzes the gray value distribution of the eyeball image, and can obtain the position, shape and range of the pixel distribution of all gray values close to the maximum value (for example, 255 gray values) in the eyeball image. Accordingly, the computing unit 130 or 230 further estimates the positions of all the first, second and third measurement bright spots in the eyeball image by analyzing and comparing the positions, shapes and ranges of the distribution of all the pixels with gray values close to the maximum value.
Then, step S405 is performed to calculate the displacement generated by the positions of the first, second and third measurement spots relative to the positions of the first, second and third reference points, so as to obtain the deformation of the iris I1 image when the eyeball E1 is located at the measurement position relative to the iris I1 image when the eyeball E1 is located at the reference position. In detail, the calculating unit 130 or 230 calculates the variation of the length and the variation of the angle of the first axis relative to the first reference axis to obtain the first variation, the variation of the length and the variation of the angle of the second axis relative to the second reference axis to obtain the second variation, and the variation of the angle relative to the reference angle to obtain the third variation. Accordingly, the computing unit 130 or 230 calculates the deformation amount of the iris image according to the first variation amount, the second variation amount and the third variation amount, and further, the deformation amount ratio can be estimated simply according to the relative ratio of the first axis to the second axis and the ratio of the original reference first two axes, and in addition, the distance from the sensor to the eyeball can be estimated according to the length of the first axis or the second axis, so as to estimate the size of the iris image and accelerate the search.
It should be noted that, when the eyeball E1 is in front of and at the reference position, the iris image captured by the image sensor 120 is more circular. When the measurement position is the same as the reference position, that is, the eye E1 is in a state of looking ahead, the iris image captured by the image sensor 120 will remain unchanged and be shaped like a circle. When the measured position is different from the reference position, that is, the eye E1 is deviated from the front view, the iris image captured by the image sensor 120 is more elliptical.
After the computing unit 130 or 230 computes the first variation, the second variation and the third variation, the long axis and the short axis of the ellipse of the iris image can be computed according to the first variation, the second variation and the third variation, so as to estimate the ellipse boundary of the iris image and further accelerate the boundary searching speed of the iris image.
Fig. 5 is a flowchart of a method for identifying an iris according to an embodiment of the invention. Referring to fig. 5 in conjunction with fig. 2A, the iris recognition method of the embodiment of fig. 5 is similar to the iris recognition method of the embodiment of fig. 4, and the differences between the two iris recognition methods will be described in detail below.
In the method for identifying an iris in the present embodiment, first, step S501 is performed to provide a plurality of incident lights L1 to be incident on an eyeball E1 so as to form a plurality of bright spots G1 on the eyeball E1. The multi-path incident light L1 can be provided by the light source 112 and the light splitting component 114, and the emitting position of the multi-path incident light L1 is the position of the light beam from the light emitting positions of the light splitting component 114. Alternatively, the multiple incident light beams L1 may be provided by at least three light sources 212 without any beam splitting element 214, so that the multiple incident light beams L1 are emitted from the light sources 212. In addition, the placement of the light source 212 or the arrangement of the light source 212 and the light splitting component 214 can adjust the incident light L1 to be incident on the iris I1 area near the pupil P1.
Step S502 is performed to set the first reference point, the second reference point and the third reference point as markers for the eyeball E1 at the reference position. In the present embodiment, the reference position is a position in front of the eye E1. In a state where the eyeball E1 is seen straight ahead, the user presets the positions of the plurality of bright spots G1 corresponding to the emission positions of these incident lights L1 as the positions of the first reference point, the second reference point, and the third reference point. However, it should be noted that the reference position is not limited to a position just in front of the eyeball E1, and the reference position may be a position where the line of sight of the eyeball E1 deviates from the straight ahead.
In detail, a first reference axis is formed between the first reference point and the second reference point, a second reference axis is formed between the second reference point and the third reference point, and a reference included angle is formed between the first reference axis and the second reference axis. In addition, the present invention may further include setting a fourth reference point or more other reference points for more clearly marking the reference positions. That is, the present invention is not limited to the number of reference points.
In the present embodiment, three positions of the incident light L1 are provided, and the positions of the three positions of the incident light L1 are arranged in a right triangle, and the two lengths of the right triangle are 2 to 1. Three bright spots G1 will appear in the iris I1 area near the pupil P1, and the arrangement of these bright spots G1 is in principle the same as the arrangement of the positions where the incident light L1 is emitted. That is, in principle, the ratio between the length of the first reference axis and the length of the second reference axis is 2 to 1, and the reference angle is 90 degrees. It is noted that the positions of the bright spots G1 corresponding to the emission positions of the incident light L1 are preset by the user as the positions of the first reference point, the second reference point and the third reference point.
In step S503, when the eyeball E1 is located at the measurement position, the incident light L1 forms a first measurement bright spot, a second measurement bright spot and a third measurement bright spot on the eyeball E1. The positions of the first, second, and third measurement bright spots are located near the pupil P1 of the eyeball E1, and correspond to the positions of the first, second, and third reference points. A first shaft is formed between the first measurement bright spot and the second measurement bright spot, a second shaft is formed between the second measurement bright spot and the third measurement bright spot, and an included angle is formed between the first shaft and the second shaft.
Specifically, when the eye tracking device 200 or 300 is worn by different users, the optical assembly 210 is easily separated from the eye E1 due to different facial forms or different heights of nose, so that the positions of the bright spots G1 formed by the incident light L1 incident on the eye E1 will be changed, and the changed positions of the bright spots G1 are the first measurement bright spot, the second measurement bright spot and the third measurement bright spot. That is, when the eye E1 keeps unchanged, the position of the bright spot G1 will be scaled from the positions of the first reference point, the second reference point and the third reference point to the positions of the first measurement bright spot, the second measurement bright spot and the third measurement bright spot in equal proportion, and the included angle is the same as the reference included angle.
In this embodiment, the ratio between the length of the first axis and the length of the second axis is 2 to 1, and accordingly, the ratio between the length of the first reference axis and the length of the second reference axis is also 2 to 1. The reference angle is 90 degrees, and accordingly, the angle is also 90 degrees.
Step S504 is performed to capture an image of the eyeball E1 through the image sensor 120. The image of the eye, the first measurement bright spot, the second measurement bright spot, the third measurement bright spot and the image of the iris I1 are displayed in the eyeball image captured by the image sensor 120. Then, the image sensor 120 transmits the data of the photographed eyeball image to the operation unit 130 or 230.
Step S505 is performed to analyze the gray level of the eyeball image to obtain the positions of the first measurement bright point, the second measurement bright point and the third measurement bright point. In detail, the operation unit 130 or 230 analyzes the gray value distribution of the eyeball image, and can know the position, shape and range of the distribution of all pixels with gray values close to the maximum value in the eyeball image. Accordingly, the positions of all the first, second and third measuring spots in the eyeball image are estimated by analyzing and comparing the positions, shapes and ranges of the distribution of all the pixels with gray values close to the maximum value. It should be noted that, since the position of the bright point G1 corresponds to the position where the incident light L1 is emitted, the position of the bright point G1 in the first spherical image can be estimated from the position, shape and range where all the pixels with gray values close to the maximum value are distributed.
Step S506 is performed to calculate the displacement generated by the positions of the first, second and third measurement bright spots relative to the positions of the first, second and third reference points, so as to obtain the resolution variation generated by the iris I1 image when the eyeball E1 is located at the reference position. In detail, the computing unit 130 or 230 captures an eyeball image from the image sensor 120, and calculates a variation of the first axis relative to the first reference axis to obtain a first variation, and calculates a variation of the second axis relative to the second reference axis to obtain a second variation. Accordingly, the computing unit 130 or 230 calculates the resolution variation of the iris image based on the first variation and the second variation.
For example, if the first reference axis has 20 pixel values (pixels), the second reference axis has 10 pixel values, and the ratio between the pixel values of the first reference axis and the second reference axis is 2 to 1. In contrast, the operation unit 130 or 230 calculates that the first reference axis has a 10-pixel value and the second reference axis has a 5-pixel value. The computing unit 130 or 230 may calculate that the first variation and the second variation are reduced by 2 times as much as the pixel values of the first reference axis and the second reference axis, respectively. Therefore, the boundary of the iris image can be estimated, and the boundary searching speed of the iris image is further accelerated.
In summary, the embodiments of the present invention provide an eyeball detection device and an eyeball tracking detection method. The eyeball detection device comprises an optical component, an image sensor and an operation unit. According to the eye tracking detection method, the operation unit can only judge the gray value distribution of the inspection area near the bright point in the first eye image, so that the position of the pupil can be obtained rapidly. Compared with the prior art, the operation unit does not need to judge the gray value distribution of the whole first eyeball image to obtain the position of the pupil.
The embodiment of the invention provides an eyeball detection device and a detection method for eyeball tracking. The control unit is used for controlling different incident light incidence positions at different time points, so that the bright point positions in eyeball images at different time points can be adjusted, and further the bright point positions can be more confirmed through gray values and special patterns after the image subtraction step, thereby helping to reduce the probability of searching for wrong bright point positions. The operation unit can only judge the gray value distribution of the inspection area near the distribution position of the bright spots in the difference image, so that the pupil position can be obtained in an accelerating way. Compared with the prior art, the operation unit does not need to judge the gray value distribution of the whole first eyeball image or the second eyeball image so as to obtain the position of the pupil.
The embodiment of the invention provides a method for identifying an iris, which is characterized in that the displacement generated by the positions of a first measurement bright spot, a second measurement bright spot and a third measurement bright spot relative to the positions of a first reference spot, a second reference spot and a third reference spot is calculated by an operation unit, so that the elliptical length and the elliptical short axis of an iris image can be calculated according to the displacement, the elliptical boundary of the iris image can be estimated, and the boundary searching speed of the iris image can be further accelerated.
The embodiment of the invention provides a method for identifying an iris, which comprises the steps of calculating a first variation, a second variation and a third variation by an operation unit, estimating the boundary of an iris image, and further accelerating the boundary searching speed of the iris image.
The foregoing is merely exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Those skilled in the art will appreciate that many modifications and substitutions can be made without departing from the spirit and scope of the invention.

Claims (14)

1. An eyeball detection device is characterized in that the eyeball detection device is an eyeglass type eyeball detection device and comprises:
a wearing type bearing frame;
The optical component is arranged on the wearing type bearing frame and is used for providing a plurality of incident lights to an eyeball so as to form a plurality of bright spots on the eyeball, and at least one bright spot in the plurality of bright spots is positioned near a pupil of the eyeball, wherein the optical component comprises a light source and a light splitting component, the light source provides a light ray, the light ray penetrates through the light splitting component to form the plurality of incident lights, the light splitting component is provided with a plurality of optical microstructures, and when the light ray provided by the light source is incident to the light splitting component, the light ray is reflected, refracted or scattered by the plurality of optical microstructures, and the plurality of incident lights are respectively emitted from a plurality of light outlets of the light splitting component;
An image sensor arranged on the wearable bearing frame and used for capturing an eyeball image from the eyeball, wherein the eyeball image comprises an image of the at least one bright spot and an image of the pupil; and
An operation unit for analyzing the gray value of the eyeball image and obtaining the distribution position of the plurality of bright spots through the gray value of the eyeball image, wherein the operation unit scans the gray value distribution in a viewing area defined by the plurality of bright spot distribution positions and only judges whether each gray value in the viewing area is smaller than a critical gray value so as to judge the position of the pupil, wherein the viewing area is slightly larger than or equal to the area surrounded by the distribution position of the plurality of bright spots and is positioned in the iris of the eyeball,
The light source and the light splitting component are configured on the lens frame of the wearable bearing frame, the light source is positioned at the side edge of the lens frame, and the light splitting component is simultaneously positioned at the bottom and two sides of the lens frame.
2. The eye detection device according to claim 1, wherein: the incident light is infrared light.
3. The eye detection device according to claim 1, wherein: the gray level of the pupil in the eyeball image is smaller than a critical gray level, and the gray level of the bright point in the eyeball image is larger than a critical gray level.
4. The eye detection device according to claim 1, wherein: the optical assembly includes a plurality of light sources for providing the plurality of incident light.
5. The eye detection device according to claim 1, wherein: the control unit controls the time points of the incident light to the eyeballs, the image sensor captures the eyeball images of different time points of a user, and the operation unit analyzes the gray values of the eyeball images at different time points.
6. The eye detection device according to claim 5, wherein: the operation unit controls the control unit to execute the time point of incidence of the plurality of incident lights to the eyeball.
7. A method for detecting a pupil, the method comprising:
Providing a plurality of incident lights to be incident on an eyeball so as to form a plurality of first bright spots on the eyeball, wherein the plurality of first bright spots are positioned near a pupil, the plurality of incident lights are emitted from an optical component, the optical component is arranged on a wearing type bearing frame and comprises a light source and a light splitting component, the light source provides a light ray, the light ray penetrates through the light splitting component to form the plurality of incident lights, the light splitting component is provided with a plurality of optical microstructures, and when the light ray provided by the light source is incident on the light splitting component, the light ray is reflected, refracted or scattered by the plurality of optical microstructures, and the plurality of incident lights are respectively emitted from a plurality of light outlets of the light splitting component;
Capturing a first eyeball image from the eyeball, wherein the first eyeball image comprises a plurality of images of the first bright spots and images of the pupil;
analyzing the gray value of the first eyeball image to obtain the distribution positions of the first bright spots; and
Only scanning gray value distribution in a viewing area defined by the distribution positions of the plurality of first bright spots, and judging whether each gray value in the viewing area is smaller than a critical gray value so as to judge the position of the pupil, wherein the viewing area is slightly larger than or equal to an area surrounded by the distribution positions of the plurality of first bright spots and is positioned in the iris of the eyeball,
The light source and the light splitting component are configured on the lens frame of the wearable bearing frame, the light source is positioned at the side edge of the lens frame, and the light splitting component is simultaneously positioned at the bottom and two sides of the lens frame.
8. The method of detecting pupils in claim 7 wherein: the first eyeball image is captured by an image sensor.
9. The method of detecting pupils in claim 7 wherein: the gray value of the first eyeball image is judged and analyzed through an operation unit.
10. The method of detecting pupils in claim 7 wherein: the step of determining the pupil position according to the distribution position of at least one first bright point in the plurality of first bright points includes:
Selecting a critical gray value;
Analyzing gray value distribution in the inspection area defined by the first bright spots;
selecting a region with gray value smaller than the critical gray value as a specific region in the inspection region; and
And judging whether the shape of the specific area accords with the shape of the pupil.
11. The method of detecting pupils in claim 10 wherein: the inspection area is surrounded by the first bright points.
12. The method of detecting pupils in claim 10 wherein: the gray scale values of the first bright points in the first eyeball image are larger than the critical gray scale value.
13. The method of detecting pupils in claim 7 wherein: the first eye image is captured at a first time point, the first eye image comprises the images of the first bright spots, and the method for detecting the pupil further comprises:
providing the incident light beams to the eyeball at a second time point to form a plurality of second bright spots on the eyeball, wherein at least part of the second bright spots are positioned near the pupil;
capturing a second eyeball image at the second time point, wherein the second eyeball image comprises a plurality of images of the second bright spots and an image of the pupil;
Analyzing the gray level of the second eyeball image to obtain the distribution positions of the plurality of second bright spots, wherein the distribution positions of the plurality of first bright spots of the first eyeball image are different from the distribution positions of the plurality of second bright spots of the second eyeball image; and
Image subtraction is performed on the first eyeball image and the second eyeball image to generate a difference image.
14. The method of detecting pupils in claim 13 wherein: executing the incident of the incident light to the eyeball at the first time point and the second time point through a control unit.
CN201910196821.XA 2013-11-25 2013-11-25 Eyeball detection device and pupil detection method Active CN109919117B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910196821.XA CN109919117B (en) 2013-11-25 2013-11-25 Eyeball detection device and pupil detection method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310607136.4A CN104657702B (en) 2013-11-25 2013-11-25 Eyeball arrangement for detecting, pupil method for detecting and iris discrimination method
CN201910196821.XA CN109919117B (en) 2013-11-25 2013-11-25 Eyeball detection device and pupil detection method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201310607136.4A Division CN104657702B (en) 2013-11-25 2013-11-25 Eyeball arrangement for detecting, pupil method for detecting and iris discrimination method

Publications (2)

Publication Number Publication Date
CN109919117A CN109919117A (en) 2019-06-21
CN109919117B true CN109919117B (en) 2024-05-03

Family

ID=53248804

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910196821.XA Active CN109919117B (en) 2013-11-25 2013-11-25 Eyeball detection device and pupil detection method
CN201310607136.4A Active CN104657702B (en) 2013-11-25 2013-11-25 Eyeball arrangement for detecting, pupil method for detecting and iris discrimination method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201310607136.4A Active CN104657702B (en) 2013-11-25 2013-11-25 Eyeball arrangement for detecting, pupil method for detecting and iris discrimination method

Country Status (1)

Country Link
CN (2) CN109919117B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108567405A (en) * 2017-04-28 2018-09-25 分界线(天津)网络技术有限公司 A kind of measuring system and method for myopic refractive degree
CN108567406A (en) * 2017-04-28 2018-09-25 分界线(天津)网络技术有限公司 A kind of analytical measurement system and method for human eye diopter
CN108567411A (en) * 2017-09-15 2018-09-25 分界线(天津)网络技术有限公司 A kind of judgement system and method for human eye health state
CN110929570B (en) * 2019-10-17 2024-03-29 珠海虹迈智能科技有限公司 Iris rapid positioning device and positioning method thereof
CN111781722A (en) * 2020-07-01 2020-10-16 业成科技(成都)有限公司 Eyeball tracking structure, electronic device and intelligent glasses

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101344919A (en) * 2008-08-05 2009-01-14 华南理工大学 Sight tracing method and disabled assisting system using the same
CN101539991A (en) * 2008-03-20 2009-09-23 中国科学院自动化研究所 Effective image-region detection and segmentation method for iris recognition
TW201016185A (en) * 2008-10-27 2010-05-01 Utechzone Co Ltd Method and system for positioning pupil, and storage media thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006048204A (en) * 2004-08-02 2006-02-16 Matsushita Electric Ind Co Ltd Pupil detecting device and pupil authenticating device
EP1983884B1 (en) * 2006-01-26 2016-10-26 Nokia Technologies Oy Eye tracker device
CN101589329B (en) * 2007-11-21 2011-10-12 松下电器产业株式会社 Display
US8723798B2 (en) * 2011-10-21 2014-05-13 Matthew T. Vernacchia Systems and methods for obtaining user command from gaze direction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539991A (en) * 2008-03-20 2009-09-23 中国科学院自动化研究所 Effective image-region detection and segmentation method for iris recognition
CN101344919A (en) * 2008-08-05 2009-01-14 华南理工大学 Sight tracing method and disabled assisting system using the same
TW201016185A (en) * 2008-10-27 2010-05-01 Utechzone Co Ltd Method and system for positioning pupil, and storage media thereof

Also Published As

Publication number Publication date
CN104657702B (en) 2019-04-12
CN109919117A (en) 2019-06-21
CN104657702A (en) 2015-05-27

Similar Documents

Publication Publication Date Title
TWI533224B (en) Eye detecting device and methodes of detecting pupil and identifying iris
CN109919117B (en) Eyeball detection device and pupil detection method
EP2571257B1 (en) Projector device and operation detecting method
US10916025B2 (en) Systems and methods for forming models of three-dimensional objects
KR101471488B1 (en) Device and Method for Tracking Sight Line
JP2017182739A (en) Gaze detection device, gaze detection method and computer program for gaze detection
US10628964B2 (en) Methods and devices for extended reality device training data creation
US10630890B2 (en) Three-dimensional measurement method and three-dimensional measurement device using the same
US10866635B2 (en) Systems and methods for capturing training data for a gaze estimation model
US11375133B2 (en) Automatic exposure module for an image acquisition system
JP2018205819A (en) Gazing position detection computer program, gazing position detection device, and gazing position detection method
JP6870474B2 (en) Gaze detection computer program, gaze detection device and gaze detection method
KR20200035003A (en) Information processing apparatus, information processing method, and program
CN113557519A (en) Information processing apparatus, information processing system, information processing method, and recording medium
WO2018164104A1 (en) Eye image processing device
US11675429B2 (en) Calibration, customization, and improved user experience for bionic lenses
JP7509285B2 (en) Biometric authentication device, biometric authentication method, and program thereof
JP2004038531A (en) Method and device for detecting position of object
JP7228885B2 (en) Pupil detector
JP2005296383A (en) Visual line detector
JP2020129187A (en) Contour recognition device, contour recognition system and contour recognition method
US20230092593A1 (en) Detection device detecting gaze point of user, control method therefor, and storage medium storing control program therefor
JP4852454B2 (en) Eye tilt detection device and program
KR20070069577A (en) Method for determining moving direction of robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant