AU727389B2 - Apparatus for the iris acquiring images - Google Patents
Apparatus for the iris acquiring images Download PDFInfo
- Publication number
- AU727389B2 AU727389B2 AU43282/97A AU4328297A AU727389B2 AU 727389 B2 AU727389 B2 AU 727389B2 AU 43282/97 A AU43282/97 A AU 43282/97A AU 4328297 A AU4328297 A AU 4328297A AU 727389 B2 AU727389 B2 AU 727389B2
- Authority
- AU
- Australia
- Prior art keywords
- camera
- light
- illuminator
- iris
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/12—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
- A61B3/1216—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes for diagnostics of the iris
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
- A61B3/145—Arrangements specially adapted for eye photography by video means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Ophthalmology & Optometry (AREA)
- Multimedia (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Sampling And Sample Adjustment (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
- Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
- Eye Examination Apparatus (AREA)
- Image Input (AREA)
Description
WO 98/08439 PCTIUS97/14873
TITLE
APPARATUS FOR THE IRIS ACQUIRING IMAGES BACKGROUND OF THE INVENTION 1. Field of the Invention The invention relates to a method and apparatus for illuminating the eye to obtain an image of the iris.
2. Background of the Invention There are several methods known as biometrics for recognizing or identifying an individual. These methods include analyzing a signature, obtaining and analyzing an image of a fingerprint and imaging and analyzing the retinal vascular patterns of a human eye. Recently the art has used the iris of the eye which contains a highly detailed pattern that is unique for each individual and stable over many years as a non-contact, non-obtrusive biometric. This technique is described in United States Patent No. 4,641,349 to Flom et al. and United States Patent No. 5,291,560 to Daugman. The systems described in these references require the person being identified to hold at least one of their eyes in a fixed position with respect to an imaging camera which takes a picture of the iris. While this procedure is satisfactory for some applications, it is not satisfactory for quick transactional activities such as using an automated teller machine, unobtrusive access control or automated dispensing. Other examples are immigration control, point of sale verification, welfare check dispensing, intemrnet banking, bank loan or account opening and other financial transactions.
WO 98/08439 PCT/US97/14873 2.
The iris identification techniques disclosed by Flom and Daugman require a clear, well-focused image of the iris portion of the eye. Once that image is obtained a comparison of that image with a coded file image of the iris of the person to be identified can be accomplished quite rapidly. However, prior to the present invention there has not been an optical system which could rapidly acquire a sufficiently clear image of an iris of the person to be identified unless that person positioned his eye in a fixed position relatively close to an imaging camera. There is a need for a system which will rapidly obtain a clear picture of the iris of a person or animal remotely from the optical system and in an uncertain position. This system would be particularly useful to identify users of automated teller machines as well as individuals seeking access to a restricted area or facility or other applications requiring user identification. The system could also be used to identify patients, criminal suspects and others who are unable or unwilling to be otherwise identified.
Automated teller machines, often called ATMs, are widely used for banking transactions. Users are accustomed to receiving relatively fast verification of their identity after inserting their identification card and entering an identification number. However, anyone who knows the identification number associated with a given card can use that card. Should a robber learn the identification number by watching the owner use the card, finding the number written on the card or otherwise, he can easily draw funds from the owner's account. Consequently, banks have been searching for other more reliable ways of verifying the identity of ATM users.
WO 98/08439 PCTUS97/14873 3.
Since the iris identification methods disclosed by Flom et al. have proved to be very reliable, the use of iris identification to verify the identity of ATM users and other remote user recognition or verification application has been proposed.
However, for such use to be commercially available, there must be a rapid, reliable and unobtrusive way to obtain iris images of sufficient resolution to permit verification and recognition from an ATM user standing in front of the teller machine. To require the user to position his head a predetermined distance from the camera, such as by using an eyepiece or other fixture or without fixturing is impractical. Thus, there is a need for a system which rapidly locates the iris of an ATM user and obtains a quality image of the iris that can be used for verification and identification. This system should be suitable for use in combination with an access card or without such a card. The system should also be able to obtain such an image from users who are wearing eyeglasses or contact lenses or ski masks or other occluding apparel.
SUMMARY OF THE INVENTION We provide a method and apparatus which can obtain a clear image of an iris of a person to be identified whose head is located in front of the portion of our optical system which receives light reflected from the iris. The system includes at least one camera with or without ambient illumination and preferably at least one or more illuminators. We also prefer to provide a pan/tilt mirror, or gimbal device and at least one lens. Light reflected from the subject is captured by the gimbaled camera or mirror and directed through the lens to the camera. In a preferred embodiment, a narrow field of view (NFOV) camera receives the light reflected from the pan/tilt mirror through the WO 98/08439 PCTIUS97/14873 4.
lens or directly via a gimbaled mounted camera. A second camera and preferably a third camera are provided to obtain a wide field of view (WFOV) image of the subject.
In some cases, the WFOV cameras may be superfluous, if the user is always known to be in the field of view of the NFOV camera or could be located by moving the NFOV camera. Images from these WFOV cameras are processed to determine the coordinates of the specific location of interest, such as the head and shoulders and the iris of a person to be identified. Based upon an analysis of those images the pan/tilt mirror or gimbal is adjusted to receive light reflected from the iris or other area of interest and direct that reflected light to a narrow field of view camera. That camera produces an image of sufficient quality to permit iris identification.
The preferred embodiment contains a wide field of view illuminator which illuminates the face of the person to be identified. The illuminator preferably contains a plurality of infrared light emitting diodes positioned around the lens of the wide field of view camera or cameras.
We also prefer to provide two or more narrow field of view illuminators each comprised of an array of light emitting diodes. These arrays are mounted so as to be rotatable about both a horizontal axis and a vertical axis. By using at least two arrays we are able to compensate for specular reflection and reflection from eyeglasses or contact lenses or other artifacts which obscure portions of the iris.
We further prefer to construct the arrays so that one set of light emitting diodes have center lines normal to the base, a second set of light emitting diodes have centerlines at an acute angle to the base, and a third set of light emitting diodes have WO 98/08439 PCT/US97/14873 centerlines at an obtuse angle relative to the base. This provides the array a wider field of illumination. We further prefer to provide a control system which enables us to separately illuminate each group of light emitting diodes. The control system may permit selective activation of individual diodes. An alternative is to provide a single illumination that is directed in a coordinated manner with the image steering device.
An image processor is provided to analyze the images from the wide field of view camera and thereby specify the location of a point or area of interest on the object or person being identified. A preferred technique for identifying the position of the user is stereographic image analysis. Alternatively, visible or non-visible range imaging or distance finding devices such as ultrasonic, radar, spread spectrum microwave or thermal imaging or sensing or other optical means could be used.
The present system is particularly useful for verifying the identity of users of automated teller machines. The system can be readily combined with most conventional automated teller machines and many other financial transaction machines.
Image acquisition and identification can generally be accomplished in less than five seconds and in less than two seconds in many cases.
Other configurations of cameras such as one NFOV camera, one WFOV and one NFOV camera, two NFOV cameras, multiple NFOV cameras and multiple WFOV cameras can be utilized for other special purpose applications such as more or less restricted movement or position scenarios. For example, iris imaging in a telephone booth or for a telephone hands free use, multiple iris imaging in a crowd of WO 98/08439 PCT/US97/14873 6.
people, iris imagining of people in a moving or stationary vehicle, iris imaging of a race horse, or a point of sale site use.
Other objects and advantages will become apparent from a description of certain present preferred embodiments shown in the drawings.
BRIEF DESCRIPTION OF THE FIGURES Figure 1 is a front view of a present preferred embodiment of our device for obtaining images of irises.
Figure 2 is a side view of the embodiment of Figure 1.
Figure 3 is a top plan view of a first present preferred embodiment of our narrow field of view illuminator.
Figure 4 is a side view of the illuminator of Figure 3 with the area of illumination shown in chainline.
Figure 5 is a side view similar to Figure 4 of a second present preferred embodiment of our narrow field of view illuminator.
Figure 6 is top view of a first present preferred illuminator bracket.
Figure 7 is a side view showing the illuminator bracket of Figure 6 with the position of the illuminator shown in chainline.
Figure 8 is top view of a second present preferred illuminator bracket Figure 9 is a side view showing the illuminator bracket of Figure 8 with the position of the illuminator shown in chainline.
Figure 10 is a block diagram showing a preferred control architecture for the embodiment of Figure 1.
WO 98/08439 PCT/US97/14873 7.
Figure 11 is a front view showing the eyes and glasses of the person to be identified on which a reflection of the wide field of view illuminator appears.
Figure 12 is a front view showing the eyes and glasses of the person to be identified on which a reflection from the narrow field of view illuminators appears.
DESCRIPTION OF THE PREFERRED
EMBODIMENTS
Referring to Figures 1 and 2 a present preferred embodiment of our device is contained in housing 1 which is 14.5 inches wide, 15 inches high, and 18 inches deep. A housing of this size can be easily placed within or near an automated teller machine or other limited access machine or entry place. When incorporated within or placed near the housing of an automated teller machine our unit is located behind a light transmissive bezel. Typically, the bezel would be smoked or other visually opaque glass or a comparable plastic which obscures our device from being easily seen. Our device would be positioned so as to be about eye level of most users.
In Figure 2 we show the head of a person to be identified user in front of our device.
In the preferred embodiment of our device shown in Figures 1 and 2, we provide two wide field of view (WFOV) cameras 3 each having a lens 2 and 4. The orientation and placement of lens 2 and 4 and other components may be changed to accommodate available space or other applications. A wide field of view illuminator 6 surrounds the lens. Hoods 5 and 7 are provided around the lens 2 and 4 to prevent light emitted from the illuminator 6 from passing directly into the camera. The wide field of view illuminator is comprised of sets of light emitting diodes 10. For ease of WO 98/08439 PCTIUS97/14873 8.
construction, these sets of diodes may be mounted on small circuit boards 12. These boards are then mounted on a housing 13 which surrounds the lens 2 and 4. A sufficient number of LED containing circuit boards 12 are provided and positioned to illuminate the head of the person to be identified who is standing in front of our device as shown in Figure 2. Consequently, we prefer that the wide field of view illuminator provide a field of illumination which encompasses a region of about two feet in diameter at a distance of about one foot from the illuminator. This field of illumination is indicated by the solid lines extending from the wide field of view illuminator 6 in Figure 2.
Portions of the wide field of view illuminator are positioned around the WFOV camera to provide nearly on-axis illumination. On axis illumination, nearly on axis illumination, and oblique illuminator allow surface features to be imaged with minimal shadows which can generate false edges or other artifacts. Such illumination of the eye produces a shadow free image with good surface features. This type of lighting may cause the pupil to be bright making the iris easier to locate although this feature can be disabled if desired. Any shadows produced by other light sources and camera angles are minimized or washed out. Illumination control of this type can be used to stimulate parasympathetic autonomic nervous system reflexes that cause eye blinks or pupil variation or other reactions. These changes may be useful to determine subject awareness to establish certain life signs and to reduce pupil size for improving imaging resolution.
WO 98/08439 PCT/US97/14873 9.
Light from the wide field of view illuminator is reflected from the user's face into the lens of the wide field of view camera lens 2 and 4. This enables the WFOV cameras to create images from which an x, y, z coordinate location can be determined for one of the user's eyes. The WFOV cameras may also be used to provide security video images for monitoring transactions or other activities. We take an image of the right eye. However, either the left eye or the right eye or both eyes can be selected. The WFOV cameras could utilize a number of techniques such as stereo, stereo with structured light, and depth from focus with or without structured light to determine both the x-y location and distance to the object of interest. We prefer to use stereo processing techniques which compare at least a portion of the images from the two wide field of view cameras to determine the x, y, z coordinate locations. We can also provide a gaze director 9 which assists in identifying the location and motion of the eye, by attracting the attention of the user.
After we know the position of the selected eye we must illuminate the iris portion of the eye and obtain an iris image which can be used for iris verification and recognition. We prefer to be able to obtain an image having approximately 200 pixels across the iris portion of the image so that the image recognition algorithms used to perform identification and verification can operate reliably.
The iris recognition algorithms compare features of the iris in the image with the same features in a file image. Verification is considered to have been made when there is a match of a predetermined number of the compared features. For some situations a match of at least 75% of the compared features may be required. It is WO 98/08439 PCTIUS97/14873 therefore important that there be no specularities, spurious light reflections or dark shadows covering a significant portion of the image. When the user is wearing glasses, this can easily occur. To overcome this problem we provide at least two spaced apart illuminators for use in conjunction with the NFOV camera. These illuminators in conjunction with the WFOV camera may be selectively switched to an orchestrated combination to enhance these image structures. For example, a structured specularity can be created on eyeglasses to make eye finding easier, and then disabled for iris image acquisition.
In the embodiment shown in Figure 1 a single NFOV camera 16 is positioned behind mirror 18. Light emitted from one or more of the NFOV illuminators 21, 22 or 23 is reflected from the selected eye to an optical subsystem which directs the reflected light to the NFOV camera. We can provide a sensor 14 which senses the level of ambient light surrounding the person to be identified.
Information from this sensor can be used to determine what, if any, illumination must be provided by the illuminators 21,22 and 23. Alternatively, any one or a plurality of cameras themselves may be used for such light sensing.
The optical subsystem 30 in the embodiment shown in Figure 1 contains a pan/tilt mirror attached to rod 33 which extends from motor 34. This enables the pan/tilt mirror to be rotated about a tilt axis corresponding to a centerline through rod 33. Motor 34 is mounted on arm 35 which is pivotably attached to base 36 by rod 37.
This arm 35 can be moved around a pan axis corresponding to a centerline through rod 37. Light emitted from any of the NFOV illuminators 21, 22, or 23 is reflected from WO 98/08439 PCT/US97/14873 11.
the subject iris to the pan/tilt mirror 32. That mirror is positioned to direct the reflected light to mirror 18 from which the light is reflected to NFOV camera 16. The lens of the NFOV camera can be moved to change the focus or zoom, and the aperture of the camera is adjustable. One can also mount NFOV camera 16 on a movable platform so that the NFOV camera can be turned toward the eye. Then, the portions of the optical subsection shown in Figure 1 may not be needed. Our preferred optical subsystem has five degrees of freedom: the pan axis, the tilt axis, the focus axis, the aperture axis and the zoom axis. A system with fewer degrees of freedom could also be used. The pan and tilt axes are used to position the pan/tilt mirror 32 so that the correct narrow field is imaged onto the sensing array of NFOV camera 16. The ability to control movements along the focus axis, aperture axis and zoom axis allows us to be certain that the imaged object is in focus. In some cases, only the NFOV camera is needed and the WFOV camera may optionally not be used.
The design of the optics resolution, magnification, focusing and size of the imager dictates the distance between the camera and lens and the distance between the lens and object to be imaged. The size of the imager is of paramount importance in defining the distance from lens to imager and contributes to the depth of focus. Those versed in the art will recognize that NFOV camera 16 may be solid state or of vidicon nature and that the sensing array size can vary from generic industry sizes of 1/4, 1/3, 1/2, 2/3 or 1 inch diagonal measurement. In an optical system, the introduction of a mirror in an optical path allows the path to be redirected without effecting the optical path length. The use of these mirrors allows the optical path to be folded back on itself WO 98/08439 PCT/US97/14873 12.
thus reducing the overall required physical length needed to implement the optical design. Those skilled in the art will recognize that a gimbaled camera can also be used to perform image steering.
In the embodiment of Figure 1 the illuminators are positioned to provide a field of illumination illustrated by the dotted lines in Figure 2. These fields must be directed and sized so that light will be reflected from the eye of the user to the pan/tilt mirror 32. To achieve this result we provide three illuminators 21, 22 and 23 placed at different locations on the housing 1. These illuminators are oriented to direct light to the areas where the user's eye is most likely to be based upon information received from the WFOV cameras or other position detectors that could be used. The illuminators may be mounted in a permanent location and orientation or placed on manually adjustable or motorized brackets such as those shown in Figures 6, 7, 8 and 9. The illuminators could also be attached to a sliding mechanism for translation along an axis.
The light source preferably is a light emitting diode or other device that emits infrared or near infrared light or a combination of these. A lens and diffuser (not shown) can be used to guarantee uniform illumination. We have found infrared light to be particularly useful because it penetrates eyeglasses and sunglasses more easily than visible light or colored light within the visible spectrum. Infrared light is also invisible to the user and extremely unobtrusive. Optical filters may be placed in the light path in front of the camera to reduce any undesirable ambient light wavelengths that corrupt the desired images. Different wavelength priority filters may be used to permit different wavelengths to be used to optimize each camera's performance. For example, WO 98/08439 PCT/US97/14873 13.
longer wave IR could be used for WFOV camera imaging and shorter IR could be used for NFOV camera imaging. Then the NFOV cameras would not respond to WFOV illumination. This "speed of light" processing can be used to great advantage. If desired, the LED light source could be strobed. Strobing provides the capability to freeze motion. Strobing also provides the capability to overwhelm ambient light by using a high intensity source for a brief period of time, and exposing the camera accordingly to wash out the background ambient illumination which would otherwise cause interference. Strobing NFOV and WFOV illuminators, perhaps at different times, and allowing the cameras to integrate photos over appropriate, possibly disparate, time slices permits optimum usage of the specific spatial and time characteristics of each device.
As shown in Figure 1, 3, 4 and 5, the NFOV illuminators are preferably comprised of a 6 x 6 array of light emitting diodes 20 mounted on a circuit board 24. If the light emitting diodes are mounted normal to the circuit board the illuminator will illuminate an area of illumination having some diameter b as indicated in Figure 4. We have discovered that the area of illumination can be increased using the exact same components as are used for the illuminator of Figure 4 by repositioning or otherwise reconfiguring the light emitting diodes. This is shown in the embodiment of Figure by larger diameter a. In that illuminator the top two rows of diodes are positioned at an acute angle relative to the board 24 and the lower two rows of diodes are mounted at an obtuse angle. Circular ring illuminators or other shapes may also be used which could be placed around the NFOV lens. Other optical elements such as polarizers or WO 98/08439 PCT/US97/14873 14.
birefringent analysis devices may be used to minimize artifacts or enhance other biologically relevant characteristics such as corneal curvature, eye separation, iris diameter, skin reflectance and sclerac vasculature.
In Figures 6 and 7 we show a bracket which we have used to attach the NFOV illuminators 21, 22, and 23 to the housing 1. That bracket 30 has a U-shaped base 31 having a hole 32 through which a screw attaches the bracket to the housing. A first pair of gripper arms 33 and 34 with attached pin 42 are pivotably attached to one upright of the base. A similar second pair of gripper arms 33 and 34 are pivotably connected to the opposite upright through collar 37. The illuminator indicated in chain line in Figure 7 is held between the gripper arms by screws or pins 38. A series of holes (not visible) are provided along the uprights so that the gripper arms can be positioned at any selected one of several positions. A locking tab 39 extends from the collar 37 into an adjacent slot in the upright to prevent rotation of the collar. Set screw 38 is tightened against the pin 42 extending from gripper arms 35 and 36 to prevent rotation of the gripper arms.
A second bracket 44 which we have used to attach the NFOV illuminators 21, 22, and 23 to the housing 1 is shown in Figures 8 and 9. That bracket has a base 51 which attaches to the housing. A rod 52 extends upward from the base 51. Collar 53 slides along rod 52 and can be held at any desired location on the rod by set screw 54. Rod 55 extends from collar 53 and holds carrier 56. This carrier is slidably attached to the rod 55 in the same manner as collar 53. The illuminator indicated in chain line in Figure 9 is attached to the carrier 56 by fasteners or snap fit on WO 98/08439 PCTIUS97/14873 pins 57 extending from the carrier. Both this bracket 50 and the other illustrated bracket 30 permit the attached illuminator to be repositioned or adjusted along a pan axis and a tilt axis.
We prefer to connect each array of light emitting diodes through a distribution board 60 to an illumination controller 62 as shown in Figure 10. Since there can be one or more illuminator arrays these arrays are designated as ILLUMINATOR 1 through ILLUMINATOR X in the drawing. The distribution board 60 and illumination controller 62 enable us to selectively light the WFOV illuminator 6 and the NFOV illuminators 21, 22 and 23 in the embodiment of Figure 1. Furthermore, we can selectively illuminate sets of light emitting diodes within each array or selectively light individual diodes.
Each set of light emitting diodes may emit a different wavelength of light. It has been noted that the irises of different people respond better to certain wavelengths and worse to other wavelengths of light. This could be accomplished using illuminators with different wavelength LEDs or populating a single illuminator with different wavelength LEDs next to each other. These could be strobed and the better image selected. Additionally, LEDs of different beam widths could be mounted side by side or in different illuminators for illumination intensity control or for specularity control the higher tighter the beamwidth the less the size of the specularity.
We can also control the duration and the intensity of the light which is emitted. To prevent burnout of the illuminators caused by prolonged illumination we WO 98/08439 PCTfUS97/14873 16.
can provide timers 63 for each illuminator as indicated by the dotted blocks labeled "T" in Figure 10. The timers will cut the power to the array after a predetermined period of illumination. The WFOV cameras 3 provide images to an image processor 64 which we call the PV-I. That processor 64 tells the computer 65 the x, y, z coordinates of the selected eye or eyes of the person to be identified. The image processor may also assess the quality of the image and contain algorithms that compensate for motion of the subject. This processor may also perform image enhancement. The PC 65 has access to information from the ambient light level detector 69 and the NFOV camera.
These data can be used to modify illumination strategies. The x, y, z coordinates for the expected position of the eye enable the computer to direct the illumination controller as to which illuminators should be lighted and to direct the pan/tilt controller 66 to properly position the pan/tilt unit 67 so that a useful image of the iris can be obtained. These functions may be selected by results from the WFOV camera processing. Commands are sent from the computer 65 to the motors to change the location of the pan/tilt axes or to adjust focus. In the simplest case, one may consider that a WFOV image is acquired, the data is processed and then passed through the image processor 64 and computer 65 to the pan/tilt controller 66 or a gimbaled controller. In order to minimize motion time and control settling time, there can be simultaneous motion in the optical subsystem along all five axes.
The pan/tilt controller 66 accepts macro level commands from the computer and generates the proper set points and/or commands for use by the illumination control or each axis supervisor. The intermediate continuous path set WO 98/08439 PCT/US97/14873 17.
points for the axis are generated here and then sent to each axis supervisory controller.
A command interpreter decodes the commands from the image analysis and formats responses using positioning information from the optical devices. A real time interrupt produces a known clock signal every n milliseconds. This signal is a requirement for the implementation of a sampled data system for the position controller of each axis and allows synchronization via the supervisory controller for continuous path motion.
A diagnostic subsystem performs health checks for the control system.
Besides the choreography of the five axes, the microprocessor controller must also provide illumination control. The illumination controller will accept commands similar to the commands associated with motion control to timely activate, or to synchronously activate with the camera frame taking, selected illuminators.
Images from the WFOV are transmitted as analog signals to the image processor 64. The image processor preferably contains two pyramid processors, a memory capable of storing at least two frames, one LUT, ALU device, a digitizer which digitizes the analog video signal, a Texas Instrument TMS 320 C-31 or C-32 processor and a serial/parallel processor. The image is processed using the pyramid processors as described in United States Patent No. 5,359,574 to van der Wal. The Texas Instruments processor computes disparities between images. The WFOV images defines a region or point in the field of view of the WFOV cameras where the subject's right eye or left eye or both are located. Using stereo processing techniques on the disparities will result in x, y, z coordinates for points on the subject relative to the WFOV cameras. That information is then further processed to define an area of interest WO 98/08439 PCT/US97/14873 18.
such as the head or an eye. The coordinates of the area of interest are used to direct the NFOV optical system. These position coordinates are transferred from the image processor to a NFOV image and iris image processor 65. This unit 65 contains a 486, PENTIUM or other microprocessor system and associated memory. In the memory are programs and algorithms for directing the optical platform and doing iris identification.
Additionally, WFOV video images can be stored as a security video record.
The focus axes of the NFOV system may be controlled in an open loop fashion. In this case, the x,y,z coordinate from stereo processing defines via table look or analytic computation the focus axis position so that the lens properly focuses the NFOV camera on the object of interest. A closed loop focus method could also be used. In this case, NFOV video would be processed by image processor 64 to obtain a figure of merit defining if the axis was in focus. From the figure of merit the axis could be commanded forward or backward and then a new image acquired. The process would continue in a closed loop form until the image is in focus. Other information such as iris size and location as measured in camera units, and eye separation from WFOV camera images can be combined with stereo and other focus information into multivariate features than can be used to refine range information by fusion of direct or derived sensory information. This sensor fusion may encompass other information as well.
Since the object of interest, namely the eye, may be moving, there is a requirement that the NFOV camera track the trajectory seen by the WFOV. When motion ceases to blur the image, a quality image may be acquired via the NFOV camera WO 98/08439 PCT/US97/14873 19.
and optics. By tracking the eye, the optics directing light to the NFOV camera are aligned so that when it is desired to obtain an iris quality image little or no additional motion may be required.
In this case, the x,y,z coordinates from analysis of the WFOV images are sent to the NFOV controller at some uniform sample rate (such as every 100 ms). A continuous path algorithm such as described in Robotic Engineering An Integrated Approach. by Klafter, Chmielewski and Negin (Prentice Hall, 1989) would be used to provide intermediate sets of set points to the axis so that the axes remain in motion during the tracking phase. To define the last end position, either a macro level command can be given or the same can be continually sent at the sample periods.
It is important to recognize that as the NFOV axes move, the associated imager may not have sufficient time to perform the required integration to get a nonblurred image. Additionally, depending on the camera used (interlaced or progressive scan) there may be field to field displacement or horizontal displacement of the image all of which can be wholly or partially corrected by computation. Thus, it is easily seen why the WFOV camera provides the information necessary for directing the NFOV stage. It should be noted, that certain eye tracking algorithms (such as those based on specularity or iris configuration or pattern matching) may be capable of providing sufficient information (even if the image is slightly blurred due to focus or exhibits some blur caused by motion) to provide a reasonable estimate of the eye location in the NFOV camera. Hence, it is conceptually possible to use the WFOV data for coarse WO 98/08439 PCT/US97/14873 movement and the processed NFOV data (during motion) as additional information for finer resolution. This fusion of data can provide a better estimate than one WFOV camera image alone in positioning the NFOV image to acquire a quality iris image.
To acquire a quality iris image, the NFOV axes must settle to a point where the residual motion is less than that which can be detected by the imager. Once this occurs, any remaining images must be purged from the imager (typically there is a delay between an image integrated and the readout via RS 170) and the proper integration time allowed to acquire a non blurred image. See Robotic Engineering An Integrated Approach for a timing scenario. This can be accomplished in a number of ways, the simplest being a time delay which occurs after the cessation of motion until a good quality RS 170 image is captured. Multiple iris images which may be partially obscured may be collected and fused into a single composite, less obscured iris image using normalization and fusion methods.
We have found that any light source will cause a reflection on eyeglasses of the person to be identified. The eyes as seen from the WFOV cameras 3 are shown in Figure 11. There is a reflection 70 from the WFOV illuminator 6 on both lenses 72 of the person's eyeglasses 74. The reflection 70 partially covers the iris 76 of the person's eye making iris identification difficult if not impossible. To overcome this problem we use illuminators 21, 22 and 23 located off-axis from the optical axis of the NFOV camera 16. By carefully positioning and sometimes using only some of the light emitting diodes we can achieve adequate illumination without creating an obscuring reflection. This result is shown in Figure 12 where light from only a few WO 98/08439 PCT/US97/14873 21.
light emitting diodes in the NFOV illuminators 21, 22 and 23 has created a reflection that appears in the image. That reflection does not cover any part of the iris 76. In some cases especially with glasses, the WFOV specularity makes finding the head and the eye more expeditious.
Multiple illuminators also enable us to determine the shape of eyeglasses worn by the subject in the images. We illuminate the eyeglasses sequentially using two spaced apart illuminators. The specularity will be in one position during the first illumination and in a different position during the second illumination. The amount of specularity change is then calculated to determine appropriate eyeglass shape. From that information we can determine the minimum movement of illumination required to move the specularity off of the iris.
A calibration procedure must be used to correlate the center of the NFOV camera's field of view with pan/tilt and focus axis positions for a series of coordinates in 3 dimensional space as defined by the wide field of view. Given a set of WFOV coordinates defining the position of a user's eye somewhere in the working volume in front of the cameras, a transformation or table look up can be used to define the coordinates of the pan, tilt and focus axes that make the center of the NFOV camera's field of view coincident with x,y coordinates and in focus on the z plane. We prefer to use a series of targets to assist in calibration. These targets have partially filled circles corresponding to iris positions at known locations on a page. The targets are placed a known distance from the housing and the device is activated to attempt to find an iris and produce a calibration image.
WO 98/08439 PCTIUS97/14873 22.
When NFOV and WFOV cameras are used, they must be calibrated together. This may be accomplished by manual or automatic procedures, optical targets or projected targets may be used. An automatic procedure would require a calibration object to be automatically recognized by the computational support equipment operating on the camera images. Another possibility is to use the NFOV camera motion capabilities or other motion generation capability to project a target into the calibration volume, and then recognize the target.
Although we have shown certain present preferred embodiments of our compact image steering and focusing device and methods of using that device, it should be distinctly understood that our invention is not limited thereto but may be variously embodied within the scope of the following claims.
Claims (27)
1. An apparatus for acquiring images of irises including: means for determining a spatial location of an eye of a subject located at any point within a field of view of said determining means; at least one camera positioned to take an image of the eye, the camera having optics which can be adjusted so that the image will contain a representation of the iris which is of sufficient resolution to be used for iris verification and identification; and at least one illuminator positioned to illuminate the iris and including a plurality of light emitting elements which can be selectively illuminated and concurrently illuminated. 0
2. The apparatus of claim 1 also including a second illuminator positioned to 0% 0 •illuminate the iris, the second illuminator being positioned apart from the first illuminator. coi* S 3. The apparatus of claim 2 wherein the second illuminator includes a plurality of light emitting elements which can be selectively illuminated. .00.
4. The apparatus of claim 1 wherein light is reflected from the iris to the at least one camera along a camera axis and the light travels from the at least one illuminator along a path which intersects the camera axis. The apparatus of claim 1 wherein the at least one illuminator includes at least one array of light emitting diodes mounted on a base.
6. The apparatus of claim 5 wherein the array of light emitting diodes include: a first set of light emitting diodes which is attached to the base in a manner to emit light along a first path; and a second set of light emitting diodes which is attached to the base in a manner to emit light along a second path which is not parallel to the first path.
7. The apparatus of claim 6 wherein the first set of light emitting diodes and the second set of light emitting diodes can be separately illuminated.
8. The apparatus of claim 5 also including an array bracket to which the base of the at least one array of light emitting diodes is pivotally attached for rotation about an array axis.
9. The apparatus of claim 8 also including a second bracket to which the array bracket is movably attached in a manner to permit movement of the array along a line normal to the array axis. o•
10. The apparatus of claim 5 wherein at least some of the light emitting diodes emit light of a wavelength which is different from light wavelengths emitted from other light emitting diodes.
11. The apparatus of claim 1 wherein the at least one illuminator can emit at least one of infrared light, visible light, near infrared light, a select band of light frequencies, and both visible light and infrared light.
12. The apparatus of claim 1 wherein the at least one illuminator can emit light of varying intensity.
13. The apparatus of claim 12 also including a power source connected to the at least one illuminator which can emit light of varying intensity and a controller connected to the power source for changing power output to the at least one illuminator thereby changing intensity of the light which is emitted by that illuminator.
14. The apparatus of claim 1 wherein the at least one of the illuminators can emit different wavelengths of light. The apparatus of claim 2 also including a controller connected to the at A 4 least one illuminator and the second illuminator, the controller containing a program for selectively illuminating the illuminators according to a dynamically predetermined pattern.
16. The apparatus of claim 1 also including a wide field of view camera positioned to take an image which includes the eye.
17. The apparatus of claim 16 also including a wide field of view illuminator.
18. The apparatus of claim 17 wherein the wide field of view illuminator includes a ring of light emitting elements arranged around the wide field of view S. camera.
19. The apparatus of claim 17 also including a hood surrounding the wide field of view camera to prevent light from the illuminators from directly entering the camera. The apparatus of claim 1 also including a power source connected to the at least one of the illuminators and a timer connected to the power supply which timer turns off the power source after a selected period of time.
21. The apparatus of claim 1 also including an ambient light sensor and a controller connected to the ambient light sensor and the at least one illuminator the controller containing a program for shutting off the illuminators when the ambient light is at a predetermined level.
22. The apparatus of claim 1 also including a motor connected to the at least one illuminator.
23. The apparatus of claim 1 also including an optical system which receives light reflected from the iris and directs the light to the at least one camera.
24. The apparatus of claim 1 wherein the at least one camera is movable. A method for acquiring an image of an iris including the steps of: locating within a field of view of a camera a three dimensional coordinate position of an eye containing the iris to be imaged; positioning the camera so that at least some light reflected from the iris will be reflected to the camera, the camera having optics which can be adjusted so that an image of the iris will be of sufficient resolution to be used for iris identification; illuminating the eye with at least one illuminator so that light is reflected from the iris to the camera along a camera axis and the light travels from the at least one illuminator along at least one path which intersects the camera axis wherein the at least one illuminator includes a plurality of light emitting elements which can be selectively illuminated; and creating at least one image of the iris with the camera during illumination which image is of sufficient resolution to be used for iris verification and identification.
26. The method of claim 25 wherein the at least one illuminator includes a first illuminator and a second illuminator which are illuminated sequentially and the camera creates a first image and a second image during the sequential S illumination which images are used to create the image of sufficient resolution to distinguish among identifying features within the iris.
27. The method of claim 25 wherein the at least one illuminator includes two sets each set containing a plurality of light emitting elements which sets are selectively and sequentially illuminated.
28. The method of claim 25 wherein the at least one illuminator can emit at least one of infrared light, visible light, near infrared light, a select band of light frequencies, and both visible light and infrared light.
29. The method of claim 25 wherein the image includes a plurality of pixels and that portion of the image which contains the iris includes at least 200 pixels. The method of claim 25 wherein the three dimensional coordinate position of the eye is located by: using a first wide field of view camera to create a first image of a region in which the eye is believed to be located; using a second wide field of view camera spaced apart from the first wide field of view camera to create a second image of a region in which the eye is believed to be located; and combining the first image and the second image in a manner to establish the three dimensional coordinate position of the eye.
31. The method of claim 30 wherein the images are combined using :.stereographic image analysis. a S32. The method of claim 30 also including the step of illuminating the region using nearly on axis illumination.
33. The method of claim 25 wherein the camera is mounted within an optical subsystem containing a pan/tilt mirror from which reflected light is directed to the camera and also including the step of adjusting the pan/tilt mirror toward the three dimensional coordinate position of the eye to direct light reflected from the iris to the camera.
34. The method of claim 25 wherein the camera is gimbal mounted and also including the step of adjusting the camera toward the three dimensional coordinate position of the eye to direct light reflected from the iris to the camera. DATED this 22nd day of September, 2000. SENSDAR INC and THE SARNOFF CORPORATION WATERMARK PATENT TRADEMARK ATTORNEYS 4TH FLOOR "DURACK CENTRE" 263 ADELAIDE TERRACE PERTH WA 6000
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US70292396A | 1996-08-25 | 1996-08-25 | |
US08/702923 | 1996-08-26 | ||
PCT/US1997/014873 WO1998008439A1 (en) | 1996-08-25 | 1997-08-22 | Apparatus for the iris acquiring images |
Publications (2)
Publication Number | Publication Date |
---|---|
AU4328297A AU4328297A (en) | 1998-03-19 |
AU727389B2 true AU727389B2 (en) | 2000-12-14 |
Family
ID=24823180
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU43282/97A Ceased AU727389B2 (en) | 1996-08-25 | 1997-08-22 | Apparatus for the iris acquiring images |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP0959769A1 (en) |
JP (2) | JP2002514098A (en) |
KR (1) | KR100342159B1 (en) |
AU (1) | AU727389B2 (en) |
CA (1) | CA2264029A1 (en) |
WO (1) | WO1998008439A1 (en) |
Families Citing this family (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6289113B1 (en) | 1998-11-25 | 2001-09-11 | Iridian Technologies, Inc. | Handheld iris imaging apparatus and method |
US6532298B1 (en) | 1998-11-25 | 2003-03-11 | Iridian Technologies, Inc. | Portable authentication device and method using iris patterns |
US6377699B1 (en) | 1998-11-25 | 2002-04-23 | Iridian Technologies, Inc. | Iris imaging telephone security module and method |
GB9907515D0 (en) * | 1999-04-01 | 1999-05-26 | Ncr Int Inc | Self service terminal |
DK1285409T3 (en) * | 2000-05-16 | 2005-08-22 | Swisscom Mobile Ag | Process of biometric identification and authentication |
KR100374708B1 (en) * | 2001-03-06 | 2003-03-04 | 에버미디어 주식회사 | Non-contact type human iris recognition method by correction of rotated iris image |
KR100447403B1 (en) * | 2001-05-12 | 2004-09-04 | 엘지전자 주식회사 | Focusing angle and distance display in iris recognition system |
KR100434370B1 (en) * | 2001-05-12 | 2004-06-04 | 엘지전자 주식회사 | Focusing distance measurement in iris recognition system |
JP2002341406A (en) * | 2001-05-11 | 2002-11-27 | Matsushita Electric Ind Co Ltd | Method and device for imaging object to be authenticated |
KR100954640B1 (en) * | 2002-02-05 | 2010-04-27 | 파나소닉 주식회사 | Personal authentication method and device |
KR100924271B1 (en) * | 2002-05-20 | 2009-11-03 | 신성복 | Identification system and method using a iris, and media that can record computer program sources thereof |
DE10225943A1 (en) | 2002-06-11 | 2004-01-08 | Basf Ag | Process for the preparation of esters of polyalcohols |
JP2007504562A (en) | 2003-09-04 | 2007-03-01 | サーノフ コーポレーション | Method and apparatus for performing iris authentication from a single image |
FR2864290B1 (en) * | 2003-12-18 | 2006-05-26 | Sagem | METHOD AND DEVICE FOR RECOGNIZING IRIS |
CA2580795A1 (en) * | 2004-09-22 | 2006-04-06 | Tripath Imaging, Inc. | Methods and compositions for evaluating breast cancer prognosis |
US7542628B2 (en) | 2005-04-11 | 2009-06-02 | Sarnoff Corporation | Method and apparatus for providing strobed image capture |
US7634114B2 (en) | 2006-09-01 | 2009-12-15 | Sarnoff Corporation | Method and apparatus for iris biometric systems for use in an entryway |
KR100728657B1 (en) | 2006-04-27 | 2007-06-14 | 서울통신기술 주식회사 | Unmanned system and method for controlling entrance and exit using of face recognition with multiple infrared cameras |
US20100291091A1 (en) * | 2007-07-30 | 2010-11-18 | ONCOTHERAPY SCIENCE ,inc. | Cancer associated gene ly6k |
US8189879B2 (en) | 2008-02-14 | 2012-05-29 | Iristrac, Llc | System and method for animal identification using IRIS images |
US8437825B2 (en) | 2008-07-03 | 2013-05-07 | Cercacor Laboratories, Inc. | Contoured protrusion for improving spectroscopic measurement of blood constituents |
US8515509B2 (en) | 2008-08-04 | 2013-08-20 | Cercacor Laboratories, Inc. | Multi-stream emitter for noninvasive measurement of blood constituents |
US20100232654A1 (en) * | 2009-03-11 | 2010-09-16 | Harris Corporation | Method for reconstructing iris scans through novel inpainting techniques and mosaicing of partial collections |
US8306288B2 (en) | 2009-08-19 | 2012-11-06 | Harris Corporation | Automatic identification of fingerprint inpainting target areas |
JP5691669B2 (en) | 2011-03-08 | 2015-04-01 | 富士通株式会社 | Biological information processing apparatus, biological information processing method, and biological information processing program |
JP5751019B2 (en) | 2011-05-30 | 2015-07-22 | 富士通株式会社 | Biological information processing apparatus, biological information processing method, and biological information processing program |
GB2495324B (en) | 2011-10-07 | 2018-05-30 | Irisguard Inc | Security improvements for Iris recognition systems |
GB2495323B (en) | 2011-10-07 | 2018-05-30 | Irisguard Inc | Improvements for iris recognition systems |
JP5915664B2 (en) | 2011-12-15 | 2016-05-11 | 富士通株式会社 | Vein authentication method and vein authentication apparatus |
KR101620774B1 (en) | 2012-03-28 | 2016-05-12 | 후지쯔 가부시끼가이샤 | Biometric authentication device, biometric authentication method, and storage medium |
JP5846291B2 (en) | 2012-03-28 | 2016-01-20 | 富士通株式会社 | Biometric authentication device, biometric authentication method, and biometric authentication program |
JP6075069B2 (en) | 2013-01-15 | 2017-02-08 | 富士通株式会社 | Biological information imaging apparatus, biometric authentication apparatus, and manufacturing method of biometric information imaging apparatus |
US10018804B2 (en) * | 2013-06-18 | 2018-07-10 | Delta Id, Inc. | Apparatus and method for multiple mode image acquisition for iris imaging |
KR102412290B1 (en) | 2014-09-24 | 2022-06-22 | 프린스톤 아이덴티티, 인크. | Control of wireless communication device capability in a mobile device with a biometric key |
MX2017007139A (en) | 2014-12-03 | 2017-11-10 | Princeton Identity Inc | System and method for mobile device biometric add-on. |
EP3403217A4 (en) | 2016-01-12 | 2019-08-21 | Princeton Identity, Inc. | Systems and methods of biometric analysis |
US10373008B2 (en) | 2016-03-31 | 2019-08-06 | Princeton Identity, Inc. | Systems and methods of biometric analysis with adaptive trigger |
US10366296B2 (en) | 2016-03-31 | 2019-07-30 | Princeton Identity, Inc. | Biometric enrollment systems and methods |
CN106407964B (en) * | 2016-11-15 | 2023-11-07 | 刘霁中 | Device, method and terminal equipment for acquiring iris by using visible light source |
WO2018187337A1 (en) | 2017-04-04 | 2018-10-11 | Princeton Identity, Inc. | Z-dimension user feedback biometric system |
KR20180133076A (en) | 2017-06-05 | 2018-12-13 | 삼성전자주식회사 | Image sensor and electronic apparatus including the same |
KR102372809B1 (en) | 2017-07-04 | 2022-03-15 | 삼성전자주식회사 | Imaging sensor assembly having tilting structure and electronic device including the same |
KR102573482B1 (en) | 2017-07-26 | 2023-08-31 | 프린스톤 아이덴티티, 인크. | Biometric security system and method |
EP3474328B1 (en) | 2017-10-20 | 2021-09-29 | Samsung Electronics Co., Ltd. | Combination sensors and electronic devices |
US11202567B2 (en) | 2018-07-16 | 2021-12-21 | Verily Life Sciences Llc | Retinal camera with light baffle and dynamic illuminator for expanding eyebox |
CN111126145B (en) * | 2018-10-18 | 2024-05-10 | 天目爱视(北京)科技有限公司 | Iris 3D information acquisition system capable of avoiding influence of light source image |
WO2020149981A1 (en) * | 2019-01-17 | 2020-07-23 | Gentex Corporation | Alignment system |
DE102019124127B4 (en) * | 2019-09-09 | 2024-08-14 | Bundesdruckerei Gmbh | DEVICE AND METHOD FOR DETECTING BIOMETRIC FEATURES OF A PERSON’S FACE |
CN115066203A (en) | 2020-01-13 | 2022-09-16 | 梅西莫股份有限公司 | Wearable device with physiological parameter monitoring |
WO2022066816A2 (en) * | 2020-09-25 | 2022-03-31 | Sterling Labs Llc | Pose optimization in biometric authentication systems |
EP4141820A1 (en) * | 2021-08-25 | 2023-03-01 | Tools for Humanity Corporation | Controlling a two-dimensional mirror gimbal for purposes of iris scanning |
JPWO2023032051A1 (en) | 2021-08-31 | 2023-03-09 | ||
CN118076985A (en) * | 2021-10-13 | 2024-05-24 | 指纹卡安娜卡敦知识产权有限公司 | Methods and systems configured to reduce the impact of impairment data in captured iris images |
US20230300470A1 (en) * | 2022-02-03 | 2023-09-21 | Facebook Technologies, Llc | Techniques for producing glints and iris illumination for eye tracking |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4641349A (en) | 1985-02-20 | 1987-02-03 | Leonard Flom | Iris recognition system |
DE3777222D1 (en) * | 1986-04-04 | 1992-04-16 | Applied Science Group Inc | METHOD AND DEVICE FOR DEVELOPING THE PRESENTATION OF THE VISION DISTRIBUTION WHEN PEOPLE WATCHING TELEVISION ADVERTISING. |
IT1226526B (en) * | 1988-08-05 | 1991-01-24 | Mario Angi | PARTICULARMEWN INFRARED LIGHT VIDEOFRACTOMETER FOR APPLICATIONS IN PEDIATRIC OPHTHALMOLOGY |
US5291560A (en) | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US5572596A (en) * | 1994-09-02 | 1996-11-05 | David Sarnoff Research Center, Inc. | Automated, non-invasive iris recognition system and method |
-
1997
- 1997-08-22 CA CA002264029A patent/CA2264029A1/en not_active Abandoned
- 1997-08-22 EP EP97941356A patent/EP0959769A1/en not_active Withdrawn
- 1997-08-22 WO PCT/US1997/014873 patent/WO1998008439A1/en not_active Application Discontinuation
- 1997-08-22 KR KR1019997001528A patent/KR100342159B1/en not_active IP Right Cessation
- 1997-08-22 AU AU43282/97A patent/AU727389B2/en not_active Ceased
- 1997-08-22 JP JP51177498A patent/JP2002514098A/en active Pending
-
2003
- 2003-10-01 JP JP2003343663A patent/JP2004073880A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2002514098A (en) | 2002-05-14 |
JP2004073880A (en) | 2004-03-11 |
AU4328297A (en) | 1998-03-19 |
KR20000035840A (en) | 2000-06-26 |
WO1998008439A1 (en) | 1998-03-05 |
KR100342159B1 (en) | 2002-06-27 |
EP0959769A1 (en) | 1999-12-01 |
CA2264029A1 (en) | 1998-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU727389B2 (en) | Apparatus for the iris acquiring images | |
US5717512A (en) | Compact image steering and focusing device | |
US8077914B1 (en) | Optical tracking apparatus using six degrees of freedom | |
US8983146B2 (en) | Multimodal ocular biometric system | |
US6296358B1 (en) | Ocular fundus auto imager | |
US7025459B2 (en) | Ocular fundus auto imager | |
US6064752A (en) | Method and apparatus for positioning subjects before a single camera | |
US8170293B2 (en) | Multimodal ocular biometric system and methods | |
US6299306B1 (en) | Method and apparatus for positioning subjects using a holographic optical element | |
JP5297486B2 (en) | Device for detecting and tracking the eye and its gaze direction | |
WO2000039760A1 (en) | Compact imaging device | |
JP2008104628A (en) | Conjunctiva and sclera imaging apparatus | |
US20240315563A1 (en) | System and method for eye tracking | |
US20110170060A1 (en) | Gaze Tracking Using Polarized Light | |
WO2000004820A1 (en) | Acquiring, analyzing and imaging three-dimensional retinal data | |
CN109964230A (en) | Method and apparatus for eyes measurement acquisition | |
JP2006318374A (en) | Glasses determination device, authentication device, and glasses determination method | |
WO2023187780A1 (en) | Eye tracking device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |