WO2012174453A2 - Systems and methods for binocular iris imaging - Google Patents

Systems and methods for binocular iris imaging Download PDF

Info

Publication number
WO2012174453A2
WO2012174453A2 PCT/US2012/042780 US2012042780W WO2012174453A2 WO 2012174453 A2 WO2012174453 A2 WO 2012174453A2 US 2012042780 W US2012042780 W US 2012042780W WO 2012174453 A2 WO2012174453 A2 WO 2012174453A2
Authority
WO
WIPO (PCT)
Prior art keywords
camera
subject
iris
range
imager
Prior art date
Application number
PCT/US2012/042780
Other languages
French (fr)
Other versions
WO2012174453A3 (en
Inventor
Evan Ronald SMITH
Hsiang-Yi Yu
Original Assignee
Sybotics, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sybotics, Llc filed Critical Sybotics, Llc
Publication of WO2012174453A2 publication Critical patent/WO2012174453A2/en
Publication of WO2012174453A3 publication Critical patent/WO2012174453A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1216Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes for diagnostics of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan

Definitions

  • the invention relates to improved image capture equipment useful in the field of biometric iris identification, and to methods for using such equipment.
  • Iris identification a long-established technology, is more accurate than fingerprints and requires no physical contact. Iris ID has been proven in a wide range of applications and is capable of essentially zero error performance. Nations around the world are implementing iris identification in government ID and border control programs, and companies are using the iris to identify employees and customers.
  • Iris identification systems have two main elements: cameras and software.
  • the camera is used to take a close-up picture of the iris, the colored ring in the eye.
  • the software provides user and system interfaces, stores iris patterns of people to be identified, and implements a matching algorithm that identifies a person looking at the camera by comparing the image data with stored iris patterns.
  • iris patterns are more visible under near-infrared illumination.
  • the iris is illuminated with one or more wavelengths in the range from 700nm to 900nm.
  • a standard design approach in the iris identification industry is to equip the camera with one or more LEDs that produce near-infrared illumination.
  • the LEDs can have a single wavelength output, or more than one emitter can be provided in a single package to produce two or three output wavelengths.
  • the IriMagic 1000BKTM iris camera manufactured by Iritech, Inc. of Fairfax, Virginia is a binocular (two-eye) iris camera with two imagers, one aimed at each eye. Each of the two imagers has an LED illuminator.
  • the LS-1 camera is a single-eye camera and has two LEDs, one located on each side of the imager, to illuminate the eye.
  • Figure 6 shows a typical illumination output of a camera with two LEDs.
  • LEDs 602 produce circular illumination patterns 604 that overlap in the center of the camera's field of view 606.
  • the illumination patterns 604 of typical prior art devices are not uniform through field of view 606; there is significantly less illumination in corners 108 than in the central region of field of view 106. Further, the projected illumination patterns 104 do not coincide with field of view 106.
  • the position of the subject item to be imaged (such as an iris) does not coincide with the illumination patterns 104, it may not be well-illuminated or may not be evenly illuminated.
  • the LS-1 camera images a single eye at close range, so within the imaging area, illumination is fairly even. However, much of the light output falls outside the imaging area.
  • Each LED illuminator projects a generally conical beam of light toward the eye region.
  • LEDs typically produce "hot spots," that is, an uneven light output across the illuminated area.
  • Many cameras such as the Iritech and Oki® cameras noted above, do not have a diffuser or other optical element between the LEDs and the targeted eyes.
  • a standard diffuser of this type can substantially eliminate hot spots, enhancing eye safety and image quality, but does not change the generally conical distribution of light from standard LEDs.
  • LEDs are available with different angular specifications for distribution of light; typically the light is distributed within a cone at an angle between 3 degrees and 60 degrees from the central axis, depending on the LED selected. The light tends to dissipate with distance.
  • the outer range at which the eye can be illuminated and imaged will depend on the number, brightness, and angular light distribution specifications of the LEDs.
  • Typical portable iris cameras have a USB interface and are powered from the USB cable. Thus, for the USB 2.0 specification these devices are limited to drawing a total current of 0.5 Amperes.
  • a typical near-infrared LED circuit draws 0.08 to 0.10 Amperes of current.
  • one or more LEDs may be arranged in series in each LED circuit.
  • the voltage and current available in portable or low- power devices establishes a hard practical limit on the number of LEDs that can be activated simultaneously in these devices.
  • eye safety standards limit the amount of near-infrared radiation that is applied to the eye, so the amount and distribution of light provided by an iris camera must be maintained within safe limits.
  • Some prior art cameras such as the LG Electronics model 2200 and the Oki IrisPass, have sequentially activated LED illumination circuits rather than activating all LEDs at the same time. This approach, when applied to portable cameras, has the effect of reducing the total instantaneous current draw of the illuminating circuits. Changing the illuminator pattern is also known to provide different illumination angles if the illuminators are spaced sufficiently apart, which tends to reduce reflections from eyeglasses.
  • the size of the region in which a given camera can find and image an eye is referred to as the "capture volume" for that camera. It is a generally desirable engineering goal for iris cameras to have a larger capture volume, i.e. a wider field of view, so that aiming and positioning relative to the target eye is less critical.
  • the capture volume of a typical portable iris camera is limited by optics, available image transmission bandwidth, available illumination current, and eye safety standards.
  • the imagers used in many portable iris cameras have VGA resolution (640x480 pixels, or 0.3 megapixels). Previous portable cameras with larger capture volumes, such as the Eye Controls product, have used imagers as large as two megapixels.
  • a binocular iris identification camera includes a single electronic imager and a lens unit for capturing images of one or both irises of a subject.
  • a horizontally elongated viewfmder orifice passes through the camera housing and allows the operator to see both irises of the subject as an indication of correct aim.
  • An electronic range detector signals when the camera is in range of the iris and may include a series of icons to indicate required distance adjustment.
  • An ergonomically correct handle allows easy operator positioning of the camera, or the camera can be placed in a stand designed to receive the handle and angle the camera to facilitate desktop use with self-positioning by the subject.
  • the camera In operator mode, the camera is positioned between the operator and the subject, and the operator aims the camera by looking through the elongated viewfinder orifice to simultaneously see two irises of the subject. The operator adjusts the camera's distance relative to the subject until the electronic range indicator indicates that the distance is in the correct range, and images are captured.
  • a self-identification mode the camera is placed in the stand or is handheld by the subject, who is facing the camera. The subject looks directly at the camera and adjusts the distance of the camera from the face, as prompted by the range indicating system, to achieve a focused image.
  • LEDs of different wavelengths illuminate the iris through an engineered diffuser that uniformly distributes light in a substantially rectangular light output pattern.
  • Figure 1 a is a front plan view of an embodiment of an iris camera
  • Figure lb is a side sectional view of the camera of Figure 1 a;
  • Figure 1 c is a back plan view of the iris camera of Figure 1 a;
  • Figure 1 d is a bottom sectional view of the camera of Figure 1 a;
  • Figures 2a and 2b are diagrams showing examples of indicator icons used in certain embodiments for positioning at the correct range from the eye;
  • Figures 3a through 3f are perspective views showing additional features of a preferred embodiment of the camera housing.
  • Figure 4 is a screen shot showing an example video display generated by the output software of the camera.
  • Figure 5 is a flow chart showing an example embodiment of an operating method for capturing an iris image
  • Figure 6 is an illustration of a prior art illumination pattern
  • Figure 7a is a front view of an example embodiment of an iris camera having a novel illumination method and apparatus
  • Figure 7b is a side view of an illumination assembly included in the example embodiment of Figure 7a;
  • Figure 7c is a schematic diagram of an example LED driving circuit
  • Figure 8 is a view of the illumination pattern generated by the example illumination assembly of Figure 7b.
  • one embodiment”, “an embodiment”, “an example embodiment”, etc. indicate that the embodiment(s) described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, persons skilled in the art may implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Embodiments of the invention may be implemented in hardware, firmware, software, or any combination thereof, or may be implemented without automated computing equipment. Embodiments of the invention may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g. a computing device).
  • a machine -readable medium may include read only memory (ROM); random access memory (RAM); hardware memory in handheld computers, PDAs, mobile telephones, and other portable devices; magnetic disk storage media; optical storage media; thumb drives and other flash memory devices; electrical, optical, acoustical, or other forms of propagated signals (e.g.
  • firmware, software, routines, instructions may be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers or other devices executing the firmware, software, routines, instructions, etc.
  • Figures la through Id are views of a first embodiment of an iris camera.
  • a front side of camera 100 includes faceplate recess 102, light distribution elements 104 (sometimes referred to as diffusers), icon cutouts 108, viewfmder 110, and optical window 112.
  • Faceplate recess 102 receives a faceplate 106 (shown in section in Figures lb and Id) that provides a finished front surface for camera 100 and includes translucent icons in the area of icon cutouts 108.
  • Viewfmder 110 is an elongated slot passing through the body of camera 100.
  • viewfmder 110 is sized so that the operator can see both of the subject's eyes through viewfmder 110 when the camera 100 is correctly positioned with the eyes in view of the camera's imager.
  • viewfmder 110 may be 48mm to 51mm wide, 6mm high, and 27 mm in depth, to provide an appropriate view at a distance of about 25cm from the subject's eyes.
  • the size and shape of the viewfmder can be varied depending on the desired operating distance.
  • the viewfmder can have less depth (i.e. the housing of camera 100 can be thinner) if the height of the viewfmder is correspondingly reduced.
  • the inventors have found that the elongated shape of viewfmder 110 is ideal for comfortable viewing in the desired operating position of camera 100.
  • a conventional viewfmder, designed for a single eye view, requires that the operator close one eye or squint, and this is less comfortable for the operator.
  • the viewfmder shape provided in this embodiment allows the operator to use both eyes in aiming the device, while looking at both eyes of the subject. This is a more comfortable and natural method of viewing the subject and simultaneously aiming the camera, as compared to methods previously used in the industry.
  • the slotted or elongated shape of the viewfmder tends to produce a high level of precision in getting the subject eyes into a horizontal plane where the camera imager can view them.
  • the aim will be centered on the subject's nose so that both eyes are in view of the imager.
  • the viewfmder may optionally be provided with internal or external lateral aiming aids that show the user whether the slot is centered on the nose.
  • brightly colored projections may be provided in the center of the slot, front and back, which can be centered on the nose. When both projections are visually aligned with the subject's nose the camera imager will be centered on the correct vertical axis.
  • optical window 112 provides a view port for an imager (omitted for clarity) mounted on circuit board 114 behind optical window 112.
  • An example of an appropriate imager is the Aptina 5 -megapixel monochrome CMOS imager, model MT9P031I12STM.
  • the imager is combined on the circuit board 114 with appropriate digital signal processing circuits to receive image data, process it, and format it for transmission through an interface.
  • the imager has a lens mounted in an optical path between the imager and the subject eyes, selected to sharply image the subject eyes at the desired operating distance.
  • the lens is selected with an aperture that provides a reasonable depth of field at the designed focal distance.
  • the operating distance is from 20 to 25 cm
  • the lens is between 8.5mm and 10.0mm focal length, with an aperture of f2.0 to f4.0.
  • a suitable lens is model number DSL935A-NIR distributed by Sunex Inc., USA of Carlsbad, California. This example lens has a 9.6mm focal length and an O.O aperture.
  • circuit board 114 has a central cutout to accommodate viewfmder 110 passing through the plane of circuit board 114.
  • circuit board 114 has an interface for connecting camera 100 to a computing device, such as a notebook computer, netbook, desktop computer, smart phone, or other mobile computing device.
  • This interface may be a USB 2.0, USB 3.0, wireless, or other standard or proprietary interface appropriate to the computing device to be connected.
  • the interface is a USB 2.0 full speed interface and a cable (omitted from Fig. 1 for clarity) is connected to circuit board 114, passing through an opening in the bottom of camera 100 or another convenient location to connect to the computer.
  • Circuit board 114 also has near-infrared illuminating LEDs mounted directly behind light distribution elements 104 to illuminate the subject eyes.
  • the LEDs may be of a single wavelength, such as 830nm, or may have different wavelengths to provide multispectral illumination such as 770nm, 830nm, and 850nm.
  • An example embodiment of the illumination arrangement is described in more detail below with particular reference to Figures 7 through 9.
  • circuit board 114 has four indicator LEDs mounted on each side of circuit board 114, aligned with openings 108 and 118 respectively. These indicator LEDs are preferably color coded and aligned to light up the icons in the front and back faceplates, shown in Figures 2a and 2b, to indicate desired positional movements of the camera.
  • the icons are controlled by input/output ports on the circuit board, which are in turn controlled in response to control signals generated either on the camera board (in response to on-board electronic ranging or firmware -based image analysis to determine position and distance), or in response to control signals sent over the USB interface from a computing device that receives the video output of the camera and performs analysis of position and distance.
  • the camera is connected via a USB cable to a computing device, such as a PC, that has software for receiving and processing a high speed USB video stream to deliver iris images to the computing device and to other software operating in the computing device or in another device networked to the computing device.
  • a computing device such as a PC
  • the camera's video frame output may be dynamically windowed and/or cropped to reduce the amount of data transmitted in each frame and provide an increased frame rate.
  • the inventors have found that with the geometry shown in Figure 1 , where the camera is located slightly (for example, about 22 mm) below the viewfmder, the subject eyes tend to be uniformly in the upper portion of the frame because they have been aligned by the operator and/or subject with the viewfmder rather than the camera lens. Thus, in the embodiment shown, the subject eyes virtually never appear in the bottom 20-25% of the full 5mp video frame. In a preferred embodiment, the bottom 20-25% of the video frame is cropped and that data is not transmitted to the computing device, thus allowing an increased frame rate.
  • step 502 the iris or irises to be imaged are illuminated, preferably using the uniform illumination systems described herein.
  • step 504 an image is captured using the imager in the camera.
  • step 506 the video frame transmitted from the camera to the processing device.
  • step 508 the frame is analyzed to determine iris location and size.
  • step 510 range indicator icons are activated based on the apparent distance to the subject. The distance to the subject may be determined based on one or more image analysis factors, including considering the diameter of the iris, the distance in pixels between images of the illuminator reflections on the iris, and the like.
  • the computing device generates a monitor display frame that can be displayed on a screen of the computing device to show the camera operator what the imager sees.
  • the monitor display frame is a 640x480 pixel image, generated by shrinking the received camera image frame to 640x360 and adding a 640x120 bitmap simulated icon display to the top of the monitor display frame.
  • the icon display mimics the icon display on the back of the camera so that whether the operator looks at the camera or at the screen, the operator sees the same range indication information.
  • the process skips further image processing steps and loops back to step 504. If the subject is in range, in step 514 the image is processed to locate the iris or irises of interest, select a 640x480 region centered on each iris, and analyze focus and image quality of the iris in that region.
  • the camera may be operated in a mode to collect a left eye image, right eye image, both eyes, or either eye.
  • step 516 if the image quality of an iris of interest is acceptable according to ISO standards, and is better than any previously obtained image for that iris, the selected 640x480 iris image region is retained for further use.
  • step 518 if the images collected are not complete and acceptable, the process repeats starting at step 504 and another image is captured and analyzed. If the images are acceptable and complete, the process continues at step 520, and the image or images stored are returned to the control program for further processing. The illuminators may then be turned off and the camera may enter an idle or low-power mode where no images are captured.
  • the camera may be selectively operated to capture both left and right iris images in a single operation, left iris only, right iris only, or either iris (whichever is captured first).
  • Figure 2a shows an embodiment of an indicator area 200 of the camera's back faceplate.
  • Area 200 includes icons 202, 204, 206 and 208 that are preferably translucent so as to be selectively illuminated by corresponding indicator LEDs on circuit board 114.
  • icon 202 is a "back arrow" used to indicate that the camera is too close to the subject and has red illumination when activated.
  • Icon 204 is a camera icon that has green illumination, used to indicate that the camera is the correct distance from the subject.
  • Icon 206 is a "forward arrow" with yellow or orange illumination, used to indicate that the camera is too distant from the subject and should be moved closer.
  • icon 208 is a head icon that is illuminated green to indicate that the image capture is complete.
  • these icons are selectively illuminated to provide a simple real-time indication to the operator or user to assist in establishing the correct distance to subject for a good iris image.
  • This range indication system combined with the viewfmder described herein, provides a complete, highly intuitive guidance system for positioning the camera relative to the subject.
  • the operator looks through the viewfmder so that he or she sees the subject's eyes. Then, the operator moves the camera forward until the range indicators show that the camera is at the right distance. The camera is then held in position until imaging is complete as shown by the indicators.
  • the inventors have found that virtually all intended operators, even small children, are able to use this positioning system effectively after only minimal instruction.
  • the positioning system comprising the icon-based range indicators and viewfmder has the benefit of simplicity and durability.
  • the range indicators use LED illuminators have an expected lifetime that exceeds the life of the device.
  • the elongated slot viewfmder has no electronic or optical elements that can fail or be damaged.
  • the positioning system features of this device are particularly durable and contribute to superior impact and drop- resistance of the device.
  • many conventional iris cameras use a video screen on the back of the camera as a positioning aid. This component is more expensive to produce than the aiming and positioning elements disclosed herein.
  • Video viewfmders are also subject to damage if the device is dropped or there is an impact to the screen, unless the screen is mounted and protected in a manner that further increases the already higher cost of that approach.
  • forward arrow icon 206 is illuminated with a yellow backlight as the camera is moved, starting outside of the imaging range, toward the subject.
  • the LED illuminating the forward arrow icon 206 will be turned off, and camera icon 204 will be illuminated with a green backlight to indicate "in range”. If the operator continues to move the camera toward the subject, when the camera reaches the inside boundary of the acceptable imaging range, icon 204 will be turned off and the "too close" arrow icon 202 will be illuminated, preferably with a red backlight.
  • the "too close” icon 202 will be turned off and the camera icon 204 will again be illuminated in green to indicate "in range”.
  • the head icon 208 will be illuminated to indicate that the operation is complete.
  • a similar set of icons is provided on the front of the camera, facing the subject.
  • the icons are arranged in a different manner when facing the subject, to correctly maintain the intuitive nature of the display.
  • the head icon 208 is positioned between the arrow icons 202 and 206, and the camera icon 204 is positioned to the right of the other three icons.
  • the icon arrangement is different for front and back indicators because a different operating mode is used with the two sets of indicators.
  • the front indicator of Fig. 2b facing the subject
  • the range indicator icons for the camera front reflect this type of movement, showing the head located between the arrows that imply movement of the head relative to the camera. This provides an intuitive range indicator to the subject that is language-universal and easy to understand.
  • the operator is typically holding the camera and moving it relative to a fixed position of the subject.
  • the back indicators have the camera icon between the two arrows, to reflect the expected movement of the camera forward and backward relative to the fixed head position.
  • the head icon is positioned to the right side on the back indicator array.
  • the fixed item (either camera or head) is shown on the right while the other of the two (the moving item) is shown between the arrow indicators.
  • a mirror image of the icon arrangements shown in Figs. 2a and 2b can also be used, and this may be desirable if the users are culturally accustomed to reading from right to left.
  • the indicators shown in Fig. 2b are actuated in a manner generally similar to that described for Fig. 2a.
  • the camera is "in range" icon 208, the head icon, is illuminated instead of icon 204, the camera icon.
  • the arrow icons operate in the same manner described with reference to Fig. 2a, and when the imaging operation is complete, both the head and camera icons will illuminate in the same manner described with reference to Fig. 2a. While it is possible to use either the head or camera icon or both as the in-range indicators, the inventors have found that having the moving item between the arrows illuminate when the camera is in range provides an intuitive indication to users. Having a change in illumination (such as the additional illumination of the head or camera icon when the other of the two is already illuminated as an in-range indicator) provides an intuitive indication that the operation is complete.
  • Front icons only, rear icons only, or both sets of icons can be provided depending on the intended operation of the camera. For example, if the camera will be used only for self- imaging, only the front icons are needed. If the camera will always be positioned by an operator, only the back icons are needed. Maximum versatility is achieved by providing both front and back sets of range indicators so that the camera can be used either in self-operated or operator-operated modes.
  • Having both modes available facilitates operation where the camera is held in a fixed position (such as in a desk stand) and the subject is told to position his head using the feedback from the indicators, while allowing the operator to take the camera out of the stand and manually position it in cases where due to age, disability, or language barrier, the subject is physically unable to engage with the camera or unable to understand and comply with instructions for use.
  • both front and back sets of indicators are provided, it is possible to selectively turn the front and back indicators on and off, depending on the current operating mode.
  • the front indicators may be activated and the back indicators deactivated, while for the operator-operated mode, only the back indicators are activated.
  • the activation of front, back, or both sets of indicators may be configurable by the operator if desired.
  • the range indicator illumination is controlled, in an example embodiment, by software operating in a computer attached to the camera via the USB connector.
  • the software in the computer receives the video output of the camera imager, locates the eye or eyes in the digital image data using known image processing techniques for finding generally circular shapes, and determines the diameter of the iris.
  • the lens is set so that the iris will be in focus within a desired diameter range for the iris image, for example 210 to 250 pixels.
  • the range indicators are then actuated in real time. This may be done, for example, based on the apparent diameter of the iris in the video image. If no iris is found or the iris diameter is less than the lower boundary (210 pixels in this example) the camera is assumed to be too far away and the "move forward" indicator is activated. If the iris diameter is within the target range, the camera is "in range” and the in-range indicator is illuminated. If the iris diameter is larger than the upper boundary, the camera is too close to the subject and the "move back" indicator is illuminated.
  • the range indicators are then actuated in real time. This may be done, for example, based on the apparent diameter of the iris in the video image. If no iris is found or the iris diameter is less than the lower boundary (210 pixels in this example) the camera is assumed to be too far away and the "move forward" indicator is activated. If the iris diameter is within the target range, the camera is "in range” and the in-range indicator is illuminated. If the iris
  • iris matching software Software that finds the center of an eye in an image and measures iris diameter is normally included in commercial iris matching software. This software can be used for range determination, or a customized algorithm can be created. The algorithm for detecting and measuring the iris can be optimized, through reasonable experimentation, for the image characteristics produced by the selected imager and circuits. Algorithms provided with commercial iris identification software will perform these functions effectively.
  • the diameter determination can also be tuned to a particular camera output by testing a range of published algorithms for locating eyes and measuring their diameters, and using the algorithm that is found to provide optimal performance for the particular contrast and image size produced by the combination of the selected imager, illumination, camera arrangement and frame rate.
  • the algorithm for determining distance to the subject uses, at least in part, the measured pixel distance between the reflections of the two illuminators on the camera imaged in the pupil area.
  • a range of distances, for example 18-22 pixels, corresponding to good focus can be determined experimentally based on the specific imager, lens, and illuminator spacing.
  • the range detection and control functionalities described herein may be implemented in the computing device connected to the camera via USB, in another computing device connected via a connection or network, or in firmware running in a processor mounted in the camera, for example on the camera circuit board.
  • Figures 3a-3f show a further, preferred embodiment of an enclosure for the present camera that includes an ergonomic handle and is provided with accessories such as a desk stand and a removable reflection shade.
  • Figure 3a is a view of an enclosure 300 that houses the functional components shown in Figure 1 and provides substantially the same relative geometries and positions of components shown in Figure 1.
  • Enclosure 300 has a handle 302 sized for a typical operator's hand and preferably angled slightly back (for example, by 10 to 15 degrees) to enable a more natural operator wrist position when holding the face of the camera in a vertical plane and moving it toward the subject's face.
  • USB cable 304 exits from the bottom of handle 302 and is connected to a computing device (not shown) that receives the camera's output.
  • Figure 3b is a view of the camera mounted in a desk stand 306.
  • Desk stand 306 is provided with a receptacle that mates with the handle 302 and supports enclosure 300 in a fixed position.
  • the desk stand 306 holds the camera face at an upward angle, to allow persons of various heights to lean over the camera and position their eyes correctly relative to the camera imager when it is placed in the desk stand 306.
  • the upward angle for the camera face can be selected as desired; the inventors have found that preferred ergonomics are obtained if the desk stand holds the camera so that its face leans back within a range of 30 to 45 degrees from vertical.
  • desk stand 306 it is easy for a subject seated at a desk across from the operator to lean over the camera and move his or her head closer to the camera until the range indicator icons illuminate to indicate the correct distance has been reached.
  • a typical instruction to the subject in the method of taking iris pictures using the camera with the desk stand is "Please look through the slot in the middle of the camera and move forward until the head turns green, then hold still.”
  • the operator is preferably able to observe what the camera sees through a user interface on the screen of the computing device that receives the camera output.
  • the camera provides a high-speed USB video stream to the computer.
  • Software operating in the computer receives this video stream and generates a monitor output stream that can be displayed as video element 400 on the computer screen, as shown in Figure 4.
  • the monitor video stream is reduced in size from the full frame output of the camera, for example, to 640 x 480 pixels.
  • the top 25% of the monitor display for example, is allocated to a bitmap simulation 402 of the rear range indicators on the camera (as described with reference to Figure 2a).
  • the front range indicator icons Fig.
  • the indicator to be displayed may also be operator- selectable based on personal preference and the current intended use. For example, the icons to be displayed may be switched between the front and back configurations by the operator for more intuitive operation, depending on whether handheld or fixed position use of the camera is occurring.
  • bitmap simulation graphics 402 in the simulated display change to show colored versions of the icons, corresponding to the current illumination state of the electronic icons on the camera itself. That is, whatever icons are illuminated on the camera are shown in the corresponding color on the simulated display, and whatever icons are not illuminated on the camera are shown in dull grey or a color that similarly contrasts with the colored, activated states of the icons.
  • the "move closer" arrows are illuminated on the camera, and on the screen the corresponding arrow region in the bitmap image is colored yellow.
  • Figure 3c is a front view of the enclosure. Viewfinder 110, diffusers 108, and IR high pass filter 112 are located in the front of the enclosure as shown. Further, enclosure 300 has reinforced mounting slots 308 on the top and sides of the enclosure for receiving a reflection reduction hood as will be described in more detail below.
  • an arrow between the camera and the head and a numeric indication of the target distance may be printed on the display to indicate the general target distance range for the camera to the first time user.
  • Figure 3d is a back view of the enclosure 300.
  • FIGs 3e and 3f are perspective views of the camera enclosure 300 mounted in desk stand 306 with an optional reflection reduction hood 314 attached thereto.
  • Hood 314 has mounting tabs 316 corresponding to and received by mounting slots 308 in the enclosure 300.
  • the mounting tabs may be friction fit, folded back and taped or connected by hook-and-loop fasteners, or otherwise removably mounted to enclosure 300.
  • hood 314 extends outward toward the position of the subject to a distance slightly beyond the target iris distance (for example, 9 to 12 inches from the camera front face).
  • the hood in the example embodiment is formed in the shape of a truncated, three-sided pyramid.
  • hood 314 are angled away from the optical axis of the imager so that reflections from the IR illuminators located behind diffusers 108 are not located on the subject iris or irises.
  • the angling of the top and sides of the hood may be varied within limits based on experimentation and the type of material selected for the hood; what is important is that the angle is sufficient to avoid generating reflections from the inside surface of hood 314 onto the iris.
  • An appropriate angle used in the example embodiment is
  • the optical axis of the camera is the line extending from the center of the imager through the lens to the subject, perpendicular to the plane of the imager.
  • the hood may be formed from any desired material. Various thicknesses of paperboard, both coated and uncoated, and various thermoplastics may be used as desired.
  • the hood may be printed with advertising material and/or the logo of the camera manufacturer, camera operator, or agency responsible for the identification program. In use, the subject places his head within the area of the end of the hood and moves forward slightly until the range indicators show correct distance from the camera. The hood prevents windows, direct sunlight, artificial lights, and other items in the camera operating area from producing reflections on the iris that would interfere with obtaining a complete image of the iris pattern.
  • the hood can be attached easily when needed, and removed for transport, storage, or when it is not needed.
  • the camera automatically scans the region within the hood, identifies the iris locations as the user moves into position, and stores the iris pictures when they are obtained. Typically, by the time the subject gets his head in position, the operation is complete.
  • FIG. 7a shows an example embodiment of an iris illumination system.
  • iris illumination system 700 includes LEDs 702, 704, 706 and 707 and distribution elements 104.
  • two assemblies 710 are provided, each incorporating one of each of LEDs 702, 704, 706 and 707 and a distribution element 104.
  • the assemblies 710 are positioned in this example on each side of a lens 712 for the imager (not shown).
  • LEDs 202, 204, 206 and 207 may be single-wavelength near-infrared LEDs selected in the range from 700 nm to 900 nm wavelength, although visible light LEDs may also be included in some applications. In a preferred embodiment two of the LEDs have nominal wavelengths of
  • each assembly 710 has nominal wavelengths of 830nm and 850nm respectively.
  • LEDs are shown in each assembly 710 in this example. Using four LEDs provides advantages as noted in this disclosure; among other things this arrangement makes it easy to provide light having three or more different wavelength components. However, one, two, three, or any other number of LEDs can be provided in each assembly 710 within the scope of the invention. Also, while LEDs are a preferred illuminating element in these example embodiments, other devices that generate appropriate near-infrared illumination can be substituted within the scope of the invention.
  • Illumination distribution element 104 is preferably both a light diffusing and distributing device.
  • Distribution element 104 is preferably effective to distribute light at least within the range of wavelengths selected for LEDs 702, 704, 706 and 707.
  • element 104 has a light distribution pattern that is generally square or rectangular, and not circular. Thus, preferably the distribution pattern corresponds generally to the shape of the field of view of the associated imager within the camera system.
  • One preferred example of the illumination distribution element 104 is the EDS-50A
  • the light output of this example element 104 is a substantially square, pyramidal beam of light with the sides of the pyramid approximately 25 degrees from the central axis. Thus, the output is a square beam of light projecting outward in a 50-degree by 50- degree space.
  • the characteristics of the microstructures can be customized to produce different beam dimensions.
  • the beam is designed to illuminate the field of view of the imager and lens used in the camera.
  • the camera field of view will be approximately 50 degrees so the standard 50 degree square diffuser provides good results.
  • the LEDs may all be positioned behind a single distribution element (for example, 2, 4 or 6 LEDs may be grouped together behind one distribution element).
  • the LEDs may be arranged behind a plurality of distribution elements, such as 2, 3, 4 or more distribution elements 104.
  • two assemblies 710 are provided, each consisting of four LEDs and a distribution element 104.
  • the distribution elements 104 that are selected preferably produce a highly uniform distribution of light within an area or areas of interest where one or more irises will appear, in the view field with no hot spots and minimal variation in intensity within the area of interest.
  • the model EDS50 diffuser described above is preferred because it produces an absolutely uniform light distribution throughout the camera's field of view, not just in one or more areas of interest.
  • LEDs 202, 204, 206 and 207 may be selected from a wide range of available devices.
  • the LEDs are T1.75 form factor devices with relatively narrow beams (such as a half angle of 6 to 10 degrees).
  • the LED position behind element 104 depends on the type of LED and the shape of its beam. With narrow beams the LEDs may be from 1mm to 5mm, for example, behind element 104 so that element 104 can be relatively small and still receive and distribute light from multiple LEDs.
  • the LEDs behind each element 104 may, for example, include two 770nm LEDs, an 830nm LED, and an 850nm LED to provide multispectral illumination in the 700-900 nm range.
  • the illumination is provided in a range of 750nm to 850nm.
  • Suitable LEDs can be selected from products manufactured by Epitex (distributed by Marubeni USA) and Vishay (distributed by Digikey USA) to produce the desired amount of light output.
  • the LEDs are driven by a constant current or by current pulses.
  • pairs of LEDs 702, 704 are arranged in series with a current limiting resistor 716 and supplied with 5VDC power.
  • Typical Epitex and Vishay T1.75 near-infrared LEDs have voltage drops of 1.7-1.8VDC, so a series pair as shown will have a total voltage drop in the range of 3.5V DC.
  • the remaining 1.5V DC is typically used in part by a switching device 718 (such as a transistor switch) for controlling the LED circuit, and the remainder of the voltage drop is taken up by current limiting resistor 716.
  • resistor 716 is selected in view of the voltage drop to be taken up by resistor 716, and the desired current I through the circuit.
  • the desired drive current I is 70 mA
  • the switching circuit 718 has a voltage drop of 0.45V DC
  • the voltage drop for the resistor 716 is 1.05V.
  • four circuits 220 are provided to drive an example total of eight LEDs as shown in Figure 7A. The types of LEDs selected and the driving power level depends on available power and the desired range of illumination.
  • the example given is suitable for a USB-powered portable camera, allocating a total of 280 mA to near-infrared illumination.
  • the circuit shown is merely provided as an example; for cameras with a longer imaging range or with more available power, a variety of arrangements can be provided.
  • LEDs can be driven at higher power in appropriate situations, using intermittent current pulses rather than a fixed voltage source, in a manner specified in data sheets for the LEDs.
  • the camera and its illuminators are intended for use with the eye positioned within a defined distance range from the camera.
  • the range can be any designed value.
  • the eye is typically intended to be from six to twelve inches (15cm to 30cm) from the camera and its illuminators for proper imaging.
  • the design distance may be greater.
  • the amount of near-infrared ("NIR") radiation exposure to the eye must be kept within established safe limits, which are subject to revision but currently include standards such as ISO 15004-2:200, ANSI/IESNA RP-27.3-2007, and IEC 62471.
  • the amount of NIR light on the eye from up to eight LEDs at a distance of 10-12 inches is normally well within established safely limits.
  • an illumination angle is formed between a line extending from the LED group to the eye, and another line extending from the eye to the imager. If this illumination angle does not exceed a minimum value, typically 5 degrees under current ISO standards, a red-eye effect may result that effectively lights up the pupil and diminishes the image contrast between the pupil and the iris. Images having diminished pupil-iris contrast may not function properly with some eye finding and iris identification algorithms.
  • An appropriate illumination angle can be determined experimentally to avoid this effect. In a preferred embodiment, the illumination angle is in the range of 9 to 11 degrees.
  • Figure 8 shows a typical illumination distribution for the illumination system disclosed in the example embodiment.
  • the beam 810 is generally more rectangular than conventional iris illumination systems such as the prior art system shown in Figure 1.
  • Conventional approaches tend to produce a series of overlapping circle patterns (shown in Figure 1), resulting in significant variation in light distribution across the view field of the camera. Further, the inventor has found that it is difficult to evenly illuminate the corners of the view field by combining circular beams.
  • each beam 810 preferably extends across more than 50% of the visual range of the imager.
  • each of the beams 810 extends across substantially all of the imager's visual field, so that the illumination is almost completely uniform regardless of where the iris is positioned within the view field.
  • smooth, consistent illumination of substantially all of the camera view field by the overlapping beams 810 facilitates simultaneous imaging of two irises.
  • a system that distributes light in a pattern more closely corresponding to the field of view of the imager provides particular unobvious advantages.
  • One such advantage results from providing a substantially consistent amount of illumination throughout a field of view of the camera associated with the illumination system.
  • iris identification technology As iris identification technology has matured, starting in the late 1990s, a number of manufacturers have developed iris biometric camera systems. Generally these camera systems have used LEDs in the 700-to-900 nm range for illumination. There has been considerable experimentation regarding optimal wavelengths, but relatively little attention has been paid to perfecting illumination systems as a component of the prior art camera systems.
  • Iris identification is performed by matching patterns in gray scale values in an image with previously collected image patterns. Variations in lighting effectively create "noise" in the image data. LEDs, particularly when not evenly diffused, produce highly variable light patterns. In conventional iris cameras, there is also variability between cameras in the light output of the LEDs and their position. Further, even if taken with the same camera, the imager position relative to the eye cannot be exactly reproduced from image to image, and from time to time. Thus, different images of the same eye will be taken at different angles and distances.
  • the lighting system of the example embodiment also enables a significantly increased usable field of view for the associated camera.
  • this illumination system allows capture of a usable, matchable image from the far corners of the camera's view area.
  • the useful capture space of the camera is limited to a central region where there is both sufficient light available and sufficient uniformity to support reasonably repeatable template generation.
  • the disclosed system and method thus makes possible a significant increase in the capture space for a given camera, compared to prior art approaches.
  • the disclosed example illumination system has been tested with a 5 -megapixel camera having a lens suitable for capture at a distance of about 10 inches from the camera.
  • the resulting capture space extends throughout the view area of the imager. Irises located in the far corners of the view field are illuminated at substantially the same level as irises located in the center of the view field. This results in more usable capture space than in conventional portable iris ID cameras. Images are typically captured with portable cameras by either moving the camera or having the person move their head so that the eyes are within the capture space. A larger capture space makes it much easier to aim the camera or to position the head, as appropriate.
  • the system and method disclosed in the example embodiment produce more illumination within the camera's field of view than prior art systems, because the light is directed evenly into the field of view and relatively little light is directed outside the field of view.
  • a number of prior art portable iris camera devices require two USB ports for operation, because the device cannot operate within the .5A power limit of a single USB port.
  • the LED illuminators are typically responsible for the majority of the power used by portable iris cameras.
  • more efficient distribution of the light produced by a given number of LEDs makes it possible to use fewer LEDs than would otherwise be required to light a given field of view, reducing the power used and the number of USB ports or other power sources required to drive the camera system.
  • Improved eye safety is another benefit of the system and method disclosed in the example embodiment. Because the system uniformly distributes the illumination over a broader area, the amount of energy hitting an eye positioned anywhere within the view area is more predictable, less variable, and on average less energy per area than would be experienced with a prior art system.
  • the system and method disclosed in the example embodiment also facilitate easy deployment of a multispectral illumination solution for iris cameras.
  • the present invention provides a multispectral approach to iris illumination, with highly desirable performance characteristics.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

A binocular iris identification camera includes a single electronic imager and a lens unit for capturing images of one or both irises of a subject. A horizontally elongated viewfinder orifice passes through the camera housing and allows the operator to see both irises of the subject as an indication of correct aim. An electronic range detector signals when the camera is in range of the iris and may include a series of icons to indicate required distance adjustment. An ergonomically correct handle allows easy operator positioning of the camera, or the camera can be placed in a stand designed to receive the handle and angle the camera to facilitate desktop use with self-positioning by the subject. In operator mode, the camera is positioned between the operator and the subject, and the operator aims the camera by looking through the elongated viewfinder orifice to simultaneously see two irises of the subject. The operator adjusts the camera's distance relative to the subject until the electronic range indicator indicates that the distance is in the correct range, and images are captured. LEDs of different wavelengths illuminate the iris through an engineered diffuser that uniformly distributes light in a substantially rectangular light output pattern.

Description

SYSTEMS AND METHODS FOR BINOCULAR IRIS IMAGING
This application claims the benefit of the following U.S. Provisional Patent Applications: Serial No. 61/497,168 filed June 15, 2011 titled "Method and Apparatus for Enhanced Iris Camera Illumination" and Serial No. 61/497,179 filed June 15, 2011 titled "Systems and Methods for Binocular Iris Imaging," the disclosures of which are incorporated herein by reference.
Technical Field
[0001 ] The invention relates to improved image capture equipment useful in the field of biometric iris identification, and to methods for using such equipment.
Background Art
[0002] Iris identification, a long-established technology, is more accurate than fingerprints and requires no physical contact. Iris ID has been proven in a wide range of applications and is capable of essentially zero error performance. Nations around the world are implementing iris identification in government ID and border control programs, and companies are using the iris to identify employees and customers.
[0003 ] Iris identification systems have two main elements: cameras and software. The camera is used to take a close-up picture of the iris, the colored ring in the eye. The software provides user and system interfaces, stores iris patterns of people to be identified, and implements a matching algorithm that identifies a person looking at the camera by comparing the image data with stored iris patterns.
[0004] In developing cameras for imaging the iris, it has been determined that iris patterns are more visible under near-infrared illumination. Typically the iris is illuminated with one or more wavelengths in the range from 700nm to 900nm. A standard design approach in the iris identification industry is to equip the camera with one or more LEDs that produce near-infrared illumination. The LEDs can have a single wavelength output, or more than one emitter can be provided in a single package to produce two or three output wavelengths. For example, the IriMagic 1000BK™ iris camera manufactured by Iritech, Inc. of Fairfax, Virginia is a binocular (two-eye) iris camera with two imagers, one aimed at each eye. Each of the two imagers has an LED illuminator.
[0005] Another prior art camera, the LifeSaver® LS-1 camera manufactured by Eye
Controls, LLC of Rockville, Maryland, was designed by the inventors of the present application. The LS-1 camera is a single-eye camera and has two LEDs, one located on each side of the imager, to illuminate the eye. Figure 6 shows a typical illumination output of a camera with two LEDs. Generally, LEDs 602 produce circular illumination patterns 604 that overlap in the center of the camera's field of view 606. The illumination patterns 604 of typical prior art devices are not uniform through field of view 606; there is significantly less illumination in corners 108 than in the central region of field of view 106. Further, the projected illumination patterns 104 do not coincide with field of view 106. Thus, if the position of the subject item to be imaged (such as an iris) does not coincide with the illumination patterns 104, it may not be well-illuminated or may not be evenly illuminated. The LS-1 camera images a single eye at close range, so within the imaging area, illumination is fairly even. However, much of the light output falls outside the imaging area.
[0006] Other prior art cameras have more than two LEDs. The handheld Oki® IrisPass-h™ made by Oki of Japan, images a single eye using two LEDs on each side, for a total of four LEDs. Some cameras have six or more near-infrared LEDs.
[0007] Each LED illuminator projects a generally conical beam of light toward the eye region. LEDs typically produce "hot spots," that is, an uneven light output across the illuminated area. Many cameras, such as the Iritech and Oki® cameras noted above, do not have a diffuser or other optical element between the LEDs and the targeted eyes. Some, such as the Eye Controls camera, have a plastic diffuser covering the LED. A standard diffuser of this type can substantially eliminate hot spots, enhancing eye safety and image quality, but does not change the generally conical distribution of light from standard LEDs.
[0008] LEDs are available with different angular specifications for distribution of light; typically the light is distributed within a cone at an angle between 3 degrees and 60 degrees from the central axis, depending on the LED selected. The light tends to dissipate with distance. The outer range at which the eye can be illuminated and imaged will depend on the number, brightness, and angular light distribution specifications of the LEDs.
[0009] Typical portable iris cameras have a USB interface and are powered from the USB cable. Thus, for the USB 2.0 specification these devices are limited to drawing a total current of 0.5 Amperes. A typical near-infrared LED circuit draws 0.08 to 0.10 Amperes of current. Depending on the operating voltage and the voltage required by each LED, one or more LEDs may be arranged in series in each LED circuit. The voltage and current available in portable or low- power devices establishes a hard practical limit on the number of LEDs that can be activated simultaneously in these devices. Further, eye safety standards limit the amount of near-infrared radiation that is applied to the eye, so the amount and distribution of light provided by an iris camera must be maintained within safe limits. Some prior art cameras, such as the LG Electronics model 2200 and the Oki IrisPass, have sequentially activated LED illumination circuits rather than activating all LEDs at the same time. This approach, when applied to portable cameras, has the effect of reducing the total instantaneous current draw of the illuminating circuits. Changing the illuminator pattern is also known to provide different illumination angles if the illuminators are spaced sufficiently apart, which tends to reduce reflections from eyeglasses.
[0010] The size of the region in which a given camera can find and image an eye is referred to as the "capture volume" for that camera. It is a generally desirable engineering goal for iris cameras to have a larger capture volume, i.e. a wider field of view, so that aiming and positioning relative to the target eye is less critical. The capture volume of a typical portable iris camera is limited by optics, available image transmission bandwidth, available illumination current, and eye safety standards. The imagers used in many portable iris cameras have VGA resolution (640x480 pixels, or 0.3 megapixels). Previous portable cameras with larger capture volumes, such as the Eye Controls product, have used imagers as large as two megapixels.
[001 1 ] The various engineering and safety limits noted herein impose limitations on small portable, low-power camera designs. Cameras that are more forgiving in terms of aiming accuracy, such as the Eye Controls product, have been designed for a closer operating distance such as five inches (12.5 cm). Prior art cameras that can image at a longer range, such as 10 to 12 inches (25 to 30 cm), have a relatively narrow region where the camera and illuminators are aimed and where the eye must be positioned. The tradeoffs imposed by safety considerations, data rates, and illumination requirements have required compromises and prevented designers from achieving ideal results in all respects.
[0012] While a number of small, portable iris camera systems have been developed over the last 15 years, many commercially available iris identification cameras, particularly those that can simultaneously image both eyes, remain complex, expensive, and difficult to use. Many existing products require physical contact with a hood or visor to position the subject relative to the camera. There is a long-felt need for an improved iris identification camera that is versatile, easy to use, rugged, inexpensive to manufacture, and that provides a non-contact interface while still being easy to position relative to the subject to obtain a good image.
Disclosure of the Invention
[0013 ] It is to be understood that both the following summary and the detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. Neither the summary nor the description that follows is intended to define or limit the scope of the invention to the particular features mentioned in the summary or in the description.
[0014] In example embodiments, a binocular iris identification camera includes a single electronic imager and a lens unit for capturing images of one or both irises of a subject. A horizontally elongated viewfmder orifice passes through the camera housing and allows the operator to see both irises of the subject as an indication of correct aim. An electronic range detector signals when the camera is in range of the iris and may include a series of icons to indicate required distance adjustment. An ergonomically correct handle allows easy operator positioning of the camera, or the camera can be placed in a stand designed to receive the handle and angle the camera to facilitate desktop use with self-positioning by the subject.
[0015] In operator mode, the camera is positioned between the operator and the subject, and the operator aims the camera by looking through the elongated viewfinder orifice to simultaneously see two irises of the subject. The operator adjusts the camera's distance relative to the subject until the electronic range indicator indicates that the distance is in the correct range, and images are captured. In a self-identification mode, the camera is placed in the stand or is handheld by the subject, who is facing the camera. The subject looks directly at the camera and adjusts the distance of the camera from the face, as prompted by the range indicating system, to achieve a focused image.
[0016] In preferred embodiments, LEDs of different wavelengths illuminate the iris through an engineered diffuser that uniformly distributes light in a substantially rectangular light output pattern.
Brief Description of Drawings
[0017] The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate various exemplary embodiments of the present invention and, together with the description, further serve to explain various principles and to enable a person skilled in the pertinent art to make and use the invention.
[0018] Figure 1 a is a front plan view of an embodiment of an iris camera;
[0019] Figure lb is a side sectional view of the camera of Figure 1 a;
[0020] Figure 1 c is a back plan view of the iris camera of Figure 1 a;
[0021 ] Figure 1 d is a bottom sectional view of the camera of Figure 1 a;
[0022] Figures 2a and 2b are diagrams showing examples of indicator icons used in certain embodiments for positioning at the correct range from the eye;
[0023 ] Figures 3a through 3f are perspective views showing additional features of a preferred embodiment of the camera housing; and
[0024] Figure 4 is a screen shot showing an example video display generated by the output software of the camera.
[0025] Figure 5 is a flow chart showing an example embodiment of an operating method for capturing an iris image;
[0026] Figure 6 is an illustration of a prior art illumination pattern; [0027] Figure 7a is a front view of an example embodiment of an iris camera having a novel illumination method and apparatus;
[0028] Figure 7b is a side view of an illumination assembly included in the example embodiment of Figure 7a;
[0029] Figure 7c is a schematic diagram of an example LED driving circuit; and
[0030] Figure 8 is a view of the illumination pattern generated by the example illumination assembly of Figure 7b.
Best Mode for Carrying Out the Invention
[0031 ] The present invention will be described in terms of one or more examples, with reference to the accompanying drawings. In the drawings, some like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of most reference numbers may identify the drawing in which the reference numbers first appear.
[0032] The present invention will also be explained in terms of exemplary embodiments.
This specification discloses one or more embodiments that incorporate the features of this invention. The disclosure herein will provide examples of embodiments, including examples of data analysis from which those skilled in the art will appreciate various novel approaches and features developed by the inventors. These various novel approaches and features, as they may appear herein, may be used individually, or in combination with each other as desired.
[0033 ] In particular, the embodiment(s) described, and references in the specification to
"one embodiment", "an embodiment", "an example embodiment", etc., indicate that the embodiment(s) described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, persons skilled in the art may implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0034] Embodiments of the invention may be implemented in hardware, firmware, software, or any combination thereof, or may be implemented without automated computing equipment. Embodiments of the invention may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g. a computing device). For example, a machine -readable medium may include read only memory (ROM); random access memory (RAM); hardware memory in handheld computers, PDAs, mobile telephones, and other portable devices; magnetic disk storage media; optical storage media; thumb drives and other flash memory devices; electrical, optical, acoustical, or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, analog signals, etc.), and others. Further, firmware, software, routines, instructions, may be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers or other devices executing the firmware, software, routines, instructions, etc.
[0035] Figures la through Id are views of a first embodiment of an iris camera. Referring first to Figure la, a front side of camera 100 includes faceplate recess 102, light distribution elements 104 (sometimes referred to as diffusers), icon cutouts 108, viewfmder 110, and optical window 112. Faceplate recess 102 receives a faceplate 106 (shown in section in Figures lb and Id) that provides a finished front surface for camera 100 and includes translucent icons in the area of icon cutouts 108.
[0036] The details of the icons for the front side of the camera are not shown in Figure 1; they are shown in Figure 2b and will be discussed below with reference to that figure. Viewfmder 110 is an elongated slot passing through the body of camera 100. In an embodiment, viewfmder 110 is sized so that the operator can see both of the subject's eyes through viewfmder 110 when the camera 100 is correctly positioned with the eyes in view of the camera's imager. For example, viewfmder 110 may be 48mm to 51mm wide, 6mm high, and 27 mm in depth, to provide an appropriate view at a distance of about 25cm from the subject's eyes. The size and shape of the viewfmder can be varied depending on the desired operating distance. The viewfmder can have less depth (i.e. the housing of camera 100 can be thinner) if the height of the viewfmder is correspondingly reduced. The inventors have found that the elongated shape of viewfmder 110 is ideal for comfortable viewing in the desired operating position of camera 100. A conventional viewfmder, designed for a single eye view, requires that the operator close one eye or squint, and this is less comfortable for the operator. The viewfmder shape provided in this embodiment allows the operator to use both eyes in aiming the device, while looking at both eyes of the subject. This is a more comfortable and natural method of viewing the subject and simultaneously aiming the camera, as compared to methods previously used in the industry.
[0037] The slotted or elongated shape of the viewfmder tends to produce a high level of precision in getting the subject eyes into a horizontal plane where the camera imager can view them. However, with a basic slot it is more difficult to detect whether the camera is centered on the correct vertical axis. Ideally the aim will be centered on the subject's nose so that both eyes are in view of the imager. To aid in that effort, the viewfmder may optionally be provided with internal or external lateral aiming aids that show the user whether the slot is centered on the nose. As an example, brightly colored projections may be provided in the center of the slot, front and back, which can be centered on the nose. When both projections are visually aligned with the subject's nose the camera imager will be centered on the correct vertical axis.
[0038] Referring to Figure lc, on the back of camera 100 there are indicator icon cutouts
118 and a recessed area 116 to accommodate an adhesive overlay containing translucent icons in the area of cutouts 118. A preferred design of icons for the back side of the camera is shown in Figure 2a and will be described in detail below with reference to Figure 2a.
[0039] Referring to the side sectional view of camera 100 in Figure lb, optical window 112 provides a view port for an imager (omitted for clarity) mounted on circuit board 114 behind optical window 112. An example of an appropriate imager is the Aptina 5 -megapixel monochrome CMOS imager, model MT9P031I12STM. The imager is combined on the circuit board 114 with appropriate digital signal processing circuits to receive image data, process it, and format it for transmission through an interface. The imager has a lens mounted in an optical path between the imager and the subject eyes, selected to sharply image the subject eyes at the desired operating distance. The lens is selected with an aperture that provides a reasonable depth of field at the designed focal distance. In an embodiment, the operating distance is from 20 to 25 cm, and the lens is between 8.5mm and 10.0mm focal length, with an aperture of f2.0 to f4.0. One example of a suitable lens is model number DSL935A-NIR distributed by Sunex Inc., USA of Carlsbad, California. This example lens has a 9.6mm focal length and an O.O aperture.
[0040] The operating circuits of camera 100 are preferably mounted on a single circuit board 114. Circuit board 114 has a central cutout to accommodate viewfmder 110 passing through the plane of circuit board 114. Besides the imager and digital processing circuits for the imager, circuit board 114 has an interface for connecting camera 100 to a computing device, such as a notebook computer, netbook, desktop computer, smart phone, or other mobile computing device. This interface may be a USB 2.0, USB 3.0, wireless, or other standard or proprietary interface appropriate to the computing device to be connected. In an embodiment, the interface is a USB 2.0 full speed interface and a cable (omitted from Fig. 1 for clarity) is connected to circuit board 114, passing through an opening in the bottom of camera 100 or another convenient location to connect to the computer.
[0041 ] Circuit board 114 also has near-infrared illuminating LEDs mounted directly behind light distribution elements 104 to illuminate the subject eyes. The LEDs may be of a single wavelength, such as 830nm, or may have different wavelengths to provide multispectral illumination such as 770nm, 830nm, and 850nm. In a preferred embodiment there are two 770 nm LEDs, one 830nm LED and one 850nm LED on each side of the camera, behind each light distribution element 104, for a total of eight near-infrared LEDs. An example embodiment of the illumination arrangement is described in more detail below with particular reference to Figures 7 through 9.
[0042] Finally, circuit board 114 has four indicator LEDs mounted on each side of circuit board 114, aligned with openings 108 and 118 respectively. These indicator LEDs are preferably color coded and aligned to light up the icons in the front and back faceplates, shown in Figures 2a and 2b, to indicate desired positional movements of the camera. The icons are controlled by input/output ports on the circuit board, which are in turn controlled in response to control signals generated either on the camera board (in response to on-board electronic ranging or firmware -based image analysis to determine position and distance), or in response to control signals sent over the USB interface from a computing device that receives the video output of the camera and performs analysis of position and distance.
[0043 ] In a preferred embodiment, the camera is connected via a USB cable to a computing device, such as a PC, that has software for receiving and processing a high speed USB video stream to deliver iris images to the computing device and to other software operating in the computing device or in another device networked to the computing device. In an embodiment, the camera's video frame output may be dynamically windowed and/or cropped to reduce the amount of data transmitted in each frame and provide an increased frame rate. The inventors have found that with the geometry shown in Figure 1 , where the camera is located slightly (for example, about 22 mm) below the viewfmder, the subject eyes tend to be uniformly in the upper portion of the frame because they have been aligned by the operator and/or subject with the viewfmder rather than the camera lens. Thus, in the embodiment shown, the subject eyes virtually never appear in the bottom 20-25% of the full 5mp video frame. In a preferred embodiment, the bottom 20-25% of the video frame is cropped and that data is not transmitted to the computing device, thus allowing an increased frame rate.
[0044] An example method for obtaining the image and processing it in the computing device is illustrated in the flow chart of Figure 5. This example embodiment includes the following steps: In step 502, the iris or irises to be imaged are illuminated, preferably using the uniform illumination systems described herein. In step 504, an image is captured using the imager in the camera. Then, in step 506, the video frame transmitted from the camera to the processing device.
In step 508, the frame is analyzed to determine iris location and size. Next, in step 510, range indicator icons are activated based on the apparent distance to the subject. The distance to the subject may be determined based on one or more image analysis factors, including considering the diameter of the iris, the distance in pixels between images of the illuminator reflections on the iris, and the like. In step 512, the computing device generates a monitor display frame that can be displayed on a screen of the computing device to show the camera operator what the imager sees. In an embodiment, the monitor display frame is a 640x480 pixel image, generated by shrinking the received camera image frame to 640x360 and adding a 640x120 bitmap simulated icon display to the top of the monitor display frame. The icon display mimics the icon display on the back of the camera so that whether the operator looks at the camera or at the screen, the operator sees the same range indication information. In step 513, if the subject is not in range, the process skips further image processing steps and loops back to step 504. If the subject is in range, in step 514 the image is processed to locate the iris or irises of interest, select a 640x480 region centered on each iris, and analyze focus and image quality of the iris in that region. The camera may be operated in a mode to collect a left eye image, right eye image, both eyes, or either eye. In step 516, if the image quality of an iris of interest is acceptable according to ISO standards, and is better than any previously obtained image for that iris, the selected 640x480 iris image region is retained for further use. In step 518, if the images collected are not complete and acceptable, the process repeats starting at step 504 and another image is captured and analyzed. If the images are acceptable and complete, the process continues at step 520, and the image or images stored are returned to the control program for further processing. The illuminators may then be turned off and the camera may enter an idle or low-power mode where no images are captured.
[0045] The camera may be selectively operated to capture both left and right iris images in a single operation, left iris only, right iris only, or either iris (whichever is captured first).
[0046] Figure 2a shows an embodiment of an indicator area 200 of the camera's back faceplate. Area 200 includes icons 202, 204, 206 and 208 that are preferably translucent so as to be selectively illuminated by corresponding indicator LEDs on circuit board 114. In the preferred embodiment, icon 202 is a "back arrow" used to indicate that the camera is too close to the subject and has red illumination when activated. Icon 204 is a camera icon that has green illumination, used to indicate that the camera is the correct distance from the subject. Icon 206 is a "forward arrow" with yellow or orange illumination, used to indicate that the camera is too distant from the subject and should be moved closer. Finally, icon 208 is a head icon that is illuminated green to indicate that the image capture is complete.
[0047] During an example imaging operation, these icons are selectively illuminated to provide a simple real-time indication to the operator or user to assist in establishing the correct distance to subject for a good iris image. This range indication system, combined with the viewfmder described herein, provides a complete, highly intuitive guidance system for positioning the camera relative to the subject. In use, the operator looks through the viewfmder so that he or she sees the subject's eyes. Then, the operator moves the camera forward until the range indicators show that the camera is at the right distance. The camera is then held in position until imaging is complete as shown by the indicators. The inventors have found that virtually all intended operators, even small children, are able to use this positioning system effectively after only minimal instruction. In addition to ease of use, the positioning system comprising the icon-based range indicators and viewfmder has the benefit of simplicity and durability. The range indicators use LED illuminators have an expected lifetime that exceeds the life of the device. The elongated slot viewfmder has no electronic or optical elements that can fail or be damaged. Thus, the positioning system features of this device are particularly durable and contribute to superior impact and drop- resistance of the device. In contrast, many conventional iris cameras use a video screen on the back of the camera as a positioning aid. This component is more expensive to produce than the aiming and positioning elements disclosed herein. Video viewfmders are also subject to damage if the device is dropped or there is an impact to the screen, unless the screen is mounted and protected in a manner that further increases the already higher cost of that approach.
[0048] In this example embodiment, for the icon arrangement of Figure 2a, forward arrow icon 206 is illuminated with a yellow backlight as the camera is moved, starting outside of the imaging range, toward the subject. When the camera enters the appropriate range for imaging, the LED illuminating the forward arrow icon 206 will be turned off, and camera icon 204 will be illuminated with a green backlight to indicate "in range". If the operator continues to move the camera toward the subject, when the camera reaches the inside boundary of the acceptable imaging range, icon 204 will be turned off and the "too close" arrow icon 202 will be illuminated, preferably with a red backlight. As the operator moves the camera back into the proper range, the "too close" icon 202 will be turned off and the camera icon 204 will again be illuminated in green to indicate "in range". When the desired images have been captured, the head icon 208 will be illuminated to indicate that the operation is complete. Thus, when the operator sees both icons 204 and 208 backlit in green, the operator can put down the camera and continue the enrollment or identification process.
[0049] Referring to Figure 2b, in a preferred embodiment a similar set of icons is provided on the front of the camera, facing the subject. However, the icons are arranged in a different manner when facing the subject, to correctly maintain the intuitive nature of the display. As shown in Figure 2b, on the front of the camera the head icon 208 is positioned between the arrow icons 202 and 206, and the camera icon 204 is positioned to the right of the other three icons. [0050] The icon arrangement is different for front and back indicators because a different operating mode is used with the two sets of indicators. When the front indicator of Fig. 2b (facing the subject) is used for positioning, typically the subject is moving his head forward and backward relative to a fixed camera position. Therefore, the range indicator icons for the camera front reflect this type of movement, showing the head located between the arrows that imply movement of the head relative to the camera. This provides an intuitive range indicator to the subject that is language-universal and easy to understand.
[0051 ] When the camera back indicators of Fig. 2a (facing the operator) are used for positioning, the operator is typically holding the camera and moving it relative to a fixed position of the subject. This, the back indicators have the camera icon between the two arrows, to reflect the expected movement of the camera forward and backward relative to the fixed head position. In the example embodiment, the head icon is positioned to the right side on the back indicator array. As the arrows and camera are illuminated to indicate the required movement (forward, backward, or hold still) the operator is provided with a simple, intuitive indication of how to position the camera.
[0052] In the example embodiment, on front and back indicators, the fixed item (either camera or head) is shown on the right while the other of the two (the moving item) is shown between the arrow indicators. However, a mirror image of the icon arrangements shown in Figs. 2a and 2b can also be used, and this may be desirable if the users are culturally accustomed to reading from right to left.
[0053 ] In operation, the indicators shown in Fig. 2b are actuated in a manner generally similar to that described for Fig. 2a. However, in the example embodiment, when the camera is "in range" icon 208, the head icon, is illuminated instead of icon 204, the camera icon. The arrow icons operate in the same manner described with reference to Fig. 2a, and when the imaging operation is complete, both the head and camera icons will illuminate in the same manner described with reference to Fig. 2a. While it is possible to use either the head or camera icon or both as the in-range indicators, the inventors have found that having the moving item between the arrows illuminate when the camera is in range provides an intuitive indication to users. Having a change in illumination (such as the additional illumination of the head or camera icon when the other of the two is already illuminated as an in-range indicator) provides an intuitive indication that the operation is complete.
[0054] Front icons only, rear icons only, or both sets of icons can be provided depending on the intended operation of the camera. For example, if the camera will be used only for self- imaging, only the front icons are needed. If the camera will always be positioned by an operator, only the back icons are needed. Maximum versatility is achieved by providing both front and back sets of range indicators so that the camera can be used either in self-operated or operator-operated modes. Having both modes available, for example, facilitates operation where the camera is held in a fixed position (such as in a desk stand) and the subject is told to position his head using the feedback from the indicators, while allowing the operator to take the camera out of the stand and manually position it in cases where due to age, disability, or language barrier, the subject is physically unable to engage with the camera or unable to understand and comply with instructions for use.
[0055] Further, if both front and back sets of indicators are provided, it is possible to selectively turn the front and back indicators on and off, depending on the current operating mode. Thus, in a subject-operated mode, the front indicators may be activated and the back indicators deactivated, while for the operator-operated mode, only the back indicators are activated. The activation of front, back, or both sets of indicators may be configurable by the operator if desired.
[0056] The range indicator illumination is controlled, in an example embodiment, by software operating in a computer attached to the camera via the USB connector. The software in the computer receives the video output of the camera imager, locates the eye or eyes in the digital image data using known image processing techniques for finding generally circular shapes, and determines the diameter of the iris. The lens is set so that the iris will be in focus within a desired diameter range for the iris image, for example 210 to 250 pixels.
[0057] The range indicators are then actuated in real time. This may be done, for example, based on the apparent diameter of the iris in the video image. If no iris is found or the iris diameter is less than the lower boundary (210 pixels in this example) the camera is assumed to be too far away and the "move forward" indicator is activated. If the iris diameter is within the target range, the camera is "in range" and the in-range indicator is illuminated. If the iris diameter is larger than the upper boundary, the camera is too close to the subject and the "move back" indicator is illuminated. There are many available prior art algorithms for identifying eyes in a digital image.
Software that finds the center of an eye in an image and measures iris diameter is normally included in commercial iris matching software. This software can be used for range determination, or a customized algorithm can be created. The algorithm for detecting and measuring the iris can be optimized, through reasonable experimentation, for the image characteristics produced by the selected imager and circuits. Algorithms provided with commercial iris identification software will perform these functions effectively. In an embodiment, the diameter determination can also be tuned to a particular camera output by testing a range of published algorithms for locating eyes and measuring their diameters, and using the algorithm that is found to provide optimal performance for the particular contrast and image size produced by the combination of the selected imager, illumination, camera arrangement and frame rate.
[0058] In a preferred embodiment, the algorithm for determining distance to the subject uses, at least in part, the measured pixel distance between the reflections of the two illuminators on the camera imaged in the pupil area. A range of distances, for example 18-22 pixels, corresponding to good focus can be determined experimentally based on the specific imager, lens, and illuminator spacing.
[0059] Other algorithms and methods may also be implemented to control the illuminators, such as providing an infrared electronic distance measuring device on the front of the camera, and using the indicated distance to the subject to indicate the correct range.
[0060] The range detection and control functionalities described herein may be implemented in the computing device connected to the camera via USB, in another computing device connected via a connection or network, or in firmware running in a processor mounted in the camera, for example on the camera circuit board.
[0061 ] The range indicator icon illumination colors, shapes, and purposes have been described herein in terms of a preferred embodiment that is easy for even untrained users and operators to understand. However, it will be understood that the indicator illumination color- coding, the shape and number of the icons, and other features of this display are not limited to this specific example, but can be varied while remaining within the scope of the invention.
[0062] Figures 3a-3f show a further, preferred embodiment of an enclosure for the present camera that includes an ergonomic handle and is provided with accessories such as a desk stand and a removable reflection shade. Figure 3a is a view of an enclosure 300 that houses the functional components shown in Figure 1 and provides substantially the same relative geometries and positions of components shown in Figure 1. Enclosure 300 has a handle 302 sized for a typical operator's hand and preferably angled slightly back (for example, by 10 to 15 degrees) to enable a more natural operator wrist position when holding the face of the camera in a vertical plane and moving it toward the subject's face. USB cable 304 exits from the bottom of handle 302 and is connected to a computing device (not shown) that receives the camera's output.
[0063 ] Figure 3b is a view of the camera mounted in a desk stand 306. Desk stand 306 is provided with a receptacle that mates with the handle 302 and supports enclosure 300 in a fixed position. Preferably the desk stand 306 holds the camera face at an upward angle, to allow persons of various heights to lean over the camera and position their eyes correctly relative to the camera imager when it is placed in the desk stand 306. The upward angle for the camera face can be selected as desired; the inventors have found that preferred ergonomics are obtained if the desk stand holds the camera so that its face leans back within a range of 30 to 45 degrees from vertical. With the design shown for desk stand 306, it is easy for a subject seated at a desk across from the operator to lean over the camera and move his or her head closer to the camera until the range indicator icons illuminate to indicate the correct distance has been reached. A typical instruction to the subject in the method of taking iris pictures using the camera with the desk stand is "Please look through the slot in the middle of the camera and move forward until the head turns green, then hold still."
[0064] During self-positioning of the subject, the operator is preferably able to observe what the camera sees through a user interface on the screen of the computing device that receives the camera output. The camera provides a high-speed USB video stream to the computer. Software operating in the computer receives this video stream and generates a monitor output stream that can be displayed as video element 400 on the computer screen, as shown in Figure 4. In the embodiment shown, the monitor video stream is reduced in size from the full frame output of the camera, for example, to 640 x 480 pixels. The top 25% of the monitor display, for example, is allocated to a bitmap simulation 402 of the rear range indicators on the camera (as described with reference to Figure 2a). However, the front range indicator icons (Fig. 2b) can be displayed within the monitor video frame instead, if desired. The indicator to be displayed may also be operator- selectable based on personal preference and the current intended use. For example, the icons to be displayed may be switched between the front and back configurations by the operator for more intuitive operation, depending on whether handheld or fixed position use of the camera is occurring.
[0065] The bitmap simulation graphics 402 in the simulated display change to show colored versions of the icons, corresponding to the current illumination state of the electronic icons on the camera itself. That is, whatever icons are illuminated on the camera are shown in the corresponding color on the simulated display, and whatever icons are not illuminated on the camera are shown in dull grey or a color that similarly contrasts with the colored, activated states of the icons. In the example screen shot of Figure 4, the "move closer" arrows are illuminated on the camera, and on the screen the corresponding arrow region in the bitmap image is colored yellow.
[0066] In this manner, the operator can see what the camera sees and also see range indication information at the same time. Depending on whether the camera is being used in a fixed mount or in a handheld mode, the operator can move the camera and/or give movement instructions to the subject while monitoring the combined display that shows both actual camera video and range instructions. [0067] Another advantage of this integrated monitor display is that it supports use of the camera as a generalized peripheral for connection to software developed by systems integrators for processing iris enrollments and recognitions. Typically software programs designed to integrate with commercially available iris cameras have the capacity to receive and display a 640x480 camera monitor output. Because such software is designed to work with at least several different cameras that have a variety of range indicators, or no range indicators at all, there is no industry standard interface for such software programs to receive and display range positioning information. The novel integration of this information into the camera video monitor output stream as disclosed, in easily understood graphic icon form consistent with the indicators on the camera itself, is thus a significant advantage. This approach simplifies camera integration into a wide range of control software, requiring no customization for this particular camera, while still providing this useful range information to the operator.
[0068] Figure 3c is a front view of the enclosure. Viewfinder 110, diffusers 108, and IR high pass filter 112 are located in the front of the enclosure as shown. Further, enclosure 300 has reinforced mounting slots 308 on the top and sides of the enclosure for receiving a reflection reduction hood as will be described in more detail below.
[0069] In the embodiment shown, at position 310 on the indicator icon display, an arrow between the camera and the head and a numeric indication of the target distance (for example, "10 in/25 cm") may be printed on the display to indicate the general target distance range for the camera to the first time user. Figure 3d is a back view of the enclosure 300.
[0070] Figures 3e and 3f are perspective views of the camera enclosure 300 mounted in desk stand 306 with an optional reflection reduction hood 314 attached thereto. Hood 314 has mounting tabs 316 corresponding to and received by mounting slots 308 in the enclosure 300. The mounting tabs may be friction fit, folded back and taped or connected by hook-and-loop fasteners, or otherwise removably mounted to enclosure 300. In the example embodiment, hood 314 extends outward toward the position of the subject to a distance slightly beyond the target iris distance (for example, 9 to 12 inches from the camera front face). The hood in the example embodiment is formed in the shape of a truncated, three-sided pyramid. The top and side surfaces of hood 314 are angled away from the optical axis of the imager so that reflections from the IR illuminators located behind diffusers 108 are not located on the subject iris or irises. The angling of the top and sides of the hood may be varied within limits based on experimentation and the type of material selected for the hood; what is important is that the angle is sufficient to avoid generating reflections from the inside surface of hood 314 onto the iris. An appropriate angle used in the example embodiment is
10 to 15 degrees measured from a line parallel to the optical axis of the camera. The optical axis of the camera is the line extending from the center of the imager through the lens to the subject, perpendicular to the plane of the imager.
[0071 ] The hood may be formed from any desired material. Various thicknesses of paperboard, both coated and uncoated, and various thermoplastics may be used as desired. The hood may be printed with advertising material and/or the logo of the camera manufacturer, camera operator, or agency responsible for the identification program. In use, the subject places his head within the area of the end of the hood and moves forward slightly until the range indicators show correct distance from the camera. The hood prevents windows, direct sunlight, artificial lights, and other items in the camera operating area from producing reflections on the iris that would interfere with obtaining a complete image of the iris pattern. The hood can be attached easily when needed, and removed for transport, storage, or when it is not needed.
[0072] In operation, synergies are realized as a result of the complementary design of the camera and its range indicators, desk stand, and hood. Referring to Figure 3f, the ergonomic position provided by the camera and hood assembly when mounted on the desk stand (and thus slightly reclined) makes it easy for an untrained subject to lean over the camera into an imaging position. The hood inherently indicates to the untrained user where his or her head is to be located, and the green light range indicator provides a simple message to the user confirming proper distance. When these elements are combined, they enable a highly effective and rapid method of obtaining iris images for large numbers of untrained subject. As the subject moves his or her head toward the hood and into the imaging region that is intuitively suggested by the position of the hood, the camera automatically scans the region within the hood, identifies the iris locations as the user moves into position, and stores the iris pictures when they are obtained. Typically, by the time the subject gets his head in position, the operation is complete.
[0073 ] A novel illumination system provided in some embodiments and mentioned briefly above will now be described in more detail. Figure 7a shows an example embodiment of an iris illumination system. In this example, iris illumination system 700 includes LEDs 702, 704, 706 and 707 and distribution elements 104. In the embodiment shown, two assemblies 710 are provided, each incorporating one of each of LEDs 702, 704, 706 and 707 and a distribution element 104. The assemblies 710 are positioned in this example on each side of a lens 712 for the imager (not shown).
[0074] LEDs 202, 204, 206 and 207 may be single-wavelength near-infrared LEDs selected in the range from 700 nm to 900 nm wavelength, although visible light LEDs may also be included in some applications. In a preferred embodiment two of the LEDs have nominal wavelengths of
770nm, and the other two have nominal wavelengths of 830nm and 850nm respectively. Four LEDs are shown in each assembly 710 in this example. Using four LEDs provides advantages as noted in this disclosure; among other things this arrangement makes it easy to provide light having three or more different wavelength components. However, one, two, three, or any other number of LEDs can be provided in each assembly 710 within the scope of the invention. Also, while LEDs are a preferred illuminating element in these example embodiments, other devices that generate appropriate near-infrared illumination can be substituted within the scope of the invention.
[0075] Illumination distribution element 104 is preferably both a light diffusing and distributing device. Distribution element 104 is preferably effective to distribute light at least within the range of wavelengths selected for LEDs 702, 704, 706 and 707. Preferably element 104 has a light distribution pattern that is generally square or rectangular, and not circular. Thus, preferably the distribution pattern corresponds generally to the shape of the field of view of the associated imager within the camera system.
[0076] One preferred example of the illumination distribution element 104 is the EDS-50A
Engineered Diffuser™ manufactured by RPC Photonics of Rochester, New York. This element is made by applying a polymer to a clear substrate and etching into it a large number of microstructures, such as microlenses, each one individually designed to produce a specific pattern of light. These elements and the microlens patterns can also be injection molded in polycarbonate or other thermoplastic materials. The light output of this example element 104 is a substantially square, pyramidal beam of light with the sides of the pyramid approximately 25 degrees from the central axis. Thus, the output is a square beam of light projecting outward in a 50-degree by 50- degree space. The characteristics of the microstructures can be customized to produce different beam dimensions. Preferably the beam is designed to illuminate the field of view of the imager and lens used in the camera. For a 5 megapixel camera with optics geared to an approximately 16cm square capture region at a 10 inch distance, the camera field of view will be approximately 50 degrees so the standard 50 degree square diffuser provides good results.
[0077] Depending on the output level of the LEDs, the camera's sensitivity, and the illumination requirements of the overall system, a wide variety of arrangements may be acceptable, and are contemplated within the scope of the invention. For example, if they provide sufficient illumination a smaller number of LEDs may be used. Also, the LEDs may all be positioned behind a single distribution element (for example, 2, 4 or 6 LEDs may be grouped together behind one distribution element). Alternatively, the LEDs may be arranged behind a plurality of distribution elements, such as 2, 3, 4 or more distribution elements 104. In the example camera embodiments, two assemblies 710 are provided, each consisting of four LEDs and a distribution element 104. [0078] The distribution elements 104 that are selected preferably produce a highly uniform distribution of light within an area or areas of interest where one or more irises will appear, in the view field with no hot spots and minimal variation in intensity within the area of interest. The model EDS50 diffuser described above is preferred because it produces an absolutely uniform light distribution throughout the camera's field of view, not just in one or more areas of interest.
[0079] LEDs 202, 204, 206 and 207 may be selected from a wide range of available devices. In one example embodiment suitable for small portable cameras, the LEDs are T1.75 form factor devices with relatively narrow beams (such as a half angle of 6 to 10 degrees). The LED position behind element 104 depends on the type of LED and the shape of its beam. With narrow beams the LEDs may be from 1mm to 5mm, for example, behind element 104 so that element 104 can be relatively small and still receive and distribute light from multiple LEDs. In this embodiment, the LEDs behind each element 104 may, for example, include two 770nm LEDs, an 830nm LED, and an 850nm LED to provide multispectral illumination in the 700-900 nm range. In this example embodiment, the illumination is provided in a range of 750nm to 850nm. Suitable LEDs can be selected from products manufactured by Epitex (distributed by Marubeni USA) and Vishay (distributed by Digikey USA) to produce the desired amount of light output.
[0080] The LEDs are driven by a constant current or by current pulses. In an example embodiment of an LED driving circuit 720, shown in Figure 2C, pairs of LEDs 702, 704 are arranged in series with a current limiting resistor 716 and supplied with 5VDC power. Typical Epitex and Vishay T1.75 near-infrared LEDs have voltage drops of 1.7-1.8VDC, so a series pair as shown will have a total voltage drop in the range of 3.5V DC. The remaining 1.5V DC is typically used in part by a switching device 718 (such as a transistor switch) for controlling the LED circuit, and the remainder of the voltage drop is taken up by current limiting resistor 716. The value of resistor 716 is selected in view of the voltage drop to be taken up by resistor 716, and the desired current I through the circuit. As an example, if the desired drive current I is 70 mA, and the switching circuit 718 has a voltage drop of 0.45V DC, the voltage drop for the resistor 716 is 1.05V. V=IR (1.05V = 0.070A * R) so R=15 ohms to produce the desired 70 mA drive current. In this example embodiment, four circuits 220 are provided to drive an example total of eight LEDs as shown in Figure 7A. The types of LEDs selected and the driving power level depends on available power and the desired range of illumination. The example given is suitable for a USB-powered portable camera, allocating a total of 280 mA to near-infrared illumination. The circuit shown is merely provided as an example; for cameras with a longer imaging range or with more available power, a variety of arrangements can be provided. In particular, LEDs can be driven at higher power in appropriate situations, using intermittent current pulses rather than a fixed voltage source, in a manner specified in data sheets for the LEDs.
[0081 ] The camera and its illuminators are intended for use with the eye positioned within a defined distance range from the camera. The range can be any designed value. For portable cameras the eye is typically intended to be from six to twelve inches (15cm to 30cm) from the camera and its illuminators for proper imaging. For fixed imagers the design distance may be greater. The amount of near-infrared ("NIR") radiation exposure to the eye must be kept within established safe limits, which are subject to revision but currently include standards such as ISO 15004-2:200, ANSI/IESNA RP-27.3-2007, and IEC 62471. The amount of NIR light on the eye from up to eight LEDs at a distance of 10-12 inches is normally well within established safely limits. However, possible misuse of the device must also be considered. For example, while there would be no value in doing so, a person might put the device very close to his eye for a brief period. The separation of a multi-LED illumination package into two groups of LEDs behind physically separated distribution elements halves the potential radiation exposure to a single eye in a close-contact misuse scenario, thus increasing the overall safety of the system. Separation into a larger number of groups such as 3 or 4 may be desirable in some cases to further decrease the potential exposure during close-contact misuse. As the number of groups increases, separation produces progressively diminishing safety returns and increases the cost and manufacturing complexity of the system. While the present invention contemplates any number of physically separated LED groups, the optimal selection for a given design will vary, involving a tradeoff based on the desired device characteristics. Further, as is known in iris imaging, an illumination angle is formed between a line extending from the LED group to the eye, and another line extending from the eye to the imager. If this illumination angle does not exceed a minimum value, typically 5 degrees under current ISO standards, a red-eye effect may result that effectively lights up the pupil and diminishes the image contrast between the pupil and the iris. Images having diminished pupil-iris contrast may not function properly with some eye finding and iris identification algorithms. An appropriate illumination angle can be determined experimentally to avoid this effect. In a preferred embodiment, the illumination angle is in the range of 9 to 11 degrees.
[0082] Figure 8 shows a typical illumination distribution for the illumination system disclosed in the example embodiment. As can be seen, the beam 810 is generally more rectangular than conventional iris illumination systems such as the prior art system shown in Figure 1. Conventional approaches tend to produce a series of overlapping circle patterns (shown in Figure 1), resulting in significant variation in light distribution across the view field of the camera. Further, the inventor has found that it is difficult to evenly illuminate the corners of the view field by combining circular beams. Thus, in conventional cameras, typically either the corners of the view field are not well-illuminated, or else in order to get light into the corners, the light beams are extended significantly outside the view field so that a good portion of the limited amount of light generated by the LEDs is wasted illuminating areas that are not of interest, and that are not visible to the camera. In the example embodiment, each beam 810 preferably extends across more than 50% of the visual range of the imager. In a preferred embodiment, each of the beams 810 extends across substantially all of the imager's visual field, so that the illumination is almost completely uniform regardless of where the iris is positioned within the view field. In this preferred embodiment, smooth, consistent illumination of substantially all of the camera view field by the overlapping beams 810 facilitates simultaneous imaging of two irises.
[0083 ] A system that distributes light in a pattern more closely corresponding to the field of view of the imager, such as the system described in the example embodiment, provides particular unobvious advantages. One such advantage results from providing a substantially consistent amount of illumination throughout a field of view of the camera associated with the illumination system. As iris identification technology has matured, starting in the late 1990s, a number of manufacturers have developed iris biometric camera systems. Generally these camera systems have used LEDs in the 700-to-900 nm range for illumination. There has been considerable experimentation regarding optimal wavelengths, but relatively little attention has been paid to perfecting illumination systems as a component of the prior art camera systems.
[0084] The inventors have determined that uneven illumination directly reduces the accuracy of matching in an iris identification system and therefore increases the error rates associated with the system. Iris identification is performed by matching patterns in gray scale values in an image with previously collected image patterns. Variations in lighting effectively create "noise" in the image data. LEDs, particularly when not evenly diffused, produce highly variable light patterns. In conventional iris cameras, there is also variability between cameras in the light output of the LEDs and their position. Further, even if taken with the same camera, the imager position relative to the eye cannot be exactly reproduced from image to image, and from time to time. Thus, different images of the same eye will be taken at different angles and distances. The result of these various factors may be a significant variation in light intensity in different areas of the iris across the camera's view field. In different images of the same eye taken without absolutely even illumination, even taken with similar cameras or the same camera, these lighting variations will generate variations in the gray scale values registered for specific parts of the iris pattern, depending on what part of the LED light pattern falls on the iris. This will result in increased variability of pixel values within the identification templates generated for images of the same eye. The result will be increased Hamming distances (less certain matching) in the matching system operating with the images. In other words, more even, repeatable lighting generates lower Hamming distances and increases the ability of an identification system to successfully differentiate between a large number of irises without error.
[0085] The lighting system of the example embodiment also enables a significantly increased usable field of view for the associated camera. By distributing light evenly into the corners of the camera's field of view, this illumination system allows capture of a usable, matchable image from the far corners of the camera's view area. In cameras where the light is not so evenly distributed, the useful capture space of the camera is limited to a central region where there is both sufficient light available and sufficient uniformity to support reasonably repeatable template generation. The disclosed system and method thus makes possible a significant increase in the capture space for a given camera, compared to prior art approaches. For example, the disclosed example illumination system has been tested with a 5 -megapixel camera having a lens suitable for capture at a distance of about 10 inches from the camera. The resulting capture space extends throughout the view area of the imager. Irises located in the far corners of the view field are illuminated at substantially the same level as irises located in the center of the view field. This results in more usable capture space than in conventional portable iris ID cameras. Images are typically captured with portable cameras by either moving the camera or having the person move their head so that the eyes are within the capture space. A larger capture space makes it much easier to aim the camera or to position the head, as appropriate.
[0086] Further, the system and method disclosed in the example embodiment produce more illumination within the camera's field of view than prior art systems, because the light is directed evenly into the field of view and relatively little light is directed outside the field of view. A number of prior art portable iris camera devices require two USB ports for operation, because the device cannot operate within the .5A power limit of a single USB port. The LED illuminators are typically responsible for the majority of the power used by portable iris cameras. Thus, more efficient distribution of the light produced by a given number of LEDs makes it possible to use fewer LEDs than would otherwise be required to light a given field of view, reducing the power used and the number of USB ports or other power sources required to drive the camera system. [0087] Improved eye safety is another benefit of the system and method disclosed in the example embodiment. Because the system uniformly distributes the illumination over a broader area, the amount of energy hitting an eye positioned anywhere within the view area is more predictable, less variable, and on average less energy per area than would be experienced with a prior art system.
[0088] The system and method disclosed in the example embodiment also facilitate easy deployment of a multispectral illumination solution for iris cameras. With the arrangement shown, it is possible to select LEDs with a plurality of different primary wavelengths and place them behind a single distribution element. Because of the exceptionally even distribution pattern of the preferred distribution element disclosed in the example embodiment, the light output of all four LEDs will be distributed with substantial uniformity across the imaging area. This uniformity of distribution is produced by the example embodiment configuration, even though the four LEDs are spaced apart (by at least their diameter) and constitute separate point sources of light. As a result, the entire imaging field of the camera is evenly illuminated with light of multiple distinct wavelengths, maximizing the camera system's ability to resolve different colored irises. Two wavelengths, three wavelengths, or four or more wavelengths can be provided as desired, by making corresponding changes to the number and characteristics of the LEDs placed behind the distribution element(s). Thus, in certain preferred embodiments, the present invention provides a multispectral approach to iris illumination, with highly desirable performance characteristics.
[0089] Although illustrative embodiments have been described herein in detail, it should be noted and understood that the descriptions and drawings have been provided for purposes of illustration only and that other variations both in form and detail can be added thereto without departing from the spirit and scope of the invention. The terms and expressions in this disclosure have been used as terms of description and not terms of limitation. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the claims and their equivalents. The terms and expressions herein should not be interpreted to exclude any equivalents of features shown and described, or portions thereof.

Claims

1. A camera device for imaging at least one ocular iris of a subject, comprising:
A housing;
An optical subsystem installed in the housing including an electronic imager and at least one lens arranged to project at least one subject iris pattern on the imager when the camera is aimed at the subject iris;
A horizontally elongated viewfmder orifice passing through the housing and peripherally enclosed by the housing, arranged so that an operator positioning the camera between the operator and the subject iris and looking through the viewfmder orifice can see two irises of the subject as an indication of correct camera aim; and
An electronic range indicator that provides an indication signal to the operator when the camera is held at a distance from the subject iris that produces a focused iris pattern image at the imager.
2. The device of claim 1 further comprising an illumination subsystem including at least two LEDs of different wavelengths and an engineered diffuser that uniformly distributes light from said LEDs across a target iris area in a substantially rectangular light output pattern.
3. The device of claim 2 including a plurality of said illumination subsystems.
4. The device of claim 1 wherein the optical subsystem has a single imager and lens arrangement that simultaneously images both subject irises.
5. The device of claim 1 wherein the electronic range indicator includes an indicator visible to the operator that illuminates when a subject iris is in range.
6. The device of claim 5 wherein the indicator comprises a set of at least three illuminated icons, indicating too close, too distant, and in-range.
7. The device of claim 1 further comprising an image processor connected to the imager and to the electronic range indicator, operating to electronically analyze images from the imager to determine whether the imager distance to the subject is within a predetermined focus range for imaging, and when the subject is in range, activating the electronic range indicator to inform the operator.
8. The device of claim 1 further comprising a handle mounted on the camera.
9. The device of claim 8 further comprising a stand that receives the handle to position the device for tabletop use, with a front face of the camera reclined at an angle relative to the tabletop so that the imager is aimed along an upward path.
10. A method for imaging an ocular iris of a subject, comprising the steps of: Providing a camera with at least one electronic imager and a horizontally elongated viewfmder orifice passing through a housing of the camera and peripherally enclosed by the housing;
Providing an electronic range detection mechanism that automatically activates an electronic range indicator in the camera to signal when the camera is held at a distance from the subject iris that produces a focused iris pattern image at the imager;
Positioning the camera between the operator and the subject, so that the operator looks through the elongated viewfmder orifice to simultaneously see two irises of the subject, as an indication of correct camera aim;
Adjusting the camera's distance relative to the subject until the electronic range indicator indicates that the distance will produce a focused iris pattern image; and
Processing at least one iris image from the imager as a biometric identifier of the subject.
11. The method of claim 10 comprising the further step of illuminating the subject iris with a substantially rectangular light pattern by providing at least one illumination subsystem integral to the camera comprising at least two LEDs of different wavelengths and an engineered diffuser that uniformly distributes light from said LEDs across the area of the subject iris in said substantially rectangular light pattern.
12. The method of claim 11 wherein said step of illuminating the subject iris is performed using at least two of said illumination subsystems.
13. The method of claim 11 wherein in the step of providing the camera, the camera has a single imager and lens arrangement that simultaneously images both subject irises.
14. The method of claim 11 wherein in the step of providing the camera, said electronic range indicator incorporates a visual indicator to the operator that illuminates when a subject iris is in range.
15. The method of claim 14 wherein in said step of adjusting the camera's distance, the electronic range indicator guiding the operator comprises a set of at least three illuminated icons, indicating too close, too distant, and in-range.
16. The method of claim 10 including the further steps of:
automatically analyzing image output in real time during camera positioning, using an image processor in the camera connected to the imager and the electronic range indicator, to determine whether the imager distance to the subject is within a predetermined focus range for imaging; and when the subject is in range based on said automatically analyzing step, activating the electronic range indicator to inform the operator of the in-range condition.
17. The method of claim 10 wherein the step of processing at least one iris image from the imager includes transmitting at least part of the image to a computing device connected to the camera and processing the raw image to produce an international standard iris image.
18. The method of claim 17 wherein the international standard iris image is processed in the computing device using a segmentation algorithm to produce an iris identification template.
19. The method of claim 15 including the further step of adding a digital simulation of the illuminated icon display on the camera to frames of a live video output showing the view field of the imager, and displaying said live video output, whereby the operator may view the status of the range icons by looking either at the camera or at the display of the live video output.
20. The method of claim 10, comprising the further steps of:
Providing a handle on the camera;
Providing a surface stand that removably receives the handle to hold the camera in a position relative to the surface where a front face of the camera is reclined at an angle so that the imager is aimed along an upward path;
Positioning the subject within the imager field of view and having the subject adjust his distance to the camera until the range indicator indicates correct range.
PCT/US2012/042780 2011-06-15 2012-06-15 Systems and methods for binocular iris imaging WO2012174453A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161497179P 2011-06-15 2011-06-15
US201161497168P 2011-06-15 2011-06-15
US61/497,168 2011-06-15
US61/497,179 2011-06-15

Publications (2)

Publication Number Publication Date
WO2012174453A2 true WO2012174453A2 (en) 2012-12-20
WO2012174453A3 WO2012174453A3 (en) 2013-02-21

Family

ID=47357779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/042780 WO2012174453A2 (en) 2011-06-15 2012-06-15 Systems and methods for binocular iris imaging

Country Status (1)

Country Link
WO (1) WO2012174453A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014205145A1 (en) * 2013-06-18 2014-12-24 Avedro, Inc. Systems and methods for determining biomechanical properties of the eye for applying treatment
US9020580B2 (en) 2011-06-02 2015-04-28 Avedro, Inc. Systems and methods for monitoring time based photo active agent delivery or photo active marker presence
US9044308B2 (en) 2011-05-24 2015-06-02 Avedro, Inc. Systems and methods for reshaping an eye feature
US20160253558A1 (en) * 2015-02-27 2016-09-01 Fujitsu Limited Iris authentication apparatus and electronic device
US9498114B2 (en) 2013-06-18 2016-11-22 Avedro, Inc. Systems and methods for determining biomechanical properties of the eye for applying treatment
US9498642B2 (en) 2009-10-21 2016-11-22 Avedro, Inc. Eye therapy system
US9707126B2 (en) 2009-10-21 2017-07-18 Avedro, Inc. Systems and methods for corneal cross-linking with pulsed light
US10028657B2 (en) 2015-05-22 2018-07-24 Avedro, Inc. Systems and methods for monitoring cross-linking activity for corneal treatments
US10114205B2 (en) 2014-11-13 2018-10-30 Avedro, Inc. Multipass virtually imaged phased array etalon
US10258809B2 (en) 2015-04-24 2019-04-16 Avedro, Inc. Systems and methods for photoactivating a photosensitizer applied to an eye
US10350111B2 (en) 2014-10-27 2019-07-16 Avedro, Inc. Systems and methods for cross-linking treatments of an eye
US10631726B2 (en) 2017-01-11 2020-04-28 Avedro, Inc. Systems and methods for determining cross-linking distribution in a cornea and/or structural characteristics of a cornea
US11179576B2 (en) 2010-03-19 2021-11-23 Avedro, Inc. Systems and methods for applying and monitoring eye therapy
US11207410B2 (en) 2015-07-21 2021-12-28 Avedro, Inc. Systems and methods for treatments of an eye with a photosensitizer
US11642244B2 (en) 2019-08-06 2023-05-09 Avedro, Inc. Photoactivation systems and methods for corneal cross-linking treatments
US11766356B2 (en) 2018-03-08 2023-09-26 Avedro, Inc. Micro-devices for treatment of an eye
US12016794B2 (en) 2018-10-09 2024-06-25 Avedro, Inc. Photoactivation systems and methods for corneal cross-linking treatments

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6254044B1 (en) * 2000-01-18 2001-07-03 Lee Li-Hwa Tabletop tripod
US20020024633A1 (en) * 1999-04-09 2002-02-28 Daehoon Kim Pupil evaluation system
US20100183199A1 (en) * 2007-09-28 2010-07-22 Eye Controls, Llc Systems and methods for biometric identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020024633A1 (en) * 1999-04-09 2002-02-28 Daehoon Kim Pupil evaluation system
US6254044B1 (en) * 2000-01-18 2001-07-03 Lee Li-Hwa Tabletop tripod
US20100183199A1 (en) * 2007-09-28 2010-07-22 Eye Controls, Llc Systems and methods for biometric identification

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9498642B2 (en) 2009-10-21 2016-11-22 Avedro, Inc. Eye therapy system
US9707126B2 (en) 2009-10-21 2017-07-18 Avedro, Inc. Systems and methods for corneal cross-linking with pulsed light
US11179576B2 (en) 2010-03-19 2021-11-23 Avedro, Inc. Systems and methods for applying and monitoring eye therapy
US9044308B2 (en) 2011-05-24 2015-06-02 Avedro, Inc. Systems and methods for reshaping an eye feature
US10137239B2 (en) 2011-06-02 2018-11-27 Avedro, Inc. Systems and methods for monitoring time based photo active agent delivery or photo active marker presence
US9020580B2 (en) 2011-06-02 2015-04-28 Avedro, Inc. Systems and methods for monitoring time based photo active agent delivery or photo active marker presence
WO2014205145A1 (en) * 2013-06-18 2014-12-24 Avedro, Inc. Systems and methods for determining biomechanical properties of the eye for applying treatment
US9498114B2 (en) 2013-06-18 2016-11-22 Avedro, Inc. Systems and methods for determining biomechanical properties of the eye for applying treatment
US9498122B2 (en) 2013-06-18 2016-11-22 Avedro, Inc. Systems and methods for determining biomechanical properties of the eye for applying treatment
US11219553B2 (en) 2014-10-27 2022-01-11 Avedro, Inc. Systems and methods for cross-linking treatments of an eye
US10350111B2 (en) 2014-10-27 2019-07-16 Avedro, Inc. Systems and methods for cross-linking treatments of an eye
US10114205B2 (en) 2014-11-13 2018-10-30 Avedro, Inc. Multipass virtually imaged phased array etalon
US10079967B2 (en) * 2015-02-27 2018-09-18 Fujitsu Limited Iris authentication apparatus and electronic device
US20160253558A1 (en) * 2015-02-27 2016-09-01 Fujitsu Limited Iris authentication apparatus and electronic device
US10258809B2 (en) 2015-04-24 2019-04-16 Avedro, Inc. Systems and methods for photoactivating a photosensitizer applied to an eye
US11167149B2 (en) 2015-04-24 2021-11-09 Avedro, Inc. Systems and methods for photoactivating a photosensitizer applied to an eye
US10028657B2 (en) 2015-05-22 2018-07-24 Avedro, Inc. Systems and methods for monitoring cross-linking activity for corneal treatments
US11207410B2 (en) 2015-07-21 2021-12-28 Avedro, Inc. Systems and methods for treatments of an eye with a photosensitizer
US10631726B2 (en) 2017-01-11 2020-04-28 Avedro, Inc. Systems and methods for determining cross-linking distribution in a cornea and/or structural characteristics of a cornea
US11529050B2 (en) 2017-01-11 2022-12-20 Avedro, Inc. Systems and methods for determining cross-linking distribution in a cornea and/or structural characteristics of a cornea
US12004811B2 (en) 2017-01-11 2024-06-11 Avedro, Inc. Systems and methods for determining cross-linking distribution in a cornea and/or structural characteristics of a cornea
US11766356B2 (en) 2018-03-08 2023-09-26 Avedro, Inc. Micro-devices for treatment of an eye
US12016794B2 (en) 2018-10-09 2024-06-25 Avedro, Inc. Photoactivation systems and methods for corneal cross-linking treatments
US11642244B2 (en) 2019-08-06 2023-05-09 Avedro, Inc. Photoactivation systems and methods for corneal cross-linking treatments

Also Published As

Publication number Publication date
WO2012174453A3 (en) 2013-02-21

Similar Documents

Publication Publication Date Title
WO2012174453A2 (en) Systems and methods for binocular iris imaging
US11665333B2 (en) Systems and methods for calibrating image sensors in wearable apparatuses
EP2710516B1 (en) Systems and methods for identifying gaze tracking scene reference locations
CA2954286C (en) Imaging and peripheral enhancements for mobile devices
CN113729611B (en) Eye tracking using center position of eyeball
US6850631B1 (en) Photographing device, iris input device and iris image input method
CN108205374B (en) Eyeball tracking module and method of video glasses and video glasses
KR102669768B1 (en) Event camera system for pupil detection and eye tracking
US20140267668A1 (en) Portable fundus camera
FI125445B (en) Blick Control Device
CN104956377A (en) Device for capturing person-specific data
CN213844155U (en) Biological characteristic acquisition and identification system and terminal equipment
EP2466896A2 (en) Integrated camera-projection device
JP3504177B2 (en) Photographing device and iris image input device
CN213844156U (en) Biological characteristic acquisition and identification system and terminal equipment
CN213844158U (en) Biological characteristic acquisition and identification system and terminal equipment
KR101492832B1 (en) Method for controlling display screen and display apparatus thereof
JP3848312B2 (en) Iris image input device
KR101433788B1 (en) Portable Iris Image Capture Device For Single-Eye
KR20240093875A (en) Event camera system for pupil detection and eye tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12800207

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01/04/2014)

122 Ep: pct application non-entry in european phase

Ref document number: 12800207

Country of ref document: EP

Kind code of ref document: A2