WO2018064563A1 - Éclairage de profil bas dans un capteur d'empreinte digitale optique - Google Patents

Éclairage de profil bas dans un capteur d'empreinte digitale optique Download PDF

Info

Publication number
WO2018064563A1
WO2018064563A1 PCT/US2017/054480 US2017054480W WO2018064563A1 WO 2018064563 A1 WO2018064563 A1 WO 2018064563A1 US 2017054480 W US2017054480 W US 2017054480W WO 2018064563 A1 WO2018064563 A1 WO 2018064563A1
Authority
WO
WIPO (PCT)
Prior art keywords
electrode layer
layer
top surface
cover glass
collimator filter
Prior art date
Application number
PCT/US2017/054480
Other languages
English (en)
Inventor
Patrick Smith
Pascale El Kallassi
Young Seen Lee
Original Assignee
Synaptics Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/280,623 external-priority patent/US10181070B2/en
Application filed by Synaptics Incorporated filed Critical Synaptics Incorporated
Publication of WO2018064563A1 publication Critical patent/WO2018064563A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor

Definitions

  • This disclosure generally relates to optical sensors and, more particularly, to small size optical fingerprint sensors.
  • Biometric recognition systems are used for authenticating and/or verifying users of devices incorporating the recognition systems.
  • Biometric sensing technology provides a reliable, non-intrusive way to verify individual identity for recognition purposes.
  • Fingerprints like various other biometric characteristics, are based on distinctive personal characteristics and, thus, are a reliable mechanism for recognizing an individual.
  • fingerprint sensors may be used to provide access control in stationary applications, such as security checkpoints.
  • Electronic fingerprint sensors may also be used to provide access control in mobile devices, such as cell phones, wearable smart devices (e.g., smart watches and activity trackers), tablet computers, personal data assistants (PDAs), navigation devices, and portable gaming devices. Accordingly, some applications, in particular applications related to mobile devices, may require recognition systems that are both small in size and highly reliable.
  • fingerprint sensors are based on optical or capacitive sensing technologies. Unfortunately conventional optical fingerprint sensors are too bulky to be packaged in mobile devices and other common consumer electronic devices, confining their use to door access control terminals and similar applications where sensor size is not a restriction. As a result, fingerprint sensors in most mobile devices are capacitive sensors having a sensing array configured to sense ridge and valley features of a fingerprint.
  • these fingerprint sensors either detect absolute capacitance (sometimes known as “self-capacitance”) or trans-capacitance (sometimes known as “mutual capacitance”). In either case, capacitance at each pixel in the array varies depending on whether a ridge or valley is present, and these variations are electrically detected to form an image of the fingerprint.
  • absolute capacitance sometimes known as “self-capacitance”
  • trans-capacitance sometimes known as “mutual capacitance”
  • capacitive fingerprint sensors provide certain advantages, most commercially available capacitive fingerprint sensors have difficulty sensing fine ridge and valley features through large distances, requiring the fingerprint to contact a sensing surface that is close to the sensing array. As a result, it remains a significant challenge for a capacitive sensor to detect fingerprints through thick layers, such as the thick cover glass (sometimes referred to herein as a "cover lens”) that protects the display of many
  • a cutout is often formed in the cover glass in an area beside the display, and a discrete capacitive fingerprint sensor (often integrated with a mechanical button) is placed in the cutout area so that it can detect fingerprints without having to sense through the cover glass.
  • a discrete capacitive fingerprint sensor (often integrated with a mechanical button) is placed in the cutout area so that it can detect fingerprints without having to sense through the cover glass.
  • the need for a cutout makes it difficult to form a flush surface on the face of device, detracting from the user experience.
  • the existence of mechanical buttons takes up valuable device real estate.
  • an optical fingerprint sensor comprising: a cover glass comprising a top surface and a bottom surface opposite to the top surface, wherein the top surface of the cover glass is configured to receive a finger; a transparent electrode layer positioned below the bottom surface of the cover glass; an organic light emitting diode (OLED) layer positioned below the transparent electrode layer, wherein the OLED layer is configured to emit an illumination light beam towards the top surface of the cover glass when the finger is positioned on the top surface of the cover glass, and wherein the OLED layer is configured to emit the illumination light beam through the transparent electrode layer and through the cover glass layer; a metal electrode layer positioned below the OLED layer, wherein the metal electrode layer comprises an opening configured to transmit a reflected light beam that is reflected from the top surface of the cover glass when the finger is positioned on the top surface of the cover glass; a collimator filter positioned below the metal electrode layer, wherein the collimator filter comprises a top surface and a bottom surface opposite to the top surface, wherein the
  • an optical fingerprint sensor comprising: a cover glass comprising a top surface and a bottom surface opposite to the top surface, wherein the top surface of the cover glass is configured to receive a finger; a transparent electrode layer positioned below the bottom surface of the cover glass; an organic light emitting diode (OLED) layer positioned below the transparent electrode layer, wherein the OLED layer is configured to emit illumination light towards the top surface of the cover glass when the finger is positioned over the top surface of the cover glass, wherein the OLED layer is configured to emit the illumination light through the transparent electrode layer and through the cover glass layer; a metal electrode layer positioned below the OLED layer; a plurality of openings extending through the transparent electrode layer, through the OLED layer, and through the metal electrode layer, wherein the plurality of openings are configured to transmit reflected light that is reflected from the top surface of the cover glass when the finger is positioned over the top surface of the cover glass; a collimator filter positioned below the metal electrode layer, wherein the collim
  • an optical fingerprint sensor comprising: a cover glass comprising a top surface and a bottom surface opposite to the top surface, wherein the top surface of the cover glass is configured to receive a finger; a transparent electrode layer positioned below the bottom surface of the cover glass; an organic light emitting diode (OLED) layer positioned below the transparent electrode layer, wherein the OLED layer is configured to emit illumination light towards the top surface of the cover glass when the finger is positioned over the top surface of the cover glass, wherein the OLED layer is configured to emit the illumination light through the transparent electrode layer and through the cover glass layer; a metal electrode layer positioned below the OLED layer; a plurality of openings extending through the transparent electrode layer, through the OLED layer, and through the metal electrode layer, wherein the plurality of openings are configured to transmit reflected light that is reflected from the top surface of the cover glass when the finger is positioned over the top surface of the cover glass; a collimator filter or pinhole filter positioned below the metal electrode layer,
  • FIG. 1 is a block diagram of an example of a device that includes an optical sensor and a processing system according to an embodiment of the disclosure.
  • FIG. 2 illustrates an example of a mobile device that includes an optical sensor according to an embodiment of the disclosure.
  • FIG. 3 illustrates an example of an optical sensor with a collimator filter layer according to an embodiment of the disclosure.
  • FIG. 4 illustrates an example of light interacting with an optical sensor having a collimator filter layer according to an embodiment of the disclosure.
  • FIG. 5 illustrates an alternative embodiment of a collimator filter layer according to an embodiment of the disclosure.
  • FIG. 6 illustrates an example of an optical sensor with a collimator filter layer
  • OLED illumination layer according to an embodiment of the disclosure.
  • FIG. 7 illustrates an example of an OLED illumination layer and a collimator filter structure according to an embodiment of the disclosure.
  • FIG. 8 illustrates a method of imaging an input object according to an embodiment of the disclosure.
  • an optical sensor includes a collimator filter layer which operates as a light conditioning layer, interposed between a light illumination layer and an image sensor array.
  • Transmitted light from the illumination layer reflects from an input object in a sensing region above a cover layer.
  • the reflected light is filtered by the collimator filter layer such that only certain of the reflected light beams reach optical sensing elements in the image sensor array.
  • the collimator filter layer of the present disclosure prevents blurring while allowing for a lower-profile image sensor, such as a fingerprint sensor, than is possible with purely lens-based or pinhole camera based imaging sensors.
  • the image sensor can be made thin for use in mobile devices such as cell phones. Placing individual collimator apertures over each optical sensing element, or group of elements, provides better sensitivity than purely pinhole based imagers by transmitting more light to the optical sensing elements.
  • the present disclosure describes the use of the collimator filter layer to enable optical sensing through a large range of thicknesses of cover layers.
  • FIG. 1 is a block diagram of an example of an electronic device 100 that includes an optical sensor device 102 and a processing system 104, according to an embodiment of the disclosure.
  • the processing system 104 includes a processor(s) 106, a memory 108, a template storage 110, an operating system (OS) 112, and a power source(s) 114.
  • Each of the processor(s) 106, the memory 108, the template storage 110, and the operating system 112 are interconnected physically, communicatively, and/or operatively for inter-component communications.
  • the power source 114 is interconnected to the various system components to provide electrical power as necessary.
  • processor(s) 106 are configured to implement functionality and/or process instructions for execution within electronic device 100 and the processing system 104.
  • processor 106 executes instructions stored in memory 108 or instructions stored on template storage 110 to identify a biometric object or determine whether a biometric authentication attempt is successful or unsuccessful.
  • Memory 108 which may be a non-transitory, computer-readable storage medium, is configured to store information within electronic device 100 during operation.
  • memory 108 includes a temporary memory, an area for information not to be maintained when the electronic device 100 is turned off. Examples of such temporary memory include volatile memories such as random access memories (RAM), dynamic random access memories (DRAM), and static random access memories (SRAM).
  • RAM random access memories
  • DRAM dynamic random access memories
  • SRAM static random access memories
  • Template storage 1 10 comprises one or more non-transitory computer-readable storage media.
  • the template storage 1 10 is generally configured to store enrollment views for fingerprint images for a user's fingerprint or other enrollment information. More generally, the template storage 110 may be used to store information about an object. The template storage 1 10 may further be configured for long- term storage of information.
  • the template storage 1 10 includes nonvolatile storage elements. Non-limiting examples of non-volatile storage elements include magnetic hard discs, solid-state drives (SSD), optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories, among others.
  • SSD solid-state drives
  • EPROM electrically programmable memories
  • EEPROM electrically erasable and programmable
  • the processing system 104 also hosts an operating system (OS) 1 12.
  • the operating system 1 12 controls operations of the components of the processing system 104.
  • the operating system 1 12 facilitates the interaction of the processor(s) 106, memory 108 and template storage 110.
  • the processor(s) 106 implement hardware and/or software to obtain data describing an image of an input obj ect.
  • the processor(s) 106 may also align two images and compare the aligned images to one another to determine whether there is a match.
  • the processor(s) 106 may also operate to reconstruct a larger image from a series of smaller partial images or sub-images, such as fingerprint images when multiple partial fingerprint images are collected during a biometric process, such as an enrollment or matching process for verification or identification.
  • the processing system 104 includes one or more power sources 1 14 to provide power to the electronic device 100.
  • power source 114 include single-use power sources, rechargeable power sources, and/or power sources developed from nickel-cadmium, lithium-ion, or other suitable material as well power cords and/or adapters which are in turn connected to electrical power.
  • Optical sensor device 102 can be implemented as a physical part of the electronic device 100, or can be physically separate from the electronic device 100. As appropriate, the optical sensor device 102 may communicate with parts of the electronic device 100 using any one or more of the following: buses, networks, and other wired or wireless interconnections.
  • optical sensor device 102 is implemented as a fingerprint sensor to capture a fingerprint image of a user.
  • the optical sensor device 102 uses optical sensing for the purpose of object imaging including imaging biometrics such as fingerprints.
  • the optical sensor device 102 can be incorporated as part of a display, for example, or may be a discrete sensor.
  • Some non-limiting examples of electronic devices 100 include personal computers of all sizes and shapes, such as desktop computers, laptop computers, netbook computers, tablets, web browsers, e-book readers, and personal digital assistants (PDAs). Additional example electronic devices 100 include composite input devices, such as physical keyboards and separate joysticks or key switches. Further example electronic devices 100 include peripherals such as data input devices (including remote controls and mice) and data output devices (including display screens and printers). Other examples include remote terminals, kiosks, video game machines (e.g., video game consoles, portable gaming devices, and the like), communication devices (including cellular phones, such as smart phones), and media devices (including recorders, editors, and players such as televisions, set-top boxes, music players, digital photo frames, and digital cameras).
  • PDAs personal digital assistants
  • Additional example electronic devices 100 include composite input devices, such as physical keyboards and separate joysticks or key switches.
  • Further example electronic devices 100 include peripherals such as data input devices (including remote controls and mice) and data output devices (including display screens and printers
  • the optical sensor device 102 may provide illumination to the sensing region. Reflections from the sensing region in the illumination wavelength(s) are detected to determine input information corresponding to the input object.
  • the optical sensor device 102 may utilize principles of direct illumination of the input object, which may or may not be in contact with a sensing surface of the sensing region depending on the configuration.
  • One or more light sources and/or light guiding structures may be used to direct light to the sensing region. When an input object is present, this light is reflected from surfaces of the input object, which reflections can be detected by the optical sensing elements and used to determine information about the input object.
  • the optical sensor device 102 may also utilize principles of internal reflection, which includes total internal reflection (TIR) and non-TIR internal reflection (collectively Fresnel reflections), to detect input objects in contact with a sensing surface.
  • TIR total internal reflection
  • Non-TIR internal reflection collectively Fresnel reflections
  • One or more light sources may be used to direct light in a light guiding element at an angle at which it is internally reflected at the sensing surface of the sensing region, due to different refractive indices at opposing sides of the boundary defined by the sensing surface.
  • Contact of the sensing surface by the input object causes the refractive index to change across this boundary, which alters the internal reflection characteristics at the sensing surface, causing light reflected from the input object to be weaker at portions where it is in contact with the sensing surface.
  • the light may be directed to the sensing surface at an angle of incidence at which it is totally internally reflected, except where the input object is in contact with the sensing surface and causes the light to partially transmit across this interface.
  • FTIR frustrated total internal reflection
  • An example of this is presence of a finger introduced to an input surface defined by a glass to air interface.
  • the higher refractive index of human skin compared to air causes light incident at the sensing surface at the critical angle of the interface to air to be partially transmitted through the finger, where it would otherwise be totally internally reflected at the glass to air interface.
  • This optical response can be detected by the system and used to determine spatial information. In some embodiments, this can be used to image small scale fingerprint features, where the internal reflectivity of the incident light differs depending on whether a ridge or valley is in contact with that portion of the sensing surface.
  • FIG. 2 illustrates an example of an electronic device 1 16, such as a mobile phone, which includes cover glass 118 over a display 120.
  • the disclosed method and system may be implemented by using the display 120 as the optical sensor to image an input object.
  • a separate discrete component 122 provides the optical sensing capabilities.
  • a discrete sensor may provide more flexibility in designing the optical components of the sensor for optimum illumination and/or signal conditioning than when attempting to integrate the optical sensor components on a display substrate, such as a TFT (thin-film transistor) backplane.
  • FIG. 3 illustrates an example of a stack-up for an optical image sensor device 200 used to image an obj ect 216, such as a fingerprint.
  • the sensor device 200 includes an image sensor array 202, a collimator filter layer or light conditioning layer 204 disposed above the image sensor array 202, an illumination layer 207 disposed above the collimator filter layer 204, a light source 208, and a cover layer 210.
  • a blocking layer 214 may also be provided.
  • the cover layer 210 protects the inner components of the sensor device 200, such as the image sensor array 202.
  • the cover layer 210 may include a cover glass or cover lens that protects inner components of a display in addition to the sensor device 200.
  • a sensing region for the input object is defined above the cover layer 210.
  • a top surface 218 of the cover layer 210 may form a sensing surface, which provides a contact area for the input object 216 (e.g., fingerprint).
  • the cover layer 210 is made of any material such as glass, transparent polymeric materials and the like.
  • the input object 216 is any object to be imaged.
  • the input object 216 has various features.
  • the input object 216 has ridges and valleys. Due to their protruding nature, the ridges contact the sensing surface 218 of the cover 210 layer. In contrast, the valleys do not contact the sensing surface 218 and instead form an air gap between the input object 216 and the sensing surface 218.
  • the input object 216 may have other features, such as stain, ink, and the like that do not create significant structural differences in portions of the input object 216, but which affect its optical properties.
  • the methods and systems disclosed herein are suitable for imaging such structural and nonstructural features of the input object 216.
  • the illumination layer 207 includes a light source 208 and/or a light guiding element 206 that directs illumination to the sensing region in order to image the input object 216.
  • the light source 208 transmits beams or rays of light 212 into the light guiding element 206 and the transmitted light propagates through the light guiding element 206.
  • the light guiding element 206 may utilize total internal reflection, or may include reflecting surfaces that extract light up towards the sensing region. Some of the light in the illumination layer 207 may become incident at the sensing surface 218 in an area that is contact with the input object 216. The incident light is in turn reflected back towards the collimator filter layer 204.
  • the light source 208 is disposed adjacent to the light guiding element 206.
  • the light source 208 may be positioned anywhere within the sensor 200 provided that emitted light reaches the light guiding element 206.
  • the light source 208 may be disposed below the image sensor array 202.
  • a separate light guiding element 206 is not required.
  • the light transmitted from the light source 208 can be transmitted directly into the cover layer 210 in which case the cover layer 210 also serves as the light guiding element.
  • the light transmitted from the light source 208 can be transmitted directly to the sensing region, in which case the light source 208 itself serves as the illumination layer.
  • a discrete light source is also not required.
  • the method and system contemplate using the light provided by a display or the backlighting from an LCD as suitable light sources.
  • the light provided by the illumination layer 207 to image the object 216 may be in near infrared (NIR) or visible.
  • the light can have a narrow band of wavelengths, a broad band of wavelengths, or operate in several bands.
  • the image sensor array 202 detects light passing through the collimator filter layer 204.
  • suitable sensor arrays are complementary metal oxide semiconductor (CMOS), charge coupled device (CCD) sensor arrays, and thin film sensor arrays.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the sensor array 202 includes a plurality of individual optical sensing elements capable of detecting the intensity of incident light.
  • the collimator filter layer 204 is provided with an array of apertures, or collimator holes, 220 with each aperture being directly above one or more optical sensing elements on the image sensor array 202.
  • the apertures 220 are formed using any suitable technique, such as laser drilling, etching and the like.
  • the collimator filter layer 204 only allows light rays reflected from the input obj ect 216 (e.g., finger) at normal or near normal incidence to the collimator filter layer 204 to pass and reach the optical sensing elements of the image sensor array 204.
  • the collimator filter layer 204 is an opaque layer with array of holes 220.
  • the collimator filter layer 204 is laminated, stacked, or built directly above the image sensor array 202.
  • the collimator filter layer 204 may be made of plastics such as polycarbonate, PET, polyimide, carbon black, inorganic insulating or metallic materials, silicon, or SU-8.
  • the collimator filter layer 204 is monolithic.
  • blocking layer 214 which is optionally provided as part of optical sensor 200.
  • the blocking layer 214 is a semitransparent or opaque layer that may be disposed above the collimator filter layer 204.
  • the blocking layer 214 may be disposed between the cover layer 210 and the illumination layer 207, as shown in FIG. 3.
  • the blocking layer 214 may be disposed between the illumination layer 207 and the collimator filter layer 204.
  • the blocking layer 214 obscures components of the sensor device 200, such as the apertures in the collimator filter layer 204, from ambient light illumination, while still allowing the sensor device 200 to operate.
  • the blocking layer 214 may include of a number of different materials or sub-layers.
  • a thin metal or electron conducting layer may be used where the layer thickness is less than the skin depth of light penetration in the visible spectrum.
  • the blocking layer 214 may include a dye and/or pigment or several dyes and/or pigments that absorb light, for example, in the visible spectrum.
  • the blocking layer 214 may include several sub-layers or nano-sized features designed to cause interference with certain wavelengths, such as visible light for example, so as to selectively absorb or reflect different wavelengths of light.
  • the light absorption profile of the blocking layer 214 may be formulated to give a particular appearance of color, texture, or reflective quality thereby allowing for particular aesthetic matching or contrasting with the device into which the optical sensor device 200 is integrated. If visible illumination wavelengths are used, a semitransparent layer may be used to allow sufficient light to pass through the blocking layer to the sensing region, while still sufficiently obscuring components below.
  • FIG. 4 illustrates a closer view of the collimator filter layer 204 disposed between the illumination layer 207 and the image sensor array 202 and interaction of light within the sensor device 200. Portions 226 of the cover layer 210 are in contact with ridges of the input obj ect 216 and portion 228 of the cover layer 210 is in contact with air due to the presence of a valley of object 216.
  • Image sensor array 202 includes optical sensing elements 230, 232, 234 and 236 disposed below apertures or holes 220 of the collimator filter layer 204.
  • a series of light rays reflected at the cover layer 210 are a series of light rays reflected at the cover layer 210.
  • light rays 238 reflect from the cover layer 210 at portions occupied by ridges or valleys of the object 216. Because the light rays 238 are above collimator apertures 220 and are relatively near normal, the light rays 238 pass through the apertures 220 in the collimator filter layer 204 and become incident on optical sensing elements 232 and 236, for example. The optical sensing elements can then be used to measure the intensity of light and convert the measured intensity into image data of the input object 216.
  • light beams 240 and 242 which have a larger angle from normal, strike the collimator filter layer 204, either on its top surface or at surface within the aperture (e.g., aperture sidewall) and are blocked and prevented from reaching optical sensing elements in the image sensor array 202.
  • a metric of the collimator filter layer 204 is an aspect ratio of the apertures or holes 220.
  • the aspect ratio is the height of the holes ('3 ⁇ 4") 244 in the collimator filter layer 204 divided by hole diameter ("d") 246.
  • the aspect ratio should be sufficiently large to prevent "stray" light from reaching the optical sensing elements directly under each collimator hole.
  • An example of stray light is light ray 242 reflected from portion 228 of the cover layer 210 (e.g., a valley), which would reach sensing elements underneath a ridge in the absence of the collimator filter layer. Larger aspect ratios restrict the light acceptance cone to smaller angles, improving the optical resolution of the system.
  • the minimum aspect ratio can be estimated using a ratio of the distance from the collimator filter layer 204 to the object being imaged (e.g., finger) divided by the desired optical resolution of the finger.
  • the collimator apertures 220 are cylindrical or conical in shape.
  • the sidewalls of the collimator apertures 220 may include grooves or other structures to prevent stray light from reflecting off the walls and reaching the optical sensing elements.
  • the effective aspect ratio is determined by the average hole diameter along height of the collimator holes.
  • suitable aspect ratios are ratios in the range of about 3: 1 to 100: 1 and more typically in the range of about 5: 1 to 20: 1.
  • the height of the holes 244 of the collimator apertures 220 is generally desirable to make the height of the holes 244 of the collimator apertures 220 as thin as possible to provide the most flexibility for fabricating the collimator filter layer 204 and integrating it with the underlying image sensor array 202, such as a CMOS or CCD image sensor.
  • a small aperture diameter 246 may be used to maintain the desired collimator aspect ratio. However, if the aperture is made too small (less than a few times the wavelength of light being used), diffraction effects can contribute to additional blurring as the light rays exiting the collimator apertures 220 diverge.
  • Such diffraction effects can be mitigated by placing the collimator filter layer 204 as close to the image sensor array 202 as possible, ideally much closer than the Fraunhofer far field distance (i.e., r 2 llambda, where r is the aperture radius and lambda is the light wavelength).
  • the distance between the collimator filter layer 204 and the image sensor array 202 it is also generally desirable to minimize the distance between the collimator filter layer 204 and the image sensor array 202 to allow the light reaching the optical sensing elements of the image sensor array 202 to be as concentrated as possible. In addition, if this sensor array 202 to collimator filter layer 204 distance is too large, stray light from adjacent holes may reach a particular optical sensing element, contributing to image blurring. [0047] If the image sensor array 202 is a CCD or CMOS image sensor, where the optical sensing element pitch (distance between elements) may be smaller than the collimator hole pitch (distance between holes), the light passing through a single collimator aperture 220 may illuminate more than one optical sensing element.
  • optical sensing elements 234 and 236 Such an arrangement is shown by optical sensing elements 234 and 236 in FIG. 4.
  • the processing system may combine the light intensity recorded by all the optical sensing elements corresponding to a given collimator aperture.
  • the resulting fingerprint image after processing raw data from the image sensor array 202 may have a resolution corresponding to the array of collimator apertures. It will be noted that the arrangement of apertures 220 in the collimator filter layer 204 may result in some optical sensing elements in the sensor array 202 going unused.
  • Examples of an unused optical sensing elements are sensing elements 240. Because optical sensing elements 240 are not underneath a collimator hole, reflected rays will be blocked before reaching them. Image processing may remove the unused sensor elements and scale the image appropriately before the data is used in image reconstruction or image matching, for example.
  • the imaging resolution (in dpi) of the optical sensor 200 is defined by the resolution of the apertures 220 in the collimation filter layer 204 whereas the pitch is the distance between each aperture.
  • each aperture 220 in the collimator filter layer 204 corresponds to a sample of a feature of the object 216 being imaged, such as a sample from a ridge or valley within a fingerprint.
  • the sampling density (which is equal to the aperture density) should be large enough such that multiple samples are taken of each feature of interest.
  • the pitch may be on the order of 50 to 100 microns since the pitch of the ridges themselves is on the order of 150 to 250 microns. If it desired to capture more granular features, such as pores in a fingerprint, a smaller pitch such as 25 microns would be appropriate. Conversely, a larger pitch can be used to capture larger features of the input obj ect.
  • the optical sensor 200 performs similarly over a wide range of distances between the collimator filter layer 204 and the sensing surface 220 because the filtering of reflected light is generally thickness independent, as long as the aspect ratio of the holes in the collimator filter layer 204 is chosen to support the desired optical resolution.
  • FIG. 5 shows an alternative embodiment of the collimator filter layer 204.
  • the collimator filter layer 204 is made of light-absorbing materials and includes an array of apertures 220.
  • the top surface of the collimator filter layer 204 further includes a reflecting layer 250.
  • the reflecting layer 250 allows light beams that would normally be absorbed by the collimator filter layer 204 to be reflected back upwards towards the sensing region. Redirecting the light back to the sensing region allows the reflected light to be recycled so that some of the recycled light can be reflected off the input object to be imaged and transmitted through the collimator filter layer apertures.
  • the reflecting layer 250 minimizes light loss by reflecting the stray light back to the input object 216 without requiring a high level of illumination in the overall sensor package.
  • the top of the light-absorbing collimator filter layer body may be roughened using various texturizing techniques, including but not limited to, sandblasting, coating with fillers, UV embossing or dry etching. This roughened top may then covered with a thin layer of metal, which creates a surface that is multifaceted in a randomized fashion.
  • the reflecting layer 250 may be made of any suitable material that will reflect light such as aluminum, chromium, and silver to name a few examples.
  • the method and system disclosed contemplate various ways to include the collimator filter layer 204 into the overall structure of the optical sensor device 200.
  • the collimator filter layer 204 may be a pre-patterned structure that is laminated or stacked onto the image sensor array 202, as generally depicted in FIGs. 3-4.
  • Alternative embodiments are contemplated by the present disclosure.
  • one alternative embodiment is to pattern or create the collimator filter layer 204 directly onto a CMOS image sensor die or wafer, as generally depicted in FIG. 5.
  • a wafer-level collimator layer may be formed by micro-fabrication.
  • CMOS image sensor array fabrication instead of placing a separate collimator filter layer 204 on top of the image sensor array 202, back-end processes are added to CMOS image sensor array fabrication. With this technique, no separate manufacturing of the collimator filter layer is required.
  • liquid-type polymer resin with light-absorbing dyes such as carbon black may be coated first then cured to form the collimator filter layer body.
  • metal may be optionally sputtered onto the cured resin top to act as a reflective layer.
  • the aperture pattern may be made through photolithography and etching of the metal and the polymer layer underneath subsequently to create the apertures. As a final step, the metal layer can be roughened to create a reflecting/diffusing layer.
  • the collimator filter layer 204 is replaced or supplemented with an optical interference filter that blocks "stray" light at angles of incidence that are relatively far from normal to the imaging plane.
  • an optical interference filter that blocks "stray" light at angles of incidence that are relatively far from normal to the imaging plane.
  • Multilayer optical filters can be used that transmit light at incidence near normal in much the same way such a filter can be constructed to only transmit light at specific wavelengths.
  • angle- specific filter may be designed to work for specific light wavelengths
  • such an interference filter may be used to reject the stray light coming from adjacent ridges and valleys.
  • the collimator filter layer 204 may also be a transparent glass collimator filter with round openings on top and bottom.
  • This type of collimator filter layer is made using double-sided alignment technique to create top and bottom openings that are aligned, but without physically hollow holes through the glass body.
  • the top surface of the collimator filter layer can be textured to be a diffuser for the light entering while the bottom surface can be metallic to recycle by reflecting the light back to the transparent glass body.
  • This method makes lamination simpler since there are no physically hollow apertures. With this glass collimator filter layer, cover glass, light guide film, and glass filter can be laminated with readily available lamination equipment.
  • an opaque glass collimator filter with drilled apertures can be used. This is similar to the previously described collimator filter film.
  • the manufacturing method may be the same, except for the fact that the body is glass.
  • the aperture density is determined based on the required dpi.
  • FIGs. 6-7 depict ways to illuminate the underside of a finger (or other object) placed on the cover glass of a mobile electronics device so that an optical fingerprint imager integrated with the light source can acquire an image of the fingerprint. Integrating an OLED light source above an image sensor provides both the illumination and pinhole or collimator array useful for a low profile lens-less fingerprint imaging device.
  • FIG. 6 depicts an example of an optical fingerprint sensor 600 with an OLED illumination layer in cross section view.
  • the optical fingerprint sensor 600 is configured to image a fingerprint of a finger 216 provided on or over a top surface 218 of a cover glass 210.
  • the illumination layer includes an OLED stack 607a-c positioned below a bottom surface 619 of the cover glass 210, opposite to the top surface 218 that provides an input surface for the finger 216.
  • the OLED stack 607a-c includes a transparent electrode layer 607a made of indium tin oxide (ITO) positioned below the bottom surface 619 of the cover glass layer 210.
  • ITO indium tin oxide
  • ITO is an ideal choice of conductive material for the upper transparent electrode layer 607a due to its conductive properties, transparency, and manufacturability, it will be understood that any other suitable transparent conductor may be used instead of ITO, provided that the chosen transparent conductor allows the illumination light 212 to pass through to the sensing region above.
  • An organic light emitting diode (OLED) layer 607b is positioned below the transparent electrode layer 607a.
  • the OLED layer 607b includes an emissive organic layer that emits illumination light 212 of the desired wavelength(s) towards the top surface 218 of the cover glass 210 during fingerprint sensing.
  • a metal electrode layer 607c is provided below the OLED layer 607b.
  • the metal electrode layer 607b may be made of any suitable metal conductor and does not need to be transparent.
  • a collimator filter layer 204 is provided below the metal electrode layer 607c.
  • the collimator filter layer 204 includes a plurality of collimator holes or apertures 220 configured to transmit reflected light 238 that is reflected from top surface of the cover glass when the finger 216 is positioned on the top surface.
  • the collimator structure 204 and corresponding apertures 220 can be configured as described above. I n particular, similar to as shown and described with respect to FIG. 4 above, the collimator holes 220 transmit reflected light 238 that is within an acceptance angle from normal incidence to the collimator filter 204 (e.g., relative to a plane defined by the top surface of the collimator filter). As shown in FIG.
  • the normal of the collimator filter 204 may coincide with the normal of the top surface 218 of the cover glass 218.
  • Interior surfaces within the collimator apertures e.g., sidewalls and/or other intermediate surfaces, block stray light that falls outside of the acceptance angle with respect to normal incidence, thereby limiting the light detected by the imager 202 and imager pixels 230 to within an acceptance angle determined by the collimator structure 204.
  • a gap is shown between a bottom surface of the collimator filter structure 204 and a top surface of the imager 202 and imager pixels 230; however, it is understood that this gap may be omitted and the collimator filter structure 204 may be positioned directly on top of the imager 202 (e.g., through wafer level fabrication as described above with respect to FIG. 5).
  • the transparent electrode layer 607a, OLED layer 607b, and metal electrode layer 607c may each include openings in areas corresponding to the collimator holes 220 to allow reflected light 238 from the finger 216 to pass through and be detected by the imager pixels 230 below.
  • FIG. 6 also shows light paths for the illumination light beams 212 and the reflected light beams 238 during operation of the sensor 600 for fingerprint imaging.
  • a voltage is applied across the OLED layer 607b by the electrode layers 607a, 607c, causing the OLED layer 607b to emit an illumination light beam 212 through the upper transparent electrode layer 607a and through the transparent cover glass layer 210, towards the finger 216 that is positioned on the upper surface 218 of cover glass.
  • the illumination light beam 212 is then reflected from the finger 216 (at the top surface of the cover glass), and this reflected light beam 238 is transmitted back through the cover glass layer 210, after which it is transmitted through the openings in the illumination layers 607a-c.
  • the metal electrode layer 607c is provided below the OLED layer 607b that emits the illumination light 212 upwards towards the finger 216, the metal electrode need not be transparent, provided that it includes openings that allow the reflected light 238 to pass through to the collimator holes 220 below.
  • the reflected light beam 238 is transmitted through a collimator aperture 220 if it is within the acceptance angle determined by the aperture geometry, whereas an intermediate surface in the collimator aperture 220 (e.g., a sidewall or other surface between the top surface and the bottom surface of the collimator filter) blocks a stray light beam that is outside of an acceptance angle from normal (see, e.g., FIG. 4).
  • the reflected light beam 238 is detected by a pixel 230 of the imager 202.
  • a plurality of collimator apertures 220 and a plurality of openings in OLED stack 607a-c can be used to capture multiple spots on the finger 216.
  • FIG. 6 depicts an example of spacing between collimator holes 220 relative to a spacing between fingerprint ridges on a finger 216. As can be seen in FIG. 6, which shows only one fingerprint ridge and one fingerprint valley on the finger 216 but multiple collimator apertures 220, the spacing between collimator apertures 220 can be smaller than a spacing between fingerprint ridges in order to sufficiently sample the fingerprint. It will be appreciated that the exact dimensions can vary depending on the implementation.
  • FIG. 7 depicts an example of an OLED illumination layer and collimator structure in top view (plan view).
  • the sensor has a collimator filter layer 204 that includes collimator holes 220 to allow light transmit through to an imager (not shown).
  • the areas corresponding to the collimator holes 220 are free of the OLED light emitting layer 607b to allow light reflected the finger to pass through the OLED layer 607b before passing through the collimator holes 220.
  • both the openings in the OLED layer 607b and the openings in the collimator filter layer 204 are arranged in a regular array and aligned to each other, although other patterns are possible.
  • the OLED illumination layer may be deposited on top of the pinhole or collimator substrate (Collimator / Substrate 204 in FIG. 6). This can be done after the holes are formed in the substrate as long as the patterning for all of the OLED stack layers is done by shadow masking or printing rather than subtractive wet chemical processing.
  • the entire pinhole/collimator substrate could be covered by a single OLED device, or the OLED could be separated into individual pixels that are driven/addressed passively or with direct connections.
  • the OLED stack could be made on a separate piece of glass or plastic (or any other transparent material), or even on the underside of the cover glass in the case of a mobile electronics device such as a cell phone or tablet.
  • the OLED device fabricated on a separate substrate could still have the same hole partem as the light conditioning layer, and these holes could be aligned to the holes in the pinhole/collimator substrate during assembly so that light reflected from the finger can travel through the apertures to the underlying imager.
  • the collimator structures could then be deposited on top of the OLED stack (while leaving the same holes open), so that when the transparent substrate is inverted, the collimator structures would also be integrated on the OLED substrate, removing the need for a separate pinhole/collimator structure that must be aligned to the OLED structures.
  • the "Collimator / Substrate" structures depicted in FIG. 6 could be deposited after the OLED stack is deposited on the glass (or "Cover glass" 210 in FIG. 6).
  • the light provided by the illumination layer to sense the finger may be in near infrared (NIR). It can be in the visible as well. Note that it can have a narrow band of wavelengths, a broad band of wavelengths, or operate in several bands.
  • NIR near infrared
  • FIG. 8 shows a method 800 of imaging in accordance with the present disclosure.
  • the sensing region is illuminated using an illumination layer having a light source and/or light guiding element. As previously described, this may be done by using a light source directing light into a separate light guiding element or by transmitting light directly into the cover layer. The transmitted light is directed towards a sensing region above the cover layer and reflected from the object towards the light collimator layer.
  • step 804 some of the reflected light is blocked at the collimator filter layer while other light passes through apertures in the collimator filter layer.
  • light rays at relatively near normal incidence to the collimator filter layer will pass through the apertures while light rays further from normal incidence will be blocked.
  • Light may be blocked by the top surface of the collimator layer, an intermediate layer of the collimator, a bottom layer of the collimator, or sidewalls of the collimator aperture.
  • step 806 the light which passes through the collimator filter layer becomes incident on one or more optical sensing elements on the sensor array below the light collimator layer.
  • the detected light at the sensing elements may be averaged or otherwise combined.
  • the image data may be adjusted to account for sensing elements that are not below an aperture.
  • step 808 the detected light at the image sensor array is processed to form an image or a partial image of the input object.
  • processing may include, for example, stitching partial images together, relating various partial images to one another in a template, and/or comparing captured image data to previously stored image data as part of an identification or verification process.
  • this invention describes optical object imaging in the context of fingerprint image sensing
  • the method and system may be used to image any object.
  • a high resolution image of a palm or hand may be acquired by placing the hand directly on the cover layer.
  • Imaging of non-biometric objects is also with the scope of this disclosure.
  • this illumination scheme could be used to illuminate any object that is to be placed directly over a low-profile imaging device that uses arrays of apertures to form its image. Larger imagers could be used to image/scan something larger than a fingerprint, such as an entire hand or palm.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Un capteur d'empreinte digitale optique comprend : un verre de protection pour recevoir un doigt; une couche d'électrode transparente au-dessous d'une surface inférieure du verre de protection; une couche de diode électroluminescente organique (OLED) au-dessous de la couche d'électrode transparente; et une couche d'électrode métallique au-dessous de la couche OLED. De multiples ouvertures s'étendent à travers chacune de la couche d'électrode transparente, de la couche OLED et de la couche d'électrode métallique, et transmettre la lumière réfléchie réfléchie par une surface supérieure du verre de protection lorsqu'un doigt est positionné sur la surface supérieure du verre de protection. l'invention concerne un collimateur ou un filtre de paquet qui est en dessous de la couche d'électrode métallique, et comprend de multiples ouvertures pour transmettre la lumière réfléchie après que la lumière réfléchie est transmise à travers les multiples ouvertures. L'invention concerne également un imageur est en dessous de la surface inférieure du collimateur ou du filtre à trous d'épingle, et comprend un réseau de pixels qui détecte la lumière réfléchie après que la lumière réfléchie a été transmise à travers les multiples ouvertures.
PCT/US2017/054480 2016-09-29 2017-09-29 Éclairage de profil bas dans un capteur d'empreinte digitale optique WO2018064563A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/280,623 2016-09-29
US15/280,623 US10181070B2 (en) 2015-02-02 2016-09-29 Low profile illumination in an optical fingerprint sensor

Publications (1)

Publication Number Publication Date
WO2018064563A1 true WO2018064563A1 (fr) 2018-04-05

Family

ID=61760162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/054480 WO2018064563A1 (fr) 2016-09-29 2017-09-29 Éclairage de profil bas dans un capteur d'empreinte digitale optique

Country Status (1)

Country Link
WO (1) WO2018064563A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020046188A1 (fr) * 2018-08-29 2020-03-05 Fingerprint Cards Ab Capteur optique d'empreinte digitale dans l'écran avec masque à orifice codé
CN111598068A (zh) * 2020-07-24 2020-08-28 深圳市汇顶科技股份有限公司 指纹识别装置和电子设备
US10763288B1 (en) 2019-02-15 2020-09-01 Vanguard International Semiconductor Corporation Semiconductor device and method for forming the same
TWI703349B (zh) * 2018-11-22 2020-09-01 世界先進積體電路股份有限公司 半導體裝置及其形成方法
WO2020233601A1 (fr) * 2019-05-22 2020-11-26 印象认知(北京)科技有限公司 Couche d'imagerie, appareil d'imagerie, dispositif électronique, structure de plaque de zone et élément d'image photosensible
US11783619B2 (en) 2020-07-24 2023-10-10 Shenzhen GOODIX Technology Co., Ltd. Fingerprint identification apparatus and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6259108B1 (en) * 1998-10-09 2001-07-10 Kinetic Sciences Inc. Fingerprint image optical input apparatus
US6750955B1 (en) * 2002-03-14 2004-06-15 Ic Media Corporation Compact optical fingerprint sensor and method
US20140047706A1 (en) * 2010-11-02 2014-02-20 Jalil SHAIKH Capacitive Touch Sensor for Identifying a Fingerprint
US20160132712A1 (en) * 2014-11-12 2016-05-12 Shenzhen Huiding Technology Co., Ltd. Fingerprint sensors having in-pixel optical sensors
US20160224816A1 (en) * 2015-02-02 2016-08-04 Synaptics Incorporated Optical sensor using collimator

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6259108B1 (en) * 1998-10-09 2001-07-10 Kinetic Sciences Inc. Fingerprint image optical input apparatus
US6750955B1 (en) * 2002-03-14 2004-06-15 Ic Media Corporation Compact optical fingerprint sensor and method
US20140047706A1 (en) * 2010-11-02 2014-02-20 Jalil SHAIKH Capacitive Touch Sensor for Identifying a Fingerprint
US20160132712A1 (en) * 2014-11-12 2016-05-12 Shenzhen Huiding Technology Co., Ltd. Fingerprint sensors having in-pixel optical sensors
US20160224816A1 (en) * 2015-02-02 2016-08-04 Synaptics Incorporated Optical sensor using collimator

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020046188A1 (fr) * 2018-08-29 2020-03-05 Fingerprint Cards Ab Capteur optique d'empreinte digitale dans l'écran avec masque à orifice codé
US10733413B2 (en) 2018-08-29 2020-08-04 Fingerprint Cards Ab Optical in-display fingerprint sensor and method for manufacturing such a sensor
TWI703349B (zh) * 2018-11-22 2020-09-01 世界先進積體電路股份有限公司 半導體裝置及其形成方法
US10763288B1 (en) 2019-02-15 2020-09-01 Vanguard International Semiconductor Corporation Semiconductor device and method for forming the same
WO2020233601A1 (fr) * 2019-05-22 2020-11-26 印象认知(北京)科技有限公司 Couche d'imagerie, appareil d'imagerie, dispositif électronique, structure de plaque de zone et élément d'image photosensible
CN111598068A (zh) * 2020-07-24 2020-08-28 深圳市汇顶科技股份有限公司 指纹识别装置和电子设备
CN111598068B (zh) * 2020-07-24 2020-11-20 深圳市汇顶科技股份有限公司 指纹识别装置和电子设备
US11783619B2 (en) 2020-07-24 2023-10-10 Shenzhen GOODIX Technology Co., Ltd. Fingerprint identification apparatus and electronic device

Similar Documents

Publication Publication Date Title
US10181070B2 (en) Low profile illumination in an optical fingerprint sensor
US11372143B2 (en) Optical fingerprint sensor
CN108885693B (zh) 具有发散光学元件的生物计量传感器
US11281336B2 (en) Optical sensor having apertures
US10891460B2 (en) Systems and methods for optical sensing with angled filters
CN107271404B (zh) 具有衍射光学元件的光学生物计量传感器
US10303919B2 (en) Display integrated optical fingerprint sensor with angle limiting reflector
US10936840B2 (en) Optical sensor with angled reflectors
WO2018064563A1 (fr) Éclairage de profil bas dans un capteur d'empreinte digitale optique
EP3360162B1 (fr) Structures de capteur d'image pour détection d'empreintes digitales
US10147757B2 (en) Image sensor structures for fingerprint sensing
US20170091506A1 (en) Optical image sensor for display integration
US10955603B2 (en) Method and system for optical imaging using point source illumination
US10558838B2 (en) Optimized scan sequence for biometric sensor
US11137534B2 (en) Systems and methods for optical imaging based on diffraction gratings
US20210117644A1 (en) Optical sensing systems and devices including apertures supplanting photodiodes for increased light throughput

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17857547

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17857547

Country of ref document: EP

Kind code of ref document: A1