CN117095427A - Optical fingerprint sensor with enhanced security feature - Google Patents

Optical fingerprint sensor with enhanced security feature Download PDF

Info

Publication number
CN117095427A
CN117095427A CN202310510520.6A CN202310510520A CN117095427A CN 117095427 A CN117095427 A CN 117095427A CN 202310510520 A CN202310510520 A CN 202310510520A CN 117095427 A CN117095427 A CN 117095427A
Authority
CN
China
Prior art keywords
pixels
fingerprint
positioning
sensing
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310510520.6A
Other languages
Chinese (zh)
Inventor
谢丰键
郑允玮
李国政
吴振铭
胡维礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiwan Semiconductor Manufacturing Co TSMC Ltd
Original Assignee
Taiwan Semiconductor Manufacturing Co TSMC Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/187,891 external-priority patent/US20240021009A1/en
Application filed by Taiwan Semiconductor Manufacturing Co TSMC Ltd filed Critical Taiwan Semiconductor Manufacturing Co TSMC Ltd
Publication of CN117095427A publication Critical patent/CN117095427A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1324Sensors therefor by using geometrical optics, e.g. using prisms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

The present disclosure relates to optical fingerprint sensors with enhanced security features. An image sensing apparatus is disclosed. The image sensing device includes a pixel array and a microlens disposed over the pixel array. The pixel array includes sensing pixels configured as minutiae points for fingerprinting and positioning pixels configured to provide a positioning code.

Description

Optical fingerprint sensor with enhanced security feature
Technical Field
The present disclosure relates to optical fingerprint sensors with enhanced security features.
Background
The semiconductor Integrated Circuit (IC) industry has experienced an exponential growth. Technological advances in IC materials and design have resulted in several generations of ICs, each having smaller and more complex circuitry than the previous generation. During the evolution of ICs, functional density (i.e., the number of interconnected devices per chip area) generally increases while geometry (i.e., the smallest component (or line) that can be created using a manufacturing process) decreases. For example, there is considerable interest in providing fingerprint sensing applications (e.g., optical sensors for fingerprint identification) for consumer and/or portable electronic devices (e.g., smartphones, electronic tablets, wearable devices, etc.) within a limited device housing without compromising the level of security provided by the fingerprint sensing applications.
In some fingerprint sensing applications, the fingerprint gray-level image is sensed by pixels of an optical fingerprint sensor capable of sensing only gray-level images (i.e., incapable of sensing color images). Further, such gray scale images do not include any specific positioning code or pattern. Such fingerprint sensing applications are susceptible to counterfeiting. Thus, conventional optical fingerprint sensors are not satisfactory in all respects.
Disclosure of Invention
According to an embodiment of the present disclosure, there is provided an image sensing apparatus including: a pixel array, the pixel array comprising: a plurality of sensing pixels configured as minutiae points for capturing a fingerprint; and a plurality of positioning pixels configured to provide a positioning code; and a plurality of microlenses disposed over the pixel array.
According to another embodiment of the present disclosure, there is provided an optical fingerprint sensor including: a filter array arranged in columns and rows; a light receiving element array below the filter array, the light receiving element array configured to convert incident light reflected from a fingerprint into a fingerprint image; and a plurality of opaque films disposed over a portion of the light receiving elements configured to add dark pixels to the fingerprint image.
According to another embodiment of the present disclosure, there is provided a method of fingerprint authentication, including: capturing a fingerprint image by an image sensing device comprising a pixel array of a combination of sensing pixels and positioning pixels, the sensing pixels being configured to capture minutiae points in the fingerprint image and the positioning pixels being configured to provide a positioning code; calculating a vector of the minutiae points with reference to the positioning code; and comparing the vector to a reference vector generated from a reference fingerprint image to determine a match between the fingerprint image and the reference fingerprint image.
Drawings
The disclosure may be best understood from the following detailed description when read in connection with the accompanying drawing. It is emphasized that, in accordance with the standard practice in the industry, various features are not drawn to scale, but are used for illustration purposes only. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Fig. 1 illustrates an electronic device having a fingerprint sensing area on a surface space in accordance with aspects of the present disclosure.
Fig. 2 is a cross-sectional view of an electronic device integrated with an optical fingerprint sensor below a display panel, in accordance with aspects of the present disclosure.
Fig. 3 is a cross-sectional view of an embodiment of the optical fingerprint sensor shown in fig. 2, in accordance with aspects of the present disclosure.
Fig. 4A, 4B, and 4C are top views of a pixel array having positioning pixels overlaying a fingerprint image at different stages of fingerprint identification, according to aspects of the present disclosure.
Fig. 5, 6 and 7 illustrate embodiments of distributions of positioning pixels in a pixel array according to aspects of the present disclosure.
Fig. 8A, 8B, 8C, 8D, 8E, 8F, and 8G illustrate embodiments of the distribution of positioning pixels and color pixels in a pixel array according to aspects of the present disclosure.
Fig. 9 illustrates a flow chart of a method for fingerprint identification in accordance with aspects of the present disclosure.
Detailed Description
The following disclosure provides many different embodiments, or examples, for implementing different features of the disclosure. Specific examples of components and arrangements are described below to simplify the present disclosure. Of course, these are merely examples and are not intended to be limiting. For example, in the description below, forming a first feature over or on a second feature may include embodiments in which the first feature and the second feature are formed in direct contact, and may also include embodiments in which additional features may be formed between the first feature and the second feature such that the first feature and the second feature may not be in direct contact. Further, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed above.
Furthermore, in the present disclosure below, the formation of a feature over, connected to, and/or coupled to another feature may include embodiments in which the feature is formed in direct contact, and may also include embodiments in which additional features are formed in a manner that inserts the feature so that the feature is not in direct contact. Furthermore, spatially relative terms, such as "lower," "upper," "horizontal," "vertical," "above," "below," "beneath," "upper," "lower," "top," "bottom," and the like, as well as derivatives thereof (e.g., "horizontally," "downwardly," "upwardly," etc.), are used herein for ease of describing the relationship of one feature to another feature of the present disclosure. Spatially relative terms are intended to encompass different orientations of the device in which the features are included.
The present disclosure relates generally to designs and methods for fingerprint sensing (e.g., in anti-counterfeit applications using optical fingerprint sensors (optical fingerprint sensor, abbreviated OFPS)). More particularly, some embodiments relate to integrating specific patterns of positioning pixels (also referred to as positioning codes) and/or additional color pixels for adding flesh tone codes to an OFPS to enhance the anti-counterfeiting capability of the OFPS.
OFPS is one method of biometric sensing that is of considerable interest in providing security features to electronic devices, more particularly to consumer and/or portable electronic devices (e.g., smart phones, electronic tablets, wearable devices, etc.). OFPS-based fingerprint recognition (or fingerprint sensing) systems are based on characteristic features of the user and may not rely on the user's memory or the user's use of other input devices, such as password entry. For the same reason, OFPS-based fingerprint recognition systems also offer the advantage of being difficult to hack.
Among the various biometric sensing techniques, fingerprint identification is a reliable and widely used technique for personal identification or authentication. Fingerprint identification systems generally include fingerprint sensing and matching functions, for example, collecting fingerprint images and comparing those images to known fingerprint information. Specifically, one method of fingerprint identification includes scanning a reference fingerprint and storing an acquired reference image. The characteristics of the new fingerprint may be scanned and compared to reference images already stored in the database to determine the correct identification of the individual (e.g., for verification purposes). Fingerprint identification systems may be particularly advantageous for authentication in consumer and/or portable electronic devices. For example, an optical sensor for acquiring a fingerprint image may be carried within a housing of the electronic device.
The effectiveness of a biometric security system may be affected by the accuracy with which characteristic biometric data can be detected. In the case of a fingerprint recognition system, this means that accuracy is improved when comparing the acquired fingerprint image with reference fingerprint images stored in a database. Stored in the database are a collection of minutiae points (minutiae points) represented as reference fingerprint images, typically ridges and valleys (valley) representing fingerprints. This collection of minutiae points is also referred to as a detail map. If the minutiae map is hacked or compromised, the fingerprint image may be reconstructed by reverse engineering (reserve engineering). Thus, the security provided by the fingerprint recognition system is compromised. In some embodiments of the present disclosure, positioning pixels are added to the pixel array of the OFPS to add additional codes to the detail map. Such a code is called a fingerprint positioning code or a position reference code. The fingerprint locating code converts the direct record of minutiae points where ridges and valleys are located into a vector representing the relative position of the minutiae points with respect to the locating pixels. Thus, even if the minutiae map is hacked or compromised, the fingerprint image cannot be reconstructed from the minutiae map without knowing the distribution of the locator pixels and how the locator pixels are referenced. Therefore, the addition of the positioning code enhances the anti-counterfeiting capability of the OFPS.
Further, in some fingerprint sensing applications, the fingerprint gray-level image is sensed by image sensing pixels (abbreviated as pixels) capable of sensing only gray-level images (i.e., incapable of sensing color images). Such pixels are referred to as gray scale pixels or "W" pixels. For example, monochrome image sensors are typically tuned to the pixels of a fingerprint recognition application that produces a grayscale image. In some embodiments of the present disclosure, one or more color image sensors are added to the pixel array as color pixels (RGB) to add a fingerprint color code representing skin tone to the reference fingerprint image and the acquired fingerprint image. The fingerprint color code further enhances the anti-counterfeiting capability of the OFPS. In various embodiments, the positioning pixels that generate the fingerprint positioning code and the color pixels that generate the fingerprint color code may be applied to the pixel array of the OFPS, either independently or together. For example, the OFPS may include a pixel array in which some of the gray pixels are replaced with the positioning pixels, a pixel array in which some of the gray pixels are replaced with the color pixels, a pixel array in which some of the gray pixels are replaced with the positioning pixels and some of the gray pixels are replaced with the color pixels, or even a pixel array without gray pixels but with a combination of the color pixels and the positioning pixels.
Fig. 1 illustrates an electronic device 100 having a fingerprint sensing area on a surface space according to some embodiments of the present disclosure. As shown in fig. 1, the electronic device 100 is illustratively a mobile wireless communication device (e.g., a smart phone). In other embodiments, the electronic device 100 may be any other suitable electronic device, such as a laptop computer, an electronic tablet, a portable gaming device, a navigation device, or a wearable device. The electronic device 100 includes a housing 102 and other components within the housing 102, such as processor(s) and memory. The display panel 104 is carried by the housing 102. In the illustrated embodiment, the display panel 104 is an organic light-emitting diode (OLED) display panel. In various embodiments, the display panel 104 may be any other suitable type of display panel understood by those skilled in the art, such as a liquid-crystal display (LCD) panel, a light-emitting diode (LED) display panel, or an active-matrix organic light-emitting diode (AMOLED) display panel.
In the illustrated embodiment, the display panel 104 extends substantially the entire surface space of the electronic device 100. Some of the space between the display panel 104 and the edge of the housing 102 may be left to the bezel panel 106. The display panel 104 is stacked over image sensing features for fingerprint detection or other suitable biometric sensing features. The image sensing features will be described in further detail later. The display panel 104 serves as a display and an input device through which the image sensing features acquire a fingerprint image. In this way, the display panel 104 performs a plurality of device functions in response to user input. For example, when the electronic device 100 is in a locked state, the display panel 104 may first display a prompt (e.g., a finger icon or instruction text) on the screen. The display panel 104 may also highlight the sensing region 108. When the user's finger 110 is placed within the sensing region 108 (in the near field or in direct contact with the display panel 104), the image sensing feature is activated and a fingerprint image is acquired from the user's finger 110. Such acquired fingerprint images (biometric data) are sent to the processor(s) for matching and/or spoof detection. If the acquired fingerprint image matches a reference fingerprint image stored in memory, the electronic device 100 may thereafter transition to the unlocked state and the display panel 104 begins displaying desktop icons or responses to various other user inputs. The display panel 104 may also be integrated with a touch sensor array. In this case, the display panel 104 is also a touch display panel.
Fig. 2 is a cross-sectional view of a portion of the electronic device 100. This portion of the electronic device 100 carries the fingerprint recognition function and may be considered as the fingerprint recognition system 200. Fingerprint recognition system 200 is a stacked configuration including a top display panel 202, a middle light conditioning layer 204, and a bottom OFPS 206. The display panel 202 illuminates the sensing region 108 above. When light emitted from the display panel 202 is reflected from the user's finger 110, the reflected light passes down through the display panel 202 and the light adjustment layer 204 and finally reaches the OFPS 206. In one embodiment, image OFPS 206 includes an array of optical sensing elements 207, e.g., complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) image sensors and/or charge coupled device (charged coupled device, CCD) sensors. The optical sensing element 207 is capable of detecting the intensity of incident light. Thus, OFPS 206 converts incident light into a pixel image that includes the biometric characteristics of user's finger 110. Each pixel of the pixel image may correspond to the intensity of incident light recorded at a corresponding location of the optical sensing element 207.
In some embodiments, the display panel 202 includes a cover glass 214 (or cover lens) that protects the internal components of the electronic device 100. The sensing region 108 is defined above the cover glass 214. The top surface 216 of the cover glass 214 forms a sensing surface that provides a contact area for the user's finger 110 or other suitable object. Within the sensing region 108, the user's finger 110 may directly touch the top surface 216 or remain a small distance from the top surface 216 during near field sensing. Cover glass 214 may be made of glass, a transparent polymer material, or other suitable material.
The display panel 202 includes an illumination or display layer 220 below the cover glass 214. The display layer 220 includes an array of light emitting pixels 222. The different light emitting pixels 222 may be configured to emit different colors, for example, pixels emitting red light, pixels emitting green light, or pixels emitting blue light. Because of the geometric relationship with the sensing region 108, the light emitting pixels 222 can be categorized into two groups, one directly below the sensing region 108 and the other outside the sensing region 108. The pixels 222 outside the sensing region 108 perform conventional display functions, while the pixels 222 directly below the sensing region 108 perform conventional display functions as well as illumination functions during biometric sensing, depending on the application. In various embodiments, the pixel distance D1 between adjacent light emitting pixels 222 is in the range of about 5 microns to about 30 microns, with other values and intervals within the range being within the scope of the present disclosure. In a specific example, the pixel distance D1 may be in a range of about 10 microns to about 20 microns.
In some embodiments, the display panel 202 further includes a barrier layer 224. The barrier layer 224 is a translucent or opaque layer that may be disposed below the display layer 220. Outside of sensing region 108, barrier 224 is continuous, which shields the components below display layer 220 from light emitted by light emitting pixels 222 and ambient light. Directly below the sensing region 108, the barrier 224 has a plurality of openings 226. Each opening 226 is located between two adjacent light emitting pixels 222. The opening 226 allows light reflected from the sensing region 108 to pass through. In the illustrated embodiment, there is one opening 226 between two adjacent light emitting pixels 222. The opening 226 may have a width (or diameter) D2, the ratio of D2 to the pixel distance D1 being from about 40% to about 90%, other values and intervals within this range are within the scope of the present disclosure. In some other embodiments, there are two or more openings 226 between two adjacent pixels 222. Accordingly, the opening 226 may have a width (or diameter) D2, the ratio of D2 to the pixel distance D1 being from about 20% to about 40%.
In various embodiments, the display layer 220 may be an LCD display (using a backlight with color filters to form RGB pixels), an LED display (e.g., micro LEDs where the pixel material may be an inorganic material used in the LEDs), an OLED display, or any other suitable display. In the illustrated embodiment, the light emitting pixels 222 are Organic Light Emitting Diodes (OLEDs), and the display layer 220 is an OLED display. Examples of OLED displays may include active-matrix OLEDs (AMOLED), passive-matrix OLEDs (PMOLED), white OLEDs (WOLED OLED), and RBG-OLED, and/or other suitable types of OLEDs. OLED displays are typically thinner, lighter, and more flexible than other types of displays (e.g., LCD displays or LED displays). OLED displays do not require a backlight, as light can be generated from the organic light emitting material in the OLED, which allows the pixel to be turned off completely. The organic light emitting material may be an organic polymer, for example, polystyrene (polyphenylenevinylene) and polyfluorene (polyfluorene). Because the organic light emitting material generates its own light, the OLED display may also have a wide viewing angle. This can be compared to LCD displays, which operate by blocking light that may cause obstruction of certain viewing angles.
OLED diodes emit light using a process known as electroluminescence, a phenomenon in which organic light emitting materials are capable of emitting light in response to a passing current. In some examples, an OLED diode may include a hole injection layer, a hole transport layer, an electron injection layer, an emission layer, and an electron transport layer. The color of the light emitted by the OLED diode depends on the type of organic light emitting material used in the emissive layer. Different colors can be obtained with various chemical structures of the organic luminescent material. The intensity of the light may depend on the number of photons emitted or the voltage applied across the OLED diode. In some embodiments, each light emitting pixel is formed of the same organic light emitting material that produces white light, but each light emitting pixel further includes a red, green, or blue color filter to filter out colors other than the target color, respectively. The color filters may be formed using cholesteric (cholesteric) filter materials (e.g., a multi-layer dielectric stack comprising materials having different refractive indices configured to form an optical filter).
As shown in fig. 2, below the sensing region 108, a light conditioning layer 204 is stacked below the display panel 202. The light conditioning layer 204 includes a semiconductor layer 240 and an optical filter film 242. In one embodiment, the semiconductor layer 240 includes a silicon microelectromechanical system (microelectromechanical systems, MEMS) structure. For example, the semiconductor layer 240 includes a collimator (collimator) 245, the collimator 245 including an array of apertures 246. Each aperture 246 is located directly above one or more optical sensing elements 207 in the OFPS 206. The array of holes 246 may be formed by any suitable technique (e.g., plasma etching, laser drilling, etc.). The array of apertures 246 conditions incident light reflected from the sensing region 108. With the OFPS 206 stacked on the bottom, the display panel 202 (especially the relatively thick cover glass 214) adds an additional vertical distance between the user's finger 110 and the OFPS 206, which allows stray light from the vicinity of the user's finger 110 to reach the optical sensing element 207 together with light from a small spot directly above. Stray light causes blurring of the image. The array of apertures 246 helps to filter out stray light and allows substantially only light from a small spot directly above to be detected, resulting in a sharper image.
The dimension of the collimator 245 is the aspect ratio of the aperture 246, which is defined as the height (a) of the aperture 246 divided by the diameter (e) of the aperture 246. The aspect ratio of the aperture 246 is large enough to allow light rays incident perpendicularly or near perpendicularly to the collimator 245 to pass through and reach the optical sensing element 207. An example of a suitable aspect ratio for the aperture 246 is from about 5:1 to about 50:1, and sometimes from about 10:1 to about 15: 1. Other values and intervals are within the scope of the present disclosure. In an embodiment, the height (a) of the aperture 246 is in the range from about 30 microns to 300 microns, for example, about 150 microns. In various embodiments, the collimator 245 may be an opaque layer having an array of apertures. In some embodiments, the collimator 245 is a monolithic semiconductor layer, e.g., a silicon layer. Other examples of collimators 245 may include plastics, such as polycarbonate, PET, polyimide, carbon black, inorganic insulating or metallized material, or SU-8.
As shown in fig. 2, the light conditioning layer 204 further includes an optical filter film 242 over the semiconductor layer 240. The optical filter 242 selectively absorbs or reflects certain spectra of incident light, particularly components from ambient light 250, such as infrared light and/or a portion of other visible light (e.g., red light). The optical filter membrane 242 helps to reduce the sensitivity of the optical sensing element 207 to ambient light 250 and increases its sensitivity to light emitted from the light emitting pixels 222. The optical filter membrane 242 may extend continuously directly above the collimator 245 and have an opening 260 outside the collimator 245.
In an example, the optical filter film 242 may include a thin metal layer or metal oxide layer that absorbs or reflects light in certain spectra. In another example, the optical filter film 242 may include dye(s) and/or pigment(s) that absorb or reflect certain light components. Alternatively, the optical filter membrane 242 may include several sub-layers or nano-sized features designed to cause interference with certain wavelengths of incident light. In one embodiment, the optical filter membrane 242 may comprise one or more materials, such as silicon oxide, titanium oxide, or another metal oxide.
The optical filter film 242 may be deposited on the dielectric layer 241, and the dielectric layer 241 may be a buried oxide layer on the semiconductor layer 240. In one embodiment, the buried oxide layer 241 may include one or more materials, such as thermal oxide, plasma enhanced oxide (plasma enhanced oxide, PEOX), high-density-plasma (HDP) oxide, or the like. In addition, the light conditioning layer 204 also includes a passive oxide layer 255 below the semiconductor layer 240. In one embodiment, the passive oxide layer 255 may include one or more materials, e.g., PEOX, HDP oxide, etc.
The OFPS 206 in this example includes a substrate 268, a plurality of optical sensing elements 207 in the substrate 268, and bond pads 264 in the substrate 268. Each bond pad may be a metal pad comprising a conductive material. As shown in fig. 2, the stack of passive oxide layer 255, semiconductor layer 240, buried oxide layer 241, and optical filter film 242 may also have several openings 260. The openings 260 allow some conductive features (e.g., bond wires 262) to interconnect at least one of the bond pads 264 on the top surface of the image sensing layer 206 to an external circuit (e.g., a processor of the electronic device 100). The landing pads 264 are routed to control signal lines and power/ground lines embedded in the image sensing layer 206. The image sensing layer 206 may also include alignment marks for alignment control during manufacturing and assembly. In other embodiments, alignment marks are located on the metal pads/bonding pad layer of the passive oxide layer 255 or the image sensing layer 206 for alignment control during fabrication and assembly.
In one embodiment, semiconductor layer 240 has a thickness (a) of about 50 microns to 200 microns. In one embodiment, the passive oxide layer 255 has a thickness (b) of about 400 nanometers to 2000 nanometers. In one embodiment, buried oxide layer 241 has a thickness (c) of about 1000 nanometers to 2000 nanometers. In one embodiment, the optical filter membrane 242 has a thickness (d) of about 1 micron to 5 microns. In one embodiment, each aperture 246 of the collimator 245 has a diameter of about 5 microns to 30 microns. According to various embodiments, the openings 260 of the passive oxide layer 255, the semiconductor layer 240, and the buried oxide layer 241 have different diameters. For example, the openings of the buried oxide layer 241 have a diameter (f) of about 100 microns to 140 microns; the opening of the semiconductor layer 240 has a diameter (g) of about 80 microns to 120 microns; the openings of the passive oxide layer 255 have a diameter (h) of about 60 microns to 100 microns.
In one embodiment, a method for capturing a fingerprint image from a user's finger illuminated by a display panel integrated with a light conditioning layer is described below. The screen of the electronic device 100 may first be in a locked state. A prompt is displayed, where the prompt may be an icon (e.g., a fingerprint icon or instructional text) highlighting the sensing region 108 on the screen. The cue is shown by a light emitting pixel 222 below the sensing region 108. The light emitting pixels 222 may be OLED diodes. The pixels 222 outside the sensing region 108 may be turned off in a locked state or when a preset screen saver image is displayed. The biometric detection mode then begins when the user's finger 110 remains stationary in the sensing region 108 for more than a predetermined time, for example, the user remains stationary for about 100 milliseconds. Otherwise, the method returns to waiting for new user input.
In the biometric detection mode, the prompt shown on the screen is turned off and the pixels 222 under the sensing region 108 begin to illuminate the user's finger 110. Light 270 emitted from the light emitting pixels 222 may pass through the cover glass 214 and reach the user's finger 110. The user's finger 110 may include ridges 272 and valleys 274. Because the top surface 216 is closer than the valleys 274, the ridges 272 of the finger may reflect more light, while the valleys 274 may reflect less light. The light 270 is in turn reflected back to the light regulating layer 204.
The optical filter 242 then filters certain spectra of light. In some embodiments, the optical filter membrane 242 is an infrared light cut-off filter that filters (or reduces) the infrared light component from the incident light, for example, by absorption or reflection. Ambient light 250 (e.g., sunlight) is the primary source of infrared light. Infrared light can easily penetrate the user's finger 110. Thus, infrared light does not carry useful information of the biometric characteristics of the finger and can be considered as part of the noise. Mixing the infrared light component from the ambient light with the reflected light from the light emitting pixels reduces the sensitivity of the optical sensing element 207. By filtering the infrared light prior to sensing, the signal-to-noise ratio (SNR) of the incident light will increase. In some other embodiments, the optical filter film 242 may target light in some spectrum other than infrared light, such as red or ultraviolet light in the visible spectrum. The filter profile of the optical filter film 242 may be tailored to give a specific appearance of color, texture, or reflective quality, allowing for optimized filtering performance. In some embodiments, the optical filter membrane 242 is an infrared light cut-off filter and has a separate membrane stacked below or above for filtering red light to reduce ghosting.
Collimator 245 then filters the stray light component in filter 270. Because of the high aspect ratio of aperture 246, collimator 245 allows only light reflected from sensing region 108 that is incident at or near normal to collimator 245 to pass through and ultimately reach OFPS 206. The optical sensing element 207 may be used to measure the intensity of light and to convert the measured intensity into a pixel image of an input object, e.g. the user's finger 110. On the other hand, stray light at a larger angle to the normal impinges the collimator 245 on the top surface of the collimator 245 or at a surface within the aperture 246 (e.g., the aperture sidewall) and is blocked and prevented from reaching the underlying image sensing layer 206. The aspect ratio of the aperture 246 is sufficiently large (e.g., from about 5:1 to about 50:1) to prevent stray light from passing through the collimator 245.
Then, the OFPS 206 acquires a fingerprint image. The optical sensing element 207 within the image sensing layer 206 may convert incident light into electrical output. The output of each optical sensing element 207 may correspond to a pixel in the fingerprint image. The optical sensing element 207 may include a monochrome image sensor (gray scale pixel) and/or a color image sensor (color pixel). In some embodiments, each of the optical sensing elements 207 may be configured to correspond to a specific wavelength of light, e.g., a sensor element for sensing a red wavelength directly below the red light emitting pixel 222, a sensor element for sensing a green wavelength directly below the green light emitting pixel 222, and a sensor element for sensing a blue wavelength directly below the blue light emitting pixel 222.
The acquired fingerprint image is compared with a real reference image previously stored in a memory (or database). If the fingerprint images match, the screen is unlocked. The pixels 222 below the sensing region 108 will cease to illuminate and combine with other pixels 222 outside the sensing region 108 to begin displaying conventional desktop icons in the unlocked state. If the fingerprint images do not match, the method returns to await a new biometric detection.
Referring to fig. 3, a cross-sectional view of some embodiments of a semiconductor structure of an OFPS 300 is provided. OFPS 300 may be substantially similar to OFPS 206 described above with reference to fig. 2. The OFPS 300 includes a pixel array 336 of image sensing pixels (abbreviated as pixels) 302 arranged in rows and columns. For example, the pixel array may include about three million pixels 302 arranged in 1536 rows and 2048 columns. The semiconductor structure includes a semiconductor substrate 304, and a photodiode 306 corresponding to a pixel 302 is disposed within the semiconductor substrate 304. The photodiodes 306 are arranged in rows and/or columns within the semiconductor substrate 304 and are configured to accumulate charge (e.g., electrons) from photons incident on the photodiodes 306. The semiconductor substrate 304 may be, for example, a bulk semiconductor substrate, such as a bulk silicon substrate or a silicon-on-insulator (SOI) substrate.
The DTI area 308 defines a substrate isolation grid comprised of grid segments (e.g., individual rectangles or squares that are contiguous with each other). Further, DTI region 308 extends into semiconductor substrate 304 from approximately level with the upper surface of substrate 304. DTI region 308 is disposed laterally around photodiode 306 and between photodiodes 306 to advantageously provide optical isolation between adjacent photodiodes 306. DTI region 308 may be, for example, a metalSuch as tungsten, copper or aluminum copper. Alternatively, DTI region 308 may be, for example, a low n material. The low n material has a refractive index less than that of filter 310, and filter 310 overlies corresponding pixel 302. The filter 310 may be a color filter for color pixels, a transparent color filter for monochrome pixels (gray pixels), or a combination of a color filter and a transparent color filter. In some embodiments, DTI region 308 has a refractive index of less than about 1.6. Further, in some embodiments, DTI region 308 is a material such as an oxide (e.g., siO 2 ) Or hafnium oxide (e.g., hfO) 2 ) Such as a dielectric, or a material having a refractive index less than that of silicon.
An anti-reflective coating (ARC) 316 and/or a first dielectric layer 318 of the semiconductor structure is disposed over the semiconductor substrate 304 along an upper surface of the semiconductor substrate 304. In embodiments where ARC 316 and first dielectric layer 318 are present, first dielectric layer 318 is typically disposed over ARC 316. ARC 316 and/or first dielectric layer 318 separates semiconductor substrate 304 from a composite grid 320 of semiconductor structures overlying substrate 304. The first dielectric layer 318 may be, for example, an oxide, such as silicon dioxide.
A composite grid 320 is disposed laterally around the photodiode 306 and between the photodiodes 306 to define openings within which the optical filters 310 are disposed. The opening corresponds to the pixel 302 and is centered with the photodiode 306 of the corresponding pixel 302. The composite grid 120 includes one or more of a metal grid 324, a low n grid 326, and a hard mask grid 328, which are stacked in this order over the semiconductor substrate 304. Each of the grids 324, 326, 328 is made up of grid segments (e.g., separate rectangles or squares that abut each other to collectively make up the grids 324, 326, 328 and surround the respective photodiode 306). Each of the grids 324, 326, 328 also includes openings between the grid segments and overlying the photodiode 306. The metal grid 324 blocks light from passing between adjacent pixels 302 to help reduce cross-talk. The metal grid 324 may be tungsten, copper, or aluminum copper, for example. The low n grid 326 is a transparent material having a refractive index less than the refractive index of the optical filter 310. Because of the low refractive index, a low n grid326 act as a light guide to guide light to filter 310 and effectively increase the size of filter 310. Further, because of the low refractive index, the low n-grid 326 is used to provide optical isolation between adjacent pixels 302. Because of the refractive index, light striking the boundary with low n-grid 326 within filter 310 typically undergoes total internal reflection. In some embodiments, the low n grid 326 is a material such as oxide (e.g., siO 2 ) Or hafnium oxide (e.g., hfO) 2 ) Such as a dielectric, or a material having a refractive index less than that of silicon. The hard mask grid 328 may be, for example, silicon nitride or silicon oxynitride.
The optical filter 310 is disposed over the ARC 316 and/or the first dielectric layer 318. Further, the optical filter 310 is arranged over the photodiode 306 of the corresponding pixel 302 within the opening of the composite grid 320. Filter 310 has an upper surface that is approximately flush with the upper surface of composite grid 320. Further, for the color filters in the optical filters 310, the optical filters 310 are assigned light of the corresponding color or wavelength, and are configured to filter out light of all colors or wavelengths except for the assigned light of the color or wavelength. Typically, the color filter allocation alternates between red, green, and blue light such that the color filters include red, green, and blue color filters. In some embodiments, the color filter assignments alternate between red, green, and blue light according to a bayer color filter mosaic. The pixel 302 corresponding to the red filter is denoted as a red ("R") pixel; the pixel 302 corresponding to the blue filter is represented as a blue ("B") pixel; the pixel 302 corresponding to the green filter is denoted as a green ("G") pixel; the pixel 302 corresponding to the transparent color filter is represented as a gray ("W") pixel. These pixels are configured for light sensing and are also denoted as sensing pixels 302S. In addition to the sensing pixels 302S for the light sensing function, there are also specific pixels distributed in the pixel array, which are not for light sensing but for providing a positioning code, denoted positioning pixels 302P. The bottom surface of the opening of the composite grid 320 corresponding to the positioning pixel 302P is covered with an opaque film 330. In some embodiments, opaque film 330 has the same material composition as metal grid 324, forming a continuous metal layer that blocks incident light. In some embodiments, the opaque film 330 is formed of a semiconductor or dielectric material. Because of the opaque film 330, the photodiode 306 of the corresponding positioning pixel 302P is not able to sense light and the output from the positioning pixel 302P is approximately zero (i.e., a dark pixel in the fingerprint image).
A second dielectric layer 130 lining composite grid 320 separates filter 310 from composite grid 320 and microlenses 332 corresponding to pixels 302 cover filter 310. The second dielectric layer 130 may be, for example, an oxide such as silicon dioxide, and may be the same material as the low n grid 326 or a different material. The microlenses 332 are centered on the photodiodes 306 of the corresponding pixels 302, and are generally symmetrical about a vertical axis centered on the photodiodes 306. Further, microlenses 132 typically overhang the composite grid 320 around the openings, so that adjacent edges of microlenses 332 abut. The depicted embodiment shows that the microlens 332 is also over the photodiode 306 of the positioning pixel 302P. However, in some embodiments, there may be no microlens 332 over the photodiode 306 of the positioning pixel 302P.
The integrated circuit 338 includes the semiconductor substrate 304 and device regions (partially shown). The device region is disposed along a lower surface of the semiconductor substrate 304 and extends into the semiconductor substrate 304. The device region includes a photodiode 306 corresponding to the pixel 302 and a logic device, e.g., a transistor, for reading out the photodiode 306. The photodiodes 306 are arranged in rows and columns within the semiconductor substrate 304 and are configured to accumulate charge from photons incident on the photodiodes 306. Further, the photodiodes 306 are optically isolated from each other by DTI regions 308 in the semiconductor substrate 304, thereby reducing cross-talk.
A back-end-of-line (BEOL) metallization stack 340 of the integrated circuit 338 is under the semiconductor substrate 304 and includes a plurality of metallization layers 342, 344 stacked within an interlayer dielectric (interlayer dielectric, ILD) layer 346. One or more contacts 348 of the BEOL metallization stack 340 extend from the metallization layer 344 to the device region. Further, one or more first vias 350 of the BEOL metallization stack 340 extend between the metallization layers 342, 344 to interconnect the metallization layers 342, 344.ILD layer 346 may be, for example, a low-k dielectric (i.e., a dielectric having a dielectric constant less than about 3.9) or an oxide. The metallization layer 342, the metallization layer 344, the contact 348, and the first via 350 may be, for example, a metal such as copper or aluminum.
The carrier substrate 352 is under the integrated circuit 338 and between the integrated circuit 338 and a Ball Grid Array (BGA) 354. BGA 354 includes a redistribution layer (redistribution layer, RDL) 356, with redistribution layer 356 disposed along a lower surface of carrier substrate 352 and electrically coupled to metallization layer 342, 344 of BEOL metallization stack 340 by one or more second silicon vias 358 extending through carrier substrate 352. RDL 356 is covered by BGA dielectric layer 360 and an under bump metallization (under bump metallization, UBM) layer 362 extends through BGA dielectric layer 360 to electrically couple solder balls (solder balls) 364 under UBM layer 362 to RDL 356.BGA dielectric layer 360 may be, for example, epoxy. RDL 356, UBM layer 362, second via 358, and solder ball 364 may be, for example, metals such as copper, aluminum, and tungsten. A bond pad (e.g., bond pad 264 described above with reference to fig. 2) may also be provided on the upper surface of OFPS 300.
To illustrate the function of locating pixels in a pixel array, fig. 4A-4C show top views of a pixel array 400 of an OFPS at different stages of fingerprint recognition. The pixel array 400 may be substantially similar to the pixel array 336 described above with reference to fig. 3. The pixel array 400 includes pixels 402 arranged in rows and columns. The pixels 402 include sensing pixels 402S and positioning pixels 402P. The sensing pixels 402S may all be gray scale pixels, color pixels, or a combination of gray scale and color pixels. Four positioning pixels 402P are shown, including a first positioning pixel 402P-a and a second positioning pixel 402P-b. Any number of positioning pixels may be present in pixel array 400. The fingerprint images acquired by the pixel array 400 are superimposed. The sensing pixels 402S capture the light intensity variations caused by the ridges and valleys of the fingerprint and generate a fingerprint image. In the illustrated embodiment, the sensing pixels 402S are all gray scale pixels and the fingerprint image is a gray scale image. Because the photodiode of the positioning pixel 402P is shielded by the opaque film, no light intensity is sensed at the location of the positioning pixel 402P. On the fingerprint image, a black dot (dark pixel) appears at the location of the positioning pixel 402P.
Referring to fig. 4A, an initial fingerprint image is acquired as a reference fingerprint image and stored in a memory (or database). The characteristics (minutiae) of the fingerprint are located with reference to the location pixels in the form of vectors. Fig. 4A shows a first vector Va marking a first minutiae point at a first position at one of the ridge lines with reference to the first positioning pixel 402P-a, and a second vector Vb marking a second minutiae point at a second position at the other of the ridge lines with reference to the second positioning pixel 402P-b. A reference fingerprint image having a vector of reference locator pixels is recorded.
Referring to fig. 4B, when the identity of the user needs to be verified, a new fingerprint image is acquired. The user's finger may not fall in exactly the same position as the last time and the acquired fingerprint image may be shifted with respect to the reference fingerprint image. The locator pixels in the form of characteristic (minutiae) reference vectors of the acquired fingerprint are repositioned. Fig. 4B shows a third vector Va 'and a fourth vector Vb', the third vector Va 'marking the same first minutiae point as in fig. 4A but shifted with reference to the first positioning pixel 402P-a, the fourth vector Vb' marking the same second minutiae point as in fig. 4A but shifted with reference to the second positioning pixel 402P-B.
Referring to fig. 4C, the acquired fingerprint image is compared with a reference fingerprint image stored in a memory. Instead of a directional comparison of the characteristic sets of fingerprints (minutiae images), which are more easily falsified, the vectors are compared. For example, the third vector Va' is compared with the first vector Va, and the shift Δva in the form of a vector is calculated. The fourth vector Vb' and the second vector Vb are compared and a shift Δvb in vector form is calculated. Then, the shift Δva and the shift Δvb are compared. The shift deltava should be equal to the shift deltavb (and many other vectors not repeated herein) to yield a match.
Fig. 5-7 illustrate various embodiments of the distribution of positioning pixels 402P in a pixel array 400. Referring to fig. 5, a pixel array 400 may be constructed by repeating a unit tile (or tile) 400a in columns and rows. Tile 400a includes a sense pixel 402S and a locate pixel 402P located in the center of tile 400 a. Accordingly, the positioning pixels 402P are repeatedly arranged in the pixel array 400. That is, the positioning pixels 402P have a regular pattern.
Referring to fig. 6, pixel array 400 can be constructed by repeating unit tiles 400b in columns and rows. Tile 400b includes a sensing pixel 402S and a plurality of positioning pixels 402P. The registration pixels 402P may be classified into different types of patterns based on the arrangement of adjacent registration pixels 402P. In the illustrated embodiment, the type I pattern includes two adjacent registration pixels 402P arranged diagonally, the type II pattern includes isolated registration pixels 402P, and the type III pattern includes three adjacent registration pixels 402P forming a triangle. Because of the repetition of tile 400b, different types of patterns of positioning pixels 402P are also repeatedly arranged in pixel array 400. That is, the positioning pixels 402P have a regular pattern. The different types of patterns that position pixels 402P provide further enhanced security features. For example, vector comparisons may be performed separately in type I, type II, and type III patterns, and the shift should pass the test within each of the type I, type II, and type III patterns. The shifts from each of the type I, type II, and type III patterns are then compared, and these shifts should be the same to ultimately arrive at a match. That is, the vector comparison may derive the same shift DeltaV based on the type I pattern of the positioning pixels 402P typeI The same shift DeltaV is derived based on the type II pattern of the registration pixels 402P typeII And deriving the same shift DeltaV based on the type III pattern of the positioning pixels 402P typeIII And further, shift DeltaV typeI Shift DeltaV typeII And shift DeltaV typeIII Should also be equal to derive a match.
Referring to fig. 7, the positioning pixels 402P may be randomly distributed in the pixel array 400. That is, the positioning pixels 402P may have a random pattern. In addition, the adjacently positioned pixels 402P may form various types of patterns, even randomly distributed throughout the pixel array 400. For example, the dashed circle in fig. 7 highlights two adjacent registration pixels 402P forming a line pattern, in addition to the other isolated registration pixels 402P. This combination increases the difficulty of counterfeiting. In various embodiments, for example, in fig. 5-7, the percentage of the positioned pixels 402P in the total amount of pixels in the pixel array 400 may range from about 1% to about 10%. This range is not trivial. If the percentage of registration pixels is less than 1%, the security feature may not be sufficiently enhanced; if the percentage of locating pixels is higher than 10%, the area of the pixel array may not be fully utilized for fingerprint image acquisition. In other words, a fingerprint image acquired by a pixel array implementing the locator pixels may have dark pixels in an area percentage of 1% to about 10% of the total area of the fingerprint image.
In addition to locating pixels, color pixels may be added to the gray scale pixel array to add flesh tone information of the fingerprint. In addition to comparing minutiae points of fingerprints, skin tone information adds another layer of security. Fig. 8A-8G illustrate various embodiments of adding multiple color pixels to an array of all gray scale pixels (denoted as "W"). The positioning pixels also help to identify the position of the color pixels by positioning the color pixels beside the positioning pixels. This helps the software algorithm to quickly identify the location of the color pixels from the fingerprint image. The color pixels may also form the same type of pattern as the positioning pixels. Referring to fig. 8A, three positioning pixels form a triangle shape, and three color pixels of red, green, and blue (RGB) are arranged in the same shape and located beside the positioning pixels. Referring to fig. 8B, two registration pixels form a diagonal shape, and two color pixels (e.g., RB, GG, or other suitable combinations) are arranged in the same shape and beside the registration pixels. Referring to fig. 8C, two positioning pixels form a horizontal line shape, and two color pixels (e.g., RG, GB, or other suitable combinations) are arranged in the same shape and beside the positioning pixels. Referring to fig. 8D, two positioning pixels form a vertical line shape, and two color pixels (e.g., GB, RG, or other suitable combinations) are arranged in the same shape and beside the positioning pixels with a column of gray scale pixels between the positioning pixels and the color pixels. Referring to fig. 8E, tile 400b discussed above with reference to fig. 6 is regenerated with added color pixels. Color pixels are added beside the positioning pixels with the same type of pattern. In the illustrated embodiment, the type I pattern includes two diagonally arranged adjacent registration pixels and two diagonally arranged adjacent color pixels, the type II pattern includes isolated registration pixels and adjacent isolated color pixels, and the type III pattern includes three adjacent registration pixels forming a triangle and three adjacent color pixels forming a triangle. Referring to fig. 8F, the color pixels may even exceed the number of gray pixels, and the gray pixels exceed the number of positioning pixels. In the illustrated embodiment, gray scale pixels appear only in every other row and every other column, while color pixels fill the remaining pixels not occupied by the positioning pixels, and the positioning pixels are randomly distributed or repeatedly distributed. Referring to fig. 8G, all of the sensing pixels in the pixel array may be color pixels, and the positioning pixels are randomly distributed or repeatedly distributed. As will be appreciated by those skilled in the art, any other suitable combination and arrangement of positioning pixels and color pixels is possible than that shown in fig. 8A-8G.
Fig. 9 illustrates a flow chart of a method 900 for capturing and identifying fingerprint images from a user's finger illuminated by a display panel integrated with an OFPS, according to an example of the present disclosure. The method 900 will be described below with reference to the exemplary electronic device 100 shown in fig. 2.
At block 902, the method 900 begins with displaying a prompt on a screen. The screen of the electronic device 100 may be in a locked state. The prompt may be an icon, for example, a fingerprint image or instruction text. The prompt highlights the sensing region 108 on the screen. The cues are shown by light emitting pixels 222 below the sensing region 108. The light emitting pixels 222 may be OLED diodes. The pixels 222 outside the sensing region 108 may be turned off in a locked state or when a preset screen saver image is displayed.
At block 904, the method 900 detects an input object shown in the sensing region 108, for example, a user's finger 110. This detection may be achieved by sensing the incident light variation at the optical sensing element 207. Alternatively, the display panel 202 may be a touch screen and include touch sensor(s), and detection may be implemented by the touch sensor(s). In some applications, the user's finger 110 need not physically touch the top surface 216 of the display panel 202. Rather, near field imaging may be used to sense touches detected by a user's glove or other barrier (e.g., oil, gel, and moisture). When the user's finger 110 remains stable for more than a predetermined time, for example, the user holds the finger stable for about 100 milliseconds, the method 900 enters a biometric detection mode. Otherwise, the method 900 returns to block 902 to await a new user input.
At block 906, the prompt displayed on the screen is turned off and the light emitting pixels 222 below the sensing region 108 begin to illuminate the user's finger 110. Light 270 emitted from the light emitting pixels 222 passes through the cover glass 214 and reaches the user's finger 110. The user's finger 110 may include ridges 272 and valleys 274. Because the top surface 216 is closer than the valleys 274, the ridges 272 of the finger may reflect more light, while the valleys 274 may reflect less light. The light 270 is in turn reflected back to the light regulating layer 204.
At block 908, the method 400 filters the stray light component in the light 270 at the collimator 240. Because of the high aspect ratio of aperture 246, collimator 240 allows only light reflected from sensing region 108 that is incident on collimator 240 at or near normal to pass through and ultimately to image sensing layer 206. The optical sensing element 207 may be used to measure the intensity of light and convert the measured intensity into a pixel image of the user's finger 110. On the other hand, stray light at a larger angle to the normal impinges on collimator 240 at the top surface or at a surface within aperture 246 (e.g., the aperture sidewall) on collimator 245 and is blocked and prevented from reaching underlying image sensing layer 206. The aspect ratio of aperture 246 is large enough (from about 5:1 to about 50:1) to prevent stray light from passing through collimator 240. As an example, without collimator 240, light rays reflected from valleys 274 may propagate at a large angle relative to the normal direction and reach one sensor element directly below ridge 272. Thus, the image produced by this sensor element is blurred by mixing light from the areas of ridges 272 and valleys 274. Such light is known as stray light. The larger aspect ratio of aperture 246 limits the light receiving cone to smaller angles, thereby improving the optical resolution of the system. In some embodiments, the aperture 246 is cylindrical or conical. The sidewalls of aperture 246 may also include grooves or other structures to prevent stray light from reflecting off the walls and reaching the underlying OFPS 206.
At block 910, method 900 acquires a fingerprint image at OFPS 206. The sensing pixels 207 in the pixel array of the image sensing layer 206 convert incident light into electrical output. The pixel array may include sensing pixels that are monochrome (gray scale) pixels, color pixels, or a combination of monochrome and color pixels. The color pixels add flesh tone information to the fingerprint image. The pixel array also includes positioning pixels uniformly or randomly distributed in the pixel array. The output of each sensing pixel 207 may correspond to one pixel in the fingerprint image having a gray level (or RGB color if color pixels are present). The output of each locator pixel may correspond to a dark pixel in the fingerprint image. In some embodiments, each sensing pixel may be configured to correspond to a specific wavelength of light, e.g., a sensing pixel for sensing a red wavelength below a red light emitting pixel (222R), a sensing pixel for sensing a green wavelength below a green light emitting pixel (222G), and a sensing pixel for sensing a blue wavelength below a blue light emitting pixel (222B).
At block 912, the method 900 obtains a vector representing characteristics (minutiae) of a fingerprint referencing the location of the locator pixel. The vectors may also be classified into different groups based on the pattern type of the positioning pixels, e.g. a first group of vectors referencing positioning pixels of a first type of pattern and a second group of vectors referencing positioning pixels of a second type of pattern.
At block 914, the method 900 compares the acquired fingerprint image with a real reference image previously stored in memory (or database). The comparison includes comparing vectors of the two images. The comparison of vectors may be added to the comparison of the detail graph at block 914, adding only another layer of security to the detail graph. Alternatively, only the vectors may be compared at block 914. The vectors may not be identical because the fingerprint may be shifted with respect to the locator pixels, but the shift of the vectors should be identical to arrive at a match. Further, if the vectors are classified into different groups (pattern types), a two-level comparison may be performed. The lower level is to compare the vectors in the same group with the shift of the vectors, which should be the same. The higher level is to compare the vector and shift between the different groups, which should also be the same, to arrive at a match. The skin tone information may optionally be another comparison criterion in order to arrive at a match. If the fingerprint images match, the method 900 proceeds to block 916 to unlock the screen. The pixels 222 below the sensing region 108 will cease to illuminate and combine with other pixels 222 outside the sensing region 108 to begin displaying conventional desktop icons in the unlocked state. If the fingerprint images do not match, the method 900 returns to block 902 to await a new biometric detection.
While not intended to be limiting, one or more embodiments of the present disclosure provide a number of benefits to fingerprint identification systems (e.g., in consumer (or portable) electronic devices). For example, some of the sensing pixels in the pixel array are replaced with positioning pixels distributed in certain patterns. The locator pixels provide a reference point to identify characteristics of the acquired fingerprint and optionally indicate the location of adjacent color pixels for providing flesh tone information. The anti-counterfeiting capability of the fingerprint identification system is further improved.
In one exemplary aspect, the present disclosure relates to a sensing device. In an embodiment, the sensing device comprises: a pixel array, the pixel array comprising: a plurality of sensing pixels configured as minutiae points for capturing a fingerprint; and a plurality of positioning pixels configured to provide a positioning code; and a plurality of microlenses disposed over the pixel array.
In another exemplary aspect, the present disclosure is directed to an apparatus. In an embodiment, the apparatus comprises: a filter array arranged in columns and rows; a light receiving element array below the filter array, the light receiving element array configured to convert incident light reflected from a fingerprint into a fingerprint image; and a plurality of opaque films disposed over a portion of the light receiving elements configured to add dark pixels to the fingerprint image.
In yet another exemplary aspect, the present disclosure is directed to a method. In an embodiment, the method comprises: capturing a fingerprint image by an image sensing device comprising a pixel array of a combination of sensing pixels and positioning pixels, the sensing pixels being configured to capture minutiae points in the fingerprint image and the positioning pixels being configured to provide a positioning code; calculating a vector of the minutiae points with reference to the positioning code; and comparing the vector to a reference vector generated from a reference fingerprint image to determine a match between the fingerprint image and the reference fingerprint image.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.
Example 1 is an image sensing apparatus, comprising: a pixel array, the pixel array comprising: a plurality of sensing pixels configured as minutiae points for capturing a fingerprint; and a plurality of positioning pixels configured to provide a positioning code; and a plurality of microlenses disposed over the pixel array.
Example 2 is the image sensing apparatus of example 1, wherein all of the sensing pixels are gray scale pixels.
Example 3 is the image sensing apparatus of example 1, wherein all of the sensing pixels are color pixels.
Example 4 is the image sensing apparatus of example 1, wherein the sensing pixels include a plurality of gray scale pixels and a plurality of color pixels.
Example 5 is the image sensing apparatus of example 4, wherein the color pixel is positioned adjacent to the positioning pixel.
Example 6 is the image sensing apparatus of example 5, wherein the color pixels are arranged in the same pattern as that formed by adjacent positioning pixels.
Example 7 is the image sensing apparatus of example 1, wherein the positioning pixels are arranged in a repeating pattern in the pixel array.
Example 8 is the image sensing apparatus of example 1, wherein the positioning pixels are randomly distributed in the pixel array.
Example 9 is the image sensing apparatus of example 1, further comprising: a collimator over the microlenses; and an illumination layer over the collimator.
Example 10 is the image sensing apparatus of example 1, wherein the microlens is disposed directly above the sensing pixel, but is not disposed above the positioning pixel.
Example 11 is an optical fingerprint sensor, comprising: a filter array arranged in columns and rows; a light receiving element array below the filter array, the light receiving element array configured to convert incident light reflected from a fingerprint into a fingerprint image; and a plurality of opaque films disposed over a portion of the light receiving elements configured to add dark pixels to the fingerprint image.
Example 12 is the optical fingerprint sensor of example 11, wherein the opaque film is made of metal.
Example 13 is the optical fingerprint sensor of example 11, wherein the opaque film is disposed between the filter array and the light receiving element array.
Example 14 is the optical fingerprint sensor of example 11, wherein the positions of the portions of the light receiving elements form a regular pattern.
Example 15 is the optical fingerprint sensor of example 14, wherein the portion of the light-receiving elements forms at least two different sub-patterns within the regular pattern.
Example 16 is the optical fingerprint sensor of example 11, wherein the positions of the portions of the light receiving elements are randomly distributed.
Example 17 is the optical fingerprint sensor of example 11, wherein the filter array comprises a combination of color filters and transparent filters.
Example 18 is a method of fingerprint authentication, comprising: capturing a fingerprint image by an image sensing device comprising a pixel array of a combination of sensing pixels and positioning pixels, the sensing pixels being configured to capture minutiae points in the fingerprint image and the positioning pixels being configured to provide a positioning code; calculating a vector of the minutiae points with reference to the positioning code; and comparing the vector to a reference vector generated from a reference fingerprint image to determine a match between the fingerprint image and the reference fingerprint image.
Example 19 is the method of example 18, wherein the vectors include a first set of vectors referencing a first type of the positioning code and a second set of vectors referencing a second type of the positioning code.
Example 20 is the method of example 18, further comprising: determining a shift between the vector and the reference vector; and determining whether the shifts are substantially equal to determine the match.

Claims (10)

1. An image sensing apparatus comprising:
a pixel array, the pixel array comprising:
a plurality of sensing pixels configured as minutiae points for capturing a fingerprint; and
a plurality of positioning pixels configured to provide a positioning code; and
a plurality of microlenses are disposed over the pixel array.
2. The image sensing device of claim 1, wherein all of the sensing pixels are gray scale pixels.
3. The image sensing device of claim 1, wherein all of the sensing pixels are color pixels.
4. The image sensing device of claim 1, wherein the sensing pixels comprise a plurality of gray scale pixels and a plurality of color pixels.
5. The image sensing device of claim 4, wherein the color pixel is positioned adjacent to the positioning pixel.
6. The image sensing device of claim 5, wherein the color pixels are arranged in the same pattern as that formed by adjacent positioning pixels.
7. The image sensing device of claim 1, wherein the positioning pixels are arranged in a repeating pattern in the pixel array.
8. The image sensing device of claim 1, wherein the positioning pixels are randomly distributed in the pixel array.
9. An optical fingerprint sensor comprising:
a filter array arranged in columns and rows;
a light receiving element array below the filter array, the light receiving element array configured to convert incident light reflected from a fingerprint into a fingerprint image; and
a plurality of opaque films disposed over a portion of the light receiving elements configured to add dark pixels to the fingerprint image.
10. A method of fingerprint authentication, comprising:
capturing a fingerprint image by an image sensing device comprising a pixel array of a combination of sensing pixels and positioning pixels, the sensing pixels being configured to capture minutiae points in the fingerprint image and the positioning pixels being configured to provide a positioning code;
calculating a vector of the minutiae points with reference to the positioning code; and
the vector is compared to a reference vector generated from a reference fingerprint image to determine a match between the fingerprint image and the reference fingerprint image.
CN202310510520.6A 2022-07-14 2023-05-08 Optical fingerprint sensor with enhanced security feature Pending CN117095427A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/389,292 2022-07-14
US18/187,891 US20240021009A1 (en) 2022-07-14 2023-03-22 Optical Fingerprint Sensor with Enhanced Anti-Counterfeiting Features
US18/187,891 2023-03-22

Publications (1)

Publication Number Publication Date
CN117095427A true CN117095427A (en) 2023-11-21

Family

ID=88781885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310510520.6A Pending CN117095427A (en) 2022-07-14 2023-05-08 Optical fingerprint sensor with enhanced security feature

Country Status (1)

Country Link
CN (1) CN117095427A (en)

Similar Documents

Publication Publication Date Title
US11289533B2 (en) Biometric sensor and methods thereof
US11514707B2 (en) Optical sensor and methods of making the same
US10796123B2 (en) Systems and methods for optical sensing using point-based illumination
US10963671B2 (en) Multifunction fingerprint sensor having optical sensing capability
US11031424B2 (en) Image sensor with selective light-shielding for reference pixels
US11568036B2 (en) Display device including fingerprint sensor and fingerprint authentication method thereof
US11861928B2 (en) Optical sensor and methods of making the same
US11825669B2 (en) Electroluminescent device
US11988855B2 (en) Optical fingerprint sensors
CN111837128A (en) Fingerprint anti-counterfeiting method, fingerprint identification device and electronic equipment
CN111353405B (en) Fingerprint identification device, fingerprint identification system and electronic equipment
CN114120374A (en) Fingerprint authentication apparatus, display apparatus including the same, and method of authenticating fingerprint
CN211529170U (en) Fingerprint identification device and electronic equipment
US20240021009A1 (en) Optical Fingerprint Sensor with Enhanced Anti-Counterfeiting Features
CN117095427A (en) Optical fingerprint sensor with enhanced security feature
JP4055108B2 (en) Image authentication device
US11928888B2 (en) Image acquisition device
WO2021258941A1 (en) Texture recognition apparatus and electronic apparatus
TWI757053B (en) Biological feature identification system and method thereof
CN110543821A (en) Grain recognition device and operation method thereof
CN118175890A (en) Full-screen display device for transmitting and receiving light rays by different unit pixels
TW202305658A (en) Biometric identification device and identification method thereof
CN116386094A (en) Optical fingerprint identification device and fingerprint sensing device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination