CN215869390U - Optical sensing system and electronic device - Google Patents

Optical sensing system and electronic device Download PDF

Info

Publication number
CN215869390U
CN215869390U CN202120662240.3U CN202120662240U CN215869390U CN 215869390 U CN215869390 U CN 215869390U CN 202120662240 U CN202120662240 U CN 202120662240U CN 215869390 U CN215869390 U CN 215869390U
Authority
CN
China
Prior art keywords
light
display
optical imaging
array
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202120662240.3U
Other languages
Chinese (zh)
Inventor
叶博纯
翟宇佳
陈远
M·Y·亚兹丹杜斯特
G·戈齐尼
C·H·泰
张钧杰
庄景桑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/003,636 external-priority patent/US20210089741A1/en
Application filed by Apple Inc filed Critical Apple Inc
Application granted granted Critical
Publication of CN215869390U publication Critical patent/CN215869390U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/15Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components having potential barriers, specially adapted for light emission
    • H01L27/153Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components having potential barriers, specially adapted for light emission in a repetitive configuration, e.g. LED bars
    • H01L27/156Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components having potential barriers, specially adapted for light emission in a repetitive configuration, e.g. LED bars two-dimensional arrays
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays
    • H10K59/12Active-matrix OLED [AMOLED] displays
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/40OLEDs integrated with touch screens
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/60OLEDs integrated with inorganic light-sensitive elements, e.g. with inorganic solar cells or inorganic photodiodes
    • H10K59/65OLEDs integrated with inorganic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14678Contact-type imagers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Inorganic Chemistry (AREA)
  • Power Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

The utility model relates to an optical sensing system and an electronic device. Systems and methods for imaging through a display are disclosed herein. An optical imaging sensor is positioned at least partially behind a display and is configured to emit visible wavelength light that passes at least partially through the display to illuminate an object, such as a fingerprint or retina, in contact with or proximate to an outer surface of the display. Surface reflections from the objects traverse the display stack and are received, and images of the objects may be combined.

Description

Optical sensing system and electronic device
The present application is a divisional application of the chinese utility model patent application 202021839969.5 entitled "optical sensing system and electronic device" having the application date of 28/8/2020.
Cross Reference to Related Applications
This patent application is a non-provisional application of U.S. provisional patent application 62/904,211 filed 2019, 23/9 and the benefit of which is claimed in 35U.S. C.119(e), the contents of which are incorporated herein by reference as if fully disclosed herein.
Technical Field
Embodiments described herein relate to optical biometric imaging through an electronic device display, and in particular to an optical fingerprint or retina imaging system with integrated optics configured for use behind a display of an electronic device.
Background
Electronic device displays ("displays") are typically formed from a laminate of functional and structural layers ("display laminates") attached to or otherwise disposed beneath a protective cover. In many conventional implementations, the protective cover defines an exterior surface of an electronic device housing that incorporates the display. Conventional display stacks are intentionally designed to be opaque in order to increase contrast.
According to an aspect of the present disclosure, there is provided an optical sensing system, including: a first substrate formed of a transparent material; a light emitting element formed on the first substrate and configured to emit light perpendicular to a first surface of the first substrate; and a second substrate coupled to a second surface of the first substrate opposite the first surface, the second substrate comprising: a photodiode formed on the second substrate and configured to collect light perpendicular to the second surface; a collimator formed on and aligned with the photodiode, the collimator positioned between the photodiode and the second surface; and a microlens formed on and aligned with the collimator and configured to focus light incident to the microlens into the collimator, the microlens positioned between the collimator and the second surface; wherein: at least a part of light emitted from the light emitting element is reflected from a surface of an object adjacent to the first substrate so as to become reflected light; and the photodiode is configured to absorb at least a portion of the reflected light that passes through the first substrate, the microlens, and the collimator.
According to another aspect of the present disclosure, there is provided an electronic device configured to capture an image of an object touching a portion of a surface of the electronic device, the electronic device comprising: a transparent outer cover defining an interface surface operable to receive a touch from the object; a light emitting layer positioned below the transparent outer cover and comprising: a first thin-film transistor layer comprising a transparent substrate; and a pixel array disposed in a pattern on the transparent substrate and configured to emit light through the transparent outer cover to illuminate a contact area defined by a portion of the object in contact with the interface surface during the touch; and an optical imaging sensor coupled to a lower surface of the first thin-film-transistor layer and comprising: a second thin-film transistor layer comprising conductive traces; a photosensitive element coupled to the second thin-film-transistor layer and electrically coupled to the conductive traces; an infrared cut filter coupled to the photosensitive element and configured to reflect and/or absorb infrared light passing through the transparent substrate between pixels in the pixel array; a collimator array formed above the infrared cut filter and configured to narrow a field of view of the photosensitive element; and a microlens array formed over the collimator array, each respective microlens in the microlens array configured to focus light incident on the respective microlens into a respective collimator in the collimator array, the microlens array coupled to the lower surface of the first thin-film-transistor layer by an adhesive; wherein: at least a portion of the light emitted from the light emitting layer is reflected from the contact region, passes through the transparent substrate between pixels in the array of pixels, is focused into a respective one of the collimator array by a respective one of the microlens array, and is absorbed by the photosensitive element.
According to yet another aspect of the present disclosure, there is provided an optical sensing system for capturing light incident to a display of an electronic device, the optical imaging system comprising: a thin film transistor substrate coupled to a back surface of the display opposite a front surface of the display from which the display emits light; an array of photosensitive elements each coupled to the thin film transistor substrate, the photosensitive elements positioned between the thin film transistor substrate and the back surface of the display and oriented to collect light incident to the front surface of the display; a collimator disposed over a photodiode and between the photodiode and the back surface of the display; and a microlens disposed above the collimator and positioned between the collimator and the back surface; wherein: light emitted from the display is reflected from a finger adjacent the front surface of the display; and at least a portion of the reflected light passes through the display, the microlens, and the collimator and is collected by the photodiode.
The electronic device may also include an optical imaging system. Some optical imaging systems, such as a front-facing camera or an ambient light sensor, are typically configured to be attached to or otherwise disposed under the same exterior surface of the housing as the display. Due to this design constraint (and the opacity of conventional display stacks), electronic devices incorporating both displays and "forward" optical imaging systems are typically constructed with a protective cover that extends a distance beyond the perimeter of the display to reserve space to accommodate the forward optical imaging system. However, this conventional solution (1) undesirably increases the apparent size of the bezel area surrounding the display, and (2) undesirably increases the size and volume of the electronic device housing.
Disclosure of Invention
The embodiments relate to an optical sensing system configured to be positioned behind a light emitting element disposed on a transparent substrate. More specifically, in these embodiments, the light emitting elements are oriented to emit light perpendicular to the first surface of the transparent substrate. The optical sensing system also includes a second substrate, which may be transparent or otherwise coupled to a second surface of the first substrate opposite the first surface.
In these embodiments, the second substrate includes a photodiode oriented to collect light perpendicular to the second surface. Such light may be incident on the transparent substrate and may exit the second surface through the transparent substrate along a path toward the photodiode.
In these embodiments, the optical sensing system further comprises a collimator disposed above and aligned with the photodiode, the collimator positioned between the photodiode and the second surface. Further, the optical sensing system includes a convex microlens disposed above and aligned with the collimator. More specifically, the micro-lens is positioned between the collimator and the second surface of the transparent substrate. In this configuration, the microlens is configured to focus light incident to the microlens into the collimator and toward the photodiode.
Due to this configuration, at least a portion of the light emitted from the light emitting element may be light that has been reflected from a surface of an object (e.g., a finger, a stylus, etc.) adjacent to the transparent substrate. At least a portion of this reflected light may pass through the transparent substrate (e.g., through an area of the transparent substrate adjacent to or otherwise at the perimeter of the light-emitting element), the microlenses and collimators, and may be absorbed by the photodiodes.
Embodiments described herein may include configurations in which the photodiode, collimator, and microlens are formed by a thin film transistor fabrication process.
Embodiments described herein may include an infrared cut filter disposed between the collimator and the photodiode or between the collimator and the microlens or between the microlens and the second surface. In further embodiments, the transparent substrate may include an infrared cut filter. In other cases, an infrared cut filter may not be required.
Embodiments described herein may include configurations in which the light-emitting element is one element of an array of light-emitting elements disposed on a transparent substrate. In these embodiments, the array of light-emitting elements can be pixels of an electronic device display.
Embodiments described herein relate to an electronic device configured to capture an image of a portion of a surface of the electronic device touched by an object (such as a finger or a stylus). In these and related embodiments, the electronic device includes a transparent outer cover (also referred to as a cover glass, a cover, a protective outer cover, a housing surface, etc.) that defines an interface surface. The interface surface is operable to receive a touch from an object.
The electronic device also includes a light emitting layer positioned below the transparent outer cover. The light-emitting layer includes a first thin-film-transistor layer having a transparent substrate and a pixel array disposed in a pattern on the transparent substrate. The pixel array (also referred to as a light emitting element, a light emitting diode, an organic pixel, etc.) is configured to emit light through the transparent outer cover to illuminate a contact area defined as a portion of an object in contact with (e.g., wetting) the interface surface during touch.
In these exemplary embodiments, the electronic device further includes an optical imaging sensor coupled to a lower surface of the first thin-film-transistor layer. The optical imaging sensor includes a second thin-film-transistor layer having conductive traces, a photosensitive element (e.g., a photodiode, an organic photodiode, a micro-solar device, a phototransistor, etc.) coupled to the second thin-film-transistor layer and electrically coupled to the conductive traces, an infrared cut-off filter coupled to the photosensitive element (configured to reflect and/or absorb infrared light passing through a transparent substrate positioned between pixels in the pixel array), a collimator array formed over the infrared cut-off filter (configured to reduce a field of view of the photosensitive element), and a microlens array formed over the collimator array. In particular, each respective microlens in the microlens array is configured to focus light incident to the respective microlens into a respective one of the collimator array. In these configurations, the microlens array is coupled to the lower surface of the first thin-film-transistor layer by an adhesive.
Due to this architecture, at least a portion of light emitted from the light emitting layer (e.g., from at least one pixel of the layer) may be reflected from the contact region, pass through the transparent substrate (e.g., located between pixels in the pixel array), be focused by a respective one of the microlens arrays into a respective one of the collimator arrays, and then be absorbed by the photosensitive element.
Embodiments described herein may include configurations in which the object that engages, touches, wets, or otherwise interacts with the interface surface is a finger. In these examples, light reflected from the finger and absorbed by the light sensitive element may be used to construct a fingerprint image or a retinal image.
Embodiments described herein may include configurations in which the light-emitting layer is a display such as an organic light-emitting diode display or a micro light-emitting diode display.
In some examples, the collimator array includes an opaque layer (e.g., ink, a reflective backing, a metal layer, a non-conductive layer, etc.) disposed over the photosensitive elements and an array of apertures defined through the opaque layer, each aligned along a common axis. In some cases, the apertures defined through the opaque layer of the collimator may be aligned perpendicular to the photosensitive elements, but this may not be necessary. In other cases, the aperture defined through the opaque layer of the collimator may be defined at an angle relative to the normal to the photosensitive element.
In these and related embodiments, the touch sensitive layer (or pressure sensitive layer) may be disposed between the transparent outer cover and the light emitting layer. An exemplary touch sensitive layer is a capacitive touch sensor.
Further embodiments described herein relate to an optical sensing system for capturing light incident to a display of an electronic device. In these examples, the optical imaging system includes a thin film transistor substrate coupled to the back surface of the display. The back surface of the display is opposite the front surface of the display from which the display emits light. The optical sensing system also includes an array of photosensitive elements, each photosensitive element coupled to the thin film transistor substrate. More specifically, each photosensitive element in the array is positioned between the thin film transistor substrate and the back surface of the display and is oriented to collect light incident on the front surface of the display.
Such embodiments also include a collimator disposed above the photodiode and between the photodiode and the back surface of the display. In addition, the optical imaging system further includes a microlens disposed above the collimator and positioned between the collimator and the back surface.
Due to this configuration, light emitted from the display may be reflected from a finger adjacent to the front surface of the display. In this way, at least a portion of the reflected light may pass through the display, the microlenses and the collimator, and may be collected by the photodiodes to image an image of a portion of a fingerprint or retina.
Related embodiments include a flexible circuit capable of communicatively coupling the thin-film-transistor layer to a processor of the electronic device.
Drawings
Reference will now be made to the exemplary embodiments illustrated in the drawings. It should be understood that the following description is not intended to limit the present disclosure to one included embodiment. On the contrary, the disclosure provided herein is intended to cover alternatives, modifications, and equivalents, which may be included within the spirit and scope of the embodiments and defined by the appended claims.
FIG. 1A depicts an electronic device that may incorporate a display stack suitable for imaging through a display.
FIG. 1B depicts a simplified block diagram of a portion of the electronic device of FIG. 1A.
Fig. 2A-2B illustrate exemplary simplified block diagrams of cross-sections of fig. 1A taken through line a-a showing an optical imaging system such as described herein.
FIG. 3 illustrates an exemplary simplified cross-section of a display stack such as described herein in connection with an optical imaging system.
Fig. 4A illustrates an exemplary cross-section of a collimator array of an optical imaging system such as described herein.
Fig. 4B illustrates an exemplary cross-section of another exemplary collimator array of an optical imaging system such as described herein.
Fig. 4C illustrates an exemplary cross-section of another exemplary collimator array of an optical imaging system such as described herein.
Fig. 5A illustrates an exemplary arrangement of microlenses of an optical imaging system such as described herein.
Fig. 5B illustrates another exemplary arrangement of microlenses of an optical imaging system such as described herein.
FIG. 6 is a simplified flowchart illustrating exemplary operations of a method for capturing an image of an object touching a display, such as described herein.
FIG. 7 is a simplified flow chart illustrating exemplary operations of a method for fabricating an optical imaging system with integrated optics, such as those described herein.
The use of the same or similar reference symbols in different drawings indicates similar, related, or identical items.
The use of cross-hatching or shading in the drawings is generally provided to clarify the boundaries between adjacent elements and also to facilitate the legibility of the drawings. Thus, the presence or absence of cross-hatching or shading does not indicate or indicate any preference or requirement for a particular material, material property, proportion of elements, size of elements, commonality of like-illustrated elements or any other characteristic, property or attribute of any element shown in the figures.
Similarly, some figures include vector, ray, trace, and/or other visual representations of one or more exemplary paths through one or more media that may be taken by one or more photons originating from one or more light sources shown in or, in some cases, omitted from the figures, which may include reflection, refraction, diffraction, and the like. It is to be understood that such simplified visual representations of light are provided merely for ease of understanding the various embodiments described herein, and thus may not necessarily be presented or illustrated to scale or with angular precision or accuracy, and thus are not intended to suggest any preference or requirement for the illustrated embodiments to receive, transmit, reflect, refract, focus and/or diffract light at any particular illustrated angle, orientation, polarization, color or direction, in addition to other embodiments described or referenced herein.
Further, it should be understood that the proportions and dimensions (relative or absolute) of the various features and elements (and collections and groupings thereof) and the limits, spacings, and positional relationships presented therebetween are provided in the drawings solely to facilitate an understanding of the various embodiments described herein, and thus may not necessarily be presented or illustrated as being scaled and are not intended to indicate any preference or requirement for the illustrated embodiments to preclude embodiments described in connection therewith.
Detailed Description
Embodiments described herein relate to an electronic device that includes a display or other light emitting layer and an optical imaging system configured to capture light incident to a surface of the display through which the display emits light. The optical imaging system may be fabricated using the same or similar thin film transistor fabrication processes as are used to fabricate the display. Thus, the imaging optics of the secondary optical imaging system may be formed directly over (and thus precisely and accurately aligned with) the photosensitive elements of the optical imaging system.
The optical imaging system is configured to operate in a visible wavelength band and is positioned on a back surface of and/or integrated within an active display area of a display of the electronic device. As used herein, the phrase "back surface" of the active display area of the display refers to the surface of the display opposite the surface from which the display emits light (which surface is referred to herein as the "front surface" of the display).
In these configurations, the optical imaging system or an electronic device incorporating the optical imaging system may command or otherwise initiate a process that causes a display of the electronic device to generate light to illuminate an object or a portion of an object in contact with or proximate to a front surface of the display. Thus, at least a portion of the light emitted by the display may be reflected from the outer surface (or in some cases, from the inner surface) of the object and subsequently redirected to be incident on the front surface of the display. In turn, at least a portion of this reflected light may pass through substantially transparent regions of the display, such as regions between adjacent or neighboring pixels disposed on the transparent substrate.
This reflected light that has passed through the display can be collected by an optical imaging system positioned on and/or coupled to the back surface of the display. In particular, for the embodiments described herein, the optical imaging system includes a microlens array positioned on the back surface of the display. Each microlens in the microlens array is oriented and configured to focus light passing through the display into a respective one of the collimator array. Each respective collimator is configured to (1) direct light directed generally parallel to a central axis of the collimator (e.g., within a selected acute angle relative to the central axis of the collimator) onto the photosensitive surface of the photodiode, and (2) reflect and/or absorb light of all other light within the collimator away from the photosensitive surface of the photodiode.
Due to this configuration, light reflected from objects near the front surface of the display, passing through the display, and oriented substantially parallel to the normal of the front surface of the display may be received and absorbed by the photodiode. Thereafter, the electrical signal generated or modified by the photodiode as a result of the absorption of light may be received by a circuit or processor, which may in turn measure or determine one or more characteristics of the light received by the photodiode. Exemplary characteristics include, but are not limited to: brightness; color; the spectral content; frequency; a wavelength; and so on. These examples are not exhaustive, and in other embodiments other characteristics of the light and/or changes in one or more characteristics of the light over time may be measured, tracked, or otherwise captured by a circuit or processor, such as described herein. For simplicity of description, the following embodiments refer to an optical imaging system configured to detect and measure or determine the brightness of light received by a photodiode. However, it is to be understood that this is merely an example, and in other embodiments, other features or combinations of features may be used.
In many embodiments described herein, the optical imaging system includes an array of photodiodes (or more generally, an array of photosensitive elements) arranged in a pattern below the back surface of the display. Each photodiode in the array may be associated with a respective one or more collimators, each collimator in turn being associated with a respective one of the microlenses oriented to face the back surface of the display to capture light passing through the display, such as described above.
In these embodiments, the light absorbed by each photodiode in the array may be measured and combined by circuitry and/or a processor such as described above into a two-dimensional image of the outer surface of the object adjacent to or in contact with the front surface of the display.
Thus, more generally and broadly, embodiments described herein facilitate imaging through a display of objects near a front surface of the display. One or more images captured by an optical imaging system such as described herein may have any suitable resolution, may be color or other color, and may be used for any suitable purpose by an electronic device incorporating the optical imaging system. Exemplary purposes include, but are not limited to: imaging a fingerprint of a front surface of the touch display; imaging a retina adjacent to the display; proximity sensing; optical communication; image or video capture; touch input sensing and positioning; touch input gesture sensing; and so on. For simplicity of description, the following embodiments refer to an exemplary implementation in which an optical imaging system is utilized by an electronic device to capture an image of a fingerprint touching an external surface, such as a protective outer layer (also referred to as a "cover glass"), above an active display area of a display of the electronic device. However, it should be understood that this is merely one example, and in other embodiments, the optical imaging system may be utilized by the electronic device to capture images or other optical information in any other suitable manner.
In some embodiments, an electronic device includes a housing that supports and encloses a display having an active display area oriented to emit light through a transparent portion of the housing or a cover glass coupled to a body portion of the housing. An optical imaging system such as described herein may be adhered, attached, formed on, or otherwise coupled to a back surface of the display that opposes at least a portion of an active display area within a housing of an electronic device.
With this configuration, when a user of the electronic device touches the housing over the active display area and over the optical imaging system (e.g., to interact with content shown on the display), the optical imaging system may obtain one or more two-dimensional images of the user's fingerprint and/or determine one or more other characteristics or properties of the user's finger. For example, the optical imaging system may be configured as, but not limited to: obtaining an image or a series of successive images of a user's fingerprint; determining a vein pattern of a user; determining blood oxygenation of the user; determining a pulse of the user; determining whether a user is wearing a glove; determining whether a user's finger is wet or dry; and so on.
As described above, an optical imaging system (such as described herein) may be used by an electronic device for any suitable imaging, sensing, or other data aggregation purpose without affecting the size of the bezel area surrounding the apparent active display area of the electronic device display. Exemplary uses include, but are not limited to: ambient light sensing; proximity sensing; depth sensing; receiving structured light; optical communication; proximity sensing; position determination; biometric imaging (e.g., fingerprint imaging, iris imaging, face recognition, vein imaging, etc.); determining optical, physical, or biological characteristic attributes (e.g., reflectance spectra, absorption spectra, etc.); and so on.
In some implementations, multiple discrete optical imaging systems can be associated with different regions of the same active display area of the same display. For example, a first optical imaging system may be disposed relative to a lower portion of a display and a second optical imaging system may be disposed behind an upper portion of the same display.
For simplicity of description, many of the embodiments below refer to exemplary configurations in which a single optical imaging system is positioned at least partially behind a lower region or portion of an active display area of a display of an electronic device. However, it should be understood that these embodiments and their equivalents described herein may be altered or adjusted to incorporate any suitable number of optical imaging systems positioned in a plurality of locations relative to an active display area or non-display surface of an electronic device and configured for the same or different imaging, sensing, or data aggregation purposes. For example, the optical imaging system may additionally or alternatively be configured to operate with infrared light or ultraviolet light. In such examples, the effective display area may include an infrared light emitting element or an ultraviolet light emitting element adjacent to a visible light emitting element of the effective display area. In other cases, the optical imaging system may include one or more light-emitting elements configured to emit light of a suitable wavelength through the back surface of the display.
For example, in some embodiments, the optical imaging system extends across the entire active display area such that the optical imaging system can image a touch of any area of the active display area. In another example, a first optical imaging system positioned relative to a first area of an active display area of a display of an electronic device may be configured to obtain a fingerprint image of a finger of a user of the electronic device, while a second optical imaging system positioned relative to a second area of the active display area may be configured to obtain a retina image of the eye of the user.
In many embodiments, thin film transistor fabrication techniques, or more generally semiconductor processing methods, may be used to fabricate optical imaging systems such as those described herein. In these implementations, the optics associated with the photodiodes (e.g., collimators, microlenses, filters, etc.) and the photodiodes themselves may be formed in the same process. Due to this manufacturing technique, alignment between the micro-scale optics and the photosensitive surface of the photodiode can be ensured. In these embodiments, a thin-film-transistor layer including one or more electrical traces may be formed on a rigid or flexible substrate, which may be transparent or opaque. The photodiode array may be formed on the thin-film transistor layer.
One or more collimators may be formed and cured on the photosensitive surface of each photodiode in the photodiode array. Microlenses, which typically take a convex shape, can be formed and cured over each collimator. The stack of layers from the thin-film-transistor layer substrate to the microlenses may then be adhered to the back surface of the display, under an active display area region of the display that exhibits at least partial transparency (e.g., the region between adjacent or neighboring pixels remains transparent).
These foregoing and other embodiments are discussed below with reference to fig. 1A-7. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
Fig. 1A depicts an electronic device 100 that includes a housing 102 that encloses a stack of layers (referred to as a "display stack") that cooperate to define a digital display configured to present visual content to convey information to a user of the electronic device 100, to request touch or force input from the user, and/or to provide entertainment to the user.
The display stack may comprise, for example, layers or elements, not in a particular order: a touch input layer; a force input layer; a tactile output layer; a thin film transistor layer; an anode layer; a cathode layer; an organic layer; a packaging layer; a reflector layer; a reinforcing layer; an injection layer; a transport layer; a polarizing layer; an anti-reflection layer; a liquid crystal layer; a backlight layer; one or more adhesive layers; a compressible layer; an ink layer; a mask layer; and so on.
For simplicity of description, the following embodiments refer to a display stack implanted using organic light emitting diode display technology and which may include, among other layers: a reflective backing layer; a thin film transistor layer; an encapsulation layer; and a light emitting layer. However, it should be understood that this is merely one illustrative example embodiment, and that other displays and display stacks may be implemented using other display technologies or combinations thereof. An example of another display technology that may be used with display stacks and/or displays such as those described herein is a micro light emitting diode display.
The display overlay also typically includes input sensors (such as force input sensors and/or touch input sensors) to detect one or more characteristics of a user's physical interaction with an active display area 104 defined by the display overlay of the display of the electronic device 100. The active display area 104 is generally characterized by an arrangement of individually controllable, physically separated and addressable pixels or sub-pixels distributed at one or more pixel densities and in the form of one or more pixel or sub-pixel distribution patterns. In more general terms, the active display area 104 is generally characterized by an arrangement of individually addressable discrete light emitting areas or regions that are physically separated from adjacent or other adjacent light emitting areas. In many embodiments, the light emitting areas defining the active display area 104 are disposed or formed on a transparent substrate, which may be flexible or rigid. Exemplary materials from which a transparent substrate such as described herein may be formed include polyethylene terephthalate, glass, sapphire, or other forms of silicon carbide. In other cases, a partially opaque substrate may be used; in such implementations, at least a portion of the substrate between pixels defined thereon can be partially or fully optically transparent.
Further, exemplary input characteristics that may be detected by input sensors of electronic device 100 that may be disposed above or below the display overlay or otherwise integrated with the display overlay may include, but are not limited to: a touch position; a force input location; touch gesture path, length, duration, and/or shape; force gesture path, length, duration, and/or shape; the magnitude of the force input; a plurality of simultaneous force inputs; a plurality of simultaneous touch inputs; and so on.
Because of these configurations, a user 106 of the electronic device 100 is facilitated to interact with content shown in the active display area 104 of the display by physically touching and/or applying a force to an input surface over any or a particular area of the active display area 104 with a user's finger.
In these embodiments, as with other embodiments described herein, the display laminate is additionally configured to facilitate imaging through the display. In particular, the display laminate further includes and/or is coupled to an optical imaging system positioned relative to the back surface of the display laminate. With this configuration, the optical imaging system may be operated by the electronic device 100 to capture a two-dimensional image of light incident on the front surface area of the display stack. For example, when the user 106 touches the display to interact with content shown in the active display area 104, an optical imaging system of the electronic device 100 may be operated by the electronic device to capture an image of a fingerprint.
More specifically, in one example, the display stack defines imaging apertures or an array of discrete and separate imaging apertures (not shown) through the backing layer or other opaque layer defining the rear surface of the display stack, allowing light to travel through the display stack from the front surface to the rear surface between two or more organic light emitting diode tube pixels or pixels (herein, "inter-pixel" regions). In some cases, the imaging aperture is rectangular in shape and is disposed on the lower region 108 of the active display area 104, although this may not be necessary.
In other cases, the imaging aperture is circular or elliptical and is disposed in a central region of the active display area 104. Typically, the imaging aperture is larger than the fingerprint of the user 106, but this may not be necessary, and a smaller aperture may be suitable. For example, in some embodiments, the backing layer may be omitted entirely; the imaging aperture may be the same size and shape as the active display area 104.
In these embodiments, an optical imaging system is positioned at least partially below the imaging aperture to collect and measure or determine light directed through the inter-pixel region of the display stack that travels through the display stack in a direction substantially opposite to a direction of travel of light emitted by the display stack. More specifically, the optical imaging system is configured to capture light incident on the front surface of the display, passing through the inter-pixel regions of the display stack, and exiting the back surface of the display. In many embodiments, the optical imaging system may be configured to operate with a display such that the display emits light to illuminate an object in contact with a front surface of the display (or an outer protective layer covering the front surface of the display). In these examples, light emitted from one or more light emitting areas (e.g., pixels) of the display may be reflected from a surface of the object and may then travel through the display stack, through the imaging aperture, and may be collected/absorbed by at least one photosensitive area or region (e.g., photodiode) of the optical imaging system. In some cases, the display may be configured to emit light in a particular area of the active display area 104 by coordinating with input sensors associated with the display. For example, as described above, the electronic device 100 may include a touch input sensor. In this example, the touch input sensor may be configured to detect a wetted area of the fingerprint of the user 106 (herein "contact area"). Once a contact area is detected, the display may be configured to illuminate the contact area with light of a particular wavelength, brightness, or other pattern. For example, in some embodiments, the display may be configured to illuminate the contact area with a blue color of a particular brightness. In other cases, the display may be configured to illuminate the touch area with a green color of a particular brightness. In other cases, the display may be configured to display a pattern or other two-dimensional image beneath the contact region. In further examples, the display may be configured to illuminate the contact area in a time-varying pattern or color. It should be understood that the foregoing examples are not exhaustive; a display such as described herein may coordinate and/or otherwise cooperate with touch input sensors of an electronic device in any suitable manner to illuminate one or more detected contact regions (and/or other regions associated therewith, such as peripheral regions of the detected contact regions) in any suitable manner. For simplicity of description, the phrase "illumination operation" is used herein to describe a function or operation of a display of an electronic device that causes a particular region or sub-region of an active display area to emit light in a particular or selected manner so as to illuminate an object in contact with or otherwise adjacent to a front surface of the display, or alternatively, an outer surface of a protective outer layer covering the front surface of the display.
As described above, the illumination operation may be commanded by an optical imaging system, such as the optical imaging system described with reference to electronic device 100. Also as described above, the optical imaging system may command or otherwise cause the illumination operation to occur for any suitable imaging or light detection purpose. In some examples, the optical imaging system may be configured to obtain an image of the retina of the user 106. In this example, once the user's eye is within a threshold distance of the front surface of the display, the optical imaging system may command an illumination operation. In other examples, such as those described below, the optical imaging system may be configured to obtain an image of the fingerprint of the user 106. In this example, once the touch input sensor or force input sensor detects a finger of the user 106, the optical imaging system may command an illumination operation. It should be appreciated that these foregoing examples are not exhaustive, and in other embodiments, other configurations of the optical imaging system may be configured for other imaging purposes, and thus, any suitable implementation-specific method of triggering an illumination operation on an imaging subject may be used.
As described above, and for simplicity of description, the following embodiments refer to an optical imaging system 110 configured to image a user's fingerprint. In these configurations, electronic device 100 may obtain an image of a fingerprint of user 106 in response to a touch or force input sensor detecting at least one contact area and correspondingly in response to a display performing an illumination operation. These operations are collectively referred to herein as "fingerprint imaging operations".
In some implementations, the optical imaging system 110 of the electronic device 100 illuminates or otherwise causes to be illuminated a finger of the user 106 with light in a visible wavelength band (e.g., green light, blue light, etc.) during a fingerprint imaging operation. Light in the visible wavelength band may be selected to maximize light reflection from the outer surface of the user's 106 fingers, thereby minimizing or eliminating emitted reflections (e.g., light that is at least partially reflected and diffused by the subsurface layers of the user's skin) that may otherwise be received as noise by the optical imaging system.
In some embodiments, optical imaging system 110 instructs a display of electronic device 100 to illuminate an area of the display under the user's 106 finger as detected by an input sensor of electronic device 100 with visible wavelength light. In other examples, the optical imaging system 110 commands the display to illuminate the periphery of the user's finger with visible wavelength light. In some examples, optical imaging system 110 of electronic device 100 instructs the display to illuminate discrete portions of user 106's finger sequentially or in a particular pattern with visible wavelength light at one or more multiple frequencies or discrete bands.
From the foregoing examples, it should be appreciated that illuminating a finger of user 106 with visible wavelength light during a fingerprint imaging operation may be performed in a number of suitable ways. For example, in some cases, the optical imaging system of electronic device 100 illuminates the user's finger with a pulsed (continuous or discrete) or steady light in the visible wavelength band. In other examples, the optical imaging system of electronic device 100 illuminates the finger of user 106 with visible wavelength light emitted in a particular modulation pattern or frequency.
In further examples, the optical imaging system 110 of the electronic device 100 illuminates the finger of the user 106 by alternating between frequencies or frequency bands of light within a visible wavelength band at a particular frequency, modulation, pulse pattern, waveform, and the like.
In other examples, optical imaging system 110 commands a display of electronic device 100 to illuminate a finger of user 106, while active display area 104 of the display of electronic device 100 also presents a visible light image. In other words, from the perspective of the user 106, the portion of the display below the fingerprint may not be particularly or differently illuminated from other portions of the display; the display may continue to render any static or animated image or series of images that appear on the display before the user touches the display.
In further examples, the display of electronic device 100 may locally increase or decrease the brightness under the user's finger, may locally increase or decrease the contrast under the user's finger, may locally increase or decrease the saturation under the user's finger, etc., while optical imaging system 110 and the display are performing a fingerprint imaging operation.
In other examples, optical imaging system 110 of electronic device 100 need not trigger an illumination operation on user 106 finger with only visible wavelength light. For example, the optical imaging system may also be configured to illuminate the finger of the user 106 with infrared light in order to detect or otherwise determine the user's pulse or blood oxygen content. In some cases, the optical imaging system 110 is configured to perform the fingerprint imaging operation substantially simultaneously with the operation of detecting the pulse of the user 106 to increase a confidence that the fingerprint image obtained by the fingerprint imaging operation corresponds to the live sample.
It will be appreciated that the above description of FIG. 1A, as well as various alternatives and variations thereof, are presented generally for purposes of explanation and to facilitate a thorough understanding of various possible configurations of an electronic device incorporating a display stack adapted for imaging through a display, such as described herein. It will be apparent, however, to one skilled in the art that some of the specific details presented herein may not be required to practice a particular described embodiment or its equivalent.
For simplicity of description and illustration, FIG. 1B is provided. This figure depicts a simplified block diagram of the electronic device of FIG. 1A, showing various operational and structural components that may be included in an electronic device configured for through-display imaging as described herein.
In particular, the electronic device 100 includes an input/display laminate layer 104a that may include or be positioned under a protective outer cover, cover glass, or other suitable transparent portion of the housing 102 shown in fig. 1A (not shown).
In these examples, the protective outer cover may be positioned over the front surface of the input/display laminate 104a, which may include at least a light emitting layer and a touch input layer. The exemplary light emitting layer may be implemented with an organic light emitting diode display technology or a micro light emitting diode display technology.
An exemplary touch input layer includes a flexible or rigid transparent substrate (e.g., glass, plastic, acrylic, polymeric material, organic material, etc.) having an array of capacitive touch input sensors configured to detect at least one contact area defined when a user 106 touches the front surface (or protective outer cover) of the input/display laminate 104 a.
As noted with respect to other embodiments described herein, the input/display stack 104a can define an array of independently addressable and controllable discrete light-emitting regions or areas (referred to herein as "pixels") disposed on a transparent or partially transparent substrate.
In particular, due to the transparent substrate, the inter-pixel area of the input/display stack 104a can be optically transparent, and thus at least a portion of light incident on the front surface of the input/display stack 104a can traverse the input/display stack 104a from the front surface to the back surface. The pixels of the input/display laminate 104a may be arranged at a constant pitch or a variable pitch to define a single pixel density or one or more pixel densities.
As noted with respect to other embodiments described herein, the active display area 104 of the display of the electronic device 100 defined by the input/display stack 104a is positioned at least partially over an optical imaging system, identified in the figure as optical imaging system 110 a. In another non-limiting phrase, the optical imaging system 110a is adhered or otherwise coupled to an optically transparent portion (e.g., an imaging aperture) of the back surface of the input/display stack 104a that is aligned with at least one inter-pixel region of the input/display stack 104a through which light incident on the front surface of the input/display stack 104a can pass. With this configuration, the optical imaging system 110a may receive light transmitted through the inter-pixel regions of the active display area 104 of the display of the electronic device 100.
The optical imaging system 110a may be formed from a number of functional and/or structural layers. In particular, the optical imaging system 110a may be formed on a rigid or flexible substrate 112 that supports a photodiode array 114. The rigid or flexible substrate 112 may be formed from a variety of suitable materials and may include any suitable number of layers. Exemplary materials that may be used to form a rigid or flexible substrate 112 of an optical imaging system, such as optical imaging system 110a, include, but are not limited to, glass, plastic, acrylic, polyethylene terephthalate, or other polymers, and the like.
The photodiode array 114 may be formed on the rigid or flexible substrate 112 using any suitable process, including operations such as, but not limited to, pick and place operations or thin film masks and additive or subtractive manufacturing operations. In many embodiments, the photodiode array 114 is fabricated using thin film transistor fabrication techniques that may include, but are not limited to, operations such as deposition operations, sputtering operations, photoresist coating and/or curing operations, exposure operations, development and/or etching operations, photoresist removal operations, polyamide or other film coating operations, cleaning operations, adhesive coating or deposition operations, adhesive curing operations, filling operations, cutting or separating operations, and the like.
The photodiode array 114 may be positioned relative to the collimator array 116. As described above, the collimator array (such as collimator array 116) may be formed of a variety of suitable materials in any suitable manner, and the collimator array is configured to reduce the field of view of at least one photodiode in the photodiode array 114. In other words, a collimator such as described herein is an example of a narrow-field-of-view optical filter that passes light that is directed substantially parallel to a central axis of the narrow-field-of-view filter and blocks (e.g., reflects or otherwise absorbs) light that is not directed substantially parallel to the central axis of the narrow-field-of-view filter.
In one embodiment, the collimator array 116 is implemented as an array of columnar holes defined through an optically opaque layer (e.g., an ink layer, a metallic backing layer, a reflective layer, etc.). In these examples, the cylindrical holes may have any suitable lateral cross-section (e.g., a cross-section perpendicular to a central axis of the respective hole). Exemplary cross-sections of cylindrical pores such as described herein are circular cross-sections, square cross-sections, polygonal cross-sections, and the like. In some cases, the cylindrical holes may be filled with an optically transparent material, such as plastic or acrylic. The filler material may then be cured.
As with the photodiode array 114, the collimator array 116 may be formed on the photodiode array 114 using any suitable process, including operations such as, but not limited to, pick and place operations, lamination operations, thin film transistor masking, or additive or subtractive manufacturing operations.
In many embodiments, as with the photodiode array 114, the collimator array 116 is fabricated using thin film transistor fabrication techniques that may include, but are not limited to, operations such as deposition operations, sputtering operations, photoresist coating and/or curing operations, exposure operations, development and/or etching operations, photoresist removal operations, polyamide or other film coating operations, cleaning operations, adhesive coating or deposition operations, adhesive curing operations, filling operations, cutting or separating operations, and the like. In many cases, the operations associated with forming the photodiode array 114 may be performed before the operations associated with forming the collimator array 116 are performed. In this way, the collimator array 116 may be precisely aligned with the photodiode array 114.
In some cases, a single collimator of the collimator array 116 is disposed over and/or formed on a single photodiode of the photodiode array 114. More specifically, the respective collimator may have substantially the same cross-sectional area as the photosensitive region of the respective photodiode. More generally, in some embodiments, the collimator array 116 may be disposed and/or formed in a one-to-one relationship with respect to the photodiode array 114. In other embodiments, multiple collimators may be positioned over a single photodiode. In other words, in some embodiments, the collimator array 116 may be disposed and/or formed in a many-to-one relationship with respect to each photodiode in the photodiode array 114.
The collimator array 116 may be positioned relative to the microlens array 118. As described above, a microlens array, such as microlens array 118, may be formed of a variety of suitable materials in any suitable manner, and the microlens array is configured to direct and/or otherwise focus light incident thereon into a respective one of collimator array 116. In other words, a microlens such as described herein is an example of an optical adapter configured to direct or focus light in a particular direction to a particular focal point. For simplicity of description, the embodiments described herein refer to concave microlenses; however, it should be understood that this is merely one example of a microlens shape, and in other embodiments, other lens shapes may be possible or preferred.
In one embodiment, the microlens array 118 is implemented as a concave lens array, each microlens in the microlens array being aligned with and disposed over a respective one of the collimator array. In many embodiments, the central axis of each respective microlens disposed and/or formed over a respective collimator is precisely aligned with the central axis of the respective collimator. In other cases, the central axis of the microlenses may be offset relative to the central axis of the respective collimator; in these examples, the microlenses may be used to focus light, and may additionally be used for beam steering purposes. These examples are not exhaustive; in other examples, other lens alignments and configurations may be used.
In many cases, each microlens in the microlens array 118 is formed into the same geometric shape and takes substantially the same shape. However, this is only one example. In other embodiments, different lenses in microlens array 118 may take different shapes, alignments, sizes, focal lengths, and the like.
As with collimator array 116, microlens array 118 may be formed on collimator array 116 using any suitable process, including operations such as, but not limited to, pick and place operations, lamination operations, thin film transistor masks, or additive or subtractive manufacturing operations.
In many embodiments, as with the collimator array 116 and the photodiode array 114, the microlens array 118 is fabricated using thin film transistor fabrication techniques that may include, but are not limited to, operations such as deposition operations, sputtering operations, photoresist coating and/or curing operations, exposure operations, development and/or etching operations, photoresist removal operations, polyamide or other film coating operations, cleaning operations, adhesive coating or deposition operations, adhesive curing operations, filling operations, cutting or separating operations, and the like. In many cases, the operations associated with forming the collimator array 116 may be performed before the operations associated with forming the microlens array 118. In other cases, the microlens array 118 may be formed in the same process as the collimator array 116. For example, a fill material used to fill the holes defining the collimators in a collimator array may be used to form the respective microlenses associated with the collimators. More specifically, the fill material can be "overfilled" such that the fill material overflowing from the fill hole can form a meniscus that, once cured, can define a microlens having a suitable or preferred geometry. In this way, the microlens array 118 may be precisely aligned with the collimator array 116.
In some cases, a single microlens in microlens array 118 is disposed over and/or formed on a single collimator in collimator array 116. More specifically, the respective microlenses may have substantially the same area as the cross-sectional area of the respective collimators. More generally, in some embodiments, the collimator array 116 may be disposed and/or formed in a one-to-one relationship with respect to the photodiode array 114. In other cases, more than one microlens may be formed over a single collimator (e.g., many-to-one relationship). In many embodiments, only a single microlens is formed over a single collimator.
Due to these configurations, light passing through the inter-pixel regions of the input/display stack 104a may be focused by the microlens array 118 into the collimator array 116, which in turn may deliver light oriented/directed substantially or approximately parallel to its central axis onto the photodiode array 114. Due to the stacked structure, the optical imaging system 110a may be configured to collect only light directed substantially perpendicular to the stacked structure. More simply, due to this configuration, the optical imaging system 110a may be configured to capture light reflected from a two-dimensional contact area of a fingerprint of the user 106, where the light absorbed/collected by a single photodiode corresponds to a single pixel of the fingerprint image.
To measure or determine the light collected by each photodiode in the photodiode array 114, the substrate 112 may also include one or more electrical traces and/or circuitry electrically coupled to each photodiode in the photodiode array 114. Such circuitry and/or traces may employ any suitable topology; exemplary circuit topologies may include pre-amplifier stages, combining stages, charge storage stages, multiplexing stages, de-multiplexing stages, addressing stages, and so forth.
The substrate 112 may be electrically coupled to a flex circuit 120 capable of conductively and communicatively coupling circuitry and/or traces defined on the substrate 112 to a general-purpose or special-purpose processor or circuit of the electronic device 100, which is identified as processor 122. Processor 122 may be any suitable processor or circuitry capable of executing, monitoring, or coordinating one or more processes or operations of electronic device 100. The processor 122 may be any suitable single-core or multi-core processor capable of executing instructions stored in a memory (not shown) to instantiate one or more classes or objects configured to interact with inputs or outputs of one or more of the optical imaging system 110a and/or the input/display stack 104 a. In some examples, the processor 122 may be a dedicated processor associated with one or more of the optical imaging system 110a, the input/display stack 104a, and/or the electronic device 100. In other cases, the processor 122 may be a general-purpose processor.
In other embodiments, electronic device 100 may include one or more optional optical elements. Optional optical elements are generally positioned between the optical imaging system 110a and the input/display stack 104a, and may include, but are not limited to: one or more lenses, filters, mirrors, actuators, apertures, irises, glint elements, narrow field of view filters, collimators, flood lights, infrared cut filters, ultraviolet cut filters, or other accessory optical elements or combinations thereof.
Thus, referring to fig. 1A-1B in general and in broad terms, it should be understood that an electronic device including a display adapted to be imaged through the display may be configured in a variety of ways. For example, although the electronic device 100 is depicted as a cellular telephone, it is understood that other electronic devices may incorporate a display stack such as described herein, including but not limited to: a tablet device; a laptop computer device; a desktop computer; computing an attachment; a peripheral input device; a vehicle control device; a mobile entertainment device; augmented reality devices; a virtual reality device; an industrial control device; a digital wallet device; home security equipment; a service security device; a wearable device; a health device; an implantable device; a garment device; a fashion accessory device; and so on.
It should also be understood that in addition to the components shown in fig. 1A-1B, the electronic device may also include one or more processors, memory, power sources and/or batteries, network connections, sensors, input/output ports, acoustic elements, tactile elements, digital and/or analog circuitry for performing, supervising and/or coordinating one or more tasks of the electronic device 100, and/or the like. For simplicity of illustration, electronic device 100 is shown in fig. 1A-1B without many of these elements, and each of the actuating elements may be partially and/or fully included within housing 102 and may be operatively or functionally associated with or coupled to a display of electronic device 100.
Further, although the electronic device 100 includes only a single rectangular display, it should be understood that this example is not exhaustive. In other embodiments, the electronic device may include or be communicatively coupled to a plurality of displays, wherein one or more displays may be adapted to image through the display. Such auxiliary/auxiliary displays, including but possibly not limited to: an auxiliary monitor; a function line display or a keyboard key display; a wearable electronic device display; peripheral input devices (e.g., trackpad, mouse, keyboard, etc.) in conjunction with a display; a digital wallet screen; and so on. Similarly, a rectangular display may not be required; other embodiments are implemented using displays having other shapes, including three-dimensional shapes (e.g., curved displays).
Similarly, although the display described with reference to electronic device 100 is the main display of the electronic device, it should be understood that this example is not exhaustive. In some embodiments, the display stack may define a low resolution auxiliary display, such as a monochrome display or a grayscale display. In other cases, the display overlay may define a single image, such as a glyph or an icon. In one particular example, a power button for an electronic device can include a button cap that incorporates a display such as described herein. The display may be configured to selectively display a power icon and/or a limited set of icons or glyphs associated with one or more functions of a button that may be configured to perform one or more configurable options with which the button is associated (e.g., a power option, a standby option, a volume option, an authentication option, a digital purchase option, a user authentication option, etc.). In these examples, the limited-purpose, auxiliary, or secondary display may be configured to have partial transparency or translucency, such as described herein, to facilitate imaging through the display.
Thus, it is to be understood that the foregoing description of specific embodiments has been presented for purposes of illustration and description. These descriptions are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed herein. On the contrary, many modifications and variations are possible in light of the above teaching, as would be apparent to those of ordinary skill in the art. In particular, it should be understood that a display laminate suitable for imaging through a display can be constructed and/or assembled in many suitable ways. For example, an optical imaging system such as described herein may be formed by assembling or creating layers in a different order and/or with additional layers.
In particular, fig. 2A illustrates exemplary stacks that can cooperate to define an optical imaging system 200 a. As with the embodiment described with reference to fig. 1B, in this exemplary embodiment, a substrate 202 supports a photodiode array 204. In this example, the infrared cut filter 206 may be formed over the photodiode array 204. The ir-cut filter 206 may be formed of any suitable material configured to absorb and/or reflect at least infrared light such that the infrared light does not interfere with the imaging operation of the photodiode 204. As with other layers of other embodiments of optical imaging systems such as those described herein, the infrared cut filter 206 may be formed, disposed, cured, and/or otherwise fabricated in accordance with thin film transistor fabrication techniques. An array of collimators 208 below an array of microlenses 210 is disposed or otherwise formed above the infrared cut filters 206.
In this embodiment, when optical imaging system 200a is coupled to the back surface of the display stack (and/or under the imaging aperture of the display stack) (e.g., via an adhesive having a different index of refraction than microlens array 210), light incident on the front surface of the display stack, passing through the inter-pixel regions of the display stack, may exit the back surface of the display stack, be focused by the microlens array 210 into the collimator array 208, the collimator array may then filter the light based on its orientation and direction (e.g., only light within an acute angle of the central axis of the collimator passes through the collimator; all other light is reflected or absorbed by the collimator), and the filtered light may be directed through an infrared cut filter 206 that passes only the orientation filtered visible light, such that at least one photodiode in the photodiode array 204 can absorb the light.
In other cases, the optical imaging system may be arranged in another manner. Fig. 2B shows another exemplary optical imaging system 200B, in which an infrared cut filter 206 is disposed over a microlens array 210, which in turn is disposed over a collimator array 208, which in turn is disposed over a photodiode array 204, which is disposed on a substrate 202.
The foregoing exemplary configurations of the optical imaging system are not exhaustive. In some embodiments, the other optical filters may be infrared pass filters, color filters, variable color filters (e.g., liquid crystal filters), polarization filters, and the like. In other cases, the optical filter may be positioned elsewhere in the stack. More simply, it should be understood that other exemplary configurations are possible according to the various exemplary embodiments described herein.
For example, fig. 3 illustrates another exemplary cross-section of an optical imaging system such as described herein coupled to a display stack. In particular, the optical imaging system 300 includes an optical imaging stack 302 positioned below a luminescent layer 304 of a display. The light emitting layer 304 is positioned below a protective outer cover 306, which may encapsulate and protect the light emitting layer 304 and the optical imaging system 300. Further, the protective outer cover 306 may define an input surface to receive a touch of a user 308.
As such, in response to a user's touch of the input surface (which may be detected by a touch input sensor, such as described above), the optical imaging system 300 may instruct the luminescent layer 304 to initiate an illumination operation to illuminate the user's finger. In particular, the light emitting layer 304 may provide power to at least one pixel, such as the pixel 304a, to cause the pixel 304a to emit light through the front surface of the light emitting layer 304, through the protective outer cover 306, and toward the user 308. The user's skin may reflect at least a portion of the light emitted by the pixels 304a, which may redirect the reflected light back to the protective outer cover 306.
At least a portion of this reflected light may pass through the protective outer cover 306 and through the inter-pixel regions of the light emitting layer 304 to exit the back surface of the light emitting layer 304 into the adhesive layer 310. At least a portion of this light may be focused by at least one microlens in the array of microlenses of the photoimaging layered structure 302 protruding into the adhesive layer 310 (one of the microlenses being identified as microlens 310a of the photoimaging layered structure 302). At least a portion of the focused light may be directed into at least one collimator of an array of collimators defined through an opaque layer 312 configured to block light (including ambient light) from passing through the display (e.g., fig. 3 includes one exemplary light ray u1 blocked by the opaque layer 312). An opaque layer 312 of the display may be included to increase the apparent contrast of the active display area and additionally provide a structural layer through which the collimator array is formed. An exemplary collimator of the optical imaging stack 302 is identified as collimator 314. The light may be filtered through at least a portion of the collimator array by an infrared cut filter 316 of the optical imaging stack 302. At least a portion of the light passing through the infrared cut filter 316 may be absorbed by a photodiode 318a defined on and/or in the thin film transistor substrate 318.
The photodiode 318a may be conductively coupled to at least one electrical trace defined on the thin film transistor substrate 318, which in turn may be coupled to a flexible substrate 320 capable of communicatively and conductively coupling the photodiode 318a (and other photodiodes 318b additionally disposed on the thin film transistor substrate 318) to the processor 322.
This exemplary cross-section of the optical imaging system is not exhaustive of the various configurations and layouts of the optical imaging system such as described herein. For example, fig. 4A-4C illustrate different exemplary configurations of imaging optics (such as collimators and microlenses) that may be used with an optical imaging system such as described herein, which may be used with the optical imaging system of fig. 3. For example, the embodiment shown in fig. 4A-4C can be viewed along line B-B shown in fig. 3. In particular, fig. 4A shows an optical imaging system 400a that includes a substrate 402 that supports a photodiode array 404 below an infrared cut filter 406, which may be optional. The imaging optics layer 408 may be formed over the infrared cut filter 406.
In this example, the imaging optics layer 408 includes an array of collimators formed by initially disposing an optically opaque layer over the infrared cut filter 406. Once the opaque layer is formed, an array of apertures may be formed or otherwise defined through the layer above the photodiode array 404. The apertures may then be filled with a curable, optically transparent material to define imaging optics groups or arrays, each comprising convex microlenses (one of which is identified as microlens 410a) and collimators (one of which is identified as collimator 410 b).
The imaging optics defined by imaging optics layer 408 may take on a variety of suitable shapes, cross-sections, and geometric designs. For example, in some embodiments, the spacing separating the microlenses and the collimator may be greater than the spacing shown in fig. 4A. In other examples, a variable spacing may be used between the imaging optics. For example, the collimators and microlenses may be arranged in groups; a first set or array of imaging optics disposed at a first pitch may be separated from a second set or array of imaging optics disposed at the first pitch or a different second pitch. In these embodiments, the arrangement of the different numbers of sets of imaging optics may be provided in any suitable pattern or arrangement.
Similarly, the convex surface of the microlens, or more generally the shape of the lens, may vary from embodiment to embodiment. Similarly, in some embodiments, the microlenses may be separated from the associated collimators by a distance. For example, fig. 4B shows an optical imaging system 400B in which the imaging optics layer 408 includes a reduced height collimator 410 c. Those skilled in the art will appreciate that different heights of the collimator sidewalls may impart different optical and/or filtering characteristics; different embodiments may be implemented in different ways.
In other embodiments, the collimator sidewalls need not be perpendicular to other layers of the optical imaging system. For example, the optical imaging system 400C shown in FIG. 4C includes trapezoidal collimator sidewalls.
It should be understood that the foregoing examples are not exhaustive of the different configurations of imaging optics that may be formed with an optical imaging system such as described herein. In other embodiments, other variations are possible.
For example, in various embodiments, the imaging optics of the optical imaging system, and in particular, the microlenses of the optical imaging system, may be distributed in a number of possible ways. For example, fig. 5A shows a first microlens arrangement 500a that distributes microlenses in a grid pattern. In another example, fig. 5B shows a second arrangement of microlenses 500B that distributes the microlenses in a circular filling arrangement. In other cases, the microlenses in the same array may be formed to have different sizes or dimensions, may partially overlap, or may be constant or variable pitch apart.
Generally and broadly, fig. 6 and 7 depict simplified flow diagrams corresponding to various in-order and/or out-of-order operations of the methods described herein. It should be appreciated that these simplified examples may be modified in a number of ways. In some examples, there may be more, alternative, or fewer operations than those depicted and described.
FIG. 6 is a simplified flowchart illustrating exemplary operations of a method for capturing an image of an object touching a display with an optical imaging system disposed behind the display, such as described herein. The method may be performed in whole or in part by a processor or circuitry of an electronic device such as described herein.
The method 600 includes an operation 602 in which a touch to a display of an electronic device is detected. Any suitable sensor or combination of sensors may be used to detect the initial touch, including but not limited to touch sensors and force sensors. Exemplary touch sensors include, but are not limited to: a capacitive touch sensor; an optical touch sensor; an impedance touch sensor; an acoustic touch sensor; and so on. Exemplary force sensors include, but are not limited to: a capacitive force sensor; a resistive force sensor; a piezoelectric material force sensor; a strain-based force sensor; an inductive force sensor; and so on.
Once a touch is detected at operation 602, the method 600 continues to operation 604, where visible wavelength light illuminates the contact area of the detected touch. As noted with respect to other embodiments described herein, illumination of the contact centroid and/or contact area may be performed in any suitable manner, including but not limited to: specific/selected modulation of light; a particular/selected pattern (e.g., linear scan, radial expansion, etc.); and the like, or any combination thereof.
The method 600 further includes an operation 606 in which one or more optical properties of the contact region are determined. In one example, a fingerprint image is captured by an optical imaging system of an electronic device. As noted with respect to other embodiments described herein, the operation of capturing a fingerprint image (or more generally, an image of an object in contact with the display at operation 602) may include one or more filtering operations, such as: spatial filtering (e.g., point source filtering, beamforming, etc.); a threshold value; de-skewing; rotating; and so on.
FIG. 7 is a simplified flow chart illustrating exemplary operations of a method for manufacturing an optical imaging system, such as described herein. In particular, method 700 includes an operation 702 in which a thin film transistor substrate is selected. Method 700 then continues to operation 704 where various functional and/or structural layers of the optical imaging system may be formed. For example, a microlens array, a collimator array, a mask layer (associated with the collimator array), an infrared cut filter, and a photodiode array may all be formed on a thin film transistor substrate. Thereafter, at operation 706, the various layers formed at operation 704 may be cured or otherwise trimmed.
It will be appreciated that while a number of embodiments are disclosed above, the operations and steps provided with respect to the methods and techniques described herein are intended to be exemplary and are therefore not exhaustive. It is further to be understood that alternative orders of steps or fewer or additional operations may be required or desired for particular embodiments.
While the foregoing disclosure has been described in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects, and functions described in one or more of the individual embodiments are not limited in their application to the particular embodiments in which they are described, but rather they may be applied, alone or in various combinations, to one or more of the several embodiments of the utility model, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should instead be defined by the claims provided herein.
In addition, the present disclosure recognizes that personal information data including biometric data in the present technology may be used to benefit a user. For example, using biometric authentication data may be used to facilitate access to device features without using a password. In other examples, user biometric data is collected to provide feedback to the user regarding their fitness or fitness level. In addition, the present disclosure also contemplates other uses where personal information data, including biometric data, is beneficial to a user.
The present disclosure also contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will comply with established privacy policies and/or privacy practices. In particular, such entities should enforce and consistently use privacy policies and practices generally recognized as meeting or exceeding industry or government requirements to maintain privacy and security of personal information data, including the use of data encryption and security methods that meet or exceed industry or government standards. For example, personal information from a user should be collected for legitimate and legitimate uses by an entity and not shared or sold outside of these legitimate uses. In addition, such collection should only be done after the user has informed consent. In addition, such entities should take any required steps to secure and protect access to such personal information data, and to ensure that others who are able to access the personal information data comply with their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data, including biometric data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, in the case of a biometric authentication method, the present technology may be configured to allow a user to selectively bypass the biometric authentication step by providing security information such as passwords, personal identification numbers, touch gestures, or other authentication methods known to those skilled in the art, either alone or in combination. In another example, a user may choose to remove, disable, or restrict access to certain health-related applications that collect the user's personal health or fitness data.

Claims (18)

1. An optical imaging system comprising:
a first substrate defining a first surface and a second surface;
a light emitting element disposed on the first surface and positioned to emit light from the first surface to a free space at least in a first direction;
a second substrate coupled to the second surface;
a photodiode disposed on an even number of second substrates and positioned to receive light propagating in a direction at least partially opposite the first direction and through at least a portion of the first substrate adjacent the light-emitting elements; and
an array of collimators positioned between the photodiodes and the second surface and comprising a plurality of collimators, each collimator aligned over a respective portion of the photodiodes; wherein:
reflection of at least a part of light emitted from the light emitting element from a surface of an object in a free space as reflected light; and
the photodiode is positioned to receive at least a portion of the reflected light.
2. The optical imaging system of claim 1, wherein the collimator array comprises at least three parallel collimators.
3. The optical imaging system of claim 1, further comprising a microlens disposed over at least one collimator of the collimator array.
4. The optical imaging system of claim 1, wherein each collimator of the collimator array includes a microlens positioned to converge light incident to the microlens into the collimator.
5. The optical imaging system of claim 1, wherein the light-emitting element is configured to emit visible light of a selected color.
6. The optical imaging system of claim 1, wherein the light emitting element is configured to emit infrared light.
7. The optical imaging system of claim 1, wherein the light-emitting elements are pixels of a display of an electronic device.
8. The optical imaging system of claim 1, wherein a first surface area of the photodiode is greater than a second surface area of the light-emitting element.
9. The optical imaging system of claim 1, further comprising a thin film transistor stack containing the photodiode and the collimator array.
10. The optical imaging system of claim 1, further comprising an infrared cut filter between the collimator array and the photodiode.
11. The optical imaging system of claim 1, further comprising a processor configured to detect a fingerprint of a user of an electronic device incorporating the optical imaging system based on an output of the photodiode.
12. The optical imaging system of claim 1, wherein the first substrate is at least partially transparent.
13. An electronic device configured to capture an image of an object in free space above a display of the electronic device, the electronic device comprising:
a housing;
a display within the housing and comprising:
a first thin film transistor layer; and
an array of light emitting pixels configured to emit light into free space; and
an optical imaging sensor coupled to a lower surface of the first thin-film transistor layer and comprising:
a second thin-film transistor layer comprising conductive traces; and
a photosensor coupled to the second thin-film-transistor layer, positioned below a subset of light-emitting pixels of the array of light-emitting pixels, and electrically coupled to the conductive traces; wherein:
at least a portion of the light emitted from the display is reflected from the object back toward the display, through the display between at least two light-emitting pixels of the array of light-emitting pixels, and received by the light sensitive element.
14. The electronic device of claim 13, further comprising an infrared cut filter coupled to the light sensitive element and configured to reflect and/or absorb infrared light passing through the display.
15. The electronic device of claim 14, further comprising a collimator array formed above the infrared cut filter and configured to narrow a field of view of the photosensitive element.
16. The electronic device defined in claim 15 further comprising an array of microlenses formed over the array of collimators, each respective microlens of the array of microlenses configured to focus light incident to the respective microlens into a respective collimator of the array of collimators, the array of microlenses coupled to a lower surface of the first thin-film-transistor layer by an adhesive.
17. The electronic device of claim 13, wherein the display is one of an organic light emitting diode display or a micro light emitting diode display.
18. The electronic device of claim 13, wherein the photosensitive element is a photodiode.
CN202120662240.3U 2019-09-23 2020-08-28 Optical sensing system and electronic device Active CN215869390U (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201962904211P 2019-09-23 2019-09-23
US62/904,211 2019-09-23
US17/003,636 2020-08-26
US17/003,636 US20210089741A1 (en) 2019-09-23 2020-08-26 Thin-Film Transistor Optical Imaging System with Integrated Optics for Through-Display Biometric Imaging
CN202021839969.5U CN212848409U (en) 2019-09-23 2020-08-28 Optical sensing system and electronic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202021839969.5U Division CN212848409U (en) 2019-09-23 2020-08-28 Optical sensing system and electronic device

Publications (1)

Publication Number Publication Date
CN215869390U true CN215869390U (en) 2022-02-18

Family

ID=72603520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202120662240.3U Active CN215869390U (en) 2019-09-23 2020-08-28 Optical sensing system and electronic device

Country Status (3)

Country Link
CN (1) CN215869390U (en)
AU (1) AU2020223751B2 (en)
DE (1) DE102020210905A1 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107515435B (en) * 2017-09-11 2020-12-29 京东方科技集团股份有限公司 Display panel and display device
US10809853B2 (en) * 2017-12-11 2020-10-20 Will Semiconductor (Shanghai) Co. Ltd. Optical sensor having apertures
EP3731137B1 (en) * 2019-02-02 2023-01-25 Shenzhen Goodix Technology Co., Ltd. Fingerprint recognition apparatus and electronic device

Also Published As

Publication number Publication date
AU2020223751B2 (en) 2022-03-03
AU2020223751A1 (en) 2021-04-08
DE102020210905A1 (en) 2021-03-25

Similar Documents

Publication Publication Date Title
CN212848409U (en) Optical sensing system and electronic device
US11073712B2 (en) Electronic device display for through-display imaging
EP3731133B1 (en) Under-screen fingerprint recognition apparatus and electronic device
US11422661B2 (en) Sensing system for detection of light incident to a light emitting layer of an electronic device display
CN213182770U (en) Fingerprint identification device and electronic equipment
US10331939B2 (en) Multi-layer optical designs of under-screen optical sensor module having spaced optical collimator array and optical sensor array for on-screen fingerprint sensing
CN109154869B (en) Optical collimator of optical sensor module under screen for on-screen fingerprint sensing
CN107004130B (en) Optical sensor module under screen for sensing fingerprint on screen
CN110991351B (en) Optical sensor module under screen for sensing fingerprint on screen
WO2020056771A1 (en) Fingerprint identification apparatus and electronic device
CN112070018B (en) Fingerprint identification device and electronic equipment
CN210052176U (en) Fingerprint detection device and electronic equipment
US11600103B2 (en) Shortwave infrared optical imaging through an electronic device display
WO2020168495A1 (en) Method and device for fingerprint recognition, and terminal device
CN111967402A (en) Optical fingerprint device, preparation method thereof and electronic equipment
CN111095279A (en) Fingerprint detection device and electronic equipment
CN111108509A (en) Fingerprint detection device and electronic equipment
WO2021174423A1 (en) Fingerprint recognition apparatus, display screen, and electronic device
CN111095273B (en) Device for biometric identification
CN215869390U (en) Optical sensing system and electronic device
CN112860120B (en) Fingerprint identification device, electronic equipment and ambient light detection method
CN112860120A (en) Fingerprint identification device, electronic equipment and method for detecting ambient light

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant