WO2021035714A1 - 纹路图像获取装置、显示装置及准直部件 - Google Patents

纹路图像获取装置、显示装置及准直部件 Download PDF

Info

Publication number
WO2021035714A1
WO2021035714A1 PCT/CN2019/103794 CN2019103794W WO2021035714A1 WO 2021035714 A1 WO2021035714 A1 WO 2021035714A1 CN 2019103794 W CN2019103794 W CN 2019103794W WO 2021035714 A1 WO2021035714 A1 WO 2021035714A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
aperture layer
display panel
light
acquisition device
Prior art date
Application number
PCT/CN2019/103794
Other languages
English (en)
French (fr)
Inventor
海晓泉
董学
王海生
王雷
王迎姿
孟宪东
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US16/767,254 priority Critical patent/US11668862B2/en
Priority to EP19932219.9A priority patent/EP4024269A4/en
Priority to PCT/CN2019/103794 priority patent/WO2021035714A1/zh
Priority to CN201980001564.8A priority patent/CN112753036A/zh
Priority to JP2020572406A priority patent/JP7425002B2/ja
Publication of WO2021035714A1 publication Critical patent/WO2021035714A1/zh
Priority to US18/301,485 priority patent/US20230251404A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/005Diaphragms
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0018Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for preventing ghost images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • G02B27/095Refractive optical elements
    • G02B27/0955Lenses
    • G02B27/0961Lens arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • G02B27/0988Diaphragms, spatial filters, masks for removing or filtering a part of the beam
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133526Lenses, e.g. microlenses or Fresnel lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/60OLEDs integrated with inorganic light-sensitive elements, e.g. with inorganic solar cells or inorganic photodiodes
    • H10K59/65OLEDs integrated with inorganic image sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/13306Circuit arrangements or driving methods for the control of single liquid crystal cells
    • G02F1/13312Circuits comprising photodetectors for purposes other than feedback
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/13338Input devices, e.g. touch panels
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14678Contact-type imagers

Definitions

  • the embodiments of the present disclosure relate to a grain image acquisition device, a display device, and a collimating component.
  • At least one embodiment of the present disclosure provides a texture image acquisition device, which includes a collimating component and an image sensor.
  • the collimating component is stacked on the image sensor, and the collimating component includes a lens array, a first diaphragm layer, and a second diaphragm layer stacked in sequence, and the second diaphragm layer is connected to the image.
  • the sensors are adjacent.
  • the lens array is configured to converge light rays and enter the first aperture layer, and the first aperture layer is configured to transmit light incident to the first aperture layer and enter the second aperture layer.
  • the lens array includes a plurality of lens units arranged in an array
  • the first aperture layer includes a plurality of first through holes arranged in an array.
  • the plurality of lens units and the plurality of first through holes overlap in a one-to-one correspondence in a direction perpendicular to the first aperture layer.
  • the thickness of the collimating member is less than 100 microns
  • the aperture of at least one lens unit is 1-40 microns
  • at least one of the first pass The opening diameter of the hole is 1-20 microns.
  • the second aperture layer includes a plurality of second through holes arranged in an array, and the plurality of second through holes are connected to the plurality of second through holes.
  • the first through holes overlap in a one-to-one correspondence in a direction perpendicular to the first aperture layer.
  • the first through hole has a first opening center
  • the second through hole has a second opening center
  • the first through hole has a second opening center.
  • the center of an opening overlaps with the second opening center of the second through hole corresponding to the first through hole in a direction perpendicular to the image sensor.
  • the opening diameter of the first through hole is greater than or equal to the opening diameter of the corresponding second through hole.
  • the number of the plurality of lens units, the plurality of first through holes, and the plurality of second through holes are equal.
  • the first through hole has a first opening center, and the first opening center is located at the center of the lens unit corresponding to the first through hole. Focal point.
  • the collimating component further includes a transparent first filling layer, which is located between the lens array and the first aperture layer;
  • the second filling layer is located between the first aperture layer and the second aperture layer.
  • the thickness of the first filling layer is equal to the focal length of the lens unit.
  • the ratio of the thickness of the first filling layer to the thickness of the second filling layer is in the range of 4-6.
  • the texture image acquisition device provided by at least one embodiment of the present disclosure further includes an intermediate medium.
  • the intermediate medium is disposed between the image sensor and the collimating member, the intermediate medium is configured to bond the image sensor and the collimating member, and adjusting the image sensor and the collimating member in The pitch in the direction perpendicular to the image sensor.
  • the plurality of lens units are arranged in a hexagonal array.
  • the grain image acquisition device provided by at least one embodiment of the present disclosure further includes a third diaphragm layer.
  • the third diaphragm layer is located between the second diaphragm layer and the image sensor, and is configured to allow light to pass through and be incident on the image sensor and to transmit light through the third diaphragm layer. The angle of light is limited.
  • At least one embodiment of the present disclosure also provides a display device, which includes a display panel, and the above-mentioned texture image acquisition device stacked on the display panel.
  • the display panel includes a display side and a back side opposite to the display side, the grain image acquisition device is located on the back side of the display panel, and the collimating component is located between the display panel and the image sensor .
  • the texture image acquisition device is configured to detect light that is reflected by an object texture on the display side of the display panel and passes through the display panel.
  • the display panel includes a display area and a peripheral area.
  • the collimating member and the image sensor are arranged in the display area of the display panel and located on the back side of the display panel.
  • the display device provided by at least one embodiment of the present disclosure further includes a plastic frame.
  • the plastic frame is arranged between the display panel and the collimating component, and the plastic frame is arranged in a peripheral area of the display panel and located on the back side of the display panel.
  • the display device provided by at least one embodiment of the present disclosure further includes a filling medium.
  • the filling medium is arranged between the display panel and the collimating member, and is filled in the gap between the back side of the display panel and the lens array of the collimating member.
  • the display panel includes an organic light emitting diode display panel, a quantum dot light emitting diode display panel, or a liquid crystal display panel.
  • At least one embodiment of the present disclosure also provides a collimating component, including a lens array, a first aperture layer, and a second aperture layer that are stacked in sequence.
  • the lens array is configured to converge light rays and be incident on the first diaphragm layer
  • the first diaphragm layer is configured to transmit the light rays incident on the first diaphragm layer and be incident on the first diaphragm layer.
  • a second diaphragm layer, and restricting the angle of light that can pass through the first diaphragm layer, the second diaphragm layer is configured to transmit the light incident on the second diaphragm layer , And limit the angle of light that can pass through the second aperture layer.
  • the lens array includes a plurality of lens units arranged in an array
  • the first aperture layer includes a plurality of first through holes arranged in an array
  • the plurality of lens units and the plurality of first through holes overlap in a one-to-one correspondence in a direction perpendicular to the first aperture layer.
  • the second aperture layer includes a plurality of second through holes arranged in an array, and the plurality of second through holes are connected to the plurality of second through holes. A through hole overlaps one by one in a direction perpendicular to the first aperture layer.
  • the opening diameter of the first through hole is greater than or equal to the opening diameter of the second through hole corresponding to the first through hole.
  • the collimating component further includes: a transparent first filling layer located between the lens array and the first aperture layer; and a transparent The second filling layer is located between the first aperture layer and the second aperture layer.
  • the thickness of the first filling layer is equal to the focal length of the lens unit.
  • the ratio of the thickness of the first filling layer to the thickness of the second filling layer is in the range of 4-6.
  • FIG. 1A is a schematic block diagram of a texture image acquisition device provided by at least one embodiment of the present disclosure
  • FIG. 1B is a cross-sectional view of a texture image acquisition device provided by at least one embodiment of the present disclosure
  • 2A is a schematic structural diagram of a collimating component provided by at least one embodiment of the present disclosure
  • 2B is a schematic plan view of a lens array in a collimating component provided by at least one embodiment of the present disclosure
  • 2C is another schematic plan view of the lens array in the collimating component provided by at least one embodiment of the present disclosure
  • 2D is another schematic structural diagram of a collimating component provided by at least one embodiment of the present disclosure.
  • 3A is a schematic diagram of a specific structure of a collimating component provided by at least one exemplary embodiment of the present disclosure
  • Fig. 3B is a design simulation diagram of a collimating component provided by at least one exemplary embodiment of the present disclosure
  • 3C and 3D are diagrams of light simulation results of a collimating component including a single-layer diaphragm provided by at least one embodiment of the present disclosure
  • 3E is a light simulation result diagram of a collimating component including a double-layer diaphragm provided by at least one embodiment of the present disclosure
  • FIG. 4A is a schematic diagram of another texture image acquisition device provided by at least one embodiment of the present disclosure.
  • 4B is a cross-sectional view of another texture image acquisition device provided by at least one embodiment of the present disclosure.
  • 4C is a schematic plan view of the image sensor in the texture image acquisition device provided by at least one embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of a display device provided by at least one embodiment of the present disclosure.
  • 6A is a schematic diagram of a display panel in a display device provided by at least one embodiment of the present disclosure
  • 6B is a schematic diagram of a display array layer in a display panel provided by at least one embodiment of the present disclosure
  • 6C is another schematic diagram of the display array layer in the display panel provided by at least one embodiment of the present disclosure.
  • 6D is a schematic plan view of a display panel provided by at least one embodiment of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view of a display device provided by at least one embodiment of the present disclosure.
  • FIG. 8A is a simulation diagram of the texture of the display device provided by at least one embodiment of the present disclosure.
  • FIG. 8B is a light simulation result diagram of the display device provided by at least one embodiment of the disclosure relative to the lines in FIG. 8A;
  • FIG. 9A is a schematic diagram of a way of bonding a display panel and a collimating component in a display device provided by at least one embodiment of the present disclosure.
  • FIG. 9B is a schematic diagram of another bonding method of the display panel and the collimating component in the display device provided by at least one embodiment of the present disclosure.
  • FIG. 10 is a flowchart of a method for manufacturing a texture image acquisition device provided by at least one embodiment of the present disclosure.
  • the light collimation methods include fiber collimation and through-hole collimation, so as to accurately obtain the valley and ridge information of the fingerprint.
  • the optical fiber collimating device in the related art is generally thicker (for example, the thickness is more than 400 micrometers), and the general application scenarios require a thinner collimating structure (for example, the thickness is less than 100 micrometers).
  • the manufacturing process of the through-hole collimation structure is relatively difficult, the cost is relatively high, and the yield rate is relatively low.
  • the embodiments of the present disclosure propose a texture image acquisition device with a collimating function and a lighter and thinner structure, which can improve the accuracy of the recognized texture information during the optical texture recognition process.
  • At least one embodiment of the present disclosure provides a texture image acquisition device, which includes a collimating component and an image sensor.
  • the collimating component is stacked on the image sensor, and the collimating component includes a lens array, a first diaphragm layer and a second diaphragm layer stacked in sequence, and the second diaphragm layer is adjacent to the image sensor.
  • the lens array is configured to converge light rays and enter the first aperture layer; the first aperture layer is configured to transmit the light incident on the first aperture layer and enter the second aperture layer, and the pair can pass through the first aperture layer.
  • the angle of the light of a diaphragm layer is limited; the second diaphragm layer is configured to transmit the light incident on the second diaphragm layer (for example, it can be incident on the image sensor after being transmitted), and the second diaphragm layer is configured to pass through the second diaphragm layer.
  • the angle of the light of the diaphragm layer is limited; the image sensor is configured to sense the light incident on the image sensor for obtaining a texture image.
  • At least one embodiment of the present disclosure also provides a display device, which includes a display panel and the above-mentioned texture image acquisition device stacked on the display panel.
  • the display panel includes a display side and a back side opposite to the display side, the grain image acquisition device is located on the back side of the display panel, and the collimating component is located between the display panel and the image sensor.
  • the texture image acquisition device is configured to detect light reflected by the target texture on the display side of the display panel and passing through the display panel.
  • At least one embodiment of the present disclosure also provides a collimating component, which includes a lens array, a first aperture layer, and a second aperture layer that are sequentially stacked and arranged.
  • the lens array is configured to converge and enter the first diaphragm layer.
  • the first aperture layer is configured to transmit light incident to the first aperture layer and enter the second aperture layer, and to limit the angle of light that can pass through the first aperture layer.
  • the second aperture layer is configured to transmit light incident to the second aperture layer and to limit the angle of light that can pass through the second aperture layer.
  • the collimating component, the grain image acquisition device, and the display device provided by the embodiments of the present disclosure have a collimation function, and can reduce light crosstalk, thereby improving the accuracy of the acquired image information, and also has the advantage of a light and thin structure, thereby satisfying The requirements of the production process and application scenarios.
  • FIG. 1A is a schematic block diagram of a texture image acquisition device 10 provided by at least one embodiment of the present disclosure
  • FIG. 1B is a cross-sectional view of the texture image acquisition device 10 provided by at least one embodiment of the disclosure.
  • the texture image acquisition device 10 includes a collimating component 110 and an image sensor 120.
  • the collimating component 110 is stacked on the image sensor 120 and is located in the light incident path of the image sensor 120; with respect to the image sensor 120, the collimating component 110 includes a lens array 111 and a first lens array 111 stacked in sequence from top to bottom.
  • the second diaphragm layer 113 is adjacent to the image sensor 120.
  • the lens array 111 is configured to converge light rays and enter the first aperture layer 112; the first aperture layer 112 is configured to transmit light rays and enter the second aperture layer 113, and the pair can pass through the first aperture layer 112.
  • the angle of the light of the aperture layer 112 is limited; the second aperture layer 113 is configured to allow light to pass through and enter the image sensor 120, and to limit the angle of the light that can pass through the second aperture layer 113;
  • image The sensor 120 is configured to sense the light irradiated on the image sensor 120 for obtaining a texture image.
  • the image sensor 120 may be a suitable type of image sensor, such as a CMOS or CCD type image sensor. For example, it may be a silicon-based image sensor.
  • the preparation method may be to prepare a photosensitive pixel array on a single crystal silicon wafer and then cut it, or it may be
  • the glass substrate image sensor for example, can be prepared by preparing a photosensitive pixel array on a glass substrate and then cutting it.
  • the texture image acquisition device 10 may also include other structures or functional layers.
  • the layers may be in direct contact with each other, and other layers or components may also be provided.
  • the collimating member 110 including the lens array 111, the first aperture layer 112, and the second aperture layer 113 can be incident on the lens array 111 in a specific angle range (ie, the collection angle, Generally, the light with a small angle is similar to collimated screening, so that the filtered light reaches the image sensor 120 below, so that the image sensor 120 can detect the intensity of the light.
  • the first aperture layer 112 can limit the angle of light passing through the first aperture layer 112 (for example, the angle is 0-X, and the value range of X is, for example, 5.7° to 8.5°), which satisfies the angle requirement
  • the light can pass through the first aperture layer 112, while the light with an excessively large angle cannot pass through the first aperture layer 112, thereby achieving the collimation function.
  • the light After passing through the first aperture layer 112, the light reaches the second aperture layer 113.
  • the second aperture layer 113 can not only limit the angle of the light passing through the second aperture layer 113, but also block the lens array 111.
  • the texture image acquisition device 10 not only has a collimation function, but also can effectively prevent light crosstalk, thereby improving the accuracy of the acquired image information (for example, the information of the valleys and ridges of the skin), and then The problem of blurring of the acquired image due to the scattering of light can be improved.
  • the collimating component 110 can also be a separate product, and the collimating component 110 can be applied to any scene where light needs to be collimated, and the embodiments of the present disclosure do not do this. limit.
  • FIG. 2A is a schematic structural diagram of a collimating component 210 provided by at least one embodiment of the present disclosure
  • FIG. 2B is a schematic plan view of a lens array 210 in the collimating component 210 provided by at least one embodiment of the present disclosure
  • FIG. 2C is at least one Another schematic plan view of the lens array 210 in the collimating component 210 provided by the embodiment
  • FIG. 2D is another schematic structural view of the collimating component 210 provided in at least one embodiment of the present disclosure.
  • the collimating member 210 includes a lens array 211, a first aperture layer 212, and a second aperture layer 213 that are sequentially stacked.
  • the lens array 211 includes a plurality of lens units 2110 arranged in an array, and the first aperture layer 212 includes a plurality of first through holes 2120 arranged in an array.
  • the plurality of lens units 2110 and the plurality of first through holes 2120 are perpendicular to each other.
  • the directions of the first diaphragm layer 212 overlap in a one-to-one correspondence.
  • the plurality of lens units 2110 and the plurality of first through holes 2120 overlap in a one-to-one correspondence in a direction perpendicular to the first aperture layer 212, which may mean that the first through holes 2120 are in the first aperture layer 212.
  • the orthographic projection of the lens unit 2110 overlaps with the orthographic projection of the corresponding lens unit 2110 on the first aperture layer 212, including partial overlap or complete overlap.
  • the orthographic projection of the first through hole 2120 is located on the front of the corresponding lens unit 2110. Inside the projection. The embodiments of the present disclosure do not limit this, and can be adjusted according to actual needs.
  • the thickness of the collimating member 210 (for example, the sum of the thickness of each film layer, or the total thickness of the collimating member 210) may be less than 100 microns, for example, 1-50 microns , Further, for example, 15-40 microns.
  • the aperture size of each lens unit 2110 may be 1-40 microns, for example, 15-30 microns; the aperture of each first through hole 2120 may be 1-20 microns, for example, 8-15 microns.
  • the thickness of the optical fiber collimating device in the related art is more than 400 microns. Therefore, the collimating component 210 provided by the embodiment of the present disclosure has the advantages of light and thin structure, meets the requirements of the thickness of the alignment component in the market, and is suitable for a variety of Application scenarios.
  • the lens array 211 may include lens units 2110 arranged in an array.
  • the lens unit 2110 may be a microlens or a microlens.
  • the diameter can be on the order of millimeters, micrometers, or nanometers.
  • the micro lens can be realized as a spherical lens, an aspheric lens, or a binary optical lens (for example, a Fresnel lens).
  • the side of the lens unit 2110 close to the first aperture layer 212 is a plane.
  • each lens unit 2110 is a spherical lens or an aspheric lens
  • the side of the lens unit 2110 away from the first diaphragm layer 212 is a curved surface.
  • the side of the lens unit 2110 with a curved surface is the light-incident side
  • the side of the lens unit 2110 with a flat surface is the light-emitting side.
  • the material of the lens array 211 may include acrylic ultraviolet (UV) curable resins, epoxy UV curable resins, thermosetting resins, etc., or may be other suitable materials, and may be prepared by a suitable molding process. For example, compression molding, photolithography, etc., which are not limited in the embodiments of the present disclosure.
  • the second aperture layer 213 in the collimating member 210 includes a plurality of second through holes 2130 arranged in an array, and a plurality of second through holes 2130 It overlaps with the plurality of first through holes 2120 in a one-to-one correspondence in a direction perpendicular to the first aperture layer 212.
  • the multiple second through holes 2130 and the multiple first through holes 2120 overlap in a one-to-one correspondence in a direction perpendicular to the first aperture layer 212, which may mean that the first through holes 2120 are in the first aperture.
  • the orthographic projection on the layer 212 overlaps with the orthographic projection of the corresponding second through hole 2130 on the first aperture layer 212, including partial overlap or complete overlap.
  • the orthographic projection of the second through hole 2130 is located on the corresponding first aperture.
  • the orthographic projection of a through hole 2120 is not limited in the embodiment of the present disclosure, and can be adjusted according to actual needs.
  • the opening shapes of the first through hole 2120 and the second through hole 2130 are not limited, and may be regular shapes such as a circle, a square, a hexagon, etc., or may be arbitrary. Irregular shape.
  • the cross-sectional shape of the lens unit 2110 on a plane parallel to the lens array 211 may also be a regular shape such as a circle, a square, a hexagon, etc., or an arbitrary irregular shape, which is not limited in the embodiments of the present disclosure.
  • the opening shapes of the first through hole 2120 and the second through hole 2130 are both circular, and the cross-sectional shape of the lens unit 2110 on a plane parallel to the lens array 211 is also circular, so that the convergence The uniformity of the light intensity of the light to the image sensor 120 in all directions is better, which further improves the accuracy of the acquired image information.
  • a plurality of lens units 2110 may be arranged in an array in an aligned manner (as shown in FIG. 2B), or according to a specific shape (for example, a hexagon ) Perform array arrangement (as shown in FIG. 2C), which is not limited in the embodiment of the present disclosure.
  • the lens array 211 includes the lens units 2110 arranged in a hexagonal array (as shown in FIG. 2C)
  • the light condensed on the image sensor 120 can be made more sufficient, which further improves The accuracy of the acquired image information.
  • the first through hole 2120 has a first opening center, and the first opening center is located at the focal point of the lens unit 2110 corresponding to the first through hole 2120.
  • the first opening center of the first through hole 2120 may include its geometric center.
  • the first through hole 2120 has an irregular shape
  • the first through hole 2120 has an irregular shape.
  • the center of the first opening may be the centroid or the area center of the first through hole 2120, which may be determined according to actual requirements, which is not limited in the embodiment of the present disclosure.
  • the center of the first opening of the first through hole 2120 may be located on the main optical axis of the lens unit 2110, which passes through the lens unit 2110. Optical center.
  • the first through hole 2120 has a first opening center
  • the second through hole 2130 has a second opening center
  • the first opening center of the first through hole 2120 is connected to the first opening center.
  • the second opening center of the second through hole 2130 corresponding to the hole 2120 overlaps in the direction perpendicular to the first aperture layer 212.
  • first opening center of the first through hole 2120 and the second opening center of the corresponding second through hole 2130 overlap in a direction perpendicular to the first aperture layer 212, which may refer to, for example,
  • the center of the first opening of a through hole 2120 is coaxial with the center of the second opening of the corresponding second through hole 2130, that is, the center of the orthographic projection of the first through hole 2120 on the first diaphragm layer 212 is consistent with the center of the second opening of the corresponding second through hole 2130.
  • the corresponding second through hole 2130 overlaps the center of the orthographic projection on the first diaphragm layer 212.
  • the embodiments of the present disclosure do not limit this, and can be adjusted according to actual needs.
  • the opening diameter of the first through hole 2120 is greater than or equal to the opening diameter of the corresponding second through hole 2130.
  • the opening diameter of the first through hole 2120 may be 1-20 micrometers, and the value of the opening diameter of the second through hole may also be 1-20 micrometers, for example, both may be 8-15 micrometers.
  • the embodiments of the present disclosure do not limit this, and can be adjusted according to actual conditions.
  • the plurality of lens units 2110, the plurality of first through holes 2120, and the plurality of second through holes 2130 are in one-to-one correspondence.
  • one lens unit 2110, one lens unit 2110 and one lens unit 2110 may correspond to each other.
  • the first through hole 2120 and the second through hole 2130 are regarded as a group of collimating units, which are used to filter out a beam of collimated light from the received light. Therefore, in this embodiment, the numbers of the plurality of lens units 2110, the plurality of first through holes 2120, and the plurality of second through holes 2130 are equal.
  • the material of the first aperture layer 212 and the second aperture layer 213 may be a light-shielding material, for example, a material that absorbs visible light (with a wavelength of approximately 380-780 nm), such as chromium, Chromium oxide, black resin, etc.; for example, metal thin films with low scattering rate, etc., which are not specifically limited in the embodiments of the present disclosure.
  • a light-shielding material for example, a material that absorbs visible light (with a wavelength of approximately 380-780 nm), such as chromium, Chromium oxide, black resin, etc.; for example, metal thin films with low scattering rate, etc., which are not specifically limited in the embodiments of the present disclosure.
  • the collimating member 210 further includes a transparent first filling layer 214 and a transparent second filling layer 215.
  • the first filling layer 214 is located between the lens array 211 and the first aperture layer 214
  • the second filling layer 215 is located between the first aperture layer 214 and the second aperture layer 215.
  • the material of the first filling layer 214 and the second filling layer 215 may be, for example, a transparent material with a light transmittance of more than 90%, etc., and may be an inorganic material or an organic material, which is not specifically limited in the embodiment of the present disclosure.
  • the lens array 211, the first aperture layer 212, the second aperture layer 213, and the first filling layer included in the collimating component 210 can be cured directly above the image sensor (not shown in FIG. 2A), thereby realizing the integration of the collimating component 210.
  • the embodiments of the present disclosure do not limit the specific steps of the manufacturing process, and can be adjusted according to actual conditions.
  • the materials of the lens array 211, the first aperture layer 212, the second aperture layer 213, the first filling layer 214, and the second filling layer 215 may be different, but need to Ensure that their refractive index can be basically the same or the difference between each other is small.
  • the collimating member 210 may include a lens array 211, a first aperture layer 212, a second aperture layer 213, and a third aperture layer that are sequentially stacked. ⁇ 230.
  • the third diaphragm layer 230 is disposed between the second diaphragm layer 213 and the image sensor 120, and is configured to allow light to pass through and be incident on the image sensor 120 and to prevent light that can pass through the third diaphragm layer 230. The angle is limited.
  • the specific structure design of the third aperture layer 230 can refer to the structures of the above-mentioned first aperture layer 212 and the second aperture layer 213, which will not be repeated here.
  • FIG. 3A shows a schematic diagram of a specific structure of the collimating component 310 provided by an exemplary embodiment of the present disclosure
  • FIG. 3B shows a design simulation diagram of the collimating component 310 provided by an exemplary embodiment of the present disclosure
  • FIGS. 3C and 3D Shows a light simulation result diagram of a collimating component including a single-layer diaphragm provided by at least one embodiment of the present disclosure
  • FIG. 3E shows a diagram of a collimating component including a double-layer diaphragm provided by at least one embodiment of the present disclosure.
  • Light simulation result graph shows a schematic diagram of a specific structure of the collimating component 310 provided by an exemplary embodiment of the present disclosure
  • FIG. 3B shows a design simulation diagram of the collimating component 310 provided by an exemplary embodiment of the present disclosure
  • FIGS. 3C and 3D Shows a light simulation result diagram of a collimating component including a single-layer diaphragm provided by at least one embodiment of
  • the collimating component 310 includes a lens array 311, a first filling layer 314, a first aperture layer 312, a second filling layer 315, and a second aperture layer 313 stacked in sequence.
  • the lens array 311 includes a plurality of lens units 3110 arranged in an array
  • the first aperture layer 312 includes a plurality of first through holes 3120 arranged in an array
  • the second aperture layer 313 includes a plurality of second through holes arranged in an array. 3130.
  • the lens unit 3110 may be, for example, a microlens 3110. In FIG.
  • D represents the aperture size of the microlens 3110
  • h represents the arch height of the microlens 3110
  • R represents the radius of curvature of the microlens 3110
  • f represents The focal length of the microlens 3110
  • H represents the thickness of the collimating member 310
  • h1 represents the thickness of the first filling layer 314
  • h2 represents the thickness of the second filling layer 315
  • d1 represents the diameter of the first through hole 3120
  • d2 represents the second
  • the angle ⁇ represents the light collection angle of the first through hole 3120 (for example, the range of ⁇ is approximately between 5.7° and 8.5°).
  • the relationship between the above structural parameters satisfies the following formula:
  • H h+h1+h2, where, since the thickness of the first aperture layer 312 and the second aperture layer 313 does not exceed 1 micrometer, or even much less than 1 micrometer, they can be ignored.
  • the above formula of the present disclosure shows the specific relationship formula between the structural parameters of the collimating component 310, in actual operation, the above specific formula may not be strictly followed. For example, it may be based on actual application requirements. , Or according to the results of multiple experiments or simulations to fine-tune the various structural parameters to achieve the best results. Therefore, the embodiments of the present disclosure do not strictly limit the specific values of the structural parameters.
  • the thickness of the collimating member 310 may be less than 100 microns, for example, about 1-50 microns; for example, the aperture size D of the lens unit 3110 may be about 1-25 microns; for example, the opening diameter d1 of the first through hole 3120 of the first diaphragm layer 312 may be about 1-15 microns.
  • the thickness h1 of the first filling layer 314 may be approximately equal to the focal length f of the lens unit 3110 (for example, a micro lens).
  • the ratio of the thickness h1 of the first filling layer 314 to the thickness h2 of the second filling layer 315 may range from 4 to 6, for example, 4.2 to 5.3.
  • the opening diameter d1 of the first through hole 3120 of the first aperture layer 312 may be greater than the opening diameter d2 of the corresponding second through hole 3130, or may be equal to the corresponding first through hole 3130.
  • the opening diameter of the two through holes 3130 is d2.
  • the first through hole 3120 of the first aperture layer 312 has a first opening center, and the first opening center is located at the lens unit 3110 corresponding to the first through hole 3120. Focal point. It should be noted that when the first through hole 3120 has a regular shape, the first opening center of the first through hole 3120 may include its geometric center. When the first through hole 3120 has an irregular shape, the first through hole 3120 has an irregular shape. The center of the first opening may be the centroid or the area center of the first through hole 3120, etc., which may be determined according to actual requirements, which is not limited in the embodiment of the present disclosure.
  • the center of the first opening of the first through hole 3120 may be located on the main optical axis of the lens unit 3110, which passes through the main optical axis of the lens unit 3110. Optical center.
  • the thickness of the collimating member 310 is 20 microns.
  • each structural parameter is designed as:
  • the thickness H of the collimating member 310 is approximately 20 microns.
  • the collimating component 310 provided by the embodiment of the present disclosure not only has a collimating function, but also can effectively prevent crosstalk of light from adjacent lens units.
  • the thickness H of the collimating component 310 provided by at least one embodiment of the present disclosure is only about 19-21 microns, which is much smaller than the general thickness of a common collimating component, which meets the requirements of the market for thinner and lighter alignment structures.
  • each diaphragm of the single-layer diaphragm 331 (for example, FIG. 3C)
  • the aperture A or the aperture B can receive the light of the corresponding lens unit from, for example, the lens array 330 and limit the angle of the light rays passing through the aperture, it inevitably allows the light from the adjacent lens unit ( For example, the light at point C or point D in FIG. 3C is transmitted.
  • a second layer of diaphragm 332 can be introduced in the collimating part, as shown in FIG. 3E, the collimating part includes a double-layer diaphragm (for example, the first diaphragm layer 331 and In the case of the second diaphragm layer 332), the light from the adjacent lens unit is effectively prevented from passing through the second diaphragm 332. Therefore, the texture image acquisition device provided by at least one embodiment of the present disclosure can effectively improve the accuracy of the acquired image information.
  • 4A is another schematic block diagram of the texture image acquisition device 40 provided by at least one embodiment of the disclosure
  • FIG. 4B is another cross-sectional view of the texture image acquisition device 40 provided by at least one embodiment of the disclosure
  • FIG. 4C is the disclosure A plan view of the image sensor 420 of the texture image acquisition device 40 provided by at least one embodiment.
  • the texture image acquisition device 40 provided by at least one embodiment of the present disclosure includes a collimating component 410 and an image sensor 420, and the collimating component 410 is stacked on the image sensor 420.
  • the image sensor 420 may include a plurality of imaging pixels 421 arranged in an array, as shown in FIG. 4C.
  • each imaging pixel 421 can be implemented in an appropriate manner, for example, it can include a photodiode and a switching transistor.
  • the photodiode can convert the light signal irradiated on it into an electrical signal.
  • the switching transistor can be electrically connected to the photodiode to control Whether the photodiode is in the state of collecting optical signals and the time of collecting optical signals.
  • the type and setting method of the photodiode can be set according to actual application requirements, which are not specifically limited in the embodiments of the present disclosure.
  • the photodiode may be a PIN junction type photodiode or a phototransistor, etc., so that the response speed of the photodiode can be improved.
  • the imaging pixel 421 is configured to align the light transmitted by the straightening member 410 for detection, thereby acquiring image information.
  • the embodiment of the present disclosure does not limit the specific number of imaging pixels 421 included in the image sensor 420.
  • one imaging pixel 421 may be set corresponding to the above-mentioned set of collimating units, or one imaging pixel 421 may be set corresponding to the above-mentioned set of collimating units.
  • the above-mentioned multiple sets of collimating units are correspondingly arranged, and the number of imaging pixels can be set according to actual application requirements.
  • the image sensor 420 includes an imaging pixel 421, and the imaging pixel 421 may include a switching transistor 4201 and a photodiode 4202.
  • the switching transistor 4201 and the photodiode 4202 are all disposed on the base substrate 4211.
  • the switching transistor 4201 includes an active layer 423, a gate 422, a first electrode 4241, and a second electrode 4242.
  • the gate 422 of the switching transistor 4201 is disposed on the base substrate 4211 and is covered by the first insulating layer 4291.
  • the active layer 423 of the switching transistor 4201 is disposed on the first insulating layer 4291 and is covered by the second insulating layer 4292.
  • the first electrode 4241 and the second electrode 4242 of the switching transistor 4201 are symmetrically arranged, and the two can be interchanged, one of which is the source and the other is the drain.
  • the photodiode 4202 includes a PIN junction 426, a first electrode 425, and a second electrode 427.
  • the second electrode 427 is made of a transparent conductive material, such as indium tin oxide (Indium Tin Oxide, ITO).
  • ITO indium Tin Oxide
  • the second electrode 427 is electrically connected to a bias voltage line (not shown in the figure) via the metal layer 428, and the first electrode 425 is electrically connected to the second electrode 4242 of the switching transistor 4201.
  • the metal layer 428 may cover the active layer 423 of the switching transistor 4201 to prevent the active layer 423 of the switching transistor 4201 from being affected by external light and avoid leakage current.
  • the switching transistor 4201 may be a top-gate thin film transistor or a bottom-gate thin film transistor.
  • FIG. 4B shows a case where it is a bottom-gate thin film transistor.
  • the photodiode 4202 is a PIN junction type, but the embodiments of the present disclosure are not limited to this.
  • the photodiode 4202 may also be a PN junction type, as long as the photodiode 4202 has a satisfactory sensitivity.
  • the demand can be.
  • the material of each film layer in the image sensor 420 can be a conventional material, and the description of each film layer can refer to the conventional design, which will not be described in detail here.
  • the relative positional relationship and the size ratio relationship of the respective film layers are only illustrative and not restrictive.
  • the front projection of the photosensitive surface of the PIN junction 426 of the photodiode 4202 on the first aperture layer 411 can cover a plurality of first through holes 4110, and the photosensitive surface of the PIN junction 426 of the photodiode 4202 is in the second light.
  • the orthographic projection on the stop layer 412 can cover a plurality of second through holes 4120, which can be set according to actual requirements, which is not limited in the embodiment of the present disclosure.
  • the bonding method between the collimating component 410 and the image sensor 420 in the grain image acquisition device 40 can be set according to actual application requirements.
  • the implementation of the present disclosure The example does not make specific restrictions on this.
  • the collimating component 410 and the image sensor 420 may be bonded by bonding.
  • the texture image acquisition device 40 may further include an intermediate medium layer 430.
  • the intermediate medium layer 430 is disposed between the collimating member 410 and the image sensor 420.
  • the intermediate medium layer 430 has adhesiveness and can adhere the collimating member 410 and the image sensor 420.
  • the intermediate medium layer 430 may be Optically Clear Adhesive (OCA).
  • OCA Optically Clear Adhesive
  • the intermediate medium layer 430 may also have the function of adjusting the distance between the collimating member 410 and the image sensor 420 in a direction perpendicular to the image sensor 420.
  • the distance between the collimating member 410 and the image sensor 420 in the direction perpendicular to the image sensor 420 can be adjusted by adjusting the thickness of the optical glue.
  • the intermediate medium layer 430 may also include a transparent substrate (not shown in the figure), and optical glue may be provided on both sides of the transparent substrate (for example, the transparent substrate may be inserted into the optical glue), thereby improving the quality of the intermediate medium layer 430. Distance adjustment ability.
  • the texture image acquisition device 40 may also include other structures or functional layers.
  • the layers can be in direct contact with each other, and other layers or components can also be provided.
  • FIG. 5 is a schematic diagram of a display device 50 provided by at least one embodiment of the present disclosure.
  • a display device 50 that includes a display panel 500 and a grain image acquisition device 501 stacked on the display panel 500.
  • the texture image acquisition device 501 may be the above-mentioned texture image acquisition device 10/40.
  • the display panel 500 includes a display side and a back side opposite to the display side, the texture image acquisition device 501 is located on the back side of the display panel 500, and the collimating component 510 in the texture image acquisition device 501 is located between the display panel 500 and the display panel 500. Between image sensors 520.
  • the texture image acquisition device 501 is configured to detect the light reflected by the target texture (for example, fingerprints, palm prints, etc.) on the display side of the display panel 500 and passing through the display panel 500.
  • the target texture for example, fingerprints, palm prints, etc.
  • the display panel 500 may be an organic light emitting diode display panel, a quantum dot light emitting diode display panel, or a liquid crystal display panel, which is not specifically limited in the embodiments of the present disclosure.
  • FIG. 6A is a schematic diagram of the display panel 610 in the display device 60 provided by at least one embodiment of the present disclosure
  • FIG. 6B is a schematic diagram of the display array layer 603 in the display panel 610 provided by at least one embodiment of the present disclosure
  • FIG. 6C is a schematic diagram of the display array layer 603 in the display panel 610 provided by at least one embodiment of the present disclosure.
  • FIG. 6D is a schematic plan view of the display panel 610 provided by at least one embodiment of the present disclosure.
  • the display device 60 includes a display panel 610 and a collimating member 620 and an image sensor 630 stacked on the display panel 610.
  • the display panel 610 includes structures such as a top film 601, a thin film package 602, a display array layer 603, a flexible substrate 604, and a bottom film 605.
  • the bottom film 605 provides protection and support functions for other structures and functional layers located thereon, such as a strong plastic substrate or a glass substrate.
  • the flexible substrate 604 is used to provide cushioning, and is, for example, a flexible substrate made of materials such as polyimide (PI), polypropylene (PP), polycarbonate (PC), and the like.
  • the display array layer 603 is formed on the flexible substrate 604.
  • the display array layer 603 may include a plurality of pixel units 621 arranged in a predetermined array and used to provide electrical signals (including scanning Signal lines (including gate lines, data lines, detection lines, etc.), each pixel unit 621 includes a light-emitting device (for example, an OLED device) and is used to drive the light-emitting device (OLED device) Light-emitting pixel drive circuit, etc.
  • the pixel unit 621 emits light 101, which is used for display and, for example, as incident light for fingerprint detection under the screen.
  • the light 101 emitted by the light-emitting device in the pixel unit 621 is reflected by the user's skin (finger or palm) 650 on the display side of the display panel to be further used for obtaining the user's skin texture image.
  • the thin film package 602 covers the display array layer 603 to prevent external moisture from entering the display array layer 603 to cause aging or deterioration. It may be a multi-layer thin film package, for example, including lamination The inorganic insulating layer and organic insulating layer.
  • the top film 601 may be a cover plate, such as a substrate or a thick film made of glass or plastic, to provide support and protection, such as for the user to perform touch, press and other operations.
  • the display panel 610 may also include other structures or functional layers as required.
  • the display panel 610 may include a touch structure for realizing a touch function.
  • the touch structure may be built into the display array layer 603, or formed on the top film 601, for example, and may be a capacitive type, a resistive type, or the like.
  • the display array layer 603 may include a plurality of pixel units 621 arranged in an array.
  • each pixel unit 621 may include a light-emitting diode 622, which may be an organic light-emitting diode, a quantum dot light-emitting diode, or an inorganic light-emitting diode 622, which is not specifically limited in the embodiment of the present disclosure.
  • each light emitting diode 622 may include a cathode 625, a light emitting layer 624, and an anode 623.
  • the light-emitting diode 622 may include a reflective electrode, so that the light emitted by the light-emitting diode 622 can be reflected to the display side (for example, the top film 601), thereby improving the luminous efficiency of the light-emitting diode 622 and reducing the impact on the image sensor.
  • the impact of 620 is not limited to the light emitting diode 622.
  • the anode 623 may be configured to include a reflective electrode, for example, the anode 623 may be made of a stack of indium tin oxide (ITO) and a metal layer.
  • the reflective electrode may have an opening 629, thereby allowing light reflected from the display side to pass through.
  • the opening 629 may allow light reflected from the top film 601 to pass through; for another example, the opening 629 may also allow light reflected from the finger skin 650 to pass through.
  • no anode or cathode is provided in the portion corresponding to the opening 629, so that the portion does not emit light, which may affect the acquisition of image information.
  • the light-emitting diode 622 may further include a reflective layer 628, which can reflect the light emitted by the light-emitting diode 622 to the display side.
  • the cathode 625 and the anode 623 can be made of a transparent conductive material or a thinner metal, so that the cathode 625 and the anode 623 have transparent or semi-transparent properties (for example, the cathode 625 and the anode 623 can be made of transparent conductive oxide, respectively). Material and transparent alloy material).
  • the reflective layer 628 may be a multilayer dielectric film that can reflect light emitted by the light emitting diode 622, and the multilayer dielectric film can reflect visible light, for example.
  • the reflective layer 628 may have openings (not shown in FIG. 6C), thereby allowing light reflected from the display side to pass through.
  • the reflective layer 628 may have the ability to transmit light of a specific wavelength band (for example, infrared band) or wavelength (for example, 976 nm).
  • one or more openings may be provided in the backlight source (ie, the side close to the collimating member 620) included in the liquid crystal display panel. , So that the light reflected from the display side reaches the collimating part 620 through the opening, and then reaches the image sensor 630 through the collimating part 620, thereby realizing light detection.
  • the display panel 610 may include a display area 611 and a peripheral area 612; the peripheral area 612 of the display panel 100 may be an area located at the edge of the display panel 610; display The display area 611 of the panel 100 may be an area surrounded by the peripheral area 612.
  • the image sensor 420 may include a plurality of imaging pixels 421 arranged in an array.
  • the plurality of imaging pixels 421 may be arranged in an array in the display area 611 of the display panel 610.
  • the imaging pixel 421 can detect light reflected from the skin (eg, finger, palm, etc.) 650 and transmitted through the collimating member 620, so that image information can be acquired.
  • the display device 70 shown in FIG. 7 takes the display device 70 shown in FIG. 7 as an example to describe in detail the grain image acquisition process of the display device 70 provided by at least one embodiment of the present disclosure.
  • a fingerprint 701 of a finger includes lines formed by protrusions (ie fingerprint ridges) and depressions (fingerprint valleys); on the finger and the display side substrate (
  • the cover plate 711 is in contact, the fingerprint ridge is in contact with the display-side substrate 711, and the fingerprint valley does not contact the display-side substrate 711; the air between the fingerprint valley and the display-side substrate 711 forms a first interface 712 with the display-side substrate 711, and the fingerprint
  • the ridge (ie, the skin) and the display side substrate 711 form a second interface 713.
  • the refractive index of skin (for example, the refractive index of skin is 1.55) is closer to the refractive index of the display-side substrate 711 (for example, for normal For the glass cover plate, the refractive index is 1.5), so the intensity of the reflected light at the first interface 712 of the light emitted by the pixel unit 621 is greater than the intensity of the reflected light at the second interface 713.
  • the reflectance of the interface Ref ((nr-1)/(nr+1)) 2 , and nr is the relative refractive index.
  • the refractive index of air, the refractive index of the skin, and the refractive index of the display side substrate 711 are respectively 1, 1.55, and 1.5
  • the relative refractive index nr of the first interface 712 is 1.5
  • the relative refractive index of the second interface 713 is 1.5.
  • the intensity of light passing through the finger The skin reflection returns to the display panel 710 and passes through the collimating part 720.
  • the intensity of the light incident on the image sensor 730 is weak (for example, even negligible). Therefore, the fingerprint image acquired on the image sensor 730 corresponds to the fingerprint The brightness of the image area of the valley is stronger, and the brightness of the image area corresponding to the fingerprint ridge is weaker.
  • FIG. 8A is a pattern simulation diagram provided by at least one embodiment of the present disclosure
  • FIG. 8B is a light simulation result diagram of the display device using the collimating component 310 in FIG. 3A relative to the pattern in FIG. 8A
  • Fig. 8A shows four vertical stripes simulating skin textures, which are placed on the display side of the display panel and located in the display area of the display side
  • Fig. 8B is a luminous flux diagram detected on the image sensor. Comparing FIGS. 8A and 8B, it can be seen that the luminous flux of the image area corresponding to the skin ridge (ie, the vertical stripes in FIG. 8A) is weaker, while the luminous flux corresponding to the skin valley (ie, the blank between adjacent vertical stripes in FIG. 8A) The luminous flux of the image area is relatively strong, so according to the different light intensities detected by the image sensor, the precise information of the valleys and ridges of the skin can be accurately obtained, so as to realize the image acquisition of the texture.
  • FIG. 9A is a schematic diagram of a bonding method between the display panel 900 and the collimating member 910 included in the display device 90 provided by at least one embodiment of the present disclosure
  • FIG. 9B is a schematic diagram of a bonding method provided by at least one embodiment of the present disclosure
  • the display panel 900 and the collimating component 910 can be bonded by frame bonding.
  • the display device 90 may include a plastic frame 901, and the plastic frame 901 is disposed between the display panel 900 and the collimating component 910.
  • the plastic frame 901 may be disposed between the back side of the display panel 900 (for example, under the bottom film 605 shown in FIG. 6A) and the first filling layer 914 of the collimating member 910, and is located at the periphery of the display panel 900 Area (612 in Figure 6).
  • the plastic frame 901 can adhere the display panel 900 and the collimating member 910, and can provide a supporting function.
  • the medium surrounding the plurality of lens units (for example, microlenses) 9110 included in the lens array 911 may be air, so the lens unit (for example, microlenses) 9110 The refractive index of the surrounding medium is 1.
  • the lens unit (for example, micro lens) 9110 can be made of a material with a refractive index of about 1.4-1.6 (for example, 1.48-1.55); the lens unit (for example, micro lens) 9110 can be made of PMMA (polymethyl methacrylate) material.
  • the refractive index of the lens unit (for example, micro lens) 9110 is about 1.4918.
  • the embodiments of the present disclosure include but are not limited to this.
  • the display panel 900 and the collimating component 910 can also be attached in a surface attaching manner.
  • the display panel 900 may also include a filling medium 902, which is arranged between the display panel 900 and the collimating member 910 and is filled in the lens unit (for example, microlens) 9110 All around.
  • the lens unit for example, microlens
  • the lens unit (for example, micro lens) 9110 can be made of a material with a refractive index of about 1.7-1.9 (for example, 1.75-1.85), and the lens unit (for example, micro lens)
  • the refractive index of 9110 may be, for example, about 1.8
  • the filling medium 902 may be made of a material with a refractive index of about 1.2-1.4 (for example, 1.25-1.35).
  • the refractive index of the filling medium 902 may be, for example, about 1.3.
  • the disclosed embodiments include but are not limited to this.
  • the bonding manner of the collimating component 910 and the display panel 900 can be selected according to actual application requirements, which is not specifically limited in the embodiment of the present disclosure.
  • FIG. 10 is a flowchart of a method for manufacturing a texture image acquisition device provided by at least one embodiment of the present disclosure.
  • the preparation method of the grain image acquisition device includes the following steps:
  • Step S101 Provide an image sensor
  • Step S102 Provide a collimating component on the image sensor.
  • providing a collimating member on the image sensor includes: the lens array included in the collimating member, the first diaphragm layer, the second diaphragm layer, the first filling layer, and the The second filling layer is cured directly above the image sensor, thereby realizing the integration of the collimating parts.
  • the second diaphragm layer may be formed on the image sensor, and the second diaphragm layer may be cured.
  • a second filling layer is formed on the second aperture layer, and the second filling layer is cured.
  • a first diaphragm layer is formed on the second filling layer, and the first diaphragm layer is cured.
  • each film layer can be integrally formed, so that the integration of the collimating member can be realized.
  • the layer curing of the collimating component may include, but is not limited to, coating, sputtering, pre-baking, exposure, development, post-baking and other processes. It should be noted that the present disclosure The embodiment of does not limit the specific steps of the preparation process, and can be adjusted according to the actual situation.
  • a black resin or a metal with a relatively low reflectivity for example, molybdenum is used to fabricate the first diaphragm layer and the second diaphragm layer by exposure.
  • the first filling layer and the second filling layer are made by a stepwise curing method (the final film thickness is, for example, 0.05 micrometers).
  • the arrangement pattern of the lens array is formed by exposure, and then the lens unit is formed by post-baking (for example, the middle is thick and the sides are thin).
  • the materials of the lens array, the first aperture layer, the second aperture layer, the first filling layer and the second filling layer in the collimating component can be different, but it is necessary to ensure that their refractive indexes are basically the same. .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Nonlinear Science (AREA)
  • Multimedia (AREA)
  • Inorganic Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Optical Elements Other Than Lenses (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Image Input (AREA)

Abstract

一种纹路图像获取装置、显示装置及准直部件。纹路图像获取装置(10)包括准直部件(110)和图像传感器(120)。准直部件(110)包括透镜阵列(111)、第一光阑层(112)和第二光阑层(113)。透镜阵列(111)使光线汇聚并入射到第一光阑层(112);第一光阑层(112)使入射到第一光阑层(112)的光线透过并入射到第二光阑层(113),以及对可透过第一光阑层(112)的光线的角度进行限制;第二光阑层(113)使入射到第二光阑层(113)的光线透过,以及对可透过第二光阑层(113)的光线的角度进行限制;图像传感器(120)感测入射到图像传感器(120)上的光线以用于获取纹路图像。纹路图像获取装置(10)具有轻薄的结构,能提高获取纹路图像的准确性。

Description

纹路图像获取装置、显示装置及准直部件 技术领域
本公开的实施例涉及一种纹路图像获取装置、显示装置及准直部件。
背景技术
随着移动终端的日益普及,越来越多的用户使用移动终端进行身份验证、电子支付等操作。由于皮肤纹路图案的唯一性,结合光学成像的指纹识别技术和掌纹识别技术等逐渐被移动电子设备采用以用于身份验证、电子支付等。
发明内容
本公开的至少一个实施例提供了一种纹路图像获取装置,该纹路图像获取装置包括准直部件和图像传感器。所述准直部件堆叠于所述图像传感器之上,所述准直部件包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层,所述第二光阑层与所述图像传感器相邻。所述透镜阵列配置为使光线汇聚并入射到所述第一光阑层,所述第一光阑层配置为使入射到所述第一光阑层的光线透过并入射到所述第二光阑层,以及对可透过所述第一光阑层的光线的角度进行限制,所述第二光阑层配置为使入射到所述第二光阑层的光线透过,以及对可透过所述第二光阑层的光线的角度进行限制,所述图像传感器配置为感测入射到所述图像传感器上的光线以用于获取纹路图像。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述透镜阵列包括阵列排布的多个透镜单元,所述第一光阑层包括阵列排布的多个第一通孔。所述多个透镜单元与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,至少一个所述透镜单元的曲率半径R、口径D和拱高h之间满足关系式:R=D 2/(8*h)+h/2。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述准直部件的厚度小于100微米,至少一个所述透镜单元的口径为1-40微米,至 少一个所述第一通孔的开口口径为1-20微米。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第二光阑层包括阵列排布的多个第二通孔,所述多个第二通孔与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第一通孔具有第一开口中心,所述第二通孔具有第二开口中心,所述第一通孔的第一开口中心与所述第一通孔所对应的第二通孔的第二开口中心在垂直于所述图像传感器的方向上重叠。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第一通孔的开口口径大于或等于所对应的第二通孔的开口口径。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述多个透镜单元、所述多个第一通孔和所述多个第二通孔的数量相等。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第一通孔具有第一开口中心,且所述第一开口中心位于所述第一通孔所对应的透镜单元的焦点处。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述准直部件还包括透明的第一填充层,位于所述透镜阵列和所述第一光阑层之间;透明的第二填充层,位于所述第一光阑层和所述第二光阑层之间。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第一填充层的厚度等于所述透镜单元的焦距。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述第一填充层的厚度和所述第二填充层的厚度的比值范围为4~6。
例如,本公开的至少一个实施例提供的纹路图像获取装置还包括中间介质。所述中间介质设置在所述图像传感器和所述准直部件之间,所述中间介质配置为黏合所述图像传感器和所述准直部件,且调节所述图像传感器和所述准直部件在垂直于所述图像传感器的方向上的间距。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,所述多个透镜单元以六边形阵列排布。
例如,本公开的至少一个实施例提供的纹路图像获取装置,还包括第三光阑层。所述第三光阑层位于所述第二光阑层和所述图像传感器之间,且配 置为使光线透过并入射到所述图像传感器以及对可透过所述第三光阑层的光线的角度进行限制。
本公开的至少一个实施例还提供了一种显示装置,该显示装置包括显示面板,以及与所述显示面板层叠设置的上述纹路图像获取装置。所述显示面板包括显示侧和与所述显示侧相对的背侧,所述纹路图像获取装置位于所述显示面板的背侧,所述准直部件位于所述显示面板与所述图像传感器之间。所述纹路图像获取装置配置为对被所述显示面板的显示侧的对象纹路反射且通过所述显示面板的光线进行检测。
例如,在本公开的至少一个实施例提供的显示装置中,所述显示面板包括显示区域和周边区域。所述准直部件和所述图像传感器设置在所述显示面板的显示区域中且位于所述显示面板的背侧。
例如,本公开的至少一个实施例提供的显示装置还包括胶框。所述胶框设置在所述显示面板与所述准直部件之间,所述胶框设置在所述显示面板的周边区域中且位于所述显示面板的背侧。
例如,本公开的至少一个实施例提供的显示装置还包括填充介质。所述填充介质设置在所述显示面板与所述准直部件之间,且填充在所述显示面板的背侧与所述准直部件的透镜阵列之间的间隙处。
例如,在本公开的至少一个实施例提供的显示装置中,所述显示面板包括有机发光二极管显示面板、量子点发光二极管显示面板或液晶显示面板。
本公开的至少一个实施例还提供了一种准直部件,包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层。所述透镜阵列配置为使光线汇聚并入射到所述第一光阑层,所述第一光阑层配置为使所述入射到所述第一光阑层的光线透过并入射到所述第二光阑层,以及对可透过所述第一光阑层的光线的角度进行限制,所述第二光阑层配置为使所述入射到所述第二光阑层的光线透过,以及对可透过所述第二光阑层的光线的角度进行限制。
例如,在本公开的至少一个实施例提供的准直部件中,所述透镜阵列包括阵列排布的多个透镜单元,所述第一光阑层包括阵列排布的多个第一通孔,所述多个透镜单元与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
例如,在本公开的至少一个实施例提供的准直部件中,至少一个所述透 镜单元的曲率半径R、口径D和拱高h之间满足关系式:R=D 2/(8*h)+h/2。例如,在本公开的至少一个实施例提供的准直部件中,所述第二光阑层包括阵列排布的多个第二通孔,所述多个第二通孔与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
例如,在本公开的至少一个实施例提供的准直部件中,所述第一通孔的开口口径大于或等于所述第一通孔所对应的第二通孔的开口口径。
例如,在本公开的至少一个实施例提供的准直部件中,所述准直部件还包括:透明的第一填充层,位于所述透镜阵列和所述第一光阑层之间;透明的第二填充层,位于所述第一光阑层和所述第二光阑层之间。
例如,在本公开的至少一个实施例提供的准直部件中,所述第一填充层的厚度等于所述透镜单元的焦距。
例如,在本公开的至少一个实施例提供的准直部件中,所述第一填充层的厚度和所述第二填充层的厚度的比值范围为4~6。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述的附图仅仅涉及本公开的一些实施例,而非对本公开的限制。
图1A为本公开至少一个实施例提供的纹路图像获取装置的示意框图;
图1B为本公开至少一个实施例提供的纹路图像获取装置的剖面图;
图2A为本公开至少一个实施例提供的准直部件的结构示意图;
图2B为本公开至少一个实施例提供的准直部件中的透镜阵列的平面示意图;
图2C为本公开至少一个实施例提供的准直部件中的透镜阵列的另一平面示意图;
图2D为本公开至少一个实施例提供的准直部件的另一结构示意图;
图3A为本公开至少一个示例性实施例提供的准直部件的具体结构示意图;
图3B为本公开至少一个示例性实施例提供的准直部件的设计仿真图;
图3C和3D为本公开至少一个实施例提供的包括单层光阑的准直部件的光线仿真结果图;
图3E为本公开至少一个实施例提供的包括双层光阑的准直部件的光线仿真结果图;
图4A为本公开至少一个实施例提供的另一纹路图像获取装置的示意图;
图4B为本公开至少一个实施例提供的另一纹路图像获取装置的剖面图;
图4C为本公开至少一个实施例提供的纹路图像获取装置中图像传感器的平面示意图;
图5为本公开至少一个实施例提供的显示装置的示意图;
图6A为本公开至少一个实施例提供的显示装置中的显示面板的示意图;
图6B为本公开至少一个实施例提供的显示面板中的显示阵列层的示意图;
图6C为本公开至少一个实施例提供的显示面板中的显示阵列层的另一示意图;
图6D为本公开至少一个实施例提供的显示面板的平面示意图;
图7为本公开至少一个实施例提供的显示装置的剖面示意图;
图8A为本公开至少一个实施例提供的显示装置的纹路模拟图;
图8B为本公开至少一个实施例提供的显示装置相对于图8A中纹路的光线仿真结果图;
图9A为本公开至少一个实施例提供的显示装置中显示面板和准直部件的一种贴合方式的示意图;
图9B为本公开至少一个实施例提供的显示装置中显示面板和准直部件的另一种贴合方式的示意图;以及
图10为本公开的至少一个实施例提供的一种纹路图像获取装置的制备方法的流程图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外定义,本公开使用的技术术语或者科学术语应当为本公开所属 领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。同样,“一个”、“一”或者“该”等类似词语也不表示数量限制,而是表示存在至少一个。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。
对于纹路识别,例如,在光学指纹识别过程中,当手指与图像传感器的距离较大时,由于经过手指反射后的光线存在散射,会造成图像传感器获取的图像模糊的问题,进而导致根据图像传感器接收到的光线识别出的指纹信息不准确。因此,需要对入射到图像传感器上的光线进行准直。在相关技术中的指纹识别技术方案中,进行光线准直的方式包括光纤准直以及通孔准直,以便精确获取指纹的谷与脊信息。但是,相关技术中的光纤准直器件整体偏厚(例如,厚度为400微米以上),而通常的应用场景需要较薄的准直结构(例如,厚度为100微米以下)。另外,通孔准直结构的制作工艺难度较大,成本较高,且良率较低。
因此,本公开的实施例提出了一种具有准直功能并且结构更轻薄的纹路图像获取装置,可以在光学纹路识别过程中提高所识别的纹路信息的准确性。
本公开的至少一个实施例提供了一种纹路图像获取装置,该纹路图像获取装置包括准直部件和图像传感器。准直部件堆叠于图像传感器之上,准直部件包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层,第二光阑层与图像传感器相邻。透镜阵列配置为使光线汇聚并入射到第一光阑层;第一光阑层配置为使入射到第一光阑层的光线透过并入射到第二光阑层,以及对可透过第一光阑层的光线的角度进行限制;第二光阑层配置为使入射到第二光阑层的光线透过(例如透过后可入射到图像传感器),以及对可透过所述第二光阑层的光线的角度进行限制;该图像传感器配置为感测入射到图像传感器上的光线以用于获取纹路图像。
本公开的至少一个实施例还提供了一种显示装置,该显示装置包括显示 面板和与该显示面板层叠设置的上述纹路图像获取装置。该显示面板包括显示侧和与该显示侧相对的背侧,纹路图像获取装置位于显示面板的背侧,准直部件位于显示面板与图像传感器之间。纹路图像获取装置配置为对被显示面板的显示侧的对象纹路反射且通过显示面板的光线进行检测。
本公开的至少一个实施例还提供一种准直部件,该准直部件包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层。透镜阵列配置为使光线汇聚并入射到第一光阑层。第一光阑层配置为使入射到第一光阑层的光线透过并入射到第二光阑层,以及对可透过第一光阑层的光线的角度进行限制。第二光阑层配置为使入射到第二光阑层的光线透过,以及对可透过第二光阑层的光线的角度进行限制。
本公开的实施例提供的准直部件、纹路图像获取装置和显示装置具有准直功能,并且可以减少光线串扰,从而提高所获取的图像信息的准确性,而且还具有结构轻薄的优势,从而满足制作工艺和应用场景的需求。
下面结合附图对本公开的实施例及其示例进行详细说明。
图1A为本公开至少一个实施例提供的纹路图像获取装置10的示意框图,图1B为本公开至少一个实施例提供的纹路图像获取装置10的剖面图。
例如,本公开的至少一个实施例提供了一种纹路图像获取装置10,如图1A和图1B所示,该纹路图像获取装置10包括准直部件110和图像传感器120。例如,准直部件110堆叠于图像传感器120之上,位于图像传感器120的光入射路径中;相对于图像传感器120,该准直部件110包括由上到下依次层叠设置的透镜阵列111、第一光阑层112和第二光阑层113。该第二光阑层113与图像传感器120相邻。例如,透镜阵列111配置为使光线汇聚并入射到第一光阑层112;第一光阑层112配置为使光线透过并入射到第二光阑层113,以及对可透过该第一光阑层112的光线的角度进行限制;第二光阑层113配置为使光线透过并入射到图像传感器120,以及对可透过该第二光阑层113的光线的角度进行限制;图像传感器120配置为感测照射到该图像传感器120上的光线以用于获取纹路图像。该图像传感器120可以为适当类型的图像传感器,例如CMOS或CCD类型的图像传感器,例如,可以是硅基图像传感器,例如制备方式可以为在单晶硅晶片制备感光像素阵列然后切割得到,或者可以玻璃基板图像传感器,例如制备方式可以为在玻璃基板制 备感光像素阵列然后切割得到。
需要说明的是,根据实际情况,该纹路图像获取装置10还可以包括其他结构或功能层。对于纹路图像获取装置10的上述结构或功能层,层与层之间可以直接接触,也可以设置其他膜层或部件。在上述纹路图像获取装置10中,包括透镜阵列111、第一光阑层112和第二光阑层113的准直部件110可以将入射到透镜阵列111上的特定角度范围(即收光角,一般为小角度)的光线近似于准直化的筛选出,使筛选后的光线到达下方图像传感器120,以便于图像传感器120检测光线的强度。例如,当光线经过透镜阵列111后,光线被汇聚,并到达第一光阑层112。第一光阑层112则可以限制透过该第一光阑层112的光线的角度(例如,该角度为0-X,X的取值范围例如为5.7°至8.5°),满足角度要求的光线可以透过第一光阑层112,而角度过大的光线则无法透过第一光阑层112,从而实现准直功能。经过第一光阑层112之后,光线再到达第二光阑层113,第二光阑层113除了可以限制透过该第二光阑层113的光线的角度以外,还可以遮挡透镜阵列111中相邻的透镜单元所汇聚的光线,以防止光线串扰。因此,本公开的实施例提供的纹路图像获取装置10不仅具有准直功能,还可以有效防止光线串扰,从而提高所获取的图像信息(例如,皮肤的谷与脊的信息)的准确性,进而可以改善由于光线发生散射而造成所获取图像模糊的问题。
需要说明的是,本公开的实施例中,准直部件110也可以为单独的产品,该准直部件110可以应用到任意需要对光线进行准直的场景中,本公开的实施例对此不作限制。
图2A为本公开至少一个实施例提供的准直部件210的结构示意图,图2B为本公开至少一个实施例提供的准直部件210中的透镜阵列210的平面示意图,图2C为本公开至少一个实施例提供的准直部件210中的透镜阵列210的另一平面示意图,以及图2D为本公开至少一个实施例提供的准直部件210的另一结构示意图。如图2A所示,例如,在本公开至少一个实施例中,准直部件210包括依次层叠设置的透镜阵列211、第一光阑层212和第二光阑层213。透镜阵列211包括阵列排布的多个透镜单元2110,第一光阑层212包括阵列排布的多个第一通孔2120,该多个透镜单元2110与多个第一通孔2120在垂直于第一光阑层212的方向上一一对应重叠。
需要说明的是,多个透镜单元2110与多个第一通孔2120在垂直于第一光阑层212的方向上一一对应重叠,可以是指第一通孔2120在第一光阑层212上的正投影与所对应的透镜单元2110在第一光阑层212上的正投影重叠,包括部分重叠或者完全重叠,例如,第一通孔2120的正投影位于所对应的透镜单元2110的正投影内等。本公开的实施例对此不作限制,可根据实际需求调整。
例如,在本公开的至少一个实施例中,准直部件210的厚度(例如,各个膜层的厚度之和,或者该准直部件210的总厚度)可以小于100微米,例如为1~50微米,进一步,例如为15~40微米。每个透镜单元2110的口径大小可以为1-40微米,例如,15~30微米;每个第一通孔2120的开口口径可以为1-20微米,例如,8~15微米。相关技术中的光纤准直器件的厚度在400微米以上,因此,本公开的实施例提供的准直部件210具有结构轻薄的优势,满足市场上对准直部件的厚度的要求,适用于多种应用场景。
例如,在本公开的至少一个实施例中,如图2A-2C所示,透镜阵列211可以包括阵列排布的透镜单元2110,根据实际应用需求,该透镜单元2110可以是微透镜,微透镜的直径可以是毫米、微米或者纳米量级尺寸。例如,该微透镜可以实现为球面透镜、非球面透镜或二元光学透镜(例如菲涅尔透镜)等。例如,如图2A-2B所示,透镜单元2110的靠近第一光阑层212的一侧为平面。例如,在每个透镜单元2110为球面透镜或非球面透镜的情况下,透镜单元2110的远离第一光阑层212的一侧为曲面。例如,透镜单元2110具有曲面的一侧为入光侧,透镜单元2110具有平面的一侧为出光侧。例如,透镜阵列211的材料可以包括丙烯酸类的紫外(UV)固化型树脂、环氧类的UV固化型树脂、热固化型树脂等,也可以为其他适用的材料,可以通过适当的成型工艺制备,例如模压成型、光刻成型等,本公开的实施例对此不作限制。
例如,在本公开的至少一个实施例中,如图2A所示,准直部件210中的第二光阑层213包括阵列排布的多个第二通孔2130,多个第二通孔2130与多个第一通孔2120在垂直于第一光阑层212的方向上一一对应重叠。
需要说明的是,多个第二通孔2130与多个第一通孔2120在垂直于第一光阑层212的方向上一一对应重叠,可以是指第一通孔2120在第一光阑层212上的正投影与所对应的第二通孔2130在第一光阑层212上的正投影重 叠,包括部分重叠或者完全重叠,例如,第二通孔2130的正投影位于所对应的第一通孔2120的正投影内等,本公开的实施例对此不作限制,可根据实际需求调整。
需要说明的是,本公开的实施例中,第一通孔2120和第二通孔2130的开口形状不受限制,可以为例如圆形、正方形、六边形等规则形状,也可以为任意的非规则形状。透镜单元2110在平行于透镜阵列211的平面上的截面形状也可以为例如圆形、正方形、六边形等规则形状,或者为任意的非规则形状,本公开的实施例对此不作限制。例如,在一个示例中,第一通孔2120和第二通孔2130的开口形状均为圆形,透镜单元2110在平行于透镜阵列211的平面上的截面形状也为圆形,从而可以使汇聚到图像传感器120上的光线在各个方向上的光强均一性较好,进一步提高所获取的图像信息的准确性。
例如,如图2B和图2C所示,本公开的实施例中,多个透镜单元2110可以采用行列对齐的方式阵列排布(如图2B所示),也可以根据特定形状(例如六边形)进行阵列排布(如图2C所示),本公开的实施例对此不作限制。例如,在一些实施例中,当透镜阵列211包括以六边形方式阵列排布(如图2C所示)的透镜单元2110时,可以使得汇聚到图像传感器120上的光线更加充足,进一步提高所获取的图像信息的准确性。
例如,在本公开的至少一个实施例中,第一通孔2120具有第一开口中心,且该第一开口中心位于第一通孔2120所对应的透镜单元2110的焦点处。需要说明的是,当第一通孔2120为规则形状时,第一通孔2120的第一开口中心可以包括其几何中心,当第一通孔2120为非规则形状时,第一通孔2120的第一开口中心可以为第一通孔2120的形心或面积中心,这可以根据实际需求而定,本公开的实施例对此不作限制。例如,在一些示例中,当第一通孔2120为非规则形状时,第一通孔2120的第一开口中心可以位于透镜单元2110的主光轴上,该主光轴穿过透镜单元2110的光学中心。
例如,在本公开的至少一个实施例中,第一通孔2120具有第一开口中心,第二通孔2130具有第二开口中心,该第一通孔2120的第一开口中心与该第一通孔2120所对应的第二通孔2130的第二开口中心在垂直于第一光阑层212的方向上重叠。
需要说明的是,第一通孔2120的第一开口中心与所对应的第二通孔2130的第二开口中心在垂直于第一光阑层212的方向上重叠,可以是指,例如, 第一通孔2120的第一开口中心与所对应的第二通孔2130的第二开口中心同轴,也就是说,第一通孔2120在第一光阑层212上的正投影的中心与所对应的第二通孔2130在第一光阑层212上的正投影的中心重叠。本公开的实施例对此不作限制,可根据实际需求进行调整。
例如,在本公开的至少一个实施例中,第一通孔2120的开口口径大于或等于所对应的第二通孔2130的开口口径。例如,第一通孔2120的开口口径可以为1-20微米,则第二通孔的开口口径的取值也可以为1-20微米,有例如二者可以为8~15微米。本公开的实施例对此不作限制,可根据实际情况进行调整。
例如,在本公开的至少一个实施例中,多个透镜单元2110、多个第一通孔2120和多个第二通孔2130一一对应,例如,可以将彼此对应的一个透镜单元2110、一个第一通孔2120和一个第二通孔2130看作一组准直单元,用以从接收到的光线中筛选出一束准直光线。因此,在该实施例中,多个透镜单元2110、多个第一通孔2120和多个第二通孔2130的数量相等。
例如,在本公开的至少一个实施例中,第一光阑层212和第二光阑层213的材料可以为遮光材料,例如,吸收可见光(波长大约在380-780nm)的材料,诸如铬、氧化铬、黑色树脂等;例如,散射率较低的金属薄膜等,本公开的实施例对此不作具体限制。
例如,在本公开的至少一个实施例中,如图2A所示,准直部件210还包括透明的第一填充层214以及透明的第二填充层215。第一填充层214位于透镜阵列211和第一光阑层214之间,第二填充层215位于第一光阑层214和第二光阑层215之间。该第一填充层214和第二填充层215的材料可以为,例如,透光率90%以上的透明材料等,可以为无机材料或有机材料,本公开的实施例对此不作具体限制。
例如,在本公开的至少一个实施例中,根据实际需求,在制作工艺方面,准直部件210中包括的透镜阵列211、第一光阑层212、第二光阑层213、第一填充层214和第二填充层215可以直接在图像传感器(图2A未示出)的上方进行图层固化,从而实现准直部件210的一体化。需要说明的是,本公开的实施例对制作工艺的具体步骤不作限制,可以根据实际情况调整。例如,在本公开的至少一个实施例中,透镜阵列211、第一光阑层212、第二光阑层213、第一填充层214和第二填充层215的材料可以各不相同,但需要保证它 们的折射率能够基本一致或彼此之间的差值较小。
例如,在本公开的至少一个实施例中,如图2D所示,准直部件210可以包括依次层叠设置的透镜阵列211、第一光阑层212、第二光阑层213和第三光阑层230。该第三光阑层230设置在第二光阑层213和图像传感器120之间,且配置为使光线透过并入射到图像传感器120以及对可透过该第三光阑层230的光线的角度进行限制。第三光阑层230的具体结构设计可以参考上述第一光阑层212和第二光阑层213的结构,在此不作赘述。
图3A示出了本公开一个示例性实施例提供的准直部件310的具体结构示意图,图3B示出了本公开一个示例性实施例提供的准直部件310的设计仿真图,图3C和3D示出了本公开至少一个实施例提供的包括单层光阑的准直部件的光线仿真结果图,以及图3E示出了本公开至少一个实施例提供的包括双层光阑的准直部件的光线仿真结果图。
例如,如图3A所示,该准直部件310包括依次层叠设置的透镜阵列311、第一填充层314、第一光阑层312、第二填充层315和第二光阑层313。透镜阵列311包括阵列排布的多个透镜单元3110,第一光阑层312包括阵列排布的多个第一通孔3120,第二光阑层313包括阵列排布的多个第二通孔3130。例如,如图3A所示,透镜单元3110例如可以是微透镜3110,图3A中D表示微透镜3110的口径大小,h表示微透镜3110的拱高,R表示微透镜3110的曲率半径,f表示微透镜3110的焦距,H表示准直部件310的厚度,h1表示第一填充层314的厚度,h2表示第二填充层315的厚度,d1表示第一通孔3120的口径大小,d2表示第二通孔3130的口径大小,角度θ表示第一通孔3120的收光角(例如,θ的范围大约在5.7°至8.5°之间)。例如,上述结构参数之间的关系满足下列公式:
R=D 2/(8*h)+h/2;
d=2*(h+h1)*tanθ;
h+h1=f=R/(n-1),其中n为微透镜3110的折射率;
H=h+h1+h2,其中,由于第一光阑层312和第二光阑层313的厚度不超过1微米,甚至远小于1微米,因此可以忽略不计。
需要说明的是,虽然本公开的上述公式示出了准直部件310中结构参数之间的具体的关系公式,但是在实际操作中,可以不严格遵循上述具体公式,例如,可以根据实际应用需求,或者根据多次实验结果或仿真结果来细微调 整各个结构参数,以达到最佳的效果。因此,本公开的实施例对结构参数的具体数值不作严格限制。
例如,在本公开的至少一个实施例中,根据实际应用需求,例如,准直部件310的厚度可以小于100微米,例如为约1-50微米;例如,透镜单元3110的口径大小D可以为约1-25微米;例如,第一光阑层312的第一通孔3120的开口口径d1可以为约1-15微米。
例如,在本公开的至少一个实施例中,第一填充层314的厚度h1可以大约等于透镜单元3110(例如,微透镜)的焦距f。
例如,在本公开的至少一个实施例中,第一填充层314的厚度h1和第二填充层315的厚度h2的比值范围可以是4~6,例如,4.2~5.3。
例如,在本公开的至少一个实施例中,第一光阑层312的第一通孔3120的开口口径d1可以大于所对应的第二通孔3130的开口口径d2,也可以等于所对应的第二通孔3130的开口口径d2。
例如,在本公开的至少一个实施例中,第一光阑层312的第一通孔3120具有第一开口中心,且该第一开口中心位于该第一通孔3120所对应的透镜单元3110的焦点处。需要说明的是,当第一通孔3120为规则形状时,第一通孔3120的第一开口中心可以包括其几何中心,当第一通孔3120为非规则形状时,第一通孔3120的第一开口中心可以为第一通孔3120的形心或面积中心等,这可以根据实际需求而定,本公开的实施例对此不作限制。例如,在一些示例中,当第一通孔3120为非规则形状时,第一通孔3120的第一开口中心可以位于透镜单元3110的主光轴上,该主光轴穿过透镜单元3110的光学中心。
例如,在本公开的至少一个实施例提供的纹路图像获取装置中,准直部件310的厚度为20微米。
例如,在本公开的示例性实施例提供的准直部件310中,根据上述公式以及基于实验结果多次调整各个结构参数之后,得到一组示例性的结构参数,各个结构参数设计为:
D=10μm,
h=2μm,
R=7.25μm,
d1=d2=3.05μm,
h1=14μm,
h2=3μm。
由于第一光阑层312和第二光阑层313的厚度可以忽略不计,所以得到准直部件310的厚度H=19μm。在实际操作中,考虑到第一光阑层312和第二光阑层313的厚度,该准直部件310的厚度H大约为20微米。相应地,可以得到该准直部件310的设计仿真图,如图3B所示。通过图3B可以看出,该准直部件310中的微透镜3110可以汇聚接收到的光线,第一通孔3120则可以仅允许小角度范围(例如,θ=5.7°~8.5°)的光线(例如,图3B中的实线1)通过,第二通孔3130可以遮挡来自相邻透镜单元3110的光线(例如,图3B中的实线2)。因此本公开的实施例提供的准直部件310不仅具有准直功能,还可以有效防止来自相邻透镜单元的光线串扰。而且,本公开至少一个实施例提供的准直部件310的厚度H只有大约19-21微米,远远小于通常的准直部件的一般厚度,满足了市场对准直结构轻薄化的要求。
为了清楚地说明第二光阑层313的防止光线串扰的功能,可以参考图3C-3E。例如,如图3C和3D所示,在准直部件仅包括单层光阑(例如,第一光阑层331)的情况下,该单层光阑331的每个光阑(例如,图3C中的光阑A或光阑B)虽然能从例如,透镜阵列330接收所对应的透镜单元的光,并且限制透过该光阑的光线的角度,但是不可避免地允许来自相邻透镜单元(例如,图3C中的C点或D点处)的光线透过。为了有效遮挡来自相邻透镜单元的光线,可以在准直部件中引入第二层光阑332,如图3E所示,在准直部件包括双层光阑(例如,第一光阑层331和第二光阑层332)的情况下,有效防止了来自相邻透镜单元的光线透过第二层光阑332。因此,本公开的至少一个实施例提供的纹路图像获取装置可以有效提高所获取的图像信息的准确性。图4A为本公开至少一个实施例提供的纹路图像获取装置40的另一示意框图,图4B为本公开至少一个实施例提供的纹路图像获取装置40的另一剖面图,以及图4C为本公开至少一个实施例提供的纹路图像获取装置40的图像传感器420的平面图。
例如,如图4A所示,本公开至少一个实施例提供的纹路图像获取装置40包括准直部件410和图像传感器420,该准直部件410堆叠于图像传感器420之上。
例如,在本公开至少一个实施例中,图像传感器420可以包括阵列排布 的多个成像像素421,如图4C所示。例如,每个成像像素421可以采用适当的实现方式,例如可以包括光电二极管和开关晶体管,光电二极管可以将照射到其上的光信号转换为电信号,开关晶体管可以与光电二极管电连接,以控制光电二极管是否处于采集光信号的状态以及采集光信号的时间。例如,光电二极管的类型和设置方式可以根据实际应用需求进行设定,本公开的实施例对此不做具体限定。例如,光电二极管可以是PIN结型光敏二极管或光敏晶体管等,由此可以提升光电二极管的响应速度。例如,成像像素421配置为对准直部件410透射的光进行检测,从而获取图像信息。
需要说明的是,本公开的实施例对图像传感器420中包括的成像像素421的具体数量不作限制,例如,一个成像像素421可以与上述一组准直单元对应设置,也可以一个成像像素421与上述多组准直单元对应设置,成像像素的数量可以根据实际应用需求来设定。
例如,在本公开至少一个实施例中,如图4B所示,图像传感器420包括成像像素421,成像像素421可以包括开关晶体管4201和光电二极管4202。开关晶体管4201、光电二极管4202均设置在衬底基板4211上。例如,开关晶体管4201包括有源层423、栅极422、第一电极4241和第二电极4242。开关晶体管4201的栅极422设置在衬底基板4211上,且被第一绝缘层4291覆盖。开关晶体管4201的有源层423设置在第一绝缘层4291上,且被第二绝缘层4292覆盖。例如,开关晶体管4201的第一电极4241和第二电极4242对称设置,两者可以互换,其中之一为源极,另一个为漏极。例如,光电二极管4202包括PIN结426、第一电极425和第二电极427。例如,第二电极427采用透明导电材料制备,例如氧化铟锡(Indium Tin Oxide,ITO)。通过采用合适的结构(例如杯状结构),第二电极427可以进一步复用为聚光层,从而使入射到第二电极427上的光线尽可能多地入射到PIN结426。第二电极427经由金属层428电连接到偏置电压线(图中未示出),第一电极425电连接到开关晶体管4201的第二电极4242。例如,金属层428可以覆盖到开关晶体管4201的有源层423,以防止开关晶体管4201中的有源层423受到外界光照影响,避免产生漏电流。需要说明的是,本公开的实施例中,开关晶体管4201可以为顶栅型薄膜晶体管,也可以为底栅型薄膜晶体管,图4B示出了其为底栅型薄膜晶体管的情形。当上述晶体管为顶栅型薄膜晶体管时,相应调整光电二极管4202与开关晶体管4201的位置,或者调整工艺制 程即可。
需要说明的是,本公开的上述实施例中,光电二极管4202为PIN结型,但是本公开的实施例不限于此,光电二极管4202也可以为PN结型,只要光电二极管4202的感光灵敏度满足使用需求即可。该图像传感器420中的各个膜层的材料可以采用常规材料,各个膜层的说明可参考常规设计,此处不再详述。
需要说明的是,图4B所示的层叠结构中,各个膜层的相对位置关系和尺寸比例关系仅是示意性的,而非限制性的。在实际应用中,光电二极管4202的PIN结426的感光面在第一光阑层411上的正投影可以覆盖多个第一通孔4110,光电二极管4202的PIN结426的感光面在第二光阑层412上的正投影可以覆盖多个第二通孔4120,这可以根据实际需求设置,本公开的实施例对此不作限制。
如图4A和4B所示,本公开至少一个实施例提供的纹路图像获取装置40中的准直部件410和图像传感器420之间的贴合方式可以根据实际应用需求进行设定,本公开的实施例对此不作具体限定。例如,准直部件410和图像传感器420之间可以通过黏合的方式进行贴合。例如,在准直部件410和图像传感器420之间通过黏合的方式进行贴合的情况下,纹路图像获取装置40还可以包括中间介质层430。中间介质层430设置在准直部件410和图像传感器420之间,中间介质层430具有黏性,并可以黏合准直部件410和图像传感器420。例如,中间介质层430可以是光学胶(Optically Clear Adhesive,OCA)。例如,中间介质层430还可以具有调节准直部件410和图像传感器420在垂直于图像传感器420方向上的间距的作用。例如,可以通过调节光学胶的厚度来调节准直部件410和图像传感器420在垂直于图像传感器420方向上的间距。又例如,中间介质层430还可以包括透明基板(图中未示出),透明基板两侧可以均设置光学胶(例如,透明基板可以插入光学胶中),由此可以提升中间介质层430的距离调节能力。
需要说明的是,根据实际情况,该纹路图像获取装置40还可以包括其他结构或功能层。层与层之间可以直接接触,也可以设置其他膜层或部件。
图5为本公开至少一个实施例提供的显示装置50的示意图。例如,如图5所示,本公开的至少一个实施例提供一种显示装置50,该显示装置包括显示面板500和与该显示面板500层叠设置的纹路图像获取装置501。例如, 纹路图像获取装置501可以为上述的纹路图像获取装置10/40。例如,该显示面板500包括显示侧和与该显示侧相对的背侧,该纹路图像获取装置501位于显示面板500的背侧,纹路图像获取装置501中的准直部件510位于该显示面板500与图像传感器520之间。该纹路图像获取装置501配置为对被显示面板500的显示侧的对象纹路(例如,指纹、掌纹等)反射且通过该显示面板500的光线进行检测。
例如,在本公开至少一个实施例中,显示面板500可以是有机发光二极管显示面板或量子点发光二极管显示面板,也可以是液晶显示面板,本公开的实施例对此不作具体限制。
图6A为本公开至少一个实施例提供的显示装置60中的显示面板610的示意图,图6B为本公开至少一个实施例提供的显示面板610中的显示阵列层603的示意图,图6C为本公开至少一个实施例提供的显示面板610中的显示阵列层603的另一示意图,以及图6D为本公开至少一个实施例提供的显示面板610的平面示意图。
例如,在本公开至少一个实施例中,如图6A所示,该显示装置60包括显示面板610和与该显示面板610层叠设置的准直部件620和图像传感器630。例如,如图6A所示,在本公开至少一个实施例中,该显示面板610包括顶层膜601、薄膜封装602、显示阵列层603、柔性衬底604以及底层膜605等结构。
例如,在本公开至少一个实施例中,底层膜605为位于其上的其他结构和功能层提供保护与支撑功能,例如为强度较大的塑料基板或玻璃基板。柔性衬底604用于提供缓冲,例如为聚酰亚胺(PI)、聚丙烯(PP)、聚碳酸酯(PC)等材料制备的柔性衬底。
例如,在本公开至少一个实施例中,显示阵列层603形成在柔性衬底604上,例如,显示阵列层603可以包括多个排列为预定阵列的像素单元621以及用于提供电信号(包括扫描信号、数据信号、检测信号等)的信号线(包括栅线、数据线、检测线等),每个像素单元621包括发光器件(例如,OLED器件)以及用于驱动该发光器件(OLED器件)发光的像素驱动电路等。像素单元621发出光线101,光线101用于显示以及例如作为屏下指纹检测的入射光。例如,像素单元621中的发光器件工作时发出的光线101被显示面板的显示侧的用户皮肤(手指或手掌)650反射,以进一步用于用户皮肤纹 路图像的获取。
例如,在本公开至少一个实施例中,薄膜封装602覆盖在显示阵列层603以防止外界的水汽进入到显示阵列层603之中导致其老化或劣化,其可以为多层薄膜封装,例如包括层叠的无机绝缘层和有机绝缘层等。
例如,在本公开至少一个实施例中,顶层膜601可以为盖板,例如为玻璃或塑料制备的基板或厚膜,用于提供支撑与保护,例如供用户进行触控、按压等操作。
需要说明的是,根据需要,该显示面板610还可以包括其他结构或功能层。例如,显示面板610可以包括触控结构以用于实现触控功能。该触控结构例如可以内置于显示阵列层603之中,或者形成在顶层膜601上等,例如可以为电容式、电阻式等。
例如,在本公开至少一个实施例中,如图6A-6C所示,显示阵列层603可以包括阵列排布的多个像素单元621。例如,每个像素单元621可以包括发光二极管622,发光二极管622可以为有机发光二极管、量子点发光二极管,也可以为无机发光二极管622,本公开的实施例对此不做具体限定。
例如,在本公开至少一个实施例中,如图6B和图6C所示,每个发光二极管622可以包括阴极625、发光层624和阳极623。例如,发光二极管622可以包括反射电极,由此可以将发光二极管622发出的光反射至显示侧(例如,顶层膜601),进而可以提升发光二级管622的发光效率,并可以降低对图像传感器620的影响。例如,对于顶发射型有机发光二极管622,阳极623可以配置包括反射电极,例如阳极623可以由铟锡氧化物(ITO)和金属层的叠层制成。例如,如图6B所示,反射电极可以具有开口629,由此可以使得从显示侧反射的光穿过。例如,开口629可以使得从顶层膜601反射的光穿过;又例如,开口629还可以使得从手指皮肤650反射的光穿过。并且,在开口629对应的部分不设置阳极或阴极,从而该部分不会发光,避免对于获取图像信息造成影响。
例如,在本公开至少一个实施例中,如图6C所示,发光二极管622还可以包括反射层628,该反射层628可以将发光二极管622发出的光反射至显示侧,此时发光二极管622的阴极625和阳极623可以使用透明导电材料制成或者使用厚度较薄的金属制成,以使得阴极625和阳极623具有透明或半透明性质(例如,阴极625和阳极623可以分别由透明导电氧化物材料和 透明合金材料制成)。例如,反射层628可以是多层介质膜,该多层介质膜可以反射发光二极管622出射的光线,该多层介质膜例如可以反射可见光。例如,根据实际应用需求,反射层628可以具有开口(图6C中未示出),由此可以使得从显示侧反射的光穿过。又例如,根据实际应用需求,反射层628可以具有透射特定波段(例如,红外波段)或波长(例如,976nm)的光线的能力。
例如,在本公开至少一个实施例中,当显示面板610为液晶显示面板时,可以通过在液晶显示面板包括的背光源中(即,靠近准直部件620的一侧)设置一个或多个开口,使得从显示侧反射的光通过开口到达准直部件620,再透过准直部件620到达图像传感器630,从而实现光线检测。
例如,在本公开至少一个实施例中,如图6D所示,例如,显示面板610可以包括显示区域611和周边区域612;显示面板100的周边区域612可以是位于显示面板610边缘的区域;显示面板100的显示区域611可以是被周边区域612包围的区域。结合图4C及其相关详细描述,图像传感器420可以包括阵列排布的多个成像像素421,在一些实施例中,例如,多个成像像素421可以在显示面板610的显示区域611中阵列排布。例如,成像像素421可以对从皮肤(例如,手指、手掌等)650反射的且经准直部件620透射的光进行检测,从而可以获取图像信息。
例如,结合图6A-6D的上述详细描述,像素单元621出射的光线在入射到显示面板610的显示侧(例如,顶层膜601)的界面时,一部分的光将被该界面反射;另一部分的光将从显示面板610出射,从显示面板610出射的光线在入射到皮肤(例如,手指,手掌等)上的情况下会发生反射,并且部分被反射的光线会返回显示面板610,这些被反射的光将被用于成像,以用于分析、识别。例如,下面以图7示出的显示装置70为例,对本公开的至少一个实施例提供的显示装置70的纹路图像获取过程做具体说明。
例如,在本公开的至少一个实施例中,如图7所示,手指的指纹701包括由凸起(即指纹脊)与凹陷(指纹谷)等所形成的纹路;在手指与显示侧基板(例如盖板)711接触时,指纹脊与显示侧基板711接触,指纹谷不与显示侧基板711接触;指纹谷与显示侧基板711之间的空气与显示侧基板711形成第一界面712,指纹脊(即,皮肤)与显示侧基板711形成第二界面713。例如,由于相比于空气的折射率(例如,空气的折射率为1),皮肤的折射率 (例如,皮肤的折射率为1.55)更接近于显示侧基板711的折射率(例如,对于通常的玻璃盖板而言,折射率为1.5),因此像素单元621出射的光线在第一界面712的反射光的强度大于在第二界面713反射光的强度。例如,在正入射的情况下,界面的反射率Ref=((nr-1)/(nr+1)) 2,nr为相对折射率。例如,在空气的折射率、皮肤的折射率和显示侧基板711的折射率分别为1、1.55和1.5的情况下,第一界面712的相对折射率nr为1.5,第二界面713的相对折射率nr约为0.97,因此第一界面712的反射光的强度与第二界面713的反射光的强度的比值为(4%)/(0.023%)=173.9。
例如,在本公开的至少一个实施例中,相比于通过界面反射(例如,第一界面712)返回到显示面板710并经过准直部件720入射到图像传感器730上的光线的强度,通过手指皮肤反射返回到显示面板710并经过准直部件720入射到图像传感器730上的光线的强度较弱(例如,甚至可以忽略不计),因此,在图像传感器730上获取的指纹图像中,对应于指纹谷的图像区域的亮度较强,对应于指纹脊的图像区域的亮度较弱。
图8A为本公开至少一个实施例提供的纹路模拟图,和图8B为采用了图3A中的准直部件310的显示装置相对于图8A中纹路的光线仿真结果图。图8A为模拟皮肤纹路的四个竖条纹,将其放在显示面板的显示侧,且位于显示侧的显示区域中;图8B则为图像传感器上检测到的光通量图。对比图8A和图8B可以看出,对应于皮肤脊(即,图8A中竖条纹)的图像区域的光通量较弱,而对应于皮肤谷(即,图8A中相邻竖条纹之间的空白区)的图像区域的光通量较强,从而根据图像传感器检测得到的光强不同,可以精确得到皮肤的谷与脊的精确信息,以用于实现纹路图像获取。
图9A为本公开至少一个实施例提供的一种显示装置90中包括的显示面板900和准直部件910之间的一种贴合方式的示意图;图9B为本公开至少一个实施例提供的一种显示装置90中包括的显示面板900和准直部件910之间的另一种贴合方式的示意图。
例如,在本公开至少一个实施例中,如图9A所示,显示面板900和准直部件910之间可以选用框贴的方式进行贴合。例如,如图9A所示,该显示装置90可以包括胶框901,胶框901设置于显示面板900和准直部件910之间。例如,胶框901可以设置在显示面板900的背侧(例如,图6A中示出的底层膜605的下方)和准直部件910的第一填充层914之间,且位于显 示面板900的周边区域(如图6中612)。例如,胶框901可以黏合显示面板900和准直部件910,并可以提供支撑功能。例如,对于图9A所示的贴合方式(框贴),透镜阵列911中包括的多个透镜单元(例如,微透镜)9110周边的介质可以为空气,因此透镜单元(例如,微透镜)9110周边的介质的折射率为1;此时,透镜单元(例如,微透镜)9110可以选用折射率为约1.4-1.6(例如,1.48-1.55)的材料制成;透镜单元(例如,微透镜)9110例如可以使用PMMA(聚甲基丙烯酸甲酯)材料制成,此时透镜单元(例如,微透镜)9110的折射率为约1.4918,本公开的实施例包括但不限于此。
例如,在本公开至少一个实施例中,如图9B所示,显示面板900和准直部件910之间还可以选用面贴的方式进行贴合。例如,在采用面贴方式的情况下,该显示面板900还可以包括填充介质902,填充介质902设置在显示面板900和准直部件910之间且填充于透镜单元(例如,微透镜)9110的四周。例如,对于图9B所示的贴合方式,透镜单元(例如,微透镜)9110可以选用折射率为约1.7-1.9(例如,1.75-1.85)的材料制成,透镜单元(例如,微透镜)9110的折射率例如可以为约1.8,填充介质902可以选用折射率为约1.2-1.4(例如,1.25-1.35)的材料制成,此时,填充介质902的折射率例如可以为约1.3,本公开的实施例包括但不限于此。
需要说明的是,准直部件910与显示面板900的贴合方式可以根据实际应用需求进行选择,本公开的实施例对此不做具体限定。
图10为本公开的至少一个实施例提供的一种纹路图像获取装置的制备方法的流程图。
如图10所示,该纹路图像获取装置的制备方法包括以下步骤:
步骤S101:提供图像传感器;
步骤S102:在图像传感器上提供准直部件。
例如,在本公开的至少一个实施例中,在图像传感器上提供准直部件包括:可以将准直部件中包括的透镜阵列、第一光阑层、第二光阑层、第一填充层和第二填充层直接在图像传感器的上方进行图层固化,从而实现准直部件的一体化。例如,可以在图像传感器上形成第二光阑层,并使第二光阑层固化。然后在第二光阑层上形成第二填充层,并使第二填充层固化。接着在第二填充层上形成第一光阑层,并使第一光阑层固化。然后在第一光阑层上形成第一填充层,并使第一填充层固化。最后在第一填充层上形成透镜阵列, 并使透镜阵列固化。由此,可以使各个膜层一体形成,从而可以实现准直部件的一体化。
例如,在本公开的至少一个实施例中,将准直部件进行图层固化可以包括但不限于涂布、溅射、前烘、曝光、显影、后烘等工艺,需要说明的是,本公开的实施例对制备工艺的具体步骤不作限制,可以根据实际情况调整。例如,在本公开的至少一个实施例中,使用黑色树脂或者反射率比较低的金属(例如,钼),采用曝光方式制作出第一光阑层和第二光阑层。例如,在本公开的至少一个实施例中,采用逐步固化方式(最终成膜厚度例如为0.05微米)来制作第一填充层和第二填充层。例如,在本公开的至少一个实施例中,采用曝光方式形成透镜阵列的排布图案,然后通过后烘使透镜单元成型(例如,中间厚两边薄)。需要说明的是,准直部件中透镜阵列、第一光阑层、第二光阑层、第一填充层和第二填充层的材料可以各不相同,但需要保证它们的折射率能够基本一致。
对于本公开,还有以下几点需要说明:
(1)本公开实施例附图只涉及到与本公开实施例涉及到的结构,其他结构可参考通常设计。
(2)在不冲突的情况下,本公开的实施例及实施例中的特征可以相互组合以得到新的实施例。
以上所述,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,本公开的保护范围应以所述权利要求的保护范围为准。

Claims (28)

  1. 一种纹路图像获取装置,包括:准直部件和图像传感器,
    其中,所述准直部件堆叠于所述图像传感器之上,所述准直部件包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层,所述第二光阑层与所述图像传感器相邻,
    所述透镜阵列配置为使光线汇聚并入射到所述第一光阑层,
    所述第一光阑层配置为使入射到所述第一光阑层的光线透过并入射到所述第二光阑层,以及对可透过所述第一光阑层的光线的角度进行限制,
    所述第二光阑层配置为使入射到所述第二光阑层的光线透过,以及对可透过所述第二光阑层的光线的角度进行限制,
    所述图像传感器配置为感测入射到所述图像传感器上的光线以用于获取纹路图像。
  2. 根据权利要求1所述的纹路图像获取装置,其中,
    所述透镜阵列包括阵列排布的多个透镜单元,
    所述第一光阑层包括阵列排布的多个第一通孔,
    所述多个透镜单元与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
  3. 根据权利要求2所述的纹路图像获取装置,其中,至少一个所述透镜单元的曲率半径R、口径D和拱高h之间满足关系式:R=D 2/(8*h)+h/2。
  4. 根据权利要求2或3所述的纹路图像获取装置,其中,
    所述准直部件的厚度小于100微米,
    至少一个所述透镜单元的口径为1-40微米,
    至少一个所述第一通孔的开口口径为1-20微米。
  5. 根据权利要求2-4中任一项所述的纹路图像获取装置,其中,
    所述第二光阑层包括阵列排布的多个第二通孔,
    所述多个第二通孔与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
  6. 根据权利要求5所述的纹路图像获取装置,其中,所述第一通孔具有第一开口中心,所述第二通孔具有第二开口中心,
    所述第一通孔的第一开口中心与所述第一通孔所对应的第二通孔的第二开口中心在垂直于所述图像传感器的方向上重叠。
  7. 根据权利要求5或6所述的纹路图像获取装置,其中,所述第一通孔的开口口径大于或等于所述第一通孔所对应的第二通孔的开口口径。
  8. 根据权利要求5-7中任一项所述的纹路图像获取装置,其中,所述多个透镜单元的数量、所述多个第一通孔的数量和所述多个第二通孔的数量相等。
  9. 根据权利要求2-5中任一项所述的纹路图像获取装置,其中,所述第一通孔具有第一开口中心,且所述第一开口中心位于所述第一通孔所对应的透镜单元的焦点处。
  10. 根据权利要求2-9中任一项所述的纹路图像获取装置,其中,所述准直部件还包括:
    透明的第一填充层,位于所述透镜阵列和所述第一光阑层之间;
    透明的第二填充层,位于所述第一光阑层和所述第二光阑层之间。
  11. 根据权利要求10所述的纹路图像获取装置,其中,所述第一填充层的厚度等于所述透镜单元的焦距。
  12. 根据权利要求10或11所述的纹路图像获取装置,其中,所述第一填充层的厚度和所述第二填充层的厚度的比值范围为4~6。
  13. 根据权利要求1-12中任一项所述的纹路图像获取装置,还包括中间介质;其中,
    所述中间介质设置在所述图像传感器和所述准直部件之间,
    所述中间介质配置为结合所述图像传感器和所述准直部件,且调节所述图像传感器和所述准直部件在垂直于所述图像传感器的方向上的间距。
  14. 根据权利要求2-12中任一项所述的纹路图像获取装置,其中,所述多个透镜单元以六边形阵列排布。
  15. 根据权利要求1-14中任一项所述的纹路图像获取装置,还包括第三光阑层,
    其中,所述第三光阑层位于所述第二光阑层和所述图像传感器之间,且配置为使光线透过并入射到所述图像传感器以及对可透过所述第三光阑层的光线的角度进行限制。
  16. 一种显示装置,包括:
    显示面板,以及
    与所述显示面板层叠设置的如权利要求1-15中任一项所述的纹路图像获取装置;
    其中,所述显示面板包括显示侧和与所述显示侧相对的背侧,所述纹路图像获取装置位于所述显示面板的背侧,所述准直部件位于所述显示面板与所述图像传感器之间,
    所述纹路图像获取装置配置为对被所述显示面板的显示侧的对象纹路反射且通过所述显示面板的光线进行检测。
  17. 根据权利要求16所述的显示装置,其中,所述显示面板包括显示区域和周边区域,
    所述准直部件和所述图像传感器设置在所述显示面板的显示区域中且位于所述显示面板的背侧。
  18. 根据权利要求17所述的显示装置,还包括胶框,
    其中,所述胶框设置在所述显示面板与所述准直部件之间,所述胶框设置在所述显示面板的周边区域中且位于所述显示面板的背侧。
  19. 根据权利要求17所述的显示装置,还包括填充介质,
    其中,所述填充介质设置在所述显示面板与所述准直部件之间,且填充在所述显示面板的背侧与所述准直部件的透镜阵列之间的间隙处。
  20. 根据权利要求16-19中任一项所述的显示装置,其中,所述显示面板包括有机发光二极管显示面板、量子点发光二极管显示面板或液晶显示面板。
  21. 一种准直部件,包括依次层叠设置的透镜阵列、第一光阑层和第二光阑层,
    其中,所述透镜阵列配置为使光线汇聚并入射到所述第一光阑层,
    所述第一光阑层配置为使入射到所述第一光阑层的光线透过并入射到所述第二光阑层,以及对可透过所述第一光阑层的光线的角度进行限制,
    所述第二光阑层配置为使入射到所述第二光阑层的光线透过,以及对可透过所述第二光阑层的光线的角度进行限制。
  22. 根据权利要求21所述的准直部件,其中,
    所述透镜阵列包括阵列排布的多个透镜单元,
    所述第一光阑层包括阵列排布的多个第一通孔,
    所述多个透镜单元与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
  23. 根据权利要求22所述的准直部件,其中,至少一个所述透镜单元的曲率半径R、口径D和拱高h之间满足关系式:R=D 2/(8*h)+h/2。
  24. 根据权利要求22或23所述的准直部件,其中,
    所述第二光阑层包括阵列排布的多个第二通孔,
    所述多个第二通孔与所述多个第一通孔在垂直于所述第一光阑层的方向上一一对应重叠。
  25. 根据权利要求24所述的准直部件,其中,所述第一通孔的开口口径大于或等于所述第一通孔所对应的第二通孔的开口口径。
  26. 根据权利要求22-25中任一项所述的准直部件,其中,所述准直部件还包括:
    透明的第一填充层,位于所述透镜阵列和所述第一光阑层之间;
    透明的第二填充层,位于所述第一光阑层和所述第二光阑层之间。
  27. 根据权利要求26所述的准直部件,其中,所述第一填充层的厚度等于所述透镜单元的焦距。
  28. 根据权利要求26或27所述的准直部件,其中,所述第一填充层的厚度和所述第二填充层的厚度的比值范围为4~6。
PCT/CN2019/103794 2019-08-30 2019-08-30 纹路图像获取装置、显示装置及准直部件 WO2021035714A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US16/767,254 US11668862B2 (en) 2019-08-30 2019-08-30 Texture image acquiring device, display device, and collimator
EP19932219.9A EP4024269A4 (en) 2019-08-30 2019-08-30 TEXTURE IMAGE DETECTION DEVICE, DISPLAY DEVICE AND COLLIMATION COMPONENT
PCT/CN2019/103794 WO2021035714A1 (zh) 2019-08-30 2019-08-30 纹路图像获取装置、显示装置及准直部件
CN201980001564.8A CN112753036A (zh) 2019-08-30 2019-08-30 纹路图像获取装置、显示装置及准直部件
JP2020572406A JP7425002B2 (ja) 2019-08-30 2019-08-30 紋様画像取得装置、表示装置及びコリメート部材
US18/301,485 US20230251404A1 (en) 2019-08-30 2023-04-17 Texture image acquiring device, display device, and collimator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/103794 WO2021035714A1 (zh) 2019-08-30 2019-08-30 纹路图像获取装置、显示装置及准直部件

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/767,254 A-371-Of-International US11668862B2 (en) 2019-08-30 2019-08-30 Texture image acquiring device, display device, and collimator
US18/301,485 Continuation US20230251404A1 (en) 2019-08-30 2023-04-17 Texture image acquiring device, display device, and collimator

Publications (1)

Publication Number Publication Date
WO2021035714A1 true WO2021035714A1 (zh) 2021-03-04

Family

ID=74684476

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103794 WO2021035714A1 (zh) 2019-08-30 2019-08-30 纹路图像获取装置、显示装置及准直部件

Country Status (5)

Country Link
US (2) US11668862B2 (zh)
EP (1) EP4024269A4 (zh)
JP (1) JP7425002B2 (zh)
CN (1) CN112753036A (zh)
WO (1) WO2021035714A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113486864B (zh) 2018-12-13 2023-09-12 深圳市汇顶科技股份有限公司 指纹识别装置、方法和电子设备
US20220301337A1 (en) * 2019-09-22 2022-09-22 Yu-Kuo Cheng Fingerprint sensing module and electronic device
CN110796097B (zh) * 2019-10-30 2022-08-30 Oppo广东移动通信有限公司 屏下光学指纹模组、显示屏组件和电子设备
CN110875370B (zh) * 2019-11-29 2022-08-26 昆山国显光电有限公司 显示面板及显示装置
KR20210142799A (ko) * 2020-05-18 2021-11-26 삼성디스플레이 주식회사 지문 센서, 그의 제조 방법, 및 그를 포함한 표시 장치
CN112466921A (zh) * 2020-11-25 2021-03-09 京东方科技集团股份有限公司 一种显示装置
JP2022189532A (ja) * 2021-06-11 2022-12-22 株式会社ジャパンディスプレイ 電子機器

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959470A (zh) * 2010-06-28 2013-03-06 株式会社V技术 曝光装置
WO2017129126A1 (en) * 2016-01-31 2017-08-03 Shenzhen GOODIX Technology Co., Ltd. Under-screen optical sensor module for on-screen fingerprint sensing
CN107515435A (zh) * 2017-09-11 2017-12-26 京东方科技集团股份有限公司 显示面板和显示装置
WO2018024118A1 (zh) * 2016-08-04 2018-02-08 京东方科技集团股份有限公司 表面纹理信息采集器、表面纹理采集方法及显示装置
CN108681703A (zh) * 2018-05-14 2018-10-19 京东方科技集团股份有限公司 用于指纹识别的装置、模组、设备及系统
CN110023956A (zh) * 2019-02-28 2019-07-16 深圳市汇顶科技股份有限公司 光学图像采集单元、光学图像采集系统和电子设备
CN110073264A (zh) * 2016-12-14 2019-07-30 三星电子株式会社 光学透镜组件和使用该光学透镜组件形成图像的方法
CN110088768A (zh) * 2019-03-12 2019-08-02 深圳市汇顶科技股份有限公司 屏下指纹识别装置和电子设备
CN110084090A (zh) * 2019-01-22 2019-08-02 东莞市美光达光学科技有限公司 一种光学式屏下指纹识别模组

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005175417A (ja) 2003-07-28 2005-06-30 Ricoh Co Ltd 発光素子アレイ、光書込ユニットおよび画像形成装置
JP4508619B2 (ja) * 2003-12-03 2010-07-21 キヤノン株式会社 固体撮像装置の製造方法
KR20080036195A (ko) * 2005-07-13 2008-04-25 코닌클리즈케 필립스 일렉트로닉스 엔.브이. 조명 시스템
DE102006004802B4 (de) * 2006-01-23 2008-09-25 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Bilderfassungssystem und Verfahren zur Herstellung mindestens eines Bilderfassungssystems
JP4792077B2 (ja) * 2008-12-17 2011-10-12 株式会社東芝 三次元画像表示装置の製造装置及び三次元画像表示装置の製造方法
JP2015114407A (ja) 2013-12-10 2015-06-22 セイコーエプソン株式会社 電気光学装置、電気光学装置の製造方法、及び電子機器
WO2016051766A1 (ja) 2014-09-30 2016-04-07 株式会社クラレ 拡散板及び拡散板の製造方法
KR102473305B1 (ko) 2015-12-30 2022-12-01 엘지디스플레이 주식회사 백라이트 유닛과 이를 포함하는 모바일 전자 기기
KR102501123B1 (ko) * 2016-01-04 2023-02-17 엘지전자 주식회사 이동 단말기
US10388689B2 (en) 2017-02-13 2019-08-20 Egis Technology Inc. Fingerprint identification system, sensing method and manufacturing method
CN106886767B (zh) * 2017-02-23 2019-07-05 京东方科技集团股份有限公司 一种光学指纹识别装置和显示面板
US10796128B2 (en) * 2017-12-12 2020-10-06 Fingerprint Cards Ab Optical sensor with ambient light filter
CN109791612B (zh) * 2018-12-26 2023-08-08 深圳市汇顶科技股份有限公司 指纹识别装置和电子设备
EP3706036B1 (en) 2019-01-22 2021-12-22 Shenzhen Goodix Technology Co., Ltd. Fingerprint recognition apparatus and electronic device
CN109858465A (zh) * 2019-02-27 2019-06-07 昆山国显光电有限公司 用于指纹识别的显示装置
CN208848221U (zh) 2019-04-10 2019-05-10 深圳市汇顶科技股份有限公司 光学指纹识别装置和电子设备
WO2020243928A1 (zh) * 2019-06-05 2020-12-10 深圳市汇顶科技股份有限公司 光学图像采集单元、光学图像采集系统、显示屏和电子设备
TW202109355A (zh) * 2019-08-30 2021-03-01 美三科技有限公司 具有偏移微透鏡群組的光學感測器及使用其之光學感測系統

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959470A (zh) * 2010-06-28 2013-03-06 株式会社V技术 曝光装置
WO2017129126A1 (en) * 2016-01-31 2017-08-03 Shenzhen GOODIX Technology Co., Ltd. Under-screen optical sensor module for on-screen fingerprint sensing
WO2018024118A1 (zh) * 2016-08-04 2018-02-08 京东方科技集团股份有限公司 表面纹理信息采集器、表面纹理采集方法及显示装置
CN110073264A (zh) * 2016-12-14 2019-07-30 三星电子株式会社 光学透镜组件和使用该光学透镜组件形成图像的方法
CN107515435A (zh) * 2017-09-11 2017-12-26 京东方科技集团股份有限公司 显示面板和显示装置
CN108681703A (zh) * 2018-05-14 2018-10-19 京东方科技集团股份有限公司 用于指纹识别的装置、模组、设备及系统
CN110084090A (zh) * 2019-01-22 2019-08-02 东莞市美光达光学科技有限公司 一种光学式屏下指纹识别模组
CN110023956A (zh) * 2019-02-28 2019-07-16 深圳市汇顶科技股份有限公司 光学图像采集单元、光学图像采集系统和电子设备
CN110088768A (zh) * 2019-03-12 2019-08-02 深圳市汇顶科技股份有限公司 屏下指纹识别装置和电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4024269A4 *

Also Published As

Publication number Publication date
EP4024269A4 (en) 2022-08-31
EP4024269A1 (en) 2022-07-06
JP2022552922A (ja) 2022-12-21
US11668862B2 (en) 2023-06-06
CN112753036A (zh) 2021-05-04
US20210409581A1 (en) 2021-12-30
US20230251404A1 (en) 2023-08-10
JP7425002B2 (ja) 2024-01-30

Similar Documents

Publication Publication Date Title
WO2021035714A1 (zh) 纹路图像获取装置、显示装置及准直部件
WO2020206894A1 (zh) 光学指纹识别装置和电子设备
WO2020181493A1 (zh) 屏下指纹识别装置和电子设备
CN107515435B (zh) 显示面板和显示装置
WO2020151159A1 (zh) 指纹识别的装置和电子设备
WO2018006475A1 (zh) 光学指纹传感器模组
WO2020119289A1 (zh) 指纹识别装置和电子设备
WO2017118029A1 (zh) 光学指纹传感器模组
WO2020220305A1 (zh) 屏下指纹识别装置和电子设备
CN209560569U (zh) 感光模组及取像装置
WO2021077265A1 (zh) 指纹识别装置和电子设备
WO2021051737A1 (zh) 指纹识别装置、背光模组、液晶显示屏和电子设备
US11308725B2 (en) Touch display device
WO2020248286A1 (zh) 光学指纹装置和电子设备
WO2021077406A1 (zh) 指纹识别装置和电子设备
WO2020191601A1 (zh) 指纹识别的装置和电子设备
CN112771484B (zh) 纹路识别装置及其制作方法
WO2021077368A1 (zh) 指纹识别装置和电子设备
WO2021213037A1 (zh) 显示面板、显示装置和显示面板的制作方法
CN112084996A (zh) 屏下指纹识别装置及电子设备
WO2019218752A1 (zh) 纹路识别组件及其制备方法、显示装置
WO2022016547A1 (zh) 指纹识别装置和电子设备
WO2022041145A1 (zh) 指纹识别装置及电子设备
CN213659463U (zh) 指纹识别装置和电子设备
CN210983441U (zh) 取像装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020572406

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19932219

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019932219

Country of ref document: EP

Effective date: 20220330