WO2022038033A1 - Systeme d'acquisition d'images - Google Patents

Systeme d'acquisition d'images Download PDF

Info

Publication number
WO2022038033A1
WO2022038033A1 PCT/EP2021/072466 EP2021072466W WO2022038033A1 WO 2022038033 A1 WO2022038033 A1 WO 2022038033A1 EP 2021072466 W EP2021072466 W EP 2021072466W WO 2022038033 A1 WO2022038033 A1 WO 2022038033A1
Authority
WO
WIPO (PCT)
Prior art keywords
polarizer
radiation
sensor
pixels
light source
Prior art date
Application number
PCT/EP2021/072466
Other languages
English (en)
French (fr)
Inventor
Noémie BALLOT
Benjamin BOUTHINON
Delphine DESCLOUX
Jérôme MICHALLON
Original Assignee
Isorg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isorg filed Critical Isorg
Publication of WO2022038033A1 publication Critical patent/WO2022038033A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • This description relates generally to image acquisition systems and, more particularly, fingerprint acquisition systems, for example, on mobile phones.
  • Fingerprint acquisition systems are used in many fields in order, for example, to secure devices, secure buildings, control access or control the identity of individuals.
  • One embodiment overcomes all or part of the drawbacks of known fingerprint acquisition systems.
  • One embodiment provides an image acquisition system comprising: a single sensor with organic photodetectors, at least a first light source adapted to emit only a first radiation in the red and/or near infrared, and a processing unit; and in which: the sensor and the first source are carried by a single frame, the sensor is adapted to capture images, the processing unit is configured to extract from said images information relating to the veins and to the fingerprint of a finger, and the first light source is adapted to emit the first radiation in a direction opposite to the frame.
  • the first radiation comprises only wavelengths comprised in the band from 600 nm to 1100 nm, preferably between 630 nm to 940 nm.
  • the system comprises a first polarizer, the first radiation from the first light source then passes through the first polarizer before reaching the sensor.
  • the first polarizer covers the first light source on the side opposite the frame.
  • the first light source comprises one or more light-emitting diodes.
  • each photodetector delimits a pixel, each pixel being substantially square, the length of the sides of each pixel preferably being of the order of 50 ⁇ m.
  • the system comprises, in order: the chassis; the sensor; an optical filter; a second light source; and a protective layer.
  • the second light source is adapted to emit a second radiation in the visible.
  • the optical filter is an angular filter.
  • the system comprises a second polarizer covering the sensor on the side of the sensor opposite the frame.
  • the second polarizer is located: between the optical filter and the second light source; or between the optical filter and the sensor.
  • the second polarizer is formed on a substrate.
  • the first polarizer and the second polarizer are linear polarizers.
  • the first polarizer carries out a polarization in a first direction
  • the second polarizer comprises a first part carrying out a polarization in the first direction and a second part carrying out a polarization in a second direction, perpendicular to the first direction.
  • the pixels of the sensor are surmounted by the first parts or the second parts with an alternation, so that two pixels surmounted by a first part are separated by one to nineteen pixels, preferably nine pixels, surmounted by a second part.
  • the first part of the second polarizer has an area equal to at least the area of four pixels.
  • the repetition pitch of the first parts is equal to the dimension of three to twenty pixels, preferably ten pixels.
  • the senor is separated from the first light source by one or more walls opaque to the first radiation.
  • the first source is covered with an angular filter.
  • the sensor picks up first rays, from the first radiation, polarized in the first direction by the first polarizer then in the second direction by the second polarizer and second rays, from the first radiation, polarized in the first direction by the first polarizer then in the first direction by the second polarizer.
  • Figure 1 shows a block diagram, partial and schematic, an example of an image acquisition system
  • Figure 2 shows, in a sectional view, partial and schematic, an embodiment of an image acquisition device
  • FIG. 3 represents, by a block diagram, an example of implementation of an image acquisition method
  • Figure 4 shows, in a sectional view, partial and schematic, another embodiment of an image acquisition device
  • Figure 5 shows, in a sectional view, partial and schematic, part of the image acquisition device shown in Figure 2;
  • FIG. 6 represents, by a top view, partial and schematic, an embodiment of the part of the image acquisition device represented in FIG. 5;
  • FIG. 7 represents, by a partial and schematic top view, another embodiment of the part of the image acquisition device represented in FIG. 5;
  • FIG. 8 represents, by a block diagram, another example of implementation of an image acquisition method
  • Figure 9 shows, in a sectional view, partial and schematic, a structure with a polarizer
  • FIG. 10 shows, in a partial and schematic sectional view, yet another embodiment of an image acquisition device
  • FIG. 11 shows, in a partial and schematic sectional view, yet another embodiment of an image acquisition device.
  • a layer or a film is said to be opaque to radiation when the transmittance of the radiation through the layer or the film is less than 10%.
  • a layer or a film is said to be transparent to radiation when the transmittance of the radiation through the layer or the film is greater than 10%, preferably greater than 50%.
  • all the elements of the optical system which are opaque to radiation have a transmittance which is less than half, preferably less than a fifth, more preferably less than a tenth, of the transmittance the weakest of the elements of the optical system transparent to said radiation.
  • the term "useful radiation" is used to refer to the electromagnetic radiation passing through the optical system in operation.
  • optical element of micrometric size is used to refer to an optical element formed on one face of a support whose maximum dimension, measured parallel to said face, is greater than 1 ⁇ m and less than 1 ⁇ m. mm.
  • each optical element of micrometric size corresponds to a lens of micrometric size, or microlens, composed two diopters.
  • these embodiments can also be put implemented with other types of micrometric-sized optical elements, each micrometric-sized optical element being able to correspond, for example, to a micrometric-sized Fresnel lens, to a micrometric-sized gradient index lens or to a micrometer-sized diffraction grating.
  • visible light is called electromagnetic radiation whose wavelength is between 400 nm and 700 nm, and, in this range, red light is electromagnetic radiation whose wavelength is between 600 nm and 700 nm.
  • Infrared radiation is electromagnetic radiation with a wavelength between 700 nm and 1 mm. In infrared radiation, a distinction is made in particular between near infrared radiation, the wavelength of which is between 700 nm and 1.1 ⁇ m.
  • the refractive index of a medium is defined as being the refractive index of the material constituting the medium for the range of wavelengths of the useful radiation picked up by the sensor d pictures.
  • the refractive index is considered to be substantially constant over the range of wavelengths of the useful radiation, for example equal to the average of the refractive index over the range of wavelengths of the useful radiation picked up by the sensor of pictures.
  • the expressions “about”, “approximately”, “substantially”, and “of the order of” mean to within 10%, preferably within 5%.
  • the expressions “all the elements”, “all the elements”, “each element”, mean between 95% and 100% of the elements. [0050] Unless otherwise specified, the expression “it comprises only the elements” means that it comprises, at least 90% of the elements, preferably that it comprises at least 95% of the elements.
  • Figure 1 shows a block diagram, partial and schematic, an example of an image acquisition system.
  • the image acquisition system illustrated in FIG. 1, comprises: an image acquisition device 1 (DEVICE); and a processing unit 2 (PU).
  • DEVICE image acquisition device 1
  • PU processing unit 2
  • the processing unit 2 preferably comprises signal processing means provided by the device 1, not shown in Figure 1.
  • the processing unit 2 comprises, for example, a microprocessor.
  • the device 1 and the processing unit 2 are preferably connected by a connection 3.
  • the device 1 and the processing unit 2 are, for example, integrated in the same circuit.
  • Figure 2 shows, by a sectional view, partial and schematic, an embodiment of a device 11.
  • FIG. 2 illustrates an embodiment of the device 1 illustrated in FIG. 1.
  • the device 11 can, for example, be coupled to a processing unit and be integrated into the image acquisition system as described in relation to Figure 1.
  • the device 11 is, for example, part of a mobile phone.
  • the device 11 illustrated in FIG. 2 comprises: an optical sensor 13; at least a first light source 17; and a frame 23 (Mid frame).
  • the optical sensor 13 is suitable for capturing images. Images captured by the optical sensor 13 are extracted from the information relating to the veins and to the fingerprint of a finger 15 located above the upper face of the device 11. The extraction of this information is carried out by the unit of treatment illustrated in Figure 1.
  • the first source 17 and the optical sensor 13 are carried by the frame 23.
  • the lower faces (in the orientation of Figure 2) of the first source 17 and the optical sensor 13 are on the frame 23 and in contact with frame 23.
  • the first light source 17 is adapted to emit only a first radiation 19 red and / or infrared.
  • the first source 17 emits the radiation 19 mainly in a direction opposite to the frame 23.
  • the first source 17 emits the radiation 19 mainly in the direction of a screen 21.
  • the screen 21 is preferably located on the side of the upper face of the optical sensor 13 (in the orientation of FIG. 2) and comprises a lower face substantially parallel to the upper face of the frame 23.
  • the device 11 comprises several first light sources 17 located around the optical sensor 13.
  • Each first source 17 is composed of one or more light-emitting diodes (LED, Light-Emitting Diode).
  • the radiation 19 corresponds to light radiation in the red and/or the infrared, that is to say radiation whose set of wavelengths which compose it is between 600 nm and 1700 n. More preferably, the radiation 19 corresponds to light radiation, the set of wavelengths of which it is composed is between 600 nm and 1100 nm, and is even more preferably between 630 nm and 940 nm.
  • the device 11 comprises a succession of superimposed layers of different natures.
  • the device 11 comprises, from top to bottom, in the orientation of the figure: a layer 25, preferably a protective window (Cover glass); the screen 21, consisting of a panel of one or more organic light-emitting diodes (OLED, Organic Light-Emitting Diode) (OLED panel), or a liquid crystal display or LCD (Liquid Crystal Display).
  • the screen 21 constitutes a second light source adapted to emit only a second radiation 27 in the visible; the optical sensor 13 composed of:
  • An image sensor 31 (Image sensor); frame 23; a battery 33 (Battery); and a rear shell 35 (Back cover).
  • the embodiments of the devices and other stacks of FIGS. 2, 4 to 7 and 10 to 11 are represented in space according to a direct orthogonal XYZ frame, the Y axis of the XYZ frame being orthogonal to the upper face of the image sensor 31.
  • the radiation 27 corresponds to light radiation in the visible, that is to say radiation of which at least one of the wavelengths which compose it is between 400 nm and 800 nm.
  • the radiation 27 corresponds to light radiation of which at least one of the wavelengths which compose it is between 400 nm and 650 nm.
  • the radiation 27 corresponds to a radiation, the set of wavelengths of which it is composed is between 400 nm and 700 nm, and is, even more preferably, between 460 nm and 600 nm.
  • the layer 25 is, for example, tempered glass.
  • Layer 25 preferably has a thickness between 25 ⁇ m and 2 mm, preferably between 600 ⁇ m and 1.5 mm.
  • layer 25 is partially transparent, preferably transparent, to radiation 19 and 27.
  • layer 25 is, in a preferred embodiment, transparent to wavelengths between 400 nm and 1700 nm. More preferably, layer 25 is transparent to wavelengths between 400 nm and 1100 nm.
  • the screen 21 is composed of a single OLED.
  • the OLED panel can, depending on the application, be pixelated or not.
  • the OLED panel preferably rests on a substrate.
  • screen 21 has a thickness of between 200 ⁇ m and 400 ⁇ m.
  • the device 11 does not include a layer 25.
  • the screen 21 is therefore, for example, oriented so that the substrate, on which the OLED panel rests, faces the finger 15 The substrate is thus used as a protective layer.
  • the frame 23 is preferably made of plastic or a metallic material and has, for example, a thickness of between 1 mm and 4 mm.
  • a telephone application preferably an intelligent telephone (smartphone)
  • electronic components of the telephone are transferred to the frame 23.
  • the rear shell 35 is preferably made of plastic, glass or a metallic material and has, for example, a thickness of between 200 ⁇ m and 2 mm.
  • the angular filter 29 is adapted to filter the incident radiation according to the incidence of the radiation with respect to the Y axis.
  • the angular filter 29 is more particularly adapted so that the image sensor 31 receives only the rays whose respective incidences relative to the Y axis are less than a maximum incidence of less than 45°, preferably less than 30°, more preferably less than 10°, even more preferably less than 4°.
  • the angular filter 29 is adapted to block the rays of the incident radiation whose respective incidences with respect to the Y axis are greater than the maximum incidence.
  • the image sensor 31 is a sensor with photodetectors or organic photodiodes (OPD, Organic Photodiode).
  • OPD Organic Photodiode
  • the photodiodes are, for example, integrated on a substrate with CMOS transistors (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor) or a substrate with thin film transistors (TFT or Thin-Film Transistor).
  • CMOS transistors Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor
  • TFT Thin-Film Transistor
  • the photodiodes of the image sensor 31 comprise, for example, a mixture of organic semiconductor polymers such as poly(3-hexylthiophene) or poly(3-hexylthiophene-2,5-diyl), known under the name P3HT, mixed with methyl [6,6]-phenyl-C61-butanoate (N-type semiconductor), known under the name PCBM.
  • the photodiodes of the image sensor 31 comprise, for example, small molecules, that is to say molecules having molar masses of less than 500 g/mol, preferably less than 200 g/mol.
  • the photodiodes can be inorganic photodiodes, for example, made from amorphous silicon or crystalline silicon.
  • the photodiodes are composed of quantum boxes (quantum dots).
  • the substrate is, for example, silicon, preferably monocrystalline silicon.
  • the channel, source and drain regions of TFT transistors are for example made of amorphous silicon (a-Si or amorphous Silicon), indium, gallium, zinc and oxide (IGZO Indium Gallium Zinc Oxide) or low temperature polycrystalline silicon (LTPS or Low Temperature Polycrystalline Silicon).
  • the image sensor 31 is preferably sensitive to the wavelengths of the radiations 19 and 27.
  • the optical sensor 13 further comprises, from top to bottom, in the orientation of Figure 2: the angular filter 29; the image sensor 31; a shielding layer 37, preferably a copper layer (Copper sheet) having a thickness of between 25 ⁇ m and 200 ⁇ m; a layer 39 of heat dissipation (Heat dissipation sheet), for example, in graphite having a thickness of between 25 ⁇ m and 200 ⁇ m; and a shock absorbing layer 41 (Cushion), for example, made of a polymeric foam having a thickness of between 25 ⁇ m and 200 ⁇ m.
  • a shielding layer 37 preferably a copper layer (Copper sheet) having a thickness of between 25 ⁇ m and 200 ⁇ m
  • a layer 39 of heat dissipation Heat dissipation sheet
  • Cushion shock absorbing layer 41
  • the layers 37, 39 and 41 of the optical sensor 13 are optional layers.
  • Adhesive layers making it possible to fix all or part of the aforementioned layers together may be present.
  • the optical sensor 13 comprises: an adhesive layer 43 (Adhesive) between the image sensor 31 and the layer 37 of shielding; an adhesive layer 45 (Adhesive) between the layer 41 shock absorption and frame 23; and an adhesive layer 47 (OCA), for example optically partially transparent, preferably optically transparent and non-diffusing (Optically Clear Adhesive, OCA), between the angular filter 29 and the image sensor 31.
  • a layer is considered to be "non-scattering" if the rays of a beam passing through it emerge in a scattering cone having a half angle (at the top of the cone) less than approximately 3.5 degrees.
  • the layers 43, 45 and 47 are, for example, made from acrylic components. They preferably each have a thickness of between 12.5 ⁇ m and 50 ⁇ m, for example, of the order of 25 ⁇ m.
  • the device 11 comprises an optional adhesive layer 49, optically partially transparent, for example transparent, and non-diffusing (OCA), located between the layer 25 and the screen 21.
  • the layer 49 is preferably partially transparent, for example transparent, and non-scattering for the wavelengths of radiation 19 and 27.
  • Layer 49 is, for example partially transparent, for example transparent for the wavelengths included between 400 nm and 1700 nm, and preferably between 400 nm and 1100 nm.
  • the screen 21, the layer 25, the layer 49, the chassis 23 and the rear shell 35 are called outer layers and all have substantially the same area (in the XZ plane ) .
  • the optical sensor 13 and the layers which compose it have an area (in the XZ plane) equal to or less than that of the external layers, that is to say the layer 25 , layer 49, the screen 21, the frame 23 and the rear shell 35.
  • the optical sensor 13 has an area (in the XZ plane) smaller than that of the external layers.
  • the optical sensor 13 is centered on the upper face of the frame 23.
  • the device 11 comprises peripheral stacks 51 arranged on each side of the optical sensor 13.
  • the peripheral stacks 51 extend vertically (along the Y axis) from the underside of the screen 21 towards the frame 23.
  • the peripheral stacks 51 extend horizontally (along the Z axis) from the peripheral end of the device 11 to a limit close to the optical sensor 13, so as to leave a space 61 between the optical sensor 13 and the peripheral stacks 51.
  • the space 61 is preferably extended by a space 63 located between the upper face of the optical sensor 13 and the screen 21.
  • the spaces 61 and 63 are, for example, filled with air, partial vacuum or resin.
  • the space 63 is, for example, filled with a partially transparent material, preferably transparent, to the radiations 19 and 27.
  • the material filling the space 63 has a lower refractive index of at least 0, 1, preferably at least 0.15, with respect to the refractive index of the material of the optical sensor 13.
  • the material of the optical sensor 13 has a refractive index between 1.5 and 1.6.
  • the space 63 is for example a layer of air, a layer of resin or a layer of adhesive. low refractive index, having a refractive index in the range of 1.34 to 1.45.
  • the term "low" refractive index means a refractive index less than 1.5, and of "high" refractive index, a refractive index greater than or equal to 1.5.
  • the low refractive index resin and the low refractive index adhesive are for example colored and thus allow wavelength filtering.
  • the peripheral stacks 51 have, for example, a function of supporting the screen 21 on the frame 23.
  • the peripheral stacks 51 preferably comprise layers similar, by their functions, to the layers making up the optical sensor 13.
  • the peripheral stacks 51 thus comprise: an adhesive layer 59 (Adhesive), similar to the adhesive layer 43, having a thickness of between 25 ⁇ m and 200 ⁇ m; a shielding layer 53 (Copper sheet), optional and similar to shielding layer 37, having a thickness of between 25 ⁇ m and 200 ⁇ m; a heat dissipation sheet 55, optional and similar to layer 39, having a thickness of between 25 ⁇ m and 200 ⁇ m; and an optional shock absorption (Cushion) layer 57 similar to layer 41, having a thickness between 25 ⁇ m and 200 ⁇ m.
  • the adhesive layer 59 is, according to the embodiment illustrated in FIG. 2, on the lower face of the screen 21 and in contact with the lower face of the screen 21.
  • the device 11 comprises housings 65 (Housing) located on the upper face of the frame 23.
  • the housings 65 can be aligned with the stacks 51 and have an area ( in the XZ plane) similar to the area of the stacks 51 or an area different from the area of the stacks 51.
  • the housings 65 facilitate, for example, the assembly of the screen 21 on the frame 23 and in particular the positioning of the optical sensor 13.
  • each stack 51 is preferably in contact with the housing 65 underlying.
  • the frame 23 comprises, for example, a printed circuit board, not shown in FIG. 2, on the face opposite that on which the image sensor 31 is arranged (underside of the frame 23 in FIG. 2).
  • FIG. 3 represents, by a block diagram, an example of implementation of an image acquisition method.
  • FIG. 3 illustrates, by a block diagram, the implementation of a method for acquiring images of the finger 15 (FIG. 2) for the extraction of information concerning the veins and the fingerprints digits of finger 15.
  • a first stream relates to the acquisition of images by the image sensor 31.
  • a second stream relates to the processing carried out on the images taken.
  • the first flow is preceded by a step 66 (Finger on display) of placing the finger 15 on the surface of the protective layer or pane 25 (FIG. 2).
  • the first and the second stream comprise a first phase of image acquisition and processing intended for fingerprint recognition followed by a second phase image acquisition and processing for vein recognition.
  • the first phase of the first stream begins with a step 67 of switching on the "visible" source 21 (Visible source ON) .
  • Step 67 is followed by a step 68 of acquisition (Image 1 acquisition) and storage of a first image.
  • the second phase of the first flow includes the switching on of the "infrared" source 17 (step 71, IR Source ON).
  • Step 71 is followed by: a step 72 of switching off the source 21 (Visible source OFF) and a step 73 of acquiring and storing a second image (Image 2 acquisition); or a step 77 of acquiring and storing a third image (Image 2' acquisition).
  • the first phase of the second flow includes a step
  • step 73 if the acquired image was acquired in step 73 (source 21 off), the image is processed, during a step 74 (Image 2 processing), in order to extract an image of veins 75 (Veins feature) .
  • step 77 the image is processed, during a step 78 (Image 2' processing) in order to to extract therefrom an image from which the first image is subtracted to obtain an image of veins 79 (Veins feature).
  • the two phases of the first flow are executed before executing the two phases of the second flow.
  • the first phases of the first and second streams are executed before the execution of the second phases of the first and second streams. Furthermore, the order of the phases can be reversed in one of the fluxes or in the other.
  • Figure 4 shows, by a sectional view, partial and schematic, another embodiment of a device 81.
  • device 81 is similar to device 11 illustrated in FIG. 2, except that it comprises two polarizers.
  • the device 81 comprises: at least one first polarizer 83; and a second polarizer 85.
  • Each first polarizer 83 preferably covers a single first source 17.
  • each source 17 is covered, along the Y axis, by a first polarizer 83.
  • the radiation 19 from the first source 17 thus passes through , preferably, the first polarizer 83 before reaching the finger 15.
  • the number of first polarizers 83 is preferably equal to the number of first sources 17, so that each first source 17 is associated with a single first polarizer 83 and each first polarizer 83 is associated with a single first source 17.
  • each first polarizer 83 has an area (in the XZ plane) equal to or greater than the area of the source 17 with which it is associated.
  • the second polarizer 85 is located on the upper face of the angular filter 29, in contact with the upper face of the angular filter 29, and has a similar area (in the XZ plane). to the surface of the angular filter 29.
  • the second polarizer 85 is located between the angular filter 29 and the image sensor 31.
  • the first polarizer(s) 83 and the second polarizer 85 are rectilinear, or in other words linear.
  • the first polarizer(s) 83 polarize in a first direction, which will be referred to below as the horizontal direction.
  • the second polarizer 85 is composed of: one or more first parts which polarize in a second direction, perpendicular to the first direction, which will be referred to below as the vertical direction; and one or more second parts which polarize in the horizontal direction.
  • FIGS. 5 and 6 The arrangement of the first and second parts of the second polarizer 85 is illustrated in FIGS. 5 and 6.
  • Figure 5 shows, by a partial and schematic sectional view, part of the image acquisition device 81 shown in Figure 4.
  • FIG. 5 illustrates a stack 87 of the image sensor 31, of the layer 47, of the angular filter 29, of the second polarizer 85 and of the screen 21.
  • the angular filter 29 comprises, from top to bottom: a layer 89 comprising an array of microlenses 91 covered by a filling layer 93; a substrate 95 on which the microlenses 91 are formed; and a matrix comprising walls 97 opaque, at least partially, to radiation 19 and 27 (FIG. 4) and openings 99, the openings 99 being filled by the material of the layer 47.
  • the substrate 95 can be made of a transparent polymer which does not absorb at least the wavelengths considered, here in the visible and infrared range.
  • This polymer can in particular be poly(ethylene terephthalate) PET, poly(methyl methacrylate) PMMA, polymer of inecyclic olefin (COP), polyimide (PI) or polycarbonate (PC).
  • the thickness of the substrate 95 can, for example, vary between 1 ⁇ m and 100 ⁇ m, preferably between 10 ⁇ m and 100 ⁇ m.
  • the substrate 95 can correspond to a colored filter, to a polarizer, to a half-wave plate or to a quarter-wave plate.
  • the lenses 91 can be made of silica, of PMMA, of a positive photosensitive resin, of PET, of poly(ethylene naphthalate) (PEN), of COP, of polydimethylsiloxane (PDMS)/silicone, of epoxy resin or in acrylate resin.
  • the lenses 91 can be formed by flowing blocks of a photoresist.
  • the lenses 91 can further be formed by molding on a layer of PET, PEN, COP, PDMS/silicone, epoxy resin or acrylate resin.
  • the lenses 91 are converging lenses each having a focal distance f of between 1 ⁇ m and 100 ⁇ m, preferably between 1 ⁇ m and 70 ⁇ m. According to one embodiment, all the lenses 91 are substantially identical.
  • the lenses 91 and the substrate 95 are preferably made of transparent or partially transparent materials, that is to say transparent in a part of the spectrum considered for the domain concerned, for example the imagery, on the range of wavelengths corresponding to the wavelengths used during exposure.
  • the layer 93 is a layer which matches the shape of the lenses 91.
  • the layer 93 can be obtained from an optically transparent adhesive (Optically Clear Adhesive - OCA), in particular an optically transparent adhesive liquid, or of a material with a low refractive index, or of an epoxy/acrylate adhesive, or of a film of a gas or of a gaseous mixture, for example air.
  • an optically transparent adhesive Optically Clear Adhesive - OCA
  • an optically transparent adhesive liquid or of a material with a low refractive index, or of an epoxy/acrylate adhesive, or of a film of a gas or of a gaseous mixture, for example air.
  • the openings 99 are, for example, filled with air, with a partial vacuum or with a material that is at least partially transparent in the visible and infrared domains.
  • the optical axes of the lenses 91 are preferably directed along the Y axis.
  • Each aperture 99 is preferably associated with a single lens 91.
  • the optical axes of the lenses 91 are preferably centered with the centers of the apertures 99.
  • the diameter of the lenses 91 is preferably greater than the maximum size of the section (perpendicular to the optical axis of the lenses 91) of the apertures 99.
  • the image sensor 31 comprises photodetectors 103 delimiting pixels 105.
  • the term pixel is used throughout the description to define a part of the image sensor 31 comprising a single photodetector 103.
  • the denomination pixel can apply to the scale of the image sensor 31 but also to the scale of the stack 87.
  • a pixel is the set of layers, constituting said stack, plumb with the pixel of the image sensor 31.
  • the term pixel unless otherwise specified, refers to a pixel at the scale of the stack 87.
  • Each photodetector 103 is preferably associated with at least four apertures 99 (and four lenses 91). Preferably, each photodetector 103 is associated with exactly four apertures 99.
  • a pixel 105 corresponds to each part of the stack 87 comprising, among other things, a photodetector 103 surmounted by four openings 99, themselves surmounted by four lenses 91.
  • Each pixel 105 has, preferably, a substantially square shape seen in a direction perpendicular to the upper face of the image sensor 31.
  • the area of each pixel corresponds to a square whose dimension on one side is between 32 ⁇ m and 100 ⁇ m, preferably between 50.8 ⁇ m and 80 ⁇ m.
  • the area of a pixel is, more preferably, equal to a square of approximately 50.8 ⁇ m by 50.8 ⁇ m.
  • Each pixel 105 can be associated with a number of lenses 91 different from four, depending on the diameter of the lenses 91, their network arrangement and the dimensions of the pixels 105.
  • a pixel 105 comprises a photodetector 103 surmounted by four openings 99.
  • the angular filter 29 comprising the openings 99 can be rolled onto the image sensor 31 without prior alignment of the angular filter 29 on the image sensor 31.
  • Certain lenses 91 and apertures 99 can then be located, in the orientation of the stack, that is to say along the Y direction, straddling two photodetectors 103.
  • the stack 87 comprises the second polarizer 85 on the upper face of the optical filter 29, more precisely on the upper face of the layer 93.
  • Figure 6 shows, by a top view, partial and schematic, an embodiment of the part of the image acquisition device shown in Figure 5.
  • FIG. 6 illustrates an embodiment of the arrangement of the first parts 107 and second parts 109 of the second polarizer 85 (FIG. 5).
  • each first part 107 and each second part 109 of the second polarizer 85 has a substantially square shape in the view of FIG. 6.
  • the area of each first part 107 and each second part 109 of the second polarizer 85 is equal to the area of a pixel 105 (FIG. 5), or, for example, a square of about 50.8 ⁇ m by 50.8 ⁇ m.
  • the first parts 107 and second parts 109 of the polarizer 85 are formed on the surface of the angular filter 29 (FIG. 5) so that one pixel 105 out of two is covered by a first part 107 and one pixel 105 out of two, different from the previous ones, or covered by a second part 109.
  • Each part 107 and each part 109 is aligned with a pixel 105 and a photodetector 103.
  • two of the pixels 105 are covered by first parts 107 and two of the pixels 105, different from the preceding pixels 105, are, for example, covered by second parts 109.
  • the repetition pitch of the first parts 107 can be greater than one pixel.
  • the repetition pitch of the first parts 107 can be between two pixels 105 and twenty pixels 105, preferably between five pixels 105 and fifteen pixels 105, and more preferably equal to about ten pixels 105.
  • the number of first parts 107 is then different and, for example, less than the number of second parts 109.
  • the first parts 107 can be arranged so that, for example, within a set of eight pixels (two columns of pixels by four rows of pixels), two first parts 107 are formed on the surface of the angular filter 29 so as to cover two pixels of the same column.
  • the first parts 107 can be arranged so that, for example, within a set of eight pixels (two columns of pixels by four rows of pixels), two first parts 107 are formed on the surface of the angular filter 29 so as to to cover two pixels, not adjacent, located within two different columns. In other words, a square repeat pattern of four pixels is repeated on two consecutive rows offset by one pixel. In these two embodiments, the repetition pitch of the first parts 107 is two pixels, however these embodiments are easily adaptable for a repetition pitch of the first parts 107 greater than two pixels.
  • the second polarizer 85 is, for example, formed by successive deposits of the first parts 107 and the second parts 109 on the surface of the angular filter 29.
  • FIG. 7 represents, by a partial and schematic top view, another embodiment of the part of the image acquisition device represented in FIG. 5.
  • FIG. 7 illustrates another embodiment of the arrangement of the first parts 107 and second parts 109 of the second polarizer 85 (FIG. 5).
  • the first parts 107 and second parts 109 of the second polarizer 85 are formed arbitrarily on the surface of the angular filter 29 (FIG. 5).
  • each first part 107 of the second polarizer 85 has a larger area (in the XZ plane) than the area of each first part 107 of the second polarizer 85 shown in Figure 6.
  • each first part 107 of the second polarizer 85 is formed on the upper face of the angular filter 29 without prior alignment of the latter with the photodetectors 103 or the underlying lenses 91.
  • each first part 107 has a substantially square shape in the view of Figure 7.
  • each first part 107 has an area (in contact with the layer 93, Figure 5 ) making it possible to fully cover, on the upper face of the angular filter 29, at least one pixel 105 (or a photodetector 103) and this regardless of its location on the upper face of the angular filter 29.
  • the area of each first part 107 is at least equal to the area of four pixels 105.
  • the area of each first part 107 is between the area of four pixels 105 and the area of six pixels 105.
  • the area of each first part 107 is equal to the area of four pixels 105.
  • the second polarizer 85 is, for example, formed by successive deposits of the first parts 107 and the second parts 109 on the surface of the angular filter 29.
  • the repetition pitch of the first parts 107 is between a distance corresponding to the dimension of three pixels and a distance corresponding to the dimension of twenty pixels.
  • the repetition pitch is between a distance corresponding to the dimension of eight pixels and a distance corresponding to the dimension of fifteen pixels. More preferably, the repetition pitch of the first parts 107 is equal to a distance corresponding to the dimension of ten pixels.
  • the distribution of the first parts 107 is aligned, that is to say that the repetition is done in rows and columns, or shifted, that is to say that the distribution is shifted by one or more pixels by one row to the next or from one column to the next.
  • the second polarizer 85 is located between the angular filter 29 and the image sensor 31, more precisely, between the layer 47 and the image sensor 31 (FIG. 5).
  • An advantage of the embodiments and modes of implementation described previously in relation to FIGS. 4 to 7 is that they make it possible to simultaneously take an image under radiation 19 polarized horizontally then, after reflection on the finger 5, horizontally (that is to say an image under radiation 19 having passed through two aligned polarizers) and an image under radiation 19 polarized horizontally then, after reflection on the finger 15, vertically (that is to say an image under the radiation 19 having crossed two crossed polarizers).
  • FIG. 8 represents, by a block diagram, another example of implementation of an image acquisition method.
  • FIG. 8 illustrates a method allowing the acquisition of images and the processing thereof in the case of a device comprising the source 17.
  • a first stream relates to the acquisition of images by the image sensor 31.
  • a second stream relates to the processing carried out on the acquired images.
  • the first flow begins with a step 123 of positioning the finger 15 on the upper face of the layer 25 (Finger on display).
  • Step 123 is followed by a step 125 in which the position of the finger 15 is detected (Detecting finger position) and localized on the layer 25.
  • the detection of the position of the finger 15 can be carried out by a detection element included in the image acquisition device or by an element internal to the image sensor 31, for example one of its electrodes.
  • the first flow comprises, in a subsequent step 127, the switching on of the source 17 (IR source ON).
  • Step 127 is followed by a step 129 of acquiring an image, of dividing this image into two distinct images depending on whether the pixels are associated with a first part 107 or a second part 109 of the second polarizer 85, and storage of these images (image acquisition).
  • the first image is the image which is associated with the photodetectors 105 (FIG. 5) surmounted by a first part 107 of the second polarizer 85.
  • the radiation 19 is polarized by the first polarizer 83 in horizontal (H) then, after reflection on the finger 15, is polarized by the first part 107 of the second polarizer 85 in vertical (V), before reaching the image sensor 31.
  • the second image is the image associated with the photodetectors 105 (FIG. 5) surmounted by a second part 109 of the second polarizer 85.
  • the radiation 19 is polarized by the first polarizer 83 in horizontal (H) then, after reflection on the finger 15, is polarized by the second part 109 of the second polarizer 85 in horizontal (H), before reaching the image sensor 31.
  • the second stream comprises two phases respectively dedicated to the separate processing of the two images and to the processing of the combination of the two images.
  • the first phase of the second flow comprises the processing of the first acquired image (HV output of block 129) in order to extract therefrom, in a step 131, an image comprising volume information on the finger 15 (Volume information (veins ) ) .
  • Volume information is called information which required the penetration of light into the volume of the finger 15 in order to be acquired.
  • the information concerning the veins for example their number, their shape or their arrangement within the finger, is, for example, volume information.
  • the first phase of the second stream further comprises the processing of the second acquired image (output HH of block 129) in order to extract therefrom, in a step 133, an image comprising surface and volume information on the finger 15 (Surface and volume information) .
  • the second phase of the second flow comprises a step 135 during which the information coming from the first image and the information coming from the second image are processed together in order to extract only surface information (Surface information (fingerprint)). .
  • This may include the determination of a third image corresponding to the difference, possibly weighted, between the second image and the first image.
  • surface information the information which required the reflection of light on the surface of the finger to be acquired.
  • the information concerning the fingerprints is, for example, areal information. This is for example an image of the grooves and ridges of fingerprints.
  • Figure 9 shows, by a sectional view, partial and schematic, a structure 111 provided with a polarizer 85.
  • FIG. 9 illustrates an embodiment of a structure 111 in which the second polarizer 85 has been formed on the surface of a support or substrate 113.
  • the second polarizer 85 shown in Figure 9 is identical to the second polarizer 85 shown in Figure 5.
  • the second polarizer 85 in Figure 9 is however formed on the support 113, unlike Figure 5 where the polarizer 85 is formed on the angular filter 29.
  • the support 113 can be made of a transparent polymer which does not absorb, at least, the wavelengths considered, here in the visible and infrared range.
  • This polymer may in particular be poly(ethylene terephthalate) (PET), poly(methyl methacrylate) (EMMA), polymer of inecyclic olefin (COP), polyimide (PI) or polycarbonate (PC).
  • Substrate 113 is preferably PET.
  • the thickness of the substrate 113 can vary from 1 ⁇ m to 100 ⁇ m, preferably from 10 ⁇ m to 50 ⁇ m.
  • the support 113 can correspond to a colored filter, to a half-wave plate or to a quarter-wave plate.
  • the arrangement of the first parts 107 and the second parts 109 of the second polarizer 85 illustrated in FIG. 9 is similar to the arrangement of the parts 107 and 109 of the second polarizer 85 illustrated in FIG. 7.
  • the structure 111 is mounted in the stack 87 of FIG. 5, in place of the second polarizer 85, above the angular filter 29.
  • the structure 111 is mounted in the stack 87 of FIG. 5, replacing the second polarizer 85, between the filter 29 and the image sensor 31, more precisely, between the layer 47 and the image sensor 31.
  • the polarizer 85 is formed under the substrate 113.
  • the lower face of the polarizer 85 is then in contact with the upper face of the angular filter 29 or in contact with the upper face of the image sensor 31 depending on whether structure 111 is positioned on angular filter 15 or between angular filter 29 and image sensor 31.
  • FIG. 10 represents, by a partial and schematic sectional view, yet another embodiment of an image acquisition device.
  • FIG. 10 illustrates a device 115 similar to device 81 illustrated in FIG. 4, except that it comprises one or more walls 117 that are opaque to radiation 19, around optical sensor 13.
  • the wall 117 covers all the side edges of the optical sensor 13 extending from the frame 23 until it is flush with the upper face of the angular filter 29.
  • Each wall 117 extends from the frame 23 to the upper face of the angular filter 29 over a width (according to the direction X in figure 10) equal to or greater than the width (in the direction X in figure 10) of the source 17.
  • the radiation 19 coming from the first source 17 is perceived by the optical sensor 13 only if it is reflected on the finger 15.
  • the optical sensor 13 is therefore not disturbed by the radiation 19 which is not reflected or direct.
  • FIG. 11 represents, by a partial and schematic sectional view, yet another embodiment of an image acquisition device.
  • FIG. 11 illustrates an image acquisition device 119 similar to device 81 illustrated in FIG. 4 except that it comprises an angular filter 121 between each source 17 and its associated first polarizer 83.
  • the device 119 comprises a number of angular filters 121 equal to the number of sources 17 and first polarizers 83.
  • Each angled filter 121 is preferably composed of a matrix of radiation-opaque walls 19 and apertures surmounted by an array of lenses (not shown). Each angular filter has for example an area (in the XZ plane) similar to the area of the first polarizer 83 with which it is associated.
  • each filter 121 is positioned on the upper face of the first polarizer 83 with which it is associated.
  • the radiation 19 coming from the first source 17 is, at the output of the angular filter 121, substantially collimated

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Input (AREA)
  • Solid State Image Pick-Up Elements (AREA)
PCT/EP2021/072466 2020-08-17 2021-08-12 Systeme d'acquisition d'images WO2022038033A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FRFR2008533 2020-08-17
FR2008533A FR3113430B1 (fr) 2020-08-17 2020-08-17 Système d'acquisition d'images

Publications (1)

Publication Number Publication Date
WO2022038033A1 true WO2022038033A1 (fr) 2022-02-24

Family

ID=74045582

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/072466 WO2022038033A1 (fr) 2020-08-17 2021-08-12 Systeme d'acquisition d'images

Country Status (3)

Country Link
CN (1) CN217641336U (zh)
FR (1) FR3113430B1 (zh)
WO (1) WO2022038033A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3139236A1 (fr) * 2022-08-30 2024-03-01 Isorg Dispositif imageur

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2008533A1 (zh) 1968-05-15 1970-01-23 Blakeslee Et Co
US20170337413A1 (en) * 2016-05-23 2017-11-23 InSyte Systems Integrated light emitting display and sensors for detecting biologic characteristics
US20180150671A1 (en) * 2016-11-28 2018-05-31 Lg Display Co., Ltd. Electroluminescent display device integrated with fingerprint sensor
FR3063596A1 (fr) * 2017-03-06 2018-09-07 Isorg Systeme d'acquisition d'images
WO2018167456A1 (en) * 2017-03-17 2018-09-20 Cambridge Display Technology Limited Fingerprint and vein imaging apparatus
WO2020136495A1 (ja) * 2018-12-28 2020-07-02 株式会社半導体エネルギー研究所 表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2008533A1 (zh) 1968-05-15 1970-01-23 Blakeslee Et Co
US20170337413A1 (en) * 2016-05-23 2017-11-23 InSyte Systems Integrated light emitting display and sensors for detecting biologic characteristics
US20180150671A1 (en) * 2016-11-28 2018-05-31 Lg Display Co., Ltd. Electroluminescent display device integrated with fingerprint sensor
FR3063596A1 (fr) * 2017-03-06 2018-09-07 Isorg Systeme d'acquisition d'images
WO2018167456A1 (en) * 2017-03-17 2018-09-20 Cambridge Display Technology Limited Fingerprint and vein imaging apparatus
WO2020136495A1 (ja) * 2018-12-28 2020-07-02 株式会社半導体エネルギー研究所 表示装置

Also Published As

Publication number Publication date
CN217641336U (zh) 2022-10-21
FR3113430B1 (fr) 2024-01-05
FR3113430A1 (fr) 2022-02-18

Similar Documents

Publication Publication Date Title
US20210305440A1 (en) Single photon avalanche diode and manufacturing method, detector array, and image sensor
EP3824326B1 (fr) Systeme optique et son procede de fabrication
EP3376544B1 (fr) Dispositif imageur optique
KR20220054387A (ko) 통합 광학 센서 및 그 제조 방법
EP3949358B1 (fr) Dispositif a capteur optique
WO2022038033A1 (fr) Systeme d'acquisition d'images
WO2022038032A1 (fr) Systeme d'acquisition d'images
EP4196905A1 (fr) Dispositif d'acquisition d'images
EP4264341A1 (fr) Filtre angulaire optique
EP3949359B1 (fr) Dispositif a capteur optique
WO2022128376A1 (fr) Dispositif d'acquisition d'images
WO2023222604A1 (fr) Filtre optique pour photodétecteurs
FR3060811A1 (fr) Dispositif d'acquisition d'empreintes digitales
EP4260103A1 (fr) Filtre angulaire optique
US20180013948A1 (en) Image sensor and image capture device
WO2022128339A1 (fr) Filtre angulaire optique
WO2022128337A1 (fr) Filtre angulaire optique
EP4073427A1 (fr) Filtre optique adapté pour corriger le bruit électronique d'un capteur

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21758382

Country of ref document: EP

Kind code of ref document: A1