EP4196904A1 - System zur erfassung von bildern - Google Patents

System zur erfassung von bildern

Info

Publication number
EP4196904A1
EP4196904A1 EP21758381.4A EP21758381A EP4196904A1 EP 4196904 A1 EP4196904 A1 EP 4196904A1 EP 21758381 A EP21758381 A EP 21758381A EP 4196904 A1 EP4196904 A1 EP 4196904A1
Authority
EP
European Patent Office
Prior art keywords
radiation
source
waveguide layer
layer
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP21758381.4A
Other languages
English (en)
French (fr)
Inventor
Benjamin BOUTHINON
Delphine DESCLOUX
Jérôme MICHALLON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Isorg SA
Original Assignee
Isorg SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isorg SA filed Critical Isorg SA
Publication of EP4196904A1 publication Critical patent/EP4196904A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
    • G02B6/0033Means for improving the coupling-out of light from the light guide
    • G02B6/0035Means for improving the coupling-out of light from the light guide provided on the surface of the light guide or in the bulk of it
    • G02B6/00362-D arrangement of prisms, protrusions, indentations or roughened surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0011Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
    • G02B6/0066Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form characterised by the light source being coupled to the light guide
    • G02B6/0068Arrangements of plural sources, e.g. multi-colour light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1394Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using acquisition arrangements
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K39/00Integrated devices, or assemblies of multiple devices, comprising at least one organic radiation-sensitive element covered by group H10K30/00
    • H10K39/30Devices controlled by radiation
    • H10K39/32Organic image sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/24Coupling light guides
    • G02B6/42Coupling light guides with opto-electronic elements
    • G02B6/4201Packages, e.g. shape, construction, internal or external details
    • G02B6/4202Packages, e.g. shape, construction, internal or external details for coupling an active element with fibres without intermediate optical elements, e.g. fibres with plane ends, fibres with shaped ends, bundles
    • G02B6/4203Optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • This description relates generally to image acquisition systems and, more particularly, to biometric acquisition systems.
  • Biometric acquisition systems and more particularly, fingerprint acquisition systems are used in many fields in order, for example, to secure devices, secure buildings, control access or control the identity of individuals.
  • One embodiment overcomes all or part of the drawbacks of known systems.
  • One embodiment provides an image acquisition system comprising: a single organic image sensor; a waveguide layer covering the image sensor and illuminated in the plane by: a first source adapted to emit a first radiation having at least one wavelength comprised in the band from 400 nm to 600 nm, and a second source adapted to emit a second radiation whose wavelength or wavelengths are between 600 nm and 1100 nm; and an image processing unit adapted to extract information relating to fingerprints and to the veins of a hand imaged by the sensor.
  • the first source and the second source face each other.
  • the first source and the second source are positioned: so that the first radiation and the second radiation are mutually perpendicular; or on the same side of the waveguide layer, one behind the other or one next to the other.
  • the first radiation comprises only wavelengths between 470 nm and 600 nm; and the second radiation comprises only wavelengths between 600 nm and 940 nm.
  • the first light source is composed of one or more light-emitting diodes; and the second light source is composed of one or more light emitting diodes.
  • the waveguide layer comprises: a first network of microstructures adapted to deflect the waves of the first radiation out of the waveguide layer on the side of the waveguide layer opposite the sensor pictures; and a second network of microstructures adapted to deflect the waves of the second radiation out of the waveguide layer on the side of the waveguide layer opposite the image sensor.
  • the first network of microstructures extends over the entire length of the waveguide layer; and the second array of microstructures spans the entire length of the waveguide layer.
  • the second network of microstructures extends from the second light source over a first distance in the waveguide layer; and the first array of microstructures extends from the first light source a second distance into the waveguide layer.
  • the first distance and the second distance are equal; or the first distance and the second distance are different.
  • the information relating to the fingerprints is obtained from at least one image acquired by the image sensor with the second radiation.
  • the information relating to the veins is obtained from at least one image acquired by the image sensor with the first radiation.
  • Figure 1 shows, in a sectional view, partial and schematic, an example of an image acquisition system
  • Figure 2 shows, in a top view, partial and schematic, the image acquisition system illustrated in Figure 1;
  • FIG. 3 shows, in sectional and top views, partial and schematic, an embodiment of a part of the image acquisition system illustrated in FIG. 1;
  • FIG. 4 represents, in a sectional, partial and schematic view, an embodiment of another part of the image acquisition system illustrated in FIG. 1;
  • Figure 5 shows, by two top views, partial and schematic, two embodiments of a color filter
  • FIG. 6 shows, in a partial and schematic sectional view, another example of an image acquisition system
  • Figure 7 shows, in a top view, partial and schematic, an embodiment of the system illustrated in Figure 6;
  • Figure 8 shows, in a top view, partial and schematic, another embodiment of the system illustrated in Figure 6;
  • FIG. 9 represents, by a block diagram, an example of implementation of an image acquisition method.
  • Figure 10 shows, in a sectional view, partial and schematic, a structure comprising a polarizer.
  • the expression “it comprises only the elements” means that it comprises, at least 90% of the elements, preferably that it comprises at least 95% of the elements.
  • a layer or a film is said to be opaque to radiation when the transmittance of the radiation through the layer or the film is less than 10%.
  • a layer or a film is said to be transparent to radiation when the transmittance of the radiation through the layer or the film is greater than 10%, preferably greater than 50%.
  • all the elements of the optical system which are opaque to radiation have a transmittance which is less than half, preferably less than a fifth, more preferably less tenth, of the lowest transmittance of the elements of the optical system transparent to said radiation.
  • the term "useful radiation” is used to refer to the electromagnetic radiation passing through the optical system in operation.
  • optical element of micrometric size is used to refer to an optical element formed on one face of a support whose maximum dimension, measured parallel to said face, is greater than 1 ⁇ m and less than 1 ⁇ m. mm.
  • each micrometric-sized optical element corresponds to a micrometric-sized lens, or microlens, composed two diopters.
  • each optical element of micrometric size being able to correspond, for example, to a Fresnel lens of micrometric size, to a micron-sized gradient index lens or to a micron-sized diffraction grating.
  • visible light is called electromagnetic radiation whose wavelength is between 400 nm and 700 nm, and, in this range, red light is electromagnetic radiation whose wavelength is between 600 nm and 700 nm.
  • Infrared radiation is electromagnetic radiation with a wavelength between 700 nm and 1 mm. In infrared radiation, a distinction is made in particular between near infrared radiation, the wavelength of which is between 700 nm and 1.1 ⁇ m.
  • the refractive index of a medium is defined as being the refractive index of the material constituting the medium for the range of wavelengths of the radiation captured by the sensor of pictures. The refractive index is considered to be substantially constant over the range of wavelengths of the useful radiation, for example equal to the average of the refractive index over the range of wavelengths of the radiation picked up by the image sensor .
  • Figure 1 shows, in a sectional view, partial and schematic, an example of an image acquisition system.
  • Figure 2 shows, by a top view, partial and schematic, the image acquisition system illustrated in Figure 1.
  • the system comprises a device 11 comprising, from bottom to top in the orientation of the figure: a single organic image sensor 13; and a layer 17, called the waveguide, covering the upper face of the image sensor 13.
  • the device 11 further comprises, preferably, an optical filter 15, for example an angular filter, between the image sensor 13 and the waveguide layer 17.
  • an optical filter 15 for example an angular filter
  • Figures 1 to 8 are represented in space according to a direct orthogonal XYZ frame, the Y axis of the XYZ frame being orthogonal to the upper face of the sensor 13.
  • the device 11 is connected to a processing unit 18 preferably comprising means for processing the signals supplied by the device 11, not shown in FIG. 1.
  • the processing unit 18 comprises, for example, a microprocessor.
  • the device 11 and the processing unit 18 are, for example, integrated in the same circuit.
  • the device 11 comprises a first light source 19 adapted to emit a first radiation 21 and a second light source 23 adapted to emit a second radiation 25.
  • the sources 19 and 23 face each other.
  • the sources 19 and 23 are, for example, laterally coupled to the layer 17 and are located outside the plumb, in the direction Y, of the stack of sensor 13, of the angular filter 15 and of the layer 17.
  • the device 11 captures the image response of an object 27, partially represented, preferably a hand.
  • the image processing unit 18 is adapted to extract information relating to fingerprints and to a network of veins of the hand 27 imaged by the sensor 13.
  • the radiation 21 corresponds to light radiation in the red and/or infrared, that is to say radiation whose component wavelength(s) are between 600 nm and 1700 nm. . More preferably, the radiation 21 corresponds to light radiation, the set of wavelengths of which it is composed is between 600 nm and 1100 nm, and even more preferably, between 630 nm and 940 nm.
  • the radiation 25 corresponds to light radiation in the visible, that is to say radiation of which at least one of the wavelengths is between 400 nm and 800 nm.
  • the radiation 25 corresponds to light radiation having at least one wavelength between 400 nm and 600 nm.
  • the radiation 25 corresponds to a radiation whose set of wavelengths which compose it is between 400 nm and 600 nm.
  • the radiation 25 corresponds to radiation whose set of wavelengths that compose it is between 470 nm and 600 nm.
  • the radiation 25 corresponds to radiation whose wavelengths are approximately equal to 530 nm (green) or 500 nm (cyan).
  • the structure of the layer 17 is described later in relation to FIG. 3 and the angular filter 15 and the sensor 13 are described later in relation to FIG. 4.
  • the sources 19 and 23 are positioned on the periphery of the layer 17.
  • the source 19 is located to the right of the layer 17, in the orientation of Figures 1 and 2
  • source 23 is located to the left of layer 17, in the orientation of Figures 1 and 2.
  • the sources 19 and 23 are located indifferently relative to each other.
  • the two sources 19 and 23 are positioned, for example, on the same side of the layer 17, one behind the other, one beside the other or so that the radiations 21 and 25 are orthogonal.
  • the sources 19 and 23 are switched on one after the other so as to successively image the hand 27 only by the first radiation 21, then the hand 27 only by the second radiation 25 or vice versa. .
  • the sources 19 and 23 are switched on simultaneously.
  • the source 19 is composed of one or more light-emitting diodes (LED, Light-Emitting Diode). Preferably, the source 19 is composed of several LEDs organized in "bars" along the layer 17. According to one embodiment, the source 23 is composed of one or more light-emitting diodes. Preferably, the source 23 is composed of several LEDs organized in "bars" along the layer 17.
  • LED Light-Emitting Diode
  • the source 19 is composed of several LEDs organized in "bars" along the layer 17.
  • Figure 3 shows, through four views, partial and schematic, part of the image acquisition system illustrated in Figure 1.
  • FIG. 3 illustrates two embodiments of the waveguide layer 17 with a length L.
  • Figure 3 illustrates a first embodiment of the layer 17 by a view Al from above and a view A2 in section, the view A2 being a view according to the section plane AA of the view Al.
  • Figure 3 illustrates a second embodiment of the diaper 17 by a view B1 from above and a view B2 in section, the view B2 being a view according to the section plane BB of the view B1.
  • the layer 17, called the waveguide layer, comprises a structure of two or three media with different refractive indices.
  • a waveguide layer is structurally adapted to allow the confinement and propagation of electromagnetic waves.
  • the media are, for example, arranged in the form of a stack of three sub-layers, a central layer sandwiched between an upper sheath and a lower sheath, the refractive indices of the materials making up the sheaths being lower than the index refraction of the material making up the central layer, the lower sheath being located on the side of the angular filter 15.
  • microstructures are formed, by nano-imprinting (nanoimprint) between the central layer and the lower sheath.
  • the microstructures preferably have the shapes of prisms isosceles whose angle at the apex, that is to say at the tip, is equal to 45°, isosceles rectangular prisms or teeth whose tips are directed in the direction of the object to be imaged.
  • the microstructures can have the shapes of hemispheres, cones, pyramids, tetrahedrons etc.
  • Each microstructure may comprise a face, for example flat, slightly inclined in the direction of propagation of the wave so that the propagated wave is deflected and follows the geometry of the microstructures.
  • the inclination of the microstructure face relative to the lower face of the central layer is, for example, between 5° and 80°.
  • the inclination is preferably of the order of 45°.
  • the microstructures are not evenly distributed along the wave path.
  • the microstructures are preferably closer and closer in the direction of the output of the waveguide.
  • the density of the microstructures is preferably higher and higher as one moves away from the source of the radiation deflected by these microstructures.
  • the microstructures are preferably filled with a material of lower optical index than the central layer or with air.
  • the central layer is for example made of poly(methyl methacrylate) (PMMA), polycarbonate (PC), cyclo-olefin polymer (COP) or poly(ethylene terephthalate) (PET).
  • the sheaths are, for example, made of epoxy or acrylate resins having a refractive index lower than the refractive index constituting the central layer.
  • a first network of microstructures 29 is, for example, adapted to guide the first waves of the first radiation 21 emitted by the first source 19 (FIGS. 1 and 2).
  • the first network then comprises microstructures 29 inclined in the direction of the waves emitted by the first source
  • a second network of microstructures 31 is, for example, adapted to guide the second waves of the second radiation 25 emitted by the second source 23 (FIGS. 1 and 2).
  • the second network then comprises microstructures 31 inclined in the direction of the waves emitted by the second source 23.
  • layer 17 has a thickness of between 200 ⁇ m and 600 ⁇ m, preferably between 300 ⁇ m and 500 ⁇ m.
  • the central layer has a thickness of between 1 ⁇ m and 40 ⁇ m, preferably between 1 ⁇ m and 20 ⁇ m.
  • the microstructures have, for example, a thickness of between 1 ⁇ m and 15 ⁇ m, preferably between 2 ⁇ m and 10 ⁇ m.
  • each array of microstructures 31 extends from the side edge of layer 17, adjacent to source 23, over a length L.
  • Each array of microstructures 31 extends, for example, as far as possible, up to the lateral edge, opposite the source 23, of the layer 17.
  • the length L corresponds substantially to the length of the layer 17.
  • the length L can be between 10 mm and 250mm.
  • each network of microstructures 29 extends from the lateral edge of layer 17, adjacent to source 19, over the same length L.
  • Each network of microstructures 29 extends, for example, as far as possible, up to at the side edge, opposite the source 19, of the layer 17.
  • each array of microstructures 31 extends from the side edge of layer 17, adjacent to source 23, over a length L1 and each array of microstructures 29 extends from the side edge of layer 17, adjacent to source 19, over a length L2.
  • the length L is preferably greater than or equal to the addition of the lengths L1 and L2.
  • the lengths L1 and L2 can be different or equal.
  • the length L2 is, for example, equal to three times the length L1.
  • a single array of microstructures is both adapted to guide the second waves of the second radiation 25 emitted by the second source 23 and adapted to guide the first waves of the first radiation 21 emitted by the first source 19.
  • the layer 17 is covered in the stack of the image acquisition device 11 by a protective layer.
  • the protective layer makes it possible, in particular, to prevent the layer 17 from being scratched by the user of the device 11.
  • Figure 4 shows, by a partial and schematic sectional view, another part of the image acquisition system illustrated in Figure 1.
  • FIG. 4 illustrates a structure 33 comprising the angular filter 15 and the sensor 13 of the device 11.
  • the sensor 13 comprises photodetectors 35, preferably arranged in matrix form.
  • the photodetectors 35 preferably all have the same structure and the same properties/characteristics. In other words, all the photodetectors are substantially identical, except for manufacturing differences.
  • the sensor 13 is preferably adapted to pick up the radiations 21 and 25.
  • the photodiodes 35 are, for example organic photodiodes (OPD, Organic Photodiode) integrated on a substrate with CMOS transistors (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor) or a substrate with thin film transistors (TFT or Thin Film Transistor).
  • CMOS transistors Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor
  • TFT or Thin Film Transistor The substrate is for example made of silicon, preferably of monocrystalline silicon.
  • the channel, source and drain regions of TFT transistors are for example made of amorphous silicon (a-Si or amorphous Silicon), indium, gallium, zinc and oxide (IGZO Indium Gallium Zinc Oxide) or low temperature polycrystalline silicon ( LTPS or Low Temperature Polycrystalline Silicon).
  • the photodiodes 35 of the image sensor 13 comprise, for example, a mixture of organic semiconductor polymers such as poly(3-hexylthiophene) or poly(3-hexylthiophene-2,5-diyl), known under the name P3HT, mixed with methyl [6,6]-phenyl-C61-butanoate (N-type semiconductor), known as PCBM.
  • organic semiconductor polymers such as poly(3-hexylthiophene) or poly(3-hexylthiophene-2,5-diyl)
  • P3HT poly(3-hexylthiophene) or poly(3-hexylthiophene-2,5-diyl)
  • PCBM methyl [6,6]-phenyl-C61-butanoate
  • the photodiodes 35 of the image sensor 13 comprise, for example, small molecules, that is to say molecules having molar masses of less than 500 g/mol, preferably less than 200 g/mol .
  • the photodiodes 35 can be inorganic photodiodes, for example, made from amorphous silicon or crystalline silicon.
  • the photodiodes 35 are composed of quantum dots.
  • the angled filter 15, illustrated in FIG. 4, comprises, from bottom to top in the orientation of the figure: a first layer 39 comprising openings 41, or holes, and walls 43 that are opaque to radiation 21 and 25.
  • openings 41 are, for example, filled with a material forming, on the underside of layer 39, a layer 45; a substrate or support 47, resting on the upper face of layer 39; and an array of lenses 49 of micrometric size, located on the upper face of the substrate 47, the flat face of the lenses 49 and the upper face of the substrate 47 facing.
  • the array of lenses 49 is surmounted by a flattening layer 51.
  • the substrate 47 can be made of a transparent polymer which does not absorb, at least, the wavelengths considered, here in the visible and infrared range.
  • This polymer can in particular be poly(ethylene terephthalate) PET, poly(methyl methacrylate) PMMA, inecyclic olefin polymer (COP), polyimide (PI), polycarbonate (PC).
  • the thickness of the substrate 47 can, for example, vary between 1 ⁇ m and 100 ⁇ m, preferably between 10 ⁇ m and 100 ⁇ m.
  • the substrate can correspond to a colored filter, to a polarizer, to a half-wave plate or to a quarter-wave plate.
  • the lenses 49 can be made of silica, of PMMA, of a positive photosensitive resin, of PET, of poly(ethylene naphthalate) (PEN), of COP, of polydimethylsiloxane (PDMS)/silicone, of epoxy resin or in acrylate resin.
  • the lenses 49 can be formed by creeping blocks of a photoresist.
  • Lenses 49 may additionally be formed by molding over a layer of PET, PEN, COP, PDMS/silicone, epoxy resin or acrylate resin.
  • the lenses 49 are converging lenses each having a focal distance f of between 1 ⁇ m and 100 ⁇ m, preferably between 1 ⁇ m and 70 ⁇ m. According to one embodiment, all the lenses 49 are substantially identical.
  • the lenses 49 and the substrate 47 are preferably made of transparent or partially transparent materials, that is to say transparent in a part of the spectrum considered for the targeted domain, for example, imaging, over the range of wavelengths corresponding to the wavelengths used during exposure.
  • the layer 51 is a layer that matches the shape of the lenses 49.
  • the layer 51 can be obtained from an optically transparent adhesive (Optically Clear Adhesive - OCA), in particular an optically transparent adhesive liquid, or of a material with a low refractive index, or of an epoxy/acrylate adhesive, or of a film of a gas or of a gaseous mixture, for example air.
  • an optically transparent adhesive Optically Clear Adhesive - OCA
  • an optically transparent adhesive liquid or of a material with a low refractive index, or of an epoxy/acrylate adhesive, or of a film of a gas or of a gaseous mixture, for example air.
  • the openings 41 are, for example, filled with air, with a partial vacuum or with a material that is at least partially transparent in the visible and infrared domains.
  • the angular filter 15 is adapted to filter the incident radiation according to the incidence of the radiation with respect to the optical axes of the lenses 49.
  • the angular filter 15 is, more particularly, adapted so that each photodetector 35 of the image sensor 13 receives only the rays whose respective incidences relative to the respective optical axes of the lenses 49 associated with this photodetector 35 are less than a maximum incidence of less than 45°, preferably less than 30°, more preferably less than 10°, even more preferably less than 4°.
  • the angular filter 15 is adapted to block the rays of the incident radiation whose respective incidences relative to the optical axes of the lenses 49 of the angular filter 15 are greater than the maximum incidence.
  • Each opening 41 is preferably associated with a single lens 49.
  • the optical axes of the lenses 49 are, preferably centered with the centers of the apertures 41 of the layer 39.
  • the diameter of the lenses 49 is preferably greater than the maximum size of the section (perpendicular to the optical axis of the lenses 49) of the apertures 41.
  • Each photodetector 35 is preferably associated with at least four apertures 41 (and four lenses 49). Preferably, each photodetector 35 is associated with exactly four apertures 41.
  • the structure 33 is preferably divided into pixels 37.
  • the term pixel is used throughout the description to define a part of the image sensor 13 comprising a single photodetector 35. apply to the scale of the image sensor 13 but also to the scale of the structure 33.
  • a pixel is the whole stack, constituting the structure 33, at the plumb of pixel 37 of sensor 13.
  • the term pixel 37 unless otherwise specified, refers to a pixel on the scale of structure 33.
  • a pixel 37 corresponds to each part of the structure 33 comprising, among other things, a photodetector 35 surmounted by four openings 41, themselves surmounted by four lenses 49.
  • Each pixel 37 is, preferably, of substantially square shape seen in a direction perpendicular to the upper face of the image sensor 13.
  • the surface area of each pixel corresponds to a square whose dimension on one side is between 32 ⁇ m and 100 ⁇ m , preferably between 50.8 ⁇ m and 80 ⁇ m.
  • Each pixel 37 can be associated with a number of lenses 49 different from four, depending on the diameter of the lenses 49 and the dimensions of the pixels 37.
  • a pixel 37 comprises a photodetector 35 surmounted by four openings 41.
  • the angular filter 15 comprising the openings 41 can be rolled onto the image sensor 13 without prior alignment of the angular filter 15 on the image sensor 13.
  • Certain lenses 49 and apertures 41 can then be located in the orientation of the stack, that is to say along the Y direction, straddling two photodetectors 35 .
  • Figure 5 shows, by two top views, partial and schematic, two embodiments of a color filter 50.
  • FIG. 5 illustrates a color filter 50, preferably intended to be positioned on the upper face of the angular filter 15 (FIG. 4).
  • the color filter 50 is divided into two parts.
  • One or more first parts 501 of the color filter 50 are adapted to pass, according to an embodiment illustrated by views B1 and B2, all of the visible and infrared radiation, preferably only only the visible radiation, again more preferably only part of the visible radiation, in particular only the green radiation.
  • the first parts 501 (G) are adapted to pass, according to an embodiment illustrated by views A1 and A2, only at least one wavelength in the band from 400 nm to 600 nm, more preferably in the band from 470 nm to 600 nm.
  • the first parts 501 are adapted to allow only the wavelength equal to 530 nm or 500 nm to pass.
  • One or more second parts 502 (R) of the color filter 50 are adapted to block all the wavelengths outside from the 600 nm to 1100 nm band, preferably outside the 630 nm to 940 nm band.
  • each second part 502 of the color filter 50 is formed on the surface of the angular filter 15 so that a pixel 37 is covered by each second part 502.
  • each second part 502 of the color filter 50 has a square shape in the view of FIG. 5.
  • the surface of each second part 502 of the color filter 50 is equal to the size of a pixel, or a square approximately 50.8 pm by 50.8 pm.
  • the repetition pitch of the second parts 502 of the color filter 50 is between two pixels 37 and twenty pixels 37.
  • the repetition pitch of the second parts 502 is approximately ten pixels 37 along the Z axis and ten pixels 37 along the X axis.
  • nine pixels separate two consecutive pixels along the Z (or X) axis covered by second parts 502.
  • a single pixel is covered by a second part 502.
  • the second parts 502 are arranged so that, for example within a set of eight pixels (two columns of pixels and four rows of pixels), two second parts 502 are formed on the surface of the angular filter 15 so as to cover two pixels of the same column.
  • the second parts 502 are arranged so that, for example within a set of eight pixels (two columns of pixels and four rows of pixels), two second parts 502 are trained to the surface of the angular filter 15 so as to cover two pixels of two different columns.
  • the repetition pitch of the second parts 502 is two pixels, however they are easily adaptable for a repetition pitch of the second parts greater than two pixels.
  • the material constituting the second part 502 is a material transparent only to wavelengths between 600 nm and 1100 nm (near-infrared filter), preferably between 630 nm and 940 nm. nm, for example an organic resin comprising a dye suitable for filtering out all the wavelengths not included in the aforementioned band.
  • the second parts 502 can, for example, be made from interference filters.
  • the other pixels 37 are covered by the first part 501 of the color filter 50.
  • the first part 501 is contiguous between two neighboring pixels 37, that is to say say that the first part 501 is not pixelated and that it is formed simultaneously on all the pixels considered of the image sensor 13.
  • the constituent material of the first part 501 is air or a partial vacuum.
  • the material constituting the first part 501 is a material transparent only to wavelengths between 400 nm and 600 nm (visible filter), preferably between 470 nm and 600 nm, for example a resin comprising the colorant known under the trade name "Orgalon Green 520" or a resin from the "COLOR MOSAIC” commercial range from the manufacturer Fujifilm.
  • the first part 501 can, for example, be made from interference filters.
  • the material constituting the first part 501 is a material transparent only at 500 nm (cyan filter) or transparent at only 530 nm (green filter), for example a resin comprising the dye known as commercial name "PC GREEN 123P” or a resin from the "COLOR MOSAIC” commercial range from the manufacturer Fujifilm.
  • the first part 501 can, for example, be made from interference filters.
  • FIG. 6 represents, by a partial and schematic sectional view, another example of an image acquisition device.
  • FIG. 6 illustrates a device 52 similar to device 11 illustrated in FIG. 1 except that it comprises two polarizers.
  • the device 52 comprises: at least one first polarizer 53; and a second polarizer 55.
  • Each first polarizer 53 is located in the device 52 so that the radiation 21 coming from the first source 19 preferably passes through the first polarizer 53 before reaching the optical sensor 13. More specifically, the radiation 21 passes through the first polarizer 53 is then reflected by the hand 27 and crosses the second polarizer 55 before reaching the optical sensor 13.
  • the first polarizer 53 thus covers laterally (along the axis (Y)) the source 19.
  • the number of first polarizers 53 is similar to the number of first sources 19 such that each first source 19 is associated with a single first polarizer 53 and each first polarizer 53 is associated with a single first source 19.
  • Each first polarizer 53 thus has an area (in the plane XY) equal to or greater than the area of the source 19 with which it is associated.
  • the number of first polarizers 53 is less than the number of first sources 19, the area of each first polarizer then being greater than the area of each first source 19.
  • each first polarizer is associated to and laterally overlaps more than one first source 19.
  • device 52 comprises a single polarizer which laterally overlaps all of the sources 19.
  • the second polarizer 55 is located between the angular filter 15 and the image sensor 13 or between the layer 17 and the angular filter 15.
  • the first polarizer(s) 53 and the second polarizer 55 are rectilinear, or in other words linear.
  • the first polarizer(s) 53 polarize in a first direction which will also be called, hereinafter, the horizontal direction.
  • the second polarizer 55 is composed of: one or more first parts which polarize in a second direction, perpendicular to the first direction, which will also be referred to below as the vertical direction ; and one or more second parts which polarize in the horizontal direction.
  • the light source 19 emits radiation 21 of low divergence, that is to say that the rays of the radiation 21 are included in a cone of radiation whose angle is less than 15°, preferably less than 5°.
  • the light source 19 is coupled to an angular filter (not shown), located between the source 19 and the first polarizer 53 or between the first polarizer 53 and the layer 17.
  • the aforementioned angular filter is adapted to block all the rays emitted by the source 19 whose incidence, measured with respect to the axis Z, is greater than 15°, preferably greater than 5°.
  • Figure 7 shows, by a partial and schematic top view, an embodiment of the device illustrated in Figure 6.
  • FIG. 7 illustrates an embodiment of the arrangement of the first parts 57 and second parts 59 of the second polarizer 55.
  • the first parts 57 and each second part 59 of the polarizer 55 are formed on the surface of the layer 17 so that one pixel 37 out of two is covered by a first part 57 and one pixel 37 out of two, different from the previous ones, is covered by a second part 59.
  • first parts 57 and two of the pixels 37, different from the pixels previous are, for example, covered by second parts 59.
  • each first part 57 and each second part 59 of the second polarizer 55 has a substantially square shape in the view of FIG. 7.
  • the areas of each first part 57 and every second part 59 of the second polarizer 55 are equal to a square of approximately 50.8 pm by 50.8 pm.
  • the second polarizer 55 is, for example, formed by successive depositions of the first parts 57 and the second parts 59, on the surface of the layer 17.
  • the repetition pitch of the first parts 57 can be greater than two pixels.
  • the repetition pitch of the first parts can be between two pixels 37 and twenty pixels 37, preferably between five pixels 37 and fifteen pixels 37 and more preferably equal to around ten pixels 37.
  • Figure 8 shows, by a top view, partial and schematic, another embodiment of the device illustrated in Figure 6.
  • FIG. 8 illustrates another embodiment of the arrangement of the first parts 57 and second parts 59 of the second polarizer 55.
  • the first parts 57 and second parts 59 of the second polarizer 55 are formed arbitrarily on the surface of the sensor 13.
  • each first part 57 of the second polarizer 55 has a larger area (in the XY plane) than the area of each first part 57 of the second polarizer 55 shown in Figure 7.
  • each first part 57 of the second polarizer 55 is formed on the layer 17 without prior alignment of the latter with the photodetectors 35 or the underlying lenses 49.
  • each first part 57 has a substantially square shape in the view of FIG. layer 17, at least one pixel 37 (or a photodetector 35) regardless of its location on the upper face of layer 17.
  • the area of each first part 57 is at least equal to the area of four pixels 37
  • the area of each first part 57 is between the area of four pixels 37 and the area of six pixels 37.
  • the area of each first part 57 is equal to the area of four pixels 37.
  • upper part of layer 17, not covered by the first parts 57, is covered by second parts 59.
  • a step calibration can be provided to determine the positions of the pixels covered by the first parts 57, for example by illuminating the image acquisition device with radiation polarized for example horizontally so that only the pixels covered by the second parts will pick up a radiation.
  • the second polarizer 55 is, for example, formed by successive depositions of the first parts 57 and the second parts 59 on the surface of the layer 17.
  • the repetition pitch of the first parts 57 is between a distance corresponding to the dimension of three pixels and a distance corresponding to the dimension of twenty pixels.
  • T1 the repetition pitch is substantially equal to a distance corresponding to the dimension ten pixels.
  • the distribution of the first parts 57 is aligned, that is to say that the repetition is done in rows and columns, or offset, that is to say that the distribution is shifted by one or more pixels d from one row to the next or from one column to the next.
  • the distribution of the second parts 59 is aligned, that is to say that the repetition is done in rows and columns, or offset, that is to say that the distribution is offset by one or more pixels from one row to the next or from one column to the next.
  • An advantage of the embodiments and modes of implementation described previously in relation to FIGS. 6 to 8 is that they make it possible to simultaneously take an image under the horizontally polarized radiation 21 then, after reflection on the hand 27 horizontally (that is to say an image under the radiation 21 having crossed two aligned polarizers) and an image under the horizontally polarized radiation 21 then, after reflection on the hand 27, vertically (that is to say an image under the radiation 21 having passed through two crossed polarizers).
  • FIG. 9 represents, by a block diagram, an example of implementation of an image acquisition method.
  • FIG. 9 illustrates a method allowing the acquisition of images and the processing thereof in the case of a device comprising the sources 19 and 23 (FIGS. 1 and 2).
  • a first stream relates to the acquisition of images by the image sensor 13.
  • a second stream relates to the processing carried out on the acquired images.
  • the first flow begins with a step 61 of positioning the hand 27 on the upper face of the layer 17 (finger on display). Step 61 is followed by a step 63 in which the position of the hand 27 is detected (Detecting finger position) and localized on the layer 17.
  • the detection of the position of the hand 27 can be carried out by a detection element included in the image acquisition device or by an element internal to the image sensor 13, for example one of its electrodes.
  • the first stream comprises, in a subsequent step 65, the switching on of the sources 19 and 23 (Visible source and IR source ON).
  • Step 65 is followed by a step 67 of acquiring an image, of dividing this image into two distinct images depending on whether the pixels are associated with a first part 57, or with the second part 58 of the second polarizer 55 and storage of these images (Image acquisition).
  • the first image is the image which is associated with the photodetectors 35 (FIG. 4) surmounted by a first part 57 of the second polarizer 55.
  • the radiation 21 is polarized by the first polarizer 53 in horizontal (H) then, after reflection on the hand 27, is polarized by the first part 57 of the second polarizer 55 in vertical (V), before reaching the image sensor 13.
  • the second image is the image associated with the photodetectors 35 (FIG. 4) surmounted by a second part 59 of the second polarizer 55.
  • the radiation 21 is polarized by the first polarizer 53 in horizontal (H) then, after reflection on the hand 27, is polarized by the second part 59 of the second polarizer 55 in horizontal (H), before reaching the image sensor 13.
  • the second stream comprises two phases respectively dedicated to the separate processing of the two images and to the processing of the combination of the two images.
  • the first phase of the second stream comprises the processing of the first acquired image (HV output of block 67) in order to extract therefrom in a step 69 an image comprising volume information on the hand 27 (Volume information (veins)) .
  • Volume information is called information that required the penetration of light into the volume of the hand to be acquired.
  • the information concerning the veins for example their number, their shape or their arrangement within the hand, is, for example, volume information.
  • the first phase of the second stream further comprises the processing of the second acquired image (output HH of block 67) in order to extract therefrom, in a step 71, an image comprising surface and volume information on the hand 27 (Surface and volume information) .
  • the second phase of the second stream comprises a step 73 during which the information coming from the first image and the information coming from the second image are processed together in order to extract only surface information (Surface information (fingerprint)). .
  • This may include the determination of a third image corresponding to the difference, possibly weighted, between the second image and the first image.
  • surface information the information which required the reflection of light on the surface of the hand to be acquired.
  • the information concerning the fingerprints is, for example, areal information. He This is for example an image of the grooves and ridges of fingerprints.
  • Figure 10 shows, in a sectional view, partial and schematic, a structure comprising a polarizer.
  • FIG. 10 illustrates an embodiment of a structure 75 in which the second polarizer 55 has been formed on the surface of a support 77.
  • the second polarizer 55 illustrated in Figure 10 is identical to the second polarizer 55 illustrated in Figure 6.
  • the second polarizer 55 is however formed on the support 77, unlike Figure 6 where the polarizer 85 is formed on the image sensor 13. This makes it possible, advantageously, to produce the second polarizer 55 separately from the other elements of the image acquisition device 52.
  • the support 77 can be made of a transparent polymer which does not absorb, at least, the wavelengths considered, here in the visible and infrared range.
  • This polymer may in particular be poly(ethylene terephthalate) (PET), poly(methyl methacrylate) (PMMA), polymer of inecyclic olefin (COP), polyimide (PI) or polycarbonate (PC).
  • Support 77 is preferably PET.
  • the thickness of the support 77 can vary from 1 ⁇ m to 100 ⁇ m, preferably from 10 ⁇ m to 50 ⁇ m.
  • the support 77 can correspond to a colored filter, to a half-wave plate or to a quarter-wave plate.
  • the arrangement of the first parts 57 and the second parts 59 of the second polarizer 55 illustrated in FIG. 10 is similar to the arrangement of the parts 57 and 59 of the second polarizer 55 illustrated in FIGS. 7 and 8.
  • the structure 75 is mounted in the image acquisition device 52 of the figure 6, replacing the second polarizer 55, between the angular filter 15 and the layer 17.
  • the structure 75 is mounted in the image acquisition device 52 of FIG. 6, replacing the second polarizer 55, between the angular filter 15 and the image sensor 13.
  • the polarizer 55 is formed under the substrate 77.
  • the lower face of the polarizer 55 is then in contact with the upper face of the image sensor 13 or in contact with the face upper part of the angular filter 15 depending on whether the structure 75 is positioned between the angular filter 15 and the layer 17 or between the angular filter 15 and the image sensor 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Optics & Photonics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Image Input (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Facsimile Heads (AREA)
  • Facsimile Scanning Arrangements (AREA)
EP21758381.4A 2020-08-17 2021-08-12 System zur erfassung von bildern Withdrawn EP4196904A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2008537A FR3113431B1 (fr) 2020-08-17 2020-08-17 Système d'acquisition d'images
PCT/EP2021/072465 WO2022038032A1 (fr) 2020-08-17 2021-08-12 Systeme d'acquisition d'images

Publications (1)

Publication Number Publication Date
EP4196904A1 true EP4196904A1 (de) 2023-06-21

Family

ID=74045584

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21758381.4A Withdrawn EP4196904A1 (de) 2020-08-17 2021-08-12 System zur erfassung von bildern

Country Status (6)

Country Link
US (1) US20240013569A1 (de)
EP (1) EP4196904A1 (de)
JP (1) JP2023538624A (de)
CN (1) CN216817444U (de)
FR (1) FR3113431B1 (de)
WO (1) WO2022038032A1 (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3135794A1 (fr) * 2022-05-19 2023-11-24 Isorg Filtre optique pour photodétecteurs

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU426280B2 (en) 1968-05-15 1972-07-19 Touma Door Company Pty. Limited Sliding door
US7623689B2 (en) * 2003-11-18 2009-11-24 Canon Kabushiki Kaisha Image pick-up apparatus including luminance control of irradiation devices arranged in a main scan direction
JP6075069B2 (ja) * 2013-01-15 2017-02-08 富士通株式会社 生体情報撮像装置及び生体認証装置ならびに生体情報撮像装置の製造方法
JP2017196319A (ja) * 2016-04-28 2017-11-02 ソニー株式会社 撮像装置、認証処理装置、撮像方法、認証処理方法およびプログラム
US10713458B2 (en) * 2016-05-23 2020-07-14 InSyte Systems Integrated light emitting display and sensors for detecting biologic characteristics
JP7144814B2 (ja) * 2018-12-28 2022-09-30 株式会社ジャパンディスプレイ 検出装置

Also Published As

Publication number Publication date
CN216817444U (zh) 2022-06-24
FR3113431A1 (fr) 2022-02-18
US20240013569A1 (en) 2024-01-11
FR3113431B1 (fr) 2023-09-15
WO2022038032A1 (fr) 2022-02-24
JP2023538624A (ja) 2023-09-08

Similar Documents

Publication Publication Date Title
EP3824326B1 (de) Optisches system und verfahren zur herstellung davon
US20210305440A1 (en) Single photon avalanche diode and manufacturing method, detector array, and image sensor
EP3376544B1 (de) Optische bildanzeigevorrichtung
FR3063596A1 (fr) Systeme d'acquisition d'images
FR2974669A1 (fr) Dispositif imageur destine a evaluer des distances d'elements dans une image
US11668864B2 (en) Thin optical filter arrays
WO2022038032A1 (fr) Systeme d'acquisition d'images
WO2022038034A1 (fr) Dispositif d'acquisition d'images
WO2022038033A1 (fr) Systeme d'acquisition d'images
WO2021110875A1 (fr) Filtre angulaire
FR3107363A1 (fr) Structure d'un filtre angulaire sur un capteur CMOS
WO2022128376A1 (fr) Dispositif d'acquisition d'images
EP4264341A1 (de) Optisches winkelfilter
WO2023222604A1 (fr) Filtre optique pour photodétecteurs
WO2018114902A1 (fr) Dispositif d'acquisition d'empreintes digitales
WO2022128336A1 (fr) Filtre angulaire optique
WO2022128337A1 (fr) Filtre angulaire optique
WO2022128339A1 (fr) Filtre angulaire optique
WO2021116231A1 (fr) Filtre optique adapté pour corriger le bruit électronique d'un capteur
WO2023088633A1 (fr) Filtre angulaire optique et procédé de fabrication d'un tel filtre
FR3131398A1 (fr) Procédé de fabrication d'un filtre coloré
FR3117612A1 (fr) Filtre optique

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230119

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
18W Application withdrawn

Effective date: 20231107