EP4196905A1 - Bilderfassungsvorrichtung - Google Patents

Bilderfassungsvorrichtung

Info

Publication number
EP4196905A1
EP4196905A1 EP21758384.8A EP21758384A EP4196905A1 EP 4196905 A1 EP4196905 A1 EP 4196905A1 EP 21758384 A EP21758384 A EP 21758384A EP 4196905 A1 EP4196905 A1 EP 4196905A1
Authority
EP
European Patent Office
Prior art keywords
parts
color filter
filter
wavelengths
band
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21758384.8A
Other languages
English (en)
French (fr)
Inventor
Jérôme MICHALLON
Delphine DESCLOUX
Benjamin BOUTHINON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Isorg SA
Original Assignee
Isorg SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isorg SA filed Critical Isorg SA
Publication of EP4196905A1 publication Critical patent/EP4196905A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1388Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths

Definitions

  • This description relates generally to image acquisition devices and more particularly to fingerprint acquisition devices.
  • Fingerprint acquisition devices are used in many fields in order, for example, to secure devices, secure buildings, control access or control the identities of individuals.
  • One embodiment overcomes all or part of the drawbacks of known fingerprint acquisition devices.
  • One embodiment provides an image acquisition device comprising: a single sensor with organic photodetectors, an angular filter, and a color filter comprising:
  • each second part having a surface area approximately equal to the surface area of a photodetector or a surface area greater than or equal to the surface area of four photodetectors; or an image processing unit adapted to extract information relating to fingerprints and veins of a finger imaged by the sensor and a waveguide layer illuminated in the plane by:
  • the first source emits only in the band of wavelengths between 400 nm and 600 nm; and/or the second source emits only in the band of wavelengths comprised between 600 nm and 1100 nm.
  • the first source emits only in the band from 460 nm to 600 nm; and/or the second source emits only in the band from 680 nm to 940 nm.
  • the first source faces the second source.
  • an image acquisition device comprising a single sensor with organic photodetectors, an angular filter, and a color filter, distinct from the angular filter, comprising: one or more first parts adapted to let the at least one wavelength in the visible, and one or more second parts filtering the wavelengths outside the red and/or the near infrared, each second part having an area approximately equal to the area of a photodetector or an area greater than or equal to the area of four photodetectors.
  • the first parts of the color filter are adapted to pass at least one wavelength in the band from 400 nm to 600 nm, and/or the second parts of the color filter filter the wavelengths of wave out of the band from 600 nm to 1100 nm.
  • some of the first parts of the color filter are adapted to pass only the wavelengths included in the band from 460 nm to 600 nm, and some of the first parts are adapted to pass only the wavelengths in the band from 500 nm to 580 nm; and/or some of the second parts of the color filter are adapted to filter wavelengths outside the 600 nm to 700 nm band, and some of the second parts are adapted to filter wavelengths outside the 700 nm band nm to 1100 nm.
  • all the first and second parts of the color filter are suitable for passing at least wavelengths comprised in the band from 700 nm to 1100 nm; some of the first parts or of the second parts of the color filter are suitable for passing only further wavelengths comprised in the band from 400 nm to 500 nm; some of the first parts or of the second parts are suitable for passing only further wavelengths comprised in the band from 500 nm to 600 nm; some of the first parts or of the second parts of the color filter are suitable for passing only further wavelengths comprised in the band from 600 nm to 700 nm; and some of the first parts or the second parts are adapted to filter the wavelengths out of the band from 700 nm to 1100 nm.
  • each photodetector delimits a pixel, each pixel having a substantially square base, the length of the sides of each pixel preferably being of the order of 50 ⁇ m.
  • each second part has an area between the area of four photodetectors and the area of six photodetectors, preferably, the area of each second part being approximately equal to the area of four photodetectors.
  • the device comprises a third light source, or a light source, adapted to emit: at least one wavelength between 400 nm and 600 nm, preferably between 460 nm and 600 nm; and at least one wavelength between 600 nm and 1100 nm, preferably between 680 nm and 940 nm.
  • the device comprises in order: the image sensor; the angular filter; the third light source or the light source; and the color filter.
  • the device comprises in order: the image sensor; the angular filter; the color filter; and the third light source or the light source.
  • the device comprises in order: the image sensor; the color filter; the angular filter; and the third light source or the light source.
  • a support is present in contact with the color filter, the support being made of a polymer that is transparent in the visible and the infrared, preferably of PET.
  • Another embodiment provides a method for detecting a real or false finger, by the image acquisition device described above, comprising the following steps: acquiring an image of the finger at a given wavelength ; determining a response signal from said image; and comparing the determined response signal with a reference value according to a comparison criterion and determining that the finger is a false finger if the comparison criterion is not respected, wherein steps a), b) and c) are repeated for at least two different wavelengths.
  • step b) comprises the determination of a distribution of gray levels of the acquired image, the response signal being equal to the central gray level of the distribution.
  • the comparison criterion consists in determining whether the determined response signal corresponds to the reference value to within a threshold.
  • the threshold corresponds to the difference between the reference value and a gray level value at mid-height of a reference distribution
  • steps a), b) and c) are repeated for at least three different wavelengths
  • Figure 1 illustrates, in a sectional view, partial and schematic, an embodiment of an image acquisition device
  • FIG. 2 represents examples of true and false finger responses, illuminated by radiation, determined from images acquired by an image acquisition device
  • Figure 3 illustrates, in a sectional view, partial and schematic, another embodiment of an image acquisition device
  • Figure 4 shows, in a top view, partial and schematic, an embodiment of the device illustrated in Figure 3;
  • Figure 5 illustrates, in a sectional view, partial and schematic, another embodiment of an image acquisition device
  • Figure 6 shows, in a top view, partial and schematic, the device illustrated in Figure 5;
  • Figure 7 illustrates, in a sectional, partial and schematic view, yet another embodiment of an image acquisition device
  • Figure 8 shows, in a top view, partial and schematic, an embodiment of the device illustrated in Figure 7;
  • Figure 9 illustrates, in a sectional view, partial and schematic, an embodiment of a structure provided with a color filter
  • Figure 10 shows, in a top view, partial and schematic, an embodiment of the structure illustrated in Figure 9;
  • FIG. 11 illustrates, in a partial and schematic sectional view, yet another embodiment of an image acquisition device
  • FIG. 12 illustrates, in a sectional, partial and schematic view, yet another embodiment of an image acquisition device
  • FIG. 13 illustrates, in a partial and schematic sectional view, yet another embodiment of an image acquisition device; and [0041] FIG. 14 is a representation of an example of the evolution of the area of a real finger illuminated by radiation. Description of embodiments
  • a layer or a film is said to be opaque to radiation when the transmittance of the radiation through the layer or the film is less than 10%.
  • a layer or a film is said to be transparent to radiation when the transmittance of the radiation through the layer or the film is greater than 10%.
  • all the elements of the optical system which are opaque to radiation have a transmittance which is less than half, preferably less than a fifth, more preferably less than a tenth, of the transmittance the weakest of the elements of the optical system transparent to said radiation.
  • the term "useful radiation” is used to refer to the electromagnetic radiation passing through the optical system in operation.
  • optical element of micrometric size refers to an optical element formed on one face of a support whose maximum dimension, measured parallel to said face, is greater than 1 ⁇ m and less than 1 mm.
  • the wavelength of a radiation, or central or main wavelength of the radiation is the wavelength at which the maximum of the spectrum of the radiation is reached.
  • each micrometric-sized optical element corresponds to a micrometric-sized lens, or microlens, composed two diopters.
  • each micrometric-sized optical element can also be put implemented with other types of micrometric-sized optical elements, each micrometric-sized optical element being able to correspond, for example, to a micrometric-sized Fresnel lens, to a micrometric-sized gradient index lens or to a micrometer-sized diffraction grating.
  • visible light is electromagnetic radiation whose wavelength is between 400 nm and 700 nm
  • red light is electromagnetic radiation whose wavelength is between 600 nm and 700 nm
  • Infrared radiation is electromagnetic radiation with a wavelength between 700 nm and 1 mm.
  • near infrared radiation the wavelength of which is between 700 nm and 1.1 ⁇ m.
  • the expression “it comprises only the elements” means that it comprises, at least 90% of the elements, preferably that it comprises at least 95% of the elements.
  • Figure 1 illustrates, by a sectional view, partial and schematic, an embodiment of an image acquisition device 101.
  • the image acquisition device 101 comprises, from bottom to top in the orientation of the figure: a single organic image sensor 11; an optical filter 13, preferably an angular filter; and a so-called waveguide layer 19 located directly above the optical filter 13.
  • the device 101 comprises a first light source 151 emitting at least one wavelength in the visible and a second light source 153 emitting at least one wavelength in the red and/or near infrared.
  • the two sources 151 and 153 are preferably face. Sources 151 and 153 are laterally coupled to layer 19 and are located outside the stack of sensor 11, optical filter 13 and layer 19.
  • the first source 151 is adapted to emit a first radiation, or green radiation, 171 comprising at least one wavelength in the band from 400 nm to 600 nm.
  • the first radiation 171 only comprises first electromagnetic waves whose wavelengths are included in the band from 400 nm to 600 nm, preferably included in the band from 460 nm to 600 nm. More preferentially, the first radiation 171 only comprises first electromagnetic waves whose wavelengths are equal to approximately 530 nm (green) or 500 nm (cyan).
  • the second source 153 is adapted to emit a second radiation, or red radiation, 173 comprising only second electromagnetic waves whose wavelengths are between 600 nm and 1700 nm, preferably between 680 nm and 940 nm .
  • the second electromagnetic waves can be monochromatic or polychromatic.
  • the device 101 captures the image response of an object 21, partially represented, preferably a finger.
  • the device 101 comprises a processing unit 22 comprising, for example, a microprocessor (not shown).
  • the processing unit 22 is, for example, a computer or a portable telephone (smartphone).
  • the sensor 11 comprises photodetectors (not shown), preferably arranged in matrix form.
  • the photodetectors preferably all have the same structure and the same properties/characteristics. In other words, all the photodetectors are substantially identical except for manufacturing differences.
  • Sensor 11 is preferably adapted to pick up radiation 171 and 173.
  • the layer 19 called the waveguide layer comprises a structure of two or three media of different refractive indices (not shown in Figure 1).
  • the layer 19 comprises sub-layers making it possible to increase the hardness of the final layer 19 but which may not be involved in wave propagation.
  • the refractive index of a medium is defined as being the refractive index of the material constituting the medium for the range of wavelengths of the radiation captured by the sensor of pictures.
  • the refractive index is considered to be substantially constant over the range of wavelengths of the useful radiation, for example equal to the average of the refractive index over the range of wavelengths of the radiation picked up by the image sensor .
  • a waveguide layer is structurally adapted to allow the confinement and propagation of electromagnetic waves.
  • the media are, for example, arranged in the form of a stack of three sub-layers, a central layer sandwiched between an upper sheath and a lower sheath, the refractive indices of the materials making up the sheaths being lower than the index refraction of the material making up the central layer, the lower sheath being located on the side of the optical filter 13.
  • microstructures are formed between the central layer and the lower sheath.
  • the microstructures preferably have the shapes of prisms or teeth whose points are oriented in the direction of the object to be imaged and whose bases are flush with the rear face of the central layer.
  • Each microstructure has a face slightly inclined in the direction of propagation of the wave so that the propagated wave is deflected and follows the geometry of the microstructure.
  • the inclination of the face of the microstructure is, for example, between 5° and 80°.
  • the inclination is preferably of the order of 45°.
  • the microstructures are not evenly distributed along the wave path.
  • the density of the microstructures is preferably higher and higher as one moves away from the source of the radiation deflected by these microstructures.
  • a first network of microstructures 181 is, for example, adapted to guide the first waves of the first radiation 171 emitted by the first source 151.
  • the first network then comprises microstructures 181 inclined in the direction of the waves emitted by the first source 151.
  • a second network of microstructures 183 is, for example, adapted to guide the second waves of the second radiation 173 emitted by the second source 153.
  • the second network then comprises microstructures 183 inclined in the direction of the waves emitted by the second source 153.
  • the first network of microstructures 181 extends from the side edge of the layer 19, adjacent to the source 151, up to, at most, the side edge, opposite the source 151, of the layer 19.
  • the second network of microstructures 183 extends from the side edge of the layer 19 adjacent to the source 153, up to, at most, the side edge of the layer 19 opposite the source 153.
  • the layer 19 comprises a single network of microstructures, in which the geometry of the microstructures is adapted to guide the first waves of the first radiation 171 from the first source 151 and the second waves of the second radiation 173 from the second source 153.
  • the device 101 comprises two superimposed waveguide layers, one of which covers the upper face of the optical filter 13, the first waveguide layer receiving the first radiation emitted by the first source 151, and the second waveguide layer receiving the second radiation emitted by the second source 153.
  • the first waveguide layer then comprises the first grating of microstructures 181 and the second waveguide layer comprises the second grating of microstructures 183.
  • the sources 151 and 153 face each other.
  • sources 151 and 153 are positioned on the periphery of layer 19.
  • source 151 is located to the left of layer 19
  • source 153 is located to the right of layer 19.
  • the sources 151 and 153 are located indifferently relative to each other.
  • the two sources 151 and 153 are positioned, for example, on the same side of the layer 19, one behind the other, one next to the other or so that the rays 171 and 173 are orthogonal.
  • the sources 151 and 153 are switched on one after the other so as to successively image the finger 21 only by the first radiation 171, then the finger 21 only by the second radiation 173.
  • a real finger from a false finger from images acquired of the finger by an image acquisition device, in particular the device for acquiring images as described previously, the finger being illuminated by at least two radiations at different wavelengths.
  • a signal called finger response
  • the response is equal to the average, possibly weighted, of the values of the pixels of the acquired image.
  • Response reference values for a real finger can be stored for different wavelengths. The detection of a false finger can be carried out by comparing the response of the tested finger with the reference response for at least two different wavelengths. If the response of the tested finger differs from the reference response for at least one wavelength, the tested finger is considered to be a fake finger.
  • the device 101 comprises more than two light sources, emitting identical or different radiation in visible wavelengths.
  • the device 101 then comprises a number of waveguide layers less than or equal to the number of sources, each of the waveguide layers comprising at least one array of microstructures.
  • Figure 2 shows examples of true and false finger responses, illuminated by radiation, determined from images acquired by an image acquisition device.
  • FIG. 2 comprises a first graph in a view A illustrating the evolution of the finger response (Amplitude) as a function of the wavelength (Wavelength (nm)) of the radiation which illuminates them and of the nature of the fingers.
  • the response of a finger to a given wavelength corresponds to the “average” of the values of the pixels of the acquired image of the finger illuminated by radiation at said wavelength.
  • the inventors have found that the responses for a false finger, under radiation having a given wavelength, are clearly different from the responses for a real finger, under the same radiation, regardless of the length of the radiation. considered wave.
  • the responses of a real finger for three wavelengths are indicated by square marks 23
  • the responses of a three-dimensional false finger are indicated by round marks 25
  • the responses of a two-dimensional dummy finger are indicated by triangular marks 27.
  • the response, under a given wavelength, of a finger to be tested is compared with the response of reference of a real finger.
  • one comes to illuminate the finger to be tested with a wavelength then to determine, from the acquired image of the finger to be tested with the image acquisition device, the response of the finger and to compare this measured response with the reference response, at the same wavelength.
  • the mentioned comparison is carried out at least at two different wavelengths, preferably at least at three different wavelengths. Still by way of example, the comparison mentioned is carried out under radiation having a wavelength between 460 nm and 600 nm, for example of the order of 530 nm, under radiation having a wavelength between 600 nm and 700 nm, for example, of the order of 680 nm and under radiation having a wavelength of between 700 nm and 1100 nm, for example, of the order of 850 nm.
  • the reference response at a given wavelength can be determined as follows: for an acquired image of a finger illuminated by radiation at the given wavelength, we come count the number of pixels having the same gray level value, the distribution of the gray level values of the pixels gives a Gaussian.
  • the Gaussian is centered on a "mean" or central value which, normalized by a value specific to the device considered, gives the reference response.
  • FIG. 2 comprises a graph in a view B illustrating, by a curve 29, an example of distribution (Occurrence) of the values of the gray levels (Grey scale) of the pixels of an image of a real finger at a wavelength of the order of 630 nm.
  • the central value v corresponds for example to the reference response.
  • the values at mid-height (O1/2) v s and vi of curve 29 define, by way of example, the comparison threshold which can be equal to v ⁇ vi .
  • Figures 3 to 13 illustrate image acquisition devices that can be used to detect the use of false fingers as described in relation to Figure 2.
  • FIGS. 3 to 13 further illustrate other embodiments of an image acquisition device comprising a single light source emitting a single radiation, the wavelengths of which are filtered locally.
  • the single radiation preferably comprises both green, red and/or infrared wavelengths.
  • the electromagnetic waves picked up by photodetectors of the device are thus locally similar to the first electromagnetic waves or to the second electromagnetic waves.
  • Figure 3 illustrates, by a sectional view, partial and schematic, another embodiment of a device 103 for acquiring images.
  • the image acquisition device 103 comprises: an image sensor 11 comprising photodetectors or photodiodes 43; and an optical filter 13, preferably an angular filter.
  • the embodiments of the devices of FIGS. 3 to 13 are represented in space according to a direct orthogonal XYZ frame, the Y axis of the XYZ frame being orthogonal to the upper face of the sensor 11.
  • the device 103 captures the image response of an object 21, partially represented, preferably a finger.
  • the device 103 comprises a processing unit 22 comprising, for example, a microprocessor (not shown).
  • the processing unit 22 is, for example, a computer or a portable telephone (smartphone).
  • the photodiodes 43 are, for example organic photodiodes (OPD, Organic Photodiode) integrated on a substrate with CMOS transistors (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor) or a substrate with thin film transistors (TFT or Thin Film Transistor).
  • CMOS transistors Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor
  • TFT or Thin Film Transistor The substrate is for example made of silicon, preferably of monocrystalline silicon.
  • the channel, source and drain regions of TFT transistors are for example made of amorphous silicon (a-Si or amorphous Silicon), indium, gallium, zinc and oxide (IGZO Indium Gallium Zinc Oxide) or low temperature polycrystalline silicon ( LTPS or Low Temperature Polycrystalline Silicon).
  • the photodiodes 43 of the image sensor 11 comprise, for example, a mixture of organic semiconductor polymers such as poly(3-hexylthiophene) or poly(3-hexylthiophene-2,5-diyl), known under the name P3HT, mixed with methyl [6,6]-phenyl-C61-butanoate (N-type semiconductor), known as PCBM.
  • the photodiodes 43 of the image sensor 11 comprise, for example, small molecules, that is to say molecules having molar masses of less than 500 g/mol, preferably less than 200 g/mol .
  • the photodiodes 43 can be inorganic photodiodes, for example, made from amorphous silicon or crystalline silicon.
  • the photodiodes 43 are composed of quantum boxes (quantum dots).
  • the photodetectors 43 are preferably arranged in matrix form.
  • the repetition pitch of the photodetectors 43 is, for example, between 10 ⁇ m and 100 ⁇ m, preferably between 32 ⁇ m and 80 ⁇ m.
  • the photodetectors 43 preferably all have the same structure and the same properties/characteristics. In other words, all the photodetectors 43 are substantially identical except for manufacturing differences.
  • the sensor 11 is preferably suitable for detecting radiation 17 emitted by a source 15.
  • the source 15 is adapted to emit the radiation 17 comprising: at least one wavelength in the visible, for example, included in the band from 400 nm to 600 nm; and at least one wavelength in the red and/or the near infrared, for example, between 600 nm and 1100 nm.
  • the radiation 17 comprises: wavelengths included in the band from 460 nm to 600 nm, for example, a wavelength equal to about 500 nm (cyan) or 530 nm (green); and wavelengths in the band 680 nm to 940 nm.
  • the light source 15 consists, for example, of one or more light-emitting diodes (LED, Light-Emitting Diode) associated with a waveguide layer.
  • the light source 15 consists, for example, of one or more organic light-emitting diodes (OLED, Organic Light-Emitting Diode).
  • the source 15 is an LED associated with a waveguide layer.
  • the optical filter 13, illustrated in FIG. 3, comprises, from bottom to top in the orientation of the figure: a first layer 47 comprising openings 49, or holes, and walls 51 that are opaque to radiation 17.
  • the openings 49 are, for example, filled with a material forming, on the underside of layer 47, a layer 53; a substrate or support 55, resting on the upper face of layer 47; and an array of lenses 57 of micrometric size, located on the upper face of the substrate 55, the flat face of the lenses 57 and the upper face of the substrate 55 facing each other.
  • the array of lenses 57 is surmounted by a flattening layer 59.
  • the substrate 55 can be made of a transparent polymer which does not absorb, at least, the wavelengths considered, here in the visible and infrared range.
  • This polymer can in particular be poly(ethylene terephthalate) PET, poly(methyl methacrylate) PMMA, polymer of inecyclic olefin (COP), polyimide (PI), or polycarbonate (PC).
  • the thickness of the substrate 55 can, for example, vary between 1 ⁇ m and 100 ⁇ m, preferably between 10 ⁇ m and 100 ⁇ m.
  • the substrate 55 can correspond to a colored filter, to a polarizer, to a half-wave plate or to a quarter-wave plate.
  • the lenses 57 can be made of silica, of EMMA, of a positive photosensitive resin, of PET, of poly(ethylene naphthalate) (PEN), of COP, of polydimethylsiloxane (PDMS)/silicone, of epoxy resin or in acrylate resin.
  • the lenses 57 can be formed by creeping blocks of a photoresist.
  • Lenses 57 may additionally be formed by molding over a layer of PET, PEN, COP, PDMS/silicone, epoxy resin or acrylate resin.
  • the lenses 57 are converging lenses each having a focal length f of between 1 ⁇ m and 100 ⁇ m, preferably between 1 ⁇ m and 70 ⁇ m. According to one embodiment, all the lenses 57 are substantially identical.
  • the lenses have, for example, a diameter of between 10 ⁇ m and 30 ⁇ m, preferably equal to approximately 20 ⁇ m.
  • the repetition pitch of the lenses is preferably between 10 ⁇ m and 30 ⁇ m, for example equal to
  • the lenses 57 and the substrate 55 are preferably made of transparent or partially transparent materials, that is to say transparent in a part of the spectrum considered for the targeted domain, for example imaging, over the range of wavelengths corresponding to the wavelengths used during the exposure.
  • the layer 59 is a layer which matches the shape of the lenses 57.
  • the layer 59 can be obtained from an optically transparent adhesive (Optically Clear Adhesive - OCA), in particular an optically transparent adhesive liquid (Liquid Optically Clear Adhesive - LOCA), or a material with a low refractive index, (for example an epoxy/acrylate adhesive) or a film of a gas or a gas mixture, for example l 'air.
  • the openings 49 are, for example, filled with air, with a partial vacuum or with a material that is at least partially transparent in the visible and infrared domains.
  • optical filter 13 constituting an angular filter.
  • these embodiments can be applied to other types of optical filters, such as a spatial filter.
  • the angular filter 13 is adapted to filter the incident radiation according to the incidence of the radiation with respect to the optical axes of the lenses 57.
  • the angular filter 13 is, more particularly, adapted so that each photodetector 43 of the image sensor 11 receives only the rays whose respective incidences with respect to the respective optical axes of the lenses 57 associated with this photodetector 43 are less than a maximum incidence of less than 45°, preferably less than 30°, more preferably less than 10°, even more preferably less than 4°.
  • the angular filter 13 is adapted to block the rays of the incident radiation whose respective incidences relative to the optical axes of the lenses 57 of the optical filter 13 are greater than the maximum incidence.
  • Each opening 49 is preferably associated with a single lens 57.
  • the optical axes of the lenses 57 are preferably centered with the centers of the openings 49 of the layer 47.
  • the diameter of the lenses 57 is preferably , greater than the maximum size of the section (perpendicular to the optical axis of the lenses 57) of the openings 49.
  • Each photodetector 43 is, for example, associated with at least four apertures 49 (and four lenses 57). Preferably, each photodetector 43 is associated with exactly four apertures 49.
  • the device 103 is preferably divided into pixels.
  • the term pixel is used throughout the description to define a part of the image sensor 11 comprising a single photodetector 43.
  • the denomination pixel can apply to the scale of the image sensor 11 but also to the device 103 scale. unless otherwise specified, refers to a device-wide pixel 103.
  • a pixel 44 corresponds to each part of the device 103 comprising, among other things, a photodetector 43 surmounted by four openings 49, themselves surmounted by four lenses 57.
  • Each pixel 44 is, preferably, of substantially square shape seen in a direction perpendicular to the upper face of the image sensor 11.
  • the area of each pixel is of the order of 50 ⁇ m by 50 ⁇ m, and is preferably equal to approximately 50.8 ⁇ m by 50.8 ⁇ m.
  • device 103 comprises a color filter 45, on the front face of optical filter 13, more precisely on the front face of layer 59.
  • the color filter 45 is located between the image sensor 11 and the angular filter 13 or between two constituent layers of the angular filter 13, for example, between the layer 47 and the substrate 55.
  • the color filter 45 is divided into two parts.
  • a first part 451 is adapted to allow all wavelengths to pass.
  • the first part 451 is adapted to let pass at least one wavelength in the band from 400 nm to 600 n.
  • the first part 451 is adapted to let pass only at least one wavelength in the band from 460 nm to 600 nm.
  • the first part 451 is adapted to allow only the wavelength equal to 530 nm or the wavelength equal to 500 nm to pass.
  • One or more second parts 453 are adapted to block all the wavelengths outside the band from 600 nm to 1100 nm, preferably outside the band from 680 nm to 940 nm.
  • Figure 4 shows, by a top view, partial and schematic, an embodiment of the device 103 illustrated in Figure 3.
  • FIG. 4 is a top view of the device 103 illustrated in FIG. 3.
  • FIG. 3 is a view along section plane AA of FIG. 4.
  • each second part 453 of the color filter 45 is formed on the surface of the optical filter 13 so that each second part 453 covers a single corresponding pixel 44.
  • Each second part 453 is therefore aligned with a photodetector 43.
  • each second part 453 of the color filter 45 has a square shape in the view of Figure 4.
  • the surface of each second part 453 of the color filter 45 is equal to a square of approximately 50.8 pm by 50.8 pm.
  • the repetition pitch of the second parts 453 of the color filter 45 is between two pixels and twenty pixels.
  • the repetition pitch of the second parts 453 is between five pixels and fifteen pixels. More preferentially, the repetition pitch of the second parts 453 is approximately ten pixels along the axis Z and ten pixels along the X axis. In other words, nine pixels separate two consecutive pixels along the Z (or X) axis covered by second parts 453. Still in other words, in a square assembly of one hundred pixels (that is to say a square of ten pixels along the Z axis and ten pixels along the X axis), a single pixel is covered by a second part 453.
  • the distribution of the second parts 453 is aligned , that is to say that the repetition is done in rows and columns, or shifted, that is to say that the distribution is shifted by one or more pixels from one row to the next or by one column to the next.
  • the other pixels 44 are covered by the first part 451 of the color filter 45.
  • the first part 451 is contiguous between two neighboring pixels 44, that is to say say that the first part 451 is not pixelated and that it is formed simultaneously on all the pixels considered of the image sensor 11.
  • the material constituting the first part 451 is deposited on the whole of the upper face of the optical filter 13, more particularly, on the upper face of the layer 59, then removed by photolithography or photolithography so as to form housings intended to receive the second parts 453.
  • the material constituting the second parts 453 is deposited full plate on the upper face of the structure and more precisely on the upper face of the first part 451 and in housing.
  • the upper face of the layer of material constituting the second parts 453 then undergoes chemical mechanical planarization (CMP, Chemical Mechanical Planarization) so as to reveal the upper face of the first part 451 or photolithography so as to remove the second parts tl
  • CMP Chemical Mechanical Planarization
  • each second part 453 located on the surface of the first part 451.
  • the thickness of each second part 453 is between 200 nm and 10 ⁇ m, preferably between 500 nm and 2 ⁇ m.
  • the first and second parts are deposited locally on the surface of the angular filter 13 by localized deposition techniques such as the screen printing technique, the inkjet technique (inkjet) or the spraying technique (spray). ) .
  • the material constituting the second parts 453 is deposited using one of the techniques described above, preferably by photolithography.
  • the parts 451 then correspond to the spaces located between the parts 453.
  • the constituent material of the first part 451 is air, a partial vacuum or a transparent material in all the wavelengths.
  • the material constituting the first part 451 is a material transparent only to wavelengths between 400 nm and 600 nm (visible filter), preferably between 460 nm and 600 nm, for example a resin comprising the colorant known under the trade name "Orgalon Green 520" or a resin from the "COLOR MOSAIC” commercial range from the manufacturer Fujifilm.
  • the material constituting the first part 451 is a material transparent only at 500 nm (cyan filter) or transparent at only 530 nm (green filter), for example a resin comprising the dye known as commercial name "PC GREEN 123P” or a resin from the "COLOR MOSAIC” commercial range from the manufacturer Fujifilm.
  • Figures 5 and 6 illustrate, by a sectional view and a top view, partial and schematic, another embodiment of an image acquisition device 104.
  • Figures 5 and 6 illustrate a device 104 similar to the device 103 illustrated in Figures 3 and 4 except that the second parts 453 of the filter 45 illustrated in Figures 3 and 4 are sometimes replaced by parts 455 and sometimes replaced by parts 457.
  • Figure 6 is a top view of the device 104 illustrated in Figure 5, Figure 5 being a sectional view along the section plane A 'A' of Figure 6.
  • the parts 451 illustrated in Figures 5 and 6 are identical to the parts 451 illustrated in Figures 3 and 4.
  • the part or parts 455 are, for example, adapted to block all the wavelengths outside the band from 600 nm to 700 nm, preferably outside the band from 680 nm to 700 nm.
  • the part or parts 457 are, for example, adapted to block all the wavelengths outside the band from 700 nm to 1100 nm, preferably outside the band from 680 nm to 940 nm.
  • the parts 455 and 457 illustrated in FIGS. 5 and 6 preferably have dimensions similar to the dimensions of the parts 453 illustrated in FIGS. 3 and 4.
  • the parts 455 and 457 are organized so that the parts 455 and 457 alternate along the Z axis and along the X axis. In other words, two successive parts 455 are separated by a part 457.
  • the filter 45 comprises three parts filtering at different wavelengths.
  • the filter 45 comprises a part, identical to part 453 illustrated in Figures 3 and 4, another part adapted to block all wavelengths outside the band from 460 nm to 600 nm and yet another part adapted to block all wavelengths out of the band from 500 nm to 580 nm.
  • the filter 45 contains, for example, four parts, one part adapted to block all the wavelengths outside the band from 460 nm to 600 nm, another part adapted to block all the wavelengths outside the band from 500 nm to 580 nm, another part adapted to block all wavelengths outside the band from 600 nm to 700 nm and another part adapted to block all wavelengths outside the band from 700 nm to 1100 n.
  • all the first and second parts of the color filter 45 are adapted to let through at least wavelengths comprised in the band from 700 nm to 1100 nm; some of the first parts or of the second parts of the color filter 45 are adapted to only let through moreover wavelengths comprised in the band from 400 nm to 500 nm (typically blue); some of the first parts or of the second parts of the color filter 45, preferably in the majority, are suitable for passing only in addition wavelengths comprised in the band from 500 nm to 600 nm (typically green); some of the first parts or of the second parts of the color filter 45 are suitable for passing only further wavelengths comprised in the band from 600 nm to 700 nm (typically red); and some of the first parts or the second parts of the color filter 45 are adapted to filter the lengths wave outside the band from 700 nm to 1100 nm (typically 1 infrared).
  • the parts blocking the wavelengths outside the band from 600 nm to 1100 nm are organized so as to replace the parts 453 of the image acquisition device 103 illustrated in FIGS. 3 and 4 and the other parts are organized so as to replace the parts 451 of the image acquisition device 103 illustrated in FIG. 3.
  • Figures 7 and 8 illustrate, by a sectional view and a top view, partial and schematic, yet another embodiment of an image acquisition device 105.
  • FIGS. 7 and 8 illustrate a device 105 similar to device 103 illustrated in FIGS. 3 and 4 except that the second parts 453 of color filter 45 are formed arbitrarily on the surface of optical filter 13 and they each have a larger area (in the XZ plane) than the area of each second part 453 of the color filter 45 illustrated in FIGS. 3 and 4.
  • Figure 8 is a top view of the device 105 illustrated in Figure 7, Figure 7 being a sectional view along the section plane BB of Figure 8.
  • each second part 453 of the color filter 45 is formed on the front face of the optical filter 13 without prior alignment of the latter with the photodetectors 43 or the lenses 57 under- underlying.
  • each second part 453 of the color filter 45 has a shape substantially square in the view of FIG. 8.
  • each second part 453 of the color filter 45 is made so that it completely covers at least one pixel 44 (or a photodetector 43) regardless of its location on the face. top of layer 59.
  • the area of each second part 453 (in the XZ plane) of the color filter 45 is at least equal to the area of four pixels 44.
  • the area of each second part 453 is between the area of four pixels 44 and the area of six pixels. More preferably, the area of each second part 453 is exactly equal to the area of four pixels 44.
  • the repetition pitch of the second parts 453 is between a distance corresponding to the dimension of four pixels and a distance corresponding to the dimension of twenty pixels.
  • the repetition pitch is substantially equal to a distance corresponding to the dimension of ten pixels.
  • the distribution of the second parts 453 is aligned, that is to say that the repetition is done in rows and columns, or shifted, that is to say that the distribution is shifted by one or more pixels by one row to the next or one column to the next.
  • the filter 45 is located between the angular filter 13 and the image sensor 11, more precisely, between the layer 53 and the image sensor 11.
  • Figures 9 and 10 illustrate, by a sectional view and a top view, partial and schematic, an embodiment of a structure 64 provided with the color filter 45.
  • FIG. 9 illustrates an embodiment of a structure 64 comprising the color filter 45 and a support 65
  • Figure 10 is a top view of the structure 64 illustrated in Figure 9, Figure 9 being a sectional view along the section plane CC of Figure 10.
  • FIG. 10 illustrates the distribution of the second parts 453 in the color filter 45.
  • the color filter 45 illustrated in Figures 9 and 10 is similar to the color filter 45 illustrated in Figures 7 and 8, with the difference that it is formed on the upper face of the support 65 in Figures 9 and 10 This advantageously makes it possible to produce the color filter 45 separately from the other elements of the image acquisition device.
  • the support 65 can be made of a transparent polymer which does not absorb at least the wavelengths considered, here in the visible and infrared range. This polymer may in particular be made of poly(ethylene terephthalate) (PET), poly(methyl methacrylate) (PMMA), polymer of inecyclic olefin (COP), polyimide (PI), or polycarbonate (PC) .
  • PET poly(ethylene terephthalate)
  • PMMA poly(methyl methacrylate)
  • COP polymer of inecyclic olefin
  • PI polyimide
  • PC polycarbonate
  • the support 65 is preferably made of PET.
  • the thickness of support 65 can vary from 1 ⁇ m to 100 ⁇ m, preferably from 10 ⁇ m to 50 ⁇ m.
  • the support 65 can correspond to a colored filter, to a polarizer, to a half-wave plate or to a quarter-wave plate.
  • Figure 11 illustrates, by a sectional view, partial and schematic, yet another embodiment of an image acquisition device.
  • FIG. 11 illustrates a device 107 similar to device 105, illustrated in FIG. 7, with the difference that it comprises structure 64 instead of color filter 45.
  • the structure 64 is located on the upper face of the source 15 so that the support 65 covers the upper face of the light source 15 and the color filter 45 covers the support 65.
  • the structure 64 is transferred to the upper face of the light source 15 without aligning the color filter 45 with the lenses 57 of the underlying optical filter 13.
  • the structure 64 is attached to the upper face of the light source 15 so that the filter 45 covers the upper face of the light source 15 and the support 65 covers the upper face of the filter. color 45.
  • Figure 12 illustrates, by a sectional, partial and schematic view, yet another embodiment of an image acquisition device.
  • FIG. 12 illustrates a device 109 similar to device 107 illustrated in FIG. 11, with the difference that structure 64 is located between light source 15 and optical filter 13.
  • the structure 64 is transferred to the upper face of the layer 59 so that the filter 45 covers the upper face of the layer 59 and the support 65 covers the upper face of the color filter 45 .
  • FIG. 13 illustrates, by a sectional view, partial and schematic, yet another embodiment of an image acquisition device. More specifically, FIG. 13 illustrates a device 111 similar to device 107 illustrated in FIG. 11, with the difference that structure 64 is located between sensor 11 and optical filter 13.
  • structure 64 is attached to the upper face of sensor 11 so that filter 45 covers the upper face of sensor 11 and support 65 covers the upper face of color filter 45.
  • FIG. 14 is a representation of an example of the evolution of the area of a real finger illuminated by radiation.
  • FIG. 14 illustrates, from a finger imaged by the photodetectors 43 within the device 103, 104, 105, 107, 109 or 111, the evolution of the area of the finger in contact with the surface of said image acquisition device during a pressure 77 of the finger on the device then during a release 79.
  • the curve 75 comprises a series of measurements (points), each extracted from a different image, corresponding to the evolution of the gray level area of the center of the finger as a function of said gray level.
  • the device acquires several images. For each of the images, the gray level (grey color (inverted) ) of the center of the finger is extracted and the surface (Area (mm 2 ) ) of the image, in square millimeters, having this gray level, or a level close gray (that is to say having a difference in brightness of the order of a few percent) is determined. Measurements over several images are grouped together in curve 75.
  • the curve 75 includes a hysteresis 81, between the pressure 77 and the release 79. This hysteresis 81 is due to the influx of blood into the finger during the release 79.
  • the hysteresis 81 of the curve 75 being characteristic of an influx of blood, it is characteristic of a real finger. It is thus possible to distinguish a case of fraud (false finger: for example silicone or 2D finger) by extracting the surface and the level of gray described above.
  • An advantage of the embodiments and modes of implementation described in FIGS. 3 to 13 is that they make it possible to detect the use of false fingers in two dimensions or three dimensions with the acquisition of a single image .
  • Another advantage of the embodiments and modes of implementation described in FIG. 3 to 13 is that they make it possible to detect the use of false fingers in two dimensions in less than 200 milliseconds.
  • Yet another advantage of the embodiments and modes of implementation described in FIG. 3 to 13 is that they only use a single sensor, which makes it possible to reduce the thickness of the telephones. Indeed, a phone with two sensors, one of which is suitable for capturing visible radiation and the other suitable for capturing infrared radiation, will be thicker than a phone with a single sensor suitable for capturing visible and infrared radiation. .
  • FIGS. 7 to 11 Yet another advantage of the embodiments and modes of implementation described in FIGS. 7 to 11, is that they make it possible to dispense with the alignment of the second parts 453 of the color filter 45 with the photodetectors 43
  • the area of the second parts 453 of the color filter 45 is, in fact, chosen so that each second part 453 of the color filter 45 completely covers at least one photodetector 43.
  • FIGS. 5 and 6 can be combined with the embodiments illustrated in FIG. 7 to 13 and some of the embodiments described in relation to FIG. 11 can be combined with the embodiments described in relation to FIGS. 12 and 13 .
EP21758384.8A 2020-08-17 2021-08-12 Bilderfassungsvorrichtung Pending EP4196905A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2008532A FR3113429A1 (fr) 2020-08-17 2020-08-17 Dispositif d'acquisition d'images
PCT/EP2021/072470 WO2022038034A1 (fr) 2020-08-17 2021-08-12 Dispositif d'acquisition d'images

Publications (1)

Publication Number Publication Date
EP4196905A1 true EP4196905A1 (de) 2023-06-21

Family

ID=74125297

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21758384.8A Pending EP4196905A1 (de) 2020-08-17 2021-08-12 Bilderfassungsvorrichtung

Country Status (5)

Country Link
US (1) US11928888B2 (de)
EP (1) EP4196905A1 (de)
CN (1) CN217035642U (de)
FR (1) FR3113429A1 (de)
WO (1) WO2022038034A1 (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3135794A1 (fr) * 2022-05-19 2023-11-24 Isorg Filtre optique pour photodétecteurs

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1207473A (en) 1968-05-15 1970-10-07 Kyowa Hakko Kogyo Kk Method for treating microorganism cells or protein extracted therefrom
WO2003073159A1 (en) 2002-02-20 2003-09-04 Planar Systems, Inc. Light sensitive display
US8094136B2 (en) 2006-07-06 2012-01-10 Flatfrog Laboratories Ab Optical touchpad with three-dimensional position determination
JP4640415B2 (ja) 2008-01-18 2011-03-02 ソニー株式会社 生体認証装置
JP5435996B2 (ja) 2009-03-24 2014-03-05 富士フイルム株式会社 近接型撮像装置、及び撮像フィルタ
JP5553707B2 (ja) 2009-08-21 2014-07-16 株式会社半導体エネルギー研究所 光検出装置
US8963886B2 (en) 2011-07-13 2015-02-24 Flatfrog Laboratories Ab Touch-sensing display panel
KR101315218B1 (ko) 2011-08-02 2013-10-08 엘지전자 주식회사 단말기 및 그 단말기에서 신호등의 신호 정보를 출력하기 위한 방법
KR101793628B1 (ko) 2012-04-08 2017-11-06 삼성전자주식회사 투명 디스플레이 장치 및 그 디스플레이 방법
JPWO2014054262A1 (ja) 2012-10-01 2016-08-25 パナソニック株式会社 表示装置
FR2996933B1 (fr) 2012-10-15 2016-01-01 Isorg Appareil portable a ecran d'affichage et dispositif d'interface utilisateur
JP6075069B2 (ja) 2013-01-15 2017-02-08 富士通株式会社 生体情報撮像装置及び生体認証装置ならびに生体情報撮像装置の製造方法
US20150055057A1 (en) 2013-08-23 2015-02-26 Austin L. Huang Touch sensitive display
JP6340793B2 (ja) 2013-12-27 2018-06-13 セイコーエプソン株式会社 光学装置
US9836165B2 (en) 2014-05-16 2017-12-05 Apple Inc. Integrated silicon-OLED display and touch sensor panel
US9570002B2 (en) 2014-06-17 2017-02-14 Apple Inc. Interactive display panel with IR diodes
TW201640418A (zh) 2015-05-04 2016-11-16 曦威科技股份有限公司 指紋檢測裝置、使用其之行動裝置以及其製造方法
CN105047689B (zh) 2015-08-12 2018-01-12 京东方科技集团股份有限公司 有机发光二极管显示基板及其光反射表面结构识别方法
FR3040577B1 (fr) 2015-08-28 2019-05-24 Commissariat A L'energie Atomique Et Aux Energies Alternatives Dispositif de prise d'image a eclairage integre
US9934418B2 (en) 2015-12-03 2018-04-03 Synaptics Incorporated Display integrated optical fingerprint sensor with angle limiting reflector
JP2017196319A (ja) 2016-04-28 2017-11-02 ソニー株式会社 撮像装置、認証処理装置、撮像方法、認証処理方法およびプログラム
US10713458B2 (en) 2016-05-23 2020-07-14 InSyte Systems Integrated light emitting display and sensors for detecting biologic characteristics
CN109475328A (zh) * 2016-06-09 2019-03-15 因赛特系统公司 用于检测生物特性的集成发光显示器及传感器
FR3063564B1 (fr) * 2017-03-06 2021-05-28 Isorg Capteur d'empreintes digitales integre dans un ecran d'affichage
FR3070094B1 (fr) * 2017-08-11 2019-09-06 Isorg Systeme d'affichage comprenant un capteur d'images
FR3070095B1 (fr) 2017-08-11 2019-09-06 Isorg Systeme d'affichage et de detection
KR20210029891A (ko) * 2019-09-06 2021-03-17 삼성디스플레이 주식회사 표시 장치

Also Published As

Publication number Publication date
FR3113429A1 (fr) 2022-02-18
US11928888B2 (en) 2024-03-12
WO2022038034A1 (fr) 2022-02-24
CN217035642U (zh) 2022-07-22
US20230306779A1 (en) 2023-09-28

Similar Documents

Publication Publication Date Title
EP3824326B1 (de) Optisches system und verfahren zur herstellung davon
EP3376544B1 (de) Optische bildanzeigevorrichtung
FR3063596A1 (fr) Systeme d'acquisition d'images
EP3388975A1 (de) Sensorvorrichtung zur aufnahme eines abdrucks eines körperteils
EP4196905A1 (de) Bilderfassungsvorrichtung
WO2022038032A1 (fr) Systeme d'acquisition d'images
EP4107558A1 (de) Struktur eines winkelfilters auf einem cmos-sensor
WO2022038033A1 (fr) Systeme d'acquisition d'images
FR3117614A1 (fr) Filtre angulaire optique
FR3117654A1 (fr) Dispositif d'acquisition d'images
WO2023222604A1 (fr) Filtre optique pour photodétecteurs
FR3060811A1 (fr) Dispositif d'acquisition d'empreintes digitales
EP4073427A1 (de) Optischer filter geeignet zur korrektur des elektronischen rauschens eines sensors
WO2022128337A1 (fr) Filtre angulaire optique
FR3131398A1 (fr) Procédé de fabrication d'un filtre coloré
FR3117613A1 (fr) Filtre angulaire optique
EP4260103A1 (de) Optischer winkelkodierer
FR3139236A1 (fr) Dispositif imageur
FR3141589A1 (fr) Dispositif émetteur et récepteur de lumière
FR2969822A1 (fr) Capteur d'image tridimensionnel

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230119

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)