WO2007071572A1 - Optical sensor - Google Patents

Optical sensor Download PDF

Info

Publication number
WO2007071572A1
WO2007071572A1 PCT/EP2006/069497 EP2006069497W WO2007071572A1 WO 2007071572 A1 WO2007071572 A1 WO 2007071572A1 EP 2006069497 W EP2006069497 W EP 2006069497W WO 2007071572 A1 WO2007071572 A1 WO 2007071572A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
image
light
optical system
source
Prior art date
Application number
PCT/EP2006/069497
Other languages
French (fr)
Inventor
Jean-François Mainguet
Romain Ramel
Catherine Jury
Original Assignee
Atmel Switzerland
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atmel Switzerland filed Critical Atmel Switzerland
Priority to EP06830489A priority Critical patent/EP1969524A1/en
Priority to JP2008546358A priority patent/JP2009520962A/en
Publication of WO2007071572A1 publication Critical patent/WO2007071572A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings

Definitions

  • This disclosure relates to optical sensors.
  • BACKGROUND Numerous systems for recognizing a stimulus, such as an application of a fingerprint, are known, and can be classed into several categories as a function of the physical phenomenon used for detection: capacitive sensors having an array of elementary capacitor plates and based on the difference in proximity between an electrode and a fingerprint ridge or valley of the skin of the finger, piezoelectric or pyroelectric sensors having an array of pressure or temperature sensors, based on the difference in pressure or in escape of heat between the array and the ridges or valleys of the skin, and optical sensors, having an array of photosensitive elements receiving an image of the finger and sensitive to the differences in light present in the image.
  • a light source projecting the light onto the finger
  • an image sensor and an optical system with lenses for projecting onto the sensor an image of the finger illuminated by the source.
  • Optical systems that can operate without an optic based on lenses have also been proposed.
  • WO2005/006241 there is described a fingerprint reading system which comprises a pointlike light source illuminating a finger portion from the inside of a transparent thin plate or pane, at an oblique incidence such that, in the absence of any finger placed on the pane, the light is totally reflected by the interface between the transparent pane and the outside air.
  • a matrix sensor is disposed in the path of the light thus reflected by the interface.
  • the light In the presence of a finger placed on the pane, the light continues to be reflected totally just where the skin is not in contact with the pane on account of the valleys of the prints; but just where the ridges of the prints touch the pane, the total reflection is inhibited, the light is absorbed, at least in part, by the finger. It follows from this that the sensor senses an image of the prints, in the form of attenuated light corresponding to the ridges of the print illuminated by the source. This assumes of course that the source is as pointlike as possible, failing which the cast shadow would be all the more blurred the wider the luminous emission surface of the source.
  • Figure 1 represents, in vertical section, the structure of a device for reading an image operating on this principle: the device, for example, comprises a printed circuit board 10 carrying various components and covered with a transparent layer or plate 12 (glass or a mouldable plastic, for example). The upper surface of the plate can be polished.
  • Components placed on the printed circuit board are: a pointlike light source 14, such as a light-emitting diode, a matrix image sensor 16 defining an array of rows and columns of pixels, control circuits for the source and for the sensor, and circuits for utilizing the gathered signals.
  • the control circuits and the utilization circuits are not represented so as to simplify the figure.
  • the light source e.g., the light-emitting diode, emits light on the inside of the plate 12 towards the upper surface of the latter, and it is on a portion of this upper surface that it is possible to place a finger 18 whose print one wants to read and/or record.
  • the surface portion to be observed, or utilized portion 22 can be surrounded by an opaque and absorbing mask 20 which protects the sensor from the ambient light so as to prevent the latter disturbing the observation.
  • the lateral edges of the transparent plate can also be protected by an opaque layer (not shown).
  • the light source 14 can be placed in such a manner that the utilized portion 22 of the surface of the plate is illuminated at an oblique incidence greater than the angle of total reflection. If the plate is made of glass or of a transparent material with a refractive index of around 1.5, the minimum incidence angle is about 45° to obtain total reflection. The light source 14 is therefore not situated below the surface portion to be observed but can be shifted laterally (to the right in Figure 1 ). The lateral shift between the utilized portion 22 and the source 14 can be almost equal to the height H of the transparent plate above the source. The make the light source appear as pointlike as possible, the light source can be placed at a distance from the finger so that the emission surface appears almost pointlike.
  • the transparent plate can be placed a certain height to realize this distance; other means are possible for distancing the source, for example by using multiple reflections between two reflecting walls separated by a smaller height.
  • the dimension of the matrix image sensor 16 is such that it can receive an image of the entirety of the utilized portion to be observed. If the sensor is almost as distant from the finger as the source is distant from the finger, the sensor must have a photosensitive matrix of dimensions almost twice as large as the utilized portion to be observed. It follows from this that the photosensitive matrix must have a surface area four times as large as the area of print observed.
  • Figure 2 represents a plan view illustrating the relative dimensions of the observed zone (dashed rectangle 22) and of the photosensitive matrix area necessary for observing this zone (dashed rectangle 24 inside the block 16).
  • the dimension of the observed zone can, for example, be one or two centimetres by one or two centimetres.
  • Another way to reduce the surface area of the sensor can be replacing the matrix sensor with a sensor of a few image lines and operating the system as a scanning sensor.
  • the photosensitive surface includes a linear strip of a few lines and the finger is swiped over the observation surface, which is a narrow band instead of a square or a wide rectangle; partial images are gathered over the few lines of the strip in tandem with the swiping of the finger, and these images are recombined by superposition of the partial images obtained in the course of the swipe, so as to culminate in an overall image of the print.
  • An example system is shown in described in the French patent publication FR 2 749 955.
  • Figures 5 and 6 represent, in lateral section and as viewed from above, the arrangement which results therefrom; the observed zone 22 is a rectangle of length L; the matrix (a strip of a several lines) of the image sensor is also in the form of an elongated rectangle 24.
  • the sensor and the source are in the same plane as in Figure 1 , and the length of the strip is then 2I_. It would also be possible to adopt an arrangement similar to Figure 3 with a sensor closer to the surface of the transparent plate, which would reduce the length of the rectangle 24 in the same proportion as the reduction from Figure 2 to Figure 4 (if the height ratios are the same).
  • the senor is necessarily longer than the length L of the zone to be observed.
  • the added length results from the principle of operation with a pointlike source and a reflection of the rays on the observed zone in the absence of a demagnification optic to project the image observed on the sensor.
  • the optical sensor can be utilized in a scanning sensor (e.g. a sensor device in which a finger is swiped over an observation surface), or can be utilized in a static system (e.g., a sensor device in which a finger is placed on an observation surface).
  • a scanning sensor e.g. a sensor device in which a finger is swiped over an observation surface
  • a static system e.g., a sensor device in which a finger is placed on an observation surface
  • the sensor device includes a transparent layer, first and second light sources, and an image sensor.
  • the transparent layer defines a utilized portion on the upper surface on which a microrelief, e.g., a fingerprint, can be observed.
  • the first and second light sources are positioned to illuminate an upper inside surface of the transparent layer so that light can be totally reflected onto the image sensor in the absence of a microrelief on the utilized portion.
  • the image sensor is positioned so that part of the light which undergoes total reflection from the utilized portion of the upper surface falls on the surface of the image sensor.
  • the image sensor dimensions can be such that the image sensor receives light totally reflected from a first portion of the utilized portion in response light from the first light source, and receives light totally reflected from a second portion of the utilized portion in response light from the second light source.
  • the image sensor can include photosensitive elements, and the first and second light sources can be pointlike light sources.
  • the first and second light sources can be activated sequentially, and image data provided by the image sensor during each activation can be combined to form an image based on the first and second portions.
  • first portion and the second portion can define an overlapping area within the utilized portion.
  • first portion and the second portion can define disjointed areas (i.e., mutually exclusive areas) within the utilized portion.
  • Figures 1 and 2 represent, respectively in vertical section and as viewed from above, the principle of a system for reading a print using a pointlike light source.
  • Figures 3 and 4 represent, respectively in section and as viewed from above, a modification in which the sensor is placed higher than the source.
  • Figures 5 and 6 represent, respectively in section and as viewed from above, an adaptation to a scanning system in which the finger swipes over the observation surface.
  • Figure 7 represents a top view an example sensor device.
  • Figure 8 represents another example sensor device having partial overlap of two image parts.
  • Figure 9 represents another example sensor device having disjointed image parts.
  • Figures 10 and 11 represent example sensor devices having three and four light sources respectively.
  • Figures 12 13 represent example sensor devices in which the light sources are positioned at two ends of a linear sensor.
  • FIG. 7 represents a top view of an example sensor device.
  • the lateral offset D between the light source and the utilized portion to be observed and the height H of the transparent plate above the light source are illustratively the same as the corresponding dimensions D and H represented in Figures 5 and 6.
  • the upper surface of the image sensor is illustratively at the same height as the surface of emission of the source as in Figure 5. Other dimensions, however, can be used.
  • the senor of Figure 7 has the dimensions represented in Figure 5, with the printed circuit board 10, the pointlike source 14, the linear sensor 16, the transparent layer or plate 12 with its masking layer 20 which surrounds the utilized portion 22 of plate surface to be observed.
  • the example sensor of Figure 7 includes at least two light sources laterally offset from one another, 14a and 14b, and positioned at a distance D from the utilized portion 22.
  • the light sources 14a and 14b can, for example, operate alternately for collecting data from the image sensor 16.
  • the light sources 14a and 14b are not simultaneously activated and image capture can be performed synchronously with activation of the light sources 14a and 14b so that an image registered by the sensor 16 results only from the illumination by only one of the two sources at a time.
  • the linear image sensor 16 is substantially half the length of the sensor of Figure 6.
  • the light emanating from the source 14a illuminates the utilized portion 22, and the total reflection of this light is projected in part onto the sensor 16 and in part outside of the sensor 16 as the latter is twice as short as that of Figure 6 and it is situated at the same distance D.
  • a first part 22a of the utilized portion 22 is reflected on a photosensitive strip that is defined by an elongated rectangle 24 inside the image sensor 16.
  • the remainder of the utilized portion 22, outside of the first part 22a, is reflected elsewhere than on the strip.
  • the second pointlike light source 14b illuminates the entirety of the utilized portion 22 but only a second part 22b is actually projected onto the rectangle 24 defining the photosensitive surface of the strip of the sensor 16.
  • the image sensor 16 and the light sources 14a and 14b are symmetrically placed at distance D with respect to the utilized portion 22, resulting in a substantially double magnification, and thus the length of each of the portions 22a and 22b can, for example, be substantially half the length of the sensor.
  • the first source 14a is turned on, the sensor 16 receives an image of the first part 22a of the print of the finger placed on the zone to be observed 22.
  • the first source is turned off and the second is turned on and the sensor 16 receives an image of the second part 22b of the print.
  • the image processing can include combining the two images to generate an image of the entire zone 22. Thereafter, successive combined images of the entire zone 22 can be combined to generate a complete image of the print as the finger moves over the zone 22.
  • images corresponding to the first part 22a are interleaved in combination to form an image of a first side, e.g., a left side
  • images corresponding to the second part 22b are interleaved in combination to form an image of a second side, e.g., a right side, and then the image of the first side is combined with the image of the second side to reconstruct the entire image.
  • the light sources 14a and 14b are placed along the sensor, at distance 2D from the latter, so that the portions 22a and 22b are substantially juxtaposed and that the length of the utilized zone 22 is substantially equal to the sum of the lengths of the portions 22a and 22b.
  • the sources 14a and 14b can be offset substantially by the length L of the zone to be observed and the strip of the sensor, e.g., the rectangle 24, can be substantially equal to the length L.
  • the length of the image sensor 16 can be considerably reduced.
  • the height of the surface above the sensor is reduced by an arrangement analogous to that of Figure 3, and thus the portions 22a and 22b become partially mutually overlapping.
  • This implementation can minimize loss of information at the centre of the finger.
  • An image processing routine to identify and to eliminate the redundant portions of the first and second portions 22a and 22b can also be implemented. For example, a simple point-to-point comparison of the two images can facilitate identification and elimination of identical image data.
  • the image processing routine can, for example, take into account the a priori knowledge that a portion of the right part of the first image is substantially identical to a portion of the left part of the second image due to this overlap and due to the second source being substantially activated after the first source is turned off.
  • Figure 8 represents another example sensor device having partial overlap of two image parts.
  • the example sensor device of Figure 8 obtains a partial mutual overlap of the left and right images successively obtained with the height of the sensor being substantially equal with respect to the height of the source.
  • the sensor 16 is of a length that is slightly larger than the length L.
  • the sources 14a and 14b are slightly closer together, their spacing being slightly less than L.
  • Figure 9 represents another example sensor device having substantially disjointed image parts.
  • the portions 22a and 22b are not in juxtaposition, and therefore a blind zone where the print is not detected occurs.
  • the length of the photosensitive strip of the sensor 16 can thus be smaller than the length L of the zone 22 to be observed.
  • the blind zone is hatched between the parts 22a and 22b of the observed utilized zone 22.
  • Figure 10 is another example sensor device utilizing three pointlike light sources 14a, 14b, and 14c.
  • the light sources 14a, 14b and 14c are illuminated successively.
  • the illuminations of light sources 14a, 14b and 14c are not simultaneous.
  • the utilized zone 22 can thus be segmented into three different parts.
  • the relative position of the light sources 14a, 14b and 14c, the utilized zone 22, and the sensor 16 can be configured so that the corresponding image parts 22a, 22b and 22c are substantially juxtaposed.
  • the spacing can be such that the corresponding image parts 22a, 22b and 22c are partially overlapping.
  • the spacing can be such that the corresponding image parts 22a, 22b and 22c define blind zones along the length of the sensor 16.
  • Each of the three parts 22a, 22b and 22c of the utilized observation portion 22 can be successively projected onto the the surface of the sensor 16 and the images can thereafter be recombined.
  • Utilization of three light sources further facilitates a reduction in size of the sensor 16, e.g., the image sensor 16 of Figure 10 can be smaller than the image sensor 16 of Figure 7, in which only two light sources are used.
  • the relative position of the light sources 14a, 14b and 14c, the utilized zone 22, and the sensor 16 is such that there is a partial overlap of the three parts observed successively.
  • Figure 11 is another example sensor device utilizing four light sources 14a, 14b, 14c and 14d.
  • the utilization of the four sources 14a, 14b, 14c, 14d facilitate an even further reduction in size of the sensor 16.
  • the four light sources 14a, 14b, 14c and 14d can be illuminated successively and each projecting onto the photosensitive strip of the sensor 16 a magnified image of a respective part of the surface 22 to be observed.
  • the images are gathered successively, in synchronism with the alternating illumination of the sources. Image portions corresponding to the various sources generated in the course of the successive alternations can be recombined so as to constitute an overall image.
  • the first image emanating from the illumination by the source 14a can be gathered, then the second immediately following image emanating from the illumination by the source 14b an be gathered.
  • the gathered images can be juxtaposed to form a complete image.
  • the images are mutually overlapping, e.g., overlapping a length known a priori
  • a subtraction of the common parts can be performed by digital processing and the remaining image parts can be juxtaposed.
  • the blind zone can be taken into account, its dimension being known, or the blind zone can be ignored during a comparison with a prerecorded print.
  • an image process can take into account a recombination of left and right images as the image is divided into at least a left part and a right part observed successively. Additionally, the image process can take into account the recombination of partial images in the course of the movement. In one implementation, after activation of the first light source, an image of the right half of a print part is gathered (e.g., several pixel lines on the right print half) and stored.
  • the first light source is turned off, and the second light source is turned on.
  • An image of the left print half is gathered and stored.
  • a juxtaposition processing of the two parts is performed to obtain an image of the two halves at one and the same time over several pixel lines, and the image is stored.
  • the image process repeats to generate a succession of partial images of several pixel lines, each partial image comprising at one and the same time the right half and the left half of the print.
  • the various partial images are recombined in the conventional way provided for in scanning print sensors, by correlation between the partially overlapping successive partial images, shifting of the partial images with respect to one another as a function of the results of the correlation, and grouping of the shifted images to form a complete image of the print.
  • successive partial images of the right half and successive partial images of the left half are gathered separately.
  • a recombination of the right images is generated, and likewise a recombination of all the left images is generated.
  • an image of a complete half of the print and another image of the other complete half are constructed.
  • the principle is the same with three or four light sources the image portions being respective thirds or quarters of the length of the utilized portion 22 to be observed.
  • Figure 12 is a transverse view of another example sensor system in which the sources 14a and 14b are positioned in the vertical axial plane oriented in the direction of elongation of the sensor, i.e., along a longitudinal axis.
  • the length of the linear sensor 16 is slightly less than the length L of the utilized portion 22.
  • the utilized observation portion 22 can be positioned substantially above the sensor 16.
  • the sources 14a and 14b each illuminate a part, respectively left or right, of the observation zone 22, and the total reflection projects this part onto a portion of the surface of the photosensitive strip of the sensor 16.
  • a blind zone Za persists in the middle of the observation zone 22, as the edges of the sensor 16 mask in part the light of the sources 14a and 14b.
  • Figure 13 is a transverse view of another example sensor system in which the sources 14a and 14b are positioned along the longitudinal axis of the sensor 16.
  • the sensor 16 of Figure 13 can be smaller than the sensor of Figure 12 due to the sources 14a and 14b being placed at a greater depth than the sensor 16 relative to upper surface of the utilized portion over which the finger swipes.
  • the sensor 16 can be further reduced (e.g., of the order of half the width of a finger) but a blind zone Za again persists.
  • the light sources can be offset from the axial plane of the sensor similar to the configuration of Figure 9, e.g., the sources can be offset by a distance greater than the length of the sensor 16 and disposed laterally with respect to the sensor 16.
  • the utilized portion can then be positioned substantially above the sensor 16 and slightly shifted perpendicularly to the length of the sensor 16.
  • light sources 14a - 14d described above are illustratively identical, light sources of different types can be used. For example, light sources of different wavelengths can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Facsimile Heads (AREA)

Abstract

First and second light pointlike sources in a sensor device are positioned to illuminate an upper inside surface of a transparent layer so that light is totally reflected onto an image sensor. The image sensor is positioned so that light reflected from a utilized portion of the upper surface falls on the surface of the image sensor. The image sensor dimensions can be such that the image sensor receives light totally reflected from a first portion of the utilized portion in response light from the first light source, and receives light totally reflected from a second portion of the utilized portion in response light from the second light source.

Description

OPTICAL SENSOR
This disclosure relates to optical sensors.
BACKGROUND Numerous systems for recognizing a stimulus, such as an application of a fingerprint, are known, and can be classed into several categories as a function of the physical phenomenon used for detection: capacitive sensors having an array of elementary capacitor plates and based on the difference in proximity between an electrode and a fingerprint ridge or valley of the skin of the finger, piezoelectric or pyroelectric sensors having an array of pressure or temperature sensors, based on the difference in pressure or in escape of heat between the array and the ridges or valleys of the skin, and optical sensors, having an array of photosensitive elements receiving an image of the finger and sensitive to the differences in light present in the image.
Among the systems with optical sensors, there are systems which comprise a light source projecting the light onto the finger, an image sensor, and an optical system with lenses for projecting onto the sensor an image of the finger illuminated by the source. Optical systems that can operate without an optic based on lenses have also been proposed. In patent publication WO2005/006241 , for example, there is described a fingerprint reading system which comprises a pointlike light source illuminating a finger portion from the inside of a transparent thin plate or pane, at an oblique incidence such that, in the absence of any finger placed on the pane, the light is totally reflected by the interface between the transparent pane and the outside air. A matrix sensor is disposed in the path of the light thus reflected by the interface. In the presence of a finger placed on the pane, the light continues to be reflected totally just where the skin is not in contact with the pane on account of the valleys of the prints; but just where the ridges of the prints touch the pane, the total reflection is inhibited, the light is absorbed, at least in part, by the finger. It follows from this that the sensor senses an image of the prints, in the form of attenuated light corresponding to the ridges of the print illuminated by the source. This assumes of course that the source is as pointlike as possible, failing which the cast shadow would be all the more blurred the wider the luminous emission surface of the source.
Figure 1 represents, in vertical section, the structure of a device for reading an image operating on this principle: the device, for example, comprises a printed circuit board 10 carrying various components and covered with a transparent layer or plate 12 (glass or a mouldable plastic, for example). The upper surface of the plate can be polished. Components placed on the printed circuit board are: a pointlike light source 14, such as a light-emitting diode, a matrix image sensor 16 defining an array of rows and columns of pixels, control circuits for the source and for the sensor, and circuits for utilizing the gathered signals. The control circuits and the utilization circuits are not represented so as to simplify the figure.
The light source, e.g., the light-emitting diode, emits light on the inside of the plate 12 towards the upper surface of the latter, and it is on a portion of this upper surface that it is possible to place a finger 18 whose print one wants to read and/or record. The surface portion to be observed, or utilized portion 22, can be surrounded by an opaque and absorbing mask 20 which protects the sensor from the ambient light so as to prevent the latter disturbing the observation. The lateral edges of the transparent plate can also be protected by an opaque layer (not shown).
The light source 14 can be placed in such a manner that the utilized portion 22 of the surface of the plate is illuminated at an oblique incidence greater than the angle of total reflection. If the plate is made of glass or of a transparent material with a refractive index of around 1.5, the minimum incidence angle is about 45° to obtain total reflection. The light source 14 is therefore not situated below the surface portion to be observed but can be shifted laterally (to the right in Figure 1 ). The lateral shift between the utilized portion 22 and the source 14 can be almost equal to the height H of the transparent plate above the source. The make the light source appear as pointlike as possible, the light source can be placed at a distance from the finger so that the emission surface appears almost pointlike. The transparent plate can be placed a certain height to realize this distance; other means are possible for distancing the source, for example by using multiple reflections between two reflecting walls separated by a smaller height. The dimension of the matrix image sensor 16 is such that it can receive an image of the entirety of the utilized portion to be observed. If the sensor is almost as distant from the finger as the source is distant from the finger, the sensor must have a photosensitive matrix of dimensions almost twice as large as the utilized portion to be observed. It follows from this that the photosensitive matrix must have a surface area four times as large as the area of print observed. Figure 2 represents a plan view illustrating the relative dimensions of the observed zone (dashed rectangle 22) and of the photosensitive matrix area necessary for observing this zone (dashed rectangle 24 inside the block 16). The dimension of the observed zone can, for example, be one or two centimetres by one or two centimetres.
It is possible to reduce the necessary sensor surface area by bringing the surface of the sensor closer to the surface of the transparent plate, as is represented in Figure 3 in section and in Figure 4 as viewed from above. The arrangement is therefore more complex to produce: it is necessary to elevate the sensor. In the example represented, with a plate height that is three times as small above the sensor as above the source, one gains almost 30% in the dimensions of the matrix and 50% in the surface area. In Figure 4 are again represented the rectangle 22 of the observed zone and the rectangle 24 representing the matrix of the sensor 16, this matrix being smaller than that of Figure 2.
Another way to reduce the surface area of the sensor can be replacing the matrix sensor with a sensor of a few image lines and operating the system as a scanning sensor. The photosensitive surface includes a linear strip of a few lines and the finger is swiped over the observation surface, which is a narrow band instead of a square or a wide rectangle; partial images are gathered over the few lines of the strip in tandem with the swiping of the finger, and these images are recombined by superposition of the partial images obtained in the course of the swipe, so as to culminate in an overall image of the print. An example system is shown in described in the French patent publication FR 2 749 955.
Figures 5 and 6 represent, in lateral section and as viewed from above, the arrangement which results therefrom; the observed zone 22 is a rectangle of length L; the matrix (a strip of a several lines) of the image sensor is also in the form of an elongated rectangle 24. To simplify the representation, it has been assumed that the sensor and the source are in the same plane as in Figure 1 , and the length of the strip is then 2I_. It would also be possible to adopt an arrangement similar to Figure 3 with a sensor closer to the surface of the transparent plate, which would reduce the length of the rectangle 24 in the same proportion as the reduction from Figure 2 to Figure 4 (if the height ratios are the same).
However, in all cases, the sensor is necessarily longer than the length L of the zone to be observed. The added length results from the principle of operation with a pointlike source and a reflection of the rays on the observed zone in the absence of a demagnification optic to project the image observed on the sensor.
SUMMARY Disclosed herein is an optical sensor device that facilitates a reduction in the size of a sensor. The optical sensor can be utilized in a scanning sensor (e.g. a sensor device in which a finger is swiped over an observation surface), or can be utilized in a static system (e.g., a sensor device in which a finger is placed on an observation surface).
In one implementation, the sensor device includes a transparent layer, first and second light sources, and an image sensor. The transparent layer defines a utilized portion on the upper surface on which a microrelief, e.g., a fingerprint, can be observed. The first and second light sources are positioned to illuminate an upper inside surface of the transparent layer so that light can be totally reflected onto the image sensor in the absence of a microrelief on the utilized portion. The image sensor is positioned so that part of the light which undergoes total reflection from the utilized portion of the upper surface falls on the surface of the image sensor. The image sensor dimensions can be such that the image sensor receives light totally reflected from a first portion of the utilized portion in response light from the first light source, and receives light totally reflected from a second portion of the utilized portion in response light from the second light source.
In an implementation, the image sensor can include photosensitive elements, and the first and second light sources can be pointlike light sources. In an implementation, the first and second light sources can be activated sequentially, and image data provided by the image sensor during each activation can be combined to form an image based on the first and second portions.
In an implementation, the first portion and the second portion can define an overlapping area within the utilized portion. In another implementation, the first portion and the second portion can define disjointed areas (i.e., mutually exclusive areas) within the utilized portion.
BRIEF DESCRIPTION OF THE DRAWINGS Figures 1 and 2 represent, respectively in vertical section and as viewed from above, the principle of a system for reading a print using a pointlike light source.
Figures 3 and 4 represent, respectively in section and as viewed from above, a modification in which the sensor is placed higher than the source.
Figures 5 and 6 represent, respectively in section and as viewed from above, an adaptation to a scanning system in which the finger swipes over the observation surface.
Figure 7 represents a top view an example sensor device. Figure 8 represents another example sensor device having partial overlap of two image parts.
Figure 9 represents another example sensor device having disjointed image parts.
Figures 10 and 11 represent example sensor devices having three and four light sources respectively.
Figures 12 13 represent example sensor devices in which the light sources are positioned at two ends of a linear sensor.
DETAILED DESCRIPTION Figure 7 represents a top view of an example sensor device. To illustrate how the example sensor device of Figure 7 facilitates utilization of a sensor having reduced dimensions, the lateral offset D between the light source and the utilized portion to be observed and the height H of the transparent plate above the light source are illustratively the same as the corresponding dimensions D and H represented in Figures 5 and 6. Additionally, the upper surface of the image sensor is illustratively at the same height as the surface of emission of the source as in Figure 5. Other dimensions, however, can be used.
Thus when viewed in vertical section, the sensor of Figure 7 has the dimensions represented in Figure 5, with the printed circuit board 10, the pointlike source 14, the linear sensor 16, the transparent layer or plate 12 with its masking layer 20 which surrounds the utilized portion 22 of plate surface to be observed.
The example sensor of Figure 7, however, includes at least two light sources laterally offset from one another, 14a and 14b, and positioned at a distance D from the utilized portion 22. The light sources 14a and 14b can, for example, operate alternately for collecting data from the image sensor 16. In one implementation, the light sources 14a and 14b are not simultaneously activated and image capture can be performed synchronously with activation of the light sources 14a and 14b so that an image registered by the sensor 16 results only from the illumination by only one of the two sources at a time. In the implementation shown in Figure 7, the linear image sensor 16 is substantially half the length of the sensor of Figure 6.
The light emanating from the source 14a illuminates the utilized portion 22, and the total reflection of this light is projected in part onto the sensor 16 and in part outside of the sensor 16 as the latter is twice as short as that of Figure 6 and it is situated at the same distance D. Thus, a first part 22a of the utilized portion 22 is reflected on a photosensitive strip that is defined by an elongated rectangle 24 inside the image sensor 16. The remainder of the utilized portion 22, outside of the first part 22a, is reflected elsewhere than on the strip.
Likewise, the second pointlike light source 14b illuminates the entirety of the utilized portion 22 but only a second part 22b is actually projected onto the rectangle 24 defining the photosensitive surface of the strip of the sensor 16. In one implementation, the image sensor 16 and the light sources 14a and 14b are symmetrically placed at distance D with respect to the utilized portion 22, resulting in a substantially double magnification, and thus the length of each of the portions 22a and 22b can, for example, be substantially half the length of the sensor. When the first source 14a is turned on, the sensor 16 receives an image of the first part 22a of the print of the finger placed on the zone to be observed 22. The first source is turned off and the second is turned on and the sensor 16 receives an image of the second part 22b of the print. These two images are stored for image processing. In one implementation, the image processing can include combining the two images to generate an image of the entire zone 22. Thereafter, successive combined images of the entire zone 22 can be combined to generate a complete image of the print as the finger moves over the zone 22. In another implementation, images corresponding to the first part 22a are interleaved in combination to form an image of a first side, e.g., a left side, and images corresponding to the second part 22b are interleaved in combination to form an image of a second side, e.g., a right side, and then the image of the first side is combined with the image of the second side to reconstruct the entire image. In Figure 7, the light sources 14a and 14b are placed along the sensor, at distance 2D from the latter, so that the portions 22a and 22b are substantially juxtaposed and that the length of the utilized zone 22 is substantially equal to the sum of the lengths of the portions 22a and 22b. Assuming the same height H of the surface at the observed zone 22 of the plate above the sources 14a and 14b and above the sensor 16, then the sources 14a and 14b can be offset substantially by the length L of the zone to be observed and the strip of the sensor, e.g., the rectangle 24, can be substantially equal to the length L. Thus the length of the image sensor 16 can be considerably reduced. In another implementation, the height of the surface above the sensor is reduced by an arrangement analogous to that of Figure 3, and thus the portions 22a and 22b become partially mutually overlapping. This implementation can minimize loss of information at the centre of the finger. An image processing routine to identify and to eliminate the redundant portions of the first and second portions 22a and 22b can also be implemented. For example, a simple point-to-point comparison of the two images can facilitate identification and elimination of identical image data. The image processing routine can, for example, take into account the a priori knowledge that a portion of the right part of the first image is substantially identical to a portion of the left part of the second image due to this overlap and due to the second source being substantially activated after the first source is turned off.
Figure 8 represents another example sensor device having partial overlap of two image parts. The example sensor device of Figure 8 obtains a partial mutual overlap of the left and right images successively obtained with the height of the sensor being substantially equal with respect to the height of the source. In this implementation, the sensor 16 is of a length that is slightly larger than the length L. The sources 14a and 14b are slightly closer together, their spacing being slightly less than L. Figure 9 represents another example sensor device having substantially disjointed image parts. In the example sensor device of Figure 9, the portions 22a and 22b are not in juxtaposition, and therefore a blind zone where the print is not detected occurs. The length of the photosensitive strip of the sensor 16 can thus be smaller than the length L of the zone 22 to be observed. The blind zone is hatched between the parts 22a and 22b of the observed utilized zone 22.
Figure 10 is another example sensor device utilizing three pointlike light sources 14a, 14b, and 14c. In one implementation, the light sources 14a, 14b and 14c are illuminated successively. In one implementation, the illuminations of light sources 14a, 14b and 14c are not simultaneous. The utilized zone 22 can thus be segmented into three different parts. In one implementation, the relative position of the light sources 14a, 14b and 14c, the utilized zone 22, and the sensor 16 can be configured so that the corresponding image parts 22a, 22b and 22c are substantially juxtaposed. In another implementation, the spacing can be such that the corresponding image parts 22a, 22b and 22c are partially overlapping. In another implementation, the spacing can be such that the corresponding image parts 22a, 22b and 22c define blind zones along the length of the sensor 16. Each of the three parts 22a, 22b and 22c of the utilized observation portion 22 can be successively projected onto the the surface of the sensor 16 and the images can thereafter be recombined. Utilization of three light sources further facilitates a reduction in size of the sensor 16, e.g., the image sensor 16 of Figure 10 can be smaller than the image sensor 16 of Figure 7, in which only two light sources are used. In the example sensor device of Figure 10, the relative position of the light sources 14a, 14b and 14c, the utilized zone 22, and the sensor 16 is such that there is a partial overlap of the three parts observed successively.
Figure 11 is another example sensor device utilizing four light sources 14a, 14b, 14c and 14d. The utilization of the four sources 14a, 14b, 14c, 14d facilitate an even further reduction in size of the sensor 16. In one implementation, the four light sources 14a, 14b, 14c and 14d can be illuminated successively and each projecting onto the photosensitive strip of the sensor 16 a magnified image of a respective part of the surface 22 to be observed. In one implementation, the images are gathered successively, in synchronism with the alternating illumination of the sources. Image portions corresponding to the various sources generated in the course of the successive alternations can be recombined so as to constitute an overall image. For example, if there are two sources, the first image emanating from the illumination by the source 14a can be gathered, then the second immediately following image emanating from the illumination by the source 14b an be gathered. In an implementation in which the image portions are substantially juxtaposed, the gathered images can be juxtaposed to form a complete image. In an implementation in which the images are mutually overlapping, e.g., overlapping a length known a priori, a subtraction of the common parts can be performed by digital processing and the remaining image parts can be juxtaposed. In an implementation in which there is no overlap and in which there exists a blind zone, the blind zone can be taken into account, its dimension being known, or the blind zone can be ignored during a comparison with a prerecorded print.
If a fingerprint is scanned, an overall print image can be reconstructed on the basis of partial images obtained successively in the course of the swiping of the finger. Typically the successive fingerprint images partially overlap one another. Consequently, to effect this recombination in the example systems of Figures 7-11 , an image process can take into account a recombination of left and right images as the image is divided into at least a left part and a right part observed successively. Additionally, the image process can take into account the recombination of partial images in the course of the movement. In one implementation, after activation of the first light source, an image of the right half of a print part is gathered (e.g., several pixel lines on the right print half) and stored. The first light source is turned off, and the second light source is turned on. An image of the left print half is gathered and stored. A juxtaposition processing of the two parts is performed to obtain an image of the two halves at one and the same time over several pixel lines, and the image is stored. The image process repeats to generate a succession of partial images of several pixel lines, each partial image comprising at one and the same time the right half and the left half of the print. The various partial images are recombined in the conventional way provided for in scanning print sensors, by correlation between the partially overlapping successive partial images, shifting of the partial images with respect to one another as a function of the results of the correlation, and grouping of the shifted images to form a complete image of the print. In another implementation, successive partial images of the right half and successive partial images of the left half are gathered separately. A recombination of the right images is generated, and likewise a recombination of all the left images is generated. After all the fingerprint data are collected, an image of a complete half of the print and another image of the other complete half are constructed. These two halves are combined, either by simple juxtaposition if there is no overlap between the two parts 22a and 22b of the observed utilized portion (such as in Figure 7, for example), or by elimination of the redundant parts if there is overlap.
The principle is the same with three or four light sources the image portions being respective thirds or quarters of the length of the utilized portion 22 to be observed.
Figure 12 is a transverse view of another example sensor system in which the sources 14a and 14b are positioned in the vertical axial plane oriented in the direction of elongation of the sensor, i.e., along a longitudinal axis. The length of the linear sensor 16 is slightly less than the length L of the utilized portion 22. The utilized observation portion 22 can be positioned substantially above the sensor 16. The sources 14a and 14b each illuminate a part, respectively left or right, of the observation zone 22, and the total reflection projects this part onto a portion of the surface of the photosensitive strip of the sensor 16. A blind zone Za persists in the middle of the observation zone 22, as the edges of the sensor 16 mask in part the light of the sources 14a and 14b.
Figure 13 is a transverse view of another example sensor system in which the sources 14a and 14b are positioned along the longitudinal axis of the sensor 16. The sensor 16 of Figure 13 can be smaller than the sensor of Figure 12 due to the sources 14a and 14b being placed at a greater depth than the sensor 16 relative to upper surface of the utilized portion over which the finger swipes. In this implementation, the sensor 16 can be further reduced (e.g., of the order of half the width of a finger) but a blind zone Za again persists. To reduce this blind zone, the light sources can be offset from the axial plane of the sensor similar to the configuration of Figure 9, e.g., the sources can be offset by a distance greater than the length of the sensor 16 and disposed laterally with respect to the sensor 16. The utilized portion can then be positioned substantially above the sensor 16 and slightly shifted perpendicularly to the length of the sensor 16.
While the light sources 14a - 14d described above are illustratively identical, light sources of different types can be used. For example, light sources of different wavelengths can be used.
This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art may effect alterations, modifications and variations to the examples without departing from the scope of the invention.

Claims

1. An optical system for reading microreliefs, in particular fingerprints, comprising:
- a transparent layer (12) comprising a utilized portion of upper surface in contact with which can come a microrelief portion, in particular a finger print (18), to be observed,
- a pointlike light source (14a) such as a light-emitting diode illuminating the upper surface via an inside part of said transparent layer at an angle of incidence greater than an angle of total reflection of an interface between the upper surface of the transparent layer and air outside this surface,
- and an image sensor (16) with an array of photosensitive pixels, placed laterally with respect to the source so that light emanating from the source and reflected by total reflection by the utilized portion of the upper surface of the transparent layer falls on a surface of the sensor so as to allow a gathering by the sensor of an image of this portion, wherein, in order to reduce a dimension of the sensor for a given surface area of the utilized portion,
- on the one hand the sensor (16) has dimensions making it possible to receive light rays emanating from total reflection of the source (14a) on a first part (22a) of the utilized portion but not the entirety of the utilized portion,
- on the other hand there is provided at least one second light source (14b) offset from the first source and placed in such a manner that the sensor receives the light rays emanating from total reflection of the second source on a second part (22b), not identical with the first part, of the utilized portion, but not the entirety of the utilized portion,
- the light sources being able to be turned on sequentially and the sensor then sequentially providing an image of the first part followed by an image of the second part of the utilized portion.
2. An optical system according to Claim 1 , wherein the two parts (22a, 22b) of the utilized portion are juxtaposed with a partial mutual overlap.
3. An optical system according to one of Claims 1 and 2, wherein the image is gathered by scanning a finger over the upper surface, the utilized portion being an elongated narrow rectangle of length approximately equal to the width of a finger, and the image sensor comprising a strip of a few photosensitive pixel lines.
4. An optical system according to Claim 3, wherein the light sources are aligned parallel to the length of the sensor and on one side of the latter.
5. An optical system according to Claim 3, wherein there are three or four light sources turned on one after the other.
6. An optical system according to Claim 3, comprising two light sources, the length of the photosensitive strip being approximately equal to the length of the microrelief portion to be observed.
7. An optical system according to one of Claims 3 to 5, wherein the length of the strip is less than that of the microrelief portion to be observed.
8. An optical system according to Claim 3, wherein the pointlike light sources are situated in the alignment of the length of the linear sensor, at each extremity of the latter.
9. An optical system according to Claim 8, wherein the sources are placed at a greater depth (H) than the sensor below the surface of the utilized portion.
10. An optical system according to Claim 9, wherein the light sources emit at different wavelengths.
PCT/EP2006/069497 2005-12-23 2006-12-08 Optical sensor WO2007071572A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP06830489A EP1969524A1 (en) 2005-12-23 2006-12-08 Optical sensor
JP2008546358A JP2009520962A (en) 2005-12-23 2006-12-08 Light sensor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0513218 2005-12-23
FR0513218A FR2895546B1 (en) 2005-12-23 2005-12-23 OPTICAL SYSTEM FOR READING MOCRORELIEFS, IN PARTICULAR FINGERPRINTS

Publications (1)

Publication Number Publication Date
WO2007071572A1 true WO2007071572A1 (en) 2007-06-28

Family

ID=36809532

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2006/069497 WO2007071572A1 (en) 2005-12-23 2006-12-08 Optical sensor

Country Status (5)

Country Link
EP (1) EP1969524A1 (en)
JP (1) JP2009520962A (en)
FR (1) FR2895546B1 (en)
TW (1) TW200739430A (en)
WO (1) WO2007071572A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014112746B4 (en) * 2014-05-30 2018-01-25 Cheng Uei Precision Industry Co., Ltd. Fingerprint sensor module
US10238085B2 (en) * 2013-11-01 2019-03-26 Children's Medical Center Corporation Devices and methods for analyzing rodent behavior
WO2019114276A1 (en) * 2017-12-15 2019-06-20 Boe Technology Group Co., Ltd. Fingerprint recognition device, fingerprint recognition method, and display device
CN111061089A (en) * 2019-12-13 2020-04-24 武汉华星光电技术有限公司 Display device
US10650228B2 (en) 2015-09-18 2020-05-12 Children's Medical Center Corporation Devices and methods for analyzing animal behavior
US11553687B2 (en) 2017-05-12 2023-01-17 Children's Medical Center Corporation Devices for analyzing animal behavior
US11678035B2 (en) 2017-10-05 2023-06-13 University Of Utah Research Foundation Translucent imaging systems and related methods

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040208347A1 (en) * 2003-04-18 2004-10-21 Izhak Baharav System and method for time-space multiplexing in finger-imaging applications

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040208347A1 (en) * 2003-04-18 2004-10-21 Izhak Baharav System and method for time-space multiplexing in finger-imaging applications

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10238085B2 (en) * 2013-11-01 2019-03-26 Children's Medical Center Corporation Devices and methods for analyzing rodent behavior
US11432528B2 (en) 2013-11-01 2022-09-06 President And Fellows Of Harvard College Devices and methods for analyzing rodent behavior
DE102014112746B4 (en) * 2014-05-30 2018-01-25 Cheng Uei Precision Industry Co., Ltd. Fingerprint sensor module
US10650228B2 (en) 2015-09-18 2020-05-12 Children's Medical Center Corporation Devices and methods for analyzing animal behavior
US11553687B2 (en) 2017-05-12 2023-01-17 Children's Medical Center Corporation Devices for analyzing animal behavior
US11678035B2 (en) 2017-10-05 2023-06-13 University Of Utah Research Foundation Translucent imaging systems and related methods
WO2019114276A1 (en) * 2017-12-15 2019-06-20 Boe Technology Group Co., Ltd. Fingerprint recognition device, fingerprint recognition method, and display device
US11288483B2 (en) 2017-12-15 2022-03-29 Boe Technology Group Co., Ltd. Fingerprint recognition device, fingerprint recognition method, and display device
CN111061089A (en) * 2019-12-13 2020-04-24 武汉华星光电技术有限公司 Display device
CN111061089B (en) * 2019-12-13 2021-04-27 武汉华星光电技术有限公司 Display device

Also Published As

Publication number Publication date
FR2895546B1 (en) 2008-06-06
FR2895546A1 (en) 2007-06-29
JP2009520962A (en) 2009-05-28
TW200739430A (en) 2007-10-16
EP1969524A1 (en) 2008-09-17

Similar Documents

Publication Publication Date Title
US8073204B2 (en) Hybrid multi-sensor biometric identification device
US5920384A (en) Optical imaging device
WO2007071572A1 (en) Optical sensor
US7164782B2 (en) System and method for time-space multiplexing in finger-imaging applications
US7158659B2 (en) System and method for multiplexing illumination in combined finger recognition and finger navigation module
US7274808B2 (en) Imaging system and apparatus for combining finger recognition and finger navigation
US7798405B2 (en) Optical imaging device for the recognition of finger prints
CN109598248B (en) Operation method of grain recognition device and grain recognition device
US7801338B2 (en) Multispectral biometric sensors
EP1939788A1 (en) Device and method for the taking of fingerprints
KR101432339B1 (en) Alternating light sources to reduce specular reflection
US20170076133A1 (en) Device and Method for the Direct Optical Recording of Live Skin Areas
JP4182987B2 (en) Image reading device
US20060039048A1 (en) Systems and methods of capturing prints with a holographic optical element
US7180643B2 (en) Live print scanner with holographic imaging a different magnifications
CN111325055B (en) Fingerprint identification method and device, storage medium and terminal
CN110785768B (en) Photosensitive detection device, display device, fingerprint detection method, and method of operating display device
JP2018120383A (en) Biological image processing system, biological image processing method, biological image processing program, storing medium for storing biological image processing program
CN111582131B (en) Thin type under-screen optical fingerprint identification device and fingerprint identification method
JPH0492990A (en) Fingerprint inputting device
JP2002208000A (en) Reader
JPH0194488A (en) Rugged pattern input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2008546358

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 1020087015205

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006830489

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2006830489

Country of ref document: EP