CN111709330B - Optical detection system and electronic equipment - Google Patents
Optical detection system and electronic equipment Download PDFInfo
- Publication number
- CN111709330B CN111709330B CN202010491268.5A CN202010491268A CN111709330B CN 111709330 B CN111709330 B CN 111709330B CN 202010491268 A CN202010491268 A CN 202010491268A CN 111709330 B CN111709330 B CN 111709330B
- Authority
- CN
- China
- Prior art keywords
- image
- lens
- external object
- field
- difference
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 72
- 230000003287 optical effect Effects 0.000 title claims abstract description 64
- 238000003384 imaging method Methods 0.000 claims abstract description 67
- 230000005284 excitation Effects 0.000 claims description 5
- 238000007689 inspection Methods 0.000 claims description 3
- 239000010410 layer Substances 0.000 description 21
- 239000011241 protective layer Substances 0.000 description 18
- 239000000758 substrate Substances 0.000 description 16
- 230000006870 function Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 239000012780 transparent material Substances 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 239000003292 glue Substances 0.000 description 3
- 230000001681 protective effect Effects 0.000 description 3
- -1 sheet Substances 0.000 description 3
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 2
- 239000002390 adhesive tape Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 238000012634 optical imaging Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 239000002985 plastic film Substances 0.000 description 1
- 229920006255 plastic film Polymers 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1324—Sensors therefor by using geometrical optics, e.g. using prisms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1318—Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Optics & Photonics (AREA)
- Image Input (AREA)
Abstract
The application discloses an optical detection system, which comprises a first lens and a second lens with overlapping field of view ranges, an image sensor positioned below the first lens and the second lens, and a processor. The image sensor includes a plurality of first photosensitive units and a plurality of second photosensitive units, respectively. The first photosensitive unit is used for receiving the imaging light converged by the first lens and converting the imaging light into corresponding electric signals. The second photosensitive unit is used for receiving the light rays focused and imaged by the second lens and converting the light rays into corresponding electric signals. The processing module is used for comparing differences of images formed by the external objects in the overlapping view field range through the first lens and the second lens respectively, and judging that the external objects are three-dimensional objects when the differences are larger than or equal to a preset threshold value. The application also discloses an optical detection system and electronic equipment.
Description
Technical Field
The present application relates to the field of photoelectric technologies, and in particular, to an optical detection system and an electronic device for detecting an external object by using an optical imaging principle.
Background
At present, the optical fingerprint identification function of electronic products such as mobile phones and tablet personal computers is usually realized by identifying a fingerprint plane image obtained when a user presses a screen surface, and is easy to be broken by lawbreakers by using plane fake fingerprint props with low manufacturing cost, such as: and sticking an adhesive tape or a picture printed with fingerprint patterns on the fingerprint identification area of the surface of the screen. Therefore, the optical fingerprint identification function of the existing electronic product has obvious potential safety hazard.
Disclosure of Invention
In view of the above, the present application provides an optical detection system and an electronic device that can improve the problems of the prior art.
An aspect of the present application provides an optical detection system for detecting an external object, including:
a first lens having a first field of view for imaging an external object within the first field of view;
a second lens having a second field of view range for imaging an external object within the second field of view range, the first field of view range and the second field of view range at least partially overlapping, defining a portion of the first field of view range and the second field of view range overlapping each other as an overlapping field of view range;
The image sensor is positioned below the first lens and the second lens, and comprises a plurality of first photosensitive units and a plurality of second photosensitive units, wherein the first photosensitive units are used for receiving imaging light converged by the first lens and converting the imaging light into corresponding electric signals, and the second photosensitive units are used for receiving the light converged by the second lens and converting the imaging light into corresponding electric signals;
the processing module is used for comparing differences between a first image formed by the external object through the first lens and a second image formed by the second lens in the overlapping view field range, and judging that the external object is a three-dimensional object when the differences are larger than or equal to a preset threshold value.
In some embodiments, the display device further comprises a lens module, wherein the lens module is located below the display screen, the lens module comprises a plurality of small lenses arranged in an array, and the first lens and the second lens are respectively one of the small lenses.
In some embodiments, the external object includes a first feature point and a second feature point, a distance between corresponding image points of the first feature point and the second feature point on the first image is a first pitch, a distance between corresponding image points of the first feature point and the second feature point on the second image is a second pitch, a difference between the first image and the second image is a difference between the first pitch and the second pitch, and the preset threshold is a threshold of the difference between the first pitch and the second pitch.
In some embodiments, the external object includes a first feature point and a second feature point, a difference between gray values of corresponding image points of the first feature point and the second feature point on the first image is a first gray difference, a difference between gray values of corresponding image points of the first feature point and the second feature point on the second image is a second gray difference, a difference between the first image and the second image is a difference between the first gray difference and the second gray difference, and the preset threshold is a threshold of the difference between the first gray difference and the second gray difference.
In some embodiments, the external object is a portion of a fingerprint trace of a user's finger within the overlapping field of view, the first feature point is a fingerprint ridge, and the second feature point is a fingerprint valley.
In some embodiments, the first feature point and the second feature point respectively form an image point on the first image and the second image, and the distance between the first feature point and the second feature point ranges from 100 μm to 300 μm.
In certain embodiments, the overlapping field of view range in the first field of view range and the second field of view range is greater than or equal to 30%.
In some embodiments, the display screen is an active light emitting display screen, and light emitted by the display screen can be used to illuminate an external object to form the first image and/or the second image; or alternatively
The display screen is a passive luminous display module, and the under-screen optical detection system further comprises an excitation light source for providing light required by detection.
In some embodiments, the processing module is further configured to compare the first image and/or the second image of the external object with a pre-stored external object template, and identify the identity of the external object according to the comparison result.
An aspect of the present application provides an electronic device comprising an optical detection system as provided by the above embodiments.
The optical detection system has the beneficial effects that whether the external object is a three-dimensional object or not is judged by comparing whether the difference caused by parallax exists between the external object images respectively acquired from different visual angles, so that an illegal molecule can be effectively prevented from attacking the identification function of the optical detection system by using the plane imitation printed with the external object images, and the safety of the electronic equipment is improved.
Drawings
FIG. 1 is a schematic diagram of an off-screen optical detection system applied to an electronic device according to an embodiment of the present application;
fig. 2 is a perspective view of an optical detection system according to a first embodiment of the present application;
FIG. 3 is a schematic partial cross-sectional view of the optical detection system shown in FIG. 2 along line III-III;
FIG. 4 is an imaging light path diagram of the first and second lenses of FIG. 3;
FIG. 5 is a functional block diagram of an optical detection system according to a first embodiment of the present application;
FIG. 6 is a schematic partial cross-sectional view of an optical detection system according to a second embodiment of the present application;
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
In the detailed description of embodiments of the application, it will be understood that when a substrate, sheet, layer, or pattern is referred to as being "on" or "under" another substrate, sheet, layer, or pattern, it can be "directly" or "indirectly" on the other substrate, sheet, layer, or pattern, or one or more intervening layers may also be present. The thickness and size of each layer in the drawings of the specification may be exaggerated, omitted, or schematically represented for clarity. Moreover, the sizes of elements in the drawings do not entirely reflect actual sizes.
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The following disclosure provides many different embodiments, or examples, for implementing different structures of the application. In order to simplify the present disclosure, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the application. Furthermore, the present application may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or configurations discussed. In addition, the present application provides examples of various specific processes and materials, but one of ordinary skill in the art will recognize the application of other processes and/or the use of other materials.
Further, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. It will be appreciated, however, by one skilled in the art that the inventive aspects may be practiced without one or more of the specific details, or with other structures, components, etc. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device 1 comprises an off-screen optical detection system 10 for detecting an external object 2. The off-screen optical inspection system 10 includes a protective layer 12, a display screen 14, and an optical inspection system 16 positioned below the display screen 14. The display 14 is located below the protective layer 12. The display 14 is used for displaying pictures. The protective layer 12 is capable of transmitting light emitted from the display 14 for displaying images and protecting the display 14 from damage. The optical detection system 16 is configured to receive light from the external object 2 through the protective layer 12 and the display screen 14, and convert the received light into a corresponding electrical signal to perform corresponding information sensing. The optical detection system 16 is used, for example, to perform sensing of biometric information such as, but not limited to, fingerprint information, palm print information, etc., texture feature information, and/or vital information such as blood oxygen information, heart beat information, pulse information, etc. However, the present application is not limited thereto, and the optical detection system 16 may be used to perform other information sensing, such as depth information sensing, proximity sensing, and the like. In the present application, the external object 2 is mainly taken as a finger of a user, and the optical detection system 16 performs fingerprint sensing as an example.
The electronic device 1 is for example, but not limited to, a consumer electronic product, a household electronic product, a vehicle-mounted electronic product, a financial terminal product or other suitable type of electronic product. The consumer electronic products are, for example, mobile phones, tablet computers, notebook computers, desktop displays, computer integrated machines, and the like. The household electronic products are, for example, intelligent door locks, televisions, refrigerators and the like. The vehicle-mounted electronic product is, for example, a vehicle-mounted navigator, a vehicle-mounted touch interaction device and the like. The financial terminal products are, for example, ATM machines, terminals for self-service transactions, etc.
It should be noted in advance that, in the present application, the optical detection system 16 has a plurality of different embodiments, and for clarity, the optical detection systems 16 in the different embodiments are denoted by different reference numerals 16a and 16b, respectively, to distinguish them. Further, for convenience of description, like reference numerals in different embodiments of the optical detection system 16 may refer to like elements, and may refer to like elements as modified, replaced, expanded, or combined.
Alternatively, in some embodiments, the display 14 may be an active light emitting display, such as, but not limited to, an organic light emitting diode display (OLED display), or the like. The display 14 may serve as an excitation light source providing light for detection, such as, but not limited to, visible light in the wavelength range between 400 and 780 nanometers (nm). Alternatively, in other embodiments, the display 14 may be a passive light emitting display, such as, but not limited to, a Liquid Crystal Display (LCD) or an electronic paper display. The electronic device 1 employing a passive light emitting display requires an additional excitation light source to provide light for detection, such as, but not limited to, near infrared light having a wavelength in the range of 800 to 1000 nm. Alternatively, an excitation light source may be provided for the electronic device using the active light emitting display to provide the light for detection.
Alternatively, in some embodiments, the protective layer 12 may comprise a transparent material, such as, but not limited to, transparent glass, transparent polymer, any other transparent material, and the like. The protective layer 12 may have a single-layer structure or a multilayer structure. The protective layer 12 is generally a thin plate having a predetermined length, width, and thickness. It will be appreciated that the protective layer 12 may comprise a plastic film, a toughened film, or other film layer or layers to which the user is attached in actual use. The outer surface 120 of the protective layer 12 may be the surface of the electronic device 1 that is located outermost. Upon detection, the external object 2 may directly contact the outer surface 120 of the protective layer 12.
Referring to fig. 2, 3 and 5, fig. 2 is a perspective view of an optical detection system 16a according to a first embodiment of the present application. FIG. 3 is a schematic partial cross-sectional view of the optical detection system 16a shown in FIG. 2 along line III-III. Fig. 5 is a functional block diagram of an optical detection system 16a according to a first embodiment of the present application. The optical detection system 16a includes a lens module 160 and an image sensor 162 disposed below the lens module 160. The lens module 160 is configured to converge light rays onto the image sensor 162 to image the external object 2. The image sensor 162 is configured to convert the received light into a corresponding electrical signal.
The Lens module 160 includes a plurality of small lenses 1600 (Mini-Lens), the plurality of small lenses 1600 are disposed above the image sensor 162, and the plurality of small lenses 1600 are spaced apart from each other. The plurality of lenslets 1600 are used to converge light onto the image sensor 162 to image the external object 2.
Optionally, in this embodiment, the plurality of lenslets 1600 are arranged in a regular array. Further alternatively, the plurality of lenslets 1600 are arranged in, for example, but not limited to, a rectangular array. However, alternatively, in other embodiments, the plurality of lenslets 1600 may be arranged irregularly.
Optionally, in this embodiment, the lens module 160 further includes a first substrate 1603, and the plurality of lenslets 1600 are disposed on the first substrate 1603. The first substrate 1603 is made of a transparent material such as, but not limited to, transparent acrylic, lens glass, UV glue, etc. The first substrate 1603 may be integrally formed with the plurality of lenslets 1600. Alternatively, the plurality of lenslets 1600 and the first substrate 1603 are elements manufactured independently, and the plurality of lenslets 1600 are fixed on the first substrate 1603 by means of glue or the like.
Optionally, a plurality of the lenslets 1600 are convex lenses. Further alternatively, the plurality of lenslets 1600 are spherical or aspherical lenses.
Optionally, a plurality of the lenslets 1600 are made of a transparent material such as, but not limited to, transparent acrylic, lens glass, UV glue, etc.
Optionally, in this embodiment, the optical detection system 16 further includes a second substrate 163 located below the image sensor 162. The second substrate 163 may provide support for the image sensor 162 and electrical connection to external circuitry. The second substrate 163 is, for example, a flexible circuit board or a hard circuit board.
Optionally, in this embodiment, the optical detection system 16 further includes a filter layer 164 disposed on the image sensor 162. The filter layer 164 transmits light within a predetermined wavelength range for detection and filters out other light outside the predetermined wavelength range. Optionally, in other embodiments, the filter layer 164 may be coated on the light entrance surfaces of the plurality of lenslets 1600.
Optionally, in this embodiment, the optical detection system 16a further includes a light shielding layer 165, where the light shielding layer 165 is disposed on the first substrate 1603 and is located in a space region between the plurality of the lenslets 1600. The light shielding layer 165 is used for shielding light to reduce interference of detection caused by stray light not converged by the lenslets 1600 impinging on the image sensor 162.
Alternatively, in other embodiments, the light shielding layer 165 may be disposed at a different location. For example, the light shielding layer 165 is disposed between the filter layer 164 and the first substrate 1603, and the light shielding layer 165 has light holes (not shown) corresponding to the plurality of lenslets 1600. Alternatively, the light shielding layer 165 is disposed between the filter layer 164 and the image sensor 162, and the light shielding layer 165 has light transmission holes corresponding to the plurality of lenslets 1600. The application is not limited in this regard.
Optionally, in other embodiments, the optical detection system 16 may further include a protective film (not shown) disposed over the plurality of lenslets 1600. The protective film may cover the plurality of lenslets 1600 and/or the light shielding layer 165 to provide moisture protection, dust protection, scratch protection, etc. In some embodiments, the protective film may also be omitted.
The image sensor 162 includes a plurality of photosensitive units 1624, and a readout circuit and other auxiliary circuits electrically connected to the photosensitive units 1624. The photosensitive unit 1624 is a photodetector such as, but not limited to, a photodiode or the like. The photosensitive unit 1624 is configured to receive the imaging light converged by the lens module 160 and convert the imaging light into a corresponding electrical signal to obtain biometric information of the external object 2. The external object 2 is for example but not limited to a user's finger, palm, etc. and the biometric information is for example but not limited to fingerprint image data of the user's finger. Alternatively, the area of each photosensitive unit 1624 may have a value ranging from 5 micrometers (μm) to 5 μm to 10 μm.
Alternatively, in the present embodiment, the plurality of photosensitive units 1624 are arranged in an array. However, alternatively, in some embodiments, the arrangement of the plurality of photosensitive cells 1624 may also form a regular or irregular two-dimensional pattern. Each Of the lenslets 1600 has a predetermined Field Of View (FOV), and each Of the lenslets 1600 is capable Of imaging an external object 2 within the field Of View onto a plurality Of photosensitive cells 1624 corresponding to the image sensor 162, and thus is converted into corresponding image data.
For convenience of description, the area where the plurality of photosensitive units 1624 capable of receiving light through the lenslet 1600 are located is defined as a photosensitive area of the lenslet 1600, and part or all of the plurality of lenslets 1600 have corresponding photosensitive areas on the image sensor 162, respectively. In the present application, each lenslet 1600 is illustrated as having a corresponding photosensitive area on the image sensor 162. Each of the lenslets 1600 is capable of imaging an external object 2 within a respective field of view within a corresponding photosensitive region. Optionally, in the present embodiment, the photosensitive area of each of the lenslets 1600 includes a plurality of the photosensitive units 1624, such as, but not limited to: an array of 10 x 10 photo-sensing cells 1624, or an array of 100 x 100 photo-sensing cells 1624, or an array of 10 x 10 to 100 photo-sensing cells 1624 of any size between 10 x 10 and 100 x 100, or 100 to 10000 photo-sensing cells 1624.
Optionally, in the present embodiment, each of the lenslets 1600 has the same shape and optical parameters, such as, but not limited to: the same focal length, field of view range, surface curvature, etc. The photosurfaces of the photosites 1624 of the image sensor 162 are substantially parallel to the outer surface of the protective layer 12, the plurality of lenslets 1600 are disposed on the same horizontal plane, the distances between the optical centers of the plurality of lenslets 1600 and the photosurfaces of the corresponding photosites 1624 are substantially the same, and the distances between the optical centers of the plurality of lenslets 1600 and the outer surface 120 of the protective layer 12 are also substantially the same. Thus, in the present embodiment, each of the lenslets 1600 and the imaging optical path formed by the corresponding photosensitive unit 1624 have the same imaging parameters, such as, but not limited to: the same image distance, magnification, etc. However, alternatively, in other embodiments, the plurality of lenslets 1600 may have different shapes and optical parameters.
Taking the external object 2 as an example of a finger of a user, the user presses the finger against a preset detection area on the outer surface 120 of the protective layer 12 when detecting the fingerprint of the finger, and the fingerprint of the finger contacts with the detection area of the outer surface 120 of the protective layer 12. It will be appreciated that the detection area is within the field of view of the plurality of lenslets 1600, and that the plurality of lenslets 1600 are capable of imaging a fingerprint in contact with the detection area onto photosensitive areas of the image sensor 162 corresponding to respective field of view and are converted into corresponding fingerprint image data by the photosensitive unit 1624 within the photosensitive areas. The fingerprint of the finger includes ridges (ridges) and valleys (valey), the ridges of the fingerprint being in direct contact with the surface of the protective layer 12, the valleys of the fingerprint being spaced apart from the surface of the protective layer 12 by a gap. Because the contact condition between the ridge and the valley and the surface of the protective layer 12 is different, the imaging light rays from the ridge and the valley respectively reach the corresponding photosensitive units 1624 with different light intensities (the brightness of the light rays can be considered to be different), so as to form fingerprint images with alternate brightness and darkness corresponding to ridge and valley lines. The fingerprint image comprises characteristic point information of a plurality of ridge and valley lines, and the fingerprint image can be further used for detecting and identifying fingerprints by comparing and verifying the characteristic point information with pre-stored fingerprint template data.
One of the plurality of lenslets 1600 is defined as a first lens 1601 having a corresponding first field of view range FOV1. The image sensor 162 has a first photosensitive region 1621 corresponding to the first lens 1601, and a photosensitive unit 1624 within the first photosensitive region 1621 is defined as a first photosensitive unit 1624a. The first lens 1601 is configured to image an external object 2 within the first field of view FOV1, and the first photosensitive unit 1624a receives the light rays focused and imaged by the first lens 1601 and converts the light rays into corresponding image data (e.g., an electrical signal). The first lens 1601 and the first photosensitive unit 1624a constitute a first imaging module 166 capable of imaging the external object 2 within the first field of view FOV1.
Another one of the plurality of lenslets 1600 is defined as a second lens 1602 having a corresponding second field of view range FOV2. The image sensor 162 has a second photosensitive region 1622 corresponding to the second lens 1602, and a photosensitive unit 1624 within the second photosensitive region 1622 is defined as a second photosensitive unit 1624b. The second lens 1602 is configured to image the external object 2 within the second field of view FOV2, and the second light sensing unit 1624b receives the light rays focused by the second lens 1602 and converts the light rays into corresponding image data (e.g., electrical signals). The second lens 1602 and the second light sensing unit 1624b constitute a second imaging module 168 capable of imaging an external object 2 within a second field of view FOV2.
The first field of view range FOV1 and the second field of view range FOV2 at least partly overlap, and the portion of the first field of view range FOV1 and the second field of view range FOV2 that overlap each other is defined as an overlapping field of view range. The first imaging module 166 forms a first image of an external object 2 (e.g., a user's finger) within the overlapping field of view and the second imaging module 168 forms a second image of the external object 2 within the overlapping field of view.
As shown in fig. 4, fig. 4 is an imaging light path diagram of the first lens and the second lens described in fig. 3. The external object 2 comprises a first feature point a having a corresponding first imaging point A1 on a first image and a second feature point B having a corresponding second imaging point B2 on the first image, the first feature point a having a corresponding third imaging point A3 on a second image, the second feature point B having a corresponding fourth imaging point B4 on the second image.
If the external object 2 is a planar object, such as, but not limited to: adhesive tape or pictures printed with fingerprint patterns, etc. The out-of-plane object 2 includes a first feature point a and a second feature point C. When the out-of-plane object 2 is placed on the detection area within the overlapping field of view, the points on the out-of-plane object 2 are all in contact with the surface of the detection area within the overlapping field of view, and the first imaging module 166 forms a first image of the out-of-plane object 2 within the overlapping field of view that is consistent with a second image of the out-of-plane object 2 within the overlapping field of view that is formed by the second imaging module 168.
Specifically, a distance between a first imaging point A1 corresponding to the first feature point a on the first image and a second imaging point C2 corresponding to the second feature point C on the first image may be defined as a first pitch A1C2. The distance between the third imaging point A3 corresponding to the first feature point a on the second image and the fourth imaging point C4 corresponding to the second feature point C on the second image may be defined as a second pitch A3C4. The first image and the second image being identical means that there is no difference between the first image and the second image or the difference is smaller than a preset threshold, and the first distance A1C2 and the second distance A3C4 are the same in terms of image size, or the difference between the first distance A1C2 and the second distance A3C4 is smaller than a preset difference threshold. The difference in gray value between the first imaging point A1 corresponding to the first feature point a on the first image and the second imaging point C2 corresponding to the second feature point C on the first image may be defined as a first gray difference. The difference in gray value between the third imaging point A3 corresponding to the first feature point a on the second image and the fourth imaging point C4 corresponding to the second feature point C on the second image may be defined as a second gray difference. The first image and the second image are consistent in terms of image gray scale, wherein the first gray scale difference is identical to the second gray scale difference, or the difference value between the first gray scale difference and the second gray scale difference is smaller than a preset difference value threshold value.
If the external object 2 is a solid object, such as, but not limited to: when the three-dimensional finger contacts with the surface of the detection area in the overlapping view field range, the fingerprint ridge of the three-dimensional finger contacts with the surface of the detection area in the overlapping view field range, and a certain distance is reserved between the fingerprint valley and the surface of the detection area. The first imaging module 166 may have a parallax between a first image of a stereoscopic finger in the overlapping field of view and a second image of a stereoscopic finger in the overlapping field of view of the second imaging module 168. Therefore, whether the external object 2 is stereoscopic can be determined by analyzing whether there is a difference between the images of the same external object 2 acquired through different viewing angles, thereby preventing an illegal molecule from attacking the recognition function of the optical detection system 16 by using the planar imitation printed with the images of the external object 2.
Specifically, with the fingerprint ridge of the stereoscopic finger as the first feature point a and the fingerprint valley of the stereoscopic finger as the second feature point B, a distance between a first imaging point A1 corresponding to the fingerprint ridge a on the first image and a second imaging point B2 corresponding to the fingerprint valley B on the first image may be defined as a first pitch A1B2. The distance between the third imaging point A3 corresponding to the fingerprint ridge a on the second image and the fourth imaging point B4 corresponding to the fingerprint valley B on the second image may be defined as a second pitch A3B4. The difference between the first image and the second image is represented in terms of image size by the difference between the first pitch A1B2 and the second pitch A3B4 being greater than or equal to a preset difference threshold. The difference in gray value between the first imaging point A1 corresponding to the fingerprint ridge a on the first image and the second imaging point B2 corresponding to the fingerprint valley B on the first image may be defined as a first gray difference. The difference in gray value between the third imaging point A3 corresponding to the fingerprint ridge a on the second image and the fourth imaging point B4 corresponding to the fingerprint valley B on the second image may be defined as a second gray level difference. The difference between the first image and the second image is reflected in terms of image gray scale in that the difference between the first gray scale difference and the second gray scale difference is greater than or equal to a preset difference threshold.
Alternatively, in this embodiment, the first feature point a and the second feature point B of the external object 2 selected during the detection by the optical detection system 16 are respectively adjacent fingerprint ridges and fingerprint valleys on the stereoscopic finger, that is, the fingerprint ridges and fingerprint valleys that are closest in position. The first pitch A1B2 of the fingerprint ridge a and the fingerprint valley B on the first image and the second pitch A3B4 of the fingerprint valley B on the second image may have a value ranging from 100 μm to 300 μm, which may be changed according to the optical parameters of the lens module 160 and the adjustment of the positional relationship among the lens module 160, the image sensor 162, and the protective layer 12.
Alternatively, in other embodiments, the first feature point a and the second feature point B of the external object 2 selected by the optical detection system 16 during detection may be a non-adjacent fingerprint ridge and fingerprint valley on the stereoscopic finger, respectively.
Optionally, in this embodiment, the first lens 1601 and the second lens 1602 are a pair of lenslets 1600 that are adjacent in position among the plurality of lenslets 1600, so that the first field-of-view range FOV1 of the first lens 1601 and the second field-of-view range FOV2 of the second lens 1602 have overlapping field-of-view ranges therebetween. The proportion of the overlapping field of view ranges in the first field of view range FOV1 and the second field of view range FOV2 respectively exceeds a preset proportion value, for example: greater than or equal to 30%.
Optionally, in other embodiments, one or more lenslets 1600 may be spaced between the first lens 1601 and the second lens 1602, so long as the overlapping field of view range can be formed between the first field of view range FOV1 of the first lens 1601 and the second field of view range FOV2 of the second lens 1602, and the ratio of the overlapping field of view ranges satisfies the foregoing requirement.
The optical detection system 16 further comprises a processing module. The processing module is configured to compare differences between a first image of the external object 2 through the first lens 1601 and a second image of the external object 2 through the second lens 1602 within the overlapping field of view, and determine whether the external object 2 is a stereoscopic object or a planar object according to a comparison result. If the difference between the first image and the second image of the external object 2 is greater than or equal to a preset threshold, judging that the external object 2 is a three-dimensional object; and if the difference between the first image and the second image of the external object 2 is smaller than a preset threshold value, judging that the external object 2 is a plane object.
Specifically, as shown in fig. 4, the external object 2 includes a first feature point a and a second feature point B. The distance between the first imaging point A1 and the second imaging point B2 corresponding to the first characteristic point A and the second characteristic point B on the first image is a first interval A1B2. The distance between the third imaging point A3 and the fourth imaging point B4 corresponding to the first feature point A and the second feature point B on the second image is a second interval A3B4. Optionally, in some embodiments, the difference between the first image and the second image may be a difference between the first pitch A1B2 and the second pitch A3B4, and the preset threshold is a threshold of the difference between the first pitch and the second pitch.
The difference between the gray values of the first imaging point A1 and the second imaging point B2 corresponding to the first feature point A and the second feature point B on the first image is a first gray difference. The difference between the gray values of the third imaging point A3 and the fourth imaging point B4 corresponding to the first feature point a and the second feature point B on the second image is a second gray difference. Optionally, in some embodiments, the difference between the first image and the second image is a difference between the first gray scale difference and the second gray scale difference, and the preset threshold is a threshold of the difference between the first gray scale difference and the second gray scale difference.
Alternatively, in this embodiment, the external object 2 is a finger of a user, the first feature point a is a fingerprint ridge of the finger, and the second feature point B is a fingerprint valley of the finger. Alternatively, in other embodiments, the external object 2 may be another three-dimensional object, and the first feature point a and the second feature point B may be points on different planes of the external object 2.
It will be appreciated that the processing module 161 may determine corresponding image points of the first and second feature points a and B on the first and second images, respectively, by feature matching the first and second images prior to comparing the first and second images.
Alternatively, in the present embodiment, the processing module 161 determines whether the external object is stereoscopic or not based on differences between the images of the external object 2 acquired from different perspectives by two imaging modules having mutually overlapping field-of-view ranges, respectively. It will be appreciated that, in other embodiments, the processing module 161 may also acquire a plurality of images of the external object 2 located in the overlapping field of view from different perspectives by using two or more imaging modules with overlapping field of view ranges, and perform statistical analysis on differences between the acquired plurality of images of the external object 2, for example: average difference analysis, standard deviation analysis and the like, and then comparing the average difference analysis, the standard deviation analysis and the like with a preset difference threshold value to further judge whether the external object 2 is stereoscopic.
Optionally, in this embodiment, the processing module 161 may be further configured to compare the first image and/or the second image of the external object 2 with a pre-stored external object 2 template, and identify the identity of the external object 2 according to the comparison result.
It will be appreciated that in some embodiments, the processing module 161 performs the above-mentioned comparison and recognition after determining that the external object 2 is a three-dimensional object. The processing module 161 does not perform subsequent comparison and recognition after determining that the external object 2 is a planar object, thereby saving power consumption of the optical detection system 16.
Alternatively, in some embodiments, the processing module may be connected to the image sensor 162 to receive image data obtained by the image sensor 162 according to the image conversion of the external object 2 by the lens module 160. Optionally, in some other embodiments, the optical detection system may further include a memory 18, the image data obtained by converting the image of the external object 2 by the image sensor 162 may be stored in the memory 18, and the processing module 161 obtains the image data of the external object 2 from the memory 18.
Alternatively, in some embodiments, the processing module 161 may be firmware that is cured within the memory 18 or computer software code stored within the memory 18. The processing module 161 is executed by a corresponding processor or processors (not shown) to control the relevant components to implement the corresponding functions. Such as, but not limited to, an application processor (Application Processor, AP), a Central Processing Unit (CPU), a Microprocessor (MCU), etc. The Memory 18 includes, but is not limited to, flash Memory (Flash Memory), charged erasable programmable read-only Memory (Electrically Erasable Programmable read only Memory, EEPROM), programmable read-only Memory (Programmable read only Memory, PROM), hard disk, and the like. The memory 18 may be used to store templates of the external object 2 for identifying the identity of the external object 2, various preset thresholds, image data of the external object 2 acquired by the image sensor 162, intermediate data generated during the comparison and judgment, and the like.
Alternatively, in some embodiments, the processor 17 and/or memory 18 may be integrated on the same substrate as the image sensor 162. Optionally, in other embodiments, the processor 17 and/or the memory 18 may also be provided on a host of the electronic device 1, such as: and the main circuit board of the mobile phone.
Optionally, in some embodiments, the functions of the processing module 161 may also be implemented by hardware, for example, by any one or a combination of the following technologies: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like. It will be appreciated that the hardware described above for implementing the functions of the processing module 161 may be integrated on the same substrate as the image sensor 162 or may be provided on the host of the electronic device 1.
Compared with the prior art, the optical detection system 16 of the present application can effectively prevent lawless persons from attacking the recognition function of the optical detection system 16 by using the planar imitation printed with the image of the external object 2 by comparing whether the difference caused by parallax exists between the images of the external object 2 respectively acquired from different perspectives to further judge whether the external object 2 is a stereoscopic object, thereby improving the security of the electronic device 1.
As shown in fig. 6, fig. 6 is a schematic partial cross-sectional view of an optical detection system 16b according to a second embodiment of the present application. The optical detection system 16b is structurally different from the optical detection system 16a of the first embodiment in that: the first imaging module 166 and the second imaging module 168 of the optical detection system 16b are separate optical imaging modules, respectively. The first imaging module 166 includes a first lens 1601 and a first image sensor 162. The second imaging module 168 includes a second lens 1602 and a second image sensor 162. The optical detection system 16b does not use a structure of an array of lenslets 1600 to image the external object 2, but uses the first lens 1601 and the second lens 1602 that are independent of each other to image the external object 2. The first lens 1601 and the second lens 1602 are imaged onto a first image sensor 162a and a second image sensor 162b, respectively, that are independent of each other, rather than onto different areas of the same image sensor 162.
Optionally, in some embodiments, the optical detection system 16b may also include two or more independent imaging modules with overlapping field of view ranges, and acquire multiple images of the external object 2 within the overlapping field of view ranges from different viewing angles. The processing module 161 (refer to fig. 5) may perform statistical analysis on the differences between the acquired images of the external object 2, for example: average difference analysis, standard deviation analysis, etc., and then comparing with a preset difference threshold value and judging whether the external object 2 is stereoscopic.
It should be understood by those skilled in the art that, without any inventive effort, some or all of the embodiments of the present application, as well as some or all of the modifications, substitutions, alterations, permutations, combinations, extensions, etc. of the embodiments are considered to be covered by the inventive concept of the present application, and are within the scope of the present application.
Any reference in this specification to "one embodiment," "an embodiment," "example embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. Such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. In addition, when a particular feature or structure is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature or structure in connection with other ones of the embodiments.
The references to "length", "width", "upper", "lower", "left", "right", "front", "rear", "back", "front", "vertical", "horizontal", "top", "bottom", "interior", "exterior", etc., as may be made in this specification are merely for convenience in describing embodiments of the application and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and are not to be construed as limiting the application. Like reference numerals and letters designate like items in the drawings, and thus once an item is defined in one drawing, no further definition or explanation thereof is necessary in the subsequent drawings. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance. In the description of the present application, the meaning of "plurality" or "plurality" means at least two or two, unless specifically defined otherwise. In the description of the present application, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "mounted," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; the connection may be direct or indirect via an intermediate medium, or may be internal communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
The foregoing is merely illustrative of embodiments of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations and substitutions within the scope of the present application. The terms used in the following claims should not be construed to limit the application to the specific embodiments disclosed in the specification. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (7)
1. An optical inspection system for inspecting an external object, comprising:
a first lens having a first field of view for imaging an external object within the first field of view;
a second lens having a second field of view range for imaging an external object within the second field of view range, the first field of view range and the second field of view range at least partially overlapping, defining a portion of the first field of view range and the second field of view range overlapping each other as an overlapping field of view range;
the lens module is positioned below the display screen and comprises a plurality of small lenses which are arranged in an array, and the first lens and the second lens are respectively one of the small lenses;
The image sensor is positioned below the first lens and the second lens, and comprises a plurality of first photosensitive units and a plurality of second photosensitive units, wherein the first photosensitive units are used for receiving imaging light converged by the first lens and converting the imaging light into corresponding electric signals, and the second photosensitive units are used for receiving the light converged by the second lens and converting the imaging light into corresponding electric signals; and
the processing module is used for comparing differences between a first image formed by the external object through the first lens and a second image formed by the second lens in the overlapping view field range, and judging that the external object is a three-dimensional object when the differences are larger than or equal to a preset threshold value;
wherein the difference between the first image and the second image comprises:
the external object comprises a first characteristic point and a second characteristic point, the distance between the first characteristic point and the second characteristic point on the corresponding image points of the first image is a first interval, the distance between the first characteristic point and the second characteristic point on the corresponding image points of the second image is a second interval, the difference between the first image and the second image is the difference value between the first interval and the second interval, and the preset threshold value is the threshold value of the difference value between the first interval and the second interval;
Or the external object comprises a first characteristic point and a second characteristic point, the difference of gray values of corresponding image points of the first characteristic point and the second characteristic point on the first image is a first gray level difference, the difference of gray values of corresponding image points of the first characteristic point and the second characteristic point on the second image is a second gray level difference, the difference between the first image and the second image is the difference of the first gray level difference and the second gray level difference, and the preset threshold is the threshold of the difference of the first gray level difference and the second gray level difference.
2. The optical detection system of claim 1, wherein the external object is a portion of a fingerprint trace of a user's finger within the overlapping field of view, the first feature point is a fingerprint ridge, and the second feature point is a fingerprint valley.
3. The optical detection system according to claim 1, wherein the first feature point and the second feature point are formed on the first image and the second image at a pitch ranging from 100 μm to 300 μm, respectively.
4. The optical detection system of claim 1, wherein the overlapping field of view range in the first field of view range and the second field of view range is greater than or equal to 30%.
5. The optical detection system of claim 1, wherein the display screen is an actively illuminated display screen, light emitted by the display screen being usable to illuminate an external object to form the first image and/or the second image; or alternatively
The display screen is a passive luminous display module, and the optical detection system under the display screen further comprises an excitation light source for providing light required by detection.
6. The optical detection system according to claim 1, wherein the processing module is further configured to compare the first image and/or the second image of the external object with a pre-stored external object template, and identify the identity of the external object according to the comparison result.
7. An electronic device comprising an optical detection system according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010491268.5A CN111709330B (en) | 2020-06-02 | 2020-06-02 | Optical detection system and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010491268.5A CN111709330B (en) | 2020-06-02 | 2020-06-02 | Optical detection system and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111709330A CN111709330A (en) | 2020-09-25 |
CN111709330B true CN111709330B (en) | 2023-11-28 |
Family
ID=72539059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010491268.5A Active CN111709330B (en) | 2020-06-02 | 2020-06-02 | Optical detection system and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111709330B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114220133A (en) * | 2021-11-06 | 2022-03-22 | 深圳阜时科技有限公司 | Optical detection module, biological characteristic detection device and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8666127B1 (en) * | 2010-08-06 | 2014-03-04 | Secugen Corporation | Method and apparatus for fake fingerprint detection |
CN104123539A (en) * | 2014-07-10 | 2014-10-29 | 中南大学 | Method and device for increasing recognition accuracy rate of fingerprint recognition device |
CN109496313A (en) * | 2018-10-26 | 2019-03-19 | 深圳市汇顶科技股份有限公司 | Fingerprint identification device and electronic equipment |
CN110457977A (en) * | 2018-05-08 | 2019-11-15 | 上海箩箕技术有限公司 | Fingerprint imaging method and fingerprint imaging system |
CN110828498A (en) * | 2019-11-22 | 2020-02-21 | 深圳阜时科技有限公司 | Optical sensing device and electronic apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7116805B2 (en) * | 2003-01-07 | 2006-10-03 | Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. | Fingerprint verification device |
US20070164115A1 (en) * | 2006-01-17 | 2007-07-19 | Symboltechnologies, Inc. | Automatic exposure system for imaging-based bar code reader |
KR102535177B1 (en) * | 2017-12-21 | 2023-05-23 | 엘지디스플레이 주식회사 | Fingerprint recognition device and display device and mobile terminal including the same |
-
2020
- 2020-06-02 CN CN202010491268.5A patent/CN111709330B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8666127B1 (en) * | 2010-08-06 | 2014-03-04 | Secugen Corporation | Method and apparatus for fake fingerprint detection |
CN104123539A (en) * | 2014-07-10 | 2014-10-29 | 中南大学 | Method and device for increasing recognition accuracy rate of fingerprint recognition device |
CN110457977A (en) * | 2018-05-08 | 2019-11-15 | 上海箩箕技术有限公司 | Fingerprint imaging method and fingerprint imaging system |
CN109496313A (en) * | 2018-10-26 | 2019-03-19 | 深圳市汇顶科技股份有限公司 | Fingerprint identification device and electronic equipment |
CN110828498A (en) * | 2019-11-22 | 2020-02-21 | 深圳阜时科技有限公司 | Optical sensing device and electronic apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN111709330A (en) | 2020-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109643379B (en) | Fingerprint identification method and device and electronic equipment | |
US10943944B2 (en) | Flat panel display having embedded optical imaging sensor located at rear surface of display | |
CN108292361B (en) | Display integrated optical fingerprint sensor with angle limiting reflector | |
CN209640879U (en) | The device and electronic equipment of fingerprint recognition | |
CN110235143B (en) | Under-screen fingerprint identification device and electronic equipment | |
US11068685B2 (en) | Optical ID sensing using illumination light sources positioned at a periphery of a display screen | |
CN108513666B (en) | Under-screen biological feature recognition device and electronic equipment | |
CN109196524A (en) | Optical imagery under screen in equipment with Organic Light Emitting Diode (OLED) screen or other screens for shielding upper fingerprint sensing in optical sensor module via imaging len and image pinhole | |
CN214225933U (en) | Fingerprint identification device and electronic equipment | |
CN109690567A (en) | Fingerprint identification device and electronic equipment | |
CN211319247U (en) | Fingerprint identification device, backlight unit, liquid crystal display and electronic equipment | |
CN110100250B (en) | Fingerprint identification device and method and electronic equipment | |
CN112069942B (en) | Optical detection system and electronic equipment under screen | |
CN112528953B (en) | Fingerprint identification device, electronic equipment and fingerprint identification method | |
CN111931681A (en) | Optical detection device and electronic equipment | |
CN110832503B (en) | Optical fingerprint device, electronic apparatus, and distance measuring method | |
CN209433415U (en) | Fingerprint identification device and electronic equipment | |
CN111709330B (en) | Optical detection system and electronic equipment | |
US10726232B2 (en) | Flat panel display having optical sensor | |
CN210402379U (en) | Lens assembly, imaging device and biological characteristic detection system | |
CN210295111U (en) | Fingerprint identification device and electronic equipment | |
CN111723689B (en) | Under-screen optical detection system and electronic equipment | |
CN111339815A (en) | Optical detection device and electronic equipment | |
CN111464727A (en) | Optical sensing device and electronic apparatus | |
CN110785770A (en) | Fingerprint identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |