CN111723689A - Optical detection system and electronic equipment under screen - Google Patents

Optical detection system and electronic equipment under screen Download PDF

Info

Publication number
CN111723689A
CN111723689A CN202010491744.3A CN202010491744A CN111723689A CN 111723689 A CN111723689 A CN 111723689A CN 202010491744 A CN202010491744 A CN 202010491744A CN 111723689 A CN111723689 A CN 111723689A
Authority
CN
China
Prior art keywords
image
external object
field
difference
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010491744.3A
Other languages
Chinese (zh)
Other versions
CN111723689B (en
Inventor
徐洪伟
涂强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202010491744.3A priority Critical patent/CN111723689B/en
Publication of CN111723689A publication Critical patent/CN111723689A/en
Application granted granted Critical
Publication of CN111723689B publication Critical patent/CN111723689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1324Sensors therefor by using geometrical optics, e.g. using prisms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Input (AREA)

Abstract

The application discloses optical detection system under screen includes: the display screen is positioned below the protective layer; the first imaging module is positioned below the display screen, is provided with a first field of view range and is used for forming a first image for an external object in the first field of view range; the second imaging module is positioned below the display screen, is provided with a second field of view range and is used for forming a second image of an external object in the second field of view range, and the first field of view range and the second field of view range at least comprise partially overlapped field of view ranges; and the processing module is used for comparing the difference between a first image formed by the first imaging module and a second image formed by the second imaging module of an external object in the overlapping field of view and judging that the external object is a three-dimensional object when the difference is greater than or equal to a preset threshold value. The application also discloses an electronic device comprising the optical detection system under the screen.

Description

Optical detection system and electronic equipment under screen
Technical Field
The present application relates to the field of optoelectronic technologies, and in particular, to an off-screen optical detection system and an electronic device using an optical imaging principle to detect an external object.
Background
At present, the optical fingerprint recognition function of electronic products such as mobile phones and tablet computers is usually realized by recognizing a fingerprint plane image acquired when a user finger presses the surface of a screen, and the optical fingerprint recognition function is easily broken by lawbreakers by using a plane fake fingerprint channel with low manufacturing cost, such as: and sticking the adhesive tape or the picture printed with the fingerprint pattern on the fingerprint identification area on the surface of the screen. Therefore, the optical fingerprint identification function of the existing electronic product has obvious potential safety hazard.
Disclosure of Invention
In view of the above, the present invention provides an off-screen optical detection system and an electronic device capable of improving the problems of the prior art.
One aspect of the present application provides an optical detection system under a screen, for detecting an external object, comprising:
a protective layer;
the display screen is positioned below the protective layer and used for displaying pictures;
the first imaging module is positioned below the display screen, is provided with a first field of view range and is used for forming a first image for an external object in the first field of view range;
the second imaging module is positioned below the display screen, is provided with a second field of view range and is used for forming a second image for an external object in the second field of view range, the first field of view range and the second field of view range are at least partially overlapped, and the overlapped part of the first field of view range and the second field of view range is defined as an overlapped field of view range;
and the processing module is used for comparing the difference between a first image formed by the first imaging module and a second image formed by the second imaging module of an external object in the overlapping field of view and judging that the external object is a three-dimensional object when the difference is greater than or equal to a preset threshold value.
In certain embodiments, the first imaging module comprises:
a first lens to converge light rays from an external object within the first field of view to image the external object; and
the first photosensitive module comprises a plurality of first photosensitive units and is used for receiving the light rays converged and imaged by the first lens and converting the light rays into corresponding electric signals;
the second imaging module comprises:
a second lens for converging light rays from an external object within the second field of view to image the external object, an
And the second photosensitive module comprises a plurality of second photosensitive units and is used for receiving the light rays converged and imaged by the second lens and converting the light rays into corresponding electric signals.
In some embodiments, the display device further comprises a lens module, the lens module is located below the display screen, the lens module comprises a plurality of small lenses arranged in an array, and the first lens and the second lens are respectively one of the small lenses.
In some embodiments, the image sensor further includes an image sensor located below the lens module, the image sensor includes a plurality of photosensitive units, and the first photosensitive module and the second photosensitive module are different photosensitive areas on the image sensor.
In some embodiments, the external object includes a first feature point and a second feature point, a distance between corresponding pixels of the first feature point and the second feature point on the first image is a first pitch, a distance between corresponding pixels of the first feature point and the second feature point on the second image is a second pitch, a difference between the first image and the second image is a difference between the first pitch and the second pitch, and the preset threshold is a threshold of the difference between the first pitch and the second pitch.
In some embodiments, the external object includes a first feature point and a second feature point, a difference between gray values of corresponding image points of the first feature point and the second feature point on the first image is a first gray difference, a difference between gray values of corresponding image points of the first feature point and the second feature point on the second image is a second gray difference, a difference between the first image and the second image is a difference between the first gray difference and the second gray difference, and the preset threshold is a threshold of the difference between the first gray difference and the second gray difference.
In some embodiments, the external object is a partial fingerprint ridge of a finger of a user located in the overlapping field of view, the first feature point is a fingerprint ridge, and the second feature point is a fingerprint valley.
In some embodiments, the first feature point and the second feature point have image points formed on the first image and the second image, respectively, with a pitch ranging from 100 μm to 300 μm.
In certain embodiments, the proportion of the overlapping field of view ranges in the first and second field of view ranges is greater than or equal to 30%.
In some embodiments, the display screen is an active light-emitting display screen, and light emitted by the display screen can be used for illuminating an external object to form the first image and/or the second image; or
The display screen is a passive light-emitting display module, and the optical detection system under the screen further comprises an excitation light source used for providing light required by detection.
In some embodiments, the processing module is further configured to compare the first image and/or the second image of the external object with a pre-stored external object template, and identify the identity of the external object according to a comparison result.
One aspect of the present application provides an electronic device including an underscreen optical detection system as provided in the above embodiments.
The optical detection system has the beneficial effects that whether the difference caused by the parallax exists between the images of the external objects acquired respectively from different viewpoints or not is compared, so that whether the external objects are three-dimensional objects or not is judged, the situation that lawbreakers use plane imitations printed with the images of the external objects to attack the identification function of the optical detection system can be effectively prevented, and the safety of electronic equipment is improved.
Drawings
FIG. 1 is a schematic structural diagram of an optical detection system applied to an electronic device under a screen according to an embodiment of the present disclosure;
FIG. 2 is a perspective view of an optical inspection system according to a first embodiment of the present application;
FIG. 3 is a schematic partial cross-sectional view of the optical detection system shown in FIG. 2 taken along line III-III;
FIG. 4 is an imaging optical path diagram of the first and second lenses of FIG. 3;
FIG. 5 is a functional block diagram of an optical inspection system according to a first embodiment of the present application;
FIG. 6 is a schematic, partially cross-sectional view of an optical inspection system according to a second embodiment of the present application;
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
In the detailed description of the embodiments herein, it will be understood that when a substrate, a sheet, a layer, or a pattern is referred to as being "on" or "under" another substrate, another sheet, another layer, or another pattern, it can be "directly" or "indirectly" on the other substrate, the other sheet, the other layer, or the other pattern, or one or more intervening layers may also be present. The thickness and size of each layer in the drawings of the specification may be exaggerated, omitted, or schematically represented for clarity. Further, the sizes of the elements in the drawings do not completely reflect actual sizes.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. To simplify the disclosure of the present application, the components and settings of a specific example are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of an electronic device according to the present application. The electronic device 1 comprises an off-screen optical detection system 10 for detecting an external object 2. The underscreen optical inspection system 10 includes a protective layer 12, a display screen 14, and an optical inspection system 16 located below the display screen 14. The display screen 14 is located below the protective layer 12. The display screen 14 is used for displaying pictures. The protective layer 12 can transmit light emitted from the display screen 14 for displaying pictures and protect the display screen 14 from being damaged. The optical detection system 16 is used for receiving light from the external object 2 through the protective layer 12 and the display screen 14, and converting the received light into corresponding electric signals to perform corresponding information sensing. The optical detection system 16 is used, for example, to perform sensing of biometric information, such as, but not limited to, fingerprint information, palm print information, and other texture information, and/or blood oxygen information, heartbeat information, pulse information, and other vital information. However, the present application is not limited thereto, and the optical detection system 16 may also be used for performing other information sensing, such as depth information sensing, proximity sensing, and the like. In the present application, the external object 2 is mainly used as a user's finger, and the optical detection system 16 performs fingerprint sensing as an example.
The electronic device 1 may be, for example, but not limited to, a consumer electronic product, a home electronic product, a vehicle-mounted electronic product, a financial terminal product, or other suitable type of electronic product. The consumer electronic products include, for example, mobile phones, tablet computers, notebook computers, desktop monitors, all-in-one computers, and the like. Household electronic products are, for example, smart door locks, televisions, refrigerators and the like. The vehicle-mounted electronic product is, for example, a vehicle-mounted navigator, a vehicle-mounted touch interactive device, and the like. The financial terminal products are ATM machines, terminals for self-service business and the like.
It should be noted that in the present application, the optical detection system 16 has a plurality of different embodiments, and for the sake of clarity, the optical detection system 16 in the different embodiments is respectively labeled with different reference numerals 16a and 16b to distinguish them. Further, for convenience of description, the same reference numerals in different embodiments of the optical detection system 16 may refer to the same elements, and may also refer to similar elements that are modified, replaced, expanded, and combined.
Alternatively, in some embodiments, the display 14 may be an active light emitting display, such as but not limited to an organic light emitting diode display (OLED display) or the like. The display 14 may be used as an excitation light source to provide light for detection, such as, but not limited to, visible light having a wavelength in the range of 400 to 780 nanometers (nm). Alternatively, in other embodiments, the display 14 may be a passive light-emitting display, such as but not limited to a Liquid Crystal Display (LCD) or an electronic paper display. The electronic device 1 employing the passive light emitting display screen needs to be additionally provided with an excitation light source to provide light for detection, such as, but not limited to, near infrared light with a wavelength range of 800 to 1000 nm. Alternatively, an excitation light source may be additionally provided to the electronic device using the active light emitting display to provide light for detection.
Optionally, in some embodiments, the protective layer 12 may comprise a transparent material, such as, but not limited to, transparent glass, transparent polymer, any other transparent material, and the like. The protective layer 12 may have a single-layer structure or a multi-layer structure. The protective layer 12 is a substantially thin plate having a predetermined length, width and thickness. It will be appreciated that the protective layer 12 may comprise a plastic film, a toughened film, or other film layer or the like to which the user is attached in actual use. The outer surface 120 of the protective layer 12 may be the outermost surface of the electronic device 1. Upon detection, the external object 2 may directly contact the outer surface 120 of the protective layer 12.
Referring to fig. 2, fig. 3 and fig. 5, fig. 2 is a perspective structural view of an optical detection system 16a according to a first embodiment of the present application. FIG. 3 is a schematic partial cross-sectional view of the optical detection system 16a shown in FIG. 2 taken along line III-III. Fig. 5 is a functional block diagram of an optical inspection system 16a according to a first embodiment of the present application. The optical detection system 16a includes a lens module 160 and an image sensor 162 disposed below the lens module 160. The lens module 160 is used for converging light to the image sensor 162 to image the external object 2. The image sensor 162 is used for converting the received light into corresponding electrical signals.
The Lens module 160 includes a plurality of lenslets 1600(Mini-Lens), the lenslets 1600 being disposed above the image sensor 162, the lenslets 1600 being spaced apart from each other. The plurality of lenslets 1600 are configured to converge light onto the image sensor 162 to image the external object 2.
Optionally, in this embodiment, the lenslets 1600 are arranged in a regular array. Further optionally, the plurality of lenslets 1600 are arranged in, for example, but not limited to, a rectangular array. However, the lenslets 1600 may be arranged irregularly in other embodiments.
Optionally, in this embodiment, the lens module 160 further includes a first substrate 1603, and the lenslets 1600 are disposed on the first substrate 1603. The first substrate 1603 is made of a transparent material such as, but not limited to, transparent acrylic, lens glass, UV glue, and the like. The first substrate 1603 can be integrally formed with the plurality of lenslets 1600. Alternatively, the small lenses 1600 and the first substrate 1603 are elements manufactured independently from each other, and the small lenses 1600 are fixed on the first substrate 1603 by means of glue or the like.
Optionally, a plurality of said lenslets 1600 are convex lenses. Further optionally, a plurality of said lenslets 1600 are spherical lenses or aspherical lenses.
Optionally, a plurality of the lenslets 1600 are made of a transparent material, such as, but not limited to, a transparent acrylic, lens glass, UV glue, or the like.
Optionally, in this embodiment, the optical detection system 16 further includes a second substrate 163 located below the image sensor 162. The second substrate 163 may provide support for the image sensor 162 and electrical connection to external circuitry. The second substrate 163 is, for example, a flexible printed circuit board or a rigid printed circuit board.
Optionally, in this embodiment, the optical detection system 16 further includes a filter layer 164 disposed on the image sensor 162. The filter layer 164 transmits the light for detection in the predetermined wavelength range and filters light outside the predetermined wavelength range. Optionally, in some other embodiments, the filter layer 164 may be coated on the light incident surface of the lenslets 1600.
Optionally, in this embodiment, the optical detection system 16a further includes a light shielding layer 165, and the light shielding layer 165 is disposed on the first substrate 1603 and located in the interval region between the lenslets 1600. The light shielding layer 165 is used for shielding light so as to reduce the interference of stray light rays which are not converged by the small lens 1600 and irradiate the image sensor 162 to the detection.
Alternatively, in other embodiments, the light shielding layer 165 may be disposed at a different position. For example, the light shielding layer 165 is disposed between the filter layer 164 and the first substrate 1603, and the light shielding layer 165 has light holes (not shown) corresponding to the plurality of small lenses 1600. Alternatively, the light shielding layer 165 is disposed between the filter layer 164 and the image sensor 162, and the light shielding layer 165 has light transmitting holes corresponding to the plurality of small lenses 1600. This is not limited by the present application.
Optionally, in other embodiments, the optical inspection system 16 can further include a protective film (not shown) disposed on the plurality of lenslets 1600. The protective film may cover the plurality of lenslets 1600 and/or the light shield layer 165 to provide moisture protection, dust protection, scratch protection, and the like. In some embodiments, the protective film may also be omitted.
The image sensor 162 includes a plurality of photosensitive cells 1624, and a readout circuit and other auxiliary circuits electrically connected to the photosensitive cells 1624. The light sensing unit 1624 is a photo detector such as, but not limited to, a photodiode. The light sensing unit 1624 is configured to receive the imaging light converged by the lens module 160 and convert the imaging light into a corresponding electrical signal, so as to obtain the biometric information of the external object 2. The external object 2 is for example, but not limited to, a finger, a palm, etc. of a user, and the biometric information is for example, but not limited to, fingerprint image data of the finger of the user. Alternatively, the area size of the single photosensitive unit 1624 may range from 5 micrometers (μm) by 5 μm to 10 μm by 10 μm, for example.
Optionally, in this embodiment, the plurality of photosensitive units 1624 are arranged in an array. However, alternatively, in some embodiments, the arrangement of the plurality of photosensitive units 1624 may also form a regular or irregular two-dimensional pattern. Each Of the lenslets 1600 has a preset Field Of View (FOV), and each Of the lenslets 1600 can image an external object 2 within the FOV onto a corresponding plurality Of light-sensing units 1624 Of the image sensor 162 and then convert the image data into corresponding image data.
For convenience of description, the regions of the plurality of photosensitive units 1624 capable of receiving light through the lenslet 1600 are defined as photosensitive regions of the lenslet 1600, and some or all of the lenslets 1600 have corresponding photosensitive regions on the image sensor 162. In the present application, each lenslet 1600 has a corresponding photosensitive region on the image sensor 162 for illustration. Each of the lenslets 1600 is capable of imaging an external object 2 within a respective field of view within a corresponding photosensitive region. Optionally, in this embodiment, the photosensitive region of each lenslet 1600 includes a plurality of said photosensitive units 1624, such as but not limited to: an array of 10 x 10 photosites 1624, or an array of 100 x 100 photosites 1624, or an array of photosites 1624 of any size between 10 x 10 and 100 x 100, or 100 to 10000 photosites 1624.
Optionally, in this embodiment, each of the lenslets 1600 has the same shape and optical parameters, such as but not limited to: the same focal length, field range, surface curvature, etc. The photosensitive surfaces of the photosensitive units 1624 on the image sensor 162 are substantially parallel to the outer surface of the protective layer 12, the small lenses 1600 are disposed on the same horizontal plane, the distances between the optical centers of the small lenses 1600 and the photosensitive surfaces of the corresponding photosensitive units 1624 are substantially the same, and the distances between the optical centers of the small lenses 1600 and the outer surface 120 of the protective layer 12 are also substantially the same. Therefore, in this embodiment, each of the lenslets 1600 has the same imaging parameters as the imaging optical path formed by the corresponding light-sensing unit 1624, such as but not limited to: the same image distance and magnification, etc. However, alternatively, in other embodiments, the lenslets 1600 can have different shapes and optical parameters.
Taking the external object 2 as an example of a finger of a user, when detecting a fingerprint of the finger, the user presses the finger against a preset detection area on the outer surface 120 of the protective layer 12, and the fingerprint of the finger is in contact with the detection area on the outer surface 120 of the protective layer 12. It is understood that the detection area is located within the field of view of the plurality of lenslets 1600, and the plurality of lenslets 1600 can image the fingerprint contacting the detection area onto the photosensitive areas of the image sensor 162 corresponding to the respective field of view, and convert the fingerprint into corresponding fingerprint image data by the photosensitive units 1624 within the photosensitive areas. The fingerprint of the finger includes ridges (ridges) and valleys (valley), the ridges of the fingerprint directly contact the surface of the protective layer 12, and the valleys of the fingerprint are spaced from the surface of the protective layer 12 by a certain gap. Because the ridges and valleys have different contact conditions with the surface of the protective layer 12, the imaging light from the ridges and valleys respectively has different light intensities (it can be considered that the brightness of the light is different) when reaching the corresponding photosensitive units 1624, so as to form fingerprint images with alternate light and dark corresponding to the ridge and valley lines. The fingerprint image comprises characteristic point information of a plurality of ridge and valley lines, and further comparison and verification are carried out on the fingerprint image and pre-stored fingerprint template data, so that fingerprint detection and identification can be realized.
One of the plurality of lenslets 1600 is defined as a first lens 1601 having a corresponding first field of view range FOV 1. The image sensor 162 has a first photosensitive area 1621 corresponding to the first lens 1601, and a photosensitive unit 1624 in the first photosensitive area 1621 is defined as a first photosensitive unit 1624 a. The first lens 1601 is used for imaging the external object 2 in the first field of view FOV1, and the first photosensitive unit 1624a receives the light rays focused and imaged by the first lens 1601 and converts the light rays into corresponding image data (e.g., electrical signals). The first lens 1601 and the first photosensitive unit 1624a constitute a first imaging module 166 capable of imaging the external object 2 within the first field of view range FOV 1.
Another of the plurality of lenslets 1600 is defined as a second lens 1602 having a corresponding second field of view range FOV 2. The image sensor 162 has a second photosensitive area 1622 corresponding to the second lens 1602, and a photosensitive unit 1624 in the second photosensitive area 1622 is defined as a second photosensitive unit 1624 b. The second lens 1602 is used for imaging the external object 2 in the second field of view FOV2, and the second photosensitive unit 1624b receives the light rays focused by the second lens 1602 and converts the light rays into corresponding image data (e.g., electrical signals). The second lens 1602 and the second photosensitive unit 1624b constitute a second imaging module 168 capable of imaging the external object 2 within the second field of view range FOV 2.
The first field of view range FOV1 and the second field of view range FOV2 at least partially overlap, a portion of the first field of view range FOV1 and the second field of view range FOV2 overlapping each other being defined as an overlapping field of view range. The first imaging module 166 forms a first image of the external object 2 (e.g., a user's finger) within the overlapping field of view and the second imaging module 168 forms a second image of the external object 2 within the overlapping field of view.
As shown in fig. 4, fig. 4 is an imaging optical path diagram of the first lens and the second lens in fig. 3. The external object 2 includes a first feature point a having a corresponding first imaged point a1 on the first image and a second feature point B having a corresponding second imaged point B2 on the first image, the first feature point a having a corresponding third imaged point A3 on the second image, the second feature point B having a corresponding fourth imaged point B4 on the second image.
If the external object 2 is a planar object, such as but not limited to: a tape or a picture printed with a fingerprint pattern, etc. The planar external object 2 includes a first feature point a and a second feature point C. When the plane external object 2 is placed on the detection area within the overlapping field of view, the points on the plane external object 2 are all in contact with the surface of the detection area within the overlapping field of view, and the first image of the plane external object 2 within the overlapping field of view by the first imaging module 166 is consistent with the second image of the plane external object 2 within the overlapping field of view by the second imaging module 168.
Specifically, a distance between a first imaging point A1 corresponding to the first feature point a on the first image and a second imaging point C2 corresponding to the second feature point C on the first image may be defined as a first pitch A1C 2. A distance between a third imaging point A3 corresponding to the first feature point a on the second image and a fourth imaging point C4 corresponding to the second feature point C on the second image may be defined as a second pitch A3C 4. The first image and the second image are consistent, that is, there is no difference or the difference between the first image and the second image is smaller than a preset threshold, which is represented in terms of image size that the first distance A1C2 is the same as the second distance A3C4, or that the difference between the first distance A1C2 and the second distance A3C4 is smaller than a preset difference threshold. A difference in gray value between a first imaging point a1 corresponding to the first feature point a on the first image and a second imaging point C2 corresponding to the second feature point C on the first image may be defined as a first gray difference. A difference in gray value between the third imaging point a3 corresponding to the first feature point a on the second image and the fourth imaging point C4 corresponding to the second feature point C on the second image may be defined as a second gray difference. The first image and the second image are consistent in image gray scale, wherein the first gray scale difference is the same as the second gray scale difference, or a difference value between the first gray scale difference and the second gray scale difference is smaller than a preset difference threshold value.
If the external object 2 is a three-dimensional object, such as but not limited to: when the three-dimensional finger of the user is in contact with the surface of the detection area in the overlapped view field range, the fingerprint ridge of the three-dimensional finger is in contact with the surface of the detection area in the overlapped view field range, and the fingerprint valley is spaced from the surface of the detection area by a certain distance. There will be a difference due to parallax between the first image of the stereo finger in the overlapped view field of the first imaging module 166 and the second image of the stereo finger in the overlapped view field of the second imaging module 168. Therefore, whether the external object 2 is stereoscopic can be determined by analyzing whether there is a difference between images of the same external object 2 acquired through different viewing angles, thereby preventing a lawbreaker from attacking the recognition function of the optical detection system 16 by using a planar replica printed with the image of the external object 2.
Specifically, with the fingerprint ridges of the three-dimensional finger as a first feature point a and the fingerprint valleys of the three-dimensional finger as a second feature point B, a distance between a first image forming point A1 corresponding to the fingerprint ridge a on the first image and a second image forming point B2 corresponding to the fingerprint valley B on the first image can be defined as a first pitch A1B 2. The distance between the third imaging point A3 of the fingerprint ridge a corresponding to the second image and the fourth imaging point B4 of the fingerprint valley B corresponding to the second image can be defined as a second pitch A3B 4. The difference between the first and second images is embodied in terms of image size as the difference between the first pitch A1B2 and the second pitch A3B4 being greater than or equal to a preset difference threshold. The difference in gray level between a first image-forming point a1 corresponding to the fingerprint ridge a on the first image and a second image-forming point B2 corresponding to the fingerprint valley B on the first image can be defined as a first gray level difference. A difference in gray values between the third image points a3 corresponding to the fingerprint ridges a on the second image and the fourth image points B4 corresponding to the fingerprint valleys B on the second image may be defined as a second gray difference. The difference between the first image and the second image is represented in terms of image gray scale, wherein the difference between the first gray scale difference and the second gray scale difference is greater than or equal to a preset difference threshold value.
Optionally, in this embodiment, the first feature point a and the second feature point B of the external object 2 selected when the optical detection system 16 performs detection are respectively a fingerprint ridge and a fingerprint valley adjacent to each other on a three-dimensional finger, that is, a fingerprint ridge and a fingerprint valley closest to each other in position. The range of the first distance A1B2 corresponding to the fingerprint ridges a and the fingerprint valleys B on the first image and the range of the second distance A3B4 corresponding to the fingerprint valleys B on the second image may be 100 μm to 300 μm, which may vary according to the optical parameters of the lens module 160 and the adjustment of the positional relationship among the lens module 160, the image sensor 162 and the protective layer 12.
Optionally, in some other embodiments, the first feature point a and the second feature point B of the external object 2 selected by the optical detection system 16 during detection may also be non-adjacent fingerprint ridges and fingerprint valleys on the three-dimensional finger, respectively.
Optionally, in this embodiment, the first lens 1601 and the second lens 1602 are respectively a pair of lenslets 1600 adjacent to each other among the lenslets 1600, so that the first field of view FOV1 of the first lens 1601 and the second field of view FOV2 of the second lens 1602 have mutually overlapping field of view ranges. The proportion of the overlapping field of view ranges in the first field of view range FOV1 and the second field of view range FOV2 respectively exceeds a preset proportion value, for example: greater than or equal to 30%.
Optionally, in some other embodiments, one or more lenslets 1600 may be spaced between the first lens 1601 and the second lens 1602, as long as the overlapping field of view range can be formed between the first field of view range FOV1 of the first lens 1601 and the second field of view range FOV2 of the second lens 1602, and the proportion of the overlapping field of view range satisfies the foregoing requirement.
The optical detection system 16 also includes a processing module. The processing module is configured to compare differences between a first image formed by the external object 2 passing through the first lens 1601 and a second image formed by the second lens 1602 respectively within the overlapping field of view, and determine whether the external object 2 is a stereoscopic object or a planar object according to a comparison result. If the difference between the first image and the second image of the external object 2 is greater than or equal to a preset threshold value, determining that the external object 2 is a three-dimensional object; if the difference between the first image and the second image of the external object 2 is smaller than a preset threshold value, determining that the external object 2 is a planar object.
Specifically, as shown in fig. 4, the external object 2 includes a first feature point a and a second feature point B. The distance between the first imaging point A1 and the second imaging point B2 of the first feature point a and the second feature point B on the first image is a first distance A1B 2. The distance between the third imaging point A3 and the fourth imaging point B4 corresponding to the first feature point a and the second feature point B on the second image is a second pitch A3B 4. Alternatively, in some embodiments, the difference between the first image and the second image may be a difference between the first pitch A1B2 and the second pitch A3B4, and the preset threshold is a threshold of the difference between the first pitch and the second pitch.
The difference between the gray values of the first imaging point A1 and the second imaging point B2 corresponding to the first characteristic point A and the second characteristic point B on the first image is a first gray difference. The difference between the gray values of the third imaging point a3 and the fourth imaging point B4 corresponding to the first feature point a and the second feature point B on the second image is a second gray difference. Optionally, in some embodiments, the difference between the first image and the second image is a difference between the first gray scale difference and the second gray scale difference, and the preset threshold is a threshold of the difference between the first gray scale difference and the second gray scale difference.
Optionally, in this embodiment, the external object 2 is a finger of a user, the first feature point a is a fingerprint ridge of the finger, and the second feature point B is a fingerprint valley of the finger. Optionally, in some other embodiments, the external object 2 may also be another three-dimensional object, and the first feature point a and the second feature point B may be points on the external object 2 that are respectively located on different planes.
It is understood that, before comparing the first image with the second image, the processing module 161 may determine the corresponding image points of the first feature point a and the second feature point B on the first image and the second image respectively by performing feature matching on the first image and the second image.
Alternatively, in this embodiment, the processing module 161 determines whether the external object is stereoscopic according to the difference between the external object 2 images respectively acquired from different viewing angles by two imaging modules having mutually overlapping field of view ranges. It is understood that, in some other embodiments, the processing module 161 may also acquire a plurality of images of the external object 2 located in the overlapped field of view from different viewing angles through two or more imaging modules with overlapped field of view ranges, and perform statistical analysis on the difference between the acquired plurality of images of the external object 2, such as: average difference analysis, standard deviation analysis, etc., and then compared with a preset difference threshold value to determine whether the external object 2 is a solid.
Optionally, in this embodiment, the processing module 161 may be further configured to compare the first image and/or the second image of the external object 2 with a pre-stored external object 2 template, and identify the identity of the external object 2 according to a comparison result.
It is understood that, in some embodiments, the processing module 161 performs the comparison and identification after determining that the external object 2 is a three-dimensional object. The processing module 161 does not perform subsequent comparison and identification after determining that the external object 2 is a planar object, thereby saving energy consumption of the optical detection system 16.
Optionally, in some embodiments, the processing module may be connected to the image sensor 162 to receive image data converted by the image sensor 162 according to the external object 2 image formed by the lens module 160. Optionally, in some other embodiments, the optical detection system may further include a memory 18, the image data converted by the image sensor 162 according to the image of the external object 2 may be stored in the memory 18, and the processing module 161 acquires the image data of the external object 2 from the memory 18.
Alternatively, in some embodiments, the processing module 161 may be firmware that is solidified within the memory 18 or computer software code that is stored within the memory 18. The processing module 161 is executed by one or more corresponding processors (not shown) to control the relevant components to implement the corresponding functions. Such as, but not limited to, an Application Processor (AP), a Central Processing Unit (CPU), a Microprocessor (MCU), etc. The Memory 18 includes, but is not limited to, a Flash Memory (Flash Memory), a charged erasable Programmable read only Memory (EEPROM), a Programmable Read Only Memory (PROM), a hard disk, and the like. The memory 18 may be used to store the external object 2 template for identifying the identity of the external object 2, various preset thresholds, image data of the external object 2 acquired by the image sensor 162, and intermediate data generated during comparison and judgment, etc.
Optionally, in some embodiments, the processor 17 and/or memory 18 may be integrated on the same substrate as the image sensor 162. Optionally, in some other embodiments, the processor 17 and/or the memory 18 may also be disposed on the host of the electronic device 1, such as: the main circuit board of the mobile phone.
Optionally, in some embodiments, the functions of the processing module 161 may also be implemented by hardware, for example, by any one or a combination of the following technologies: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like. It is understood that the above-mentioned hardware for implementing the functions of the processing module 161 may be integrated on the same substrate as the image sensor 162, or may be disposed on a host of the electronic device 1.
Compared with the prior art, the optical detection system 16 of the present application determines whether the external object 2 is a three-dimensional object by comparing whether there is a difference caused by parallax between the images of the external object 2 respectively obtained from different viewing angles, so that lawless persons can be effectively prevented from attacking the identification function of the optical detection system 16 by using a planar replica printed with the image of the external object 2, and the security of the electronic device 1 is improved.
Fig. 6 is a partial cross-sectional view of an optical inspection system 16b according to a second embodiment of the present application, as shown in fig. 6. The optical detection system 16b differs from the optical detection system 16a of the first embodiment in structure in that: the first imaging module 166 and the second imaging module 168 of the optical inspection system 16b are each independent optical imaging modules. The first imaging module 166 includes a first lens 1601 and a first image sensor 162. The second imaging module 168 includes a second lens 1602 and a second image sensor 162. The optical detection system 16b does not use the structure of the lenslet 1600 array to image the external object 2, but uses the first lens 1601 and the second lens 1602 which are independent to each other to image the external object 2 respectively. The first lens 1601 and the second lens 1602 are imaged on the first image sensor 162a and the second image sensor 162b, respectively, which are independent of each other, instead of on different areas of the same image sensor 162.
Optionally, in some embodiments, the optical detection system 16b may also include two or more independent imaging modules with overlapped field of view ranges, and acquire multiple images of the external object 2 within the overlapped field of view ranges from different viewing angles respectively. The processing module 161 (refer to fig. 5) may perform statistical analysis on the acquired differences between the plurality of external object 2 images, such as: average difference analysis, standard deviation analysis, etc., and then compares with a preset difference threshold value and judges whether the external object 2 is a solid.
It should be noted that, part or all of the embodiments of the present application, and part or all of the modifications, replacements, alterations, splits, combinations, extensions, etc. of the embodiments are considered to be covered by the inventive idea of the present application and belong to the protection scope of the present application.
Any reference in this specification to "one embodiment," "an embodiment," "example embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature or structure is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature or structure in connection with other ones of the embodiments.
The orientations or positional relationships indicated by "length", "width", "upper", "lower", "left", "right", "front", "rear", "back", "front", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, which may appear in the specification of the present application, are based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the embodiments of the present application and simplifying the description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present application. Like reference numbers and letters refer to like items in the figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance. In the description of the present application, "plurality" or "a plurality" means at least two or two unless specifically defined otherwise. In the description of the present application, it should also be noted that, unless explicitly stated or limited otherwise, "disposed," "mounted," and "connected" are to be understood in a broad sense, e.g., they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, or may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
The above description is only for the specific embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. The terms used in the following claims should not be construed to limit the application to the specific embodiments disclosed in the specification. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. An optical detection system for detecting an external object, comprising:
a protective layer;
the display screen is positioned below the protective layer and used for displaying pictures;
the first imaging module is positioned below the display screen, is provided with a first field of view range and is used for forming a first image for an external object in the first field of view range;
the second imaging module is positioned below the display screen, is provided with a second field of view range and is used for forming a second image for an external object in the second field of view range, the first field of view range and the second field of view range are at least partially overlapped, and the overlapped part of the first field of view range and the second field of view range is defined as an overlapped field of view range;
and the processing module is used for comparing the difference between a first image formed by the first imaging module and a second image formed by the second imaging module of an external object in the overlapping field of view and judging that the external object is a three-dimensional object when the difference is greater than or equal to a preset threshold value.
2. The underscreen optical inspection system of claim 1, wherein:
the first imaging module comprises:
a first lens to converge light rays from an external object within the first field of view to image the external object; and
the first photosensitive module comprises a plurality of first photosensitive units and is used for receiving the light rays converged and imaged by the first lens and converting the light rays into corresponding electric signals;
the second imaging module comprises:
a second lens for converging light rays from an external object within the second field of view to image the external object, an
And the second photosensitive module comprises a plurality of second photosensitive units and is used for receiving the light rays converged and imaged by the second lens and converting the light rays into corresponding electric signals.
3. The off-screen optical inspection system of claim 2, further comprising a lens module positioned below the display screen, the lens module including a plurality of lenslets arranged in an array, the first lens and the second lens each being one of the lenslets.
4. The underscreen optical inspection system of claim 3 further comprising an image sensor located below the lens module, the image sensor including a plurality of light sensing units, the first and second light sensing modules each being a different light sensing area on the image sensor.
5. The off-screen optical detection system of claim 1, wherein the external object comprises a first feature point and a second feature point, a distance between corresponding pixels of the first feature point and the second feature point on the first image is a first pitch, a distance between corresponding pixels of the first feature point and the second feature point on the second image is a second pitch, a difference between the first image and the second image is a difference between the first pitch and the second pitch, and the preset threshold is a threshold of the difference between the first pitch and the second pitch.
6. The system according to claim 1, wherein the external object includes a first feature point and a second feature point, a difference between gray values of corresponding image points of the first feature point and the second feature point on the first image is a first gray difference, a difference between gray values of corresponding image points of the first feature point and the second feature point on the second image is a second gray difference, a difference between the first image and the second image is a difference between the first gray difference and the second gray difference, and the preset threshold is a threshold of the difference between the first gray difference and the second gray difference.
7. The system of claim 5 or 6, wherein the external object is a partial fingerprint ridge of a finger of a user located in the overlapping field of view, the first feature point is a fingerprint ridge, and the second feature point is a fingerprint valley.
8. The underscreen optical detection system of claim 5 or 6 in which the first and second feature points form image points on the first and second images, respectively, having a pitch in a range from 100 μm to 300 μm.
9. The underscreen optical detection system of claim 1 wherein the proportion of the overlapping field of view ranges in the first and second field of view ranges is greater than or equal to 30%.
10. The underscreen optical inspection system of claim 1 wherein the display screen is an active emissive display screen, the light emitted by the display screen being usable to illuminate an external object to form the first image and/or the second image; or
The display screen is a passive light-emitting display module, and the optical detection system under the screen further comprises an excitation light source used for providing light required by detection.
11. The system according to claim 1, wherein the processing module is further configured to compare the first image and/or the second image of the external object with a pre-stored external object template, and identify the identity of the external object according to the comparison result.
12. An electronic device comprising an underscreen optical detection system as claimed in claims 1 to 11.
CN202010491744.3A 2020-06-02 2020-06-02 Under-screen optical detection system and electronic equipment Active CN111723689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010491744.3A CN111723689B (en) 2020-06-02 2020-06-02 Under-screen optical detection system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010491744.3A CN111723689B (en) 2020-06-02 2020-06-02 Under-screen optical detection system and electronic equipment

Publications (2)

Publication Number Publication Date
CN111723689A true CN111723689A (en) 2020-09-29
CN111723689B CN111723689B (en) 2023-09-12

Family

ID=72565580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010491744.3A Active CN111723689B (en) 2020-06-02 2020-06-02 Under-screen optical detection system and electronic equipment

Country Status (1)

Country Link
CN (1) CN111723689B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131237A1 (en) * 2003-01-07 2004-07-08 Akihiro Machida Fingerprint verification device
US20090232362A1 (en) * 2008-03-12 2009-09-17 Hitachi Maxell, Ltd. Biometric information acquisition apparatus and biometric authentication apparatus
US8666127B1 (en) * 2010-08-06 2014-03-04 Secugen Corporation Method and apparatus for fake fingerprint detection
US20140232697A1 (en) * 2010-06-01 2014-08-21 Cho-Yi Lin Portable optical touch system
KR20160037305A (en) * 2014-09-26 2016-04-06 창신정보통신(주) Method for User Authentication using Fingerprint Recognition
US20160188950A1 (en) * 2014-12-30 2016-06-30 Quanta Computer Inc. Optical fingerprint recognition device
WO2017113744A1 (en) * 2015-12-31 2017-07-06 深圳市汇顶科技股份有限公司 Fingerprint identifying method and fingerprint identification device
US20170270342A1 (en) * 2015-06-18 2017-09-21 Shenzhen GOODIX Technology Co., Ltd. Optical collimators for under-screen optical sensor module for on-screen fingerprint sensing
US10102415B1 (en) * 2018-03-29 2018-10-16 Secugen Corporation Method and apparatus for simultaneous multiple fingerprint enrollment
CN109791609A (en) * 2018-12-26 2019-05-21 深圳市汇顶科技股份有限公司 Fingerprint identification module, electronic equipment and chip
US20190197287A1 (en) * 2017-12-21 2019-06-27 Lg Display Co., Ltd. Fingerprint Recognition Device and Display Device and Mobile Terminal Using Fingerprint Recognition Device
WO2020073166A1 (en) * 2018-10-08 2020-04-16 深圳市汇顶科技股份有限公司 Fingerprint recognition method and apparatus, and terminal device
CN111066031A (en) * 2019-11-05 2020-04-24 深圳市汇顶科技股份有限公司 Under-screen fingerprint identification device, LCD fingerprint identification system and electronic equipment
WO2020082380A1 (en) * 2018-10-26 2020-04-30 深圳市汇顶科技股份有限公司 Fingerprint recognition apparatus, and electronic device
CN111133446A (en) * 2018-12-13 2020-05-08 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
WO2020093251A1 (en) * 2018-11-06 2020-05-14 深圳市汇顶科技股份有限公司 Double sensing area-based fingerprint identification method, fingerprint identification system, and electronic device
CN111199190A (en) * 2019-12-19 2020-05-26 深圳阜时科技有限公司 Optical detection device

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131237A1 (en) * 2003-01-07 2004-07-08 Akihiro Machida Fingerprint verification device
US20090232362A1 (en) * 2008-03-12 2009-09-17 Hitachi Maxell, Ltd. Biometric information acquisition apparatus and biometric authentication apparatus
US20140232697A1 (en) * 2010-06-01 2014-08-21 Cho-Yi Lin Portable optical touch system
US8666127B1 (en) * 2010-08-06 2014-03-04 Secugen Corporation Method and apparatus for fake fingerprint detection
KR20160037305A (en) * 2014-09-26 2016-04-06 창신정보통신(주) Method for User Authentication using Fingerprint Recognition
US20160188950A1 (en) * 2014-12-30 2016-06-30 Quanta Computer Inc. Optical fingerprint recognition device
US20170270342A1 (en) * 2015-06-18 2017-09-21 Shenzhen GOODIX Technology Co., Ltd. Optical collimators for under-screen optical sensor module for on-screen fingerprint sensing
WO2017113744A1 (en) * 2015-12-31 2017-07-06 深圳市汇顶科技股份有限公司 Fingerprint identifying method and fingerprint identification device
US20190197287A1 (en) * 2017-12-21 2019-06-27 Lg Display Co., Ltd. Fingerprint Recognition Device and Display Device and Mobile Terminal Using Fingerprint Recognition Device
US10102415B1 (en) * 2018-03-29 2018-10-16 Secugen Corporation Method and apparatus for simultaneous multiple fingerprint enrollment
WO2020073166A1 (en) * 2018-10-08 2020-04-16 深圳市汇顶科技股份有限公司 Fingerprint recognition method and apparatus, and terminal device
WO2020082380A1 (en) * 2018-10-26 2020-04-30 深圳市汇顶科技股份有限公司 Fingerprint recognition apparatus, and electronic device
WO2020093251A1 (en) * 2018-11-06 2020-05-14 深圳市汇顶科技股份有限公司 Double sensing area-based fingerprint identification method, fingerprint identification system, and electronic device
CN111133446A (en) * 2018-12-13 2020-05-08 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN109791609A (en) * 2018-12-26 2019-05-21 深圳市汇顶科技股份有限公司 Fingerprint identification module, electronic equipment and chip
CN111066031A (en) * 2019-11-05 2020-04-24 深圳市汇顶科技股份有限公司 Under-screen fingerprint identification device, LCD fingerprint identification system and electronic equipment
CN111199190A (en) * 2019-12-19 2020-05-26 深圳阜时科技有限公司 Optical detection device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
D.LI, H.KUNIEDA, ET.AL: "Online Detection of Spoof Fingers for Smartphone-based Applications", 2015 IEEE 17TH INTERNATIONAL CONFERENCE ON HPCC *
陈志清,常铮,等: "活体指纹图像质量自动评估的参数选择与实现", 刑事技术 *

Also Published As

Publication number Publication date
CN111723689B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN110088768B (en) Fingerprint recognition device and electronic equipment under screen
US10943944B2 (en) Flat panel display having embedded optical imaging sensor located at rear surface of display
CN108292361B (en) Display integrated optical fingerprint sensor with angle limiting reflector
CN110235143B (en) Under-screen fingerprint identification device and electronic equipment
CN108513666B (en) Under-screen biological feature recognition device and electronic equipment
CN211319247U (en) Fingerprint identification device, backlight unit, liquid crystal display and electronic equipment
CN109690567A (en) Fingerprint identification device and electronic equipment
CN110100250B (en) Fingerprint identification device and method and electronic equipment
CN210109828U (en) Fingerprint identification device and electronic equipment
CN112069942B (en) Optical detection system and electronic equipment under screen
CN112528953A (en) Fingerprint identification device, electronic equipment and fingerprint identification method
CN209433415U (en) Fingerprint identification device and electronic equipment
US10726232B2 (en) Flat panel display having optical sensor
CN111709330B (en) Optical detection system and electronic equipment
CN111931681A (en) Optical detection device and electronic equipment
CN210295111U (en) Fingerprint identification device and electronic equipment
WO2021056392A1 (en) Optical fingerprint apparatus, electronic device, and method for measuring distance
CN111339815A (en) Optical detection device and electronic equipment
CN111291731A (en) Optical sensing device and electronic apparatus
CN111723689B (en) Under-screen optical detection system and electronic equipment
CN110785770A (en) Fingerprint identification method and device and electronic equipment
CN210403732U (en) Lens assembly, imaging device and biological characteristic detection system
CN212484400U (en) Optical detection device and electronic equipment
CN211698975U (en) Optical detection device and electronic equipment
CN210401953U (en) Backlight assembly, display device and biological characteristic detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant