CN211698986U - Optical sensing device and electronic apparatus - Google Patents
Optical sensing device and electronic apparatus Download PDFInfo
- Publication number
- CN211698986U CN211698986U CN202020420003.1U CN202020420003U CN211698986U CN 211698986 U CN211698986 U CN 211698986U CN 202020420003 U CN202020420003 U CN 202020420003U CN 211698986 U CN211698986 U CN 211698986U
- Authority
- CN
- China
- Prior art keywords
- lens
- lenses
- optical
- light
- light beam
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Image Input (AREA)
Abstract
The application discloses optical sensing device includes: the image sensing chip comprises a plurality of pixel units, wherein the pixel units are used for receiving light beams and converting the received light beams into corresponding electric signals; the lens module is positioned above the image sensing chip and comprises a plurality of first lenses which are arranged at intervals, the image sensing chip is provided with effective sensing areas respectively corresponding to the first lenses, each effective sensing area comprises a plurality of pixel units, the first lenses are used for converging light beams to the corresponding effective sensing areas, and the light beams in the target waveband can penetrate through the first lenses to reach the corresponding effective sensing areas and are converted into corresponding electric signals; wherein part or all of the plurality of first lenses have an arrangement of a triangular mesh. The application also discloses an electronic device.
Description
Technical Field
The present disclosure relates to the field of optoelectronic technologies, and more particularly, to an optical sensing device and an electronic apparatus with ultra-thin dimensions.
Background
With the technical progress and the improvement of living standard of people, users demand more functions and fashionable appearance for electronic equipment such as mobile phones, tablet computers and cameras. At present, the development trend of electronic devices such as mobile phones and the like is to have higher screen occupation ratio and have functions of fingerprint detection and the like. In order to realize a full screen or a screen close to the full screen effect, the electronic equipment has a high screen occupation ratio, and a fingerprint detection technology under the screen is developed. Because the internal space of electronic equipment such as a mobile phone is limited, an imaging device for realizing optical imaging by using a traditional lens occupies a larger space due to larger size and volume.
SUMMERY OF THE UTILITY MODEL
In view of the above, the present application provides an optical sensing device and an electronic apparatus capable of solving or improving the problems of the prior art.
One aspect of the present application provides an optical sensing device comprising:
the image sensing chip comprises a plurality of pixel units, wherein the pixel units are used for receiving light beams and converting the received light beams into corresponding electric signals;
a lens module located above the image sensing chip, the lens module including:
the image sensing chip is provided with effective sensing areas corresponding to the first lenses respectively, each effective sensing area comprises a plurality of pixel units, the first lenses are used for converging light beams to the corresponding effective sensing area, and the light beams in a target waveband can penetrate through the first lenses to reach the corresponding effective sensing area and are converted into corresponding electric signals; wherein part or all of the plurality of first lenses have an arrangement of a triangular mesh.
In some embodiments, the triangular mesh comprises an equilateral triangular mesh.
In some embodiments, the image sensor chip has a photosensitive surface for receiving the light beam, and a portion of the first lens of the lens module opposite to the photosensitive surface has an arrangement of an equilateral triangle grid.
In some embodiments, the center-to-center spacing of two adjacent lenses is greater than or equal to 1 or 2 or 3 or 4 or 5 or 6 times the diameter of the first lens.
In some embodiments, the first lens is a convex lens, which includes a convex surface facing away from the side where the image sensing chip is located, and the light beam enters the first lens from the convex surface and is emitted from the lens module toward the side of the image sensing chip after being converged.
In some embodiments, the lens module further includes a dam wall disposed on the interval region of the first lens, wherein: the retaining wall can shield a light beam of a first preset wave band, and the first preset wave band comprises the target wave band; or the lens module further comprises a light shielding layer covering the retaining wall, and the light shielding layer can shield the first preset waveband light beam.
In some embodiments, the optical sensing device further includes a filter layer, the filter layer is located between the lens module and the image sensor chip, or the filter layer is formed on a side of the lens module facing away from the image sensor chip, the filter layer is configured to transmit a light beam of a target wavelength band and filter out a light beam of a second preset wavelength band, where the second preset wavelength band is different from the target wavelength band.
When the light beam of the target waveband is near-infrared light, the second preset waveband comprises a visible light waveband, the filter layer is used for filtering visible light and transmitting near-infrared light, and the filter layer is a visible light cut-off filter; when the light beam of the target waveband is visible light, the second preset waveband comprises a near-infrared light waveband, the filter layer is used for filtering near-infrared light and transmitting visible light, and the filter layer is an infrared light cut-off filter.
In some embodiments, the lens module further includes an optical spacer layer located below the first lens and the retaining wall, the optical spacer layer is located above the image sensing chip, the first lens and the retaining wall have an integrated structure, the optical spacer layer is a residual layer when the first lens and the retaining wall are formed, and the first lens, the retaining wall and the optical spacer layer are made of the same material.
In some embodiments, the first lens of the lens module is formed directly on the image sensing chip; or the optical sensing device further comprises a first substrate disposed above the image sensing chip, and the lens module is formed on the first substrate.
In some embodiments, the first lens is a convex lens, which includes a convex surface facing away from the side where the image sensing chip is located, and the light beam enters the first lens from the convex surface and is emitted from the lens module toward the side of the image sensing chip after being converged.
In some embodiments, the target band of light beams includes visible and/or near infrared light.
One aspect of the present application provides an electronic device, including a display screen and an optical sensing device located below the display screen, where the optical sensing device is the above optical sensing device, and the optical sensing device can transmit a light beam with biometric information of an external object received by the display screen and convert the light beam into an electrical signal, so as to obtain the biometric information of the external object.
The beneficial effect of this application lies in, the optical type sensing device of this application adopts a plurality of first lens printing opacity and convergent light beam to image sensor chip's a plurality of pixel cell on, pixel cell receives and converts the light beam for the signal of telecommunication to acquire external object's biological characteristic information. Part or all of the plurality of first lenses are arranged in an equilateral triangular grid, so that the light flux can be larger, and the biological characteristic detection effect is better.
Drawings
FIG. 1 is a schematic view of an electronic device of the present application including an optical sensing device;
FIG. 2 is a schematic diagram of a partial exploded perspective view of one embodiment of the optical sensing device of FIG. 1;
FIG. 3 is a schematic top view of the plurality of first lenses of the optical sensor device of FIG. 2, illustrating an arrangement of the plurality of first lenses;
FIG. 4 is a schematic partial cross-sectional view of the optical sensing device of FIG. 2;
FIGS. 5A and 5B are schematic diagrams of a square grid arrangement and an equilateral triangular grid arrangement, respectively, of the first lens;
FIG. 6 is a schematic top view of a portion of a first lens of the optical sensor device of FIG. 2;
FIG. 7 is a schematic drawing in section with portions broken away of one embodiment of an electronic device of the present application;
fig. 8 is an imaging schematic of a large lens of the prior art.
Detailed Description
In the detailed description of the embodiments herein, it will be understood that when a substrate, a sheet, a layer, or a pattern is referred to as being "on" or "under" another substrate, another sheet, another layer, or another pattern, it can be "directly" or "indirectly" on the other substrate, the other sheet, the other layer, or the other pattern, or one or more intervening layers may also be present. The thickness and size of each layer in the drawings of the specification may be exaggerated, omitted, or schematically represented for clarity. Further, the sizes of the elements in the drawings do not completely reflect actual sizes.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application.
Further, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject technology can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the focus of the application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of an electronic device according to the present application. The electronic device 100 comprises an optical sensing apparatus 1 and a display screen 2. The display screen 2 is used for displaying pictures. The optical sensing device 1 is located below the display screen 2, and is configured to receive a light beam returned by an external object through the display screen 2, and convert the received light beam into a corresponding electrical signal, so as to perform corresponding information sensing. The optical sensing device 1 is used for example to perform sensing of biometric information, such as but not limited to, texture information including fingerprint information, palm print information, and the like, and/or living body information including blood oxygen information, heartbeat information, pulse information, and the like. However, the present application is not limited thereto, and the optical sensing apparatus 1 may also be used for performing other information sensing, such as depth information sensing, proximity sensing, and the like. In the present application, the optical sensing device 1 is mainly used to perform the sensing of the biometric information. The display screen 2 is, for example, but not limited to, an OLED display screen or an LCD display screen. The display screen 2 may be used as an excitation light source for providing a light beam for detection, or an excitation light source may be additionally provided in the electronic device 100 for providing a light beam for detection. The surface of the display screen 2 facing away from the optical sensing device 1 is an upper surface (not numbered), and the upper surface includes a detection area VA. When the external object 1000 contacts the detection area VA, the light beams reflected and/or transmitted by the external object 1000 can pass through the display screen 2 from the detection area VA, and the optical sensing device 1 can receive the light beams through the display screen 2 to acquire a biometric image of the external object.
The electronic device 100 may be any suitable type of electronic product, such as, but not limited to, consumer electronics, home electronics, vehicle-mounted electronics, financial terminal products, and the like. The consumer electronic products include, for example, mobile phones, tablet computers, notebook computers, desktop monitors, all-in-one computers, and the like. Household electronic products are, for example, smart door locks, televisions, refrigerators and the like. The vehicle-mounted electronic product is, for example, a vehicle-mounted navigator, a vehicle-mounted DVD, or the like. The financial terminal products are ATM machines, terminals for self-service business and the like.
Referring to fig. 2, fig. 2 is a partially exploded schematic view of an optical sensing device 1 according to an embodiment of the present disclosure, in which the optical sensing device 1 includes a lens module 10 and an image sensor chip 20 located below the lens module 10. The lens module 10 is used for converging light beams to the image sensing chip 20. The image sensor chip 20 is used for converting the received light beam into a corresponding electrical signal. The image sensing chip 20 has a photosensitive surface for light sensing, and a light beam reaching the photosensitive surface can be received by the image sensing chip 20 and converted into an electrical signal.
The lens module 10 includes a plurality of first lenses 12. The plurality of first lenses 12 are arranged at intervals. The first lens 12 is capable of converging a light beam, and the light beam is capable of converging onto the image sensing chip 20 through the first lens 12. The plurality of first lenses 12 are arranged at intervals.
Please refer to fig. 3 to fig. 4. Fig. 3 is a schematic top view of the plurality of first lenses 12 shown in fig. 2. Wherein, the plurality of first lenses 12 have a triangular grid arrangement. Fig. 4 is a partial cross-sectional view of the optical sensing device 1 along the line a-a in fig. 3. Alternatively, the triangular mesh includes an equilateral triangular mesh, and the plurality of first lenses 12 may have an arrangement of the equilateral triangular mesh. In the present specification, an equilateral triangular grid is taken as an example, but it should be understood that the plurality of first lenses 12 in the present embodiment may have a non-equilateral triangular shape, or a grid-like arrangement having polygons including triangles, or other suitable shapes.
The first lenses 12 have optical centers, and the distance between the optical centers of two adjacent first lenses 12 is the center-to-center distance P between the two adjacent first lenses 12. The center-to-center pitch P can be any value from 100 microns to 1000 microns, for example, but not limited to, the center-to-center pitch P can be 100 microns, 200 microns, 300 microns, 350 microns, 400 microns, 450 microns, and the like. Alternatively, the diameter D of the first lens 12 may be any value from 80 microns to 500 microns, for example, but not limited to, the diameter D of the first lens 12 may be 100 microns, 110 microns, 120 microns, 130 microns, 140 microns, 150 microns, 160 microns, 170 microns, 180 microns, 200 microns, and the like.
The first lens 12 may have a rise H1 of any value between 0 and 100 microns, such as 10 microns, 20 microns, 30 microns, 40 microns, 50 microns, 60 microns, 70 microns, 80 microns, 90 microns, and the like. The first lens 12 also includes a bottom surface (not numbered) opposite the convex surface 121, where the rise H1 can be considered as the maximum perpendicular distance of the convex surface 121 from the bottom surface. In this embodiment, the bottom surface of the first lens 12 is circular and has a diameter D. The diameter D of the first lens 12 may be any value between 10 microns and 500 microns. Such as 50 microns, 80 microns, 100 microns, 120 microns, 150 microns, and so forth. For convenience of description, the diameter D of the bottom surface of the first lens 12 is also referred to as the diameter of the first lens 12. Alternatively, in some embodiments, the bottom surface of the first lens 12 may be circular, rectangular, hexagonal, polygonal, etc., and accordingly, the first lens 12 may be referred to as a circular lens, a rectangular lens, a hexagonal lens, etc. The present application is not limited to this, and those skilled in the art can understand that the embodiments of the present application are not described again.
The first lens 12 includes a convex surface 121, the convex surface 121 is a light incident surface facing away from one side of the image sensor chip 20, and a light beam can enter the first lens 12 from the convex surface 121 and exit from the lens module 10 facing one side of the image sensor chip 20, and then be received by the light detecting unit of the image sensor chip 20 and be converted into an electrical signal.
The image sensing chip 20 includes a plurality of pixel units 21, and when the light beam 101 with the biometric information of the external object 1000 reaches the image sensing chip 20 through the first lens 12, the pixel units 21 can receive the light beam 101 and convert the light beam into corresponding electric signals to obtain the corresponding biometric information of the external object 1000 (see fig. 1). Such as, but not limited to, a user's finger, palm, etc. The pixel unit 212 includes, for example, but not limited to, a photodiode, etc. The plurality of pixel units 21 form a light detection array 22 of the image sensor chip 20, i.e., a photosensitive area or a photosensitive surface of the image sensor chip 20.
Optionally, in some embodiments, each of the first lenses 12 faces a plurality of the pixel units 21, in which case the first lens 12 may be a small lens (Mini-lens). Compared with the case that each first lens 12 respectively faces only one pixel unit 21, the first lens 12 faces a plurality of pixel units 21, so that the light sensing area corresponding to a single lens is increased, and the sensing accuracy is higher.
However, alternatively, in some embodiments, the plurality of first lenses 110 may be directly opposite to the plurality of pixel units 21, in which case the first lens 12 may be a Micro-lens (Micro-lens).
As shown in fig. 3, the light detecting array 22 of the image sensing chip 21 has an effective photosensitive area 25 corresponding to the first lens 12. The effective photosensitive area 25 is capable of receiving the light beam transmitted through the first lens 12 corresponding to the effective photosensitive area 25 and converting the light beam into an electrical signal representing biometric information of the external object 1000. The effective photosensitive area 25 corresponds to a plurality of pixel units 21.
The effective photosensitive area 25 may face the first lens 12, and the orthographic projection of the first lens 12 on the light detection array 22 of the image sensing chip 21 may completely cover the effective photosensitive area 25. Alternatively, however, the effective photosensitive area 25 may overlap with the orthographic projection of the first lens element 12. Alternatively, the effective photosensitive areas 25 and the first lenses 12 have a one-to-one correspondence relationship. Optionally, the first lens 12 and the corresponding effective sensing area EA are opposite to each other. For a circular first lens 12, the diameter D of the first lens 12 may be greater than, equal to, or less than the diameter of the effective sensing area EA corresponding thereto. Alternatively, the area of the first lens 12 may be greater than, equal to, or less than the area of the effective sensing area EA corresponding thereto.
Optionally, in some embodiments, there is a first lens 12 that does not have a corresponding effective photosensitive area 25. Such as the first lenses 12 located at the edge portion of the lens module 10, the orthographic projection of the first lenses 12 on the image sensing chip 20 may actually be located outside the photosensitive surface, the light beams transmitted through the first lenses 12 are not received by the image sensing chip 20 and converted into corresponding electrical signals, and the first lenses 12 do not have corresponding effective photosensitive areas 25 on the image sensing chip 20.
In order to ensure the imaging quality and avoid crosstalk, it is necessary to ensure that the light beams reflected and/or transmitted from the external object 1000 are transmitted through one first lens 12 and then can be irradiated only on the effective sensing area EA corresponding to the first lens 12 as much as possible. If the light beam passing through one first lens 12 irradiates the effective sensing area EA corresponding to the adjacent or other first lens 12, the normal imaging of the effective sensing area EA corresponding to the adjacent or other first lens 12, that is, the crosstalk phenomenon, is interfered. In order to avoid crosstalk, in the embodiment of the present application, adjacent first lenses 12 are spaced apart from each other, and the center pitch P is not less than 1, 2, 3, 4, 5, 6, or the like of the diameter of the first lens 12.
Alternatively, in some embodiments, the arrangement of the plurality of first lenses 12 may have a two-dimensional periodic grid-like structure, which may include one or more of a square grid, an equilateral triangular grid, an equilateral hexagonal grid, or any other suitably shaped grid. Optionally, in some embodiments, the optical center of the first lens 12 is the vertex of the grid-like structure. It should be understood that the manufacturing process may cause errors in actual products due to differences in processes, materials, etc., for example, there may be a deviation of the optical center of the first lens 12 from the vertex of the grid-like structure, but the plurality of first lenses 12 as a whole have a grid-like arrangement, which also falls within the scope of the present application.
Optionally, in some embodiments, at least a portion of the first lenses 12 of the plurality of first lenses 12 facing the light-sensing surface of the image sensor chip 20 has an arrangement of an equilateral triangular grid.
It can be understood that the equilateral triangular meshes are arranged more closely compared to square meshes and equilateral hexagonal meshes, etc. The use of the equilateral triangular grid arrangement enables the lens module 10 to accommodate more first lenses 12 within the same area size while ensuring that the center-to-center distances of adjacent first lenses 12 are not less than the first predetermined distance. Then, when the number of the first lenses 12 is larger, the amount of the light beam transmitted through the first lenses 12 is larger. That is, the first lenses 12 arranged in the manner of an equilateral triangular grid can achieve a larger light flux on the same size of photosensitive surface. Therefore, the optical sensing device 1 using the plurality of first lenses 12 arranged in the equilateral triangle grid can avoid the influence of crosstalk on imaging, and has better imaging effect and biological feature detection effect.
Referring to fig. 5A and 5B, fig. 5A shows a plurality of first lenses 12 arranged in a square grid within a rectangular region R1, wherein a side length of the square grid is a center-to-center distance P between two adjacent first lenses 12. Fig. 5B shows a plurality of first lenses 12 arranged in an equilateral triangular grid within the same rectangular region R1, wherein the side length of the equilateral triangular grid is the center-to-center distance P between two adjacent first lenses 12. It can be seen that within the same rectangular region R1, the number of first lenses 12 arranged in an equilateral triangular grid is 36, while the number of first lenses 12 arranged in a square grid is 30. Assuming that the amount of transmitted light beams of each first lens 12 is equal (i.e., equal), the light flux of the plurality of first lenses 12 in fig. 5B can be approximately considered to be 20% higher than the light flux of the plurality of first lenses 12 in fig. 5A.
Alternatively, in some embodiments, the lens module 10 may be formed directly on the image sensing chip 20. For example, but not limited to, the image sensing chip 20 is a bare chip, and the plurality of first lenses 12 are formed on the image sensing chip 20 through a plating process. Alternatively, in some embodiments, the plurality of first lenses 12 are adhered to the image sensing chip 20 by an adhesive. The adhesive can be daf (die attach film), solid glue, liquid glue, and the like, and the application is not limited.
Optionally, in some embodiments, the lens module 10 includes a retaining wall 13 disposed in a spacing region between the first lenses 10. In order to prevent crosstalk between light beams transmitted by adjacent first lenses 12, the height of the dam 13 may be greater than that of the first lenses 12. Of course, the height of the retaining wall 13 may also be less than or equal to the height of the first lens 12. Here, the height of the retaining wall 13 may be the maximum vertical distance of the retaining wall 13 relative to the plane where the bottom surface of the first lens 12 is located, and the height of the first lens 12 may be the maximum vertical distance of the convex surface 121 of the first lens 1 relative to the plane where the bottom surface thereof is located. Alternatively, in some embodiments, the retaining wall 13 may be only disposed between part of the adjacent first lenses 12, or the retaining wall 13 may be disposed between any adjacent first lenses 12.
Further, the dam 13 may be higher than the first lens 12 by any value of 5 to 10 micrometers. Since the retaining wall 13 is higher than the first lens 12, and the light shielding layer 14 covers the retaining wall 13, the interference light beam from the oblique side of the first lens 12 can be shielded by the light shielding layer 14, and will not be received by the pixel unit 21 on the effective sensing area 25 directly opposite to the adjacent first lens 12 after passing through the first lens 12, so as to effectively avoid the crosstalk problem of the light beam passing through between the adjacent first lenses 12. For example, as shown in fig. 4, the light beam 101 is blocked by the light blocking layer 14 covering the retaining wall 13 and cannot reach the effective sensing area 25 directly opposite to the adjacent first lens 12 through the first lens 12. Therefore, the image sensing chip 20 has better sensing accuracy. In addition, when the lens module 10 is pressed from top to bottom, the retaining wall 13 can bear all or most of the pressure, and the first lens 12 is not deformed or damaged due to the pressure, so that the image sensing chip 20 is not influenced to receive the light beam transmitted through the first lens 12 to acquire the biometric information of the external object 1000 (see fig. 1). The biological characteristic information can be fingerprint information, vein information, face information, iris information and the like.
Further, the retaining wall 13 may include a top surface 131 and a plurality of side surfaces 132, the side surfaces are respectively disposed opposite to the first lenses 12, and each side surface 132 surrounds one first lens 12. The top surface 131 may include a flat surface and/or a curved surface, and the side surface 132 may include a flat surface and/or a curved surface. The junction of the top surface 131 and the side surface 132 has a height greater than the rise of the first lens 12 with respect to the plane of the bottom surface of the first lens 12. Alternatively, the top surface 131 of the retaining wall 13 may be recessed downward such that the cross section of the retaining wall 13 along the line a-a in fig. 1 appears to have a concave shape. Of course, the retaining wall 13 may have other structural configurations, which is not limited in this application.
The retaining wall 13 may be made of a transparent material or a non-transparent material, and the lens module 10 further includes a light shielding layer 14 disposed on the surface of the retaining wall 13 and at an interval between the first lenses 12. The light shielding layer 14 may be used for shielding the light beam of the first predetermined wavelength band. The light shielding layer 14 may cover the spacing regions between the plurality of first lenses 12 and the banks 13 on the spacing regions. Thus, the light beam outside the first preset wavelength band can reach the image sensing chip 20 only through the first lens 12. The first preset wave band at least comprises a target wave band. Alternatively, the light shielding layer 14 may cover the side surfaces 132 and the top surface 131 of the retaining wall 13. Optionally, there is a gap between the side 132 of the retaining wall 13 and the first lens 12, for example, but not limited to, the optical spacer layer 11 is exposed at the gap. The light shielding layer 14 covers the side surfaces 132 and the top surface 131 of the retaining wall 13 and the gap.
Alternatively, in some embodiments, the retaining wall 13 may be configured to block the light beam of the first predetermined wavelength band. For example, but not limited to, the retaining wall 13 may be made of a material that is opaque to light, or the retaining wall 13 may be made of a material that can block the light beam of the first predetermined wavelength band. At this time, the light beam of the first predetermined wavelength band is blocked by the retaining wall 13. The first preset waveband at least comprises a target waveband.
The light beam of the target wavelength band may pass through the first lens 12 and then be received by the image sensing chip 20 and converted into a light beam of an electrical signal. In this case, the light shielding layer 14 may not be required to be covered on the retaining wall 13.
It should be noted that the light beam of the target wavelength band may be a light beam that is received by the image sensing chip 20 and converted into an electrical signal to acquire image data with biometric information. In the embodiment of the present application, the target wavelength band may be 300 nm to 2000 nm.
Optionally, in some embodiments, the lens module 10 may further include an optical spacer layer 11 located below the first lens 12 and the retaining wall 13. The first lens 12 and the retaining wall 13 may be regarded as one integral structure. For example, but not limited to, the first lens 12 and the retaining wall 13 may be formed in one step by an imprinting process. The first lens 12 and the retaining wall 13 may be made of a light-transmitting resin, an optical adhesive, or the like. The optical spacer layer 11 and the first lens 12 are made of the same material as the retaining wall 13. The optical spacer layer 11 may be a residual layer when the first lens 12 and the bank 13 are formed. Of course, the optical spacer layer 11 may be formed to have a certain thickness when the first lens 12 and the retaining wall 13 are formed. Of course, in some embodiments, the first lens 12, the retaining wall 13, and the optical spacer layer 11 may be made of other materials or processes. Optionally, in some embodiments, the optical spacer layer 11 has an upper surface 111, and a plane where the bottom surface of the first lens 12 is located may be regarded as the upper surface 111 of the optical spacer layer 11. The light beam of the target wavelength band may be received by the image sensor chip 20 and converted into an electrical signal after passing through the first lens 12 and the optical spacer layer 11.
Optionally, in some embodiments, the light spacing layer 11 has an upper surface 111, and the first lens 12 and the retaining wall 13 are located on the upper surface 111. Relative to the upper surface 111 of the light spacing layer 11, the height (rise) of the first lens 12 is H1, the height of the retaining wall 13 is H2, and H2 is not less than H1. That is, the bank 13 is higher in height than the first lens 12. By way of example and not limitation, the dam 13 may be higher than the first lens 12 by any value from 0 to 100 microns.
Optionally, in some embodiments, the plurality of first lenses 12 are convex lenses. Further optionally, the plurality of first lenses 12 are spherical lenses or aspherical lenses. The convex surface 121 is spherical or aspherical.
Optionally, in some embodiments, the plurality of first lenses 12 are made of a transparent material. Such as, but not limited to, transparent acrylic, transparent glass, UV glue material, and the like.
Optionally, in some embodiments, the plurality of first lenses 12 are, for example, identical. However, alternatively, in some embodiments, the plurality of first lenses 12 may not be identical.
Optionally, in some embodiments, the optical sensing device 1 may further include a first substrate 30, and the plurality of lens modules 10 are disposed on the first substrate 30. The first substrate 30 may be used to carry the lens module 10. The first substrate 30 may be made of glass, resin, metal, or the like. This is not a limitation of the present application. The light beam of the target wavelength band may sequentially pass through the first lens 12, the optical spacer 11, and the first substrate 30, and then be received by the image sensing chip 20 and converted into an electrical signal. Of course, alternatively, in some embodiments, the lens module 10 may be directly formed on the image sensor chip 20, and at this time, the image sensor chip 20 may serve as a carrier substrate of the lens module 10. The image sensing chip 20 may be a die or a packaged chip. In contrast to the method of manufacturing the lens module 10 on the first substrate 30, and then fixing the first substrate 30 carrying the lens module 10 and the image sensing chip 20 by, for example, an adhesive. Forming the lens module 20 directly on the image sensing chip 20 may make the overall thickness of the optical sensing apparatus 1 thinner.
Optionally, in some embodiments, the optical sensing device 10 may further include a filter layer 40 disposed between the lens module 10 and the image sensing chip 20. The filter layer 40 is configured to transmit the light beam of the target wavelength band and block a light beam of a second predetermined wavelength band, where the second predetermined wavelength band is different from the target wavelength band. The light beam of the target wavelength band may be received by the image sensing chip 20 and converted into an electrical signal after passing through the first lens 12, the optical spacer 11, the first substrate 30, and the filter layer 40 in sequence. For example, but not limited to, the filter layer 40 may be an infrared cut filter capable of blocking infrared light and transmitting light beams other than infrared light. The visible light can be received by the image sensing chip 20 and converted into an electrical signal after passing through the filter layer 40, so that the image sensing chip 20 can acquire corresponding visible light image data. Optionally, in some embodiments, the filter layer 40 is formed on the photosensitive array 22 of the image sensing chip 20, for example, by an evaporation process. The thickness of the filter layer 40 may be, for example, but not limited to, 1 micron to 5 microns.
Alternatively, in some other embodiments, the lens module 10 includes a light shielding layer 14 for filtering out light beams of a first predetermined wavelength band, and the optical sensing device 10 includes a filter layer 40 for filtering out light beams of a second predetermined wavelength band. The first preset wave band and the second preset wave band are completely different or completely the same or partially the same. When the first preset waveband is the same as the second preset waveband, the first preset waveband comprises the second preset waveband. For example, the first preset wavelength band includes a visible light wavelength band and a near infrared light wavelength band, and the second preset wavelength band includes a near infrared light wavelength band. The filter layer 40 is, for example, an infrared cut filter capable of cutting off infrared light and transmitting visible light. Optionally, in some embodiments, the filter layer 40 is disposed on the image sensing chip 20, or/and the filter layer 40 is disposed on the lens module 10. Specifically, for example, the filter layer 40 is provided on the plurality of first lenses 12 and the light shielding layer 14.
Optionally, in some embodiments, the optical sensing device 1 further comprises a filter layer 40 disposed above the plurality of pixel units 21. The filter layer 40 is used for transmitting the light beam of the target waveband and filtering out the light beam outside the target waveband, so that the interference of stray light to the sensing precision is reduced. The light beam of the target wavelength band is, for example, visible light. Optionally, in some embodiments, the light beam of the target wavelength band is visible light, and the light shielding layer 40 may be a visible light cut filter capable of cutting off visible light and transmitting near infrared light. Alternatively, in some embodiments, the filter layer 40 may be formed on the image sensing chip 20 by a plating film, the filter layer 40 may include a multilayer optical thin film structure, or the filter layer 40 includes a material having optical properties of transmitting light beams of a target wavelength band and cutting off light beams of other wavelength bands, which is understood by those skilled in the art and not limited thereto.
Optionally, in some embodiments, the target wavelength band corresponding light beam may include visible light and/or infrared light. The target wavelength band may be in the wavelength range from 300 nanometers to 2000 nanometers. The light beam with the wavelength of 300 nm to 780 nm can be regarded as visible light, and the light beam with the wavelength of 780 nm to 2000 nm can be regarded as near infrared light.
Alternatively, in some embodiments, the lens module 10 is formed on a first substrate 30, and the first substrate 30 is connected to the filter layer 40 by an adhesive. The adhesive can be DAF (die attach film), liquid glue, solid glue, optical glue, etc. The embodiments of the present application do not limit this. Alternatively, in some embodiments, the first substrate 30 may be omitted, and the lens module 10 may be directly formed on the filter layer 40, for example, by an imprinting process. It should be noted that the lens module 10 may also be formed by other processes, and the first lens 12 and the retaining wall 13 may be formed at one time or formed multiple times. This is not a limitation of the present application.
Referring to fig. 6, a top view of a portion of the first lenses 12 arranged in an equilateral triangular grid in fig. 2 is shown, wherein the plurality of first lenses 12 include first lenses 12a, 12B, and 12c sequentially arranged along line B-B. Fig. 7 is a partial cross-sectional view of the electronic device 100 taken along the line B-B in fig. 6. The surface of the display screen 2 facing away from the optical sensing device 1 is an upper surface, which can be touched by the external object 1000 to acquire the biometric information of the external object 1000. The upper surface of the display screen 2 has sub-sensing regions V1, V2, V3 corresponding to the first lenses 12a, 12b, 12c, respectively. The effective photosensitive areas of the first lenses 12a, 12b, and 12c on the photosensitive surface of the image sensor chip 20 are 25a, 25b, and 25c, respectively.
A part of the light beams reflected and/or transmitted by the external object 1000 can pass through the display screen 2 from the sub-detection region V1, and after passing through the first lens 12a, are received by the effective sensing region 25a and converted into electrical signals. Likewise, a portion of the light beams reflected and/or transmitted by the external object 1000 can pass through the display screen 2 from the sub-detection region V2, and after passing through the first lens 12b, are received by the effective sensing region 25b and converted into electrical signals. A part of the light beams reflected and/or transmitted by the external object 1000 can pass from the sub-detection region V3 through the display screen 2, and after passing through said first lens 12c, are received by the active sensing region 25c and converted into electrical signals. There is an overlapping portion of the sub-sensing regions V1 and V2, and an overlapping portion of the sub-sensing regions V2 and V3. It can be considered that, in the embodiment shown in fig. 6 and 7, for the plurality of first lenses 12, there is an overlapping region in the sub-detection regions corresponding to each of two adjacent first lenses 12, or there is an overlapping region in the adjacent sub-detection regions.
Taking fingerprint detection as an example, when the external object 1000 is a finger, the finger generally needs to touch the detection area VA on the upper surface of the display screen 2, so that a detection light beam from the display screen 2 or an external excitation light source can reach the finger, and after being transmitted and/or reflected by the finger, the detection light beam passes through the display screen 2 from the detection area, and can be received by the optical sensing device 1 located below the display screen 2 to acquire an optical fingerprint image corresponding to the finger. The reflected and/or transmitted detection light beam of the finger can carry fingerprint characteristic information, and the reflected and/or transmitted detection light beam of different positions of the finger can carry different fingerprint information. For convenience of description, the detection light beams reflected and/or transmitted by the external object 1000 are hereinafter collectively referred to as detection light beams returned by the external object 1000. It is understood that the plurality of first lenses 12 are disposed at intervals, and the plurality of effective sensing regions 25 corresponding to the plurality of first lenses 12 are disposed at intervals. The optical fingerprint image may include a plurality of separated sub-detection images, each sub-detection image corresponds to one first lens 12, and the detection light beam passing through the corresponding first lens 12 generates the sub-detection image after being subjected to photoelectric conversion by the image sensor chip 20.
When the optical fingerprint image is used for fingerprint identification and comparison, the image characteristics of different sub-detection images can be identified through a fingerprint identification algorithm, the image characteristics generated after the light beams in the overlapped area are received by different effective sensing areas are used as alignment (alignment) references (the detection light beams penetrating through the overlapped area can be regarded as carrying the same fingerprint characteristic information), and the different sub-detection images are spliced to obtain the spliced fingerprint characteristic image.
However, alternatively, in other embodiments, the sub-detection regions corresponding to two adjacent first lenses 12 may not overlap or have a gap.
The optical centers of the first lenses 12a, 12b, 12c are O1, O2, O3, respectively. The vertical distances between the optical centers O1, O2, O3 and the photosensitive surface of the image sensing chip 20 are image distances L1, and the vertical distances between the optical centers O1, O2, O3 and the upper surface of the display screen 2 are object distances L2. It is possible to obtain:
where diameter (Vi) represents the diameter of the sub-detection region Vi, diameter (Ei) represents the diameter of the effective sensing region Ei, and k is an object image ratio of the optical sensing apparatus 1 for optically imaging the biological feature of the external object 1000. Note that, here, the sub-detection region Vi and the effective sensing region Ei are regarded as circular regions, and the first lens 12a is regarded as a circular lens. It should be understood that the first lens 12a may have a bottom surface with different shapes, the sub-detection area Vi and the effective sensing area Ei may have different shapes, and the above equation may be adjusted according to practical situations, and the application is not limited thereto.
It should be understood that the first lenses 12a, 12b and 12c, the sub detection regions V1, V2, V3, the effective photosensitive regions 25a, 25b and 25c, etc. are exemplary labels for convenience of description and are not intended to limit the embodiments of the present application.
When k is greater than 1, the area of the sub-detection region corresponding to the first lens 12 is larger than the area of the effective sensing region corresponding to the sub-detection region, and the larger the value of k is, the larger the area ratio of the sub-detection region corresponding to the first lens 12 to the effective sensing region is. For example, when the image distance L1 is constant and the object distance L2 is increased, the sub detection regions V1, V2, and V3 become larger accordingly, and the sizes of the effective sensing regions E1, E2, and E3 are constant, so that the number of pixel units 21 is constant (it can be considered that the resolution of the sub detection image corresponding to each first lens 12 is constant), and the object distance L2 can be increased appropriately while ensuring the recognition accuracy of the biometric detection algorithm.
Optionally, in some embodiments, the diameter of the sub-detection region is larger than the center-to-center distance P between two adjacent first lenses 12. At this time, the corresponding two sub-detection regions of the two adjacent first lenses 12 overlap.
Optionally, in some embodiments, the diameter of the sub-detection region is equal to the center-to-center distance P between two adjacent first lenses 12. At this time, the corresponding two sub-detection regions of the two adjacent first lenses 12 just do not overlap but have no space.
Optionally, in some embodiments, the diameter of the sub-detection region is smaller than the center-to-center distance P between two adjacent first lenses 12, and at this time, the two corresponding sub-detection regions of two adjacent first lenses 12 do not overlap and have a spacing.
By way of example and not limitation, the image distance L1 may be 200 microns, the object distance L2 may be 1000 microns, the sub-detection region 25a may have a diameter of 100 microns, and the sub-detection region V1 may have a diameter of 500 microns.
For the first lens 12, it satisfies the convex lens imaging equation:
where L2 denotes an object distance, L1 denotes an image distance, and f denotes a focal length of the first lens 12. By way of example and not limitation, the image distance L1 may be 200 microns, the object distance L2 may be 1000 microns, and the focal length of the first lens 12 may be 166 microns according to the above equation. In the embodiment of the present application, the first lens 12 may be configured as a short-focus or ultra-short-focus convex lens. It should be noted that the first lens 12 described in this document is referred to as a convex lens. The focal length of the first lens 12 may be determined based on the viewing angle of the electronic device 100 and the size of the first lens 12. When the angle of view is fixed, the focal length of the first lens 12 may increase in proportion to its size.
Optionally, in some embodiments, the thickness of the display screen 2 is greater than the thickness of the optical sensing device 1 (where the thickness is the length in the Z-axis direction in fig. 1), and at this time, the object-to-image ratio k > 1 is for the optical sensing device 1 to acquire the biometric image acquired by the external object 1000. The detection areas of the plurality of first lenses 12 collectively form a larger detection area on the upper surface of the display screen 2, the detection area can be touched by an external object 1000, and the optical sensing device 1 can receive a detection light beam returned from the detection area by the external object 1000 through the display screen 2.
Referring to fig. 8, fig. 8 is a schematic diagram of an image of a large lens 1002 according to the prior art. Taking fingerprint detection as an example, in order to acquire sufficient fingerprint characteristic information, the large lens 1002 and the first lens 12 need to perform convergent imaging on the light beam within the detection area VA. For example, but not limited to, the sensing area VA may be a rectangular area of 4 mm to 10 mm, or the sensing area VA may be a circular area of 4 mm to 10 mm in diameter. Of course, the detection area VA may have other shapes and sizes, which is not limited in the embodiments of the present application. In the optical sensing device 1, each different first lens 12 is used for collecting a part of the area on the detection area VA. In contrast, the large lens 1002 of the prior art needs to perform convergent imaging of the light beam 101 transmitted through the entire detection area VA. When imaging is actually performed, the distance between the optical center of the first lens 12 and the detection area VA is smaller than the distance between the optical center of the large lens 1002 and the detection area VA, and the distance between the optical center of the first lens 12 and the photosensitive array 22 of the image sensing chip 20 is smaller than the distance between the optical center of the large lens 1002 and the photosensitive array 22.
Therefore, the distance from the detection area VA to the image sensing chip 20 in the prior art is greater than the distance from the detection area VA to the image sensing chip 20 when the optical sensing device 1 is used for fingerprint detection in the embodiment of the present application. Therefore, compared to the prior art, the optical sensing device 1 of the present application has a more compact and compact volume and size, and can be used in electronic devices 100 with more stringent requirements for occupying internal space, such as mobile phones, tablet computers, smart watches, and the like. The overall thickness of the optical sensing device 1 of the present application can be up to 0.5 mm, for example, 0.4 mm, 0.35 mm or less, and the optical sensing device 1 can be used as an ultra-thin camera or applied under the display screen 2 (see fig. 1) to realize the optical biometric feature detection under the screen.
It should be understood that the specific examples in the embodiments of the present application are for the purpose of promoting a better understanding of the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application.
It is to be understood that the terminology used in the embodiments of the present application and the appended claims is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the present application. For example, as used in the examples of this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Those of ordinary skill in the art will appreciate that the elements of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system and apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It should be noted that part or all of the structures, functions, and methods of the embodiments of the present application can be applied to other or modified embodiments, and are not limited to the embodiments described in correspondence thereto, and all embodiments obtained thereby belong to the scope of the present application. In addition, in the embodiment of the present application, the light beam may be visible light or invisible light, and the invisible light may be near infrared light, for example. The terms "overlap", "overlap" and "overlapping" as may appear in the description of the present application should be understood to have the same meaning and to be interchangeable.
Any reference in this specification to "one embodiment," "an embodiment," "example embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature or structure is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature or structure in connection with other ones of the embodiments.
The orientations or positional relationships indicated by "length", "width", "upper", "lower", "left", "right", "front", "rear", "back", "front", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, which may appear in the specification of the present application, are based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the embodiments of the present application and simplifying the description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present application. Like reference numbers and letters refer to like items in the figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance. In the description of the present application, "plurality" or "a plurality" means at least two or two unless specifically defined otherwise. In the description of the present application, it should also be noted that, unless explicitly stated or limited otherwise, "disposed," "mounted," and "connected" are to be understood in a broad sense, e.g., they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, or may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (13)
1. An optical sensing device, comprising:
the image sensing chip comprises a plurality of pixel units, wherein the pixel units are used for receiving light beams and converting the received light beams into corresponding electric signals;
a lens module located above the image sensing chip, the lens module including:
the image sensing chip is provided with effective sensing areas corresponding to the first lenses respectively, each effective sensing area comprises a plurality of pixel units, the first lenses are used for converging light beams to the corresponding effective sensing area, and the light beams in a target waveband can penetrate through the first lenses to reach the corresponding effective sensing area and are converted into corresponding electric signals;
wherein part or all of the plurality of first lenses have an arrangement of a triangular mesh.
2. The optical sensing device of claim 1, wherein the triangular mesh comprises an equilateral triangular mesh.
3. The optical sensor apparatus of claim 1, wherein the image sensor chip has a photosensitive surface for receiving the light beam, and the portion of the first lens of the lens module opposite to the photosensitive surface has an arrangement of an equilateral triangular grid.
4. The optical sensing device of claim 1, wherein the center-to-center distance between two adjacent first lenses is greater than or equal to 1 or 2 or 3 or 4 or 5 or 6 times the diameter of the first lenses.
5. The optical sensor apparatus as claimed in claim 1, wherein the first lens is a convex lens, and includes a convex surface facing away from the image sensor chip, and the light beam enters the first lens from the convex surface and is converged to exit from the lens module toward a side of the image sensor chip.
6. The optical sensor apparatus as claimed in claim 1, wherein the lens module further comprises a dam wall disposed on the spacing region of the first lens, wherein:
the retaining wall can shield a light beam of a first preset wave band, and the first preset wave band comprises the target wave band; or
The lens module is still including covering the light shield layer of barricade, the light shield layer can shelter from first predetermined wave band light beam.
7. The optical sensing device as claimed in claim 1, further comprising a filter layer disposed between the lens module and the image sensor chip or formed on a side of the lens module opposite to the image sensor chip, wherein the filter layer is configured to transmit a light beam in a target wavelength band and filter out a light beam in a second predetermined wavelength band, and the second predetermined wavelength band is different from the target wavelength band.
8. The optical sensing device as claimed in claim 7, wherein when the light beam of the target wavelength band is near-infrared light, the second predetermined wavelength band includes a visible light wavelength band, the filter layer is used for filtering visible light and transmitting near-infrared light, and the filter layer is a visible light cut-off filter; when the light beam of the target waveband is visible light, the second preset waveband comprises a near-infrared light waveband, the filter layer is used for filtering near-infrared light and transmitting visible light, and the filter layer is an infrared light cut-off filter.
9. The optical sensor device as claimed in claim 6, wherein the lens module further includes an optical spacer layer located below the first lens and the retaining wall, the optical spacer layer is located above the image sensor chip, the first lens and the retaining wall have an integral structure, the optical spacer layer is a residual layer when the first lens and the retaining wall are formed, and the first lens, the retaining wall and the optical spacer layer are made of the same material.
10. The optical sensing device as claimed in claim 1, wherein the first lens of the lens module is formed directly on the image sensing chip; or
The optical sensing device further comprises a first substrate arranged above the image sensing chip, and the lens module is formed on the first substrate.
11. The optical sensor device as claimed in claim 1, wherein the first lens is a convex lens, and includes a convex surface facing away from a side of the image sensor chip, and the light beam enters the first lens from the convex surface and is converged to exit from the lens module toward the side of the image sensor chip.
12. The optical sensing device of claim 1, wherein the target band of light beams includes visible and/or near infrared light.
13. An electronic device, comprising a display screen, and an optical sensing device under the display screen, wherein the optical sensing device is the optical sensing device according to any one of claims 1 to 12, and the optical sensing device is capable of receiving a light beam with biometric information of an external object through the display screen and converting the light beam into an electrical signal to obtain the biometric information of the external object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202020420003.1U CN211698986U (en) | 2020-03-27 | 2020-03-27 | Optical sensing device and electronic apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202020420003.1U CN211698986U (en) | 2020-03-27 | 2020-03-27 | Optical sensing device and electronic apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
CN211698986U true CN211698986U (en) | 2020-10-16 |
Family
ID=72781782
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202020420003.1U Active CN211698986U (en) | 2020-03-27 | 2020-03-27 | Optical sensing device and electronic apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN211698986U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111291731A (en) * | 2020-03-27 | 2020-06-16 | 深圳阜时科技有限公司 | Optical sensing device and electronic apparatus |
-
2020
- 2020-03-27 CN CN202020420003.1U patent/CN211698986U/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111291731A (en) * | 2020-03-27 | 2020-06-16 | 深圳阜时科技有限公司 | Optical sensing device and electronic apparatus |
CN111291731B (en) * | 2020-03-27 | 2024-07-19 | 深圳阜时科技有限公司 | Optical sensing device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111291731B (en) | Optical sensing device and electronic equipment | |
CN210052176U (en) | Fingerprint detection device and electronic equipment | |
CN111095277B (en) | Optical fingerprint device and electronic equipment | |
CN211349375U (en) | Optical fingerprint device and electronic equipment | |
CN210864756U (en) | Optical fingerprint device and electronic equipment | |
CN111095279A (en) | Fingerprint detection device and electronic equipment | |
CN111586267A (en) | Optical sensing devices and electronic equipment | |
CN110854148A (en) | Optical sensing device and electronic apparatus | |
CN110796123B (en) | Optical fingerprint sensing device and electronic device | |
CN213186221U (en) | Optical sensing device and electronic apparatus | |
CN211698986U (en) | Optical sensing device and electronic apparatus | |
CN211087267U (en) | Fingerprint identification device, backlight unit, display screen and electronic equipment | |
CN211124080U (en) | Optical sensing device and electronic apparatus | |
CN211428168U (en) | Optical sensing device and electronic apparatus | |
CN212161813U (en) | Optical sensing device and electronic apparatus | |
CN211700286U (en) | Optical sensing device and electronic apparatus | |
CN210864759U (en) | Optical fingerprint sensing device and electronic equipment | |
CN211700285U (en) | Optical integrated device | |
CN111095287A (en) | Optical fingerprint device and electronic equipment | |
CN210864760U (en) | Optical biological characteristic sensing device and mobile phone | |
CN210864761U (en) | Optical fingerprint sensing device and electronic equipment | |
CN211427358U (en) | Optical fingerprint sensing device and electronic equipment | |
CN111104864B (en) | Optical fingerprint sensing device and electronic equipment | |
CN211124074U (en) | Optical detection device and electronic apparatus | |
CN211698974U (en) | Optical fingerprint sensing device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |