CN216083081U - Light receiving module, depth camera and terminal - Google Patents

Light receiving module, depth camera and terminal Download PDF

Info

Publication number
CN216083081U
CN216083081U CN202121842408.5U CN202121842408U CN216083081U CN 216083081 U CN216083081 U CN 216083081U CN 202121842408 U CN202121842408 U CN 202121842408U CN 216083081 U CN216083081 U CN 216083081U
Authority
CN
China
Prior art keywords
phase
light
lens
image sensor
receiving module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121842408.5U
Other languages
Chinese (zh)
Inventor
刘海亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202121842408.5U priority Critical patent/CN216083081U/en
Application granted granted Critical
Publication of CN216083081U publication Critical patent/CN216083081U/en
Priority to PCT/CN2022/098857 priority patent/WO2023011009A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Blocking Light For Cameras (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The application discloses light receiving module, degree of depth camera and terminal. The light receiving module comprises an image sensor and a phase lens. In a direction from an object side to an image side of the light receiving module, the phase lens and the image sensor are sequentially arranged, the phase lens is used for adjusting a phase of light emitted from the phase lens to the image sensor, and the image sensor is used for receiving the light to acquire depth data. Light receiving module, degree of depth camera and terminal in this application replace traditional refraction lens group through set up the phase type lens that can adjust light phase place in light receiving module, so can reduce light receiving module's volume, can also promote light and arrive image sensor's illuminance, be favorable to image sensor to receive light to promote the detection precision of degree of depth camera.

Description

Light receiving module, depth camera and terminal
Technical Field
The application relates to the technical field of distance measurement, in particular to a light receiving module, a depth camera and a terminal.
Background
Time of flight (ToF) is a technique that calculates the distance between an object and a sensor by measuring the Time difference between the transmitted signal and the signal reflected back by the object.
A typical TOF structure includes a transmitting end module (Tx) and a receiving end module (Rx), in which laser emitted from a light source is projected onto an object in a speckle form or a flood form through a collimating mirror and a Diffractive Optical Element (DOE) or through a quasi-diameter and light-homogenizing sheet (Diffuser), and the diffuse reflected light of the speckle or the flood is received by the receiving end module to complete the collection of depth signals.
The receiving end module has a high requirement on the signal-to-noise ratio of the received signal, and if the image sensor adopted in the TOF receiving end module performs illumination compensation, the signal-to-noise ratio and the signal dynamic range can be reduced. Therefore, the image sensor adopted in the TOF receiving end module generally does not have the Illumination compensation gain function of the conventional image sensor, and therefore, only relatively high Relative Illumination (RI) is required on the optical hardware. However, at present, the collimator lens in the receiving end module usually adopts a conventional refractive lens, which is limited by curved surface refraction characteristics, and the image plane has relative illumination loss, thereby reducing the sensing distance and precision.
SUMMERY OF THE UTILITY MODEL
The embodiment of the application provides a light receiving module, a depth camera and a terminal.
The embodiment of the application provides a light receiving module. The light receiving module comprises an image sensor and a phase type lens. In a direction from an object side to an image side of the light receiving module, the phase lens and the image sensor are sequentially arranged, the phase lens is used for adjusting a phase of light emitted from the phase lens to the image sensor, and the image sensor is used for receiving the light to obtain depth data.
In some embodiments, the illumination of the light reaching the image sensor is greater than or equal to 98%.
In some embodiments, the light receiving module further includes a filter, and the phase lens, the filter, and the image sensor are sequentially arranged in a direction from an object side to an image side of the light receiving module, and the filter is configured to filter light outside a predetermined wavelength range.
In some embodiments, the light receiving module further includes a lens barrel and a light-transmissive cover plate. The phase type lens is arranged in the lens barrel and comprises a first opening and a second opening which are opposite, and the second opening is closer to the image sensor than the first opening. The cover plate is mounted at the first opening of the lens barrel.
In some embodiments, the phase lens includes a substrate and a phase microstructure disposed on the substrate. The phase microstructure is used for adjusting the phase of light rays emitted from the phase type lens to the image sensor.
In some embodiments, the substrate includes first and second opposing faces, the first face being further from the image sensor than the second face, and the phase microstructure is disposed on the first and/or second faces.
In some embodiments, the substrate includes a first side and a second side opposite to each other, the first side is farther away from the image sensor than the second side, one of the first side and the second side is provided with the phase microstructure, and the other side is a refractive lens surface.
In some embodiments, the first face is provided with the phase microstructure, and the second face is a convex lens surface; or the second surface is provided with the phase microstructure, and the first surface is a convex lens surface.
In some embodiments, the phase lens is a planar phase lens, the phase microstructure comprising a nano-microstructure; or, the phase type lens is a Fresnel lens, and the phase microstructure comprises an annular Fresnel microstructure.
The embodiment of the application also provides a depth camera. The depth camera comprises a light emitting module and the light receiving module in any one of the embodiments. The light emitting module is used for emitting light, and the light receiving module is used for receiving at least part of the light reflected by the object and forming an electric signal.
The embodiment of the application also provides a terminal. The terminal comprises a shell and the depth camera in the embodiment. The depth camera is coupled to the housing.
Light receiving module, degree of depth camera and terminal in this application replace traditional refraction lens group through set up the phase type lens that can adjust light phase place in light receiving module, so can reduce light receiving module's volume, can also promote light and arrive image sensor's illuminance, be favorable to image sensor to receive light to promote the detection precision of degree of depth camera.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 to 10 are schematic structural views of a light receiving module according to some embodiments of the present disclosure;
FIG. 11 is a schematic diagram of a depth camera in some embodiments of the present application;
fig. 12 is a schematic structural diagram of a terminal in some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Time of flight (ToF) is a technique that calculates the distance between an object and a sensor by measuring the Time difference between the transmitted signal and the signal reflected back by the object. A typical TOF structure includes a transmitting end module (Tx) and a receiving end module (Rx), in which laser emitted from a light source is projected onto an object in a speckle form or a flood form through a collimating mirror and a Diffractive Optical Element (DOE) or through a quasi-diameter and light-homogenizing sheet (Diffuser), and the diffuse reflected light of the speckle or the flood is received by the receiving end module to complete the collection of depth signals. At present, a receiving end module in the flight time technology generally adopts a traditional refraction lens group, however, the refraction lens is limited by the curved surface refraction characteristic of the refraction lens, relative illumination loss exists on an image surface, namely, illumination of light reaching a sensor after passing through the refraction lens group has large loss, and thus the measurement precision can be influenced.
Referring to fig. 1, in order to solve the above technical problem, an embodiment of the present invention provides a light receiving module 10. The light receiving module 10 includes an image sensor 11 and a phase lens 12. In a direction from the object side to the image side of the light receiving module 10, the phase lens 12 and the image sensor 11 are arranged in sequence, the phase lens 12 is used for adjusting a phase of light emitted from the phase lens 12 to the image sensor 11, and the image sensor 11 is used for receiving the light to obtain depth data. The object side and the image side are in this order along the incident direction of light (the dotted line with an arrow in fig. 1 indicates the incident direction of light).
Light receiving module 10 in this application replaces traditional refraction lens group through setting up phase type lens 12 that can adjust the light phase place, so can reduce light receiving module 10's volume, can also promote the illuminance that light reachd image sensor 11, is favorable to image sensor to receive light to promote the detection precision of degree of depth camera 100 (shown in fig. 11).
It should be noted that the number of the phase-type lenses 12 provided in the light receiving module 10 may be one or more, and is not limited herein. In addition, in some embodiments, the illumination of the light reaching the image sensor 11 through the phase-type lens 12 is greater than or equal to 98%. Specifically, referring to fig. 1, the phase lens 12 includes a substrate 121 and a phase microstructure 122 disposed on the substrate 121. The phase structure 122 is used to adjust the phase of the light emitted from the phase lens 12 to the image sensor 11.
More specifically, the substrate 121 includes a first surface 1211 and a second surface 1212 that are opposite to each other, and the first surface 1211 is farther from the image sensor 11 than the second surface 1212. Phase microstructure 122 is disposed on first face 1211 and/or second face 1212.
For example, referring to fig. 1 and 6, in some embodiments, the phase microstructure 122 is disposed on the first surface 1211 of the substrate 121. Thus, light can be directly incident on the phase microstructure 122, and the light is modulated by the phase microstructure 122 and then exits to the image sensor 11 through the substrate 121. Since the phase microstructure 122 adjusts the phase of the light emitted from the phase lens 12 to the image sensor 11, compared with the case where the light directly passes through the lens and then is emitted to the sensor, the illumination of the light reaching the image sensor 11 can be improved.
For another example, referring to fig. 2 and 7, in some embodiments, the phase microstructure 122 is disposed on the second surface 1212 of the substrate 121, so that light can pass through the substrate 121 before reaching the phase microstructure 122, that is, the light is prevented from directly entering the phase microstructure 122. Glare may occur due to direct incidence of strong light on the action microstructure 122, and the stray light is relatively strong, which is not beneficial for the image sensor 11 to receive light, thereby affecting the detection accuracy of the depth camera 100 (shown in fig. 11). Therefore, in the embodiment, the phase microstructure 122 is disposed on the second surface 1211 of the substrate 121 closer to the image sensor 11, and compared with the microstructure 122 disposed on the first surface 1211 of the substrate 121 closer to the image sensor 11, the illumination of the light reaching the image sensor 11 can be improved, and simultaneously, the occurrence of glare and the reduction of stray light can be avoided, which is beneficial for the image sensor 11 to receive the light, so as to improve the detection accuracy of the depth camera 100 (shown in fig. 11).
For another example, referring to fig. 3 and 8, in some embodiments, the phase microstructures 122 may be disposed on both sides of the substrate 121. That is, the phase microstructures 122 are disposed on the first surface 1211 and the second surface 1212 of the substrate 121. In this way, the phase of the light beam emitted from the phase-type lens 12 to the image sensor 11 can be adjusted a plurality of times without providing a plurality of phase-type lenses 12. In addition, compared with the phase type lens 12 having the phase microstructures 122 on only one side, the phase type lens 12 having the phase microstructures 122 on both sides can further increase the illumination of the light reaching the image sensor 11 and reduce the distortion of the light received by the image sensor 11. It should be noted that, in some embodiments, the shapes, the numbers and the arrangements of the phase microstructures 122 disposed on the opposite sides of the substrate 12 may be completely the same or different, and are not limited herein.
Referring to fig. 4 to 5 and 9 to 10, in some embodiments, when only one side of the substrate 121 of the phase lens 12 is provided with the phase microstructures 122, the other side of the substrate 121 may be a refractive lens surface. That is, one side of the substrate 121 of the phase lens 12 is provided with the phase microstructure 122, and the other side (i.e. the side not provided with the phase microstructure 122) has the same function as the surface of the refractive lens, i.e. the other side has curvature capable of refracting light. Since the phase microstructure 122 is disposed on one surface of the phase lens 12 and the other surface is a refractive lens surface, the illumination of the light reaching the image sensor 11 can be increased, and at the same time, stray light can be reduced and distortion of the light received by the image sensor 11 can be reduced.
For example, referring to fig. 4 and 9, in some embodiments, the phase microstructure 122 is disposed on the first surface 1211 of the substrate 121, and the second surface 1212 of the substrate 121 is a convex lens surface. Specifically, the second face 1212 of the substrate 121 projects toward the side close to the image sensor 11. Thus, the second surface 1212 (convex lens surface) can guide the light modulated by the phase microstructure 122 to the image sensor 11, which is beneficial for the image sensor 11 to receive the light, so as to improve the detection accuracy of the depth camera 100 (shown in fig. 11). Referring to fig. 5 and 10, in some embodiments, the phase microstructure 122 is disposed on the second surface 1212 of the substrate 121, and the first surface 1211 of the substrate 121 is a convex lens surface. Specifically, the first face 1211 of the substrate 121 protrudes to the side away from the image sensor 11. Thus, the first surface 1211 (the convex lens surface) can converge and guide the light to the phase microstructure 122, and then the light is modulated by the phase microstructure 122 and then emitted to the image sensor 11, which is beneficial for the image sensor 11 to receive the light, so as to improve the detection accuracy of the depth camera 100 (shown in fig. 11). Since one side of the phase microstructure 122 of the phase lens 12 is a convex lens surface, the illumination of the light reaching the image sensor 11 can be increased, and the stray light can be reduced and the distortion of the light received by the image sensor 11 can be reduced.
Referring to fig. 1 to 5, in some embodiments, the phase lens 12 is a planar phase lens, and the phase microstructure 122 includes a nano-microstructure. Because the manufacturing difficulty of the planar phase lens is lower than that of the fresnel lens, the phase lens 12 in this embodiment adopts the planar phase lens, which can reduce the processing difficulty of the phase lens 12 compared with the fresnel lens, so as to reduce the processing difficulty of the light receiving module 10. In particular, in one example, as shown in FIG. 2, the phase lens 12 is a planar phase lens and the phase microstructure 122 includes a nano-microstructure. Phase microstructure 122 is disposed on second side 1212 of substrate 121, and from the center of second side 1212 toward the edge of second side 1212, the component of phase microstructure 122 gradually decreases, i.e., the closer to the center of second side 1212, the denser the arrangement of nano-microstructures 122 is; the closer to the edge of the second face 1212, the more sparsely the nano-microstructures 122 are disposed. That is, the nano-microstructures are disposed on the second side 1212 of the substrate 121, and the components of the nano-microstructures gradually decrease from the center of the second side 1212 toward the edge of the second side 1212. Of course, in other embodiments, the nano-microstructures may be arranged on the substrate 121 in other ways, which is not limited herein.
Referring to fig. 6 to 10, in some embodiments, the phase lens 12 is a fresnel lens, and the phase microstructure 122 includes a ring-shaped fresnel microstructure. Because the design difficulty of the fresnel lens is lower than that of the planar phase lens, the fresnel lens adopted by the phase lens 12 in this embodiment can reduce the design difficulty of the phase lens 12 compared with the planar phase lens, thereby reducing the design difficulty of the light receiving module 10.
Referring to fig. 1, in some embodiments, the light receiving module 10 may further include a filter 13, and the filter 13 is disposed between the phase lens 12 and the image sensor 11. That is, the phase-type lens 12, the filter 13, and the image sensor 11 are sequentially arranged in a direction from the object side to the image side of the light receiving module 10. The filter 13 is used for filtering light outside the predetermined wavelength range, so that after the filter 13 receives the light modulated by the phase-type lens 12, the filter 13 filters the light outside the predetermined wavelength range, and only the light within the predetermined wavelength range can be emitted to the image sensor 11 through the filter 13. It is noted that in some embodiments, the predetermined wavelength range is 920nm to 960 nm. In particular, in some embodiments, the optical filter 13 is capable of transmitting light with a wavelength of 940nm, that is, light with a wavelength of 940nm can be emitted to the image sensor 11.
Referring to fig. 1, in some embodiments, the light receiving module 10 may further include a lens barrel 14 and a light-transmitting cover plate 15. Specifically, the lens barrel 14 includes a first opening 141 and a second opening 142 opposite to each other, and the phase-type lens 12 is mounted in the lens barrel 14. The second opening 142 is closer to the image sensor 11 than the first opening 141, and the cover plate 15 is mounted at the first opening 141 of the lens barrel 14. This can protect the phase lens 12 from being damaged easily, and prevent dust and liquid from entering the phase lens 12 and affecting the normal operation of the light receiving module 10. In particular, the cover plate 15 can block the first opening 141, so that dust and liquid can be further prevented from entering the phase lens 12. It should be noted that, if the light receiving module 10 includes the optical filter 13, the optical filter 13 is located at the second opening of the lens barrel 14.
Referring to fig. 11, a depth camera 100 is further provided in the present embodiment. The depth camera 100 includes a light emitting module 20 and a light receiving module 10 described in any of the above embodiments. The light emitting module 20 is used for emitting light, and the light receiving module 10 is used for receiving at least part of the light reflected by the object and forming an electrical signal. The depth camera 100 obtains depth information of the object from the electrical signal formed by the light receiving module 10.
Depth camera 100 in this application replaces traditional refraction lens group through set up phase type lens 12 that can adjust the light phase place in light receiving module 10, so can reduce light receiving module 10's volume, can also promote the illuminance that light reachd image sensor 11, is favorable to image sensor 11 to receive light to promote depth camera 100's detection precision.
Referring to fig. 12, a terminal 1000 is also provided in the present embodiment. Terminal 1000 can include a housing 200 and a depth camera 100 as described in any of the embodiments above, depth camera 100 being coupled to housing 200. It should be noted that the terminal 1000 may be a mobile phone, a computer, a tablet computer, an intelligent watch, an intelligent wearable device, and the like, which is not limited herein.
Terminal 1000 in this application replaces traditional refraction lens group through set up phase type lens 12 that can adjust light phase place in light receiving module 10, so can reduce light receiving module 10's volume, can also promote the illuminance that light reachd image sensor 11, is favorable to image sensor 11 to receive light to promote depth camera 100's detection precision.
In the description herein, reference to the description of the terms "certain embodiments," "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "a plurality" means at least two, e.g., two, three, unless specifically limited otherwise.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (11)

1. An optical receiving module, comprising:
an image sensor; and
the phase type lens and the image sensor are sequentially arranged in the direction from the object side to the image side of the light receiving module, the phase type lens is used for adjusting the phase of light rays emitted from the phase type lens to the image sensor, and the image sensor is used for receiving the light rays to acquire depth data.
2. The light receiving module as claimed in claim 1, wherein the illumination of the light reaching the image sensor is greater than or equal to 98%.
3. The light receiving module of claim 1, further comprising a filter, wherein the phase lens, the filter, and the image sensor are arranged in sequence from an object side to an image side of the light receiving module, and the filter is configured to filter light outside a predetermined wavelength range.
4. The optical transceiver module of claim 1, further comprising:
a lens barrel, wherein the phase lens is installed in the lens barrel and comprises a first opening and a second opening which are opposite, and the second opening is closer to the image sensor than the first opening; and
the light-transmitting cover plate is arranged at the first opening of the lens barrel.
5. The optical receiving module as claimed in any one of claims 1 to 4, wherein the phase lens comprises:
a substrate; and
the phase microstructure is arranged on the substrate and used for adjusting the phase of light rays emitted from the phase type lens to the image sensor.
6. The light receiving module of claim 5, wherein the substrate comprises a first side and a second side opposite to each other, the first side is further away from the image sensor than the second side, and the phase microstructure is disposed on the first side and/or the second side.
7. The light receiving module as claimed in claim 5, wherein the substrate comprises a first surface and a second surface opposite to each other, the first surface is further away from the image sensor than the second surface, one of the first surface and the second surface is provided with the phase microstructure, and the other surface is a refractive lens surface.
8. The light-receiving module of claim 7, wherein the first surface is provided with the phase microstructure, and the second surface is a convex lens surface; or
The second surface is provided with the phase microstructure, and the first surface is a convex lens surface.
9. The light receiving module of claim 5, wherein the phase lens is a planar phase lens, and the phase microstructure comprises a nano-microstructure; or
The phase type lens is a Fresnel lens, and the phase microstructure comprises an annular Fresnel microstructure.
10. A depth camera, comprising:
the light emitting module is used for emitting light; and
the light receiving module as claimed in any one of claims 1-9, for receiving at least a portion of the light reflected back from the object and forming an electrical signal.
11. A terminal, comprising:
a housing; and
the depth camera of claim 10, in combination with the housing.
CN202121842408.5U 2021-08-06 2021-08-06 Light receiving module, depth camera and terminal Active CN216083081U (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202121842408.5U CN216083081U (en) 2021-08-06 2021-08-06 Light receiving module, depth camera and terminal
PCT/CN2022/098857 WO2023011009A1 (en) 2021-08-06 2022-06-15 Optical receiving module, depth camera and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121842408.5U CN216083081U (en) 2021-08-06 2021-08-06 Light receiving module, depth camera and terminal

Publications (1)

Publication Number Publication Date
CN216083081U true CN216083081U (en) 2022-03-18

Family

ID=80668203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121842408.5U Active CN216083081U (en) 2021-08-06 2021-08-06 Light receiving module, depth camera and terminal

Country Status (2)

Country Link
CN (1) CN216083081U (en)
WO (1) WO2023011009A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023011009A1 (en) * 2021-08-06 2023-02-09 Oppo广东移动通信有限公司 Optical receiving module, depth camera and terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130038035A (en) * 2011-10-07 2013-04-17 삼성전자주식회사 Image sensor
US10802117B2 (en) * 2018-01-24 2020-10-13 Facebook Technologies, Llc Systems and methods for optical demodulation in a depth-sensing device
US10735640B2 (en) * 2018-02-08 2020-08-04 Facebook Technologies, Llc Systems and methods for enhanced optical sensor devices
US10805594B2 (en) * 2018-02-08 2020-10-13 Facebook Technologies, Llc Systems and methods for enhanced depth sensor devices
CN111694161A (en) * 2020-06-05 2020-09-22 Oppo广东移动通信有限公司 Light emitting module, depth camera and electronic equipment
CN112945141B (en) * 2021-01-29 2023-03-14 中北大学 Structured light rapid imaging method and system based on micro-lens array
CN216083081U (en) * 2021-08-06 2022-03-18 Oppo广东移动通信有限公司 Light receiving module, depth camera and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023011009A1 (en) * 2021-08-06 2023-02-09 Oppo广东移动通信有限公司 Optical receiving module, depth camera and terminal

Also Published As

Publication number Publication date
WO2023011009A1 (en) 2023-02-09

Similar Documents

Publication Publication Date Title
KR200492211Y1 (en) Near-infrared imaging lens
KR20120068177A (en) Lens assembly
CN110945527B (en) Fingerprint identification device and electronic equipment
US20180128943A1 (en) Imaging device provided with lens having moth-eye structure
CN216083081U (en) Light receiving module, depth camera and terminal
CN215499250U (en) Camera module and electronic equipment
WO2021213218A1 (en) Periscopic photographing module, multi-camera photographing module, and electronic device
US12041333B2 (en) Camera module including refractive member and electronic device including refractive member
CN111522186A (en) Lens barrel
US20210063674A1 (en) Lens module, optical lens, and electronic device
CN110940282B (en) Dual-wavelength laser receiving optical system and laser ranging receiving device
CN102651065A (en) Optical fingerprint collection device
WO2022068678A1 (en) Photographing apparatus and electronic device
CN114815138B (en) Imaging lens group and optical identification system
CN213957737U (en) Light detector based on telecentric lens
CN221827157U (en) Polarization receiving module and depth camera
CN221507289U (en) Combined shading component of optical prism for periscope type lens
CN220105399U (en) Optical lens and laser radar
CN218350606U (en) Optical sensing system
CN113746549B (en) Optical signal receiving multiplexing system
KR100643464B1 (en) Camera module
CN219916060U (en) Lens system, imaging module and TOF depth camera
CN220153512U (en) Deviation correcting sensor
CN211506525U (en) Optical fingerprint identification device and electronic equipment
US11163090B2 (en) Photoelectric sensor with coaxial emission and receiving optical paths

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant