WO2023011009A1 - 光接收模组、深度相机及终端 - Google Patents

光接收模组、深度相机及终端 Download PDF

Info

Publication number
WO2023011009A1
WO2023011009A1 PCT/CN2022/098857 CN2022098857W WO2023011009A1 WO 2023011009 A1 WO2023011009 A1 WO 2023011009A1 CN 2022098857 W CN2022098857 W CN 2022098857W WO 2023011009 A1 WO2023011009 A1 WO 2023011009A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase
lens
image sensor
light
receiving module
Prior art date
Application number
PCT/CN2022/098857
Other languages
English (en)
French (fr)
Inventor
刘海亮
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023011009A1 publication Critical patent/WO2023011009A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles

Definitions

  • the present application relates to the technical field of distance measurement, and more specifically, to a light receiving module, a depth camera and a terminal.
  • Time of flight (ToF) technology is a technology that calculates the distance between the object and the sensor by measuring the time difference between the transmitted signal and the signal reflected by the object.
  • a typical TOF structure includes a transmitter module (Tx) and a receiver module (Rx).
  • Tx transmitter module
  • Rx receiver module
  • the laser light emitted by the light source passes through a collimator and a diffraction optical element (Diffractive Optical Element, DOE), or
  • DOE diffraction optical element
  • the quasi-diameter and diffuser are projected on the object in the form of speckle or flood light, and the diffuse reflection light of speckle or flood light is received by the receiver module to complete the collection of depth signals.
  • the receiver module has high requirements on the signal-to-noise ratio of the received signal, because if the image sensor used in the TOF receiver module performs illumination compensation, the signal-to-noise ratio and signal dynamic range will be reduced. Therefore, the current image sensor used in the TOF receiver module generally does not have the illumination compensation gain function of the traditional image sensor, so it can only require a higher relative illumination (Relative Illumination, RI) on the optical hardware.
  • the collimating mirror in the receiver module usually uses a traditional refracting lens, which is limited by the refraction characteristics of the curved surface, and there is a relative illuminance loss on the image plane, thereby reducing the perception distance and accuracy.
  • Embodiments of the present application provide a light receiving module, a depth camera, and a terminal.
  • Embodiments of the present application provide a light receiving module.
  • the light receiving module includes an image sensor and a phase lens.
  • the phase lens and the image sensor are arranged in sequence, and the phase lens is used to adjust the output from the phase lens to the image sensor
  • the phase of the light, the image sensor is used to receive the light to obtain the depth data.
  • the illuminance of the light reaching the image sensor is greater than or equal to 98%.
  • the light receiving module further includes a filter, and in the direction from the object side to the image side of the light receiving module, the phase lens, the filter, and the The image sensors are arranged in sequence, and the optical filter is used to filter light outside the predetermined wavelength range.
  • the light receiving module further includes a lens barrel and a light-transmitting cover plate.
  • the phase lens is installed in the lens barrel and includes a first opening and a second opening opposite to each other, and the second opening is closer to the image sensor than the first opening.
  • the cover plate is installed at the first opening of the lens barrel.
  • the phase lens includes a substrate and a phase microstructure disposed on the substrate.
  • the phase microstructure is used to adjust the phase of the light emitted from the phase lens to the image sensor.
  • the substrate includes a first surface and a second surface opposite to each other, the first surface is farther away from the image sensor than the second surface, and the phase microstructure is disposed on the first side and/or the second side.
  • the substrate includes opposite first and second surfaces, the first surface is farther from the image sensor than the second surface, the first surface and the second surface are opposite to each other.
  • One of the two surfaces is provided with the phase microstructure, and the other is a refracting lens surface.
  • the first surface is provided with the phase microstructure, and the second surface is a surface of a convex lens; or, the second surface is provided with the phase microstructure, and the first surface is a convex lens surface.
  • the phase lens is a planar phase lens, and the phase microstructure includes a nano-microstructure; or, the phase lens is a Fresnel lens, and the phase microstructure includes an annular Fresnel lens. microstructure.
  • the embodiments of the present application also provide a depth camera.
  • the depth camera includes a light emitting module and the light receiving module described in any one of the above embodiments.
  • the light emitting module is used to emit light
  • the light receiving module is used to receive at least part of the light reflected by the object and form an electrical signal.
  • the embodiments of the present application also provide a terminal.
  • the terminal includes a casing and the depth camera described in the above embodiments.
  • the depth camera is combined with the casing.
  • FIGS. 1 to 10 are schematic structural views of light receiving modules in some embodiments of the present application.
  • Fig. 11 is a schematic structural diagram of a depth camera in some embodiments of the present application.
  • Fig. 12 is a schematic structural diagram of a terminal in some embodiments of the present application.
  • a first feature being "on” or “under” a second feature may mean that the first and second features are in direct contact, or that the first and second features are indirect through an intermediary. touch.
  • “above”, “above” and “above” the first feature on the second feature may mean that the first feature is directly above or obliquely above the second feature, or simply means that the first feature is higher in level than the second feature.
  • “Below”, “beneath” and “beneath” the first feature may mean that the first feature is directly below or obliquely below the second feature, or simply means that the first feature is less horizontally than the second feature.
  • Embodiments of the present application provide a light receiving module.
  • the light receiving module includes an image sensor and a phase lens.
  • the phase lens and the image sensor are arranged in sequence, the phase lens is used to adjust the phase of the light emitted from the phase lens to the image sensor, and the image sensor is used to receive the light and Get depth data.
  • the illuminance of light reaching the image sensor is greater than or equal to 98%.
  • the light-receiving module further includes a filter, and in the direction from the object side to the image side of the light-receiving module, the phase lens, the filter, and the image sensor are arranged in sequence, and the filter is used for To filter light outside a predetermined wavelength range.
  • the light receiving module further includes a lens barrel and a light-transmitting cover plate.
  • the phase lens is installed in the lens barrel and includes a first opening and a second opening opposite to each other. The second opening is closer to the image sensor than the first opening.
  • the cover plate is installed at the first opening of the lens barrel.
  • the phase lens includes a substrate and a phase microstructure, the phase microstructure is disposed on the substrate, and the phase microstructure is used to adjust the phase of light emitted from the phase lens to the image sensor.
  • the substrate includes a first surface and a second surface opposite to each other, the first surface is farther away from the image sensor than the second surface, and the phase microstructure is disposed on the first surface and/or the second surface.
  • the substrate includes a first surface and a second surface opposite to each other, the first surface is farther away from the image sensor than the second surface, and one of the first surface and the second surface is provided with a phase microstructure, The other side is a refractive lens surface.
  • the first surface is provided with a phase microstructure, and the second surface is a convex lens surface; or the second surface is provided with a phase microstructure, and the first surface is a convex lens surface.
  • the phase lens is a planar phase lens, and the phase microstructures include nano-microstructures; or the phase lens is a Fresnel lens, and the phase microstructures include annular Fresnel microstructures.
  • Embodiments of the present application provide a depth camera.
  • the depth camera includes a light emitting module and a light receiving module.
  • the light emitting module is used to emit light.
  • the light receiving module is used to receive at least part of the light reflected by the object and form an electrical signal.
  • the light receiving module includes an image sensor and a phase lens. In the direction from the object side to the image side of the light receiving module, the phase lens Arranged in sequence with the image sensor, the phase lens is used to adjust the phase of the light emitted from the phase lens to the image sensor, and the image sensor is used to receive the light to obtain depth data.
  • the illuminance of light reaching the image sensor is greater than or equal to 98%.
  • the light-receiving module further includes a filter, and in the direction from the object side to the image side of the light-receiving module, the phase lens, the filter, and the image sensor are arranged in sequence, and the filter is used for To filter light outside a predetermined wavelength range.
  • the light receiving module further includes a lens barrel and a light-transmitting cover plate.
  • the phase lens is installed in the lens barrel and includes a first opening and a second opening opposite to each other. The second opening is closer to the image sensor than the first opening.
  • the cover plate is installed at the first opening of the lens barrel.
  • the phase lens includes a substrate and a phase microstructure, the phase microstructure is disposed on the substrate, and the phase microstructure is used to adjust the phase of light emitted from the phase lens to the image sensor.
  • the substrate includes a first surface and a second surface opposite to each other, the first surface is farther away from the image sensor than the second surface, and the phase microstructure is disposed on the first surface and/or the second surface.
  • the substrate includes a first surface and a second surface opposite to each other, the first surface is farther away from the image sensor than the second surface, and one of the first surface and the second surface is provided with a phase microstructure, The other side is a refractive lens surface.
  • the first surface is provided with a phase microstructure, and the second surface is a convex lens surface; or the second surface is provided with a phase microstructure, and the first surface is a convex lens surface.
  • the phase lens is a planar phase lens
  • the phase microstructures include nano-microstructures.
  • the phase lens is a Fresnel lens
  • the phase microstructure includes an annular Fresnel microstructure
  • An embodiment of the present application provides a terminal.
  • the terminal includes a casing and the depth camera described in any one of the above implementation manners.
  • the depth camera is combined with the housing.
  • Time of flight (ToF) technology is a technology that calculates the distance between the object and the sensor by measuring the time difference between the transmitted signal and the signal reflected by the object.
  • a typical TOF structure includes a transmitter module (Tx) and a receiver module (Rx).
  • the transmitter module the laser light emitted by the light source passes through a collimator and a diffraction optical element (Diffractive Optical Element, DOE), or The quasi-diameter and diffuser are projected on the object in the form of speckle or flood light, and the diffuse reflection light of speckle or flood light is received by the receiver module to complete the collection of depth signals.
  • the receiving end module in the time-of-flight technology usually adopts the traditional refraction lens group.
  • the refraction lens is limited by its own curved surface refraction characteristics, and there is a relative illuminance loss on the image plane, that is, the illuminance of the light reaching the sensor after passing through the refraction lens group exists. Large loss, which will affect the measurement accuracy.
  • the light receiving module 10 includes an image sensor 11 and a phase lens 12 .
  • the phase lens 12 and the image sensor 11 are arranged in sequence, and the phase lens 12 is used to adjust the phase of the light emitted from the phase lens 12 to the image sensor 11, and the image
  • the sensor 11 is used to receive light to obtain depth data.
  • the incident direction of the light along the incident direction of the light (the dotted line with the arrow in FIG. 1 indicates the incident direction of the light) are the object side and the image side in sequence.
  • the light-receiving module 10 in this application replaces the traditional refracting lens group by setting a phase lens 12 capable of adjusting the light phase, which can reduce the volume of the light-receiving module 10, and can also increase the illuminance of the light reaching the image sensor 11. It is beneficial for the image sensor to receive light to improve the detection accuracy of the depth camera 100 (shown in FIG. 11 ).
  • the number of phase lenses 12 provided in the light receiving module 10 may be one or more, which is not limited here.
  • the illuminance of the light passing through the phase lens 12 and reaching the image sensor 11 is greater than or equal to 98%.
  • the phase lens 12 includes a substrate 121 and a phase microstructure 122 disposed on the substrate 121 .
  • the phase structure 122 is used to adjust the phase of the light emitted from the phase lens 12 to the image sensor 11 .
  • the substrate 121 includes a first surface 1211 and a second surface 1212 opposite to each other, and the first surface 1211 is farther away from the image sensor 11 than the second surface 1212 .
  • the phase microstructure 122 is disposed on the first surface 1211 and/or the second surface 1212 .
  • the phase microstructure 122 is disposed on the first surface 1211 of the substrate 121 .
  • the light can be directly incident on the phase microstructure 122 , and after being adjusted by the phase microstructure 122 , the light passes through the substrate 121 and exits to the image sensor 11 .
  • the phase microstructure 122 adjusts the phase of the light emitted from the phase lens 12 to the image sensor 11 , the illuminance of the light reaching the image sensor 11 can be improved compared to the light that passes through the lens directly and then exits to the sensor.
  • the phase microstructure 122 is disposed on the second surface 1212 of the substrate 121, so that the light can first pass through the substrate 121 and then reach the phase microstructure 122. That is, the light is prevented from directly incident on the phase microstructure 122 . Due to strong light directly incident on the behavioral microstructure 122 may cause glare, and the stray light is relatively strong, which is not conducive to the image sensor 11 receiving light, thus affecting the detection accuracy of the depth camera 100 (shown in FIG. 11 ).
  • the phase microstructure 122 is arranged on the second surface 1211 of the substrate 121 closer to the image sensor 11, compared with the microstructure 122 arranged on the first surface 1211 of the substrate 121 closer to the image sensor 11, it can While increasing the illuminance of the light reaching the image sensor 11 , avoiding glare and reducing stray light is beneficial for the image sensor 11 to receive light, so as to improve the detection accuracy of the depth camera 100 (shown in FIG. 11 ).
  • phase microstructures 122 may be disposed on both surfaces of the substrate 121 . That is to say, the phase microstructure 122 is disposed on the first surface 1211 and the second surface 1212 of the substrate 121 . In this way, the phase of the light emitted from the phase lens 12 to the image sensor 11 can be adjusted multiple times without disposing a plurality of phase lenses 12 .
  • the phase lens 12 with phase microstructures 122 on both sides can further increase the illuminance of light reaching the image sensor 11 and reduce the intensity of light received by the image sensor 11 compared to the phase lens 12 with phase microstructures 122 on only one side. Distortion of light.
  • the shape, quantity and arrangement of the phase microstructures 122 disposed on opposite sides of the substrate 12 may be completely the same or different, which is not limited here.
  • the other side of the substrate 121 can be a refractive lens surface . That is to say, one side of the substrate 121 of the phase lens 12 is provided with the phase microstructure 122, and the other side (that is, the side without the phase microstructure 122) has the same function as the surface of the refracting lens, that is, the other side has a curvature Able to refract light.
  • phase lens 12 Since one side of the phase lens 12 is provided with a phase microstructure 122 and the other side is a refracting lens surface, it can reduce stray light and reduce the distortion of the light received by the image sensor 11 while stepping up the illuminance of the light reaching the image sensor 11 .
  • the first surface 1211 of the substrate 121 is provided with the phase microstructure 122
  • the second surface 1212 of the substrate 121 is a convex lens surface.
  • the second surface 1212 of the substrate 121 protrudes toward a side close to the image sensor 11 .
  • the second surface 1212 can guide the light adjusted by the phase microstructure 122 to the image sensor 11, which is beneficial for the image sensor 11 to receive light, so as to improve the detection accuracy of the depth camera 100 (shown in FIG. 11 ).
  • the second surface 1212 of the substrate 121 is provided with the phase microstructure 122
  • the first surface 1211 of the substrate 121 is a convex lens surface.
  • the first surface 1211 of the substrate 121 protrudes toward a side away from the image sensor 11 .
  • the first surface 1211 (convex lens surface) can converge and guide the light to the phase microstructure 122, and then output to the image sensor 11 after being adjusted by the phase microstructure 122. Shown) the detection accuracy. Since one side of the phase lens 12 has the phase microstructure 122 and the other side is a convex lens surface, it can reduce stray light and reduce the distortion of the light received by the image sensor 11 while further increasing the illumination of the light reaching the image sensor 11 .
  • the phase lens 12 is a planar phase lens, and the phase microstructures 122 include nano-microstructures. Since the manufacturing difficulty of the planar phase lens is lower than that of the Fresnel lens, the phase-type lens 12 in this embodiment can reduce the processing difficulty of the phase-type lens 12 compared with the use of the Fresnel lens, thereby The processing difficulty of the light receiving module 10 is reduced.
  • the phase lens 12 is a planar phase lens
  • the phase microstructure 122 includes a nano-microstructure.
  • the phase microstructure 122 is arranged on the second surface 1212 of the substrate 121, and from the center of the second surface 1212 toward the edge of the second surface 1212, the component of the phase microstructure 122 gradually decreases, that is, the closer to the center of the second surface 1212,
  • the denser the nanostructures 122 are arranged; the closer to the edge of the second surface 1212 , the sparser the nanostructures 122 are arranged. That is, the nano-microstructure is disposed on the second surface 1212 of the substrate 121 , and from the center of the second surface 1212 toward the edge of the second surface 1212 , the components of the nano-microstructure gradually decrease.
  • the nano-microstructures may also have other arrangements on the substrate 121, which is not limited here.
  • the phase lens 12 is a Fresnel lens
  • the phase microstructure 122 includes an annular Fresnel microstructure. Since the design difficulty of the current Fresnel lens is lower than that of the planar phase lens, the use of the Fresnel lens in the phase lens 12 in this embodiment can reduce the design difficulty of the phase lens 12 compared with the planar phase lens. Therefore, the design difficulty of the light receiving module 10 is reduced.
  • the light receiving module 10 may further include a filter 13 disposed between the phase lens 12 and the image sensor 11 . That is to say, in the direction from the object side to the image side of the light receiving module 10 , the phase lens 12 , the filter 13 and the image sensor 11 are arranged in sequence.
  • the optical filter 13 is used to filter the light outside the predetermined wavelength range, so that after the optical filter 13 receives the light adjusted by the phase lens 12, the optical filter 13 filters the light outside the predetermined wavelength range, and only the light within the predetermined wavelength range The light can pass through the filter 13 and exit to the image sensor 11 .
  • the predetermined wavelength range is 920nm-960nm.
  • the filter 13 can transmit light with a wavelength of 940 nm, that is, the light with a wavelength of 940 nm can exit to the image sensor 11 .
  • the light receiving module 10 may further include a lens barrel 14 and a transparent cover 15 .
  • the lens barrel 14 includes opposite first openings 141 and second openings 142 , and the phase lens 12 is installed in the lens barrel 14 .
  • the second opening 142 is closer to the image sensor 11 than the first opening 141
  • the cover plate 15 is mounted on the first opening 141 of the lens barrel 14 .
  • This can protect the phase lens 12 from being easily damaged, and prevent dust and liquid from entering the phase lens 12 and affecting the normal operation of the light receiving module 10 .
  • the cover plate 15 can block the first opening 141 , so as to further prevent dust and liquid from entering the phase lens 12 .
  • the light receiving module 10 includes a filter 13
  • the filter 13 is located at the second opening of the lens barrel 14 .
  • the embodiment of the present application also provides a depth camera 100 .
  • the depth camera 100 includes a light emitting module 20 and the light receiving module 10 described in any one of the above embodiments.
  • the light emitting module 20 is used to emit light
  • the light receiving module 10 is used to receive at least part of the light reflected back by the object and form an electrical signal.
  • the depth camera 100 obtains the depth information of the object according to the electrical signal formed by the light receiving module 10 .
  • the depth camera 100 replaces the traditional refracting lens group by installing a phase lens 12 capable of adjusting the light phase in the light receiving module 10, so that the volume of the light receiving module 10 can be reduced, and the light can reach the image sensor.
  • the illuminance of 11 is favorable for the image sensor 11 to receive light, so as to improve the detection accuracy of the depth camera 100 .
  • the embodiment of the present application further provides a terminal 1000 .
  • the terminal 1000 includes a casing 200 and the depth camera 100 described in any one of the above embodiments, and the depth camera 100 is combined with the casing 200 .
  • the terminal 1000 may be a mobile phone, a computer, a tablet computer, a smart watch, a smart wearable device, etc., which is not limited here.
  • a phase lens 12 capable of adjusting the light phase is installed in the light receiving module 10 to replace the traditional refracting lens group, so that the volume of the light receiving module 10 can be reduced, and the light can reach the image sensor 11.
  • the illuminance is favorable for the image sensor 11 to receive light, so as to improve the detection accuracy of the depth camera 100 .
  • references to the terms “certain embodiments,” “one embodiment,” “some embodiments,” “exemplary embodiments,” “examples,” “specific examples,” or “some examples” To describe means that a specific feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the described specific features, structures, materials or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
  • first and second are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include at least one of said features.
  • “plurality” means at least two, such as two, three, unless otherwise specifically defined.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Measurement Of Optical Distance (AREA)
  • Blocking Light For Cameras (AREA)

Abstract

一种光接收模组(10)、深度相机(100)及终端(1000)。光接收模组(10)包括影像传感器(11)及相位型透镜(12)。在光接收模组(10)的物侧至像侧的方向上,相位型透镜(12)与影像传感器(11)依次排列,相位型透镜(12)用于调节从相位型透镜(12)出射至影像传感器(11)的光线的相位,影像传感器(11)用于接收光线以获取深度数据。

Description

光接收模组、深度相机及终端
优先权信息
本申请请求2021年8月6日向中国国家知识产权局提交的、专利申请号为202121842408.5的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及测距技术领域,更具体而言,涉及一种光接收模组、深度相机及终端。
背景技术
飞行时间技术(Time of flight,ToF)是一种通过测量发射信号和被物体反射回的信号之间的时间差,通过这个时间差,计算出物体和传感器距离之间测距的技术。
典型的TOF结构包括发射端模组(Tx)和接收端模组(Rx),在发射端模组中,光源发出的激光经过准直镜和衍射光学元件(Diffractive Optical Element,DOE),或经过准直径及匀光片(Diffuser),以散斑形式或者泛光形式投影在物体上,散斑或者泛光的漫反射光由接收端模组接收,完成深度信号的收集。
其中,接收端模组对接收到的信号的信噪比有很高的要求,由于TOF接收端模组中采用的图像传感器若进行照度补偿,便会降低信噪比和信号动态范围。因此,目前TOF接收端模组中采用的图像传感器一般不具备传统的图像传感器的照度补偿增益功能,从而只能在光学硬件上要求较高的相对照度(Relative Illumination,RI)。然而,目前,接收端模组中的准直镜通常采用的是传统的折射透镜,折射透镜受限于曲面折射特性,像面存在相对照度损失,由此拉低感知距离和精度。
发明内容
本申请实施方式提供一种光接收模组、深度相机及终端。
本申请实施方式提供一种光接收模组。所述光接收模组包括影像传感器及相位型透镜。在所述光接收模组的物侧至像侧的方向上,所述相位型透镜与所述影像传感器依次排列,所述相位型透镜用于调节从所述相位型透镜出射至所述影像传感器的光线的相位,所述影像传感器用于接收所述光线以获取深度数据。
在一些实施例中,所述光线到达所述影像传感器的照度大于或等于98%。
在一些实施例中,所述光接收模组还包括滤光片,在所述光接收模组的物侧至像侧的方向上,所述相位型透镜、所述滤光片、及所述影像传感器依次排列,所述滤光片用于过 滤预定波长范围以外的光线。
在一些实施例中,所述光接收模组还包括镜筒及透光的盖板。所述相位型透镜安装于所述镜筒内,并包括相背的第一开口与第二开口,所述第二开口相较于所述第一开口更接近所述影像传感器。所述盖板安装于所述镜筒的所述第一开口处。
在一些实施例中,所述相位型透镜包括衬底及设置于衬底的相位微结构。所述相位微结构用于调节从所述相位型透镜出射至所述影像传感器的光线的相位。
在一些实施例中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述相位微结构设置于所述第一面和/或所述第二面。
在一些实施例中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述第一面与所述第二面中的一面设有所述相位微结构,另一面为折射透镜表面。
在一些实施例中,所述第一面设有所述相位微结构,所述第二面为凸透镜表面;或,所述第二面设有所述相位微结构,所述第一面为凸透镜表面。
在一些实施例中,所述相位型透镜为平面相位透镜,所述相位微结构包括纳米微结构;或,所述相位型透镜为菲涅尔透镜,所述相位微结构包括环形的菲涅尔微结构。
本申请实施方式还提供一种深度相机。所述深度相机包括光发射模组及上述任意一项实施例所述的光接收模组。所述光发射模组用于发射光线,所述光接收模组用于接收被物体反射回的至少部分所述光线并形成电信号。
本申请实施方式还提供一种终端。所述终端包括壳体及上述实施例中所述的深度相机。所述深度相机与所述壳体结合。
本申请的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实施方式的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1至图10是本申请某些实施方式中的光接收模组的结构示意图;
图11是本申请某些实施方式中的深度相机的结构示意图;
图12是本申请某些实施方式中的终端的结构示意图。
具体实施方式
以下结合附图对本申请的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。
另外,下面结合附图描述的本申请的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的限制。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
本申请实施方式提供一种光接收模组。光接收模组包括影像传感器及相位型透镜。在光接收模组的物侧至像侧的方向上,相位型透镜与影像传感器依次排列,相位型透镜用于调节从相位型透镜出射至影像传感器的光线的相位,影像传感器用于接收光线以获取深度数据。
在某些实施方式中,光线到达影像传感器的照度大于或等于98%。
在某些实施方式中,光接收模组还包括滤光片,在光接收模组的物侧至像侧的方向上,相位型透镜、滤光片、及影像传感器依次排列,滤光片用于过滤预定波长范围以外的光线。
在某些实施方式中,光接收模组还包括镜筒及透光的盖板。相位型透镜安装于镜筒内,并包括相背的第一开口与第二开口,第二开口相较于第一开口更接近影像传感器,盖板安装于镜筒的第一开口处。
在某些实施方式中,相位型透镜包括衬底及相位微结构,相位微结构设置于衬底,相位微结构用于调节从相位型透镜出射至影像传感器的光线的相位。
在某些实施方式中,衬底包括相背的第一面与第二面,第一面相较于第二面更远离影像传感器,相位微结构设置于第一面和/或第二面。
在某些实施方式中,衬底包括相背的第一面与第二面,第一面相较于第二面更远离影像传感器,第一面与第二面中的一面设有相位微结构,另一面为折射透镜表面。
在某些实施方式中,第一面设有相位微结构,第二面为凸透镜表面;或第二面设有相位微结构,第一面为凸透镜表面。
在某些实施方式中,相位型透镜为平面相位透镜,相位微结构包括纳米微结构;或相位型透镜为菲涅尔透镜,相位微结构包括环形的菲涅尔微结构。
本申请实施方式提供一种深度相机。深度相机包括光发射模组及光接收模组。光发射模组用于发射光线。光接收模组用于接收被物体反射回的至少部分光线并形成电信号, 光接收模组包括影像传感器及相位型透镜,在光接收模组的物侧至像侧的方向上,相位型透镜与影像传感器依次排列,相位型透镜用于调节从相位型透镜出射至影像传感器的光线的相位,影像传感器用于接收光线以获取深度数据。
在某些实施方式中,光线到达影像传感器的照度大于或等于98%。
在某些实施方式中,光接收模组还包括滤光片,在光接收模组的物侧至像侧的方向上,相位型透镜、滤光片、及影像传感器依次排列,滤光片用于过滤预定波长范围以外的光线。
在某些实施方式中,光接收模组还包括镜筒及透光的盖板。相位型透镜安装于镜筒内,并包括相背的第一开口与第二开口,第二开口相较于第一开口更接近影像传感器,盖板安装于镜筒的第一开口处。
在某些实施方式中,相位型透镜包括衬底及相位微结构,相位微结构设置于衬底,相位微结构用于调节从相位型透镜出射至影像传感器的光线的相位。
在某些实施方式中,衬底包括相背的第一面与第二面,第一面相较于第二面更远离影像传感器,相位微结构设置于第一面和/或第二面。
在某些实施方式中,衬底包括相背的第一面与第二面,第一面相较于第二面更远离影像传感器,第一面与第二面中的一面设有相位微结构,另一面为折射透镜表面。
在某些实施方式中,第一面设有相位微结构,第二面为凸透镜表面;或第二面设有相位微结构,第一面为凸透镜表面。
在某些实施方式中,相位型透镜为平面相位透镜,相位微结构包括纳米微结构。
在某些实施方式中,相位型透镜为菲涅尔透镜,相位微结构包括环形的菲涅尔微结构。
本申请实施方式提供一种终端。终端包括壳体和上述任意一项实施方式中所述的深度相机。深度相机与壳体结合。
飞行时间技术(Time of flight,ToF)是一种通过测量发射信号和被物体反射回的信号之间的时间差,通过这个时间差,计算出物体和传感器距离之间测距的技术。典型的TOF结构包括发射端模组(Tx)和接收端模组(Rx),在发射端模组中,光源发出的激光经过准直镜和衍射光学元件(Diffractive Optical Element,DOE),或经过准直径及匀光片(Diffuser),以散斑形式或者泛光形式投影在物体上,散斑或者泛光的漫反射光由接收端模组接收,完成深度信号的收集。目前,飞行时间技术中的接收端模组通常采用传统的折射透镜组,然而折射透镜受限于自身曲面折射特性,像面存在相对照度损失,也即光线经过折射透镜组后到达传感器的照度存在较大损失,如此会影响测量精度。
请参阅图1,针对上述技术问题,本申请实施例提供一种光接收模组10。光接收模组10包括影像传感器11及相位型透镜12。在光接收模组10的物侧至像侧的方向上, 相位型透镜12与影像传感器11依次排列,相位型透镜12用于调节从相位型透镜12出射至影像传感器11的光线的相位,影像传感器11用于接收光线以获取深度数据。其中,沿光的入射方向(图1中带箭头的虚线表示光的入射方向)依次为物侧及像侧。
本申请中的光接收模组10通过设置能够调节光线相位的相位型透镜12来代替传统折射透镜组,如此能够降低光接收模组10的体积,还能够提升光线到达影像传感器11的照度,有利于影像传感器接收光线,以提升深度相机100(图11所示)的检测精度。
需要说明的是,光接收模组10中设置的相位型透镜12的数量可以是一个或多个,在此不作限制。此外,在一些实施例中,光线经过相位型透镜12到达影像传感器11的照度大于或等于98%。具体地,请参阅图1,相位型透镜12包括衬底121及设置在衬底121上的相位微结构122。相位结构122用于调节从相位型透镜12出射至影像传感器11的光线的相位。
更具体地,衬底121包括相背的第一面1211与第二面1212,第一面1211相较于第二面1212更远离影像传感器11。相位微结构122设置与第一面1211和/或第二面1212。
例如,请参阅图1及图6,在一些实施例中,相位微结构122设置在衬底121的第一面1211上。如此,光线能够直接入射至相位微结构122上,光线经由相位微结构122调节后穿过衬底121出射至影像传感器11。由于相位微结构122对从相位型透镜12出射至影像传感器11的光线的相位进行调节,相较于光线直接穿过透镜后出射至传感器,能够提升光线到达影像传感11的照度。
再例如,请参阅图2及图7,在一些实施例中,相位微结构122设置在衬底121的第二面1212上,如此光线能够先经过衬底121后再到达相位微结构122,也即避免了光线直接入射相位微结构122。由于强光直接入射至行为微结构122可能会出现眩光,并且杂散光比较厉害,不利于影像传感器11接收光线,从而会影响深度相机100(图11所示)的检测精度。因此,本实施例将相位微结构122设置在衬底121更靠近影像传感器11的第二面1211,相较于将微结构122设置在衬底121更靠近影像传感器11的第一面1211,能够在提升光线到达影像传感11的照度的同时,避免出现眩光及减少杂散光,有利于影像传感器11接收光线,以提升深度相机100(图11所示)的检测精度。
再例如,请参阅图3及图8,在一些实施例中,衬底121的两面可以均设置有相位微结构122。也即是说,相位微结构122设置在衬底121的第一面1211及第二面1212上。如此无需设置多个相位型透镜12也能够对从相位型透镜12出射至影像传感器11的光线的相位进行多次调节。此外两面均设置有相位微结构122的相位型透镜12,相较于只有一面设置有相位微结构122的相位型透镜12能够进一步提升光线到达影像传 感11的照度,以及降低影像传感器11接收到光线的畸变。需要说明的是,在一些实施例中,设置在衬底12相背两面的相位微结构122的形状、数量及排布可以完全相同,也可以不相同,在此不作限制。
请参阅图4至图5及图9至图10,在一些实施例中,当相位型透镜12的衬底121只有一面设置有相位微结构122时,衬底121的另一面可以为折射透镜表面。也即是说,相位型透镜12的衬底121的一面设置有相位微结构122,另一面(即未设置相位微结构122的一面)与折射透镜的表面具有相同功能,也即另一面具有曲率能够折射光线。由于相位型透镜12一面设置有相位微结构122,另一面为折射透镜表面,如此能够在步提升光线到达影像传感11的照度的同时,减少杂散光以及降低影像传感器11接收到光线的畸变。
示例地,请参阅图4及图9,在一些实施例中,衬底121的第一面1211设有相位微结构122,衬底121的第二面1212为凸透镜表面。具体地,衬底121的第二面1212向靠近影像传感器11的一侧凸出。如此第二面1212(凸透镜表面)能够将经由相位微结构122调节后的光线引导至影像传感器11上,有利于影像传感器11接收光线,以提升深度相机100(图11所示)的检测精度。请参阅图5及图10,在一些实施例中,衬底121的第二面1212设有相位微结构122,衬底121的第一面1211为凸透镜表面。具体地,衬底121的第一面1211向远离影像传感器11的一侧凸出。如此第一面1211(凸透镜表面)能够将光线会聚引导至相位微结构122,再经过相位微结构122调节后出射至影像传感器11,有利于影像传感器11接收光线,以提升深度相机100(图11所示)的检测精度。由于相位型透镜12的一面相位微结构122时,另一面为凸透镜表面,如此能够在步提升光线到达影像传感11的照度的同时,减少杂散光以及降低影像传感器11接收到光线的畸变。
请参阅图1至图5,在一些实施例中,相位型透镜12为平面相位透镜,此时相位微结构122包括纳米微结构。由于平面相位透镜的制造难度低于菲涅尔透镜的制造难度,因此本实施例中相位型透镜12采用平面相位透镜相较于采用菲涅尔透镜,能够降低相位型透镜12的加工难度,从而降低光接收模组10的加工难度。特别地,在一个例子中,如图2所示,相位型透镜12为平面相位透镜,相位微结构122包括纳米微结构。相位微结构122设置在衬底121的第二面1212,并且从第二面1212的中心朝向第二面1212的边缘,相位微结构122的分量逐渐降低,即越靠近第二面1212的中心,纳米微结构122设置的越密集;越靠近第二面1212的边缘,纳米微结构122设置的越稀疏。也即,纳米微结构设置在衬底121的第二面1212,并且从第二面1212的中心朝向第二面1212的边缘,纳米微结构的分量逐渐降低。当然,在另一些实施例中,纳米 微结构在衬底121上也可以有其他排布方式,在此不作限制。
请参阅图6至图10,在一些实施例中,相位型透镜12为菲涅尔透镜,此时相位微结构122包括环形的菲涅尔微结构。由于目前菲涅尔透镜的设计难度低于平面相位透镜的设计难度,因此本实施例中相位型透镜12采用菲涅尔透镜相较于采用平面相位透镜,能够降低相位型透镜12的设计难度,从而降低光接收模组10的设计难度。
请参阅图1,在一些实施例中,光接收模组10还可以包括滤光片13,滤光片13设置在相位型透镜12及影像传感器11之间。也即是说,在光接收模组10的物侧至像侧的方向上,相位型透镜12、滤光片13及影像传感器11依次排列。滤光片13用于过滤预定波长范围以外的光线,如此滤光片13接收到经由相位型透镜12调节后的光线之后,滤光片13过滤预定波长范围以外的光线,只有预定波长范围内的光线才能够通过滤光片13出射至影像传感器11。需要说明的是,在一些实施例中,预定波长范围为920nm-960nm。特别地,在一些实施例中,滤光片13能够透过波长为940nm的光线,也即波长为940nm的光线能够出射至影像传感器11。
请参阅图1,在一些实施例中,光接收模组10还可以包括镜筒14及透光的盖板15。具体地,镜筒14包括相背的第一开口141及第二开口142,相位型透镜12安装于镜筒14内。第二开口142相较于第一开口141更靠近影像传感器11,盖板15安装于镜筒14的第一开口处141。如此能够保护相位型透镜12不轻易受到损坏,以及避免灰尘及液体进入相位型透镜12,影响光接收模组10正常工作。特别地,盖板15能够遮挡住第一开口141,如此能够进一步避免灰尘及液体进入相位型透镜12。需要说明的是,若光接收模组10包括滤光片13时,滤光片13位于镜筒14的第二开口处。
请参阅图11,本申请实施例还提供一种深度相机100。深度相机100包括光发射模组20及上述任意一项实施例中所述的光接收模组10。光发射模组20用于发射光线,光接收模组10用于接收被物体反射回的至少部分光线并形成电信号。深度相机100根据光接收模组10形成的电信号以获得物体的深度信息。
本申请中深度相机100,通过在光接收模组10中设置能够调节光线相位的相位型透镜12来代替传统折射透镜组,如此能够降低光接收模组10的体积,还能够提升光线到达影像传感器11的照度,有利于影像传感器11接收光线,以提升深度相机100的检测精度。
请参阅图12,本申请实施例还提供一种终端1000。终端1000包括壳体200及上述任意一项实施例中所述的深度相机100,深度相机100与壳体200结合。需要说明的是,终端1000可以是手机、电脑、平板电脑、智能手表、智能穿戴设备等,在此不作限制。
本申请中终端1000,通过在光接收模组10中设置能够调节光线相位的相位型透镜12来代替传统折射透镜组,如此能够降低光接收模组10的体积,还能够提升光线到达影像传感器11的照度,有利于影像传感器11接收光线,以提升深度相机100的检测精度。
在本说明书的描述中,参考术语“某些实施方式”、“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个所述特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个,除非另有明确具体的限定。
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。

Claims (20)

  1. 一种光接收模组,其中,包括:
    影像传感器;及
    相位型透镜,在所述光接收模组的物侧至像侧的方向上,所述相位型透镜与所述影像传感器依次排列,所述相位型透镜用于调节从所述相位型透镜出射至所述影像传感器的光线的相位,所述影像传感器用于接收所述光线以获取深度数据。
  2. 根据权利要求1所述的光接收模组,其中,所述光线到达所述影像传感器的照度大于或等于98%。
  3. 根据权利要求1所述的光接收模组,其中,所述光接收模组还包括滤光片,在所述光接收模组的物侧至像侧的方向上,所述相位型透镜、所述滤光片、及所述影像传感器依次排列,所述滤光片用于过滤预定波长范围以外的光线。
  4. 根据权利要求1所述的光接收模组,其中,所述光接收模组还包括:
    镜筒,所述相位型透镜安装于所述镜筒内,并包括相背的第一开口与第二开口,所述第二开口相较于所述第一开口更接近所述影像传感器;及
    透光的盖板,所述盖板安装于所述镜筒的所述第一开口处。
  5. 根据权利要求1-4任意一项所述的光接收模组,其中,所述相位型透镜包括:
    衬底;及
    设置于所述衬底的相位微结构,所述相位微结构用于调节从所述相位型透镜出射至所述影像传感器的光线的相位。
  6. 根据权利要求5所述的光接收模组,其中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述相位微结构设置于所述第一面和/或所述第二面。
  7. 根据权利要求5所述的光接收模组,其中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述第一面与所述第二面中的一面设有所述相位微结构,另一面为折射透镜表面。
  8. 根据权利要求7所述的光接收模组,其中,所述第一面设有所述相位微结构,所述第二面为凸透镜表面;或
    所述第二面设有所述相位微结构,所述第一面为凸透镜表面。
  9. 根据权利要求5所述的光接收模组,其中,所述相位型透镜为平面相位透镜,所述相位微结构包括纳米微结构;或
    所述相位型透镜为菲涅尔透镜,所述相位微结构包括环形的菲涅尔微结构。
  10. 一种深度相机,其中,包括:
    光发射模组,用于发射光线;及
    光接收模组,用于接收被物体反射回的至少部分所述光线并形成电信号,所述光接收模组包括影像传感器及相位型透镜,在所述光接收模组的物侧至像侧的方向上,所述相位型透镜与所述影像传感器依次排列,所述相位型透镜用于调节从所述相位型透镜出射至所述影像传感器的光线的相位,所述影像传感器用于接收所述光线以获取深度数据。
  11. 根据权利要求10所述的深度相机,其中,所述光线到达所述影像传感器的照度大于或等于98%。
  12. 根据权利要求10所述的深度相机,其中,所述光接收模组还包括滤光片,在所述光接收模组的物侧至像侧的方向上,所述相位型透镜、所述滤光片、及所述影像传感器依次排列,所述滤光片用于过滤预定波长范围以外的光线。
  13. 根据权利要求10所述的深度相机,其中,所述光接收模组还包括:
    镜筒,所述相位型透镜安装于所述镜筒内,并包括相背的第一开口与第二开口,所述第二开口相较于所述第一开口更接近所述影像传感器;及
    透光的盖板,所述盖板安装于所述镜筒的所述第一开口处。
  14. 根据权利要求10-13任意一项所述的深度相机,其中,述相位型透镜包括:
    衬底;及
    设置于所述衬底的相位微结构,所述相位微结构用于调节从所述相位型透镜出射至所述影像传感器的光线的相位。
  15. 根据权利要求14所述的深度相机,其中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述相位微结构设置于所述第一面和/或所述第二面。
  16. 根据权利要求14所述的深度相机,其中,所述衬底包括相背的第一面与第二面,所述第一面相较于所述第二面更远离所述影像传感器,所述第一面与所述第二面中的一面设有所述相位微结构,另一面为折射透镜表面。
  17. 根据权利要求16所述的深度相机,其中,所述第一面设有所述相位微结构,所述第二面为凸透镜表面;或
    所述第二面设有所述相位微结构,所述第一面为凸透镜表面。
  18. 根据权利要求14所述的深度相机,其中,所述相位型透镜为平面相位透镜,所述相位微结构包括纳米微结构。
  19. 根据权利要求14所述的深度相机,其中,所述相位型透镜为菲涅尔透镜,所述相位微结构包括环形的菲涅尔微结构。
  20. 一种终端,其特征在于,包括:
    壳体;及
    权利要求10-19任意一项所述的深度相机,所述深度相机与所述壳体结合。
PCT/CN2022/098857 2021-08-06 2022-06-15 光接收模组、深度相机及终端 WO2023011009A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202121842408.5 2021-08-06
CN202121842408.5U CN216083081U (zh) 2021-08-06 2021-08-06 光接收模组、深度相机及终端

Publications (1)

Publication Number Publication Date
WO2023011009A1 true WO2023011009A1 (zh) 2023-02-09

Family

ID=80668203

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098857 WO2023011009A1 (zh) 2021-08-06 2022-06-15 光接收模组、深度相机及终端

Country Status (2)

Country Link
CN (1) CN216083081U (zh)
WO (1) WO2023011009A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN216083081U (zh) * 2021-08-06 2022-03-18 Oppo广东移动通信有限公司 光接收模组、深度相机及终端

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103037161A (zh) * 2011-10-07 2013-04-10 三星电子株式会社 包括相位差检测像素的成像装置
CN110135232A (zh) * 2018-02-08 2019-08-16 脸谱科技有限责任公司 用于增强深度传感器装置的系统和方法
CN110139004A (zh) * 2018-02-08 2019-08-16 脸谱科技有限责任公司 用于增强光学传感器装置的系统和方法
CN111694161A (zh) * 2020-06-05 2020-09-22 Oppo广东移动通信有限公司 光发射模组、深度相机及电子设备
CN111902763A (zh) * 2018-01-24 2020-11-06 脸谱科技有限责任公司 用于深度感测设备中的光学解调的系统和方法
CN112945141A (zh) * 2021-01-29 2021-06-11 中北大学 基于微透镜阵列的结构光快速成像方法及系统
CN216083081U (zh) * 2021-08-06 2022-03-18 Oppo广东移动通信有限公司 光接收模组、深度相机及终端

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103037161A (zh) * 2011-10-07 2013-04-10 三星电子株式会社 包括相位差检测像素的成像装置
CN111902763A (zh) * 2018-01-24 2020-11-06 脸谱科技有限责任公司 用于深度感测设备中的光学解调的系统和方法
CN110135232A (zh) * 2018-02-08 2019-08-16 脸谱科技有限责任公司 用于增强深度传感器装置的系统和方法
CN110139004A (zh) * 2018-02-08 2019-08-16 脸谱科技有限责任公司 用于增强光学传感器装置的系统和方法
CN111694161A (zh) * 2020-06-05 2020-09-22 Oppo广东移动通信有限公司 光发射模组、深度相机及电子设备
CN112945141A (zh) * 2021-01-29 2021-06-11 中北大学 基于微透镜阵列的结构光快速成像方法及系统
CN216083081U (zh) * 2021-08-06 2022-03-18 Oppo广东移动通信有限公司 光接收模组、深度相机及终端

Also Published As

Publication number Publication date
CN216083081U (zh) 2022-03-18

Similar Documents

Publication Publication Date Title
US20200234024A1 (en) Under-screen fingerprint recognition system, liquid crystal display fingerprint recognition apparatus and electronic device
WO2020006706A1 (zh) 指纹模组和电子设备
RU2017116184A (ru) Компактная нашлемная система индикации, защищенная сверхтонкой структурой
TWI433009B (zh) 光學式觸控裝置
WO2019237876A1 (zh) 摄像头模组和电子装置
WO2023011009A1 (zh) 光接收模组、深度相机及终端
US9039189B2 (en) Projection apparatus
CN111522186B (zh) 镜头
WO2021213218A1 (zh) 潜望式摄像模组、多摄摄像模组和电子设备
WO2022174683A1 (zh) 摄像头模组和电子设备
US12041333B2 (en) Camera module including refractive member and electronic device including refractive member
JP2024504436A (ja) カメラモジュール及び電子機器
US7563037B2 (en) Digital camera module and lens used therein
WO2023016078A1 (zh) 光发射模组、深度相机及终端
CN111856633A (zh) 遮光片、镜头模组及影像镜头
KR20140042633A (ko) 적외선센서모듈
CN105046204A (zh) 一种采用折反式的光学指纹传感器结构
WO2023173885A1 (zh) 光学组件、光发射模组、深度相机及电子设备
KR101478572B1 (ko) 줌 대물렌즈를 이용한 자동초점 시스템
WO2021026684A1 (zh) 指纹识别模组、屏下光学指纹系统及电子装置
TW202100993A (zh) 光學投影裝置與電子設備
WO2022252776A1 (zh) 测量模组、电子设备及控制方法
EP2920645A1 (en) An apparatus comprising flash light circuitry
CN220153512U (zh) 一种纠偏传感器
CN211506525U (zh) 光学指纹识别装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22851725

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22851725

Country of ref document: EP

Kind code of ref document: A1