WO2020221188A1 - Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique - Google Patents

Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique Download PDF

Info

Publication number
WO2020221188A1
WO2020221188A1 PCT/CN2020/087143 CN2020087143W WO2020221188A1 WO 2020221188 A1 WO2020221188 A1 WO 2020221188A1 CN 2020087143 W CN2020087143 W CN 2020087143W WO 2020221188 A1 WO2020221188 A1 WO 2020221188A1
Authority
WO
WIPO (PCT)
Prior art keywords
discrete
collimated
target object
light
beams
Prior art date
Application number
PCT/CN2020/087143
Other languages
English (en)
Chinese (zh)
Inventor
吕方璐
程世球
Original Assignee
深圳市光鉴科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市光鉴科技有限公司 filed Critical 深圳市光鉴科技有限公司
Publication of WO2020221188A1 publication Critical patent/WO2020221188A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement

Definitions

  • the present disclosure relates to the field of 3D imaging, and in particular, to a 3D imaging device and electronic equipment based on synchronized ToF discrete point clouds.
  • ToF (time of flight) technology is a technology that emits measurement light from the projector, and reflects the measurement light back to the receiver through the target object, so that the space from the object to the sensor can be obtained according to the propagation time of the measurement light in this propagation distance Distance 3D imaging technology.
  • Commonly used ToF technology includes single point scanning projection method and surface light projection method.
  • the ToF method of single-point scanning projection uses a single-point projector to project a single beam of collimated light, and the projection direction of the single beam of collimated light is controlled by the scanning device to be able to project to different target positions. After the collimated light of the single beam is reflected by the target, part of the light is received by the single-point light detector, thereby obtaining the depth measurement data of the current projection direction.
  • This method can concentrate all the optical power on a target point, thereby achieving a high signal-to-noise ratio at a single target point, and then achieving high-precision depth measurement.
  • the scanning of the entire target object relies on scanning devices, such as mechanical motors, MEMS, and optical phase-controlled radars. Splicing the depth data points obtained by scanning can obtain the discrete point cloud data required for 3D imaging. This method is conducive to realizing long-distance 3D imaging, but requires the use of a complex projection scanning system, and the cost is relatively high.
  • the ToF method of surface light projection is to project a surface beam with continuous energy distribution.
  • the projected light continuously covers the surface of the target object.
  • the photodetector is an array of photodetectors that can obtain the travel time of the beam.
  • the depth obtained by each detector image point is the depth information of the object position corresponding to the object image relationship.
  • the purpose of the present disclosure is to provide a 3D imaging device and electronic equipment based on synchronous ToF discrete point clouds.
  • the present disclosure adopts a projection method of discrete light beams to synchronously acquire point cloud data with higher precision, thereby realizing low-cost, low-power, and high-precision 3D imaging.
  • the 3D imaging device based on synchronous ToF discrete point cloud includes a discrete beam projector and a photodetector array imager;
  • the discrete beam projector is configured to project multiple discrete collimated beams to a target object
  • the photodetector array imager is configured to receive the multiple discrete collimated beams reflected by the target object and measure the propagation time of the multiple discrete collimated beams, so as to obtain the surface of the target object The depth data.
  • the discrete beam projector includes an edge emitting laser and a beam projector arranged on an optical path;
  • the edge emitting laser is configured to project laser light to the beam projector
  • the beam projector is configured to project the incident laser light into multiple discrete collimated beams.
  • the discrete beam projector includes a laser array, a collimating lens and a beam splitting device arranged on an optical path;
  • the laser array is configured to project laser light of a first order of magnitude to the collimating lens
  • the collimating lens is configured to collimate the multiple incident laser beams and then emit a collimated beam of a first order of magnitude;
  • the beam splitting device is configured to split the incident collimated beam of the first order of magnitude and then emit the collimated beam of the second order of magnitude;
  • the second order of magnitude is greater than the first order of magnitude.
  • the photodetector array imager includes an optical imaging lens, a photodetector array, and a driving circuit; the photodetector array includes a plurality of photodetectors distributed in an array;
  • the optical imaging lens is configured such that the direction vector of the collimated light beam entering the photodetector array through the optical imaging lens has a one-to-one correspondence with the photodetector;
  • the light detector is configured to receive a collimated light beam reflected by the target object
  • the driving circuit is configured to measure the propagation time of a plurality of the discrete collimated beams and then generate depth data on the surface of the target object.
  • the multiple discrete collimated light beams are periodically arranged in a predetermined shape.
  • the preset shape includes any of the following shapes or any multiple shapes that can be switched with each other:
  • the multiple discrete collimated beams are non-periodically arranged in another preset shape.
  • the aperiodic arrangement includes any of the following arrangements or any multiple arrangements that can be switched between:
  • the light detector adopts any of the following light sensors:
  • the electronic equipment provided by the present disclosure includes the 3D imaging device based on synchronous ToF discrete point clouds, and also includes a display panel; the discrete beam projector and the photodetector array imager are located on the backlight side of the display panel;
  • the photodetector array imager receives multiple discrete collimated light beams that penetrate the display panel after being reflected by the target object, and obtains a depth image of the surface of the target object according to the multiple discrete collimated light beams.
  • the present disclosure uses a discrete beam projector to project multiple discrete collimated beams to a target object, so that the photodetector array imager receives part of the collimated beam reflected by the target object, realizes the acquisition of depth data on the surface of the target object, and improves the beam Power density achieves a balance between signal-to-noise ratio and point cloud density, so that 3D imaging can be performed with low cost, low power consumption and high precision.
  • FIG. 1 is a schematic structural diagram of a 3D imaging device based on synchronized ToF discrete point clouds in this disclosure
  • FIG. 2 is a schematic diagram of a structure of the discrete beam projector in the present disclosure
  • FIG. 3 is a schematic diagram of another structure of the discrete beam projector in the present disclosure.
  • FIG. 4 is a schematic diagram of the structure of the optical imaging lens in the disclosure.
  • FIG. 6(a), (b), (c) are schematic diagrams of aperiodic arrangement of multiple discrete collimated beams in the present disclosure
  • 2 is a photodetector array imager
  • 102 is an optical imaging lens
  • 201 is an edge emitting laser
  • 202 is a beam projector
  • 203 is a laser array
  • 204 is a collimating lens
  • 205 is a beam splitting device.
  • connection can be configured for a fixed function or a circuit connection function.
  • first and second are only configured for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present disclosure, “plurality” means two or more than two, unless otherwise specifically defined.
  • the present disclosure provides a compromise solution between the single-point scanning projection method and the surface light projection method, that is, a projector is used to project multiple discrete collimated beams at the same time, which is paired with the ToF photodetector array 101 ,
  • a projector is used to project multiple discrete collimated beams at the same time, which is paired with the ToF photodetector array 101 .
  • the number of collimated beams projected at the same time can range from a few to tens of thousands.
  • the present disclosure can achieve the trade-off and optimization of the beam power density (ie signal-to-noise ratio) and the point cloud density by controlling the number of beams under the same power.
  • the accuracy and point cloud density can be optimized according to the specific 3D imaging application scenario in the present disclosure.
  • all the 3D point clouds in the present disclosure are acquired by synchronous measurement, it can avoid the problem of the need to use algorithms to correct the point cloud when the target object 3 and the 3D imaging device have relative motion in the single-point scanning method.
  • FIG. 1 is a schematic structural diagram of a 3D imaging device based on a synchronized ToF discrete point cloud in the present disclosure.
  • the 3D imaging device based on a synchronized ToF discrete point cloud provided by the present disclosure includes a discrete beam projector 1 and a light detector Array imager 2;
  • the discrete beam projector 1 is configured to project multiple discrete collimated beams to a target object 3;
  • the photodetector array imager 2 is configured to receive the multiple discrete collimated beams reflected by the target object 3 and measure the propagation time of the multiple discrete collimated beams, so as to obtain the target Depth data on the surface of object 3.
  • the present disclosure uses the discrete beam projector 1 to project multiple discrete collimated beams to the target object 3, so that the photodetector array imager 2 receives a part of the collimated beam reflected by the target object 3 to achieve the target object
  • the acquisition of depth data on the 3 surface improves the beam power density and achieves a balance between the signal-to-noise ratio and the point cloud density, so that 3D imaging can be performed with low cost, low power consumption and high precision.
  • These discrete depth data points construct point cloud data that can reproduce the 3D shape of the object, so as to realize the 3D imaging of the target object 3.
  • the plurality of discrete collimated light beams have a cone shape.
  • the number of the multiple discrete collimated beams is between two beams and tens of thousands of beams, such as 2 beams to 100,000 beams.
  • the 3D imaging device based on synchronous ToF discrete point clouds includes a driving circuit connected to the discrete beam projector 1 and the photodetector array imager 2.
  • the driving circuit is configured to control the discrete beam projector 1 and the photodetector array imager 2 to turn on or off at the same time.
  • the driving circuit can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc., or a general-purpose processor, for example, when the depth camera is integrated into smart terminals such as mobile phones, TVs, computers, etc. ,
  • the processor in the terminal can be used as at least part of the processing circuit
  • FIG. 2 is a schematic diagram of a structure of the discrete beam projector in this disclosure.
  • the discrete beam projector 1 includes an edge emitting laser 201 and a beam projector 202 arranged on an optical path;
  • the edge-emitting laser 201 is configured to project laser light to the beam projector 202;
  • the beam projector 202 is configured to project the incident laser light into multiple discrete collimated beams.
  • the inner surface of the beam splitting projector is processed with an optical chip with a micro-nano structure and is composed of an optical lens.
  • the beam splitting projector can realize the function of dividing the incident light from the edge-emitting laser 201 into any number of collimated beams.
  • the emission direction of the edge-emitting laser 201 and the projection direction of the beam splitting projector may be the same, or may be 90 degrees or any angle required for the design of the optical system.
  • FIG. 3 is a schematic diagram of another structure of the discrete beam projector in the present disclosure.
  • the discrete beam projector 1 includes a laser array 203, a collimating lens 204, and a beam splitting device 205 arranged on an optical path. ;
  • the laser array 203 is configured to project laser light of a first order of magnitude to the collimating lens 204;
  • the collimating lens 204 is configured to collimate the multiple incident laser beams and then emit a collimated beam of a first order of magnitude;
  • the beam splitting device 205 is configured to split an incident collimated beam of a first order of magnitude and then emit a collimated beam of a second order of magnitude;
  • the second order of magnitude is greater than the first order of magnitude.
  • the second order of magnitude is one to two times the first order of magnitude.
  • the laser array 203 may be composed of multiple vertical cavity surface emitting lasers (VCSEL) or multiple edge emitting lasers (Edge Emitting Laser, EEL). After passing through the collimating lens 204, multiple laser beams can become highly parallel collimated beams. According to the requirement of the number of discrete beams in practical applications, the beam splitting device 205 can be used to achieve more collimated beams.
  • the beam splitting device 205 may use a diffraction grating (DOE), a spatial light modulator (SLM), or the like.
  • DOE diffraction grating
  • SLM spatial light modulator
  • the photodetector array imager 2 includes an optical imaging lens 102, a photodetector array 101, and a driving circuit; the photodetector array 101 Including multiple photodetectors distributed in an array;
  • the optical imaging lens 102 is configured such that the direction vector of the collimated light beam entering the photodetector array 101 through the optical imaging lens 102 has a one-to-one correspondence with the photodetector;
  • the light detector is configured to receive the collimated light beam reflected by the target object 3;
  • the driving circuit is configured to measure the propagation time of a plurality of the discrete collimated beams and then generate depth data on the surface of the target object 3.
  • the optical imaging lens 102 is usually equipped with a narrow-band filter, so that the photodetector array 101 can only pass the incident collimated light beam with a preset wavelength.
  • the predetermined wavelength may be the wavelength of the incident collimated beam, or may be between 50 nanometers less than the incident collimated beam and 50 nanometers greater than the incident collimated beam.
  • the photodetector array 101 may be arranged periodically or non-periodically. Each photodetector cooperates with the auxiliary circuit to realize the time-of-flight measurement of the straight beam.
  • the photodetector array 101 can be a combination of multiple single-point photodetectors or a sensor chip integrating multiple photodetectors.
  • the irradiation spot of a discrete collimated beam on the target object 3 may correspond to one or more photodetectors.
  • the signal of each detector can be connected through a circuit, so that it can be combined into a photodetector with a larger detection area.
  • the multiple discrete collimated light beams are lattice lights periodically arranged in a predetermined shape, that is, they are geometrically distributed.
  • the preset The shape of includes any of the following shapes or any multiple shapes that can be switched between:
  • the shape of the periodic arrangement of the multiple discrete collimated beams is not limited to the above-mentioned shape, and may be arranged in other shapes.
  • the preset shape when the preset shape is rectangular, that is, the unit arrangement shape of the collimated beams in one period is rectangular, and it repeats periodically in space.
  • the preset shape when the preset shape is a triangle, that is, the unit arrangement shape of the collimated light beam in one period is a triangle, and it repeats periodically in space.
  • the preset shape is a hexagon, that is, the unit arrangement shape of the collimated beams in a period is a hexagon, and it repeats periodically in space.
  • each collimated beam in the cross-section may be distorted, such as stretching and distortion.
  • the energy distribution of each collimated beam in the cross-section can be a circle, a ring, or an ellipse. In such an arrangement as shown in 5, it is beneficial to simplify the spatial correspondence between the multiple discrete collimated beams and the photodetector array 101.
  • the multiple discrete collimated light beams are lattice lights that are non-periodically arranged in another preset shape.
  • the aperiodic arrangement includes any of the following arrangements or any multiple arrangements that can be switched between:
  • the shape of the aperiodic arrangement of the multiple discrete collimated light beams is not limited to the above-mentioned shape, and may be arranged in other shapes.
  • the spatial coding arrangement is specifically that in the periodic arrangement, a part of the light beam is defaulted, so as to realize the spatial coding of the arrangement position.
  • the coding that can be used is not limited.
  • the random arrangement specifically the arrangement of collimated beams, is randomly distributed, so that the similarity of the arrangement of different positions is small or close to Zero, as shown in FIG.
  • the quasi-lattice arrangement is specifically that the collimated light beams are arranged non-periodically at adjacent positions at a short distance, and periodically arranged at a long distance. Since the implementation of the present disclosure is limited by the optical system, the arrangement of the actual collimated beam in the cross-section may be distorted, such as stretching and distortion.
  • the energy distribution of each collimated beam in the cross-section can be a circle, a ring, or an ellipse. In this arrangement as shown in 6, this arrangement is conducive to uniform sampling of non-determined targets and optimizes the effect of the final 3D depth map.
  • the light detector adopts any of the following light sensors:
  • model selection of the light detector is not limited to the aforementioned light sensor, and may also include other types of light sensors.
  • An embodiment of the present disclosure also provides an electronic device, including the 3D imaging device based on the synchronous ToF discrete point cloud described in the above embodiment, and further including a display panel; the discrete beam projector 1 and the photodetector array imaging The device 2 is located on the backlight side of the display panel;
  • the photodetector array imager 2 receives the multiple discrete collimated beams reflected by the target object 3 and penetrates the display panel, and obtains the depth of the surface of the target object 3 according to the multiple discrete collimated beams image.
  • the photodetector array imager 2 ensures the spatial position correspondence between the multiple discrete collimated beams projected and the photodetector array 101. In this way, each photodetector in the photodetector array 101 can use the ToF mode of continuously modulating the light beam or pulse in time to measure the propagation time of light, and then calculate the distance of light propagation by means of the speed of light.
  • the pulse-based ToF method is also called the direct ToF method, specifically: the photodetector can sensitively detect the waveform of a light pulse, and then compare it with the emission time of the light pulse to obtain a collimated beam projected on a discrete beam The travel time between the detector 1 and the photodetector array imager 2.
  • the commonly used photodetector is a single photon avalanche diode (SPAD).
  • the single-photon avalanche diode can count the photons of the light pulse very sensitively and at high speed. That is, the number of photons at different times is counted within the pulse time window, and the overall waveform of the pulse is restored.
  • the pulse-based ToF method has relatively low requirements for the power consumption of the projector, and is beneficial to eliminate the interference of multipath beams.
  • the ToF method based on time-continuous modulation of the beam is also called the indirect ToF method.
  • the time continuous modulation usually adopts a sine wave modulation method
  • the photodetector can be realized by a CMOS or CCD photosensitive method
  • the discrete beam projector 1 continuously emits a collimated beam to the target object 3 under high frequency modulation.
  • the photodetector array 101 After being reflected by the target object 3, it is received by the photodetector array 101.
  • Each photodetector records the phase change between the emitted collimated beam and the received collimated beam, so that the depth information of the surface position of the target object 3 can be obtained.
  • the ToF method based on time continuous modulation of the beam is an energy integration process, it has higher accuracy than pulsed measurement, and does not require the light source to be a short-time high-intensity pulse. Different types of light sources can be used and different modulation methods can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un appareil d'imagerie 3D à base de nuage de points discrets à temps de vol synchrone et un dispositif électronique. Ledit appareil comprend un projecteur de faisceau lumineux discret et un imageur à matrice de photodétecteurs; le projecteur de faisceau lumineux discret est configuré pour projeter de multiples faisceaux lumineux collimatés discrets vers un objet cible; et l'imageur à matrice de photodétecteurs est configuré pour recevoir les multiples faisceaux lumineux collimatés discrets réfléchis par l'objet cible et pour mesurer le temps de propagation des multiples faisceaux lumineux collimatés discrets, de telle sorte que des données de profondeur de la surface de l'objet cible peuvent être obtenues. Dans la présente invention, le projecteur de faisceau lumineux discret projette de multiples faisceaux lumineux collimatés discrets vers l'objet cible, de telle sorte que l'imageur à matrice de photodétecteurs reçoit les faisceaux lumineux collimatés réfléchis par l'objet cible, obtenant ainsi des données de profondeur de la surface de l'objet cible, ce qui permet d'améliorer la densité de puissance des faisceaux lumineux et de réaliser un certain équilibre entre un rapport signal sur bruit et une densité de nuage de points, de telle sorte que l'imagerie 3D peut être effectuée à faible coût et avec une faible consommation d'énergie et une précision élevée.
PCT/CN2020/087143 2019-04-30 2020-04-27 Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique WO2020221188A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910362506 2019-04-30
CN201910362506.X 2019-04-30

Publications (1)

Publication Number Publication Date
WO2020221188A1 true WO2020221188A1 (fr) 2020-11-05

Family

ID=68511609

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087143 WO2020221188A1 (fr) 2019-04-30 2020-04-27 Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique

Country Status (2)

Country Link
CN (1) CN110471081A (fr)
WO (1) WO2020221188A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471081A (zh) * 2019-04-30 2019-11-19 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备
CN112839217A (zh) * 2019-11-22 2021-05-25 深圳市光鉴科技有限公司 4d摄像装置及电子设备
WO2022032516A1 (fr) * 2020-08-12 2022-02-17 深圳市速腾聚创科技有限公司 Radar laser et procédé de détection associé, support de stockage et système de détection
CN113740863A (zh) * 2021-09-26 2021-12-03 海南师范大学 一种高亮度单管实现多线光发射/接收的激光雷达系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100284082A1 (en) * 2008-01-21 2010-11-11 Primesense Ltd. Optical pattern projection
CN103309137A (zh) * 2012-03-15 2013-09-18 普莱姆森斯有限公司 结构光的投影机
CN107464280A (zh) * 2017-07-31 2017-12-12 广东欧珀移动通信有限公司 用户3d建模的匹配方法和装置
CN107493428A (zh) * 2017-08-09 2017-12-19 广东欧珀移动通信有限公司 拍摄控制方法和装置
CN108828786A (zh) * 2018-06-21 2018-11-16 深圳市光鉴科技有限公司 一种3d摄像头
US20190018137A1 (en) * 2017-07-14 2019-01-17 Microsoft Technology Licensing, Llc Optical projector having switchable light emission patterns
CN110471081A (zh) * 2019-04-30 2019-11-19 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472457B (zh) * 2013-09-13 2015-06-10 中国科学院空间科学与应用研究中心 稀疏孔径压缩计算关联飞行时间的三维成像系统及方法
CN105785343A (zh) * 2016-04-29 2016-07-20 中国科学院电子学研究所 空间多光束激光发射装置、多通道接收装置和探测装置
CN106972347B (zh) * 2017-05-04 2019-04-09 深圳奥比中光科技有限公司 用于3d成像的激光阵列
CN106990548A (zh) * 2017-05-09 2017-07-28 深圳奥比中光科技有限公司 阵列激光投影装置及深度相机
CN206833079U (zh) * 2017-05-09 2018-01-02 深圳奥比中光科技有限公司 阵列激光投影装置及深度相机
CN109507688B (zh) * 2017-09-15 2021-03-02 清华大学 一种激光发射装置、激光雷达探测装置及方法
CN113964159A (zh) * 2018-03-09 2022-01-21 Oppo广东移动通信有限公司 光传感器、电子装置及其制造方法
CN108810195B (zh) * 2018-03-16 2021-03-09 Oppo广东移动通信有限公司 电子装置及其制造方法
CN108401098A (zh) * 2018-05-15 2018-08-14 绍兴知威光电科技有限公司 一种tof深度相机系统及其降低外部误差的方法
CN208210162U (zh) * 2018-05-15 2018-12-07 湖北秉正讯腾科技有限公司 集成激光器驱动电路的tof 3d深度图像传感芯片
CN108960061A (zh) * 2018-06-01 2018-12-07 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置、计算机设备和存储介质
CN109299662B (zh) * 2018-08-24 2022-04-12 上海图漾信息科技有限公司 深度数据计算设备与方法及人脸识别设备
CN109510885B (zh) * 2018-11-19 2020-08-07 Oppo广东移动通信有限公司 电子装置
CN109343070A (zh) * 2018-11-21 2019-02-15 深圳奥比中光科技有限公司 时间飞行深度相机

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100284082A1 (en) * 2008-01-21 2010-11-11 Primesense Ltd. Optical pattern projection
CN103309137A (zh) * 2012-03-15 2013-09-18 普莱姆森斯有限公司 结构光的投影机
US20190018137A1 (en) * 2017-07-14 2019-01-17 Microsoft Technology Licensing, Llc Optical projector having switchable light emission patterns
CN107464280A (zh) * 2017-07-31 2017-12-12 广东欧珀移动通信有限公司 用户3d建模的匹配方法和装置
CN107493428A (zh) * 2017-08-09 2017-12-19 广东欧珀移动通信有限公司 拍摄控制方法和装置
CN108828786A (zh) * 2018-06-21 2018-11-16 深圳市光鉴科技有限公司 一种3d摄像头
CN110471081A (zh) * 2019-04-30 2019-11-19 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TECHBEAT: "Non-official translation: Break Apple's Patent Monopoly, Deptrum's New Technology is Expected to Install Face ID into Middle and Low End Mobile Phone", HTTPS://WWW.SOHU.COM/A/257069376_100269170, 30 September 2018 (2018-09-30), DOI: 20200604134632X *

Also Published As

Publication number Publication date
CN110471081A (zh) 2019-11-19

Similar Documents

Publication Publication Date Title
WO2020221185A1 (fr) Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique
US11860280B2 (en) Integrated illumination and detection for LIDAR based 3-D imaging
WO2020221188A1 (fr) Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique
US11435446B2 (en) LIDAR signal acquisition
US10330780B2 (en) LIDAR based 3-D imaging with structured light and integrated illumination and detection
CN111142088B (zh) 一种光发射单元、深度测量装置和方法
CN110244318B (zh) 基于异步ToF离散点云的3D成像方法
WO2021072802A1 (fr) Système et procédé de mesure de distance
US20170269198A1 (en) LIDAR Based 3-D Imaging With Varying Illumination Field Density
WO2021244011A1 (fr) Procédé et système de mesure de distance, et support d'informations lisible par ordinateur
CN111796295B (zh) 一种采集器、采集器的制造方法及距离测量系统
CN112066906A (zh) 深度成像装置
CN110658529A (zh) 一种集成分束扫描单元及其制造方法
CN110716190A (zh) 一种发射器及距离测量系统
WO2022083198A1 (fr) Système de mesure de distance à balayage de lignes multiples
CN110716189A (zh) 一种发射器及距离测量系统
CN112066907B (zh) 深度成像装置
CN210128694U (zh) 深度成像装置
CN210835244U (zh) 基于同步ToF离散点云的3D成像装置及电子设备
CN112068144B (zh) 光投射系统及3d成像装置
CN111947565A (zh) 基于同步ToF离散点云的3D成像方法
CN111492264B (zh) Lidar信号获取

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20799308

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20799308

Country of ref document: EP

Kind code of ref document: A1