WO2020221185A1 - Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique - Google Patents

Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique Download PDF

Info

Publication number
WO2020221185A1
WO2020221185A1 PCT/CN2020/087137 CN2020087137W WO2020221185A1 WO 2020221185 A1 WO2020221185 A1 WO 2020221185A1 CN 2020087137 W CN2020087137 W CN 2020087137W WO 2020221185 A1 WO2020221185 A1 WO 2020221185A1
Authority
WO
WIPO (PCT)
Prior art keywords
discrete
target object
asynchronous
collimated
laser
Prior art date
Application number
PCT/CN2020/087137
Other languages
English (en)
Chinese (zh)
Inventor
吕方璐
程世球
Original Assignee
深圳市光鉴科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市光鉴科技有限公司 filed Critical 深圳市光鉴科技有限公司
Publication of WO2020221185A1 publication Critical patent/WO2020221185A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Definitions

  • the present disclosure relates to the field of 3D imaging, and in particular, to a 3D imaging device and electronic equipment based on asynchronous ToF discrete point clouds.
  • ToF (time of flight) technology is a technology that emits measurement light from the projector, and reflects the measurement light back to the receiver through the target object, so that the space from the object to the sensor can be obtained according to the propagation time of the measurement light in this propagation distance Distance 3D imaging technology.
  • Commonly used ToF technology includes single point scanning projection method and surface light projection method.
  • the ToF method of single-point scanning projection uses a single-point projector to project a single beam of collimated light, and the projection direction of the single beam of collimated light is controlled by the scanning device to be able to project to different target positions. After the collimated light of the single beam is reflected by the target, part of the light is received by the single-point light detector, thereby obtaining the depth measurement data of the current projection direction.
  • This method can concentrate all the optical power on a target point, thereby achieving a high signal-to-noise ratio at a single target point, and then achieving high-precision depth measurement.
  • the scanning of the entire target object relies on scanning devices, such as mechanical motors, MEMS, and optical phase-controlled radars. Splicing the depth data points obtained by scanning can obtain the discrete point cloud data required for 3D imaging. This method is conducive to realizing long-distance 3D imaging, but requires the use of a complex projection scanning system, and the cost is relatively high.
  • the ToF method of surface light projection is to project a surface beam with continuous energy distribution.
  • the projected light continuously covers the surface of the target object.
  • the photodetector is an array of photodetectors that can obtain the travel time of the beam.
  • the depth obtained by each detector image point is the depth information of the object position corresponding to the object image relationship.
  • the purpose of the present disclosure is to provide a 3D imaging device and electronic equipment based on asynchronous ToF discrete point clouds.
  • the present disclosure adopts an asynchronous discrete beam projection method, which can maintain a low optical power during each projection, and can meet the limitations of power consumption and laser eye safety.
  • the 3D imaging device based on asynchronous ToF discrete point cloud includes an asynchronous discrete beam projector and a photodetector array imager;
  • the asynchronous discrete beam projector is configured to sequentially project multiple discrete collimated beams to different regions of the target object in a time sequence within a preset time period;
  • the photodetector array imager is configured to sequentially receive the multiple discrete collimated beams reflected by the target object and measure the propagation time of the multiple discrete collimated beams, so as to obtain the target
  • the depth data of the multiple regions of the object is generated based on the depth data of the multiple regions of the target object.
  • the asynchronous discrete beam projector includes an edge emitting laser and a beam projector arranged on an optical path;
  • the edge emitting laser is configured to project laser light to the beam projector
  • the beam projector is configured to project a plurality of discrete collimated beams of the incident laser light sequentially to different regions of the target object in a time sequence within a preset time period.
  • the asynchronous discrete beam projector includes a laser array, a collimating lens and a beam splitting device arranged on an optical path;
  • the laser array includes a plurality of lasers arranged in an array, and the plurality of lasers are divided into a plurality of laser emitting groups, and each laser emitting group is sent to the The collimating lens projects the first order of magnitude laser;
  • the collimating lens is configured to collimate the multiple incident laser beams and then emit a collimated beam of a first order of magnitude;
  • the beam splitting device is configured to split the incident collimated beam of the first order of magnitude and then emit the collimated beam of the second order of magnitude;
  • the second order of magnitude is greater than the first order of magnitude.
  • the photodetector array imager includes an optical imaging lens, a photodetector array, and a driving circuit;
  • the photodetector array includes a plurality of photodetectors distributed in an array, and the plurality of photodetector emission groups are a plurality of photodetector groups, each of the photodetector groups and a laser emission group Corresponding;
  • the photodetector group is configured to receive the collimated light beam reflected by the target object after being emitted by the corresponding laser emitting group;
  • the optical imaging lens is configured such that the direction vector of the collimated light beam entering the photodetector array through the optical imaging lens has a one-to-one correspondence with the photodetector;
  • the driving circuit is configured to measure the propagation time of multiple discrete collimated light beams and then generate depth data of multiple regions on the surface of the target object, and then generate the depth data based on the depth data of multiple regions of the target object. Describe the depth data of the target surface.
  • the multiple discrete collimated light beams are periodically arranged in a predetermined shape.
  • the preset shape includes any of the following shapes or any multiple shapes that can be switched with each other:
  • the multiple discrete collimated beams are non-periodically arranged in another preset shape.
  • the aperiodic arrangement includes any of the following arrangements or any multiple arrangements that can be switched between:
  • the asynchronous discrete beam projector includes multiple groups of edge-emitting lasers and beam projectors arranged on an optical path;
  • the edge-emitting laser is configured to project laser light to the beam projector;
  • the beam projector is configured to project the incident laser light into multiple discrete collimated beams;
  • the electronic equipment provided by the present disclosure includes the 3D imaging device based on asynchronous ToF discrete point clouds, and also includes a display panel; the asynchronous discrete beam projector and the photodetector array imager are located on the backlight side of the display panel ;
  • the asynchronous discrete beam projector projects multiple discrete collimated light beams to different areas of the target object sequentially in time sequence within a preset time period, after the multiple discrete collimated light beams penetrate the display panel Shine on the target object;
  • the photodetector array imager receives multiple discrete collimated light beams that penetrate the display panel after being reflected by the target object, and obtains multiple discrete collimated light beams according to the propagation time of the multiple discrete collimated light beams.
  • the depth data of the region, and then the depth data of the surface of the target object is generated according to the depth data of multiple regions of the target object.
  • the present disclosure uses an asynchronous discrete beam projector to project multiple discrete collimated beams to different regions of the target object in a time sequence within a preset time period, so that the photodetector array imager can obtain multiple regions of the target object
  • the depth data of the target object is generated based on the depth data of multiple regions of the target object, which improves the beam power density and achieves a balance between the signal-to-noise ratio and the point cloud density, so as to achieve specific 3D Optimized accuracy and point cloud density in imaging application scenarios, and can maintain a low optical power during each projection, which can meet the limitations of power consumption and laser eye safety.
  • FIG. 1 is a schematic structural diagram of a 3D imaging device based on synchronized ToF discrete point clouds in an embodiment of the disclosure
  • FIG. 2 is a schematic diagram of a structure of an asynchronous discrete beam projector in an embodiment of the disclosure
  • FIG. 3 is a schematic diagram of another structure of an asynchronous discrete beam projector in an embodiment of the disclosure.
  • FIG. 4 is a schematic structural diagram of an optical imaging lens in an embodiment of the disclosure.
  • FIGS. 6(a), (b), (c) are schematic diagrams of aperiodic arrangement of multiple discrete collimated beams in an embodiment of the disclosure.
  • 1 is a photodetector array imager
  • 102 is an optical imaging lens
  • 201 is an edge emitting laser
  • 202 is a beam projector
  • 203 is a laser array
  • 204 is a collimating lens
  • 205 is a beam splitting device.
  • connection can be configured for a fixed function or a circuit connection function.
  • first and second are only configured for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present disclosure, “plurality” means two or more than two, unless otherwise specifically defined.
  • the present disclosure provides a compromise solution between the single-point scanning projection method and the surface light projection method, that is, a projector is used to project multiple discrete collimated beams at the same time, which is paired with a ToF photodetector array, A synchronized 3D point cloud containing depth data of multiple target points can be obtained in a single measurement.
  • the number of collimated beams projected at the same time can range from a few to tens of thousands.
  • the present disclosure can achieve the trade-off and optimization of the beam power density (i.e. signal-to-noise ratio) and the point cloud density by controlling the number of light beams under the same power.
  • the present disclosure adopts an asynchronous projection method.
  • the distribution of the beams projected each time is relatively sparse, and the distribution of the beams projected at different times is different, so as to obtain point cloud data of different regions of the target object. After the point clouds obtained by multiple projection measurements are stitched together, a relatively dense point cloud can be obtained. In this way, it is possible to achieve the most optimized accuracy and point cloud density for specific 3D imaging application scenarios, and to maintain a low optical power during each projection, so as to meet the limitations of power consumption and laser eye safety.
  • Figure 1 is a schematic structural diagram of a 3D imaging device based on a synchronous ToF discrete point cloud in the present disclosure.
  • the 3D imaging device based on a synchronous ToF discrete point cloud provided by the present disclosure is configured to implement the asynchronous ToF based 3D imaging device provided by the present disclosure.
  • 3D imaging device for ToF discrete point cloud including asynchronous discrete beam projector 2 and photodetector array imager 1;
  • the asynchronous discrete beam projector 2 is configured to sequentially project multiple discrete collimated beams to different regions of the target object 3 in a time sequence within a preset time period;
  • the photodetector array imager 1 is configured to sequentially receive the multiple discrete collimated beams reflected by the target object 3 and measure the propagation time of the multiple discrete collimated beams, so as to obtain According to the depth data of the multiple regions of the target object 3, the depth data of the surface of the target object 3 is generated according to the depth data of the multiple regions of the target object 3.
  • the preset time period can be set to 10 milliseconds; the target object 3 includes at least a first area and a second area, so that it can be projected by asynchronous discrete beams within the preset time period
  • a group of lasers in the device 2 first project multiple discrete collimated beams to the first area to obtain depth data of the first area, and then pass another group of lasers in the asynchronous discrete beam projector 2 to the second area.
  • Multiple discrete collimated beams are projected into the area, the depth data of the second area is acquired, and the depth data of the surface of the target object 3 is generated according to the depth data of the first area and the depth data of the second area.
  • the first area and the second area may be adjacent areas, or may be connected areas.
  • the first area and the second area may also be intertwined.
  • the first area includes a plurality of randomly distributed first sub-areas
  • the second area includes a plurality of randomly distributed first sub-areas at any gap between adjacent first sub-areas.
  • the second sub-area may include multiple regions, such as three regions, four regions, and so on.
  • the asynchronous discrete beam projector 2 sequentially projects multiple discrete collimated beams to different regions of the target object 3 in a time sequence to generate A plurality of depth data on the surface of the target object 3 is further used to generate optimal depth data on the surface of the target object 3 iteratively according to the depth data on the surface of the target object 3.
  • the present disclosure uses the asynchronous discrete beam projector 2 to project multiple discrete collimated beams to different areas of the target object 3 in a time sequence within a preset time period, so that the photodetector array imager 1 It is possible to obtain the depth data of multiple regions of the target object 3, and generate the depth data of the surface of the target object 3 according to the depth data of the multiple regions of the target object 3, which improves the beam power density.
  • a balance between point cloud density can be achieved to achieve the most optimized accuracy and point cloud density for specific 3D imaging application scenarios, and to maintain a low optical power during each projection, which can meet the power consumption and laser eyes Security restrictions.
  • the multiple discrete collimated beams in a discrete shape projected by the asynchronous discrete beam projector 2 are reflected by the target object 3, and the partially reflected collimated beams are received by the photodetector array 101.
  • These discrete depth data points construct point cloud data that can reproduce the 3D shape of the object, so as to realize the 3D imaging of the target object 3.
  • the plurality of discrete collimated light beams have a cone shape.
  • the number of the multiple discrete collimated beams is between two beams and tens of thousands of beams, such as 2 beams to 100,000 beams.
  • the 3D imaging device based on asynchronous ToF discrete point cloud includes a driving circuit connected with the asynchronous discrete beam projector 2 and the photodetector array imager 1.
  • the driving circuit is configured to control the asynchronous discrete beam projector 2 and the photodetector array imager 1 to be turned on or off at the same time.
  • the driving circuit can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc., or a general-purpose processor, for example, when the depth camera is integrated into smart terminals such as mobile phones, TVs, computers, etc. ,
  • the processor in the terminal can be used as at least part of the processing circuit
  • FIG. 2 is a schematic structural diagram of the asynchronous discrete beam projector in the present disclosure.
  • the asynchronous discrete beam projector 2 includes an edge emitting laser 201 and a beam projector 202 arranged on an optical path;
  • the edge-emitting laser 201 is configured to project laser light to the beam projector 202;
  • the beam projector 202 is configured to project a plurality of discrete collimated beams of the incident laser light to different regions of the target object 3 sequentially in time sequence within a preset time period.
  • the beam projector 202 is provided with multiple sets of beam projection ports, and a set of beam projection ports are sequentially controlled in a chronological order within the predetermined period of time to project multiple beams of discrete collimation to different regions of the target object 3. beam. That is, within a preset time period, each group of beam projection ports is opened once to project multiple discrete collimated beams. When one group of beam projection ports is opened, the other groups of beam projection ports are closed.
  • the opening and closing of the light beam projection port can be opened and closed by electromagnetic drive or micro-motor drive block.
  • the beam projector 202 can realize the function of dividing the incident light from the edge-emitting laser 201 into any number of collimated beams.
  • the emission direction of the edge-emitting laser 201 and the emission direction of the beam projector 202 may be the same, or may be 90 degrees or any angle required for the design of the optical system.
  • the inner surface of the beam splitting projector is processed with a micro-nano structure optical chip and is composed of an optical lens.
  • the beam splitting projector can realize the function of dividing the incident light from the edge-emitting laser 201 into any number of collimated beams.
  • the emission direction of the edge-emitting laser 201 and the projection direction of the beam splitting projector may be the same, or may be 90 degrees or any angle required for the design of the optical system.
  • the asynchronous discrete beam projector 2 includes multiple sets of edge emitting lasers 201 and beam projectors 202 arranged on an optical path;
  • the edge-emitting laser 201 is configured to project laser light to the beam projector 202; the beam projector 202 is configured to project the incident laser light into multiple discrete collimated beams;
  • a group of edge-emitting lasers 201 and beam projectors 202 arranged on an optical path are sequentially controlled in a chronological order within the predetermined period of time to turn on the target object 3.
  • the area projects multiple discrete collimated beams. That is, within a preset period of time, each group of edge-emitting lasers 201 and beam projectors 202 set on an optical path are turned on once to project multiple discrete collimated beams.
  • the edge-emitting laser 201 and the beam projector 202 of the other groups arranged on an optical path are turned off.
  • FIG. 3 is a schematic diagram of another structure of the asynchronous discrete beam projector in this disclosure.
  • the asynchronous discrete beam projector 2 includes a laser array 203, a collimating lens 204, and a beam splitter arranged on an optical path.
  • the laser array 203 includes a plurality of lasers arranged in an array, and the plurality of lasers are divided into a plurality of laser emitting groups, and pass through each laser emitting group to all the lasers in a predetermined period of time.
  • the collimating lens 204 projects a laser of the first order of magnitude;
  • the collimating lens 204 is configured to collimate the multiple incident laser beams and then emit a collimated beam of a first order of magnitude;
  • the beam splitting device 205 is configured to split an incident collimated beam of a first order of magnitude and then emit a collimated beam of a second order of magnitude;
  • the second order of magnitude is greater than the first order of magnitude.
  • each of the lasers or each group of the lasers can be individually controlled to emit, so that each group of lasers can be controlled to emit lasers during a preset time period to achieve the target object 3 Projections from different areas.
  • the number of the laser emitting groups is consistent with the number of regions of the target object 3.
  • the laser array 203 may be composed of multiple vertical cavity surface emitting lasers (VCSEL) or multiple edge emitting lasers (Edge Emitting Laser, EEL). After passing through the collimating lens 204, multiple laser beams can become highly parallel collimated beams. According to the requirement of the number of discrete beams in practical applications, the beam splitting device 205 can be used to achieve more collimated beams.
  • the beam splitting device 205 may adopt a diffraction grating (DOE), a spatial light modulator (SLM), and the like.
  • DOE diffraction grating
  • SLM spatial light modulator
  • the photodetector array imager 1 includes an optical imaging lens 102, a photodetector array 101, and a driving circuit;
  • the photodetector array 101 includes a plurality of photodetectors distributed in an array, and the plurality of photodetector emission groups are a plurality of photodetector groups, and each of the photodetector groups and a laser emitting Group corresponding
  • the photodetector group is configured to receive the collimated light beam reflected by the target object 3 after being emitted by the corresponding laser emitting group;
  • the optical imaging lens 102 is configured such that the direction vector of the collimated light beam entering the photodetector array 101 through the optical imaging lens 102 has a one-to-one correspondence with the photodetector;
  • the driving circuit is configured to measure the propagation time of multiple discrete collimated light beams and then generate depth data of multiple regions on the surface of the target object 3, and then according to the depth data of multiple regions of the target object 3 The depth data of the surface of the target object 3 is generated.
  • the photodetector group and the laser emitting group are in a one-to-one correspondence, that is, when the spatial relationship between each detector and the projected collimated beam is accurately corresponded by calibration, in the asynchronous mode
  • only the photodetector group corresponding to the laser group of the collimated beam being projected can be turned on, and the other photodetector groups can be turned off, thereby reducing the power consumption of the photodetector and system noise interference.
  • the number of the photodetector groups is consistent with the number of regions of the target object 3.
  • a narrow-band filter is usually installed in the optical imaging lens 102, so that the photodetector array 101 can only pass the incident collimated light beam with a preset wavelength.
  • the predetermined wavelength may be the wavelength of the incident collimated beam, or may be between 50 nanometers less than the incident collimated beam and 50 nanometers greater than the incident collimated beam.
  • the photodetector array 101 may be arranged periodically or non-periodically. Each photodetector cooperates with the auxiliary circuit to realize the time-of-flight measurement of the straight beam.
  • the photodetector array 101 can be a combination of multiple single-point photodetectors or a sensor chip integrating multiple photodetectors.
  • the irradiation spot of a discrete collimated beam on the target object 3 may correspond to one or more photodetectors.
  • the signal of each detector can be connected through a circuit, so that it can be combined into a photodetector with a larger detection area.
  • the multiple discrete collimated beams are periodically arranged in a predetermined shape, that is, distributed in a geometric regularity.
  • the preset The shape of includes any of the following shapes or any multiple shapes that can be switched between:
  • the shape of the periodic arrangement of the multiple discrete collimated beams is not limited to the above-mentioned shape, and may be arranged in other shapes.
  • the preset shape when the preset shape is rectangular, that is, the unit arrangement shape of the collimated beams in one period is rectangular, and it repeats periodically in space.
  • the preset shape when the preset shape is a triangle, that is, the unit arrangement shape of the collimated light beam in one period is a triangle, and it repeats periodically in space.
  • the preset shape is a hexagon, that is, the unit arrangement shape of the collimated beams in a period is a hexagon, and it repeats periodically in space.
  • each collimated beam in the cross-section may be distorted, such as stretching and distortion.
  • the energy distribution of each collimated beam in the cross-section can be in other shapes such as a circle, a ring, or an ellipse. In such an arrangement as shown in 5, it is beneficial to simplify the spatial correspondence between the multiple discrete collimated beams and the photodetector array 101.
  • the multiple discrete collimated beams are non-periodically arranged in another preset shape.
  • the aperiodic arrangement includes any of the following arrangements or any multiple arrangements that can be switched between:
  • the shape of the aperiodic arrangement of the multiple discrete collimated light beams is not limited to the above-mentioned shape, and may be arranged in other shapes.
  • the spatial coding arrangement is specifically that in the periodic arrangement, a part of the light beam is defaulted, so as to realize the spatial coding of the arrangement position.
  • the coding that can be used is not limited.
  • the random arrangement specifically the arrangement of collimated beams, is randomly distributed, so that the similarity of the arrangement of different positions is small or close to Zero, as shown in FIG.
  • the quasi-lattice arrangement is specifically that the collimated light beams are arranged non-periodically at adjacent positions at a short distance, and periodically arranged at a long distance. Since the implementation of the present disclosure is limited by the optical system, the arrangement of the actual collimated beam in the cross-section may be distorted, such as stretching and distortion.
  • the energy distribution of each collimated beam in the cross-section can be a circle, a ring, or an ellipse. In this arrangement as shown in 6, this arrangement is conducive to uniform sampling of non-determined targets and optimizes the effect of the final 3D depth map.
  • the light detector adopts any of the following light sensors:
  • model selection of the light detector is not limited to the aforementioned light sensor, and may also include other types of light sensors.
  • the electronic device provided by the present disclosure includes the 3D imaging device based on asynchronous ToF discrete point cloud, and also includes a display panel; the asynchronous discrete beam projector and the photodetector array imager are located at The backlight side of the display panel;
  • the asynchronous discrete beam projector 2 sequentially projects multiple discrete collimated beams to different areas of the target object 3 in a time sequence within a preset period of time, so that the multiple discrete collimated beams penetrate the display panel and then illuminate To the target object 3;
  • the photodetector array imager 1 receives the multiple discrete collimated light beams reflected by the target object 3 and penetrates the display panel, and obtains the target object 3 according to the propagation time of the multiple discrete collimated light beams
  • the depth data of the multiple regions of the target object 3 is further generated according to the depth data of the multiple regions of the target object 3 to generate the depth data of the surface of the target object 3.
  • the photodetector array imager 1 ensures the spatial position correspondence between the multiple discrete collimated beams projected and the photodetector array 101. In this way, each photodetector in the photodetector array 101 can use the ToF mode of continuously modulating the light beam or pulse in time to measure the propagation time of light, and then calculate the distance of light propagation by means of the speed of light.
  • the pulse-based ToF method is also called the direct ToF method, specifically: the photodetector can sensitively detect the waveform of a light pulse, and then compare with the emission time of the light pulse to obtain the collimated beam in the asynchronous discrete beam The travel time between the projector 2 and the photodetector array imager 1.
  • the commonly used photodetector is a single photon avalanche diode (SPAD).
  • the single-photon avalanche diode can count the photons of the light pulse very sensitively and at high speed. That is, the number of photons at different times is counted within the pulse time window, and the overall waveform of the pulse is restored.
  • the pulse-based ToF method has relatively low requirements for the power consumption of the projector, and is beneficial to eliminate the interference of multipath beams.
  • the ToF method based on time-continuous modulation of the beam is also called the indirect ToF method.
  • the time continuous modulation usually adopts a sine wave modulation method
  • the photodetector can be realized by a CMOS or CCD photosensitive method
  • the asynchronous discrete beam projector 2 continuously emits a collimated beam to the target object under high frequency modulation. 3.
  • the photodetector array 101 After being reflected by the target object 3, it is received by the photodetector array 101.
  • Each photodetector records the phase change between the emitted collimated beam and the received collimated beam, so that the depth information of the surface position of the target object 3 can be obtained.
  • the ToF method based on time continuous modulation of the beam is an energy integration process, it has higher accuracy than pulsed measurement, and does not require the light source to be a short-time high-intensity pulse. Different types of light sources can be used and different modulation methods can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un appareil d'imagerie 3D basé sur un nuage de points discrets à temps de vol asynchrone, l'appareil comprenant : un projecteur de faisceau lumineux discret asynchrone (2) projetant de manière séquentielle une pluralité de faisceaux lumineux collimatés discrets vers différentes régions d'un objet cible (3) dans un ordre chronologique à l'intérieur d'une période de temps prédéfinie; et un imageur à réseau de détecteurs optiques (1) recevant de manière séquentielle une pluralité de faisceaux lumineux collimatés discrets réfléchis par l'objet cible (3) et mesurant le temps de propagation de la pluralité de faisceaux lumineux collimatés discrets, puis obtenant des données de profondeur de la pluralité de régions de l'objet cible (3), et générant des données de profondeur de la surface de l'objet cible (3) en fonction des données de profondeur de la pluralité de régions de l'objet cible (3). L'invention permet d'obtenir la meilleure précision et la meilleure densité de nuage de points pour un scénario d'application d'imagerie 3D spécifique, et peut maintenir une puissance optique relativement faible à chaque fois qu'une projection est effectuée, ce qui permet de satisfaire les limitations relatives à la consommation d'énergie et au laser pour la sécurité de l'œil humain.
PCT/CN2020/087137 2019-04-30 2020-04-27 Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique WO2020221185A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910362489.X 2019-04-30
CN201910362489.XA CN110221309B (zh) 2019-04-30 2019-04-30 基于异步ToF离散点云的3D成像装置及电子设备

Publications (1)

Publication Number Publication Date
WO2020221185A1 true WO2020221185A1 (fr) 2020-11-05

Family

ID=67820430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087137 WO2020221185A1 (fr) 2019-04-30 2020-04-27 Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN110221309B (fr)
WO (1) WO2020221185A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221309B (zh) * 2019-04-30 2021-08-17 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像装置及电子设备
CN110764101B (zh) * 2019-11-07 2023-05-05 浙江缔科新技术发展有限公司 一种具备测高功能的光量子激光瞄镜
CN112824934B (zh) * 2019-11-20 2024-05-07 深圳市光鉴科技有限公司 基于调制光场的tof多径干扰去除方法、系统、设备及介质
US20230107567A1 (en) * 2020-03-16 2023-04-06 Ningbo ABAX Sensing y Co., Ltd. Device and method for measuring distance by time of flight
WO2022032516A1 (fr) * 2020-08-12 2022-02-17 深圳市速腾聚创科技有限公司 Radar laser et procédé de détection associé, support de stockage et système de détection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201707438U (zh) * 2010-05-28 2011-01-12 中国科学院合肥物质科学研究院 基于led阵列共透镜tof深度测量的三维成像系统
CN103731611A (zh) * 2012-10-12 2014-04-16 三星电子株式会社 深度传感器、图像捕获方法和图像处理系统
US20170127036A1 (en) * 2015-10-29 2017-05-04 Samsung Electronics Co., Ltd. Apparatus and method for acquiring depth information
CN107765260A (zh) * 2016-08-22 2018-03-06 三星电子株式会社 用于获取距离信息的方法、设备及计算机可读记录介质
CN108140666A (zh) * 2015-08-10 2018-06-08 特里纳米克斯股份有限公司 用于至少一个对象的光学检测的有机检测器
CN110221309A (zh) * 2019-04-30 2019-09-10 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像装置及电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10698110B2 (en) * 2015-03-05 2020-06-30 Teledyne Digital Imaging, Inc. Laser scanning apparatus and method
CN109313267B (zh) * 2016-06-08 2023-05-02 新唐科技日本株式会社 测距系统及测距方法
JP7028588B2 (ja) * 2017-09-04 2022-03-02 株式会社日立エルジーデータストレージ 3次元距離測定装置
CN107907055B (zh) * 2017-12-14 2024-01-26 北京驭光科技发展有限公司 图案投射模组、三维信息获取系统、处理装置及测量方法
CN108594455B (zh) * 2018-03-23 2019-12-13 深圳奥比中光科技有限公司 一种结构光投影模组和深度相机
CN109116332A (zh) * 2018-09-05 2019-01-01 Oppo广东移动通信有限公司 阵列光源、tof测距方法、摄像头模组及电子设备
CN109343070A (zh) * 2018-11-21 2019-02-15 深圳奥比中光科技有限公司 时间飞行深度相机
CN109597211B (zh) * 2018-12-25 2022-01-14 奥比中光科技集团股份有限公司 一种投影模组、深度相机以及深度图像获取方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201707438U (zh) * 2010-05-28 2011-01-12 中国科学院合肥物质科学研究院 基于led阵列共透镜tof深度测量的三维成像系统
CN103731611A (zh) * 2012-10-12 2014-04-16 三星电子株式会社 深度传感器、图像捕获方法和图像处理系统
CN108140666A (zh) * 2015-08-10 2018-06-08 特里纳米克斯股份有限公司 用于至少一个对象的光学检测的有机检测器
US20170127036A1 (en) * 2015-10-29 2017-05-04 Samsung Electronics Co., Ltd. Apparatus and method for acquiring depth information
CN107765260A (zh) * 2016-08-22 2018-03-06 三星电子株式会社 用于获取距离信息的方法、设备及计算机可读记录介质
CN110221309A (zh) * 2019-04-30 2019-09-10 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像装置及电子设备

Also Published As

Publication number Publication date
CN110221309B (zh) 2021-08-17
CN110221309A (zh) 2019-09-10

Similar Documents

Publication Publication Date Title
WO2020221185A1 (fr) Appareil d'imagerie 3d basé sur un nuage de points discrets à temps de vol asynchrone, et dispositif électronique
US20220026575A1 (en) Integrated illumination and detection for lidar based 3-d imaging
CN110244318B (zh) 基于异步ToF离散点云的3D成像方法
US10330780B2 (en) LIDAR based 3-D imaging with structured light and integrated illumination and detection
US11435446B2 (en) LIDAR signal acquisition
US10393877B2 (en) Multiple pixel scanning LIDAR
CN108603937B (zh) 具有远场照射重叠的lidar式3-d成像
WO2020221188A1 (fr) Appareil d'imagerie 3d à base de nuage de points discrets à temps de vol synchrone et dispositif électronique
CN110824490B (zh) 一种动态距离测量系统及方法
WO2022021797A1 (fr) Système de mesure de distance et procédé de mesure de distance
CN111796295B (zh) 一种采集器、采集器的制造方法及距离测量系统
CN110658529A (zh) 一种集成分束扫描单元及其制造方法
CN110716190A (zh) 一种发射器及距离测量系统
CN112066906A (zh) 深度成像装置
CN110716189A (zh) 一种发射器及距离测量系统
CN210835244U (zh) 基于同步ToF离散点云的3D成像装置及电子设备
CN112066907B (zh) 深度成像装置
CN210128694U (zh) 深度成像装置
CN111947565A (zh) 基于同步ToF离散点云的3D成像方法
CN211426798U (zh) 一种集成分束扫描单元
CN112068144A (zh) 光投射系统及3d成像装置
EP3665503A1 (fr) Acquisition de signal lidar

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20799579

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20799579

Country of ref document: EP

Kind code of ref document: A1