CN110221309B - 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud - Google Patents
3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud Download PDFInfo
- Publication number
- CN110221309B CN110221309B CN201910362489.XA CN201910362489A CN110221309B CN 110221309 B CN110221309 B CN 110221309B CN 201910362489 A CN201910362489 A CN 201910362489A CN 110221309 B CN110221309 B CN 110221309B
- Authority
- CN
- China
- Prior art keywords
- discrete
- target object
- collimated light
- asynchronous
- light beams
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention provides a 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud, which sequentially project a plurality of discrete collimated light beams to different areas of a target object in a time sequence within a preset time period through an asynchronous discrete light beam projector; sequentially receiving the plurality of discrete collimated light beams reflected by the target object through a light detector array imager and measuring the propagation time of the plurality of discrete collimated light beams, further obtaining the depth data of the plurality of regions of the target object, and generating the depth data of the surface of the target object according to the depth data of the plurality of regions of the target object. The invention can realize the optimized precision and point cloud density in a specific 3D imaging application scene, can keep lower optical power during each projection, and can meet the limits of power consumption and laser eye safety.
Description
Technical Field
The invention relates to the field of 3D imaging, in particular to a 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud.
Background
The tof (time of flight) technique is a 3D imaging technique that emits measurement light from a projector and reflects the measurement light back to a receiver through a target object, thereby obtaining a spatial distance from the object to a sensor from a propagation time of the measurement light in the propagation path. Common ToF techniques include single point scanning projection methods and area light projection methods.
The ToF method of single-point scanning projection uses a single-point projector to project a single beam of collimated light whose projection direction is controlled by a scanning device so that it can be projected onto different target locations. After the collimated light of the single light beam is reflected by the target object, part of the light is received by the single-point light detector, and therefore the depth measurement data of the current projection direction is obtained. The method can concentrate all the optical power on one target point, thereby realizing high signal-to-noise ratio at a single target point and further realizing high-precision depth measurement. Scanning of the entire target object relies on scanning devices such as mechanical motors, MEMS, photo phase control radar, etc. And splicing the depth data points obtained by scanning to obtain the discrete point cloud data required by 3D imaging. This method is advantageous for long-range 3D imaging, but requires the use of complex projection scanning systems, which is costly.
The ToF method of surface light projection projects a surface light beam with a continuous energy distribution. The projected light continuously covers the target object surface. The light detector is a light detector array capable of acquiring the propagation time of the light beam. When the optical signal reflected by the target object is imaged on the optical detector through the optical imaging system, the depth obtained by each detector image point is the depth information of the object image relationship corresponding to the object position. This method can be free of complex scanning systems. However, since the optical power density of the surface light projection is much lower than that of the singular collimated light, the signal-to-noise ratio is greatly reduced compared with the method of single-point scanning projection, so that the method can only be applied to scenes with reduced distance and lower precision.
Disclosure of Invention
In view of the defects in the prior art, the present invention provides a 3D imaging device and an electronic device based on asynchronous ToF discrete point cloud. The invention adopts the asynchronous stray light beam projection method, can keep lower light power during each projection, and can meet the limits of power consumption and laser eye safety.
The invention provides a 3D imaging device based on asynchronous ToF discrete point cloud, which comprises an asynchronous discrete light beam projector and a light detector array imager;
the asynchronous discrete light beam projector is used for projecting a plurality of discrete collimated light beams to different areas of the target object in sequence according to time sequence within a preset time period;
the photodetector array imager is configured to sequentially receive the plurality of discrete collimated light beams reflected by the target object and measure propagation time of the plurality of discrete collimated light beams, so as to obtain depth data of a plurality of regions of the target object, and generate depth data of a surface of the target object according to the depth data of the plurality of regions of the target object.
Preferably, the asynchronous discrete beam projector comprises an edge-emitting laser and a beam projector disposed on an optical path;
the edge-emitting laser is used for projecting laser to the beam projector;
the light beam projector is used for projecting the incident laser to different areas of the target object in sequence according to time in a preset time period.
Preferably, the asynchronous discrete beam projector comprises a laser array, a collimating lens and a beam splitting device which are arranged on an optical path;
the laser array comprises a plurality of lasers which are arranged in an array, the lasers are divided into a plurality of laser emission groups, and lasers with a first order of magnitude are projected to the collimating lens through each laser emission group in sequence according to time in a preset time period;
the collimating lens is used for collimating the incident multiple laser beams and then emitting collimated light beams with a first order of magnitude;
the beam splitting device is used for splitting the incident collimated light beam with the first order of magnitude to emit a collimated light beam with a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude.
Preferably, the photodetector array imager comprises an optical imaging lens, a photodetector array and a driving circuit;
the optical detector array comprises a plurality of optical detectors distributed in an array, a plurality of optical detector emission components are a plurality of optical detector groups, and each optical detector group corresponds to one laser emission group;
the optical detector group is used for receiving collimated light beams which are emitted by the corresponding laser emitting group and then reflected by the target object;
the optical imaging lens is used for enabling direction vectors of the collimated light beams which penetrate through the optical imaging lens and enter the light detector array to be in one-to-one correspondence with the light detectors;
the driving circuit is used for measuring the propagation time of the plurality of discrete collimated light beams and further generating depth data of the plurality of regions of the surface of the target object, and further generating the depth data of the surface of the target object according to the depth data of the plurality of regions of the target object.
Preferably, the plurality of discrete collimated light beams are arranged periodically in a predetermined shape.
Preferably, the preset shape includes any one of the following shapes or any plurality of shapes that can be switched with each other:
straight line shape
-a triangle;
-a quadrilateral;
-a rectangle;
-circular;
-a hexagon;
-a pentagon.
Preferably, the plurality of discrete collimated light beams are non-periodically arranged in another predetermined shape.
Preferably, the aperiodic arrangement includes any one of the following arrangements or any plurality of arrangements that can be switched with each other:
-a random arrangement;
-a spatial coding arrangement;
-a quasi-lattice arrangement.
Preferably, the asynchronous discrete beam projector comprises a plurality of sets of edge-emitting lasers and beam projectors arranged on an optical path;
the edge-emitting laser is used for projecting laser to the beam projector; the beam projector is used for projecting the incident laser light into a plurality of discrete collimated light beams;
and (c) projecting a plurality of discrete collimated light beams onto different regions of the target object by sequentially controlling a set of edge-emitting lasers and beam projectors arranged in an optical path to be turned on in a time sequence within the predetermined period of time.
The electronic equipment provided by the invention comprises the 3D imaging device based on the asynchronous ToF discrete point cloud and a display panel; the asynchronous discrete beam projector and the photodetector array imager are located on a backlight side of the display panel;
when the asynchronous discrete light beam projector projects a plurality of discrete collimated light beams to different areas of the target object in sequence in time within a preset time period, the discrete collimated light beams penetrate through the display panel and then irradiate the target object;
the light detector array imager receives a plurality of discrete collimated light beams which penetrate through the display panel after being reflected by the target object, obtains depth data of a plurality of areas of the target object according to the propagation time of the plurality of discrete collimated light beams, and further generates depth data of the surface of the target object according to the depth data of the plurality of areas of the target object.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, the asynchronous discrete light beam projector projects a plurality of discrete collimated light beams to different regions of the target object in a time sequence within a preset time period, so that the light detector array imager can obtain the depth data of a plurality of regions of the target object, the depth data of the surface of the target object is generated according to the depth data of the plurality of regions of the target object, the power density of the light beams is improved, and the balance between the signal-to-noise ratio and the point cloud density is realized, thereby realizing the optimized precision and the point cloud density in a specific 3D imaging application scene, keeping lower light power during each projection, and meeting the limits of power consumption and laser eye safety.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a schematic structural diagram of a 3D imaging apparatus based on a synchronous ToF discrete point cloud according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an asynchronous discrete beam projector according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of another configuration of an asynchronous discrete beam projector in accordance with an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an optical imaging lens according to an embodiment of the present disclosure;
FIGS. 5(a), (b), and (c) are schematic diagrams of a periodic arrangement of a plurality of discrete collimated light beams according to an embodiment of the present invention; and
fig. 6(a), (b), and (c) are schematic diagrams of non-periodic arrangements of a plurality of discrete collimated light beams in an embodiment of the invention.
In the figure:
1 is a light detector array imager;
2 is an asynchronous discrete beam projector;
3 is a target object;
101 is a photodetector array;
102 is an optical imaging lens;
201 is an edge-emitting laser;
202 is a beam projector;
203 is a laser array;
204 is a collimating lens;
205 is a beam splitting device.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. The connection may be for fixation or for circuit connection.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be in any way limiting of the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
The invention provides a compromise scheme of a single-point scanning projection method and a surface light projection method, namely, a plurality of discrete collimated light beams are projected by one projector at the same time and matched with a ToF photodetector array, and synchronous 3D point cloud containing a plurality of target point depth data can be obtained in single measurement. The number of collimated beams projected simultaneously may vary from a few to tens of thousands, depending on the requirements of the actual application. The invention can realize the balance and optimization of the light beam power density (namely the signal-to-noise ratio) and the point cloud density under the same power by controlling the number of the light beams. When the number of the light beams is small, each point obtains higher signal-to-noise ratio and precision, but the point cloud is sparse; at higher numbers of beams, the point cloud is denser, but the signal-to-noise ratio and accuracy are relatively degraded, but still better than the method of surface light projection. In order to improve the precision and density of the point cloud at the same time, the invention adopts an asynchronous projection method, the distribution of the light beams projected each time is relatively sparse, and the distribution of the light beams projected at different moments is different, thereby obtaining the point cloud data of different areas of the target object. And splicing the point clouds obtained by multiple projection measurements to obtain relatively dense point clouds. Therefore, the optimized precision and point cloud density in a specific 3D imaging application scene can be realized, and lower optical power is kept during each projection, so that the limitations of power consumption and laser eye safety can be met.
Fig. 1 is a schematic structural diagram of a 3D imaging device based on a synchronous ToF discrete point cloud according to the present invention, and as shown in fig. 1, the 3D imaging device based on a synchronous ToF discrete point cloud according to the present invention is used for implementing a 3D imaging device based on an asynchronous ToF discrete point cloud according to the present invention, and includes an asynchronous discrete light beam projector 2 and a light detector array imager 1;
the asynchronous discrete light beam projector 2 is used for projecting a plurality of discrete collimated light beams to different areas of the target object 3 in sequence in time within a preset time period;
the photodetector array imager 1 is configured to sequentially receive the plurality of discrete collimated light beams reflected by the target object 3 and measure propagation time of the plurality of discrete collimated light beams, so as to obtain depth data of a plurality of regions of the target object 3, and generate depth data of the surface of the target object 3 according to the depth data of the plurality of regions of the target object 3.
In an embodiment of the present invention, the preset time period may be set to 10 milliseconds; the target object 3 at least comprises a first area and a second area, so that a plurality of discrete collimated light beams can be firstly projected to the first area by a group of lasers in the asynchronous discrete light beam projector 2 within a preset time period to obtain depth data of the first area, then a plurality of discrete collimated light beams are projected to the second area by another group of lasers in the asynchronous discrete light beam projector 2 to obtain depth data of the second area, and depth data of the surface of the target object 3 is generated according to the depth data of the first area and the depth data of the second area. The first region and the second region may be adjacent regions or may be connected regions. The first region and the second region may also be interlaced together, for example, the first region includes a plurality of randomly distributed first sub-regions, and the second region includes a plurality of randomly distributed second sub-regions at the gaps of any adjacent first sub-regions. In a variant the target object 3 may comprise a plurality of zones, such as three zones, four zones, etc.
In an embodiment of the present invention, during a plurality of the preset time periods, the asynchronous discrete beam projector 2 sequentially projects a plurality of discrete collimated beams to different areas of the target object 3 in time sequence, so as to generate a plurality of depth data of the surface of the target object 3, and then iteratively generates optimal depth data of the surface of the target object 3 according to the plurality of depth data of the surface of the target object 3.
In an embodiment of the present invention, the asynchronous discrete light beam projector 2 projects a plurality of discrete collimated light beams to different regions of the target object 3 in a time sequence within a preset time period, so that the photodetector array imager 1 can obtain depth data of a plurality of regions of the target object 3, and generates depth data of the surface of the target object 3 according to the depth data of the plurality of regions of the target object 3, thereby increasing the light beam power density, and achieving a balance between the signal-to-noise ratio and the point cloud density, so that an optimized precision and point cloud density in a specific 3D imaging application scene can be achieved, and a lower light power can be maintained during each projection, and the limitations of power consumption and laser eye safety can be satisfied.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, so that the above is the core idea of the present invention, and the above objects, features and advantages of the present invention can be more clearly understood. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In an embodiment of the present invention, a plurality of discrete collimated light beams projected by the asynchronous discrete light beam projector 2 in a discrete shape are reflected by the target object 3, the partially reflected collimated light beams are received by the photo detector array 101, each photo detector can obtain the flight time t from emission to reception of the corresponding light beam, so as to obtain the flight distance s ═ ct of the collimated light beam through the speed of light c, and thus the depth information of the surface position of each target object 3 irradiated by the discrete light beams can be measured. These discrete-position depth data points construct point cloud data that can replicate the 3D morphology of the object, enabling 3D imaging of the target object 3. The plurality of discrete collimated light beams is tapered.
In an embodiment of the invention, the number of discrete collimated light beams is between two and several tens of thousands of beams, such as 2 to 10 thousands of beams.
In an embodiment of the invention, the asynchronous ToF discrete point cloud based 3D imaging device provided by the invention comprises a driving circuit connected with the asynchronous discrete light beam projector 2 and the photodetector array imager 1. The drive circuitry is used to control the asynchronous discrete beam projector 2 and the photodetector array imager 1 to be turned on or off simultaneously.
The driving circuit may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, or the like, or may include a general-purpose processor, such as when the depth camera is integrated into an intelligent terminal, such as a mobile phone, a television, a computer, or the like, and the processor in the terminal may serve as at least one part of the processing circuit
Fig. 2 is a schematic diagram of a structure of an asynchronous discrete beam projector according to the present invention, and as shown in fig. 2, the asynchronous discrete beam projector 2 includes an edge-emitting laser 201 and a beam projector 202 disposed on an optical path;
the edge-emitting laser 201 is used for projecting laser to the beam projector 202;
the beam projector 202 is configured to sequentially project the incident laser light to different regions of the target object 3 in a time sequence within a preset time period.
The beam projector 202 is provided with a plurality of sets of beam projection ports for projecting a plurality of discrete collimated beams to different areas of the target object 3 by sequentially controlling the sets of beam projection ports in time series within the preset time period. That is, each group of beam projection ports is opened once in a preset time period to project a plurality of discrete collimated beams, and when one group of beam projection ports is opened, the other groups of beam projection ports are closed. The opening and closing of the light beam projection port can be realized by driving a stop block through electromagnetic driving or a micro motor.
In one embodiment of the present invention, the beam projector 202 is capable of splitting incident light from the edge-emitting laser 201 into any number of collimated beams. The emitting direction of the edge-emitting laser 201 and the projecting direction of the beam projector 202 may be the same, or may be at 90 degrees or any angle required for the optical system design.
In an embodiment of the invention, the inner surface of the beam splitting projector is processed with a micro-nano structured optical chip and matched with an optical lens. The beam splitting projector can perform the function of splitting incident light from the edge-emitting laser 201 into any number of collimated beams. The emission direction of the edge-emitting laser 201 and the projection direction of the beam splitting projector may be the same, or may be at 90 degrees or any angle required for the optical system design.
In an embodiment of the present invention, the asynchronous discrete beam projector 2 comprises a plurality of sets of edge-emitting lasers 201 and beam projectors 202 arranged on an optical path;
the edge-emitting laser 201 is used for projecting laser to the beam projector 202; the beam projector 202 for projecting the incident laser light into a plurality of discrete collimated beams;
in an embodiment of the present invention, a plurality of discrete collimated light beams are projected to different regions of the target object 3 by sequentially controlling a set of the edge-emitting laser 201 and the beam projector 202 disposed on an optical path to be turned on in time series for the preset period of time. That is, each set of the edge-emitting laser 201 and the beam projector 202 disposed on one optical path is turned on once for a predetermined period of time to project a plurality of discrete collimated beams, and when one set of the edge-emitting laser 201 and the beam projector 202 disposed on one optical path is turned on, the other set of the edge-emitting laser 201 and the beam projector 202 disposed on one optical path is turned off.
Fig. 3 is a schematic view showing another structure of the asynchronous discrete beam projector of the present invention, and as shown in fig. 3, the asynchronous discrete beam projector 2 includes a laser array 203, a collimator lens 204, and a beam splitting device 205 disposed on an optical path;
the laser array 203 comprises a plurality of lasers arranged in an array, the plurality of lasers are divided into a plurality of laser emission groups, and lasers with a first order of magnitude are projected to the collimating lens 204 sequentially through each laser emission group in time sequence within a preset time period;
the collimating lens 204 is configured to collimate the incident multiple laser beams and emit collimated light beams of a first order of magnitude;
the beam splitting device 205 is configured to split the incident collimated light beam of the first order of magnitude and emit a collimated light beam of a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude.
In an embodiment of the present invention, each laser or each group of lasers may be controlled to emit light individually, so that by controlling each group of lasers to emit light in a preset time period, projection of different areas of the target object 3 is achieved. The number of laser emission groups corresponds to the number of regions of the target object 3.
In an embodiment of the present invention, the Laser array 203 may be formed by a plurality of Vertical Cavity Surface Emitting Lasers (VCSELs) or a plurality of Edge Emitting Lasers (EELs). The multiple laser beams can become highly parallel collimated beams after passing through the collimating lens 204. The beam splitting device 205 may be used to achieve more collimated beams as required by the number of discrete beams in practical applications. The beam splitting device 205 may employ a diffraction grating (DOE), a Spatial Light Modulator (SLM), and the like.
Fig. 4 is a schematic structural diagram of an optical imaging lens in the present invention, and as shown in fig. 4, the photodetector array imager 1 includes an optical imaging lens 102, a photodetector array 101, and a driving circuit;
the optical detector array 101 comprises a plurality of optical detectors distributed in an array, the emission components of the plurality of optical detectors are a plurality of optical detector groups, and each optical detector group corresponds to one laser emission group;
the optical detector group is used for receiving collimated light beams which are emitted by the corresponding laser emitting group and then reflected by the target object 3;
the optical imaging lens 102 is configured to enable a direction vector of the collimated light beam entering the light detector array 101 through the optical imaging lens 102 to have a one-to-one correspondence with the light detectors;
the driving circuit is configured to measure propagation times of the plurality of discrete collimated light beams and further generate depth data of a plurality of regions on the surface of the target object 3, and further generate depth data of the surface of the target object 3 according to the depth data of the plurality of regions of the target object 3.
In an embodiment of the present invention, the optical detector groups are in one-to-one correspondence with the laser emission groups, that is, when the spatial relationship between each detector and the projected collimated light beam is calibrated to correspond accurately, in an asynchronous mode, only the optical detector group corresponding to the laser group of the projected collimated light beam may be turned on, and the other optical detector groups may be turned off, so that the power consumption of the optical detector and the system noise interference may be reduced. The number of the sets of optical detectors corresponds to the number of regions of the target object 3.
In an embodiment of the present invention, in order to filter background noise, a narrow band filter is usually installed inside the optical imaging lens 102, so that the photodetector array 101 can only pass incident collimated light beams with preset wavelengths. The preset wavelength can be the wavelength of the incident collimated light beam, and can also be between 50 nanometers smaller than the incident collimated light beam and 50 nanometers larger than the incident collimated light beam. The photodetector array 101 may be arranged periodically or aperiodically. Each photodetector, in cooperation with an auxiliary circuit, may enable measurement of the time of flight of the collimated beam. The photodetector array 101 may be a combination of multiple single-point photodetectors or a sensor chip integrating multiple photodetectors, as required by the number of discrete collimated light beams. To further optimize the sensitivity of the light detectors, the illumination spot of one discrete collimated light beam on the target object 3 may correspond to one or more light detectors. When a plurality of light detectors correspond to the same irradiation light spot, signals of each detector can be communicated through a circuit, so that the light detectors with larger detection areas can be combined.
In an embodiment of the invention, the plurality of discrete collimated light beams are periodically arranged in a predetermined shape, that is, in a geometrically regular distribution.
Fig. 5(a), (b), and (c) are schematic diagrams of the periodic arrangement of a plurality of discrete collimated light beams in the present invention, and as shown in fig. 5, in an embodiment of the present invention, the preset shape includes any one of the following shapes or any plurality of shapes that can be switched with each other:
straight line shape
-a triangle;
-a quadrilateral;
-a rectangle;
-circular;
-a hexagon;
-a pentagon.
The shape of the periodic arrangement of the plurality of discrete collimated light beams is not limited to the above shape, and the plurality of discrete collimated light beams may be arranged in other shapes. As shown in fig. 5(a), when the preset shape is a rectangle, that is, the unit arrangement shape of the collimated light beams in one period is a rectangle, and is periodically repeated in space. As shown in fig. 5(b), when the preset shape is a triangle, that is, the unit arrangement shape of the collimated light beam in one period is a triangle, and is periodically repeated in space. As shown in fig. 5(c), when the preset shape is a hexagon, that is, the unit arrangement shape of the collimated light beams in one period is a hexagon, and is periodically repeated in space. Since the present invention is limited to an optical system in implementation, the arrangement of the actual collimated light beam in the cross section may have distortion, such as stretching, twisting, and the like. And the energy distribution of each collimated light beam in the cross section can be circular, circular ring or elliptical and the like. In such an arrangement as shown in fig. 5, it is advantageous to simplify the spatial correspondence of the plurality of discrete collimated light beams to the photodetector array 101.
In an embodiment of the invention, the plurality of discrete collimated light beams are non-periodically arranged in another predetermined shape.
In an embodiment of the present invention, the aperiodic arrangement includes any one of the following arrangements or any plurality of arrangements that can be switched with each other:
-a random arrangement;
-a spatial coding arrangement;
-a quasi-lattice arrangement.
The shape of the non-periodic arrangement of the plurality of discrete collimated light beams is not limited to the above shape, and the plurality of discrete collimated light beams may be arranged in other shapes. As shown in fig. 6(a), the spatial coding arrangement, specifically, in the periodic arrangement, a part of the light beams is deleted, so as to implement the spatial coding of the arrangement position, and the actually adopted coding is not limited to the example in fig. 6 (a); as shown in fig. 6(b), the random arrangement, specifically the arrangement of the collimated light beams, is randomly distributed so that the similarity of the arrangement pattern at different positions is small or close to zero, and as shown in fig. 6(c), the quasi-lattice arrangement, specifically the quasi-collimated light beams, are non-periodically arranged at close proximity positions and are periodically arranged at long distances. Since the present invention is limited to an optical system in implementation, the arrangement of the actual collimated light beam in the cross section may have distortion, such as stretching, twisting, and the like. And the energy distribution of each collimated light beam in the cross section can be circular, circular ring or elliptical and the like. In this arrangement as shown in fig. 6, this arrangement facilitates uniform sampling of non-deterministic targets, optimizing the effect of the final 3D depth map.
In an embodiment of the present invention, the light detector employs any one of the following light sensors:
-a CMOS light sensor;
-a CCD light sensor;
SPAD light sensor.
The type of the light detector is not limited to the light sensor, and may also include other types of light sensors.
In this embodiment, the electronic device provided by the present invention includes the 3D imaging apparatus based on the asynchronous ToF discrete point cloud, and further includes a display panel; the asynchronous discrete beam projector and the photodetector array imager are located on a backlight side of the display panel;
when the asynchronous discrete light beam projector 2 sequentially projects a plurality of discrete collimated light beams to different areas of the target object 3 in time sequence within a preset time period, the plurality of discrete collimated light beams are irradiated onto the target object 3 after penetrating through the display panel;
the photodetector array imager 1 receives a plurality of discrete collimated light beams that penetrate the display panel after being reflected by the target object 3, obtains depth data of a plurality of regions of the target object 3 according to propagation time of the plurality of discrete collimated light beams, and generates depth data of the surface of the target object 3 according to the depth data of the plurality of regions of the target object 3.
In an embodiment of the present invention, the photodetector array imager 1 ensures spatial position correspondence between the projected plurality of discrete collimated light beams and the photodetector array 101. So that each photodetector in the photodetector array 101 can measure the propagation time of light by using a ToF method of modulating a light beam or pulse continuously in time, and then calculate the distance traveled by light by means of the speed of light.
The pulse-based ToF method, also known as direct ToF method, is specifically: the light detector can sensitively detect the waveform of a light pulse and then obtain the time for the collimated light beam to travel between the asynchronous discrete light beam projector 2 and the light detector array imager 1 compared to the emission time of the light pulse. In this method, a Single Photon Avalanche Diode (SPAD) is a commonly used photodetector. Single photon avalanche diodes are capable of counting photons of an optical pulse very sensitively and at high speed. Namely, counting the number of photons at different times in a pulse time window, and recovering the integral waveform of the pulse. The pulse-based ToF method has low power consumption requirements for the projector and is advantageous for eliminating interference from multipath beams.
The ToF method based on time-continuously modulating a beam is also called an index ToF method. The method specifically comprises the following steps: the time continuous modulation is generally a sine wave modulation mode, the light detector can be realized by a CMOS or CCD photosensitive mode, and the asynchronous discrete light beam projector 2 continuously emits collimated light beams to the target object 3 under high-frequency modulation, and the collimated light beams are received by the light detector array 101 after being reflected by the target object 3. Each photodetector records the phase change of the emitted collimated beam and the received collimated beam, thereby enabling depth information of the surface position of the target object 3 to be obtained. Since the ToF method based on time-continuous modulation of the light beam is an energy integration process, the accuracy is higher compared to pulsed measurement, and the light source is not required to be a short-time high-intensity pulse, different types of light sources can be used, and different modulation methods can be applied.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.
Claims (9)
1. A3D imaging device based on asynchronous ToF discrete point cloud is characterized by comprising an asynchronous discrete beam projector and a light detector array imager;
the asynchronous discrete light beam projector is used for projecting a plurality of discrete collimated light beams to different areas of the target object in sequence according to time sequence within a preset time period;
the photodetector array imager is used for sequentially receiving the plurality of discrete collimated light beams reflected by the target object and measuring the propagation time of the plurality of discrete collimated light beams, so that depth data of a plurality of areas of the target object can be obtained, and depth data of the surface of the target object is generated according to the depth data of the plurality of areas of the target object, wherein the plurality of areas are adjacent areas;
the asynchronous discrete beam projector comprises an edge-emitting laser and a beam projector which are arranged on a light path;
the edge-emitting laser is used for projecting laser to the beam projector;
the light beam projector is used for projecting the incident laser to different areas of the target object in sequence according to time in a preset time period.
2. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 1 wherein the asynchronous discrete beam projector comprises a laser array, a collimating lens and a beam splitting device arranged on an optical path;
the laser array comprises a plurality of lasers which are arranged in an array, the lasers are divided into a plurality of laser emission groups, and lasers with a first order of magnitude are projected to the collimating lens through each laser emission group in sequence according to time in a preset time period;
the collimating lens is used for collimating the incident multiple laser beams and then emitting collimated light beams with a first order of magnitude;
the beam splitting device is used for splitting the incident collimated light beam with the first order of magnitude to emit a collimated light beam with a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude.
3. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 2, wherein the photodetector array imager comprises an optical imaging lens, a photodetector array and a driving circuit;
the optical detector array comprises a plurality of optical detectors distributed in an array, the plurality of optical detectors are divided into a plurality of optical detector groups, and each optical detector group corresponds to one laser emission group;
the optical detector group is used for receiving collimated light beams which are emitted by the corresponding laser emitting group and then reflected by the target object;
the optical imaging lens is used for enabling direction vectors of the collimated light beams which penetrate through the optical imaging lens and enter the light detector array to be in one-to-one correspondence with the light detectors;
the driving circuit is used for measuring the propagation time of the plurality of discrete collimated light beams and further generating depth data of the plurality of regions of the surface of the target object, and further generating the depth data of the surface of the target object according to the depth data of the plurality of regions of the target object.
4. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 1, wherein the plurality of discrete collimated light beams are periodically arranged in a predetermined shape.
5. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 4, wherein the preset shape comprises any one of the following shapes or any plurality of shapes that can be switched with each other:
straight line shape
-a triangle;
-a quadrilateral;
-a rectangle;
-circular;
-a hexagon;
-a pentagon.
6. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 1, wherein the plurality of discrete collimated light beams are non-periodically arranged in another predetermined shape.
7. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 6, wherein the aperiodic arrangement comprises any one of the following arrangements or any plurality of arrangements that can be switched with each other:
-a random arrangement;
-a spatial coding arrangement;
-a quasi-lattice arrangement.
8. The asynchronous ToF discrete point cloud based 3D imaging device according to claim 1 wherein the asynchronous discrete beam projector comprises a plurality of sets of edge emitting lasers and beam projectors arranged in an optical path;
the edge-emitting laser is used for projecting laser to the beam projector; the beam projector is used for projecting the incident laser light into a plurality of discrete collimated light beams;
and (c) projecting a plurality of discrete collimated light beams onto different regions of the target object by sequentially controlling a set of edge-emitting lasers and beam projectors arranged in an optical path to be turned on in a time sequence within the predetermined period of time.
9. An electronic device comprising the asynchronous ToF discrete point cloud based 3D imaging apparatus according to any one of claims 1 to 8, further comprising a display panel; the asynchronous discrete beam projector and the photodetector array imager are located on a backlight side of the display panel;
when the asynchronous discrete light beam projector projects a plurality of discrete collimated light beams to different areas of the target object in sequence in time within a preset time period, the discrete collimated light beams penetrate through the display panel and then irradiate the target object;
the light detector array imager receives a plurality of discrete collimated light beams which penetrate through the display panel after being reflected by the target object, obtains depth data of a plurality of areas of the target object according to the propagation time of the plurality of discrete collimated light beams, and further generates depth data of the surface of the target object according to the depth data of the plurality of areas of the target object.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910362489.XA CN110221309B (en) | 2019-04-30 | 2019-04-30 | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud |
PCT/CN2020/087137 WO2020221185A1 (en) | 2019-04-30 | 2020-04-27 | Asynchronous tof discrete point cloud-based 3d imaging apparatus, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910362489.XA CN110221309B (en) | 2019-04-30 | 2019-04-30 | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110221309A CN110221309A (en) | 2019-09-10 |
CN110221309B true CN110221309B (en) | 2021-08-17 |
Family
ID=67820430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910362489.XA Active CN110221309B (en) | 2019-04-30 | 2019-04-30 | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110221309B (en) |
WO (1) | WO2020221185A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110221309B (en) * | 2019-04-30 | 2021-08-17 | 深圳市光鉴科技有限公司 | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud |
CN110764101B (en) * | 2019-11-07 | 2023-05-05 | 浙江缔科新技术发展有限公司 | Light quantum laser sighting telescope with height measurement function |
CN112824934B (en) * | 2019-11-20 | 2024-05-07 | 深圳市光鉴科技有限公司 | TOF multipath interference removal method, system, equipment and medium based on modulated light field |
CN113484869A (en) * | 2020-03-16 | 2021-10-08 | 宁波飞芯电子科技有限公司 | Detection device and method |
CN114325749A (en) * | 2020-09-28 | 2022-04-12 | 宁波飞芯电子科技有限公司 | Flight time distance measuring device and method |
WO2021184866A1 (en) * | 2020-03-16 | 2021-09-23 | 宁波飞芯电子科技有限公司 | Device and method for measuring distance by time of flight |
WO2022032516A1 (en) * | 2020-08-12 | 2022-02-17 | 深圳市速腾聚创科技有限公司 | Laser radar and detection method therefor, storage medium, and detection system |
CN112946678A (en) * | 2021-02-02 | 2021-06-11 | 宁波飞芯电子科技有限公司 | Detection device |
CN118330673B (en) * | 2024-06-17 | 2024-10-15 | 欧菲微电子(南昌)有限公司 | Depth imaging module, depth imaging method and electronic device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107907055A (en) * | 2017-12-14 | 2018-04-13 | 北京驭光科技发展有限公司 | Pattern projection module, three-dimensional information obtain system, processing unit and measuring method |
CN108594455A (en) * | 2018-03-23 | 2018-09-28 | 深圳奥比中光科技有限公司 | A kind of structured light projection module and depth camera |
CN109116332A (en) * | 2018-09-05 | 2019-01-01 | Oppo广东移动通信有限公司 | Array light source, TOF measurement method, camera module and electronic equipment |
CN109313267A (en) * | 2016-06-08 | 2019-02-05 | 松下知识产权经营株式会社 | Range-measurement system and distance measuring method |
CN109343070A (en) * | 2018-11-21 | 2019-02-15 | 深圳奥比中光科技有限公司 | Time flight depth camera |
CN109425864A (en) * | 2017-09-04 | 2019-03-05 | 日立乐金光科技株式会社 | 3 dimension distance-measuring devices |
CN109597211A (en) * | 2018-12-25 | 2019-04-09 | 深圳奥比中光科技有限公司 | A kind of projective module group, depth camera and depth image acquisition method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201707438U (en) * | 2010-05-28 | 2011-01-12 | 中国科学院合肥物质科学研究院 | Three-dimensional imaging system based on LED array co-lens TOF (Time of Flight) depth measurement |
KR101977711B1 (en) * | 2012-10-12 | 2019-05-13 | 삼성전자주식회사 | Depth sensor, image capturing method thereof and image processing system having the depth sensor |
US10698110B2 (en) * | 2015-03-05 | 2020-06-30 | Teledyne Digital Imaging, Inc. | Laser scanning apparatus and method |
WO2017025567A1 (en) * | 2015-08-10 | 2017-02-16 | Trinamix Gmbh | Organic detector for an optical detection of at least one object |
KR102486385B1 (en) * | 2015-10-29 | 2023-01-09 | 삼성전자주식회사 | Apparatus and method of sensing depth information |
KR20180021509A (en) * | 2016-08-22 | 2018-03-05 | 삼성전자주식회사 | Method and device for acquiring distance information |
CN110221309B (en) * | 2019-04-30 | 2021-08-17 | 深圳市光鉴科技有限公司 | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud |
-
2019
- 2019-04-30 CN CN201910362489.XA patent/CN110221309B/en active Active
-
2020
- 2020-04-27 WO PCT/CN2020/087137 patent/WO2020221185A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109313267A (en) * | 2016-06-08 | 2019-02-05 | 松下知识产权经营株式会社 | Range-measurement system and distance measuring method |
CN109425864A (en) * | 2017-09-04 | 2019-03-05 | 日立乐金光科技株式会社 | 3 dimension distance-measuring devices |
CN107907055A (en) * | 2017-12-14 | 2018-04-13 | 北京驭光科技发展有限公司 | Pattern projection module, three-dimensional information obtain system, processing unit and measuring method |
CN108594455A (en) * | 2018-03-23 | 2018-09-28 | 深圳奥比中光科技有限公司 | A kind of structured light projection module and depth camera |
CN109116332A (en) * | 2018-09-05 | 2019-01-01 | Oppo广东移动通信有限公司 | Array light source, TOF measurement method, camera module and electronic equipment |
CN109343070A (en) * | 2018-11-21 | 2019-02-15 | 深圳奥比中光科技有限公司 | Time flight depth camera |
CN109597211A (en) * | 2018-12-25 | 2019-04-09 | 深圳奥比中光科技有限公司 | A kind of projective module group, depth camera and depth image acquisition method |
Also Published As
Publication number | Publication date |
---|---|
CN110221309A (en) | 2019-09-10 |
WO2020221185A1 (en) | 2020-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110221309B (en) | 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud | |
CN110244318B (en) | 3D imaging method based on asynchronous ToF discrete point cloud | |
US11435446B2 (en) | LIDAR signal acquisition | |
CA3012691C (en) | Lidar based 3-d imaging with far-field illumination overlap | |
CA3017735C (en) | Integrated illumination and detection for lidar based 3-d imaging | |
CN111722241B (en) | Multi-line scanning distance measuring system, method and electronic equipment | |
WO2020221188A1 (en) | Synchronous tof discrete point cloud-based 3d imaging apparatus, and electronic device | |
CN115575928A (en) | LIDAR data acquisition and control | |
CN109557522A (en) | Multi-beam laser scanner | |
CN112066906A (en) | Depth imaging device | |
CN110824490A (en) | Dynamic distance measuring system and method | |
CN112066907B (en) | Depth imaging device | |
CN110658529A (en) | Integrated beam splitting scanning unit and manufacturing method thereof | |
CN112596068A (en) | Collector, distance measurement system and electronic equipment | |
CN110716190A (en) | Transmitter and distance measurement system | |
CN210128694U (en) | Depth imaging device | |
CN110716189A (en) | Transmitter and distance measurement system | |
CN210835244U (en) | 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud | |
US11971505B2 (en) | Methods and devices for peak signal detection | |
CN117629403A (en) | Active single photon detection array non-field imaging system | |
CN214122466U (en) | Collector, distance measurement system and electronic equipment | |
CN112068144B (en) | Light projection system and 3D imaging device | |
CN211426798U (en) | Integrated beam splitting scanning unit | |
CN111947565A (en) | 3D imaging method based on synchronous ToF discrete point cloud | |
KR20170127865A (en) | Range Image Sensor comprised of Combined Pixel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |