CN212135134U - 3D imaging device based on time flight - Google Patents

3D imaging device based on time flight Download PDF

Info

Publication number
CN212135134U
CN212135134U CN202020294116.1U CN202020294116U CN212135134U CN 212135134 U CN212135134 U CN 212135134U CN 202020294116 U CN202020294116 U CN 202020294116U CN 212135134 U CN212135134 U CN 212135134U
Authority
CN
China
Prior art keywords
time
light
speckle
imaging device
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020294116.1U
Other languages
Chinese (zh)
Inventor
陈驰
李安
鲁亚东
黄若普
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Angstrong Technology Co ltd
Original Assignee
Shenzhen Angstrong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Angstrong Technology Co ltd filed Critical Shenzhen Angstrong Technology Co ltd
Priority to CN202020294116.1U priority Critical patent/CN212135134U/en
Application granted granted Critical
Publication of CN212135134U publication Critical patent/CN212135134U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The utility model discloses a 3D imaging device based on time flight, include: the speckle projection module comprises a light source, a laser module and a control module, wherein the light source is used for emitting a plurality of light beams and forming speckles distributed at intervals on a target surface; the array receiving module comprises pixel units corresponding to the speckles in number, and the pixel units are used for receiving speckle reflected beams of the target surface and recording light receiving time; and the control processing module is used for recording the light emission time of the light beam emitted by the speckle projection module, controlling the speckles emitted by the speckle projection module to scan the target surface, obtaining the position point distance information of the reflected speckles according to the light emission time and the light receiving time corresponding to each scanning, and generating a distance depth map. The utility model discloses a throw the scanning of light path, and then improve formation of image resolution greatly, have the characteristics of super far away formation of image distance, anti strong interference, super-resolution formation of image, precision height, low-power consumption.

Description

3D imaging device based on time flight
Technical Field
The utility model belongs to the technical field of the photoelectric imaging technique and specifically relates to a 3D imaging device based on time flight is related to.
Background
The 3D imaging device has already begun to be applied to some electronic consumer products in the market, such as motion recognition of motion sensing games, and structured light 3D face recognition of new generation iphones. The 3D imaging device can greatly enrich the experience of users, promote the product competitiveness, and particularly, compared with the 2D face recognition, the 3D face recognition device is incomparable with the 2D face recognition device in the aspects of experience, safety and the like due to the addition of one-dimensional information. Compared with traditional biological identification, such as fingerprint identification, the reliability and the safety of 3D face identification are higher by one step.
Different from a traditional 2D imaging device such as a camera, the three-dimensional imaging device can only acquire plane 2D information of an object, and the 3D imaging device can also acquire depth information of the object to construct a three-dimensional 3D model, so that the 3D imaging device is widely applied to the fields of industrial measurement, part modeling, medical diagnosis, security monitoring, machine vision, biological recognition, augmented reality AR, virtual reality VR and the like, and has great application value.
The 3D imaging technology is divided into two types, active and passive, with structured light and time flight technologies being the mainstream for the active mode and binocular vision being the mainstream for the passive mode. Because the passive binocular vision technology is influenced by objective factors such as external environment, surface texture attributes of a shot object and the like, and is more complex in an automatic feature point matching algorithm, the passive binocular vision technology is not popularized in the field of 3D imaging consumer electronics at present, and the market is mainly based on passive application of structured light and time of flight (TOF). The structured light technical scheme is characterized by comprising a speckle projector and an IR imaging module, the precision within 1m of a short distance reaches sub-millimeter, the highest requirement of 3D face recognition precision in the financial payment industry is met, and the defect is that the remote imaging precision is sharply reduced, and certain computing resources for a depth algorithm are consumed, so that the application scene range is greatly limited. The technical scheme is characterized by comprising an illumination transmitting module and a TOF sensor receiving module, has long imaging distance and multiple application scenes, such as rear modeling and 3D perception of a smart phone, 3D sensing of an AR/VR device to an external environment, a laser 3D ranging system of an automobile and the like, and has the defects that the precision in the whole imaging range is not as high as that of a structured light technology, but the applied scene is not as high as the precision, so that the TOF sensor has wider application prospect.
The TOF is called Time-Of-Flight, i.e. the Time Of Flight, and measures the Time interval from the emitting Time to the Time when the emitted light is reflected by an object to the receiving end, and the distance measurement can be realized according to the principle Of constant light speed. The current market commonly uses an I-TOF technology, namely, an index Time-Of-Flight technology, wherein the I-TOF transmits a beam Of temporally periodically modulated laser to the surface Of an object through a laser transmitting device, the returned light generates a Time delay relative to the incident light in Time sequence, the Time delay is specifically expressed as a phase delay, the magnitude Of the phase delay and the Flight Time Of the light have a corresponding calculation relation, and the I-TOF obtains the Flight Time Of the light indirectly by measuring the phase delay so as to realize distance measurement. A time flight depth camera as provided in patent application publication No. CN209894976U, which is a technical proposal that the optical flight time is indirectly obtained by measuring the phase, and the phase measurement is actually realized indirectly by the strength of the received light energy, resulting in inaccurate phase measurement and being susceptible to ambient light interference, and once the distance continues to rise, for example, above 10m, the reflected received light is sharply attenuated and even submerged by ambient light, and the measurement accuracy is sharply reduced. Meanwhile, the I-TOF technology essentially obtains a distance value of each pixel on an image sensor (sensor) by using a light energy average integration method, so that the problem of multipath interference is inevitably generated, when a plurality of objects are in front of the sensor, the obtained distance depth map generates 'flying pixels' between the objects, the outlines of the objects are blurred together and cannot be distinguished, and the effect is poor.
SUMMERY OF THE UTILITY MODEL
For solving the problem that prior art exists, the utility model provides a 3D imaging device based on time flight directly realizes the time record of two moments of photoemission and light reception, and the device has the characteristics of super far image distance, anti strong interference, super resolution formation of image, precision height, low-power consumption.
In order to realize the purpose of the utility model, the utility model adopts the following technical scheme:
a time-of-flight based 3D imaging device, comprising:
the speckle projection module comprises a light source, a laser module and a control module, wherein the light source is used for emitting a plurality of light beams and forming speckles distributed at intervals on a target surface;
the array receiving module comprises pixel units corresponding to the speckles in number, and the pixel units are used for receiving speckle reflected beams of the target surface and recording light receiving time;
and the control processing module is used for recording the light emission time of the light beam emitted by the speckle projection module, controlling the speckles emitted by the speckle projection module to scan the target surface, obtaining the position point distance information of the reflected speckles according to the light emission time and the light receiving time corresponding to each scanning, and generating a distance depth map.
Preferably, a scanning device for controlling the change of the light path is arranged in the speckle projection module.
Preferably, the scanning device is a two-dimensional scanning galvanometer arranged behind an optical path in the speckle projection module.
Preferably, the speckle projection module comprises a laser emitter, a collimating lens and a diffractive optical element.
Preferably, the scanning device is an actuating mechanism for driving the laser emitter or the collimating lens to change the optical path.
Preferably, each pixel unit includes a timing circuit therein for recording the light receiving time.
Preferably, the array receiving module comprises an array SPAD array consisting of a plurality of SPAD single SPAD pixel units, a narrow-band filter and an imaging receiving lens.
Preferably, the light source is a vcsel laser emitter which works in a subarea mode, and speckles projected to the target surface are emitted by each area which is sequentially lightened.
The utility model discloses following beneficial effect has: the scanning of the projection light path is realized through various modes, the imaging resolution is greatly improved, and the method has the characteristics of super-long imaging distance, strong interference resistance, super-resolution imaging, high precision and low power consumption.
Drawings
FIG. 1 is a system diagram of a 3D imaging apparatus according to the present embodiment;
FIG. 2 is a schematic diagram of scanning imaging of the 3D imaging apparatus in the present embodiment;
FIG. 3 is a detailed schematic diagram of a speckle projection module;
FIG. 4 is a schematic diagram of a scanning function achieved by adding an MEMS galvanometer in a projection light path;
FIG. 5 is a schematic diagram of a speckle projection module implementing a mechanical motion scanning function therein;
FIG. 6 is a schematic view of a vcsel laser transmitter being partitioned to implement a scanning function, wherein (a) and (b) are two different vcsel laser transmitters operating in different partitioned areas;
FIG. 7 is a schematic structural diagram of an array receiving module;
FIG. 8 is a schematic diagram of the transmittance curve of a narrow band filter;
FIG. 9 is a schematic diagram of a SPAD array;
FIG. 10 is a cross-sectional view of a single SPAD pixel cell;
fig. 11 is a schematic diagram of the imaging of a scanned speckle beam on a SPAD array sensor.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and the present invention is not limited to the specific embodiments disclosed below. The terms "upper", "lower", "left" and "right" used herein are defined with reference to the accompanying drawings, and it is to be understood that the above-described terms do not limit the scope of the present invention.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present invention, and should not be construed as limiting the present invention.
In the present invention, unless otherwise expressly specified or limited, the terms "mounted," "connected," and "fixed" are to be construed broadly and may, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood according to specific situations by those skilled in the art.
A time-of-flight based 3D imaging apparatus as shown in fig. 1, comprising: a speckle projection module 11, including a light source for emitting a plurality of light beams and finally forming speckles 121 distributed at intervals on a target surface (e.g., a receiving screen 12);
the array receiving module 13 includes pixel units corresponding to the number of the speckles 121, and the pixel units are used for receiving the speckle reflected light beam of the target surface and recording the light receiving time;
and the control processing module 10 is used for recording the light emission time of the light beam emitted by the speckle projection module, controlling the speckles emitted by the speckle projection module to scan the target surface, obtaining the position point distance information of the reflected speckles according to the light emission time and the light receiving time corresponding to each scanning, and generating a distance depth map.
The control processing module 10 is a comprehensive system module including a high-speed synchronous signal circuit, a high-speed laser driving circuit, and a boost power management circuit, and is mainly used for performing comprehensive coordination control and data processing on the speckle projection module 11 and the array receiving module 13. Under the control of the control processing module 10, the speckle projection module 11 performs pulse modulation on the light emitted from the internal light source, and projects a plurality of laser beams with precisely controlled angles to reach an object in front, such as the receiving screen 12, forming speckles 121 on the upper side of the receiving screen, and the array receiving module 13 is synchronously opened to work and record the initial time of light emission. The reflected light beam of the speckle 121 is received by the array receiving module 13, triggering the time counting of the internal module, and further obtaining the moment when the light is received, wherein the time difference between the light receiving moment and the light emitting moment is the light flight time Δ t to be obtained, and then the measuring distance d can be obtained according to the following basic formula:
d=Δt*c/2
where c represents the speed of light, approximately equal to a constant of 3X 10 in vacuum8m/s. In practical application, various algorithm models are used for improving environment fitness and calculation accuracy, but the core light velocity distance measurement formula cannot be used.
Each speckle 121 accurately corresponds to each single pixel (pixel unit) in the array receiving module 13, so that the light flight time Δ t corresponding to each pixel of the array receiving module 13 can be obtained, and further, a whole distance depth map of a front object scene is obtained, and the purpose of 3D imaging is achieved.
Referring to the scanning process of fig. 2, at time 1, the projected light beam forms a speckle 121 on the receiving screen 12 by the speckle projection module 11, and then the speckle 122 is projected at time 2 by the X-direction scanning action inside the speckle projection module 11, where the positions of the speckle 121 and the speckle 122 differ by one step. The pixel units of the array receiving module 13 receive the reflected light beams of the speckles 121 and 122, and record the receiving time of the two transmitted light pulses. Since the optical system condition of the speckle projection module 11 is known, the spatial position information of the speckle beams projected at the time 1 and the time 2 can be known, and the object position information (xy direction) and the distance information (z direction) of the corresponding reflected speckle beams in the front space can be obtained by combining the TOF values of the corresponding pixels on the array receiving module 13, so that a complete distance depth map can be constructed, and thus, a distance depth map 2 times the original resolution can be obtained.
In another embodiment, as shown in fig. 3, a laser emitter 111 is used as the light source. The speckle projection module 11 mainly includes a laser emitter 111, a collimator lens 112, and a diffractive optical element 113, and the traveling direction of light is indicated by an arrow 114. The laser emitter 111 may be an area array laser, such as a vertical cavity surface laser emitter, or a single-point laser, such as an edge emitting laser, and the wavelength of the emitted light may be selected according to the application requirement, and the preferred embodiment adopts a vertical cavity surface laser emitter with regular lattice distribution, and the wavelength of the emitted light is 940 nm. The collimating lens 112 is composed of 1 lens in the figure, or may be composed of other number of lenses, and functions to collimate the light beams emitted from all the light-emitting points 1110 of the laser emitter 111, so that the light beams enter the subsequent diffractive optical element 113 in an approximately parallel form. The diffractive optical element DOE 113 is fabricated by a micro-nano process, such as nanoimprint, precision injection molding, or semiconductor lithography, and functions to spatially modulate each incident beam, and preferably functions to split and copy each beam to generate a required greater number of laser emission beams 115.
In another embodiment, a scanning device is provided in the speckle projection module 11, and the scanning device is used for realizing the change of the optical path and controlling the speckle projected by the light beam on the receiving screen 12 to scan in steps.
In another embodiment, as shown in fig. 4, the speckle beam scanning in the X and Y directions is realized by adding a two-dimensional scanning galvanometer to the optical path inside the speckle projection module 11. As shown in fig. 4, the two-dimensional scanning galvanometer 116 is disposed on the subsequent light path of the diffractive optical element 113, and by controlling the movement of the two-dimensional scanning galvanometer 116 in the X and Y directions, the speckle beam can be scanned in two directions, respectively, so as to further achieve the super-resolution effect of the 3D imaging device according to the foregoing principle.
In another embodiment, the scanning device is a motion mechanism for driving the laser emitter or the collimating lens to realize the optical path change. As shown in fig. 5, in this embodiment, a device for controlling mechanical motion, such as a two-dimensional scanning motion device, is integrated on one optical element inside the speckle projection module 11, so as to change the optical path, thereby achieving the effect of scanning the speckle beam. As shown in fig. 10, a Y movement axis 1120 and an X movement axis 1121 which can be precisely controlled are applied to the collimator lens 112, and for example, when the Y movement axis 1120 scans a step length, the laser emission beam 115 becomes a laser emission beam 116 slightly shifted in the Y direction. Similarly, a mechanical motion control device may be integrated into the laser transmitter 111 to achieve the same effect.
In another embodiment, as shown in fig. 6, the light source is a vcsel laser emitter working in different areas, and the vcsel laser emitter 111 working in different areas is used to realize lighting of different areas at different times, so as to achieve the effect that the speckle projection module 11 scans the speckle beam in the X and Y directions. As shown in fig. 6a, four sub-regions of the vcsel laser transmitter 111 are schematically shown, and the region 1110, the region 1111, the region 1112, and the region 1113 are sequentially lighted up, so that 2 × 2 scanning in the X direction and the Y direction can be realized, and the resolution of the finally obtained distance depth map is 4 times of the original resolution of the SPAD array. As shown in fig. 6b, four sections of the vcsel laser emitter 111 are schematically shown, and the area 1114, the area 1115, the area 1116 and the area 1117 are sequentially lighted up, so that 4 × 1 scanning in the X direction and the Y direction can be realized, and the resolution of the finally obtained distance depth map is 4 times of the original resolution of the SPAD array. In summary, the partition method can exhibit various forms according to the needs and the comprehensive design of the optical projection system, all within the effective scope of the present embodiment.
As shown in fig. 7, the array receiving module 13 mainly includes a SPAD array 31, a narrow band filter 132, and an image receiving lens 133, and the traveling direction of light is indicated by an arrow 134. The middle object reflected beam 1151 of the application scene comes from the laser emission beam 115 of the speckle projection module 11, passes through the imaging receiving lens 133 and the narrow band filter 132 and finally reaches the SPAD array 1311. The image receiving lens 133, which is formed by 1 lens in this figure, or by a different number of lenses, is operative to receive all of the speckle reflected light beams from the receiving screen 12 and to precisely control each beam to reach a corresponding individual SPAD pixel element on the SPAD array 131. The narrow-band filter 132 only allows light with a wavelength emitted by the speckle projection module 11 to pass through, and light with other wavelengths to pass through, so as to achieve the effects of filtering light and reducing ambient light interference, and the preferred wavelength transmittance curve is as shown in fig. 8.
The SPAD is an abbreviation of Single Photon Avalanche Diode, i.e. an apd (Avalanche Photon Diode) with a Single Photon Avalanche Diode working in a geiger mode, has a super-strong photosensitive capability of inducing a Single Photon, and generally comprises a quenching circuit integration after Single Photon Avalanche. By utilizing the avalanche multiplication effect, the detector has higher sensitivity and gain, and has great advantages in the fields of weak light detection, high-speed imaging and the like.
As shown in fig. 9, the SPAD array 131 is composed of a plurality of individual SPAD pixel cells 1311, and the gating control of each individual SPAD pixel can be realized by the X-direction addressing control 1312 and the Y-direction addressing control 1313, and usually a large reverse bias is applied to the corresponding pixel to make it in the critical avalanche state, i.e. equivalent to entering the "exposure" state. The single pixel units of the actual SPAD array are not so compact as the arrangement of fig. 9, and due to the influence of a complex internal integrated circuit and a manufacturing process, gaps exist among the single SPAD pixel units 1311, the area of the effective photosensitive pixel unit can obtain the filling rate compared with the total array area, generally, the area of the backside-illuminated SPAD array sensor is much higher than that of the front-illuminated SPAD array, so the photon detection efficiency is higher, and the preferred embodiment adopts the backside-illuminated SPAD array.
It is emphasized that the optical projector system of the speckle projection module 11 is closely matched to the optical receiver system of the array receiver module 13, for example, the laser emission beam 115 of fig. 3 corresponds to the imaging spot of the reflected beam 1151 of fig. 7 on a single SPAD pixel cell 1311 of the SPAD array 131.
Fig. 10 is a cross-sectional view of a single SPAD pixel cell 1311, which mainly includes a photon absorption layer 1314, a charge control layer 1315, a multiplication layer 1316, and pixel internal circuits including a controller, a quenching circuit, an analog front end, a TDC, a histogram memory, and other logic circuits. Under the action of the controller, the pixel unit enters an 'exposure' state, can absorb and convert incoming light, generates multiplication avalanche triggering, generates light current, generates a voltage signal on a resistor connected in series inside, and records triggering time after the voltage signal is processed by an analog front-end circuit and reaches a time measuring circuit TDC (time to trigger) of a high-precision meter, namely the flight time delta t of the received light reaching the single SPAD pixel 1311. Meanwhile, the quenching circuit is triggered to stop the avalanche effect of the multiplication layer in time and recover to the initial reverse bias state to prepare for the next photon receiving avalanche triggering.
The SPAD has the capability of sensing weak single photons, has higher sensitivity and gain by utilizing the avalanche multiplication effect, and can greatly improve the imaging distance, the distance precision and the ambient light interference resistance under the same light emission energy, for example, the farthest imaging distance of the 3D imaging device can reach 50m, the 3D imaging device also has good performance under the condition of 15m distance under the outdoor strong sunlight interference, and the distance precision can reach less than 0.5%. Meanwhile, the 3D imaging device utilizes that each photosensitive pixel receives a single photon to trigger the TDC timing system, further obtains the TOF value of each pixel unit, and does not utilize the average integral of light energy, so that the problem of multipath interference is avoided, the outlines among multiple objects in the obtained depth map are clear, and the quality is far higher than that of the I-TOF technical scheme.
Due to factors such as the complicated internal circuit and filling rate of the SPAD array, the size of each individual SPAD pixel unit is much larger than that of the conventional CMOS imaging, and reaches more than 10um, so the overall resolution of the array cannot be made high, for example, the resolution of 180 × 140 is the leading production technology level in the industry at present. If the original resolution of the SPAD array is directly used, the resolution of the finally obtained range depth map is very low, and some application scenes with high requirements on resolution cannot be met. In the embodiment, the speckle projection module is used for realizing a dynamic speckle scanning method, so that the resolution of the 3D imaging device based on the D-TOF time flight technology of the SPAD array is greatly improved.
In another embodiment, the control processing module 10 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, or a general-purpose processor, and may be a sub-functional module of the general-purpose processor, for example, when the device is integrated into a smart terminal, such as a mobile phone, a television, or a computer. In the control processing module 10, the synchronous signal circuit may be configured to record a light emission time when the speckle projection module 11 is started, and synchronize the operation of the array receiving module 13; and the laser driving circuit is used for controlling the pulse modulation of the light source in the speckle projection module 11.
According to the 3D imaging apparatus in the above embodiment, the apparatus principle is as follows:
as shown in fig. 2, at time 1, the projected light beam forms a solid round speckle 121 on the receiving screen 12, and then the X-direction scanning inside the speckle projection module 11 acts on the speckle 122 projecting a dashed circle at time 2, so that a total projected speckle number equivalent to 2 times is obtained by scanning once in the X-direction, and the number of projected speckles in this diagram is only schematic and does not represent an actual number. Meanwhile, as shown in fig. 11, at the time 1, the imaging spot 1312 on the SPAD array 131 corresponds to the speckle 121 on the receiving screen 12, and at the time 2, the imaging spot 1313 corresponds to the speckle 122 on the receiving screen 12, that is, two times of multiplexing of the single SPAD pixel cell 1311 are realized, and it is noted that the circuits inside the pixel are enough to complete one quenching and recovery in the time interval between the time 1 and the time 2. Fig. 10 illustrates the inside of a single SPAD pixel cell 1311, and the other pixel cells are similar. Because the optical system condition of the speckle projection module 11 is known, the spatial position information of the speckle beams projected at the time 1 and the time 2 can be known, and the object position information (xy direction) and the distance information (z direction) of the corresponding reflected speckle beams in the front space can be obtained by combining the TOF values of the corresponding pixels on the array 131, so that a complete distance depth map can be constructed, and the distance depth map which is 2 times of the original resolution of the SPAD array can be obtained.
The above is a result of the speckle projection module 11 implementing one step of scanning in the X direction internally, that is, obtaining a distance depth map 2 times the original resolution of the SPAD array, and similarly, more steps can be scanned in the X direction and the Y direction, for example, 10 × 10, which is equivalent to that the resolution of the finally obtained distance depth map in the X direction and the Y direction is enlarged by 10 times, which is enough for the device to cover more application scenes.
Preferably, each pixel cell corresponds to a beam-formed speckle that is scanned within the receiving area of the corresponding pixel cell. That is, it should be noted that during scanning of any one speckle, its imaging position on SPAD array 131 should not exceed the area of its corresponding single SPAD pixel element. Therefore, the actual total scanning stroke is very short, the corresponding maximum scanning angle range is within 5 degrees, the level of 3 degrees can be achieved by matching with a certain number of basic speckle projections, the super-resolution effect of the system is realized in a small-angle scanning range, the processing and manufacturing and batch production of the actual scanning system are very convenient, and the stability and reliability of products are greatly improved.
In summary, under the conditions allowed by the prior art, the scanning of the projection light path is realized in various ways, so that the resolution of the 3D imaging device based on the D-TOF time flight technology of the SPAD array is greatly improved, and the effect exceeding the original resolution of the SPAD array is realized.
The above description is only exemplary of the preferred embodiments of the present invention, and should not be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principles of the present invention should be included in the present invention.

Claims (8)

1. A time-of-flight based 3D imaging device, comprising:
the speckle projection module comprises a light source, a laser module and a control module, wherein the light source is used for emitting a plurality of light beams and forming speckles distributed at intervals on a target surface;
the array receiving module comprises pixel units corresponding to the speckles in number, and the pixel units are used for receiving speckle reflected beams of the target surface and recording light receiving time;
and the control processing module is used for recording the light emission time of the light beam emitted by the speckle projection module, controlling the speckles emitted by the speckle projection module to scan the target surface, obtaining the position point distance information of the reflected speckles according to the light emission time and the light receiving time corresponding to each scanning, and generating a distance depth map.
2. The time-of-flight based 3D imaging device of claim 1, wherein the speckle projection module is configured with a scanning device to control optical path changes.
3. The time-of-flight based 3D imaging device of claim 2, wherein the scanning device is a two-dimensional scanning galvanometer disposed within the speckle projection module after the optical path.
4. The time-of-flight based 3D imaging device of claim 2, wherein the speckle projection module comprises a laser emitter, a collimating lens, and a diffractive optical element.
5. The time-of-flight based 3D imaging device of claim 4, wherein the scanning device is a motion mechanism that drives the laser emitter or the collimating lens to effect the change of the optical path.
6. The time-of-flight based 3D imaging device of claim 1, wherein each pixel cell includes a timing circuit for recording the light reception time.
7. The time-of-flight based 3D imaging device of claim 6, wherein the array receiving module is an array SPAD array comprising a plurality of SPAD single SPAD pixel cells, a narrow band filter, and an imaging receiving lens.
8. The time-of-flight based 3D imaging device of claim 1, wherein the light source is a partitioned vcsel laser emitter, each of the sequentially illuminated regions emitting speckle projected onto the target surface.
CN202020294116.1U 2020-03-11 2020-03-11 3D imaging device based on time flight Active CN212135134U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020294116.1U CN212135134U (en) 2020-03-11 2020-03-11 3D imaging device based on time flight

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020294116.1U CN212135134U (en) 2020-03-11 2020-03-11 3D imaging device based on time flight

Publications (1)

Publication Number Publication Date
CN212135134U true CN212135134U (en) 2020-12-11

Family

ID=73673172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020294116.1U Active CN212135134U (en) 2020-03-11 2020-03-11 3D imaging device based on time flight

Country Status (1)

Country Link
CN (1) CN212135134U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112255639A (en) * 2020-12-23 2021-01-22 杭州蓝芯科技有限公司 Depth perception sensor and depth perception sensing module for region of interest
WO2022183658A1 (en) * 2021-03-01 2022-09-09 奥比中光科技集团股份有限公司 Adaptive search method for light spot positions, time of flight distance measurement system, and distance measurement method
WO2022241781A1 (en) * 2021-05-21 2022-11-24 深圳市汇顶科技股份有限公司 Emitting apparatus for time-of-flight depth detection and electronic device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112255639A (en) * 2020-12-23 2021-01-22 杭州蓝芯科技有限公司 Depth perception sensor and depth perception sensing module for region of interest
CN112255639B (en) * 2020-12-23 2021-09-03 杭州蓝芯科技有限公司 Depth perception sensor and depth perception sensing module for region of interest
WO2022183658A1 (en) * 2021-03-01 2022-09-09 奥比中光科技集团股份有限公司 Adaptive search method for light spot positions, time of flight distance measurement system, and distance measurement method
WO2022241781A1 (en) * 2021-05-21 2022-11-24 深圳市汇顶科技股份有限公司 Emitting apparatus for time-of-flight depth detection and electronic device

Similar Documents

Publication Publication Date Title
CN110596722B (en) System and method for measuring flight time distance with adjustable histogram
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
CN110596725B (en) Time-of-flight measurement method and system based on interpolation
CN111025317B (en) Adjustable depth measuring device and measuring method
CN111427230A (en) Imaging method based on time flight and 3D imaging device
CN110596723B (en) Dynamic histogram drawing flight time distance measuring method and measuring system
CN110596724B (en) Method and system for measuring flight time distance during dynamic histogram drawing
CN212135134U (en) 3D imaging device based on time flight
WO2021072802A1 (en) Distance measurement system and method
CN110824490B (en) Dynamic distance measuring system and method
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
CN112731425B (en) Histogram processing method, distance measurement system and distance measurement equipment
WO2022021797A1 (en) Distance measurement system and distance measurement method
CN110780312B (en) Adjustable distance measuring system and method
CN110716190A (en) Transmitter and distance measurement system
CN111965658B (en) Distance measurement system, method and computer readable storage medium
CN110658529A (en) Integrated beam splitting scanning unit and manufacturing method thereof
CN209676383U (en) Depth camera mould group, depth camera, mobile terminal and imaging device
CN110716189A (en) Transmitter and distance measurement system
CN212135135U (en) 3D imaging device
CN111965659B (en) Distance measurement system, method and computer readable storage medium
CN211148917U (en) Distance measuring system
CN211148903U (en) Transmitter and distance measurement system
CN211148902U (en) Transmitter and distance measurement system
CN211426798U (en) Integrated beam splitting scanning unit

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant