CN112379389B - Depth information acquisition device and method combining structured light camera and TOF depth camera - Google Patents
Depth information acquisition device and method combining structured light camera and TOF depth camera Download PDFInfo
- Publication number
- CN112379389B CN112379389B CN202011264676.3A CN202011264676A CN112379389B CN 112379389 B CN112379389 B CN 112379389B CN 202011264676 A CN202011264676 A CN 202011264676A CN 112379389 B CN112379389 B CN 112379389B
- Authority
- CN
- China
- Prior art keywords
- depth information
- tof
- sensor
- light
- structured light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000005516 engineering process Methods 0.000 claims abstract description 24
- 238000005259 measurement Methods 0.000 claims abstract description 16
- 238000005286 illumination Methods 0.000 claims abstract description 14
- 238000013507 mapping Methods 0.000 claims abstract description 10
- 230000003287 optical effect Effects 0.000 claims abstract description 7
- 230000000007 visual effect Effects 0.000 claims abstract description 5
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000004611 spectroscopical analysis Methods 0.000 claims description 5
- 238000001208 nuclear magnetic resonance pulse sequence Methods 0.000 claims description 3
- 238000002366 time-of-flight method Methods 0.000 claims 13
- 230000009286 beneficial effect Effects 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000010363 phase shift Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The invention discloses a depth information acquisition device combining a structured light technology and a TOF technology, which comprises an active light source emission device, a TOF sensor, a structured light sensor, a controller and a processor, wherein the active light source emission device and the processor are both connected with the controller, and the TOF sensor and the structured light sensor are both connected with the processor. The method comprises the following steps: the controller controls the active light source emitting device to form an illumination scene with space light-shade distribution and irradiate the target measuring area, the TOF sensor receives echo signals reflected by the target measuring area, and the light wave flight time is calculated to obtain depth information of each point; the structural light sensor collects a structural light image under illumination to obtain intensity information; and combining the depth information obtained by the multi-frame TOF sensor to realize depth measurement of a complete visual field range, mapping the depth information into a structured light image coordinate, and calculating subdivision codes by the structured light sensor according to the TOF depth information and an optical triangle principle, so as to obtain more accurate depth information.
Description
Technical Field
The invention relates to the technical fields of a depth sensor, machine vision, three-dimensional reconstruction, TOF technology and structured light, in particular to a depth information acquisition device and method combining a structured light camera and a TOF depth camera.
Background
In recent years, commonly used depth information acquisition methods include 3D structured light, TOF (Time of Flight), binocular, and the like. The depth information acquisition has an indispensable role for image recognition and processing, scene understanding, VR, AR, robot and other applications, but different depth information acquisition modes have respective locality, such as: the structure light calculates depth information by utilizing the size and the shape projected on the observed object, so that the method has the advantages of less calculated amount, low power consumption, higher precision in a close range, suitability for static scenes, influence of strong natural light and reflection, inapplicability for outdoor scenes, relatively complex process, higher cost and slower speed; the TOF calculates depth information by utilizing emitted light pulses and receiving reflected light pulses, is relatively more beneficial to the miniaturization of equipment, has high frame rate, simple algorithm and high speed, is more suitable for dynamic scenes, has larger measurement distance, is suitable for long distance, has certain robustness to outdoor strong light environment, is influenced by multiple reflection, and has lower resolution and higher power consumption at present; binocular computation is complex, is greatly affected by the environment, and has poor reliability.
Therefore, we propose a depth information acquisition device and method combining a structured light camera and a TOF depth camera, which have the advantage of high short-distance precision of the structured light camera, improve measurement efficiency, combine depth information according to different conditions, and are applicable to wider scenes.
Disclosure of Invention
The embodiment of the invention aims to provide a depth information acquisition device and method combining a structured light camera and a TOF depth camera, which are used for solving the problems of longer measurement time and lower measurement efficiency in the prior art and enhancing the accuracy and reliability of depth information acquisition in a dynamic scene. Meanwhile, the mechanical light coding space illumination is beneficial to reducing the problem of TOF multipath scattering and enhancing the accuracy of TOF ranging.
The technical scheme adopted for solving the technical problems is as follows:
In a first aspect, an embodiment of the present invention provides a depth information acquiring device combining a structured light technology and a TOF technology, where the depth information acquiring device includes an active light source emitting device, a TOF sensor, a structured light sensor, a controller, and a processor, where the active light source emitting device and the processor are both connected to the controller, and the TOF sensor and the structured light sensor are both connected to the processor.
Further, the active light source emitting device comprises an active light source, a spatial modulator and an imaging system, wherein the controller controls the active light source to emit time-modulated light, the controller controls the spatial modulator, and the time-modulated light is modulated by the spatial modulator and forms an illumination scene with spatial light and shade distribution by the imaging system.
Further, the structured light sensor has a coincident field of view when arranged with the TOF sensor.
Furthermore, the TOF sensor and the structural light sensor are provided with respective lenses, external parameter calibration is required, and meanwhile, the mapping relation of pixels depends on TOF to obtain depth information.
Further, the structured light sensor and the TOF sensor acquire images and depth information under the same conditions by adopting a spectroscopic method, so that the maximum field of view is realized.
Furthermore, in the process that the structural light sensor and the TOF sensor acquire the image and the depth information under the same conditions by adopting a light-splitting method, the relative rotation relationship of the structural light sensor and the TOF sensor is also required to be calibrated before implementation by adopting the light-splitting method, and the mapping relationship of pixel coordinates is established.
Further, the active light source emitting device is time modulated into a sine wave, a fast square wave sequence or a pulse sequence.
In a second aspect, an embodiment of the present invention further provides a depth information acquiring method of the depth information acquiring apparatus using the combined structured light technology and TOF technology according to the first aspect, including:
The method comprises the following steps that (1), a controller controls an active light source emitting device to form an illumination scene with space light-dark distribution and irradiate a target measurement area, a TOF sensor receives echo signals reflected by the target measurement area, and the flight time of light waves is calculated to obtain depth information of each point; the structural light sensor collects a structural light image under illumination to obtain intensity information;
And (2) combining the depth information obtained by the multi-frame TOF sensor to realize depth measurement of a complete visual field range, mapping the depth information into a structured light image coordinate, and calculating subdivision codes by the structured light sensor according to the TOF depth information and an optical triangle principle, so as to obtain more accurate depth information.
According to the technical scheme, the invention has the following beneficial effects:
the invention realizes high-resolution rapid depth information acquisition in indoor and outdoor environments with lower cost, and is suitable for short distance and long distance. The method has the advantages of fast measurement of TOF technology, short-distance high precision of structured light and the like, reduces risks of unrecognizable and wrongly identifiable, and is better in stability and reliability.
Meanwhile, a low-resolution depth image and a high-resolution structured light image are acquired, the depth image is mapped to a structured light coordinate system, subdivision coding and rough subdivision coding are combined, high-resolution depth information is obtained according to an optical triangulation principle, and algorithm complexity and calculation time are greatly reduced. The method or the device comprises a path of active light source emitting device which can carry out space and time modulation, and the space modulation projects a pattern with a certain space distribution on a target; time modulation is used for TOF acquisition of depth images. The TOF sensor and the structured light sensor are only responsive to the active light source, and the same-band optical filter is added to reduce the influence of ambient light, so that identifiable images can be obtained in close or long distance, strong ambient light and weak ambient light.
Drawings
For a better understanding, the invention will be explained in more detail in the following description with reference to the drawings. It is to be understood that the invention is not limited to this exemplary embodiment, and that specified features may also be conveniently combined and/or modified without departing from the scope of the invention as defined by the claims. In the drawings:
fig. 1 is a block diagram of a depth information acquiring apparatus combining structured light technology and TOF technology according to an embodiment of the present invention;
fig. 2 is a flowchart of a depth information acquiring method of a depth information acquiring apparatus combining structured light technology and TOF technology according to an embodiment of the present invention;
In the figure: 1. a projection area of the measured object; 2. a same-band filter; 3. a spatial modulator and an imaging system; 4. an image sensor; 5. a TOF sensor; 6. a processor; 7. a controller; 8. an active light source.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the technology described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc.
As shown in fig. 1, an embodiment of the present invention provides a depth information acquiring device combining a structured light technology and a TOF technology, which includes an active light source emitting device, a TOF sensor 5, a structured light sensor 4, a controller 7, and a processor 6, where the active light source emitting device and the processor 6 are both connected to the controller 7, and the TOF sensor 5 and the structured light sensor 4 are both connected to the processor 6.
In an embodiment of the present application, the active light source emitting device includes an active light source 8, a spatial modulator and an imaging system 3, the controller 7 controls the active light source 8 to emit time-modulated light, the controller 7 controls the spatial modulator, and the imaging system forms an illumination scene with spatial light and shade distribution after the time-modulated light is modulated by the spatial modulator. In the embodiment, an active light source 8 adopts a 850nm laser module, the active light source 8 is modulated into a sine wave signal with the modulation frequency of 12MHz, a space modulator is a liquid crystal space modulator, the space modulator generates sine stripes with the phase shift of 2 pi/3 and 4 pi/3, and an imaging system is a micro lens array.
In an embodiment of the application, the time modulation of the active light source emitting device is a sine wave, a fast square wave sequence or a pulse sequence. Active light source emitting devices can also be classified into pulsed laser projectors and coded structured light projectors.
In an embodiment of the present application, the structural light sensor 4 and the TOF sensor 5 have coincident visual fields when arranged, the TOF sensor 5 and the structural light sensor 4 have respective lenses, and external parameter calibration is required first, and the mapping relation of pixels depends on TOF to obtain depth information. The structured light sensor 4 and the TOF sensor 5 acquire images and depth information under the same conditions by adopting a light splitting method, so that the maximum field of view is realized.
In use, the TOF sensor and the structured light sensor are preferably placed in parallel; the TOF sensor 5 and the structured light sensor 4 are respectively added with a same-wave-band optical filter 2 in front. The TOF sensor 5 receives echo signals emitted by the active light source 8 to the scene target and reflected back, calculates the flight time according to the time modulation of the signals, and further obtains scene depth information. The structured light sensor 4 receives a structured light image under illumination from an active light source 8.
In the process of acquiring the image and depth information under the same conditions by adopting the spectroscopic method, the structural light sensor 4 and the TOF sensor 5 also need to mark the relative rotation relationship of the structural light sensor 4 and the TOF sensor 5 before implementation by adopting the spectroscopic method, and establish the mapping relationship of pixel coordinates. The depth information obtained by the TOF sensor 5 assists the structural light sensor 4 to calculate the depth information, so that the efficiency is improved; the depth information calculated by the structured light sensor 4 can be reversely fed back to the TOF sensor 5, so that the measuring accuracy of the TOF sensor at a short distance is improved, and the noise is reduced. It is also possible to include only one sensor for both the structured light sensor 4 and the TOF sensor 5.
The embodiment of the invention also provides a depth information acquisition method of the depth information acquisition device by utilizing the combined structured light technology and TOF technology, which comprises the following steps:
The method comprises the following steps that (1), a controller 7 controls an active light source emitting device to form an illumination scene with space light-dark distribution and irradiate a target measurement area, a TOF sensor 5 receives echo signals reflected by the target measurement area, and the flight time of light waves is calculated to obtain depth information of each point; the structure light sensor 4 collects the structure light image under illumination to obtain intensity information;
And (2) combining the depth information obtained by the multi-frame TOF sensor 5 to realize depth measurement of a complete visual field range, mapping the depth information into a structured light image coordinate, and calculating subdivision codes by the structured light sensor 4 according to the TOF depth information and an optical triangle principle so as to obtain more accurate depth information.
The invention considers how to combine the respective characteristics of the structured light and TOF technology, and improves the accuracy and efficiency of calculating depth information within a certain range, so that the method is suitable for more scenes. The method has the advantages of greatly reducing algorithm requirements, improving measurement speed and reducing influence of ambient light while having high resolution of structured light at a short distance, and is suitable for indoor and outdoor distances and long and short distances. Compared with the simple fusion of multiple sensors, the method has the advantages of low cost, better integration level and more stable performance.
Examples:
As shown in fig. 2, an embodiment of the present invention further provides a depth information acquiring method of a depth information acquiring apparatus using the above-mentioned combined structured light technology and TOF technology, including:
1. the controller 7 controls the active light source 8 to emit sine wave light signals with the wavelength of 850nm and the modulation frequency of 12MHz, and the sine wave light signals are modulated by the liquid crystal spatial modulator to form sine stripes to irradiate the projection area 1 of the measured object, so that the modulated light illumination with the time modulated into high-frequency sine waves and the spatial distribution of the sine stripes is realized.
2. The TOF sensor 5 receives echo signals reflected by a target, calculates light wave flight time to obtain depth information of each point, the resolution is 320 multiplied by 240, the image sensor 4 synchronously collects the intensity images of the target modulated by sine stripes, the resolution is 640 multiplied by 480, and the frame rate is 100Hz.
3. The controller 7 controls the lc spatial modulator to produce sinusoidal fringes with a phase shift of 2 pi/3 and 4 pi/3.
4. The TOF sensor 5 and the image sensor 4 receive both depth images and intensity images simultaneously.
5. The internal and external parameters of the TOF sensor 5 and the structured light sensor 4 are calibrated in advance.
6. The three depth images received by the TOF sensor 5 are combined, and each image weighting coefficient is proportional to the TOF signal intensity. HDR can be achieved when depth images are combined, i.e. depth data is calculated from different exposure times and the final TOF depth data is synthesized with respect to the optimal range. The depth data is aligned to the image sensor 4 according to pre-calibrated TOF sensor 5 and image sensor 4 external parameters.
7. The TOF sensor 5 calculates a coarse code for each pixel from the depth data with a sinusoidal fringe period.
8. The image sensor 4 calculates the subdivision code of each pixel according to the three acquired intensity data i0, i1 and i2, and the formula is as follows: c=atan (1.732 × (i 1-i 2)/(2×i0-i1-i 2)). N is the sinusoidal fringe period.
9. And combining the rough coding and the subdivision coding to realize the complete coding of the structured light, and calculating the depth information according to the principle of trigonometry. The TOF depth data is combined with structured light depth data, and structured light depth data weighting is inversely proportional to distance squared. Finally, high-precision depth image acquisition with 640 x 480 resolution of 33Hz is realized.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (4)
1. The depth information acquisition device is characterized by comprising an active light source emission device, a TOF sensor, a structural light sensor, a controller and a processor, wherein the active light source emission device and the processor are connected with the controller, and the TOF sensor and the structural light sensor are connected with the processor;
The controller controls the spatial modulator, and the time-modulated light is modulated by the spatial modulator and then forms an illumination scene with space light-shade distribution by the imaging system;
The structured light sensor and the TOF sensor have coincident field of view when arranged;
the structural light sensor and the TOF sensor acquire images and depth information under the same conditions by adopting a light splitting method, so that the maximum view field is realized;
in the method for acquiring the image and depth information under the same conditions by adopting a spectroscopic method, the relative rotation relationship of the structural light sensor and the TOF sensor is also required to be calibrated before implementation by adopting the spectroscopic method, and the mapping relationship of pixel coordinates is established.
2. The depth information acquiring device combining the structured light technology and the TOF technology according to claim 1, wherein the TOF sensor and the structured light sensor have respective lenses, and external parameter calibration is required, and the mapping relation of pixels depends on the TOF to acquire the depth information.
3. The depth information acquiring device combining structured light and TOF techniques according to claim 1, wherein the active light source emitting device is time modulated as a sine wave, a fast square wave sequence or a pulse sequence.
4. A depth information acquiring method using the depth information acquiring apparatus combining structured light technology and TOF technology according to claim 1, comprising:
the method comprises the following steps that (1), a controller controls an active light source emitting device to form an illumination scene with space light-dark distribution and irradiate a target measurement area, a TOF sensor receives echo signals reflected by the target measurement area, and the flight time of light waves is calculated to obtain depth information of each point; the structural light sensor collects a structural light image under illumination to obtain intensity information;
And (2) combining the depth information obtained by the multi-frame TOF sensor to realize depth measurement of a complete visual field range, mapping the depth information into a structured light image coordinate, and calculating subdivision codes by the structured light sensor according to the TOF depth information and an optical triangle principle, so as to obtain more accurate depth information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011264676.3A CN112379389B (en) | 2020-11-11 | 2020-11-11 | Depth information acquisition device and method combining structured light camera and TOF depth camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011264676.3A CN112379389B (en) | 2020-11-11 | 2020-11-11 | Depth information acquisition device and method combining structured light camera and TOF depth camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112379389A CN112379389A (en) | 2021-02-19 |
CN112379389B true CN112379389B (en) | 2024-04-26 |
Family
ID=74583508
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011264676.3A Active CN112379389B (en) | 2020-11-11 | 2020-11-11 | Depth information acquisition device and method combining structured light camera and TOF depth camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112379389B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113687369A (en) * | 2021-07-14 | 2021-11-23 | 南京大学 | Synchronous acquisition system and method for spectral information and depth information |
CN114067575A (en) * | 2021-11-23 | 2022-02-18 | 安徽富煌科技股份有限公司 | Traffic hub region safety analysis device based on 3D structured light detection |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN105705962A (en) * | 2013-06-06 | 2016-06-22 | 新加坡恒立私人有限公司 | Sensor system with active illimination |
CN106527761A (en) * | 2015-09-10 | 2017-03-22 | 义明科技股份有限公司 | Non-contact optical sensing device and three-dimensional object depth position sensing method |
CN106772431A (en) * | 2017-01-23 | 2017-05-31 | 杭州蓝芯科技有限公司 | A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision |
CN107807490A (en) * | 2017-09-26 | 2018-03-16 | 中国科学院长春光学精密机械与物理研究所 | Method and system based on double camera spectroscopic imaging increase visual field |
CN110488240A (en) * | 2019-07-12 | 2019-11-22 | 深圳奥比中光科技有限公司 | Depth calculation chip architecture |
CN111031278A (en) * | 2019-11-25 | 2020-04-17 | 广州恒龙信息技术有限公司 | Monitoring method and system based on structured light and TOF |
CN111263045A (en) * | 2020-02-26 | 2020-06-09 | 深圳奥比中光科技有限公司 | Electronic equipment |
-
2020
- 2020-11-11 CN CN202011264676.3A patent/CN112379389B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN105705962A (en) * | 2013-06-06 | 2016-06-22 | 新加坡恒立私人有限公司 | Sensor system with active illimination |
CN106527761A (en) * | 2015-09-10 | 2017-03-22 | 义明科技股份有限公司 | Non-contact optical sensing device and three-dimensional object depth position sensing method |
CN106772431A (en) * | 2017-01-23 | 2017-05-31 | 杭州蓝芯科技有限公司 | A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision |
CN107807490A (en) * | 2017-09-26 | 2018-03-16 | 中国科学院长春光学精密机械与物理研究所 | Method and system based on double camera spectroscopic imaging increase visual field |
CN110488240A (en) * | 2019-07-12 | 2019-11-22 | 深圳奥比中光科技有限公司 | Depth calculation chip architecture |
CN111031278A (en) * | 2019-11-25 | 2020-04-17 | 广州恒龙信息技术有限公司 | Monitoring method and system based on structured light and TOF |
CN111263045A (en) * | 2020-02-26 | 2020-06-09 | 深圳奥比中光科技有限公司 | Electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN112379389A (en) | 2021-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106772431B (en) | A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision | |
CN109798838B (en) | ToF depth sensor based on laser speckle projection and ranging method thereof | |
CN111239729B (en) | Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof | |
US9501833B2 (en) | Method and system for providing three-dimensional and range inter-planar estimation | |
CN111123289B (en) | Depth measuring device and measuring method | |
CN112379389B (en) | Depth information acquisition device and method combining structured light camera and TOF depth camera | |
CN113538591A (en) | Calibration method and device for distance measuring device and camera fusion system | |
US10712432B2 (en) | Time-of-light-based systems using reduced illumination duty cycles | |
JP2022504010A (en) | Depth measurement assembly with structured light source and flight time camera | |
CN108362228B (en) | Double-optical-machine-based optical knife grating hybrid three-dimensional measurement device and measurement method | |
CN111045029A (en) | Fused depth measuring device and measuring method | |
WO2014101408A1 (en) | Three-dimensional imaging radar system and method based on a plurality of times of integral | |
CN112255639B (en) | Depth perception sensor and depth perception sensing module for region of interest | |
CN105004324A (en) | Monocular vision sensor with triangulation ranging function | |
CN110618537B (en) | Coated lens device and three-dimensional reconstruction imaging system applying same | |
CN112799080A (en) | Depth sensing device and method | |
CN111366943A (en) | Flight time ranging system and ranging method thereof | |
CN112987021B (en) | Structured light three-dimensional imaging system and method integrating time-of-flight method and structured light method | |
CN110595390A (en) | Stripe projection device based on rectangular pyramid reflector and three-dimensional reconstruction imaging system | |
CN210803719U (en) | Depth image imaging device, system and terminal | |
CN116718133A (en) | Short-distance single-point structured light three-dimensional measurement method | |
CN111679099A (en) | Accelerometer calibration method and device based on coherent light vision optical flow detection | |
CN115824170A (en) | Method for measuring ocean waves by combining photogrammetry and laser radar | |
CN103697825A (en) | System and method of utilizing super-resolution 3D (three-dimensional) laser to measure | |
CN113311451B (en) | Laser speckle projection TOF depth perception method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |