CN112379389A - Depth information acquisition device and method combining structured light camera and TOF depth camera - Google Patents

Depth information acquisition device and method combining structured light camera and TOF depth camera Download PDF

Info

Publication number
CN112379389A
CN112379389A CN202011264676.3A CN202011264676A CN112379389A CN 112379389 A CN112379389 A CN 112379389A CN 202011264676 A CN202011264676 A CN 202011264676A CN 112379389 A CN112379389 A CN 112379389A
Authority
CN
China
Prior art keywords
depth information
tof
structured light
sensor
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011264676.3A
Other languages
Chinese (zh)
Other versions
CN112379389B (en
Inventor
孙乐韵
徐永奎
杨静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Lanxin Technology Co ltd
Original Assignee
Hangzhou Lanxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Lanxin Technology Co ltd filed Critical Hangzhou Lanxin Technology Co ltd
Priority to CN202011264676.3A priority Critical patent/CN112379389B/en
Publication of CN112379389A publication Critical patent/CN112379389A/en
Application granted granted Critical
Publication of CN112379389B publication Critical patent/CN112379389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a depth information acquisition device combining a structured light technology and a TOF technology, which comprises an active light source emitting device, a TOF sensor, a structured light sensor, a controller and a processor, wherein the active light source emitting device and the processor are connected with the controller, and the TOF sensor and the structured light sensor are connected with the processor. The method comprises the following steps: the controller controls the active light source emitting device to form an illumination scene with space light and shade distribution and irradiates a target measurement area, the TOF sensor receives an echo signal reflected by the target measurement area, and the light wave flight time is calculated to obtain depth information of each point; the structured light sensor collects a structured light image under illumination to obtain intensity information; depth information obtained by combining multiple frames of TOF sensors is combined to achieve depth measurement of a complete view field range, the depth information is mapped to a structured light image coordinate, and the structured light sensor calculates subdivision coding according to the TOF depth information and an optical triangulation principle, so that more accurate depth information is obtained.

Description

Depth information acquisition device and method combining structured light camera and TOF depth camera
Technical Field
The invention relates to the technical field of depth sensors, machine vision, three-dimensional reconstruction, TOF (time of flight) technology and structured light, in particular to a depth information acquisition device and method combining a structured light camera and a TOF depth camera.
Background
In recent years, common depth information acquisition methods include 3D structured light, TOF (Time of Flight), binocular, and the like. Depth information acquisition has an indispensable role for applications such as image recognition and processing, scene understanding, VR, AR, and robots, but different depth information acquisition methods have respective locality, such as: the structured light utilizes the size and the shape projected on an observed object to calculate depth information, so that the method has the advantages of relatively small calculation amount, low power consumption, higher precision in a close range, suitability for a static scene, influence by strong natural light and reflection, unsuitability for an outdoor scene, relatively complex process, higher cost and lower speed; the TOF calculates depth information by using the emitted light pulse and the reflected light pulse, is relatively more beneficial to miniaturization of equipment, has high frame rate, simple algorithm, high speed, is more suitable for dynamic scenes, has larger measurement distance, is suitable for long distance, does not reduce the measurement precision along with the increase of the measurement distance, has certain robustness to outdoor strong light environment, but is influenced by multiple light reflections, and has lower resolution ratio and higher power consumption at present; the binocular calculation is complex, greatly affected by the environment and poor in reliability.
Therefore, the device and the method for acquiring the depth information by combining the structured light camera and the TOF depth camera have the advantages of high close-range precision of the structured light camera, improve the measurement efficiency, combine the depth information according to different conditions and can be applied to wider scenes.
Disclosure of Invention
The embodiment of the invention aims to provide a depth information acquisition device and a depth information acquisition method combining a structured light camera and a TOF depth camera, so as to solve the problems of long measurement time and low measurement efficiency in the prior art and enhance the accuracy and reliability of depth information acquisition in a dynamic scene. Meanwhile, mechanism light coding space illumination is beneficial to reducing the problem of TOF multipath scattering, and accuracy of TOF ranging is enhanced.
The technical scheme adopted by the invention for solving the technical problems is as follows:
in a first aspect, an embodiment of the present invention provides a depth information acquiring apparatus combining a structured light technology and a TOF technology, including an active light source emitting device, a TOF sensor, a structured light sensor, a controller, and a processor, where the active light source emitting device and the processor are both connected to the controller, and the TOF sensor and the structured light sensor are both connected to the processor.
Furthermore, the active light source emitting device comprises an active light source, a spatial modulator and an imaging system, the controller controls the active light source to emit light modulated in time, the controller controls the spatial modulator, and the light modulated in time is modulated by the spatial modulator and then forms an illumination scene with spatial light and shade distribution by the imaging system.
Further, the structured light sensor has a coincident field of view when arranged with the TOF sensor.
Furthermore, the TOF sensor and the structured light sensor have respective lenses, external reference calibration needs to be performed first, and meanwhile the mapping relation of pixels depends on the TOF to obtain depth information.
Furthermore, the structured light sensor and the TOF sensor adopt a light splitting method to collect image and depth information under the same condition, and the maximum view field is realized.
Further, in the case that the structured light sensor and the TOF sensor adopt a light splitting method to acquire image and depth information under the same condition, the light splitting method also needs to calibrate the relative rotation relationship between the structured light sensor and the TOF sensor before implementation, and establish a mapping relationship of pixel coordinates.
Further, the time modulation of the active light source emitting device is a sine wave, a fast square wave sequence or a pulse sequence.
In a second aspect, an embodiment of the present invention further provides a depth information acquiring method using the depth information acquiring apparatus combining the structured light technology and the TOF technology, where the method includes:
step (1), a controller controls an active light source emitting device to form an illumination scene with space light and shade distribution and irradiate a target measurement area, a TOF sensor receives echo signals reflected by the target measurement area, and the light wave flight time is calculated to obtain depth information of each point; the structured light sensor collects a structured light image under illumination to obtain intensity information;
and (2) combining depth information obtained by the multi-frame TOF sensor to realize depth measurement of a complete view field range, mapping the depth information to a structured light image coordinate, and calculating subdivision coding by the structured light sensor according to the TOF depth information and an optical triangulation principle so as to obtain more accurate depth information.
According to the technical scheme, the invention has the following beneficial effects:
the invention realizes the high-resolution rapid depth information acquisition in indoor and outdoor environments with lower cost and is suitable for short distance and long distance. Have advantages such as the short-range high accuracy of rapid survey and the structured light of TOF technique concurrently, reduced the risk that can't discern and misidentification, stability and reliability are better.
And meanwhile, a low-resolution depth image and a high-resolution structured light image are acquired, the depth image is mapped to a structured light coordinate system, the subdivision coding and the coarse subdivision coding are combined, the high-resolution depth information is obtained according to the optical trigonometry principle, and the algorithm complexity and the calculation time are greatly reduced. The method or the device comprises a path of active light source emitting device, and can perform space and time modulation, wherein the space modulation projects a pattern with certain space distribution on a target; temporal modulation is used for TOF acquisition of depth images. Both the TOF sensor and the structured light sensor only respond to an active light source, and the same-waveband optical filter is added to reduce the influence of ambient light, so that recognizable images can be obtained at a short distance or a long distance under strong ambient light and weak ambient light.
Drawings
For a better understanding, the invention will be explained in more detail in the following description with reference to the drawings. It is to be understood that the invention is not limited to this exemplary embodiment, but that specified features may also be combined and/or modified as convenient, without departing from the scope of the invention as defined by the claims. In the drawings:
fig. 1 is a structural diagram of a depth information acquiring apparatus combining a structured light technology and a TOF technology according to an embodiment of the present invention;
fig. 2 is a flowchart of a depth information acquiring method of a depth information acquiring apparatus combining a structured light technique and a TOF technique according to an embodiment of the present invention;
in the figure: 1. a projection area of the measured object; 2. a filter with the same wave band; 3. a spatial modulator and an imaging system; 4. an image sensor; 5. a TOF sensor; 6. a processor; 7. a controller; 8. an active light source.
Detailed Description
In the following description, numerous specific details are set forth to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, and so forth.
As shown in fig. 1, an embodiment of the present invention provides a depth information acquiring apparatus combining a structured light technology and a TOF technology, including an active light source emitting apparatus, a TOF sensor 5, a structured light sensor 4, a controller 7, and a processor 6, where the active light source emitting apparatus and the processor 6 are both connected to the controller 7, and the TOF sensor 5 and the structured light sensor 4 are both connected to the processor 6.
In an embodiment of the present application, active light source emitter includes active light source 8, spatial modulator and imaging system 3, controller 7 control the light of active light source 8's emission time modulation, controller 7 control spatial modulator, the light of time modulation is after the spatial modulator modulation, forms the illumination scene that has space light and shade distribution by imaging system. In the embodiment, an 850nm laser module is adopted as an active light source 8, the time modulation of the active light source 8 is a sine wave signal with the modulation frequency of 12MHz, a space modulator is a liquid crystal space modulator, the space modulator generates sine stripes with the phase shift of 2 pi/3 and 4 pi/3, and an imaging system is a micro-lens array.
In an embodiment of the present application, the time modulation of the active light source emitting device is a sine wave, a fast square wave sequence or a pulse sequence. The active light source emitting device can also be divided into a pulsed laser projector and a coded structured light projector.
In an embodiment of the present application, when the structured light sensor 4 and the TOF sensor 5 are arranged, the structured light sensor 4 and the TOF sensor 5 have respective lenses, external reference calibration needs to be performed first, and meanwhile, the mapping relationship of pixels depends on TOF to obtain depth information. The structured light sensor 4 and the TOF sensor 5 adopt a light splitting method to collect image and depth information under the same condition, and the maximum view field is realized.
When in use, the TOF sensor and the structured light sensor are preferably placed in parallel; and a same-waveband optical filter 2 is respectively added in front of the TOF sensor 5 and the structured light sensor 4. The TOF sensor 5 receives an echo signal which is emitted to a scene target by the active light source 8 and is reflected back, and calculates the flight time according to the time modulation of the signal to further obtain the scene depth information. The structured light sensor 4 receives a structured light image under illumination of the active light source 8.
In the structured light sensor 4 and the TOF sensor 5 acquiring image and depth information under the same condition by adopting a light splitting method, the relative rotation relationship between the structured light sensor 4 and the TOF sensor 5 needs to be calibrated before implementation by adopting the light splitting method, and a mapping relationship of pixel coordinates is established. The depth information obtained by the TOF sensor 5 assists the structured light sensor 4 in calculating the depth information, so that the efficiency is improved; the depth information obtained by the structured light sensor 4 can be fed back to the TOF sensor 5 in return, so that the measurement accuracy of the TOF sensor at a close distance is improved, and the noise is reduced. It is also possible to include only one sensor for both the structured light sensor 4 and the TOF sensor 5.
The embodiment of the present invention further provides a depth information acquiring method using the depth information acquiring apparatus combining the structured light technology and the TOF technology, including:
step (1), a controller 7 controls an active light source emitting device to form an illumination scene with space bright-dark distribution and irradiate a target measurement area, a TOF sensor 5 receives an echo signal reflected by the target measurement area, and the light wave flight time is calculated to obtain depth information of each point; the structured light sensor 4 collects a structured light image under illumination to obtain intensity information;
and (2) combining depth information obtained by the multi-frame TOF sensor 5 to realize depth measurement of a complete view field range, mapping the depth information to a structured light image coordinate, and calculating subdivision coding by the structured light sensor 4 according to the TOF depth information and an optical triangulation principle so as to obtain more accurate depth information.
The invention considers how to combine the respective characteristics of the structured light and the TOF technology, and improves the precision and the efficiency of calculating the depth information within a certain range, so that the method is suitable for more scenes. The structured light has high resolution at a short distance, greatly reduces algorithm requirements, improves measurement speed, reduces the influence of ambient light, and is suitable for indoor and outdoor and near and far distances. Compared with simple fusion of multiple sensors, the method has the advantages of low cost, high integration level and stable performance.
Example (b):
as shown in fig. 2, an embodiment of the present invention further provides a depth information acquiring method using the depth information acquiring apparatus combining the structured light technology and the TOF technology, including:
1. the controller 7 controls the active light source 8 to emit a sine wave light signal with the wavelength of 850nm and the modulation frequency of 12MHz, and the sine wave light signal is modulated by the liquid crystal spatial modulator to form sine stripes to irradiate the projection area 1 of the measured object, so that the modulated light illumination with the time modulation of high-frequency sine waves and the spatial distribution of sine stripes is realized.
2. The TOF sensor 5 receives echo signals reflected by a target, calculates the flight time of light waves to obtain depth information of each point, the resolution is 320 multiplied by 240, the image sensor 4 synchronously collects target intensity images modulated by sine stripes, the resolution is 640 multiplied by 480, and the frame rate is 100 Hz.
3. The controller 7 controls the liquid crystal spatial modulator to generate sinusoidal stripes with phase shifts of 2 pi/3 and 4 pi/3.
4. The TOF sensor 5 and the image sensor 4 receive the two depth images and the intensity image synchronously.
5. Internal and external parameters of the TOF sensor 5 and the structured light sensor 4 are calibrated in advance.
6. The three depth images received by the TOF sensor 5 are combined, each image weighting coefficient being proportional to the TOF signal intensity. HDR can be achieved when depth images are combined, that is, depth data are calculated according to different exposure times, and final TOF depth data are synthesized corresponding to an optimal range. The depth data is aligned to the image sensor 4 based on pre-calibrated TOF sensor 5 and image sensor 4 parameters.
7. The TOF sensor 5 calculates a coarse-divided code for each pixel from the depth data, with a coding accuracy of the period of the sinusoidal fringes.
8. The image sensor 4 calculates the subdivision code of each pixel according to the three collected intensity data i0, i1, i2, and the formula is: c-atan (1.732 (i1-i2)/(2 i0-i1-i2)) × N, N being the period of the sinusoidal stripes.
9. And combining the rough division coding and the subdivision coding to realize the complete coding of the structured light, and calculating the depth information according to the trigonometry principle. The TOF depth data is combined with structured light depth data, the structured light depth data weighting being inversely proportional to the distance squared. Finally, the high-precision depth image acquisition with 640 × 480 resolution and 33Hz is realized.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The depth information acquisition device is characterized by comprising an active light source emitting device, a TOF sensor, a structured light sensor, a controller and a processor, wherein the active light source emitting device and the processor are connected with the controller, and the TOF sensor and the structured light sensor are connected with the processor.
2. The depth information acquiring apparatus combining the structured light technology and the TOF technology according to claim 1, wherein the active light source emitting device comprises an active light source, a spatial modulator and an imaging system, the controller controls the active light source to emit light modulated in time, the controller controls the spatial modulator, and the illumination scene with spatial light and shade distribution is formed by the imaging system after the light modulated in time is modulated by the spatial modulator.
3. The combined structured light and TOF technique depth information acquisition apparatus according to claim 1 wherein the structured light sensor and TOF sensor are arranged with a coincident field of view.
4. The depth information acquiring apparatus combining the structured light technology and the TOF technology according to claim 3, wherein the TOF sensor and the structured light sensor have respective lenses, external reference calibration is required to be performed first, and the mapping relationship of the pixels depends on TOF to acquire depth information.
5. The depth information acquiring device combining the structured light technology and the TOF technology according to claim 3, wherein the structured light sensor and the TOF sensor adopt a light splitting method to acquire image and depth information under the same condition, so as to realize a maximum field of view.
6. The apparatus according to claim 5, wherein the structured light sensor and the TOF sensor use a light splitting method to collect the image and depth information under the same condition, and the light splitting method also requires calibration of the relative rotation relationship between the structured light sensor and the TOF sensor before implementation, so as to establish the mapping relationship between the pixel coordinates.
7. The combined structured light and TOF technique depth information acquisition apparatus according to claim 1 wherein the time modulation of the active light source emitting device is a sine wave, a fast square wave sequence or a pulse sequence.
8. A depth information acquisition method using the depth information acquisition apparatus combining the structured light technique and the TOF technique according to claim 1, comprising:
step (1), a controller controls an active light source emitting device to form an illumination scene with space light and shade distribution and irradiate a target measurement area, a TOF sensor receives echo signals reflected by the target measurement area, and the light wave flight time is calculated to obtain depth information of each point; the structured light sensor collects a structured light image under illumination to obtain intensity information;
and (2) combining depth information obtained by the multi-frame TOF sensor to realize depth measurement of a complete view field range, mapping the depth information to a structured light image coordinate, and calculating subdivision coding by the structured light sensor according to the TOF depth information and an optical triangulation principle so as to obtain more accurate depth information.
9. The depth information acquiring method according to claim 8, wherein the active light source emitting device includes an active light source, a spatial modulator, and an imaging system, the controller controls light emitted by the active light source and modulated in emission time, the controller controls the spatial modulator, and the light modulated in time is modulated by the spatial modulator, and then an illumination scene with spatial light and dark distribution is formed by the imaging system.
10. The depth information acquisition method according to claim 8, wherein the structured light sensor and the TOF sensor are arranged with a coincident field of view.
CN202011264676.3A 2020-11-11 2020-11-11 Depth information acquisition device and method combining structured light camera and TOF depth camera Active CN112379389B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011264676.3A CN112379389B (en) 2020-11-11 2020-11-11 Depth information acquisition device and method combining structured light camera and TOF depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011264676.3A CN112379389B (en) 2020-11-11 2020-11-11 Depth information acquisition device and method combining structured light camera and TOF depth camera

Publications (2)

Publication Number Publication Date
CN112379389A true CN112379389A (en) 2021-02-19
CN112379389B CN112379389B (en) 2024-04-26

Family

ID=74583508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011264676.3A Active CN112379389B (en) 2020-11-11 2020-11-11 Depth information acquisition device and method combining structured light camera and TOF depth camera

Country Status (1)

Country Link
CN (1) CN112379389B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113687369A (en) * 2021-07-14 2021-11-23 南京大学 Synchronous acquisition system and method for spectral information and depth information
CN114067575A (en) * 2021-11-23 2022-02-18 安徽富煌科技股份有限公司 Traffic hub region safety analysis device based on 3D structured light detection

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104903677A (en) * 2012-12-17 2015-09-09 Lsi公司 Methods and apparatus for merging depth images generated using distinct depth imaging techniques
CN105705962A (en) * 2013-06-06 2016-06-22 新加坡恒立私人有限公司 Sensor system with active illimination
CN106527761A (en) * 2015-09-10 2017-03-22 义明科技股份有限公司 Non-contact optical sensing device and three-dimensional object depth position sensing method
CN106772431A (en) * 2017-01-23 2017-05-31 杭州蓝芯科技有限公司 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision
CN107807490A (en) * 2017-09-26 2018-03-16 中国科学院长春光学精密机械与物理研究所 Method and system based on double camera spectroscopic imaging increase visual field
CN110488240A (en) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 Depth calculation chip architecture
CN111031278A (en) * 2019-11-25 2020-04-17 广州恒龙信息技术有限公司 Monitoring method and system based on structured light and TOF
CN111263045A (en) * 2020-02-26 2020-06-09 深圳奥比中光科技有限公司 Electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104903677A (en) * 2012-12-17 2015-09-09 Lsi公司 Methods and apparatus for merging depth images generated using distinct depth imaging techniques
CN105705962A (en) * 2013-06-06 2016-06-22 新加坡恒立私人有限公司 Sensor system with active illimination
CN106527761A (en) * 2015-09-10 2017-03-22 义明科技股份有限公司 Non-contact optical sensing device and three-dimensional object depth position sensing method
CN106772431A (en) * 2017-01-23 2017-05-31 杭州蓝芯科技有限公司 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision
CN107807490A (en) * 2017-09-26 2018-03-16 中国科学院长春光学精密机械与物理研究所 Method and system based on double camera spectroscopic imaging increase visual field
CN110488240A (en) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 Depth calculation chip architecture
CN111031278A (en) * 2019-11-25 2020-04-17 广州恒龙信息技术有限公司 Monitoring method and system based on structured light and TOF
CN111263045A (en) * 2020-02-26 2020-06-09 深圳奥比中光科技有限公司 Electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113687369A (en) * 2021-07-14 2021-11-23 南京大学 Synchronous acquisition system and method for spectral information and depth information
CN114067575A (en) * 2021-11-23 2022-02-18 安徽富煌科技股份有限公司 Traffic hub region safety analysis device based on 3D structured light detection

Also Published As

Publication number Publication date
CN112379389B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN106772431B (en) A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision
CN111025317B (en) Adjustable depth measuring device and measuring method
CN113016177B (en) Depth measurement assembly with structured light source and time-of-flight camera
CN111045029B (en) Fused depth measuring device and measuring method
CN111123289B (en) Depth measuring device and measuring method
CN113538591A (en) Calibration method and device for distance measuring device and camera fusion system
CN107783353B (en) Device and system for capturing three-dimensional image
CN111538024B (en) Filtering ToF depth measurement method and device
US20160299218A1 (en) Time-of-light-based systems using reduced illumination duty cycles
CN113538592A (en) Calibration method and device for distance measuring device and camera fusion system
CN112379389B (en) Depth information acquisition device and method combining structured light camera and TOF depth camera
CN110007289B (en) Motion artifact reduction method based on time-of-flight depth camera
CN110609299A (en) Three-dimensional imaging system based on TOF
CN108362228A (en) A kind of hybrid three-dimensional measuring apparatus of finishing tool grating and measurement method based on double ray machines
CN209676383U (en) Depth camera mould group, depth camera, mobile terminal and imaging device
CN112987021B (en) Structured light three-dimensional imaging system and method integrating time-of-flight method and structured light method
CN111366943A (en) Flight time ranging system and ranging method thereof
CN210803719U (en) Depth image imaging device, system and terminal
CN116718133A (en) Short-distance single-point structured light three-dimensional measurement method
CN109100740B (en) Three-dimensional image imaging device, imaging method and system
CN115824170A (en) Method for measuring ocean waves by combining photogrammetry and laser radar
CN103697825A (en) System and method of utilizing super-resolution 3D (three-dimensional) laser to measure
US20220244392A1 (en) High resolution lidar scanning
CN216133412U (en) Distance measuring device and camera fusion system
TWI630431B (en) Device and system for capturing 3-d images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant