CN106772431B - A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision - Google Patents

A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision Download PDF

Info

Publication number
CN106772431B
CN106772431B CN201710050801.2A CN201710050801A CN106772431B CN 106772431 B CN106772431 B CN 106772431B CN 201710050801 A CN201710050801 A CN 201710050801A CN 106772431 B CN106772431 B CN 106772431B
Authority
CN
China
Prior art keywords
depth information
depth
image
tof
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710050801.2A
Other languages
Chinese (zh)
Other versions
CN106772431A (en
Inventor
杨静
时岭
高勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Blue Core Technology Co Ltd
Original Assignee
Hangzhou Blue Core Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Blue Core Technology Co Ltd filed Critical Hangzhou Blue Core Technology Co Ltd
Priority to CN201710050801.2A priority Critical patent/CN106772431B/en
Publication of CN106772431A publication Critical patent/CN106772431A/en
Application granted granted Critical
Publication of CN106772431B publication Critical patent/CN106772431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention discloses the Depth Information Acquistion devices and methods therefor of a kind of combination TOF technology and binocular vision, which includes an active light source emitter, a TOF sensor, left images sensor, controller and processor;Active light source emitter includes the light source being sequentially arranged, spatial modulation device and imaging system;Active light source emitter and TOF sensor are connected with controller, and TOF sensor, left image sensor and right image sensor are connected with processor.Active light source emitter generates the optical illumination scene of time, spatial modulation, and TOF sensor acquisition time modulated signal obtains depth information of scene, and left images sensor acquires spatial modulation signal and obtains scene image information.Increase constraint condition with depth information and characteristic matching is carried out to scene image, obtains high-resolution depth graph picture.The present invention realizes high-resolution fast deep acquisition of information under indoor and outdoor surroundings at lower cost, is suitable for closely and relatively remote.

Description

A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision
Technical field
The present invention relates to depth transducer, machine vision, three-dimensional reconstruction, binocular stereo vision, TOF technical fields, especially It is related to the Depth Information Acquistion devices and methods therefor of a kind of combination TOF technology and binocular vision.
Background technique
In recent years, depth information obtains more and more applications in the sensor.The technology for obtaining depth information is main There are the technologies such as binocular solid matching, TOF (Time of Flight, flight time), monocular structure light.These technologies can be biography Sensor increases additional depth information, has extensively in the fields such as image identifying and processing, scene understanding, VR, AR and robot Application.However, the main product on current market still has respective use scope and limitation.For example, binocular solid With the algorithm dependent on complexity, therefore hardware requirement is high, and the calculating time is long, and target identification effect unconspicuous for feature is not It is good;Pulsed TOF depth measurement technical costs is very high, and due to the limitation of scan method, there are high-resolution and high refreshing frequency Contradiction;Phase method TOF depth measurement distance resolution and spatial resolution be not high, and anti-interference ability is poor;Structure light skill Art according to the difference of coding method, there is also resolution ratio it is low, time of measuring is long, poor reliability the defects of, to measurement target property It there are certain requirements, and poor anti jamming capability, be only used for interior.Multiple sensors are often needed to configure for meet demand, simply Sensor superposition, can not improve overall performance well, also will increase system structure complexity and system cost.
Summary of the invention
In view of the above deficiencies, the present invention provide a kind of combination TOF technology and binocular vision Depth Information Acquistion device and Its method is greatly reduced algorithm requirement, improves measuring speed, to no line while having binocular stereo vision high-resolution The target of reason can also identify very well, be suitable for various targets under indoor and outdoors environment.Melt relative to simple multiple sensors It closes, not only cost reduces, and integrated level is more preferable, and also more stable in performance.
The technical solution adopted by the present invention to solve the technical problems is as follows: a kind of combination TOF technology and binocular vision Depth Information Acquistion device, including an active light source emitter, a TOF sensor, left image sensor, right image pass Sensor, controller and processor;The active light source emitter include the light source being sequentially arranged, spatial modulation device and Imaging system;Active light source emitter and TOF sensor are connected with controller, TOF sensor, left image sensor and the right side Imaging sensor is connected with processor.
The left image sensor and right image sensor is symmetrically placed in TOF sensor two sides;As a kind of technology Scheme, left images sensor and TOF sensor are using the image and depth information under light-splitting method acquisition similarity condition.Left, Right image sensor obtain reflectivity information TOF sensor can be fed back, improve TOF sensor measurement accuracy, Reduce noise.
The light of the controller control light source launch time modulation, after the modulation of spatial modulation device, by imaging system Form the patterned illumination scene with certain space distribution.Preferably, light source is LED or mode of laser group, active light source transmitting The time-modulation of device is sine wave or quick square wave sequence, and spatial modulation device is frosted glass or grating, and imaging system is micro- Lens array.
The TOF sensor receives the echo-signal of scene objects reflection, is calculated and is flown according to the time-modulation of signal Time further obtains depth information of scene.The imaging sensor receives the scene under environment light and active light source illumination The spatial modulation of image, active light source adds texture to scene, carries out feature to texture-free target convenient for left images sensor Matching.Preferably, the left image sensor and right image sensor acquired image are gray level image, spectral characteristic It, also can response section visible light wave range primarily responsive to the narrow-band of active light source;Further, the left image sensor and the right side It may be RGB color image that imaging sensor, which acquires image, obtains more multiple-object information and carries out characteristic matching.
It is a further object of the present invention to provide the depth information acquisition method of a kind of combination TOF technology and binocular vision, tools Body includes the following steps:
(1) lightwave signal of controller control light source launch time modulation, by being imaged after the modulation of spatial modulation device System forms pattern and irradiates target measurement region, and TOF sensor receives the echo-signal of target measurement region reflection, calculates light The wave flight time obtains the depth information of each point;Left image sensor and right image sensor acquisition environment light and active light source hair Scene under injection device illumination, obtains the left and right image under two visual angles;
(2) TOF sensor, left image sensor and right image sensor are demarcated, it is alive obtains three sensors The corresponding relationship of boundary's coordinate system;
(3) according to calibrating parameters, depth information is mapped in two-way image coordinate, constraint condition is increased with depth information Stereo matching, including following several method are carried out to two-way image:
(3.1) in depth image, depth is greater than the region of D, and 1-2 character pair point is found in left images, calculates Depth information is as background depth;Further, multiple grades are segmented into, region of the distance between D1-D2 calculates primary Depth, distance calculate a depth in the region of D2-D3, and so on;
(3.2) in depth image, to the region except step (3.1), its gradient is calculated, gradient value is greater than the region of G, High Precision Stereo matching is carried out in left images, calculates accurate depth information;
(3.3) it in depth image, to the region except step (3.1) and step (3.2), calculates secondly ladder degree, second order Gradient is less than the region of G2, only carries out Stereo matching to boundary in left images, calculates depth information, and other parts carry out line Property interpolation;
(3.4) it in depth image, to the region except step (3.1), (3.2), (3.3), is carried out in left images low Precision Stereo matching calculates depth information;
(3.5) high-pass filtering is carried out to left images, extracts image high frequency components, identify small size target, calculated deep Spend information.
(4) the blind area part as caused by the difference of visual angle in two-way image is filled with background depth;
(5) it to the no parallax region except two-way image crossover range, is filled with TOF depth information, obtains scene Depth image.
Beneficial effects of the present invention are as follows: realizing that high-resolution fast deep information obtains under indoor and outdoor surroundings at lower cost It takes, is suitable for closely and relatively remote.The high-resolution of the rapid survey and binocular vision that have both TOF technology is anti-interference etc. excellent Point reduces the risk that can not be identified with wrong identification, stability and better reliability.
The depth image and the high-resolution image of two-way for acquiring low resolution simultaneously, using depth image as prior information Feature is carried out to image to identify and match, and is obtained high-resolution depth information according to optic triangle method principle, is significantly reduced Algorithm complexity and calculating time.The method of the present invention or device include active light source emitter all the way, can carry out space and Time-modulation, spatial modulation add certain texture to target, and auxiliary two-way image carries out feature identification, and time-modulation is used for TOF obtains depth image.Depth image sensor only responds active light source.Left images sensor primarily responsive to active light source, Also can response section visible light source, guarantee closely or under more remote, strong environment light and weak environment light can obtain can The image of identification.
Detailed description of the invention
Fig. 1 is the structure chart of apparatus of the present invention;
Fig. 2 is the structural schematic diagram of active light source emitter in apparatus of the present invention;
Fig. 3 is the method for the present invention flow chart;
Fig. 4 is the schematic diagram of sensor acquisition signal in the present invention;
In figure: active light source emitter 1, TOF sensor 2, left image sensor 3, right image sensor 4, controller 5, processor 6, light source 7, spatial modulation device 8, imaging system 9.
Specific embodiment
The present invention is described further with reference to the accompanying drawings and examples.
Embodiment:
As shown in Figure 1, the Depth Information Acquistion device of a kind of combination TOF technology and binocular vision, including an active light 1, TOF sensor 2 of source emitter, left image sensor 3, right image sensor 4, controller 5 and processor 6;Such as Shown in Fig. 2, the active light source emitter 1 includes light source 7, spatial modulation device 8 and the imaging system 9 being sequentially arranged; Active light source emitter 1 and light source 7 are connected with controller 5, TOF sensor 2, left image sensor 3 and right image sensing Device 4 is connected with processor 6;
Left image sensor 3 and right image sensor 4 are arranged symmetrically in the two sides of TOF sensor 2 by the present embodiment, left, Right image sensor and TOF sensor 2 are using the image and depth information under light-splitting method acquisition similarity condition.Left and right image The reflectivity information that sensor obtains can feed back TOF sensor 2, improve the measurement accuracy of TOF sensor, reduce and make an uproar Sound.
The TOF sensor 2 receives the echo-signal of scene objects reflection, is calculated and is flown according to the time-modulation of signal Time further obtains depth information of scene.The left and right imaging sensor receives under environment light and active light source illumination The spatial modulation of scene image, active light source adds texture to scene, carries out convenient for left and right imaging sensor to texture-free target Characteristic matching.It is RGB color image, Spectral Properties that left image sensor 3 and right image sensor 4, which acquire image, in the present embodiment Property primarily responsive to active light source narrow-band, also can response section visible light wave range, obtain more multiple-object information carry out feature Match.
The controller 5 controls the light of 7 launch time of light source modulation, after the modulation of spatial modulation device, is by imaging System 9 forms the patterned illumination scene with certain space distribution.Light source 7 uses 850nm mode of laser group, active light source in embodiment The time-modulation of emitter 1 is sine wave, and spatial modulation device 8 is frosted glass, and imaging system 9 is microlens array.
As shown in figure 3, a kind of depth information acquisition method of combination TOF technology and binocular vision, specifically includes following step It is rapid:
(1) controller 5 control 7 launch time of light source modulation lightwave signal, by spatial modulation device 8 modulation after by As the formation pattern of system 9 and target measurement region is irradiated, is realized on the time as sine wave, the tune for spatially having certain hot spot to be distributed Optical illumination processed, as shown in Figure 4;TOF sensor 2 receives the echo-signal of target measurement region reflection, calculates the light wave flight time The depth information of each point is obtained, resolution ratio is 360 × 240;Left image sensor 3 and right image sensor 4 acquire environment light and Scene under the illumination of active light source emitter 1, obtains the left and right image under two visual angles, and resolution ratio is 1280 × 1024;
(2) TOF sensor 2, left image sensor 3 and right image sensor 4 are demarcated, obtains three sensors and exists The corresponding relationship of world coordinate system;
The narrow-band spectrum of TOF sensor 2 response active light sources obtains preferable signal-to-noise ratio to improve anti-interference, and Since spatial resolution is lower, received is the average signal in range, therefore the not influence of exposure pattern space distribution.Ash Degree imaging sensor, also can response section visible spectrum primarily responsive to the narrow-band spectrum of active light source.
(3) according to calibrating parameters, depth information is mapped in two-way image coordinate, constraint condition is increased with depth information Stereo matching is carried out to two-way image;
(4) the blind area part as caused by the difference of visual angle in two-way image is filled with background depth;
(5) it to the no parallax region except two-way image crossover range, is filled with TOF depth information, obtains scene Depth image.
In the solid matching method, increased constraint condition is specific as follows:
(3.1) in depth image, depth is greater than the region of D, and 1-2 character pair point is found in left images, calculates Depth information is as background depth;Further, multiple grades are segmented into, region of the distance between D1-D2 calculates primary Depth, distance calculate a depth in the region of D2-D3, and so on;
(3.2) in depth image, to the region except step (3.1), its gradient is calculated, gradient value is greater than the region of G, High Precision Stereo matching is carried out in left images, calculates accurate depth information;
(3.3) it in depth image, to the region except step (3.1) and step (3.2), calculates secondly ladder degree, second order Gradient is less than the region of G2, only carries out Stereo matching to boundary in left images, calculates depth information, and other parts carry out line Property interpolation;
(3.4) it in depth image, to the region except step (3.1), (3.2), (3.3), is carried out in left images low Precision Stereo matching calculates depth information;
(3.5) high-pass filtering is carried out to left images, extracts image high frequency components, identify small size target, calculated deep Spend information.
Constraint condition is not limited to the above several method, and the several method being mentioned herein can not also be used all.

Claims (7)

1. a kind of depth information acquisition method of combination TOF technology and binocular vision, which is characterized in that this method is combining TOF It is realized in the Depth Information Acquistion device of technology and binocular vision, the depth information of the combination TOF technology and binocular vision obtains Taking device includes an active light source emitter, a TOF sensor, left image sensor, right image sensor, controller And processor;The active light source emitter includes the light source being sequentially arranged, spatial modulation device and imaging system, institute The time-modulation for stating active light source emitter is sine wave or quick square wave sequence;Active light source emitter and TOF sensing Device is connected with controller, and TOF sensor, left image sensor and right image sensor are connected with processor;
The left image sensor and right image sensor acquired image are gray level image, and spectral characteristic is primarily responsive to master The narrow-band of dynamic light source, also can response section visible light wave range;
This method comprises:
(1) optical signal of controller control light source launch time modulation, by imaging system shape after the modulation of spatial modulation device At pattern and target measurement region is irradiated, TOF sensor receives the echo-signal of target measurement region reflection, calculates light wave flight Time obtains the depth information of each point;Left image sensor and right image sensor acquisition environment light and active light source emitter Scene under illumination, obtains the left and right image under two visual angles;
(2) TOF sensor, left image sensor and right image sensor are demarcated, obtains three sensors and is sat in the world Mark the corresponding relationship of system;
(3) according to calibrating parameters, depth information is mapped in two-way image coordinate, constraint condition is increased to two with depth information Road image carries out Stereo matching, comprising the following steps:
(3.1) in depth image, depth is greater than the region of D, and 1-2 character pair point is found in left images, calculates depth Information is as background depth;Wherein, region of the depth greater than D is divided into multiple grades, and region of the distance between D1-D2 calculates one Secondary depth, distance calculate a depth in the region of D2-D3, and so on;
(3.2) in depth image, to the region except step (3.1), its gradient is calculated, gradient value is greater than the region of G, in left and right High Precision Stereo matching is carried out in image, calculates accurate depth information;
(3.3) it in depth image, to the region except step (3.1) and step (3.2), calculates secondly ladder degree, second order gradient Region less than G2 only carries out Stereo matching to boundary in left images, calculates depth information, and other parts are linearly inserted Value;
(3.4) in depth image, to the region except step (3.1), (3.2), (3.3), low precision is carried out in left images Stereo matching calculates depth information;
(3.5) high-pass filtering is carried out to left images, extracts image high frequency components, identify small size target, calculate depth letter Breath;
(4) the blind area part as caused by the difference of visual angle in two-way image is filled with background depth;
(5) it to the no parallax region except two-way image crossover range, is filled with TOF depth information, obtains the depth of scene Spend image.
2. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The left image sensor and right image sensor acquired image are color image.
3. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The light source can be LED or mode of laser group.
4. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The spatial modulation device is frosted glass or grating.
5. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The imaging system is microlens array.
6. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The left image sensor and right image sensor are arranged symmetrically in TOF sensor two sides.
7. the depth information acquisition method of combination TOF technology and binocular vision according to claim 1, which is characterized in that The left image sensor and TOF sensor are using the image and depth information under light-splitting method acquisition similarity condition.
CN201710050801.2A 2017-01-23 2017-01-23 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision Active CN106772431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710050801.2A CN106772431B (en) 2017-01-23 2017-01-23 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710050801.2A CN106772431B (en) 2017-01-23 2017-01-23 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision

Publications (2)

Publication Number Publication Date
CN106772431A CN106772431A (en) 2017-05-31
CN106772431B true CN106772431B (en) 2019-09-20

Family

ID=58942287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710050801.2A Active CN106772431B (en) 2017-01-23 2017-01-23 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision

Country Status (1)

Country Link
CN (1) CN106772431B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107635129B (en) * 2017-09-29 2020-06-16 上海安威士科技股份有限公司 Three-dimensional trinocular camera device and depth fusion method
CN110136203B (en) * 2018-02-08 2022-02-18 浙江舜宇智能光学技术有限公司 Calibration method and calibration system of TOF equipment
CN108495113B (en) * 2018-03-27 2020-10-27 百度在线网络技术(北京)有限公司 Control method and device for binocular vision system
CN108921888B (en) * 2018-07-03 2022-07-19 京东方科技集团股份有限公司 Depth image detection method, system and computer readable storage medium
US11609313B2 (en) * 2018-07-31 2023-03-21 Waymo Llc Hybrid time-of-flight and imager module
CN108989783A (en) 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 The control method of electronic device and electronic device
CN109299662B (en) * 2018-08-24 2022-04-12 上海图漾信息科技有限公司 Depth data calculation device and method, and face recognition device
CN109389632A (en) * 2018-09-05 2019-02-26 深圳奥比中光科技有限公司 Depth calculation System and method for
CN109615652B (en) * 2018-10-23 2020-10-27 西安交通大学 Depth information acquisition method and device
CN109343565A (en) * 2018-10-29 2019-02-15 中国航空无线电电子研究所 A kind of UAV Intelligent ground control control method based on gesture perception identification
CN109435845A (en) * 2018-12-26 2019-03-08 中国地质大学(武汉) A kind of lorry blind area detection warning device and detection alarm method based on TOF technology
CN109579852A (en) * 2019-01-22 2019-04-05 杭州蓝芯科技有限公司 Robot autonomous localization method and device based on depth camera
CN109788622B (en) * 2019-03-18 2024-02-06 上海炬佑智能科技有限公司 Light source control device, light source control method and time-of-flight sensor
EP3930321A4 (en) * 2019-03-25 2022-04-20 Huawei Technologies Co., Ltd. Large aperture blurring method based on dual camera + tof
CN111741283A (en) * 2019-03-25 2020-10-02 华为技术有限公司 Image processing apparatus and method
CN110035269A (en) * 2019-04-12 2019-07-19 信利光电股份有限公司 A kind of bimodulus depth camera
CN109917419B (en) * 2019-04-12 2021-04-13 中山大学 Depth filling dense system and method based on laser radar and image
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
CN110456380B (en) * 2019-07-31 2021-12-28 炬佑智能科技(苏州)有限公司 Time-of-flight sensing camera and depth detection method thereof
CN111175786B (en) * 2019-10-14 2022-05-03 岭纬科技(厦门)有限公司 Multi-path crosstalk-eliminating wide-view-field high-resolution solid-state laser radar
CN112802114A (en) * 2019-11-13 2021-05-14 浙江舜宇智能光学技术有限公司 Multi-vision sensor fusion device and method and electronic equipment
CN111045030B (en) * 2019-12-18 2022-09-13 奥比中光科技集团股份有限公司 Depth measuring device and method
CN111708039B (en) * 2020-05-24 2023-09-05 奥比中光科技集团股份有限公司 Depth measurement device and method and electronic equipment
CN112230244B (en) * 2020-09-08 2022-09-16 奥比中光科技集团股份有限公司 Fused depth measurement method and measurement device
CN112379389B (en) * 2020-11-11 2024-04-26 杭州蓝芯科技有限公司 Depth information acquisition device and method combining structured light camera and TOF depth camera
CN112415533B (en) * 2021-01-21 2021-04-16 杭州蓝芯科技有限公司 Depth sensing method and device based on chirped pulse and sensor
CN112802127B (en) * 2021-03-31 2021-07-20 深圳中科飞测科技股份有限公司 Calibration method and device, calibration equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390404A (en) * 2006-02-22 2009-03-18 皇家飞利浦电子股份有限公司 Method of colour image projection using spatial light modulation and light source modulation
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
CN104583804A (en) * 2012-08-14 2015-04-29 微软公司 Illumination light projection for depth camera
CN104634276A (en) * 2015-02-12 2015-05-20 北京唯创视界科技有限公司 Three-dimensional measuring system, photographing device, photographing method, depth calculation method and depth calculation device
CN105115445A (en) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390404A (en) * 2006-02-22 2009-03-18 皇家飞利浦电子股份有限公司 Method of colour image projection using spatial light modulation and light source modulation
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
CN104583804A (en) * 2012-08-14 2015-04-29 微软公司 Illumination light projection for depth camera
CN104634276A (en) * 2015-02-12 2015-05-20 北京唯创视界科技有限公司 Three-dimensional measuring system, photographing device, photographing method, depth calculation method and depth calculation device
CN105115445A (en) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于TOF与立体匹配相融合的高分辨率深度获取;刘娇丽 等;《信息技术》;20161215;第191-192页 *

Also Published As

Publication number Publication date
CN106772431A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106772431B (en) A kind of Depth Information Acquistion devices and methods therefor of combination TOF technology and binocular vision
JP5337243B2 (en) Adaptive 3D scanning system for surface features
US9392262B2 (en) System and method for 3D reconstruction using multiple multi-channel cameras
JP6480441B2 (en) Time-of-flight camera system
CN104634276B (en) Three-dimension measuring system, capture apparatus and method, depth computing method and equipment
US8836756B2 (en) Apparatus and method for acquiring 3D depth information
CN108474658B (en) Ground form detection method and system, unmanned aerial vehicle landing method and unmanned aerial vehicle
CN105115445A (en) Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
KR20120058828A (en) System for extracting 3-dimensional coordinate and method thereof
CN105004324B (en) A kind of monocular vision sensor with range of triangle function
CN102438111A (en) Three-dimensional measurement chip and system based on double-array image sensor
CN106651925B (en) A kind of acquisition methods of color depth image obtain equipment
CN110619617B (en) Three-dimensional imaging method, device, equipment and computer readable storage medium
CN109410234A (en) A kind of control method and control system based on binocular vision avoidance
KR20160090464A (en) Method for generating depth map in TOF camera
CN109085603A (en) Optical 3-dimensional imaging system and color three dimensional image imaging method
CN207751449U (en) One kind being based on the matched monocular depth camera of visual field
CN112255639A (en) Depth perception sensor and depth perception sensing module for region of interest
CN112379389B (en) Depth information acquisition device and method combining structured light camera and TOF depth camera
CN108693538A (en) Accurate confidence level depth camera range unit based on binocular structure light and method
Zheng et al. Underwater 3D target positioning by inhomogeneous illumination based on binocular stereo vision
CN215219710U (en) 3D recognition device and 3D recognition system
CN103697825A (en) System and method of utilizing super-resolution 3D (three-dimensional) laser to measure
CN209181735U (en) Amphibious 3D vision detection device based on laser
CN113888702A (en) Indoor high-precision real-time modeling and space positioning device and method based on multi-TOF laser radar and RGB camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20170531

Assignee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Assignor: HANGZHOU LANXIN TECHNOLOGY CO.,LTD.

Contract record no.: X2023980031743

Denomination of invention: A depth information acquisition device and method combining TOF technology and binocular vision

Granted publication date: 20190920

License type: Exclusive License

Record date: 20230202

PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A depth information acquisition device and method combining TOF technology and binocular vision

Effective date of registration: 20230207

Granted publication date: 20190920

Pledgee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Pledgor: HANGZHOU LANXIN TECHNOLOGY CO.,LTD.

Registration number: Y2023110000056

EC01 Cancellation of recordation of patent licensing contract
EC01 Cancellation of recordation of patent licensing contract

Assignee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Assignor: HANGZHOU LANXIN TECHNOLOGY CO.,LTD.

Contract record no.: X2023980031743

Date of cancellation: 20240402

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20190920

Pledgee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Pledgor: HANGZHOU LANXIN TECHNOLOGY CO.,LTD.

Registration number: Y2023110000056