CN111045030B - Depth measuring device and method - Google Patents

Depth measuring device and method Download PDF

Info

Publication number
CN111045030B
CN111045030B CN201911312430.6A CN201911312430A CN111045030B CN 111045030 B CN111045030 B CN 111045030B CN 201911312430 A CN201911312430 A CN 201911312430A CN 111045030 B CN111045030 B CN 111045030B
Authority
CN
China
Prior art keywords
depth
depth map
pixels
target object
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911312430.6A
Other languages
Chinese (zh)
Other versions
CN111045030A (en
Inventor
曾海
王兆民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN201911312430.6A priority Critical patent/CN111045030B/en
Publication of CN111045030A publication Critical patent/CN111045030A/en
Application granted granted Critical
Publication of CN111045030B publication Critical patent/CN111045030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention provides a depth measuring device and a method, wherein the depth measuring device comprises a transmitting module, an acquisition module and a control and processor; the emission module is used for projecting a spot pattern consisting of a plurality of output light beams to a target object; the acquisition module comprises an image sensor, the image sensor comprises a pixel array, and the acquisition module is used for acquiring optical signals of reflected light reflected by the target object through part of pixels in the pixel array and converting the optical signals into electric signals; the control and processor is respectively connected with the transmitting module and the collecting module, and is used for receiving the electric signals and calculating phase difference to obtain a first depth map, and processing the other part of the pixels which do not collect the reflected light in the first depth map to obtain depth values and output a second depth map. The technical scheme of the invention can effectively improve the measurement precision of the depth measurement device.

Description

Depth measuring device and method
Technical Field
The invention belongs to the technical field of depth measurement, and particularly relates to a depth measurement device and method.
Background
Currently, a ToF depth measuring device based on the ToF (Time of flight) principle includes a light source configured to emit an output light beam or a pattern composed of the output light beam toward a target object, and a photoreceptor for receiving reflected light reflected by the target object to identify and map the target object based on the reflected light.
However, in the actual measurement process, the measurement accuracy and the measurement distance of the ToF are affected by the intensity of the light source, and in the existing ToF depth measurement device, because the light source mostly adopts a single-form flood lighting mode and uniformly distributes the energy emitted by the light source, the required power consumption is large, the measurement distance is small, and the measurement accuracy is not high; in addition, since only the depth values of the part of the pixels into which the reflected light enters are obtained in the ToF calculation mode, and the depth values of all the pixels are not obtained, the resolution of the obtained depth image is low, and the measurement accuracy is low.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a depth measuring device, which aims to solve the problem of low measuring precision of the conventional depth measuring device.
The invention provides a depth measuring device, comprising:
the emission module is used for projecting a spot pattern consisting of a plurality of output light beams to a target object;
the acquisition module comprises an image sensor, the image sensor comprises a pixel array, and the acquisition module is used for acquiring optical signals of reflected light reflected by the target object through part of pixels in the pixel array and converting the optical signals into electric signals; and the number of the first and second groups,
and the control and processor is respectively connected with the transmitting module and the collecting module, and is used for receiving the electric signals and calculating the phase difference to obtain a first depth map, and performing interpolation processing on the other part of the pixels, which do not collect the reflected light, in the first depth map to obtain a depth value and output a second depth map.
Optionally, each of the pixels comprises a plurality of taps for respectively collecting the reflected light and forming the electrical signal within a single frame period.
Optionally, the acquisition module further includes an analog-to-digital converter for converting the electrical signal into a digital signal, and the control and processor calculates a phase difference based on the digital signal to obtain the first depth map.
Optionally, the depth measuring device further comprises an RGB camera connected to the control and processor for acquiring a color image of the target object; the control and processor directs the interpolation process with the color image to improve interpolation accuracy.
Optionally, the emission module includes a light source, and the light source is a vertical cavity surface laser emitter.
The invention also provides a depth measuring method, which is realized by using the depth measuring device, and comprises the following steps:
s10, projecting a speckle pattern to the target object by the emission module;
s20, collecting optical signals of reflected light reflected by the target object by partial pixels in a pixel array of the collection module, and converting the optical signals into electric signals;
s30, the control and processor calculates a phase difference based on the electric signals to acquire a first depth map of the target object; the control and processor also processes another part of the pixels of the first depth map, which are not collected with the reflected light, so as to obtain depth values and output a second depth map.
Optionally, each of the pixels includes a plurality of taps, and the step S20 includes:
and sequentially switching the plurality of taps in a preset sequence in a single frame period, and converting the optical signal of the reflected light collected by each tap into the electric signal.
Optionally, the depth measurement method further includes: a color image of the target object is captured using an RGB camera.
Optionally, the depth measurement method further includes: and the control and processor acquires a gray-scale image of the target object according to the intensity information of the reflected light.
Optionally, the step S30 includes:
and taking the color image or the gray map as a guide image, and carrying out interpolation reduction on the first depth map to obtain the second depth map.
Based on the structure and method design, in the technical scheme of the invention, on one hand, because the emission module of the depth measuring device is configured to provide the output light beam in the spot pattern with a plurality of spots, compared with the current common light source adopting a single-form flood lighting mode, the energy of the output light beam of the emission module is more concentrated, so that the power consumption required by the emitted light is smaller, the measuring distance is increased, and the measuring precision is further improved; on the other hand, by means of processing another part of pixels, which do not acquire reflected light, in the first depth map through the control and processor, the depth values can be acquired and the second depth map can be output, wherein the second depth map comprises more depth values relative to the first depth map, and the precision of the second depth map is higher than that of the first depth map. The two aspects have combined action, so that the depth measuring device can effectively improve the measuring precision.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic view of an optical path structure of a depth measuring device provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a depth measuring device provided by an embodiment of the present invention;
FIG. 3 is an enlarged schematic view at A in FIG. 2;
fig. 4 is a flowchart of a depth measurement method according to an embodiment of the present invention.
The reference numbers illustrate:
reference numerals Name (R) Reference numerals Name (R)
10 Depth measuring device 11 Transmitting module
12 Collection module 121 Image sensor with a plurality of pixels
13 Control and processor 30 Output light beam
40 Reflected light 20 Target object
201 Speckle pattern beam 202 Light spot
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and intended to explain the present invention and should not be construed as limiting the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The embodiment of the invention provides a depth measuring device.
Referring to fig. 1 to 3, in an embodiment, the depth measuring device 10 includes a transmitting module 11, an acquiring module 12, and a control and processor 13; the emission module 11 is used for projecting a spot pattern composed of a plurality of output light beams 30 to the target object 20; the collecting module 12 includes an image sensor 121, the image sensor 121 includes a pixel array, and the collecting module 12 is configured to collect optical signals of the reflected light 40 reflected by the target object 20 through a part of pixels in the pixel array and convert the optical signals into electrical signals; the control and processor 13 is connected to the emission module 11 and the collection module 12, respectively, and the control and processor 13 is configured to receive the electrical signal and calculate a phase difference to obtain a first depth map, and process another part of pixels in the first depth map, where the reflected light 40 is not collected, to obtain a depth value and output a second depth map.
Based on the structural design, in the present embodiment, on one hand, since the emitting module 11 of the depth measuring device 10 is configured to provide the output light beam 30 with a speckle pattern having a plurality of speckles, compared with the current conventional floodlighting mode in which the light source adopts a single form, the energy of the output light beam 30 of the emitting module 11 is more concentrated, so that the power consumption required for emitting the light beam is smaller, the measuring distance is increased, and the measuring accuracy is further improved, except that, unlike the floodlighting mode, only the pixels irradiated by the speckles in the obtained first depth map have depth data, and the pixels without the floodlighting have no depth data; in this regard, on the other hand, by controlling the interpolation processing of another part of the pixels of the first depth map, in which the reflected light 40 is not acquired, by the processor 13, it is possible to acquire depth values and output a second depth map, which includes more depth values than the first depth map. The two aspects combine to make the depth measuring device 10 have not only higher resolution but also higher measuring accuracy. The specific interpolation method is illustrated in fig. 4.
It should be noted that the connection between the control and processor 13 and the transmitting module 11 and the collecting module 12 may be a wired electrical connection through a wire, or a wireless communication connection, and if the connection is a wireless communication connection, the depth measuring device 10 should further include a transceiver module for wireless signals.
Referring to fig. 1, in the present embodiment, the emitting module 11 includes a light source and an optical element. The light source may be, but not limited to, a light source such as a Light Emitting Diode (LED), an Edge Emitting Laser (EEL), a Vertical Cavity Surface Emitting Laser (VCSEL), or a light source array composed of a plurality of light sources. Here, since the VCSEL light source has features of a small size, a small light source emission angle, good stability, and the like, the VCSEL can be preferably used as the light source of the present depth measuring device 10. Further, the optical elements may be, but are not limited to, one or more combinations in the form of lenses, diffractive optical elements, microlens arrays, and the like. The optical element can modulate the light beam by diffraction, transmission, etc., and the modulated output light beam 30 can be projected on the target object 20 to form a speckle pattern with a plurality of speckles. In one embodiment, as shown in fig. 2 and 3, the amplitude of each spot 202 in the spot pattern beam may be square wave or pulse modulated in time sequence, in fig. 2, the output beam 30 is the modulated beam, and the reflected light 40 is correspondingly the modulated beam; of course, in other embodiments, the light beam may be modulated by, for example, but not limited to, a sine wave.
Referring to fig. 1 and fig. 2, in the present embodiment, the image sensor 121 in the acquisition module 12 may be, but is not limited to, an image sensor 121 composed of a Charge Coupled Device (CCD), a Complementary Metal-Oxide-Semiconductor (CMOS), an Avalanche Diode (AD), a Single Photon Avalanche Diode (SPAD), and the like. The image sensor 121 should include at least one pixel, and further, each pixel includes a plurality of taps (taps) for respectively collecting the reflected light 40 and forming an electrical signal within a single frame period. In particular, the taps are mainly used for storing and reading or discharging electrical signals generated by incident photons under the control of the corresponding electrodes, and particularly when each pixel includes a plurality of taps, the corresponding optical signals can be collected and further converted into electrical signals by sequentially switching different taps in a preset order within a single frame period or a single exposure time.
Further, in this embodiment, the acquisition module 12 further includes an analog-to-digital converter (ADC) for converting the electrical signal into a digital signal, and the control and processor 13 calculates a phase difference based on the digital signal to obtain the first depth map.
Further, in the present embodiment, the depth measuring device 10 further includes an RGB camera (not shown), and the RGB camera is connected to the control and processor 13 for capturing a color image of the target object 20, that is, the control and processor 13 can control the RGB camera to obtain an RGB image of the target object 20. The RGB camera includes an RGB image sensor, the RGB image sensor and the image sensor 121 in the acquisition module 12 are aligned and calibrated with high precision, and the RGB image after alignment and calibration can be used to guide an interpolation process to further improve the precision of interpolation data in the second depth image, thereby facilitating improvement of the measurement precision of the depth measurement apparatus 10.
The present invention further provides a depth measuring method, which is implemented by using the depth measuring device 10 as described above, and the principle and technical effect of the depth measuring method are consistent with those of the depth measuring device 10, and are not described herein again.
In one embodiment, as shown in fig. 4, the depth measurement method includes the steps of:
s10, projecting the speckle pattern to the target object 20 by the emission module 11;
s20, collecting optical signals of the reflected light 40 reflected by the target object 20 by a part of pixels in the pixel array of the collection module 12, and converting the optical signals into electrical signals;
s30, the control and processor 13 calculates the phase difference based on the electric signals to acquire a first depth map of the target object 20; the control and processor 13 also processes another portion of the pixels of the first depth map that do not capture the reflected light 40 to obtain depth values and output a second depth map.
Further, in the present embodiment, each pixel includes a plurality of taps, and the step S20 includes: the plurality of taps are sequentially switched in a preset order within a single frame period, and the optical signal of the reflected light 40 collected by each tap is converted into an electrical signal.
Specifically, the control and processor 13 supplies demodulation signals to respective taps in respective pixels of the image sensor 121; then, each tap collects the optical signal of the reflected light 40 reflected back by the target object 20 under the control of the demodulation signal and converts the optical signal into a corresponding electrical signal; then, the control and processor 13 calculates a phase difference based on the electric signal, and calculates a first depth map of the target object 20 based on the phase difference, then, processes pixels in the first depth map where the reflected light 40 is not acquired to acquire depth values, and outputs a second depth map,
referring to fig. 2 together, in the present embodiment, each pixel of the image sensor 121 in the capture module 12 includes four taps, and the four taps are respectively used for capturing four optical signals and converting the optical signals into electrical signals C1, C2, C3 and C4 in a single frame period, and the time and the interval of the four captures are the same. Of course, in other embodiments, each pixel may include two taps or three taps, and the number of taps is not limited herein.
In general, the control and processor 13 acquires at least a portion of the reflected light 40 reflected back through the target object 20 according to the acquisition module 12 to obtain raw phase data, and outputs a gray scale map and a first depth map according to the raw phase data. Specifically, the control and processor 13 receives the electrical signals C1, C2, C3 and C4, calculates the intensity information of the speckle pattern beam 201, and generates a gray scale map based on the intensity information. In one embodiment, the intensity information B may be calculated according to the following equation (1):
Figure BDA0002324903980000081
however, if the light intensity calculation method is the same as the conventional method, the ambient light signal is difficult to be eliminated, and the signal-to-noise ratio of the final gray pattern is low. Thus, in another embodiment, the intensity information B should be calculated according to the following equation (2):
Figure BDA0002324903980000082
while the gray scale is being generated, the control and processor 13 also receives the electrical signal output by the image sensor 121, calculates the phase difference of the reflected light 40, and calculates the time of flight of the reflected light from the emission module 11 to the collection module 12 according to the phase difference, so that the first depth map of the target object 20 can be further calculated based on the time of flight.
Further, in the present embodiment, the depth measurement method further includes capturing a color image of the target object 20 using an RGB camera, and the control and processor 13 acquires a gray scale image of the target object 20 according to the intensity information of the reflected light 40. Then, step S30 includes: and taking the color image or the gray-scale image as a guide image, and carrying out interpolation restoration on the first depth map to obtain a second depth map.
The following description will be made in detail by taking RGB images as an example. Firstly, the control and processor 13 may collect the reflected light 40 reflected by the target object 20 through a part of pixels in the pixel array to generate an electrical signal, and calculate a phase difference through the electrical signal to obtain a depth value of the target object 20, thereby outputting a first depth map; meanwhile, the control and processor 13 may control the RGB camera to acquire a color image of the target object, then perform edge extraction on the color image to acquire a color edge of the color image, and then map the color edge into the first depth map through an alignment calibration result to perform recovery processing on the color edge; meanwhile, the control and processor 13 may further perform interpolation and restoration processing on another part of pixels in the first depth map, where the reflected light 40 is not collected, to obtain depth values, and output a second depth map, where the interpolation and restoration processing may specifically adopt, for example and without limitation, an interpolation algorithm such as a region growing algorithm, deep learning, and guided filtering.
In one embodiment, the pixels of the first depth map that do not capture the reflected light 40 may be processed using a region growing algorithm to obtain depth values. Specifically, any pixel point with unknown depth value is randomly selected as a seed point, pixel points in a neighborhood are traversed by taking the seed point as a center, and a non-edge pixel point with known depth value is selected for calculating the mean value of the depth values, and then the mean value is the depth value of the seed point; continuously traversing pixel points in the depth map by taking the seed point as a center, and selecting a non-edge pixel point which is closest to the seed point and has unknown depth value as a new seed point; and repeating the process until the depth map has no non-edge pixel points with unknown depth values, and stopping region growing. For the depth values of the edge pixel points, the edge points with unknown depth values in the depth map are recovered by using median filtering to obtain a complete second depth map, so that the second depth map comprises more depth values relative to the first depth map, and the precision of the second depth map is higher than that of the first depth map.
However, the design is not limited to this, and in other embodiments, the grayscale map may be used as a guide image to interpolate and restore the first depth map to obtain the second depth map, and the method for obtaining the second depth map is not limited in this respect.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents or improvements made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (7)

1. A depth measurement device, comprising:
the emission module is used for projecting a spot pattern consisting of a plurality of output light beams to a target object;
the acquisition module comprises an image sensor, the image sensor comprises a pixel array, and the acquisition module is used for acquiring optical signals of reflected light reflected by the target object through part of pixels in the pixel array and converting the optical signals into electric signals; and (c) a second step of,
the control and processor is respectively connected with the transmitting module and the collecting module, and is used for receiving the electric signals and calculating a phase difference to obtain a first depth map, and performing interpolation processing on the other part of the pixels, which do not collect the reflected light, in the first depth map to obtain a depth value and output a second depth map; the pixels without light beam illumination are used as the other part of pixels which do not collect reflected light in the first depth map;
the depth measuring device also comprises an RGB camera, and the RGB camera is connected with the control and processor and is used for acquiring a color image of the target object; the control and processor directing the interpolation process with the color image to improve interpolation accuracy, comprising: the control and processor controls the RGB camera to collect a color image of the target object, performs edge extraction on the color image to acquire a color edge of the color image, and maps the color edge into the first depth map through an alignment calibration result to perform recovery processing on the color edge.
2. The depth measurement device of claim 1, wherein each of the pixels comprises a plurality of taps for separately collecting the reflected light and forming the electrical signal within a single frame period.
3. The depth measurement device of claim 1, wherein the acquisition module further comprises an analog-to-digital converter for converting the electrical signal into a digital signal, the control and processor being configured to obtain the first depth map based on the digital signal and calculating the phase difference.
4. The depth measurement device of claim 1, wherein the emission module comprises a light source, the light source being a vertical cavity surface emitting laser.
5. A depth measuring method implemented using the depth measuring apparatus according to any one of claims 1 to 4, the depth measuring method comprising the steps of:
s10, projecting a speckle pattern to the target object by the emission module;
s20, collecting optical signals of reflected light reflected by the target object by partial pixels in a pixel array of the collection module, and converting the optical signals into electric signals;
s30, the control and processor calculates a phase difference based on the electric signals to acquire a first depth map of the target object; acquiring a color image of the target object using an RGB camera; the control and processor further processes another part of the pixels of the first depth map, which are not acquired by the reflected light, and interpolates and restores the first depth map to obtain the second depth map by using the color image or the grayscale map as a guide image, including: the control and processor controls the RGB camera to collect a color image of the target object, performs edge extraction on the color image to acquire a color edge of the color image, and maps the color edge into the first depth map through an alignment calibration result to perform recovery processing on the color edge; the pixels irradiated by the spots in the first depth map have depth data, the pixels without light beam illumination do not have depth data, and the pixels without light beam illumination are used as the other part of pixels which do not collect reflected light in the first depth map.
6. The depth measurement method of claim 5, wherein each of the pixels includes a plurality of taps, the step S20 includes:
and sequentially switching a plurality of taps in a preset sequence in a single frame period, and converting the optical signal of the reflected light collected by each tap into the electric signal.
7. The depth measurement method of claim 5, further comprising: and the control and processor acquires a gray-scale image of the target object according to the intensity information of the reflected light.
CN201911312430.6A 2019-12-18 2019-12-18 Depth measuring device and method Active CN111045030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911312430.6A CN111045030B (en) 2019-12-18 2019-12-18 Depth measuring device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911312430.6A CN111045030B (en) 2019-12-18 2019-12-18 Depth measuring device and method

Publications (2)

Publication Number Publication Date
CN111045030A CN111045030A (en) 2020-04-21
CN111045030B true CN111045030B (en) 2022-09-13

Family

ID=70237723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911312430.6A Active CN111045030B (en) 2019-12-18 2019-12-18 Depth measuring device and method

Country Status (1)

Country Link
CN (1) CN111045030B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708039B (en) * 2020-05-24 2023-09-05 奥比中光科技集团股份有限公司 Depth measurement device and method and electronic equipment
CN112987022A (en) * 2021-03-11 2021-06-18 Oppo广东移动通信有限公司 Distance measurement method and device, computer readable medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106772431A (en) * 2017-01-23 2017-05-31 杭州蓝芯科技有限公司 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision
CN109598736A (en) * 2018-11-30 2019-04-09 深圳奥比中光科技有限公司 The method for registering and device of depth image and color image
CN109613558A (en) * 2018-12-12 2019-04-12 北京华科博创科技有限公司 A kind of the data fusion method for parallel processing and system of all-solid state laser radar system
CN110221274A (en) * 2019-05-09 2019-09-10 深圳奥比中光科技有限公司 Time flight depth camera and the distance measurement method of multifrequency modulation /demodulation
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
CN209676383U (en) * 2019-04-12 2019-11-22 深圳市光微科技有限公司 Depth camera mould group, depth camera, mobile terminal and imaging device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2115859C (en) * 1994-02-23 1995-12-26 Brian Dewan Method and apparatus for optimizing sub-pixel resolution in a triangulation based distance measuring device
EP2538242B1 (en) * 2011-06-24 2014-07-02 Softkinetic Software Depth measurement quality enhancement.
US10062180B2 (en) * 2014-04-22 2018-08-28 Microsoft Technology Licensing, Llc Depth sensor calibration and per-pixel correction
EP2955544B1 (en) * 2014-06-11 2020-06-17 Sony Depthsensing Solutions N.V. A TOF camera system and a method for measuring a distance with the system
EP3318173A4 (en) * 2015-06-30 2019-04-17 Olympus Corporation Image processing device, ranging system, and endoscope system
US10091435B2 (en) * 2016-06-07 2018-10-02 Disney Enterprises, Inc. Video segmentation from an uncalibrated camera array
CN106548489B (en) * 2016-09-20 2019-05-10 深圳奥比中光科技有限公司 A kind of method for registering, the three-dimensional image acquisition apparatus of depth image and color image
CN106572339B (en) * 2016-10-27 2018-11-30 深圳奥比中光科技有限公司 A kind of image acquisition device and image capturing system
US11662433B2 (en) * 2017-12-22 2023-05-30 Denso Corporation Distance measuring apparatus, recognizing apparatus, and distance measuring method
CN109831255B (en) * 2019-03-26 2021-03-23 Oppo广东移动通信有限公司 Control system and terminal of time-of-flight subassembly
CN110471080A (en) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 Depth measurement device based on TOF imaging sensor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106772431A (en) * 2017-01-23 2017-05-31 杭州蓝芯科技有限公司 A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision
CN109598736A (en) * 2018-11-30 2019-04-09 深圳奥比中光科技有限公司 The method for registering and device of depth image and color image
CN109613558A (en) * 2018-12-12 2019-04-12 北京华科博创科技有限公司 A kind of the data fusion method for parallel processing and system of all-solid state laser radar system
CN209676383U (en) * 2019-04-12 2019-11-22 深圳市光微科技有限公司 Depth camera mould group, depth camera, mobile terminal and imaging device
CN110221274A (en) * 2019-05-09 2019-09-10 深圳奥比中光科技有限公司 Time flight depth camera and the distance measurement method of multifrequency modulation /demodulation
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谢长江等.基于深度学习的彩色以及近红外图像.《计算机应用》.2019,第39卷(第10期), *

Also Published As

Publication number Publication date
CN111045030A (en) 2020-04-21

Similar Documents

Publication Publication Date Title
CN110596722B (en) System and method for measuring flight time distance with adjustable histogram
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
CN111142088B (en) Light emitting unit, depth measuring device and method
CN103477186B (en) Stereo photographic device
CN111025317A (en) Adjustable depth measuring device and measuring method
US10041787B2 (en) Object detection device
CN110596723B (en) Dynamic histogram drawing flight time distance measuring method and measuring system
JP6020547B2 (en) Image acquisition apparatus and method
CN110687541A (en) Distance measuring system and method
CN111123289B (en) Depth measuring device and measuring method
CN111045030B (en) Depth measuring device and method
CN110221272B (en) Time flight depth camera and anti-interference distance measurement method
CN111538024B (en) Filtering ToF depth measurement method and device
CN111025321B (en) Variable-focus depth measuring device and measuring method
WO2019231496A1 (en) Light detection system having multiple lens-receiver units
CN111707413B (en) Centroid detection method based on single-pixel detector
CN110780312B (en) Adjustable distance measuring system and method
CN114930192B (en) Infrared imaging assembly
KR101145132B1 (en) The three-dimensional imaging pulsed laser radar system using geiger-mode avalanche photo-diode focal plane array and auto-focusing method for the same
CN106772426B (en) System for realizing remote laser high-sensitivity single photon imaging
CN111458717A (en) TOF depth measuring device and method and electronic equipment
CN110268282A (en) The dynamic visual field for receiving light is provided from dynamic position
CN211148917U (en) Distance measuring system
CN113366383B (en) Camera device and automatic focusing method thereof
CN113325439B (en) Depth camera and depth calculation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant after: Obi Zhongguang Technology Group Co.,Ltd.

Address before: 12 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: SHENZHEN ORBBEC Co.,Ltd.

GR01 Patent grant
GR01 Patent grant