WO2021120402A1 - Fused depth measurement apparatus and measurement method - Google Patents

Fused depth measurement apparatus and measurement method Download PDF

Info

Publication number
WO2021120402A1
WO2021120402A1 PCT/CN2020/077862 CN2020077862W WO2021120402A1 WO 2021120402 A1 WO2021120402 A1 WO 2021120402A1 CN 2020077862 W CN2020077862 W CN 2020077862W WO 2021120402 A1 WO2021120402 A1 WO 2021120402A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth
structured light
tof
depth map
target object
Prior art date
Application number
PCT/CN2020/077862
Other languages
French (fr)
Chinese (zh)
Inventor
许星
Original Assignee
深圳奥比中光科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥比中光科技有限公司 filed Critical 深圳奥比中光科技有限公司
Publication of WO2021120402A1 publication Critical patent/WO2021120402A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves

Definitions

  • This application relates to the field of optical measurement technology, and in particular to a fusion depth measurement device and measurement method.
  • the depth measurement device can be used to obtain the depth image of the object, and can further perform 3D modeling, skeleton extraction, face recognition, etc., and has a very wide range of applications in the fields of 3D measurement and human-computer interaction.
  • the current depth measurement technology mainly includes TOF ranging technology and structured light ranging technology.
  • TOF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object. It is divided into direct measurement Distance technology and indirect distance measurement technology. Among them, the indirect ranging technology measures the phase delay of the reflected light signal relative to the emitted light signal, and then calculates the flight time by the phase delay. According to the different types of modulation and demodulation, it can be divided into continuous wave (CW) modulation and decoding. Modulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method. TOF ranging technology does not require complex image processing calculations, and the detection distance is relatively long and can maintain high accuracy.
  • CW continuous wave
  • PM Pulse Modulated
  • the structured light ranging technology emits a structured light beam to a space object, then collects the structured light pattern formed by the structured light beam modulated and reflected by the object, and finally uses the triangulation method to perform depth calculation to obtain the depth data of the object.
  • Commonly used structured light patterns include irregular spot patterns, stripe patterns, phase shift patterns and so on.
  • Structured light technology has very high accuracy in short-distance measurement. It performs well in low-light environments, but is easily affected in strong light environments. Relatively speaking, the anti-interference effect of TOF technology in strong light environments is better than that of structure. Light technology.
  • the purpose of this application is to provide a fusion depth measurement device and measurement method to solve at least one of the above-mentioned background technical problems.
  • a fusion depth measurement device including a transmitting module, a receiving module, and a control and processing circuit respectively connected with the transmitting module and the receiving module; wherein the transmitting module includes a light source array and optical elements, and the light source array is used for transmitting A light beam whose amplitude is modulated in time sequence.
  • the optical element receives the light beam and then emits a spot-patterned light beam to a target object.
  • the receiving module includes a TOF image sensor.
  • the TOF image sensor includes a pixel array. The pixel array receives the light beam from the target.
  • the spot pattern beam reflected by the object forms an electrical signal; the control and processing circuit receives the electrical signal and calculates the phase difference, uses the phase difference to calculate the TOF depth map of the target object; and, receives the electrical signal and calculates it.
  • Structured light pattern using the structured light pattern to calculate the structured light depth map of the target object; and, using the depth value in the TOF depth map as a reliable point, assigning values to the corresponding pixel positions in the structured light depth map, And use the reliable point to correct the structured light depth map, and finally obtain a depth image of the target object.
  • the TOF image sensor includes at least one pixel; wherein each pixel includes two or more taps.
  • control and processing circuit provides the demodulated signal of each tap in each pixel of the TOF image sensor, and the taps are controlled by the demodulated signal to collect the reflected light beam generated by the reflected light from the target object. Electrical signal.
  • control and processing circuit includes a phase calculation module and an intensity calculation module, and the electrical signal generated by the TOF image sensor is transmitted to the phase calculation module and the intensity calculation module at the same time, and is calculated through processing.
  • the phase information and intensity information corresponding to the pixel a TOF depth map and a structured light depth map corresponding to the pixel are further obtained according to the phase information and the intensity information.
  • control and processing circuit further includes a calibration module, a matching module, and a correction module; wherein the TOF depth map obtained by the calibration module and the structured light depth obtained by the matching module
  • the map is input to the correction module to establish a mapping between the TOF depth map and the structured light depth map to ensure that the TOF depth value calculated from the electrical signal generated by each pixel corresponds to the calculated structured light depth Value, assign the TOF depth value to the corresponding pixel coordinates in the structured light depth map, and take the assigned point as a reliable point, and perform correction processing on the unassigned point in the structured light depth map.
  • a fusion depth measurement method includes the following steps:
  • step S1 the light source array emits the spot pattern beam toward the target area, and the control and processing circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be in a continuous wave, square wave or pulse mode in time sequence. At least one way of modulation.
  • the TOF image sensor includes at least one pixel, and each pixel includes two or more taps; within a single frame period, the taps are sequentially switched in a certain order to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
  • step S3 the electrical signal input by the pixel array is received through the control and processing circuit to perform phase calculation and intensity calculation respectively to obtain the TOF depth map and the structured light depth map of the target object, and the TOF depth value Assign values to the corresponding pixels in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points and unreliable points, and use the reliable points to correct the unreliable points in combination with a correction algorithm , To obtain the depth image of the target object.
  • An electronic device comprising a housing, a screen, and a fused depth measuring device;
  • the fused depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module;
  • the transmitting module includes a light source array and an optical element, the light source array is used to transmit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam;
  • the receiving module includes a TOF image Sensor, the TOF image sensor includes a pixel array, the pixel array receives the spot pattern beam reflected by the target object and forms an electrical signal;
  • the control and processing circuit receives the electrical signal and calculates the phase difference, using all The phase difference calculates the TOF depth map of the target object; and, receiving the electrical signal and calculating the structured light pattern, using the structured light pattern to calculate the structured light depth map of the target object; and, converting the TOF
  • the embodiment of the present application provides a fusion depth measurement device, including a transmitting module for transmitting a spot pattern beam whose amplitude is time-modulated to a target object; a receiving module for receiving the spot pattern beam reflected by the target object and forming an electrical signal; controlling and The processing circuit receives the electrical signal and calculates the TOF depth map and the structured light pattern, uses the depth value in the TOF depth map as a reliable point, assigns the value to the corresponding pixel position in the structured light depth map, and uses the The structured light depth map is calibrated reliably, and the depth image of the target object is finally obtained.
  • This application has high measurement accuracy based on the TOF depth value.
  • the TOF depth value is used to correct the error caused by the matching calculation of the structured light depth map to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, to achieve A depth measuring device with high precision, high resolution, low power consumption and miniaturization is developed.
  • Fig. 1 is a schematic diagram of a fusion depth measuring device according to an embodiment of the present application.
  • Fig. 2 is a schematic diagram of the principle of a fusion depth measuring device according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of the control and processing circuit architecture of a fused depth measurement device according to an embodiment of the present application.
  • Fig. 4 is a flowchart of a fusion depth measurement method according to an embodiment of the present application.
  • Fig. 5 is a schematic diagram of an electronic device integrated with the depth measurement device fused in Fig. 1 according to an embodiment of the present application.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • “plurality” means two or more, unless otherwise specifically defined.
  • FIG. 1 is a schematic diagram of a fusion depth measurement device according to an embodiment of the application.
  • the fused depth measurement device 10 includes a transmitting module 11, a receiving module 12, and a control and processing circuit 13 respectively connected to the transmitting module 11 and the receiving module 12; wherein, the transmitting module 11 is used to transmit a light beam 30 to a target object 20, and transmit
  • the light beam 30 is a spot pattern light beam whose amplitude is modulated in time sequence.
  • the spot pattern light beam is emitted into the target space to illuminate the target object 20 in the space. At least part of the emitted light beam 30 is reflected by the target object 20 to form a reflected light beam 40.
  • the reflected light beam At least part of the light beam in 40 is received by the receiving module 12; the control and processing circuit 13 is respectively connected with the transmitting module 11 and the receiving module 12 to control the transmission and reception of the light beam, and at the same time receive the information generated by the receiving module 12 receiving the reflected light beam, And calculate the information to obtain the depth information of the target object.
  • the emission module 11 includes a light source array 111, an optical element 112, a light source driver (not shown in the figure), and the like.
  • the light source array 111 may be a light source array composed of multiple light sources such as light emitting diodes (LED), edge emitting lasers (EEL), and vertical cavity surface emitting lasers (VCSEL).
  • the light beams emitted by the light sources may be visible light, infrared light, or ultraviolet light. Wait.
  • the light source array 111 is an irregularly arranged VCSEL S array for emitting irregular spot pattern beams.
  • the light source array 111 is controlled by the light source driver (which may be further controlled by the control and processing circuit 13) to emit light beams after being modulated with a certain timing amplitude.
  • the light source array 111 is under the control of the light source driver. Transmit pulse modulated beams, square wave modulated beams, sine wave modulated beams and other beams at a certain frequency.
  • the amplitude of the beam corresponding to each spot in the irregular spot pattern beam is modulated in a continuous wave, square wave or pulse manner in time sequence. It is understandable that a part of the control and processing circuit 13 or a sub-circuit independent of the control and processing circuit 13 can be used to control the light source array 111 to emit related light beams, such as a pulse signal generator.
  • the optical element 112 receives the light beam from the light source array 111 and emits the spot pattern light beam outward. In some embodiments, the optical element 112 is also used to expand the received light beam to expand the angle of view of the measuring device. It can be understood that the amplitude of the light beam modulated by the optical element 112 is still modulated in a certain time sequence, that is, the incident sine wave modulated light beam is still emitted by the sine wave modulated light beam.
  • the optical element 112 may be one or a combination of a lens, a diffractive optical element (DOE), a microlens array, and a liquid crystal.
  • DOE diffractive optical element
  • the receiving module 12 includes a TOF image sensor 121, a filter unit 122, and a lens unit 123.
  • the lens unit 123 receives and images at least part of the spot pattern beam reflected by the target object on at least part of the TOF image sensor 121; the filter unit 122 is configured to interact with the light source Narrowband filters with matched wavelengths are used to suppress background light noise in other wavelength bands.
  • the TOF image sensor 121 can be an image sensor array composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc.
  • the size of the array represents the depth of the camera. Resolution, such as 320 ⁇ 240, etc.
  • connected to the image sensor 121 also includes a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure). ).
  • a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure).
  • the TOF image sensor 121 includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel in the TOF image sensor 121 here includes two or more taps (tap, used in the corresponding electrode). Store and read or discharge the charge signal generated by incident photons under control), for example, including 2 taps, and switch the taps in a certain order within a single frame period (or single exposure time) to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
  • the control and processing circuit 13 can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc. composed of CPU, memory, bus, etc., or a general processing circuit, such as when the depth measuring device is integrated into
  • a smart terminal such as a mobile phone, a TV, a computer, etc.
  • the processing circuit in the smart terminal can be used as at least a part of the control and processing circuit 13.
  • the control and processing circuit 13 is used to provide a modulation signal (transmission signal) required when the light source array 111 emits laser light, and the light source emits a light beam to a target object under the control of the modulation signal.
  • the modulation signal is a square wave signal or a pulse signal
  • the light source is modulated in time sequence in amplitude under the modulation of the modulation signal to generate a square wave signal or a pulse signal for external emission.
  • the control and processing circuit 13 also provides a demodulated signal (collection signal) of each tap in each pixel of the TOF image sensor 121, and the tap collects the electrical signal generated by the reflected light beam containing the target object under the control of the demodulated signal.
  • the control and processing circuit 13 processes the electrical signal and calculates the intensity information reflecting the intensity of the reflected beam to form a structured light pattern. Finally, based on the structured light pattern, it uses matching calculations, triangulation calculations, etc. to perform calculations to obtain the test Structured light depth image of the target object.
  • control and processing circuit 13 processes the electrical signal and calculates the phase difference reflecting the light beam from emission to reception, calculates the flight time of the light beam based on the phase difference, and further obtains the TOF depth image of the target object. Further, the control and processing circuit 13 can also correct the structured light depth image according to the TOF depth image. For example, the depth value in the TOF depth map can be used as a reliable point and assigned to the corresponding pixel position in the structured light depth map, and The structured light depth map is corrected using the reliable point, and a more specific correction method will be described later.
  • the depth measurement device 10 may also include a drive circuit, a power supply, a color camera, an infrared camera, an IMU, and other devices, which are not shown in the figure.
  • the combination with these devices can achieve richer functions, such as 3D. Texture modeling, infrared face recognition, SLAM and other functions.
  • the depth measuring device 10 may be embedded in electronic products such as mobile phones, tablet computers, and computers.
  • Fig. 2 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application.
  • the control and processing circuit 13 controls the light source array 111 to emit a spot pattern beam 301 whose amplitude is modulated by a square wave or pulse toward the target object, and the amplitude of each spot 302 is modulated by a square wave or pulse in time sequence. It is understandable that the light sources in the light source array 111 are modulated in the same manner. In some other embodiments, a sine wave may also be used to modulate the amplitude of the emitted light beam.
  • each pixel of the TOF image sensor in the receiving module 12 includes 4 taps, which are respectively used to collect light signals 4 times in a single frame period and convert them into electrical signals C 1 , C 2 , C 3 Same as C 4 , the time and interval of 4 acquisitions are the same.
  • the control and processing circuit 13 receives the electrical signals C 1 , C 2 , C 3 and C 4 to calculate the intensity information of the spot pattern beam.
  • the intensity information is calculated according to the following formula:
  • the structured light pattern After obtaining the intensity information of all pixels, the structured light pattern can be formed, and finally the structured light pattern is used for matching calculation to obtain the parallax and calculate the structured light depth image according to the parallax.
  • the intensity information will be calculated according to the following formula:
  • the structured light pattern is generated according to the calculated intensity information of the spot pattern beam, and further matching calculation is performed according to the structured light pattern to obtain the parallax and the structured light depth image is calculated according to the parallax.
  • the above-mentioned structured light acquisition scheme based on the 4-tap TOF image sensor and the square wave or pulse modulated light emission signal is also applicable to other tapped TOF image sensors and other types of depth measurement devices with modulated light emission signals. It is understandable that, compared with the traditional structured light depth measurement, the present application uses the transmitting end to emit a time-modulated speckle projection beam, and adopts the receiving end multi-tap pixel collection method, so that this method can have more functions than the traditional solution. For example, it is possible to implement depth measurement methods that are difficult to resist environmental interference in traditional solutions.
  • control and processing circuit 13 also receives the TOF image sensor output.
  • each tap collects the electrical signal generated by the reflected beam reflected by the target object to calculate the reflected beam phase difference and calculate the reflected beam according to the phase difference.
  • the flight time from the transmitting end to the receiving end is further calculated based on the flight time to calculate the TOF depth image of the target object.
  • Fig. 3 is a schematic diagram of a control and processing circuit architecture according to an embodiment of the present application.
  • the control and processing circuit 13 includes a phase calculation module 131 and an intensity calculation module 133; wherein the output of the phase calculation module 131 is connected to the calibration module 132; the output of the intensity calculation module 133 is connected to the preprocessing module 134, the preprocessing module 134 The output of is connected to the matching module 135.
  • the input of the calibration module 132 and the matching module 135 is also connected to the memory 137, and the output is connected to the correction module 136.
  • the control and processing circuit 13 receives the electrical signal from the TOF image sensor.
  • the transmitting module 11 emits an irregular spot pattern beam with modulated amplitude toward the target area
  • the receiving module 12 receives the spot pattern reflected by the target object. beam.
  • the electrical signal generated by it is transmitted to the phase calculation module 131 and the intensity calculation module 133 at the same time, and the phase information and intensity information corresponding to the pixel are obtained through processing and calculation.
  • the information and intensity information further obtain the TOF depth map and the structured light depth map corresponding to the pixel, which have a corresponding positional relationship.
  • the depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by using the reliable point. Specifically, including:
  • the phase calculation module 131 processes and calculates the electrical signal to obtain the phase difference. Based on the phase difference, the flight time of the beam from emission to reception can be calculated to further obtain the target object.
  • the TOF depth image can be directly calculated by the phase calculation module 131. Then the depth map is sent to the calibration module 132 for calibration. Because TOF measurement is often interfered by noise, there is a certain error between the measured value and the actual value. Therefore, a calibration step will be adopted before actual use, such as in a certain measurement interval.
  • the calibration board is set at intervals in the internal, and the actual depth value of the calibration board is known, and then the actual measurement of the calibration board at different distances is performed successively to obtain the measured value corresponding to each distance, and the relationship between the measured value and the actual value can be
  • the calibration module will call the pre-calibration parameters from the memory 137 to calibrate the current measurement value during calibration.
  • the pre-calibration parameter here can be a comparison table (index) between the actual value and the measured value.
  • the calibration process of the calibration module 132 is actually a table look-up process; it can also be a certain mathematical method to model the error and pass Multiple measurements are performed in advance to calculate the unknown parameters in the model.
  • the calibration process of the calibration module 132 is actually a process of calculating actual values based on the model and the measured values. After calibration, an accurate TOF depth map can be obtained.
  • the intensity calculation module 133 After the electrical signal generated by the TOF image sensor 121 is transmitted to the intensity calculation module 133, the intensity calculation module 133 performs phase calculation of the electrical signal to obtain intensity information reflecting the intensity of the light beam to form a structured light image. Subsequently, the structured light image is sent to the preprocessing module 134 for processing such as denoising and contrast enhancement. Pre-processing tasks such as image distortion correction can also be performed on the structured light image.
  • the pre-processed image then enters the matching module 135 for matching calculation. During the matching calculation, the matching module 135 will call the pre-stored reference image from the memory 137. In one embodiment, the matching module 135 uses zero mean normalization The least square distance function is used to estimate the pixel deviation between the structured light image and the reference image.
  • the matching module 135 can directly calculate the depth value to obtain the structured light depth image of the target to be measured.
  • the TOF depth map is directly obtained by phase difference calculation.
  • the measurement accuracy is relatively accurate, but in the TOF calculation mode, only the depth value of the part of the pixel that the reflected beam enters can be obtained, but the depth value of all the pixels cannot be obtained. Therefore, in the embodiment of the present application, the TOF depth map is extracted The depth value is used as a reliable point to correct the structured light depth map.
  • the TOF depth map obtained by the calibration module 132 and the structured light depth map obtained by the matching module 135 are input to the correction module 135 to establish a mapping between the TOF depth map and the structured light depth map to ensure the TOF calculated from the electrical signal generated by each pixel
  • the depth value corresponds to the calculated structured light depth value.
  • the TOF depth value is assigned to the corresponding pixel coordinates in the structured light depth map. These assigned points are called reliable points, and the unassigned points are called unreliable points. Based on the reliable points, the unreliable points in the structured light depth map are corrected to improve the accuracy of the structured light depth map.
  • the correction method may include the region growing method, the confidence weighting method, and so on.
  • the area growth method is used to correct the structured light depth map.
  • Region growth is calculated based on the assumption that the object has continuity and the depth values between two close points on the object are approximately equal.
  • multiple reliable points are selected as seed points, and by setting appropriate growth criteria, Use four connectivity (up, down, left, right) to connect unreliable points and reliable points that satisfy the correlation to the same area to form a new area.
  • the reliable point assigned by the TOF depth value is called the seed point, an appropriate growth criterion is set, and a first-in first-out area q is set.
  • the depth value of all pixels in the area is equal to The depth value of the seed point.
  • the depth value of the seed point p is d
  • the four pixel coordinates (x 0 , y 0 -1) adjacent to the point p are predicted, (x 0, y 0 -1) 0 , y 0 +1), (x 0 -1, y 0 ), (x 0 +1, y 0 ) the correlation between the depth value and the seed point p, assuming that the growth criterion sets two adjacent pixels
  • the absolute value of the difference between the point depth values is less than the custom threshold T.
  • the unreliable point that meets the condition is merged with the seed point
  • the region q where the seed point is located becomes a reliable point
  • the newly obtained reliable point is used as the seed point to continue the prediction; if the correlation between the point p and the depth value of the adjacent four pixel coordinate points does not meet the set growth criterion , The growth of the region is terminated. After the correction processing, the measurement accuracy of the depth map is effectively improved.
  • this application also proposes a fused depth measurement method.
  • Fig. 4 is a flowchart of a fusion depth measurement method according to another embodiment of the application, which includes the following steps:
  • the light source array emits the spot pattern beam toward the target area
  • the control circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be modulated by at least one of continuous wave, square wave, or pulse in time sequence.
  • the TOF image sensor includes at least one pixel, and each pixel includes two or more taps.
  • each pixel includes 4 taps, which can be determined within a single frame period (or single exposure time). The taps are sequentially switched in order to collect the corresponding photons to receive the optical signal and convert it into an electrical signal.
  • control circuit receives the electrical signal input from the pixel array to perform phase calculation and intensity calculation respectively, wherein the TOF depth map of the target area to be measured is obtained through phase calculation; the structured light depth map of the target area to be measured is obtained through intensity calculation. Assign the TOF depth value to the corresponding pixel in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points (TOF depth values) and unreliable points, and use the reliable points to correct unreliable points in combination with the correction algorithm point.
  • reliable points TOF depth values
  • reliable points are used as seed points, and unreliable points are optimized in combination with a region growing algorithm, so as to correct the structured light depth map to obtain a depth image with higher accuracy.
  • This application proposes a fusion depth measurement device and measurement method composed of a structured light emitting end and a TOF image sensor based on amplitude timing modulation.
  • TOF depth value calculation and structured light depth are respectively performed Value calculation, based on the TOF depth value, has high measurement accuracy.
  • the error caused by the matching calculation of the structured light depth map is corrected to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, A high-precision, high-resolution, low-power, and miniaturized depth measurement solution is realized.
  • an electronic device is also provided.
  • the electronic device may be a mobile phone, a tablet, a computer, a TV, a smart helmet, smart glasses, a robot, and so on.
  • the electronic device 500 includes a housing 51, a screen 52, and the fused depth measurement device described in the foregoing embodiment; the screen 52 is used for information display; and the The housing 51 can provide protection functions such as dustproof, waterproof, and drop-proof for the electronic device.
  • the transmitting module 11 and the receiving module 12 of the fused depth measuring device are arranged on the first plane of the electronic device 500, and are used to transmit to the target object the spot pattern beam whose amplitude is modulated in time sequence and to receive the light beam reflected by the target object.
  • Spot pattern light beam; the screen 52 is installed on the second plane of the electronic device to display information such as images or text; the first plane and the second plane are the same plane or the first plane and the second plane are opposite planes.
  • the control and processing circuit of the fused depth measurement device can be shared with the electronic device; and in some embodiments, when the electronic device itself is provided with a TOF image sensor, the receiving module of the fused depth measurement device can also be used.
  • the TOF image sensor is shared.
  • the functions of electronic devices are continuously expanded and their applications are more and more extensive. For example, Realize the depth measurement of the target object to obtain a higher-precision depth image containing the target, and further based on the high-precision depth image can realize three-dimensional reconstruction, face recognition, human-computer interaction and other functions.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Disclosed is a fused depth measurement apparatus, comprising a transmitting module which is used for transmitting, to a target object, a spot pattern light beam with the amplitude being modulated in time sequence; a receiving module for receiving the spot pattern light beam reflected by the target object and forming an electrical signal; and a control and processing circuit for receiving the electrical signal and performing calculation to obtain a TOF depth image and a structured light pattern, taking a depth value in the TOF depth image as a reliable point and assigning same to a corresponding pixel position in a structured light depth image, and using the reliable point to correct the structured light depth image to finally obtain a depth image of the target object. The present application has relatively high measurement accuracy on the basis of a TOF depth value; an error of a structured light depth image caused by matching calculation is corrected on the basis of the TOF depth value to obtain a high-accuracy depth image; and in conjunction with the advantage of high resolution of structured light measurement, a high-accuracy, high-resolution, low-power-consumption and miniaturized depth measurement apparatus is achieved.

Description

一种融合的深度测量装置及测量方法Fusion depth measuring device and measuring method
本申请要求于2019年12月18日提交中国专利局,申请号为201911306106.3,发明名称为“一种融合的深度测量装置及测量方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office on December 18, 2019, the application number is 201911306106.3, and the invention title is "a fusion depth measurement device and measurement method", the entire content of which is incorporated by reference In this application.
技术领域Technical field
本申请涉及光学测量技术领域,尤其涉及一种融合的深度测量装置及测量方法。This application relates to the field of optical measurement technology, and in particular to a fusion depth measurement device and measurement method.
背景技术Background technique
深度测量装置可以用来获取物体的深度图像,进一步可以进行3D建模、骨架提取、人脸识别等,在3D测量以及人机交互等领域有着非常广泛的应用。目前的深度测量技术主要有TOF测距技术、结构光测距技术等。The depth measurement device can be used to obtain the depth image of the object, and can further perform 3D modeling, skeleton extraction, face recognition, etc., and has a very wide range of applications in the fields of 3D measurement and human-computer interaction. The current depth measurement technology mainly includes TOF ranging technology and structured light ranging technology.
TOF的全称是Time-of-Flight,即飞行时间,TOF测距技术是一种通过测量光脉冲在发射/接收装置和目标物体间的往返飞行时间来实现精确测距的技术,分为直接测距技术和间接测距技术。其中,间接测距技术测量反射光信号相对于发射光信号的相位延迟,再由相位延迟对飞行时间进行计算,按照调制解调类型方式的不同可以分为连续波(Continuous Wave,CW)调制解调方法和脉冲调制(Pulse Modulated,PM)调制解调方法。TOF测距技术无需复杂的图像处理计算,且探测距离较远并能保持较高的精度。The full name of TOF is Time-of-Flight, that is, time of flight. TOF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object. It is divided into direct measurement Distance technology and indirect distance measurement technology. Among them, the indirect ranging technology measures the phase delay of the reflected light signal relative to the emitted light signal, and then calculates the flight time by the phase delay. According to the different types of modulation and demodulation, it can be divided into continuous wave (CW) modulation and decoding. Modulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method. TOF ranging technology does not require complex image processing calculations, and the detection distance is relatively long and can maintain high accuracy.
而结构光测距技术则是向空间物体发射结构光光束,然后采集被物体调制及反射后的结构光光束所形成的结构光图案,最后利用三角法进行深度计算以获取物体的深度数据。常用的结构光图案有不规则斑点图案、条纹图案、相移图案等。The structured light ranging technology emits a structured light beam to a space object, then collects the structured light pattern formed by the structured light beam modulated and reflected by the object, and finally uses the triangulation method to perform depth calculation to obtain the depth data of the object. Commonly used structured light patterns include irregular spot patterns, stripe patterns, phase shift patterns and so on.
结构光技术在近距离测量时具有非常高的精度,在低光环境中表现良好,但是在强光环境中易受影响,相对而言TOF技术在强光环境中的抗干扰效果要优于结构光技术。Structured light technology has very high accuracy in short-distance measurement. It performs well in low-light environments, but is easily affected in strong light environments. Relatively speaking, the anti-interference effect of TOF technology in strong light environments is better than that of structure. Light technology.
故,若能提供一种解决方案,将结构光技术与TOF技术融合,充分发挥两者优势进行深度测量,则将大大提高测量的优势。Therefore, if a solution can be provided that integrates structured light technology and TOF technology, and makes full use of the advantages of both for depth measurement, it will greatly improve the advantages of measurement.
发明内容Summary of the invention
本申请的目的在于提供一种融合的深度测量装置及测量方法,以解决上述背景技术问题中的至少一种。The purpose of this application is to provide a fusion depth measurement device and measurement method to solve at least one of the above-mentioned background technical problems.
为达到上述目的,本申请实施例的技术方案是这样实现的:In order to achieve the foregoing objectives, the technical solutions of the embodiments of the present application are implemented as follows:
一种融合的深度测量装置,包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;其中,发射模块,包括有光源阵列以及光学元件,所述光源阵列用于发射时序上振幅被调制的光束,所述光学元件接收所述光束后向目标物体发射斑点图案光束;接收模块,包括有TOF图像传感器,所述TOF图像传感器包括像素阵列,所述像素阵列接收由目标物体反射的斑点图案光束并形成电信号;控制和处理电路,接收所述电信号并计算得到相位差,利用所述相位差计算目标物体的TOF深度图;和,接收所述电信号并计算得到结构光图案,利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像。A fusion depth measurement device, including a transmitting module, a receiving module, and a control and processing circuit respectively connected with the transmitting module and the receiving module; wherein the transmitting module includes a light source array and optical elements, and the light source array is used for transmitting A light beam whose amplitude is modulated in time sequence. The optical element receives the light beam and then emits a spot-patterned light beam to a target object. The receiving module includes a TOF image sensor. The TOF image sensor includes a pixel array. The pixel array receives the light beam from the target. The spot pattern beam reflected by the object forms an electrical signal; the control and processing circuit receives the electrical signal and calculates the phase difference, uses the phase difference to calculate the TOF depth map of the target object; and, receives the electrical signal and calculates it. Structured light pattern, using the structured light pattern to calculate the structured light depth map of the target object; and, using the depth value in the TOF depth map as a reliable point, assigning values to the corresponding pixel positions in the structured light depth map, And use the reliable point to correct the structured light depth map, and finally obtain a depth image of the target object.
在一些实施例中,所述TOF图像传感器包括至少一个像素;其中,每个像素包括有两个及以上的抽头。In some embodiments, the TOF image sensor includes at least one pixel; wherein each pixel includes two or more taps.
在一些实施例中,所述控制和处理电路提供所述TOF图像传感器各个像素中各个抽头的解调信号,抽头在解调信号的控制下采集由包含所述目标物体反射回的反射光束所产生的电信号。In some embodiments, the control and processing circuit provides the demodulated signal of each tap in each pixel of the TOF image sensor, and the taps are controlled by the demodulated signal to collect the reflected light beam generated by the reflected light from the target object. Electrical signal.
在一些实施例中,所述控制和处理电路包括有相位计算模块和强度计算模块,所述TOF图像传感器产生的电信号同时传输到所述相位计算模块和所述强度计算模块,通过处理计算得到该像素对应的相位信息和强度信息,根据所述相位信息和所述强度信息进一步得到该像素对应的TOF深度图和结构光深度图。In some embodiments, the control and processing circuit includes a phase calculation module and an intensity calculation module, and the electrical signal generated by the TOF image sensor is transmitted to the phase calculation module and the intensity calculation module at the same time, and is calculated through processing. According to the phase information and intensity information corresponding to the pixel, a TOF depth map and a structured light depth map corresponding to the pixel are further obtained according to the phase information and the intensity information.
在一些实施例中,所述控制和处理电路还包括有标定模块、匹配模块、以及校正模块;其中,所述标定模块得到的所述TOF深度图和所述匹配模块得到的所述结构光深度图输入至所述校正模块,建立所述TOF深度图和所述结构光深度图的映射,保证每个像素产生的电信号计算得到的所述TOF深度值对应于计算得到的所述结构光深度值,将所述TOF深度值赋值到所述结构光深度图中对应的像素坐标中,以赋值后的点为可靠点,对所述结构光深度图中未赋值的点进行校正处理。In some embodiments, the control and processing circuit further includes a calibration module, a matching module, and a correction module; wherein the TOF depth map obtained by the calibration module and the structured light depth obtained by the matching module The map is input to the correction module to establish a mapping between the TOF depth map and the structured light depth map to ensure that the TOF depth value calculated from the electrical signal generated by each pixel corresponds to the calculated structured light depth Value, assign the TOF depth value to the corresponding pixel coordinates in the structured light depth map, and take the assigned point as a reliable point, and perform correction processing on the unassigned point in the structured light depth map.
本申请另一技术方案为:Another technical solution of this application is:
一种融合的深度测量方法,包括如下步骤:A fusion depth measurement method includes the following steps:
S1、控制光源阵列发射时序上振幅被调制的光束,光学元件接收所述光束后向目标物体发射斑点图案光束;S1. Control the light source array to emit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam;
S2、通过TOF图像传感器中的像素阵列接收由所述目标物体反射的斑点图案光束并形成电信号;S2. Receive the spot pattern light beam reflected by the target object through the pixel array in the TOF image sensor and form an electric signal;
S3、接收所述电信号并计算得到相位差,基于所述相位差计算所述目标物体的TOF深度图;和,接收所述电信号以形成结构光图案,并利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到所述目标物体的深度图像。S3. Receive the electrical signal and calculate the phase difference, calculate the TOF depth map of the target object based on the phase difference; and, receive the electrical signal to form a structured light pattern, and use the structured light pattern to calculate the The structured light depth map of the target object; and, the depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth is corrected by using the reliable point Figure, finally get the depth image of the target object.
在一些实施例中,步骤S1中,光源阵列朝向目标区域发射斑点图案光束,控制和处理电路控制斑点图案光束中每个斑点对应的光束的振幅在时序上被连续波、方波或脉冲方式中的至少一种方式调制。In some embodiments, in step S1, the light source array emits the spot pattern beam toward the target area, and the control and processing circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be in a continuous wave, square wave or pulse mode in time sequence. At least one way of modulation.
在一些实施例中,步骤S2中,TOF图像传感器包括至少一个像素,每个像 素包括有两个及以上的抽头;在单个帧周期内,以一定的次序依次切换抽头以采集相应的光子,以接收光信号并转换成电信号。In some embodiments, in step S2, the TOF image sensor includes at least one pixel, and each pixel includes two or more taps; within a single frame period, the taps are sequentially switched in a certain order to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
在一些实施例中,步骤S3中,通过控制和处理电路接收像素阵列输入的电信号分别进行相位计算和强度计算,以得到所述目标物体的TOF深度图和结构光深度图,将TOF深度值赋值到所述结构光深度图中对应的像素上,由此将所述结构光深度图上的点区分为可靠点与不可靠点,结合校正算法利用所述可靠点校正所述不可靠的点,得到所述目标物体的深度图像。In some embodiments, in step S3, the electrical signal input by the pixel array is received through the control and processing circuit to perform phase calculation and intensity calculation respectively to obtain the TOF depth map and the structured light depth map of the target object, and the TOF depth value Assign values to the corresponding pixels in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points and unreliable points, and use the reliable points to correct the unreliable points in combination with a correction algorithm , To obtain the depth image of the target object.
本申请又一技术方案为:Another technical solution of this application is:
一种电子设备,包括有壳体、屏幕、以及融合的深度测量装置;所述融合的深度测量装置包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;所述发射模块包括有光源阵列以及光学元件,所述光源阵列用于发射时序上振幅被调制的光束,所述光学元件接收所述光束后向目标物体发射斑点图案光束;所述接收模块包括有TOF图像传感器,所述TOF图像传感器包括像素阵列,所述像素阵列接收由所述目标物体反射的斑点图案光束并形成电信号;所述控制和处理电路接收所述电信号并计算得到相位差,利用所述相位差计算所述目标物体的TOF深度图;和,接收所述电信号并计算得到结构光图案,利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像;其中,所述融合的深度测量装置的发射模块与接收模块设置于电子设备的第一平面上,以用于向目标物体发射在时序上振幅被调制的斑点图案光束与接收由所述目标物体反射的斑点图案光束;所述屏幕安装在所述电子设备的第二平面上,用于显示图像或者文字等信息;所述第一平面与所述第二平面为同一平面或所述第一平面与所述第二平面为相对立的平面。An electronic device comprising a housing, a screen, and a fused depth measuring device; the fused depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module; The transmitting module includes a light source array and an optical element, the light source array is used to transmit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam; the receiving module includes a TOF image Sensor, the TOF image sensor includes a pixel array, the pixel array receives the spot pattern beam reflected by the target object and forms an electrical signal; the control and processing circuit receives the electrical signal and calculates the phase difference, using all The phase difference calculates the TOF depth map of the target object; and, receiving the electrical signal and calculating the structured light pattern, using the structured light pattern to calculate the structured light depth map of the target object; and, converting the TOF The depth value in the depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by using the reliable point to finally obtain a depth image of the target object; wherein, the fusion The transmitting module and the receiving module of the depth measuring device are arranged on the first plane of the electronic device, and are used to transmit a spot pattern light beam whose amplitude is modulated in time sequence to a target object and receive a spot pattern light beam reflected by the target object; The screen is installed on the second plane of the electronic device to display information such as images or text; the first plane and the second plane are the same plane or the first plane and the second plane For opposite planes.
本申请实施例提供一种融合的深度测量装置,包括发射模块,用于向目标物体发射振幅时序被调制的斑点图案光束;接收模块,接收目标物体反射的斑 点图案光束并形成电信号;控制和处理电路,接收所述电信号并计算得到TOF深度图和结构光图案,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像。本申请基于TOF深度值具有较高的测量精度,以TOF深度值为依据校正结构光深度图因匹配计算所引起的误差,获得高精度的深度图,结合结构光测量的高分辨的优点,实现了一种高精度、高分辨率、低功耗、小型化的深度测量装置。The embodiment of the present application provides a fusion depth measurement device, including a transmitting module for transmitting a spot pattern beam whose amplitude is time-modulated to a target object; a receiving module for receiving the spot pattern beam reflected by the target object and forming an electrical signal; controlling and The processing circuit receives the electrical signal and calculates the TOF depth map and the structured light pattern, uses the depth value in the TOF depth map as a reliable point, assigns the value to the corresponding pixel position in the structured light depth map, and uses the The structured light depth map is calibrated reliably, and the depth image of the target object is finally obtained. This application has high measurement accuracy based on the TOF depth value. The TOF depth value is used to correct the error caused by the matching calculation of the structured light depth map to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, to achieve A depth measuring device with high precision, high resolution, low power consumption and miniaturization is developed.
附图说明Description of the drawings
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions in the embodiments of the present application or the prior art, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the prior art. Obviously, the drawings in the following description are only These are some embodiments of the present application. For those of ordinary skill in the art, other drawings can be obtained based on these drawings without creative labor.
图1是根据本申请一个实施例融合的深度测量装置的示意图。Fig. 1 is a schematic diagram of a fusion depth measuring device according to an embodiment of the present application.
图2是根据本申请一个实施例融合的深度测量装置的原理示意图。Fig. 2 is a schematic diagram of the principle of a fusion depth measuring device according to an embodiment of the present application.
图3是根据本申请一个实施例融合的深度测量装置的控制和处理电路架构示意图。Fig. 3 is a schematic diagram of the control and processing circuit architecture of a fused depth measurement device according to an embodiment of the present application.
图4是根据本申请一个实施例融合的深度测量方法的流程图。Fig. 4 is a flowchart of a fusion depth measurement method according to an embodiment of the present application.
图5是根据本申请一个实施例集成有图1融合的深度测量装置的电子设备的示意图。Fig. 5 is a schematic diagram of an electronic device integrated with the depth measurement device fused in Fig. 1 according to an embodiment of the present application.
具体实施方式Detailed ways
为了使本申请实施例所要解决的技术问题、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。In order to make the technical problems, technical solutions, and beneficial effects to be solved by the embodiments of the present application clearer, the following further describes the present application in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not used to limit the present application.
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可 以直接在另一个元件上或者间接在该另一个元件上。当一个元件被称为是“连接于”另一个元件,它可以是直接连接到另一个元件或间接连接至该另一个元件上。另外,连接即可以是用于固定作用也可以是用于电路连通作用。It should be noted that when an element is referred to as being "fixed to" or "disposed on" another element, it can be directly on the other element or indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or indirectly connected to the other element. In addition, the connection can be used for fixing or circuit connection.
需要理解的是,术语“长度”、“宽度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请实施例和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。It should be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top" The orientation or positional relationship indicated by "bottom", "inner", "outer", etc. are based on the orientation or positional relationship shown in the drawings, and are only for the convenience of describing the embodiments of the application and simplifying the description, rather than indicating or implying what is meant. The device or element must have a specific orientation, be constructed and operated in a specific orientation, and therefore cannot be understood as a limitation of the present application.
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多该特征。在本申请实施例的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。In addition, the terms "first" and "second" are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with "first" and "second" may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, "plurality" means two or more, unless otherwise specifically defined.
参照图1所示,图1为本申请一个实施例融合的深度测量装置的示意图。所述融合的深度测量装置10包括发射模块11、接收模块12以及分别与发射模块11和接收模块12连接的控制和处理电路13;其中,发射模块11用于向目标物体20发射光束30,发射光束30是在时序上振幅被调制的斑点图案光束,该斑点图案光束发射至目标空间中以照明空间中的目标物体20,至少部分发射光束30由目标物体20反射后形成反射光束40,反射光束40中的至少部分光束被接收模块12接收;控制和处理电路13分别与发射模块11以及接收模块12连接,以控制光束的发射与接收,同时接收来自接收模块12接收反射光束并产生的信息,并对该信息进行计算以获取目标物体的深度信息。Referring to FIG. 1, FIG. 1 is a schematic diagram of a fusion depth measurement device according to an embodiment of the application. The fused depth measurement device 10 includes a transmitting module 11, a receiving module 12, and a control and processing circuit 13 respectively connected to the transmitting module 11 and the receiving module 12; wherein, the transmitting module 11 is used to transmit a light beam 30 to a target object 20, and transmit The light beam 30 is a spot pattern light beam whose amplitude is modulated in time sequence. The spot pattern light beam is emitted into the target space to illuminate the target object 20 in the space. At least part of the emitted light beam 30 is reflected by the target object 20 to form a reflected light beam 40. The reflected light beam At least part of the light beam in 40 is received by the receiving module 12; the control and processing circuit 13 is respectively connected with the transmitting module 11 and the receiving module 12 to control the transmission and reception of the light beam, and at the same time receive the information generated by the receiving module 12 receiving the reflected light beam, And calculate the information to obtain the depth information of the target object.
发射模块11包括光源阵列111、光学元件112以及光源驱动器(图中未示出)等。其中,光源阵列111可以是发光二极管(LED)、边发射激光器(EEL)、垂直腔面发射激光器(VCSEL)多个光源组成的光源阵列,光源所发射的光束可以是可见光、红外光、紫外光等。优选地,光源阵列111是不规则排列的VCSEL S阵列,用于发射不规则的斑点图案光束。光源阵列111在光源驱动器(其可以进一步被 控制和处理电路13控制)的控制下以一定的时序振幅被调制后向外发射光束,比如在一个实施例中,光源阵列111在光源驱动器的控制下以一定的频率发射脉冲调制光束、方波调制光束、正弦波调制光束等光束。在本申请的一个实施例中,不规则斑点图案光束中每个斑点对应的光束的振幅在时序上以连续波、方波或脉冲的方式被调制。可以理解的是,可以利用控制和处理电路13中的一部分或者独立于控制和处理电路13存在的子电路来控制光源阵列111发射相关的光束,比如脉冲信号发生器。 The emission module 11 includes a light source array 111, an optical element 112, a light source driver (not shown in the figure), and the like. The light source array 111 may be a light source array composed of multiple light sources such as light emitting diodes (LED), edge emitting lasers (EEL), and vertical cavity surface emitting lasers (VCSEL). The light beams emitted by the light sources may be visible light, infrared light, or ultraviolet light. Wait. Preferably, the light source array 111 is an irregularly arranged VCSEL S array for emitting irregular spot pattern beams. The light source array 111 is controlled by the light source driver (which may be further controlled by the control and processing circuit 13) to emit light beams after being modulated with a certain timing amplitude. For example, in one embodiment, the light source array 111 is under the control of the light source driver. Transmit pulse modulated beams, square wave modulated beams, sine wave modulated beams and other beams at a certain frequency. In an embodiment of the present application, the amplitude of the beam corresponding to each spot in the irregular spot pattern beam is modulated in a continuous wave, square wave or pulse manner in time sequence. It is understandable that a part of the control and processing circuit 13 or a sub-circuit independent of the control and processing circuit 13 can be used to control the light source array 111 to emit related light beams, such as a pulse signal generator.
光学元件112接收来自光源阵列111的光束,并向外发射斑点图案光束。在一些实施例中,光学元件112还用于将接收到的光束进行扩束,以扩大测量装置的视场角。可以理解的是,通过光学元件112调制后的光束其振幅依旧是以一定的时序被调制,即入射的正弦波调制光束,出射的依旧是正弦波调制光束。光学元件112可以是透镜、衍射光学元件(DOE)、微透镜阵列、液晶中的一种或几种的组合。The optical element 112 receives the light beam from the light source array 111 and emits the spot pattern light beam outward. In some embodiments, the optical element 112 is also used to expand the received light beam to expand the angle of view of the measuring device. It can be understood that the amplitude of the light beam modulated by the optical element 112 is still modulated in a certain time sequence, that is, the incident sine wave modulated light beam is still emitted by the sine wave modulated light beam. The optical element 112 may be one or a combination of a lens, a diffractive optical element (DOE), a microlens array, and a liquid crystal.
接收模块12包括TOF图像传感器121、过滤单元122和透镜单元123,透镜单元123接收并将由目标物体反射回的至少部分斑点图案光束成像在至少部分TOF图像传感器121上;过滤单元122设置为与光源波长相匹配的窄带滤光片,用于抑制其余波段的背景光噪声。TOF图像传感器121可以是电荷耦合元件(CCD)、互补金属氧化物半导体(CMOS)、雪崩二极管(AD)、单光子雪崩二极管(SPAD)等组成的图像传感器阵列,阵列大小代表着该深度相机的分辨率,比如320×240等。一般地,与图像传感器121连接的还包括由信号放大器、时数转换器(TDC)、模数转换器(ADC)等器件中的一种或多种组成的读出电路(图中未示出)。The receiving module 12 includes a TOF image sensor 121, a filter unit 122, and a lens unit 123. The lens unit 123 receives and images at least part of the spot pattern beam reflected by the target object on at least part of the TOF image sensor 121; the filter unit 122 is configured to interact with the light source Narrowband filters with matched wavelengths are used to suppress background light noise in other wavelength bands. The TOF image sensor 121 can be an image sensor array composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc. The size of the array represents the depth of the camera. Resolution, such as 320×240, etc. Generally, connected to the image sensor 121 also includes a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure). ).
一般地,TOF图像传感器121包括至少一个像素,与传统的仅用于拍照的图像传感器相比,这里TOF图像传感器121中每个像素包含两个及以上的抽头(tap,用于在相应电极的控制下存储并读取或者排出由入射光子产生的电荷信号),比如包括2个抽头,在单个帧周期(或单次曝光时间内)内以一定的次序依次切换抽 头以采集相应的光子,以接收光信号并转换成电信号。Generally, the TOF image sensor 121 includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel in the TOF image sensor 121 here includes two or more taps (tap, used in the corresponding electrode). Store and read or discharge the charge signal generated by incident photons under control), for example, including 2 taps, and switch the taps in a certain order within a single frame period (or single exposure time) to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
控制和处理电路13可以是独立的专用电路,比如包含CPU、存储器、总线等组成的专用SOC芯片、FPGA芯片、ASIC芯片等等,也可以包含通用处理电路,比如当该深度测量装置被集成到如手机、电视、电脑等智能终端中时,智能终端中的处理电路可以作为该控制和处理电路13的至少一部分。The control and processing circuit 13 can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc. composed of CPU, memory, bus, etc., or a general processing circuit, such as when the depth measuring device is integrated into When in a smart terminal such as a mobile phone, a TV, a computer, etc., the processing circuit in the smart terminal can be used as at least a part of the control and processing circuit 13.
控制和处理电路13用于提供光源阵列111发射激光时所需的调制信号(发射信号),光源在调制信号的控制下向目标物体发射光束。比如在一个实施例中,调制信号为方波信号或脉冲信号,光源在该调制信号的调制下振幅被时序上调制以产生方波信号或者脉冲信号向外发射。The control and processing circuit 13 is used to provide a modulation signal (transmission signal) required when the light source array 111 emits laser light, and the light source emits a light beam to a target object under the control of the modulation signal. For example, in one embodiment, the modulation signal is a square wave signal or a pulse signal, and the light source is modulated in time sequence in amplitude under the modulation of the modulation signal to generate a square wave signal or a pulse signal for external emission.
控制和处理电路13还提供TOF图像传感器121各个像素中各个抽头的解调信号(采集信号),抽头在解调信号的控制下采集由包含目标物体反射回的反射光束所产生的电信号。一方面,控制和处理电路13对该电信号进行处理并计算出反映反射光束强度的强度信息以形成结构光图案,最后基于该结构光图案利用匹配计算、三角法计算等进行计算以获得待测目标物体的结构光深度图像。同时,控制和处理电路13对该电信号进行处理并计算出反映光束从发射到接收的相位差,基于相位差计算光束的飞行时间,进一步获得目标物体的TOF深度图像。进一步的,控制和处理电路13还可以根据TOF深度图像校正结构光深度图像,比如可以将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,更具体的校正方法将在后面描述。The control and processing circuit 13 also provides a demodulated signal (collection signal) of each tap in each pixel of the TOF image sensor 121, and the tap collects the electrical signal generated by the reflected light beam containing the target object under the control of the demodulated signal. On the one hand, the control and processing circuit 13 processes the electrical signal and calculates the intensity information reflecting the intensity of the reflected beam to form a structured light pattern. Finally, based on the structured light pattern, it uses matching calculations, triangulation calculations, etc. to perform calculations to obtain the test Structured light depth image of the target object. At the same time, the control and processing circuit 13 processes the electrical signal and calculates the phase difference reflecting the light beam from emission to reception, calculates the flight time of the light beam based on the phase difference, and further obtains the TOF depth image of the target object. Further, the control and processing circuit 13 can also correct the structured light depth image according to the TOF depth image. For example, the depth value in the TOF depth map can be used as a reliable point and assigned to the corresponding pixel position in the structured light depth map, and The structured light depth map is corrected using the reliable point, and a more specific correction method will be described later.
在一些实施例中,深度测量装置10还可以包括驱动电路、电源、彩色相机、红外相机、IMU等器件,在图中并未示出,与这些器件的组合可以实现更加丰富的功能,比如3D纹理建模、红外人脸识别、SLAM等功能。深度测量装置10可以被嵌入到手机、平板电脑、计算机等电子产品中。In some embodiments, the depth measurement device 10 may also include a drive circuit, a power supply, a color camera, an infrared camera, an IMU, and other devices, which are not shown in the figure. The combination with these devices can achieve richer functions, such as 3D. Texture modeling, infrared face recognition, SLAM and other functions. The depth measuring device 10 may be embedded in electronic products such as mobile phones, tablet computers, and computers.
图2是根据本申请一实施例的深度测量装置原理示意图。控制和处理电路13控制光源阵列111朝向目标物体发射振幅被方波或脉冲调制的斑点图案光束 301,各个光斑302的振幅在时序上被方波或脉冲调制。可以理解的是,光源阵列111中光源的调制方式相同,在一些其他实施例中,还可以利用正弦波对发射光束的振幅进行调制。Fig. 2 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application. The control and processing circuit 13 controls the light source array 111 to emit a spot pattern beam 301 whose amplitude is modulated by a square wave or pulse toward the target object, and the amplitude of each spot 302 is modulated by a square wave or pulse in time sequence. It is understandable that the light sources in the light source array 111 are modulated in the same manner. In some other embodiments, a sine wave may also be used to modulate the amplitude of the emitted light beam.
在一个实施例中,接收模组12中的TOF图像传感器的各个像素包括4个抽头,分别用于在单个帧周期内分别采集4次光信号并转换成电信号C 1、C 2、C 3和C 4,4次采集的时间以及间隔相同。 In one embodiment, each pixel of the TOF image sensor in the receiving module 12 includes 4 taps, which are respectively used to collect light signals 4 times in a single frame period and convert them into electrical signals C 1 , C 2 , C 3 Same as C 4 , the time and interval of 4 acquisitions are the same.
控制和处理电路13接收电信号C 1、C 2、C 3和C 4计算出斑点图案光束的强度信息。在一个实施例中的,强度信息根据下式计算: The control and processing circuit 13 receives the electrical signals C 1 , C 2 , C 3 and C 4 to calculate the intensity information of the spot pattern beam. In one embodiment, the intensity information is calculated according to the following formula:
Figure PCTCN2020077862-appb-000001
Figure PCTCN2020077862-appb-000001
在获取所有像素的强度信息之后就可以形成结构光图案,最后再利用结构光图案进行匹配计算以获取视差以及根据视差计算出结构光深度图像。After obtaining the intensity information of all pixels, the structured light pattern can be formed, and finally the structured light pattern is used for matching calculation to obtain the parallax and calculate the structured light depth image according to the parallax.
对于存在环境光信号时,这一光束强度计算方式与传统方式一样,均难以进行消除,从而导致最终的灰度图案信噪比较低。因此,在一个实施例,强度信息将根据下式计算:When there is an ambient light signal, this beam intensity calculation method is the same as the traditional method, which is difficult to eliminate, resulting in a low signal-to-noise ratio of the final grayscale pattern. Therefore, in one embodiment, the intensity information will be calculated according to the following formula:
Figure PCTCN2020077862-appb-000002
Figure PCTCN2020077862-appb-000002
根据计算出的斑点图案光束的强度信息生成结构光图案,进一步根据结构光图案进行匹配计算以获取视差以及根据视差计算出结构光深度图像。The structured light pattern is generated according to the calculated intensity information of the spot pattern beam, and further matching calculation is performed according to the structured light pattern to obtain the parallax and the structured light depth image is calculated according to the parallax.
上述基于4抽头的TOF图像传感器以及方波或脉冲调制光发射信号的结构光获取方案同样也适用于其他抽头TOF图像传感器以及其他类型调制光发射信号的深度测量装置。可以理解的是,与传统的结构光深度测量相比,本申请利用发射端发出时序调制的斑点投影光束,采用接收端多抽头像素采集方式,从而可以使得该方法拥有比传统方案更多的功能,比如可以实现传统方案难以抗环境干扰的深度测量方法。The above-mentioned structured light acquisition scheme based on the 4-tap TOF image sensor and the square wave or pulse modulated light emission signal is also applicable to other tapped TOF image sensors and other types of depth measurement devices with modulated light emission signals. It is understandable that, compared with the traditional structured light depth measurement, the present application uses the transmitting end to emit a time-modulated speckle projection beam, and adopts the receiving end multi-tap pixel collection method, so that this method can have more functions than the traditional solution. For example, it is possible to implement depth measurement methods that are difficult to resist environmental interference in traditional solutions.
同时,控制和处理电路13还接收TOF图像传感器输出各抽头在解调信号的控制下采集目标物体反射回的反射的光束所产生的电信号计算反射光束相位差 并根据该相位差计算出反映光束从发射端到接收端的所用的飞行时间,进一步基于该飞行时间计算出目标物体的TOF深度图像。At the same time, the control and processing circuit 13 also receives the TOF image sensor output. Under the control of the demodulation signal, each tap collects the electrical signal generated by the reflected beam reflected by the target object to calculate the reflected beam phase difference and calculate the reflected beam according to the phase difference. The flight time from the transmitting end to the receiving end is further calculated based on the flight time to calculate the TOF depth image of the target object.
图3是根据本申请一实施例的控制和处理电路架构示意图。所述控制和处理电路13包括有相位计算模块131、强度计算模块133;其中,相位计算模块131的输出连接至标定模块132;强度计算模块133的输出连接有预处理模块134,预处理模块134的输出连接至匹配模块135。其中,标定模块132和匹配模块135的输入还连接有存储器137,而输出连接至校正模块136。Fig. 3 is a schematic diagram of a control and processing circuit architecture according to an embodiment of the present application. The control and processing circuit 13 includes a phase calculation module 131 and an intensity calculation module 133; wherein the output of the phase calculation module 131 is connected to the calibration module 132; the output of the intensity calculation module 133 is connected to the preprocessing module 134, the preprocessing module 134 The output of is connected to the matching module 135. Among them, the input of the calibration module 132 and the matching module 135 is also connected to the memory 137, and the output is connected to the correction module 136.
控制和处理电路13接收来自TOF图像传感器的电信号,在本申请实施例中,发射模块11朝向目标区域发射振幅被调制的不规则的斑点图案光束,接收模块12接收目标物体反射回的斑点图案光束。对于TOF图像传感器中接收到反射光束的每个像素而言,其产生的电信号同时传输到相位计算模块131和强度计算模块133,通过处理计算得到该像素对应的相位信息和强度信息,根据相位信息和强度信息进一步得到该像素对应的TOF深度图和结构光深度图,其具有对应的位置关系。进一步的,将TOF深度图中的深度值作为可靠点,赋值到结构光深度图中对应像素位置,并利用可靠点校正结构光深度图。具体的,包括:The control and processing circuit 13 receives the electrical signal from the TOF image sensor. In the embodiment of the present application, the transmitting module 11 emits an irregular spot pattern beam with modulated amplitude toward the target area, and the receiving module 12 receives the spot pattern reflected by the target object. beam. For each pixel in the TOF image sensor that receives the reflected light beam, the electrical signal generated by it is transmitted to the phase calculation module 131 and the intensity calculation module 133 at the same time, and the phase information and intensity information corresponding to the pixel are obtained through processing and calculation. The information and intensity information further obtain the TOF depth map and the structured light depth map corresponding to the pixel, which have a corresponding positional relationship. Further, the depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by using the reliable point. Specifically, including:
(1)计算TOF深度图(1) Calculate TOF depth map
当TOF图像传感器121产生的电信号传输到相位计算模块131后,相位计算模块131对电信号进行处理计算以获得相位差,可以基于相位差计算光束从发射到接收的飞行时间,进一步获得目标物体的TOF深度图像,该相位差与深度值之间存在线性关系,因此在一些实施例中,通过该相位计算模块131可以直接计算出TOF深度图。随后深度图被送入标定模块132进行标定,由于TOF测量常常受到噪声干扰,使得测量值与实际值之间存在一定的误差,因此在实际使用前将采用一个标定步骤,比如在一定的测量区间内每隔一段距离设置标定板,并且标定板的实际深度值已知,随后逐次对不同距离上的标定板进行实际测量得到各个距离对应的测量值,测量值与实际值之间的关系就可以作为预标定参数被存储到存储器137中,标定模块在标定时将从存储器137中调用预 标定参数对当前测量值进行标定。这里的预标定参数即可以是实际值与测量值的对照表(index),此时标定模块132标定过程实际上是查表过程;也可以是通过一定的数学手段对误差进行建模,并通过预先的多次测量以计算得到模型中的未知参数,标定模块132的标定过程实际上就是基于模型、测量值计算出实际值的过程。经过标定即可得到准确的TOF深度图。After the electrical signal generated by the TOF image sensor 121 is transmitted to the phase calculation module 131, the phase calculation module 131 processes and calculates the electrical signal to obtain the phase difference. Based on the phase difference, the flight time of the beam from emission to reception can be calculated to further obtain the target object. In the TOF depth image of, there is a linear relationship between the phase difference and the depth value. Therefore, in some embodiments, the TOF depth image can be directly calculated by the phase calculation module 131. Then the depth map is sent to the calibration module 132 for calibration. Because TOF measurement is often interfered by noise, there is a certain error between the measured value and the actual value. Therefore, a calibration step will be adopted before actual use, such as in a certain measurement interval. The calibration board is set at intervals in the internal, and the actual depth value of the calibration board is known, and then the actual measurement of the calibration board at different distances is performed successively to obtain the measured value corresponding to each distance, and the relationship between the measured value and the actual value can be As the pre-calibration parameters are stored in the memory 137, the calibration module will call the pre-calibration parameters from the memory 137 to calibrate the current measurement value during calibration. The pre-calibration parameter here can be a comparison table (index) between the actual value and the measured value. At this time, the calibration process of the calibration module 132 is actually a table look-up process; it can also be a certain mathematical method to model the error and pass Multiple measurements are performed in advance to calculate the unknown parameters in the model. The calibration process of the calibration module 132 is actually a process of calculating actual values based on the model and the measured values. After calibration, an accurate TOF depth map can be obtained.
(2)计算结构光深度图(2) Calculate structured light depth map
当TOF图像传感器121产生的电信号传输到强度计算模块133后,强度计算模块133执行对电信号的相位计算以获取反映光束强度的强度信息,形成结构光图像。随后结构光图像被送入预处理模块134进行去噪、对比度增强等处理,也可对结构光图像执行图像畸变校正等前处理任务。被前处理后的图像随后进入匹配模块135进行匹配计算,在进行匹配计算时匹配模块135将从存储器137中调用被预先存储的参考图像,在一个实施例中,匹配模块135采用零均值归一化最小平方距离函数对结构光图像与参考图像进行像素偏离值的匹配估计。根据结构光三角法,像素偏离值与目标的深度值之间存在一定的关系,因此,匹配模块135可以直接进行深度值的计算,得到待测目标的结构光深度图像。在一些实施例中,还可以对深度图像进行后处理例如图像增强、插值计算等对其进行优化,比如孔洞填充、边缘优化等。After the electrical signal generated by the TOF image sensor 121 is transmitted to the intensity calculation module 133, the intensity calculation module 133 performs phase calculation of the electrical signal to obtain intensity information reflecting the intensity of the light beam to form a structured light image. Subsequently, the structured light image is sent to the preprocessing module 134 for processing such as denoising and contrast enhancement. Pre-processing tasks such as image distortion correction can also be performed on the structured light image. The pre-processed image then enters the matching module 135 for matching calculation. During the matching calculation, the matching module 135 will call the pre-stored reference image from the memory 137. In one embodiment, the matching module 135 uses zero mean normalization The least square distance function is used to estimate the pixel deviation between the structured light image and the reference image. According to the structured light triangulation method, there is a certain relationship between the pixel deviation value and the depth value of the target. Therefore, the matching module 135 can directly calculate the depth value to obtain the structured light depth image of the target to be measured. In some embodiments, it is also possible to perform post-processing on the depth image, such as image enhancement, interpolation calculation, etc., to optimize it, such as hole filling, edge optimization, and so on.
(3)校正结构光深度图(3) Correct structured light depth map
由于结构光深度计算需要匹配计算,匹配计算在整个深度计算环节中消耗资源较大也对精度影响较大,导致最终得到的结构光深度图存在误差;而TOF深度图直接采用相位差计算得到,测量精度较准确,但是在TOF计算模式中只能得到反射光束射入的那部分像素的深度值,而不能得到全部像素的深度值,因此,在本申请实施例中,提取TOF深度图上的深度值作为可靠点校正结构光深度图。Since structured light depth calculation requires matching calculation, matching calculation consumes a lot of resources in the entire depth calculation link and has a greater impact on accuracy, resulting in errors in the final structured light depth map; while the TOF depth map is directly obtained by phase difference calculation. The measurement accuracy is relatively accurate, but in the TOF calculation mode, only the depth value of the part of the pixel that the reflected beam enters can be obtained, but the depth value of all the pixels cannot be obtained. Therefore, in the embodiment of the present application, the TOF depth map is extracted The depth value is used as a reliable point to correct the structured light depth map.
具体的,标定模块132得到的TOF深度图与匹配模块135得到的结构光深度图输入校正模块135,建立TOF深度图和结构光深度图的映射,保证每个像 素产生的电信号计算得到的TOF深度值对应于计算得到的结构光深度值,将TOF深度值赋值到结构光深度图中对应的像素坐标中,将这些赋值后的点称为可靠点,没有赋值的点称为不可靠点,以可靠点为依据对结构光深度图中的不可靠点进行校正处理,以提高结构光深度图的精度。校正的方法可包括区域生长法、置信度加权法等。Specifically, the TOF depth map obtained by the calibration module 132 and the structured light depth map obtained by the matching module 135 are input to the correction module 135 to establish a mapping between the TOF depth map and the structured light depth map to ensure the TOF calculated from the electrical signal generated by each pixel The depth value corresponds to the calculated structured light depth value. The TOF depth value is assigned to the corresponding pixel coordinates in the structured light depth map. These assigned points are called reliable points, and the unassigned points are called unreliable points. Based on the reliable points, the unreliable points in the structured light depth map are corrected to improve the accuracy of the structured light depth map. The correction method may include the region growing method, the confidence weighting method, and so on.
在一个实施例中,采用区域生长法对结构光深度图进行校正处理。区域生长是基于物体具有连续性则物体上两个靠近点之间的深度值近似相等的假设进行计算的,在待预测区域内选择多个可靠点作为种子点,通过设定合适的生长准则,利用四连通性(上、下、左、右)将满足相关联性的不可靠点与可靠点连接至同一个区域中形成一个新的区域。In one embodiment, the area growth method is used to correct the structured light depth map. Region growth is calculated based on the assumption that the object has continuity and the depth values between two close points on the object are approximately equal. In the region to be predicted, multiple reliable points are selected as seed points, and by setting appropriate growth criteria, Use four connectivity (up, down, left, right) to connect unreliable points and reliable points that satisfy the correlation to the same area to form a new area.
在本申请实施例中,将通过TOF深度值赋值的可靠点称为种子点,设定合适的生长准则,设定一个先进先出的区域q,在该区域内的全部像素点的深度值等于种子点的深度值。例如选取一个种子点p的像素坐标为(x 0,y 0),该种子点p的深度值为d,预测该点p邻近的四个像素坐标(x 0,y 0-1),(x 0,y 0+1),(x 0-1,y 0),(x 0+1,y 0)上的深度值与种子点p的相关联性,假设生长准则设定相邻两个像素点深度值之差的绝对值小于自定义的阈值T,若其中一个以上的不可靠点与种子点p的相关联性满足设定的生长准则,则将满足条件的不可靠点与种子点合并到种子点所在的区域q变成可靠点,以新获得的可靠点作为种子点继续进行预测;若点p与邻近的四个像素坐标点的深度值的相关联性不满足设定的生长准则,则区域生长终止。经过校正处理后,有效提高了深度图的测量精度。 In the embodiment of the present application, the reliable point assigned by the TOF depth value is called the seed point, an appropriate growth criterion is set, and a first-in first-out area q is set. The depth value of all pixels in the area is equal to The depth value of the seed point. For example, if the pixel coordinates of a seed point p are selected as (x 0 , y 0 ), the depth value of the seed point p is d, and the four pixel coordinates (x 0 , y 0 -1) adjacent to the point p are predicted, (x 0, y 0 -1) 0 , y 0 +1), (x 0 -1, y 0 ), (x 0 +1, y 0 ) the correlation between the depth value and the seed point p, assuming that the growth criterion sets two adjacent pixels The absolute value of the difference between the point depth values is less than the custom threshold T. If the correlation between more than one unreliable point and the seed point p meets the set growth criterion, the unreliable point that meets the condition is merged with the seed point When the region q where the seed point is located becomes a reliable point, the newly obtained reliable point is used as the seed point to continue the prediction; if the correlation between the point p and the depth value of the adjacent four pixel coordinate points does not meet the set growth criterion , The growth of the region is terminated. After the correction processing, the measurement accuracy of the depth map is effectively improved.
可以理解的是,上述实施例只是一个具体实施例,并不对本申请的校正方法做具体限制,任何属于本申请设计构思的校正方法均属于本申请的公开范围。It is understandable that the above-mentioned embodiment is only a specific embodiment, and does not specifically limit the correction method of the present application, and any correction method that belongs to the design concept of the present application belongs to the disclosure scope of the present application.
基于上述实施例的融合的深度测量装置,本申请还提出了一种融合的深度测量方法。Based on the fused depth measurement device of the foregoing embodiment, this application also proposes a fused depth measurement method.
参照图4所示,图4为本申请另一实施例一种融合的深度测量方法的流程 图示,包括如下步骤:Referring to Fig. 4, Fig. 4 is a flowchart of a fusion depth measurement method according to another embodiment of the application, which includes the following steps:
S1、控制光源阵列发射时序上振幅被调制的光束,光学元件接收光束后向目标物体发射斑点图案光束;S1. Control the light source array to emit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to the target object after receiving the light beam;
S2、通过TOF图像传感器中的像素阵列接收由目标物体反射的斑点图案光束并形成电信号;S2. Receive the spot pattern light beam reflected by the target object through the pixel array in the TOF image sensor and form an electrical signal;
S3、接收电信号并计算得到相位差,基于相位差计算目标物体的TOF深度图;和,接收电信号以形成结构光图案,并利用结构光图案计算目标物体的结构光深度图;以及,将TOF深度图中的深度值作为可靠点,赋值到结构光深度图中对应像素位置,并利用可靠点校正结构光深度图,最终得到目标物体的深度图像。S3. Receive the electrical signal and calculate the phase difference, calculate the TOF depth map of the target object based on the phase difference; and, receive the electrical signal to form a structured light pattern, and use the structured light pattern to calculate the structured light depth map of the target object; and The depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by the reliable point, and finally the depth image of the target object is obtained.
具体的,光源阵列朝向目标区域发射斑点图案光束,控制电路控制斑点图案光束中每个斑点对应的光束的振幅在时序上被连续波、方波或脉冲的方式中的至少一种方式调制。Specifically, the light source array emits the spot pattern beam toward the target area, and the control circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be modulated by at least one of continuous wave, square wave, or pulse in time sequence.
具体的,TOF图像传感器包括至少一个像素,每个像素包括有两个及以上的抽头,优选地,每个像素包括4个抽头,在单个帧周期(或单次曝光时间内)内以一定的次序依次切换抽头以采集相应的光子,以接收光信号并转换成电信号。Specifically, the TOF image sensor includes at least one pixel, and each pixel includes two or more taps. Preferably, each pixel includes 4 taps, which can be determined within a single frame period (or single exposure time). The taps are sequentially switched in order to collect the corresponding photons to receive the optical signal and convert it into an electrical signal.
具体的,控制电路接收像素阵列输入的电信号分别进行相位计算和强度计算,其中,通过相位计算获取待测目标区域的TOF深度图;通过强度计算获取待测目标区域的结构光深度图。将TOF深度值赋值到结构光深度图中对应的像素上,由此将结构光深度图上的点区分为可靠点(TOF深度值)和不可靠点,结合校正算法利用可靠点校正不可靠的点。Specifically, the control circuit receives the electrical signal input from the pixel array to perform phase calculation and intensity calculation respectively, wherein the TOF depth map of the target area to be measured is obtained through phase calculation; the structured light depth map of the target area to be measured is obtained through intensity calculation. Assign the TOF depth value to the corresponding pixel in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points (TOF depth values) and unreliable points, and use the reliable points to correct unreliable points in combination with the correction algorithm point.
在一个实施例中,以可靠点作为种子点,结合区域生长算法优化不可靠点,从而校正结构光深度图获得精度更高的深度图像。In one embodiment, reliable points are used as seed points, and unreliable points are optimized in combination with a region growing algorithm, so as to correct the structured light depth map to obtain a depth image with higher accuracy.
本申请提出了一种基于振幅时序调制的结构光发射端和TOF图像传感器组成的融合的深度测量装置及测量方法,通过朝向目标区域投射斑点图案化光束,分别进行TOF深度值计算以及结构光深度值计算,基于TOF深度值具有较高的 测量精度,以TOF深度值为依据校正结构光深度图因匹配计算所引起的误差,获得高精度的深度图,结合结构光测量的高分辨的优点,实现了一种高精度、高分辨率、低功耗、小型化的深度测量方案。This application proposes a fusion depth measurement device and measurement method composed of a structured light emitting end and a TOF image sensor based on amplitude timing modulation. By projecting a spot patterned beam toward the target area, TOF depth value calculation and structured light depth are respectively performed Value calculation, based on the TOF depth value, has high measurement accuracy. Based on the TOF depth value, the error caused by the matching calculation of the structured light depth map is corrected to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, A high-precision, high-resolution, low-power, and miniaturized depth measurement solution is realized.
作为本申请又一实施例,还提供一种电子设备。所述电子设备可以是手机、平板、电脑、电视、智能头盔、智能眼镜以及机器人等。参照图5所示,以手机为例进行说明,电子设备500包括壳体51、屏幕52、以及前述实施例所述的融合的深度测量装置;所述屏幕52用于进行信息显示;而所述壳体51可为所述电子设备提供防尘、防水、防摔等保护功能。As another embodiment of the present application, an electronic device is also provided. The electronic device may be a mobile phone, a tablet, a computer, a TV, a smart helmet, smart glasses, a robot, and so on. 5, taking a mobile phone as an example for description, the electronic device 500 includes a housing 51, a screen 52, and the fused depth measurement device described in the foregoing embodiment; the screen 52 is used for information display; and the The housing 51 can provide protection functions such as dustproof, waterproof, and drop-proof for the electronic device.
具体的,融合的深度测量装置的发射模块11与接收模块12设置于电子设备500的第一平面上,用于向目标物体发射在时序上振幅被调制的斑点图案光束与接收由目标物体反射的斑点图案光束;屏幕52安装在电子设备的第二平面上,用于显示图像或者文字等信息;第一平面与第二平面为同一平面或第一平面与第二平面为相对立的平面。在一些实施例中,融合的深度测量装置的控制和处理电路可以与电子设备共用;而在一些实施例中,当电子设备本身设置有TOF图像传感器时,融合的深度测量装置的接收模块也可以共用所述TOF图像传感器。Specifically, the transmitting module 11 and the receiving module 12 of the fused depth measuring device are arranged on the first plane of the electronic device 500, and are used to transmit to the target object the spot pattern beam whose amplitude is modulated in time sequence and to receive the light beam reflected by the target object. Spot pattern light beam; the screen 52 is installed on the second plane of the electronic device to display information such as images or text; the first plane and the second plane are the same plane or the first plane and the second plane are opposite planes. In some embodiments, the control and processing circuit of the fused depth measurement device can be shared with the electronic device; and in some embodiments, when the electronic device itself is provided with a TOF image sensor, the receiving module of the fused depth measurement device can also be used. The TOF image sensor is shared.
通过将该融合的深度测量装置集成到电子设备中,如:手机、平板、电脑、电视、智能头盔、智能眼镜以及机器人等,从而使得电子设备的功能不断扩大,应用越来越广泛,例如可以实现对目标物体进行深度测量以获取包含目标更高精度的深度图像,进一步基于该高精度的深度图像可以实现三维重建、人脸识别、人机交互等等功能。By integrating the fused depth measurement device into electronic devices, such as mobile phones, tablets, computers, TVs, smart helmets, smart glasses, and robots, the functions of electronic devices are continuously expanded and their applications are more and more extensive. For example, Realize the depth measurement of the target object to obtain a higher-precision depth image containing the target, and further based on the high-precision depth image can realize three-dimensional reconstruction, face recognition, human-computer interaction and other functions.
可以理解的是,以上内容是结合具体/优选的实施方式对本申请所作的进一步详细说明,不能认定本申请的具体实施只局限于这些说明。对于本申请所属技术领域的普通技术人员来说,在不脱离本申请构思的前提下,其还可以对这些已描述的实施方式做出若干替代或变型,而这些替代或变型方式都应当视为属于本申请的保护范围。在本说明书的描述中,参考术语“一种实施例”、“一 些实施例”、“优选实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。It is understandable that the above content is a further detailed description of the application in combination with specific/preferred implementations, and it cannot be considered that the specific implementation of the application is limited to these descriptions. For those of ordinary skill in the technical field to which this application belongs, without departing from the concept of this application, they can also make several substitutions or modifications to the described implementations, and these substitutions or modifications should be regarded as It belongs to the protection scope of this application. In the description of this specification, reference to the description of the terms "one embodiment", "some embodiments", "preferred embodiment", "examples", "specific examples", or "some examples" etc. means to incorporate the implementation The specific features, structures, materials or characteristics described by the examples or examples are included in at least one embodiment or example of the present application.
在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。尽管已经详细描述了本申请的实施例及其优点,但应当理解,在不脱离由所附权利要求限定的范围的情况下,可以在本文中进行各种改变、替换和变更。In this specification, the schematic representations of the above terms do not necessarily refer to the same embodiment or example. Moreover, the described specific features, structures, materials or characteristics can be combined in any one or more embodiments or examples in a suitable manner. In addition, those skilled in the art can combine and combine the different embodiments or examples and the features of the different embodiments or examples described in this specification without contradicting each other. Although the embodiments of the present application and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope defined by the appended claims.
此外,本申请的范围不旨在限于说明书中所述的过程、机器、制造、物质组成、手段、方法和步骤的特定实施例。本领域普通技术人员将容易理解,可以利用执行与本文所述相应实施例基本相同功能或获得与本文所述实施例基本相同结果的目前存在的或稍后要开发的上述披露、过程、机器、制造、物质组成、手段、方法或步骤。因此,所附权利要求旨在将这些过程、机器、制造、物质组成、手段、方法或步骤包含在其范围内。In addition, the scope of the present application is not intended to be limited to the specific embodiments of the processes, machines, manufacturing, material composition, means, methods, and steps described in the specification. A person of ordinary skill in the art will easily understand that the above-mentioned disclosures, processes, machines, processes, machines, processes that currently exist or will be developed later that perform substantially the same functions as the corresponding embodiments described herein or obtain substantially the same results as the embodiments described herein can be used. Manufacturing, material composition, means, method, or step. Therefore, the appended claims intend to include these processes, machines, manufacturing, material compositions, means, methods, or steps within their scope.

Claims (10)

  1. 一种融合的深度测量装置,其特征在于,包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;其中,A fusion depth measurement device, which is characterized by comprising a transmitting module, a receiving module, and a control and processing circuit respectively connected with the transmitting module and the receiving module; wherein,
    发射模块,包括有光源阵列以及光学元件,所述光源阵列用于发射时序上振幅被调制的光束,所述光学元件接收所述光束后向目标物体发射斑点图案光束;The transmitting module includes a light source array and an optical element, the light source array is used to emit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam;
    接收模块,包括有TOF图像传感器,所述TOF图像传感器包括像素阵列,所述像素阵列接收由所述目标物体反射的斑点图案光束并形成电信号;A receiving module, including a TOF image sensor, the TOF image sensor including a pixel array, the pixel array receives the spot pattern light beam reflected by the target object and forms an electrical signal;
    控制和处理电路,接收所述电信号并计算得到相位差,利用所述相位差计算所述目标物体的TOF深度图;和,接收所述电信号并计算得到结构光图案,利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像。The control and processing circuit receives the electrical signal and calculates the phase difference, uses the phase difference to calculate the TOF depth map of the target object; and, receives the electrical signal and calculates the structured light pattern, and uses the structured light Pattern calculation of the structured light depth map of the target object; and, taking the depth value in the TOF depth map as a reliable point, assigning values to the corresponding pixel position in the structured light depth map, and using the reliable point to correct the Structured light depth map, and finally get the depth image of the target object.
  2. 如权利要求1所述融合的深度测量装置,其特征在于:所述TOF图像传感器包括至少一个像素;其中,每个像素包括有两个及以上的抽头。The fusion depth measurement device according to claim 1, wherein the TOF image sensor includes at least one pixel; wherein each pixel includes two or more taps.
  3. 如权利要求2所述融合的深度测量装置,其特征在于:所述控制和处理电路提供所述TOF图像传感器各个像素中各个抽头的解调信号,抽头在所述解调信号的控制下采集由包含目标物体反射回的反射光束所产生的电信号。The fusion depth measurement device of claim 2, wherein the control and processing circuit provides demodulated signals of each tap in each pixel of the TOF image sensor, and the taps are collected by the demodulated signal under the control of the demodulated signal. Contains the electrical signal generated by the reflected beam of the target object.
  4. 如权利要求1所述融合的深度测量装置,其特征在于:所述控制和处理电路包括有相位计算模块和强度计算模块,所述TOF图像传感器产生的电信号同时传输到所述相位计算模块和所述强度计算模块,通过处理计算得到该像素对应的相位信息和强度信息,根据所述相位信息和所述强度信息进一步得到该像素对应的TOF深度图和结构光深度图。The fusion depth measurement device of claim 1, wherein the control and processing circuit includes a phase calculation module and an intensity calculation module, and the electrical signal generated by the TOF image sensor is simultaneously transmitted to the phase calculation module and The intensity calculation module obtains the phase information and intensity information corresponding to the pixel through processing and calculation, and further obtains the TOF depth map and the structured light depth map corresponding to the pixel according to the phase information and the intensity information.
  5. 如权利要求4所述融合的深度测量装置,其特征在于:所述控制和处理电路还包括有标定模块、匹配模块、以及校正模块;其中,所述标定模块得到的TOF深度图和所述匹配模块得到的结构光深度图输入至所述校正模块,建立 TOF深度图和结构光深度图的映射,保证每个像素产生的电信号计算得到的TOF深度值对应于计算得到的结构光深度值,将所述TOF深度值赋值到所述结构光深度图中对应的像素坐标中,以赋值后的点作为可靠点,对所述结构光深度图中未赋值的点进行校正处理。The fusion depth measurement device according to claim 4, wherein the control and processing circuit further includes a calibration module, a matching module, and a correction module; wherein the TOF depth map obtained by the calibration module and the matching module The structured light depth map obtained by the module is input to the correction module to establish a mapping between the TOF depth map and the structured light depth map to ensure that the TOF depth value calculated from the electrical signal generated by each pixel corresponds to the calculated structured light depth value. The TOF depth value is assigned to the corresponding pixel coordinates in the structured light depth map, and the assigned points are used as reliable points, and the unassigned points in the structured light depth map are corrected.
  6. 一种融合的深度测量方法,其特征在于,包括如下步骤:A fusion depth measurement method is characterized in that it comprises the following steps:
    S1、控制光源阵列发射时序上振幅被调制的光束,光学元件接收所述光束后向目标物体发射斑点图案光束;S1. Control the light source array to emit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam;
    S2、通过TOF图像传感器中的像素阵列接收由所述目标物体反射的斑点图案光束并形成电信号;S2. Receive the spot pattern light beam reflected by the target object through the pixel array in the TOF image sensor and form an electric signal;
    S3、接收所述电信号并计算得到相位差,基于所述相位差计算所述目标物体的TOF深度图;和,接收所述电信号以形成结构光图案,并利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像。S3. Receive the electrical signal and calculate the phase difference, calculate the TOF depth map of the target object based on the phase difference; and, receive the electrical signal to form a structured light pattern, and use the structured light pattern to calculate the The structured light depth map of the target object; and, the depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth is corrected by using the reliable point Figure, finally get the depth image of the target object.
  7. 如权利要求6所述融合的深度测量方法,其特征在于:步骤S1中,光源阵列朝向目标区域发射斑点图案光束,控制和处理电路控制斑点图案光束中每个斑点对应的光束的振幅在时序上被连续波、方波或脉冲方式中的至少一种方式调制。The fusion depth measurement method according to claim 6, characterized in that: in step S1, the light source array emits the spot pattern beam toward the target area, and the control and processing circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam in time sequence. It is modulated by at least one of continuous wave, square wave, or pulse mode.
  8. 如权利要求6所述融合的深度测量方法,其特征在于:步骤S2中,TOF图像传感器包括至少一个像素,每个像素包括有两个及以上的抽头;在单个帧周期内,以一定的次序依次切换抽头以采集相应的光子,以接收光信号并转换成电信号。The fusion depth measurement method of claim 6, wherein: in step S2, the TOF image sensor includes at least one pixel, and each pixel includes two or more taps; within a single frame period, in a certain order Switch the taps in turn to collect the corresponding photons to receive the optical signal and convert it into an electrical signal.
  9. 如权利要求6所述融合的深度测量方法,其特征在于:步骤S3中,通过控制和处理电路接收像素阵列输入的电信号分别进行相位计算和强度计算,以得到目标物体的TOF深度图和结构光深度图,将TOF深度值赋值到所述结构光深度图中对应的像素上,由此将所述结构光深度图上的点区分为可靠点与不 可靠点,结合校正算法利用所述可靠点校正所述不可靠的点,得到所述目标物体的深度图像。The fusion depth measurement method according to claim 6, characterized in that: in step S3, the electrical signal input by the pixel array is received through the control and processing circuit to perform phase calculation and intensity calculation respectively to obtain the TOF depth map and structure of the target object Light depth map, assigning TOF depth values to the corresponding pixels in the structured light depth map, thereby distinguishing points on the structured light depth map into reliable points and unreliable points, and using the reliable point in combination with a correction algorithm Point correction of the unreliable point to obtain a depth image of the target object.
  10. 一种电子设备,包括:壳体、屏幕、以及融合的深度测量装置;所述融合的深度测量装置包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;所述发射模块包括有光源阵列以及光学元件,所述光源阵列用于发射时序上振幅被调制的光束,所述光学元件接收所述光束后向目标物体发射斑点图案光束;所述接收模块包括有TOF图像传感器,所述TOF图像传感器包括像素阵列,所述像素阵列接收由所述目标物体反射的斑点图案光束并形成电信号;所述控制和处理电路接收所述电信号并计算得到相位差,利用所述相位差计算所述目标物体的TOF深度图;和,接收所述电信号并计算得到结构光图案,利用所述结构光图案计算所述目标物体的结构光深度图;以及,将所述TOF深度图中的深度值作为可靠点,赋值到所述结构光深度图中对应像素位置,并利用所述可靠点校正所述结构光深度图,最终得到目标物体的深度图像;其中,所述融合的深度测量装置的发射模块与接收模块设置于电子设备的第一平面上,以用于向目标物体发射在时序上振幅被调制的斑点图案光束与接收由所述目标物体反射的斑点图案光束;所述屏幕安装在所述电子设备的第二平面上,用于显示图像或者文字等信息;所述第一平面与所述第二平面为同一平面或所述第一平面与所述第二平面为相对立的平面。An electronic device comprising: a housing, a screen, and a fused depth measuring device; the fused depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module; The transmitting module includes a light source array and an optical element, the light source array is used to transmit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam; the receiving module includes a TOF image Sensor, the TOF image sensor includes a pixel array, the pixel array receives the spot pattern beam reflected by the target object and forms an electrical signal; the control and processing circuit receives the electrical signal and calculates the phase difference, using all The phase difference calculates the TOF depth map of the target object; and, receiving the electrical signal and calculating the structured light pattern, using the structured light pattern to calculate the structured light depth map of the target object; and, converting the TOF The depth value in the depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by using the reliable point to finally obtain a depth image of the target object; wherein, the fusion The transmitting module and the receiving module of the depth measuring device are arranged on the first plane of the electronic device, and are used to transmit a spot pattern light beam whose amplitude is modulated in time sequence to a target object and receive a spot pattern light beam reflected by the target object; The screen is installed on the second plane of the electronic device to display information such as images or text; the first plane and the second plane are the same plane or the first plane and the second plane For opposite planes.
PCT/CN2020/077862 2019-12-18 2020-03-04 Fused depth measurement apparatus and measurement method WO2021120402A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911306106.3 2019-12-18
CN201911306106.3A CN111045029B (en) 2019-12-18 2019-12-18 Fused depth measuring device and measuring method

Publications (1)

Publication Number Publication Date
WO2021120402A1 true WO2021120402A1 (en) 2021-06-24

Family

ID=70237114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077862 WO2021120402A1 (en) 2019-12-18 2020-03-04 Fused depth measurement apparatus and measurement method

Country Status (2)

Country Link
CN (1) CN111045029B (en)
WO (1) WO2021120402A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronic device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111664798B (en) * 2020-04-29 2022-08-02 奥比中光科技集团股份有限公司 Depth imaging method and device and computer readable storage medium
CN111678457B (en) * 2020-05-08 2021-10-01 西安交通大学 ToF device under OLED transparent screen and distance measuring method
CN112255639B (en) * 2020-12-23 2021-09-03 杭州蓝芯科技有限公司 Depth perception sensor and depth perception sensing module for region of interest
CN112969019B (en) * 2021-02-26 2023-08-08 深圳荆虹科技有限公司 TOF module and electronic device
CN113466884B (en) * 2021-06-30 2022-11-01 深圳市汇顶科技股份有限公司 Time-of-flight depth measurement transmitting device and electronic equipment
CN113534596B (en) * 2021-07-13 2022-09-27 盛景智能科技(嘉兴)有限公司 RGBD stereo camera and imaging method
CN114937071B (en) * 2022-07-26 2022-10-21 武汉市聚芯微电子有限责任公司 Depth measurement method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062558A1 (en) * 2013-09-05 2015-03-05 Texas Instruments Incorporated Time-of-Flight (TOF) Assisted Structured Light Imaging
CN105096259A (en) * 2014-05-09 2015-11-25 株式会社理光 Depth value restoration method and system for depth image
CN109889809A (en) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 Depth camera mould group, depth camera, depth picture capturing method and depth camera mould group forming method
CN110333501A (en) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 Depth measurement device and distance measurement method
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
CN110471080A (en) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 Depth measurement device based on TOF imaging sensor
CN110488240A (en) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 Depth calculation chip architecture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230403A (en) * 2018-01-23 2018-06-29 北京易智能科技有限公司 A kind of obstacle detection method based on space segmentation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062558A1 (en) * 2013-09-05 2015-03-05 Texas Instruments Incorporated Time-of-Flight (TOF) Assisted Structured Light Imaging
CN105096259A (en) * 2014-05-09 2015-11-25 株式会社理光 Depth value restoration method and system for depth image
CN109889809A (en) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 Depth camera mould group, depth camera, depth picture capturing method and depth camera mould group forming method
CN110333501A (en) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 Depth measurement device and distance measurement method
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion
CN110471080A (en) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 Depth measurement device based on TOF imaging sensor
CN110488240A (en) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 Depth calculation chip architecture

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronic device
CN115184956B (en) * 2022-09-09 2023-01-13 荣耀终端有限公司 TOF sensor system and electronic device

Also Published As

Publication number Publication date
CN111045029A (en) 2020-04-21
CN111045029B (en) 2022-06-28

Similar Documents

Publication Publication Date Title
WO2021120402A1 (en) Fused depth measurement apparatus and measurement method
WO2021008209A1 (en) Depth measurement apparatus and distance measurement method
WO2021128587A1 (en) Adjustable depth measuring device and measuring method
WO2021120403A1 (en) Depth measurement device and method
WO2021051477A1 (en) Time of flight distance measurement system and method with adjustable histogram
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
EP3185037B1 (en) Depth imaging system
WO2021051479A1 (en) Interpolation-based time of flight measurement method and system
WO2021051481A1 (en) Dynamic histogram drawing time-of-flight distance measurement method and measurement system
WO2021051480A1 (en) Dynamic histogram drawing-based time of flight distance measurement method and measurement system
CN107917701A (en) Measuring method and RGBD camera systems based on active binocular stereo vision
WO2021238212A1 (en) Depth measurement apparatus and method, and electronic device
CN105115445A (en) Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
WO2021238213A1 (en) Tof-based depth measurement apparatus and method, and electronic device
WO2021169531A1 (en) Tof depth measurement apparatus, method for controlling tof depth measurement apparatus, and electronic device
CN110221274A (en) Time flight depth camera and the distance measurement method of multifrequency modulation /demodulation
US11494925B2 (en) Method for depth image acquisition, electronic device, and storage medium
WO2023015880A1 (en) Acquisition method for training sample set, model training method and related apparatus
CN110221273A (en) Time flight depth camera and the distance measurement method of single-frequency modulation /demodulation
WO2020223981A1 (en) Time flight depth camera and multi-frequency modulation and demodulation distance measuring method
CN110361751A (en) The distance measurement method of time flight depth camera and the reduction noise of single-frequency modulation /demodulation
WO2022241942A1 (en) Depth camera and depth calculation method
US11624834B2 (en) Time of flight sensing system and image sensor used therein
US11709271B2 (en) Time of flight sensing system and image sensor used therein
CN111654626A (en) High-resolution camera containing depth information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20903015

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20903015

Country of ref document: EP

Kind code of ref document: A1