WO2021120402A1 - Appareil de mesure de la profondeur de fusion et procédé de mesure - Google Patents

Appareil de mesure de la profondeur de fusion et procédé de mesure Download PDF

Info

Publication number
WO2021120402A1
WO2021120402A1 PCT/CN2020/077862 CN2020077862W WO2021120402A1 WO 2021120402 A1 WO2021120402 A1 WO 2021120402A1 CN 2020077862 W CN2020077862 W CN 2020077862W WO 2021120402 A1 WO2021120402 A1 WO 2021120402A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth
structured light
tof
depth map
target object
Prior art date
Application number
PCT/CN2020/077862
Other languages
English (en)
Chinese (zh)
Inventor
许星
Original Assignee
深圳奥比中光科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥比中光科技有限公司 filed Critical 深圳奥比中光科技有限公司
Publication of WO2021120402A1 publication Critical patent/WO2021120402A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves

Definitions

  • This application relates to the field of optical measurement technology, and in particular to a fusion depth measurement device and measurement method.
  • the depth measurement device can be used to obtain the depth image of the object, and can further perform 3D modeling, skeleton extraction, face recognition, etc., and has a very wide range of applications in the fields of 3D measurement and human-computer interaction.
  • the current depth measurement technology mainly includes TOF ranging technology and structured light ranging technology.
  • TOF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object. It is divided into direct measurement Distance technology and indirect distance measurement technology. Among them, the indirect ranging technology measures the phase delay of the reflected light signal relative to the emitted light signal, and then calculates the flight time by the phase delay. According to the different types of modulation and demodulation, it can be divided into continuous wave (CW) modulation and decoding. Modulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method. TOF ranging technology does not require complex image processing calculations, and the detection distance is relatively long and can maintain high accuracy.
  • CW continuous wave
  • PM Pulse Modulated
  • the structured light ranging technology emits a structured light beam to a space object, then collects the structured light pattern formed by the structured light beam modulated and reflected by the object, and finally uses the triangulation method to perform depth calculation to obtain the depth data of the object.
  • Commonly used structured light patterns include irregular spot patterns, stripe patterns, phase shift patterns and so on.
  • Structured light technology has very high accuracy in short-distance measurement. It performs well in low-light environments, but is easily affected in strong light environments. Relatively speaking, the anti-interference effect of TOF technology in strong light environments is better than that of structure. Light technology.
  • the purpose of this application is to provide a fusion depth measurement device and measurement method to solve at least one of the above-mentioned background technical problems.
  • a fusion depth measurement device including a transmitting module, a receiving module, and a control and processing circuit respectively connected with the transmitting module and the receiving module; wherein the transmitting module includes a light source array and optical elements, and the light source array is used for transmitting A light beam whose amplitude is modulated in time sequence.
  • the optical element receives the light beam and then emits a spot-patterned light beam to a target object.
  • the receiving module includes a TOF image sensor.
  • the TOF image sensor includes a pixel array. The pixel array receives the light beam from the target.
  • the spot pattern beam reflected by the object forms an electrical signal; the control and processing circuit receives the electrical signal and calculates the phase difference, uses the phase difference to calculate the TOF depth map of the target object; and, receives the electrical signal and calculates it.
  • Structured light pattern using the structured light pattern to calculate the structured light depth map of the target object; and, using the depth value in the TOF depth map as a reliable point, assigning values to the corresponding pixel positions in the structured light depth map, And use the reliable point to correct the structured light depth map, and finally obtain a depth image of the target object.
  • the TOF image sensor includes at least one pixel; wherein each pixel includes two or more taps.
  • control and processing circuit provides the demodulated signal of each tap in each pixel of the TOF image sensor, and the taps are controlled by the demodulated signal to collect the reflected light beam generated by the reflected light from the target object. Electrical signal.
  • control and processing circuit includes a phase calculation module and an intensity calculation module, and the electrical signal generated by the TOF image sensor is transmitted to the phase calculation module and the intensity calculation module at the same time, and is calculated through processing.
  • the phase information and intensity information corresponding to the pixel a TOF depth map and a structured light depth map corresponding to the pixel are further obtained according to the phase information and the intensity information.
  • control and processing circuit further includes a calibration module, a matching module, and a correction module; wherein the TOF depth map obtained by the calibration module and the structured light depth obtained by the matching module
  • the map is input to the correction module to establish a mapping between the TOF depth map and the structured light depth map to ensure that the TOF depth value calculated from the electrical signal generated by each pixel corresponds to the calculated structured light depth Value, assign the TOF depth value to the corresponding pixel coordinates in the structured light depth map, and take the assigned point as a reliable point, and perform correction processing on the unassigned point in the structured light depth map.
  • a fusion depth measurement method includes the following steps:
  • step S1 the light source array emits the spot pattern beam toward the target area, and the control and processing circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be in a continuous wave, square wave or pulse mode in time sequence. At least one way of modulation.
  • the TOF image sensor includes at least one pixel, and each pixel includes two or more taps; within a single frame period, the taps are sequentially switched in a certain order to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
  • step S3 the electrical signal input by the pixel array is received through the control and processing circuit to perform phase calculation and intensity calculation respectively to obtain the TOF depth map and the structured light depth map of the target object, and the TOF depth value Assign values to the corresponding pixels in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points and unreliable points, and use the reliable points to correct the unreliable points in combination with a correction algorithm , To obtain the depth image of the target object.
  • An electronic device comprising a housing, a screen, and a fused depth measuring device;
  • the fused depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module;
  • the transmitting module includes a light source array and an optical element, the light source array is used to transmit a light beam whose amplitude is modulated in time sequence, and the optical element emits a spot pattern light beam to a target object after receiving the light beam;
  • the receiving module includes a TOF image Sensor, the TOF image sensor includes a pixel array, the pixel array receives the spot pattern beam reflected by the target object and forms an electrical signal;
  • the control and processing circuit receives the electrical signal and calculates the phase difference, using all The phase difference calculates the TOF depth map of the target object; and, receiving the electrical signal and calculating the structured light pattern, using the structured light pattern to calculate the structured light depth map of the target object; and, converting the TOF
  • the embodiment of the present application provides a fusion depth measurement device, including a transmitting module for transmitting a spot pattern beam whose amplitude is time-modulated to a target object; a receiving module for receiving the spot pattern beam reflected by the target object and forming an electrical signal; controlling and The processing circuit receives the electrical signal and calculates the TOF depth map and the structured light pattern, uses the depth value in the TOF depth map as a reliable point, assigns the value to the corresponding pixel position in the structured light depth map, and uses the The structured light depth map is calibrated reliably, and the depth image of the target object is finally obtained.
  • This application has high measurement accuracy based on the TOF depth value.
  • the TOF depth value is used to correct the error caused by the matching calculation of the structured light depth map to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, to achieve A depth measuring device with high precision, high resolution, low power consumption and miniaturization is developed.
  • Fig. 1 is a schematic diagram of a fusion depth measuring device according to an embodiment of the present application.
  • Fig. 2 is a schematic diagram of the principle of a fusion depth measuring device according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of the control and processing circuit architecture of a fused depth measurement device according to an embodiment of the present application.
  • Fig. 4 is a flowchart of a fusion depth measurement method according to an embodiment of the present application.
  • Fig. 5 is a schematic diagram of an electronic device integrated with the depth measurement device fused in Fig. 1 according to an embodiment of the present application.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • “plurality” means two or more, unless otherwise specifically defined.
  • FIG. 1 is a schematic diagram of a fusion depth measurement device according to an embodiment of the application.
  • the fused depth measurement device 10 includes a transmitting module 11, a receiving module 12, and a control and processing circuit 13 respectively connected to the transmitting module 11 and the receiving module 12; wherein, the transmitting module 11 is used to transmit a light beam 30 to a target object 20, and transmit
  • the light beam 30 is a spot pattern light beam whose amplitude is modulated in time sequence.
  • the spot pattern light beam is emitted into the target space to illuminate the target object 20 in the space. At least part of the emitted light beam 30 is reflected by the target object 20 to form a reflected light beam 40.
  • the reflected light beam At least part of the light beam in 40 is received by the receiving module 12; the control and processing circuit 13 is respectively connected with the transmitting module 11 and the receiving module 12 to control the transmission and reception of the light beam, and at the same time receive the information generated by the receiving module 12 receiving the reflected light beam, And calculate the information to obtain the depth information of the target object.
  • the emission module 11 includes a light source array 111, an optical element 112, a light source driver (not shown in the figure), and the like.
  • the light source array 111 may be a light source array composed of multiple light sources such as light emitting diodes (LED), edge emitting lasers (EEL), and vertical cavity surface emitting lasers (VCSEL).
  • the light beams emitted by the light sources may be visible light, infrared light, or ultraviolet light. Wait.
  • the light source array 111 is an irregularly arranged VCSEL S array for emitting irregular spot pattern beams.
  • the light source array 111 is controlled by the light source driver (which may be further controlled by the control and processing circuit 13) to emit light beams after being modulated with a certain timing amplitude.
  • the light source array 111 is under the control of the light source driver. Transmit pulse modulated beams, square wave modulated beams, sine wave modulated beams and other beams at a certain frequency.
  • the amplitude of the beam corresponding to each spot in the irregular spot pattern beam is modulated in a continuous wave, square wave or pulse manner in time sequence. It is understandable that a part of the control and processing circuit 13 or a sub-circuit independent of the control and processing circuit 13 can be used to control the light source array 111 to emit related light beams, such as a pulse signal generator.
  • the optical element 112 receives the light beam from the light source array 111 and emits the spot pattern light beam outward. In some embodiments, the optical element 112 is also used to expand the received light beam to expand the angle of view of the measuring device. It can be understood that the amplitude of the light beam modulated by the optical element 112 is still modulated in a certain time sequence, that is, the incident sine wave modulated light beam is still emitted by the sine wave modulated light beam.
  • the optical element 112 may be one or a combination of a lens, a diffractive optical element (DOE), a microlens array, and a liquid crystal.
  • DOE diffractive optical element
  • the receiving module 12 includes a TOF image sensor 121, a filter unit 122, and a lens unit 123.
  • the lens unit 123 receives and images at least part of the spot pattern beam reflected by the target object on at least part of the TOF image sensor 121; the filter unit 122 is configured to interact with the light source Narrowband filters with matched wavelengths are used to suppress background light noise in other wavelength bands.
  • the TOF image sensor 121 can be an image sensor array composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc.
  • the size of the array represents the depth of the camera. Resolution, such as 320 ⁇ 240, etc.
  • connected to the image sensor 121 also includes a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure). ).
  • a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure).
  • the TOF image sensor 121 includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel in the TOF image sensor 121 here includes two or more taps (tap, used in the corresponding electrode). Store and read or discharge the charge signal generated by incident photons under control), for example, including 2 taps, and switch the taps in a certain order within a single frame period (or single exposure time) to collect the corresponding photons. Receive optical signals and convert them into electrical signals.
  • the control and processing circuit 13 can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc. composed of CPU, memory, bus, etc., or a general processing circuit, such as when the depth measuring device is integrated into
  • a smart terminal such as a mobile phone, a TV, a computer, etc.
  • the processing circuit in the smart terminal can be used as at least a part of the control and processing circuit 13.
  • the control and processing circuit 13 is used to provide a modulation signal (transmission signal) required when the light source array 111 emits laser light, and the light source emits a light beam to a target object under the control of the modulation signal.
  • the modulation signal is a square wave signal or a pulse signal
  • the light source is modulated in time sequence in amplitude under the modulation of the modulation signal to generate a square wave signal or a pulse signal for external emission.
  • the control and processing circuit 13 also provides a demodulated signal (collection signal) of each tap in each pixel of the TOF image sensor 121, and the tap collects the electrical signal generated by the reflected light beam containing the target object under the control of the demodulated signal.
  • the control and processing circuit 13 processes the electrical signal and calculates the intensity information reflecting the intensity of the reflected beam to form a structured light pattern. Finally, based on the structured light pattern, it uses matching calculations, triangulation calculations, etc. to perform calculations to obtain the test Structured light depth image of the target object.
  • control and processing circuit 13 processes the electrical signal and calculates the phase difference reflecting the light beam from emission to reception, calculates the flight time of the light beam based on the phase difference, and further obtains the TOF depth image of the target object. Further, the control and processing circuit 13 can also correct the structured light depth image according to the TOF depth image. For example, the depth value in the TOF depth map can be used as a reliable point and assigned to the corresponding pixel position in the structured light depth map, and The structured light depth map is corrected using the reliable point, and a more specific correction method will be described later.
  • the depth measurement device 10 may also include a drive circuit, a power supply, a color camera, an infrared camera, an IMU, and other devices, which are not shown in the figure.
  • the combination with these devices can achieve richer functions, such as 3D. Texture modeling, infrared face recognition, SLAM and other functions.
  • the depth measuring device 10 may be embedded in electronic products such as mobile phones, tablet computers, and computers.
  • Fig. 2 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application.
  • the control and processing circuit 13 controls the light source array 111 to emit a spot pattern beam 301 whose amplitude is modulated by a square wave or pulse toward the target object, and the amplitude of each spot 302 is modulated by a square wave or pulse in time sequence. It is understandable that the light sources in the light source array 111 are modulated in the same manner. In some other embodiments, a sine wave may also be used to modulate the amplitude of the emitted light beam.
  • each pixel of the TOF image sensor in the receiving module 12 includes 4 taps, which are respectively used to collect light signals 4 times in a single frame period and convert them into electrical signals C 1 , C 2 , C 3 Same as C 4 , the time and interval of 4 acquisitions are the same.
  • the control and processing circuit 13 receives the electrical signals C 1 , C 2 , C 3 and C 4 to calculate the intensity information of the spot pattern beam.
  • the intensity information is calculated according to the following formula:
  • the structured light pattern After obtaining the intensity information of all pixels, the structured light pattern can be formed, and finally the structured light pattern is used for matching calculation to obtain the parallax and calculate the structured light depth image according to the parallax.
  • the intensity information will be calculated according to the following formula:
  • the structured light pattern is generated according to the calculated intensity information of the spot pattern beam, and further matching calculation is performed according to the structured light pattern to obtain the parallax and the structured light depth image is calculated according to the parallax.
  • the above-mentioned structured light acquisition scheme based on the 4-tap TOF image sensor and the square wave or pulse modulated light emission signal is also applicable to other tapped TOF image sensors and other types of depth measurement devices with modulated light emission signals. It is understandable that, compared with the traditional structured light depth measurement, the present application uses the transmitting end to emit a time-modulated speckle projection beam, and adopts the receiving end multi-tap pixel collection method, so that this method can have more functions than the traditional solution. For example, it is possible to implement depth measurement methods that are difficult to resist environmental interference in traditional solutions.
  • control and processing circuit 13 also receives the TOF image sensor output.
  • each tap collects the electrical signal generated by the reflected beam reflected by the target object to calculate the reflected beam phase difference and calculate the reflected beam according to the phase difference.
  • the flight time from the transmitting end to the receiving end is further calculated based on the flight time to calculate the TOF depth image of the target object.
  • Fig. 3 is a schematic diagram of a control and processing circuit architecture according to an embodiment of the present application.
  • the control and processing circuit 13 includes a phase calculation module 131 and an intensity calculation module 133; wherein the output of the phase calculation module 131 is connected to the calibration module 132; the output of the intensity calculation module 133 is connected to the preprocessing module 134, the preprocessing module 134 The output of is connected to the matching module 135.
  • the input of the calibration module 132 and the matching module 135 is also connected to the memory 137, and the output is connected to the correction module 136.
  • the control and processing circuit 13 receives the electrical signal from the TOF image sensor.
  • the transmitting module 11 emits an irregular spot pattern beam with modulated amplitude toward the target area
  • the receiving module 12 receives the spot pattern reflected by the target object. beam.
  • the electrical signal generated by it is transmitted to the phase calculation module 131 and the intensity calculation module 133 at the same time, and the phase information and intensity information corresponding to the pixel are obtained through processing and calculation.
  • the information and intensity information further obtain the TOF depth map and the structured light depth map corresponding to the pixel, which have a corresponding positional relationship.
  • the depth value in the TOF depth map is used as a reliable point, assigned to the corresponding pixel position in the structured light depth map, and the structured light depth map is corrected by using the reliable point. Specifically, including:
  • the phase calculation module 131 processes and calculates the electrical signal to obtain the phase difference. Based on the phase difference, the flight time of the beam from emission to reception can be calculated to further obtain the target object.
  • the TOF depth image can be directly calculated by the phase calculation module 131. Then the depth map is sent to the calibration module 132 for calibration. Because TOF measurement is often interfered by noise, there is a certain error between the measured value and the actual value. Therefore, a calibration step will be adopted before actual use, such as in a certain measurement interval.
  • the calibration board is set at intervals in the internal, and the actual depth value of the calibration board is known, and then the actual measurement of the calibration board at different distances is performed successively to obtain the measured value corresponding to each distance, and the relationship between the measured value and the actual value can be
  • the calibration module will call the pre-calibration parameters from the memory 137 to calibrate the current measurement value during calibration.
  • the pre-calibration parameter here can be a comparison table (index) between the actual value and the measured value.
  • the calibration process of the calibration module 132 is actually a table look-up process; it can also be a certain mathematical method to model the error and pass Multiple measurements are performed in advance to calculate the unknown parameters in the model.
  • the calibration process of the calibration module 132 is actually a process of calculating actual values based on the model and the measured values. After calibration, an accurate TOF depth map can be obtained.
  • the intensity calculation module 133 After the electrical signal generated by the TOF image sensor 121 is transmitted to the intensity calculation module 133, the intensity calculation module 133 performs phase calculation of the electrical signal to obtain intensity information reflecting the intensity of the light beam to form a structured light image. Subsequently, the structured light image is sent to the preprocessing module 134 for processing such as denoising and contrast enhancement. Pre-processing tasks such as image distortion correction can also be performed on the structured light image.
  • the pre-processed image then enters the matching module 135 for matching calculation. During the matching calculation, the matching module 135 will call the pre-stored reference image from the memory 137. In one embodiment, the matching module 135 uses zero mean normalization The least square distance function is used to estimate the pixel deviation between the structured light image and the reference image.
  • the matching module 135 can directly calculate the depth value to obtain the structured light depth image of the target to be measured.
  • the TOF depth map is directly obtained by phase difference calculation.
  • the measurement accuracy is relatively accurate, but in the TOF calculation mode, only the depth value of the part of the pixel that the reflected beam enters can be obtained, but the depth value of all the pixels cannot be obtained. Therefore, in the embodiment of the present application, the TOF depth map is extracted The depth value is used as a reliable point to correct the structured light depth map.
  • the TOF depth map obtained by the calibration module 132 and the structured light depth map obtained by the matching module 135 are input to the correction module 135 to establish a mapping between the TOF depth map and the structured light depth map to ensure the TOF calculated from the electrical signal generated by each pixel
  • the depth value corresponds to the calculated structured light depth value.
  • the TOF depth value is assigned to the corresponding pixel coordinates in the structured light depth map. These assigned points are called reliable points, and the unassigned points are called unreliable points. Based on the reliable points, the unreliable points in the structured light depth map are corrected to improve the accuracy of the structured light depth map.
  • the correction method may include the region growing method, the confidence weighting method, and so on.
  • the area growth method is used to correct the structured light depth map.
  • Region growth is calculated based on the assumption that the object has continuity and the depth values between two close points on the object are approximately equal.
  • multiple reliable points are selected as seed points, and by setting appropriate growth criteria, Use four connectivity (up, down, left, right) to connect unreliable points and reliable points that satisfy the correlation to the same area to form a new area.
  • the reliable point assigned by the TOF depth value is called the seed point, an appropriate growth criterion is set, and a first-in first-out area q is set.
  • the depth value of all pixels in the area is equal to The depth value of the seed point.
  • the depth value of the seed point p is d
  • the four pixel coordinates (x 0 , y 0 -1) adjacent to the point p are predicted, (x 0, y 0 -1) 0 , y 0 +1), (x 0 -1, y 0 ), (x 0 +1, y 0 ) the correlation between the depth value and the seed point p, assuming that the growth criterion sets two adjacent pixels
  • the absolute value of the difference between the point depth values is less than the custom threshold T.
  • the unreliable point that meets the condition is merged with the seed point
  • the region q where the seed point is located becomes a reliable point
  • the newly obtained reliable point is used as the seed point to continue the prediction; if the correlation between the point p and the depth value of the adjacent four pixel coordinate points does not meet the set growth criterion , The growth of the region is terminated. After the correction processing, the measurement accuracy of the depth map is effectively improved.
  • this application also proposes a fused depth measurement method.
  • Fig. 4 is a flowchart of a fusion depth measurement method according to another embodiment of the application, which includes the following steps:
  • the light source array emits the spot pattern beam toward the target area
  • the control circuit controls the amplitude of the beam corresponding to each spot in the spot pattern beam to be modulated by at least one of continuous wave, square wave, or pulse in time sequence.
  • the TOF image sensor includes at least one pixel, and each pixel includes two or more taps.
  • each pixel includes 4 taps, which can be determined within a single frame period (or single exposure time). The taps are sequentially switched in order to collect the corresponding photons to receive the optical signal and convert it into an electrical signal.
  • control circuit receives the electrical signal input from the pixel array to perform phase calculation and intensity calculation respectively, wherein the TOF depth map of the target area to be measured is obtained through phase calculation; the structured light depth map of the target area to be measured is obtained through intensity calculation. Assign the TOF depth value to the corresponding pixel in the structured light depth map, thereby distinguishing the points on the structured light depth map into reliable points (TOF depth values) and unreliable points, and use the reliable points to correct unreliable points in combination with the correction algorithm point.
  • reliable points TOF depth values
  • reliable points are used as seed points, and unreliable points are optimized in combination with a region growing algorithm, so as to correct the structured light depth map to obtain a depth image with higher accuracy.
  • This application proposes a fusion depth measurement device and measurement method composed of a structured light emitting end and a TOF image sensor based on amplitude timing modulation.
  • TOF depth value calculation and structured light depth are respectively performed Value calculation, based on the TOF depth value, has high measurement accuracy.
  • the error caused by the matching calculation of the structured light depth map is corrected to obtain a high-precision depth map, combined with the high resolution advantages of structured light measurement, A high-precision, high-resolution, low-power, and miniaturized depth measurement solution is realized.
  • an electronic device is also provided.
  • the electronic device may be a mobile phone, a tablet, a computer, a TV, a smart helmet, smart glasses, a robot, and so on.
  • the electronic device 500 includes a housing 51, a screen 52, and the fused depth measurement device described in the foregoing embodiment; the screen 52 is used for information display; and the The housing 51 can provide protection functions such as dustproof, waterproof, and drop-proof for the electronic device.
  • the transmitting module 11 and the receiving module 12 of the fused depth measuring device are arranged on the first plane of the electronic device 500, and are used to transmit to the target object the spot pattern beam whose amplitude is modulated in time sequence and to receive the light beam reflected by the target object.
  • Spot pattern light beam; the screen 52 is installed on the second plane of the electronic device to display information such as images or text; the first plane and the second plane are the same plane or the first plane and the second plane are opposite planes.
  • the control and processing circuit of the fused depth measurement device can be shared with the electronic device; and in some embodiments, when the electronic device itself is provided with a TOF image sensor, the receiving module of the fused depth measurement device can also be used.
  • the TOF image sensor is shared.
  • the functions of electronic devices are continuously expanded and their applications are more and more extensive. For example, Realize the depth measurement of the target object to obtain a higher-precision depth image containing the target, and further based on the high-precision depth image can realize three-dimensional reconstruction, face recognition, human-computer interaction and other functions.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

L'invention concerne un appareil de mesure de la profondeur de fusion, comprenant un module de transmission qui est utilisé pour transmettre, en direction d'un objet cible, un faisceau lumineux à motif ponctuel dont l'amplitude est modulée en séquence temporelle ; un module de réception pour recevoir le faisceau lumineux à motif ponctuel réfléchi par l'objet cible et former un signal électrique ; et un circuit de commande et de traitement pour recevoir le signal électrique et effectuer un calcul pour obtenir une image de profondeur de temps de vol et un motif de lumière structurée, définir une valeur de profondeur dans l'image de profondeur de temps de vol en tant que point fiable et attribuer celui-ci à une position de pixel correspondante dans une image de profondeur en lumière structurée, et utiliser le point fiable pour corriger l'image de profondeur en lumière structurée pour obtenir finalement une image de profondeur de l'objet cible. La présente demande présente une précision de mesure relativement élevée sur la base d'une valeur de profondeur de temps de vol ; une erreur d'une image de profondeur en lumière structurée provoquée par un calcul de correspondance est corrigée sur la base de la valeur de profondeur de temps de vol pour obtenir une image de profondeur de haute précision ; et conjointement avec l'avantage d'une haute résolution de mesure en lumière structurée, on obtient un appareil de mesure de profondeur miniaturisé de haute précision, à haute résolution et à faible consommation d'énergie.
PCT/CN2020/077862 2019-12-18 2020-03-04 Appareil de mesure de la profondeur de fusion et procédé de mesure WO2021120402A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911306106.3 2019-12-18
CN201911306106.3A CN111045029B (zh) 2019-12-18 2019-12-18 一种融合的深度测量装置及测量方法

Publications (1)

Publication Number Publication Date
WO2021120402A1 true WO2021120402A1 (fr) 2021-06-24

Family

ID=70237114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077862 WO2021120402A1 (fr) 2019-12-18 2020-03-04 Appareil de mesure de la profondeur de fusion et procédé de mesure

Country Status (2)

Country Link
CN (1) CN111045029B (fr)
WO (1) WO2021120402A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998450A (zh) * 2022-06-21 2022-09-02 豪威科技(武汉)有限公司 一种tof相机标定方法及系统
CN115184956A (zh) * 2022-09-09 2022-10-14 荣耀终端有限公司 Tof传感器系统和电子设备

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111664798B (zh) * 2020-04-29 2022-08-02 奥比中光科技集团股份有限公司 一种深度成像方法、装置及计算机可读存储介质
CN111678457B (zh) * 2020-05-08 2021-10-01 西安交通大学 一种OLED透明屏下ToF装置及测距方法
CN112184790B (zh) * 2020-09-02 2024-05-17 福建(泉州)哈工大工程技术研究院 基于深度相机的物体尺寸高精度测量方法
CN112255639B (zh) * 2020-12-23 2021-09-03 杭州蓝芯科技有限公司 一种感兴趣区域深度感知传感器及深度感知传感模块
CN112969019B (zh) * 2021-02-26 2023-08-08 深圳荆虹科技有限公司 一种tof模组及电子装置
CN113466884B (zh) * 2021-06-30 2022-11-01 深圳市汇顶科技股份有限公司 飞行时间深度测量发射装置及电子设备
CN113534596B (zh) * 2021-07-13 2022-09-27 盛景智能科技(嘉兴)有限公司 Rgbd立体相机及成像方法
CN114937071B (zh) * 2022-07-26 2022-10-21 武汉市聚芯微电子有限责任公司 一种深度测量方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062558A1 (en) * 2013-09-05 2015-03-05 Texas Instruments Incorporated Time-of-Flight (TOF) Assisted Structured Light Imaging
CN105096259A (zh) * 2014-05-09 2015-11-25 株式会社理光 深度图像的深度值恢复方法和系统
CN109889809A (zh) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 深度相机模组、深度相机、深度图获取方法以及深度相机模组形成方法
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置
CN110488240A (zh) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 深度计算芯片架构

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230403A (zh) * 2018-01-23 2018-06-29 北京易智能科技有限公司 一种基于空间分割的障碍物检测方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062558A1 (en) * 2013-09-05 2015-03-05 Texas Instruments Incorporated Time-of-Flight (TOF) Assisted Structured Light Imaging
CN105096259A (zh) * 2014-05-09 2015-11-25 株式会社理光 深度图像的深度值恢复方法和系统
CN109889809A (zh) * 2019-04-12 2019-06-14 深圳市光微科技有限公司 深度相机模组、深度相机、深度图获取方法以及深度相机模组形成方法
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置
CN110488240A (zh) * 2019-07-12 2019-11-22 深圳奥比中光科技有限公司 深度计算芯片架构

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998450A (zh) * 2022-06-21 2022-09-02 豪威科技(武汉)有限公司 一种tof相机标定方法及系统
CN115184956A (zh) * 2022-09-09 2022-10-14 荣耀终端有限公司 Tof传感器系统和电子设备
CN115184956B (zh) * 2022-09-09 2023-01-13 荣耀终端有限公司 Tof传感器系统和电子设备

Also Published As

Publication number Publication date
CN111045029A (zh) 2020-04-21
CN111045029B (zh) 2022-06-28

Similar Documents

Publication Publication Date Title
WO2021120402A1 (fr) Appareil de mesure de la profondeur de fusion et procédé de mesure
WO2021008209A1 (fr) Appareil de mesure de profondeur et procédé de mesure de distance
WO2021120403A1 (fr) Dispositif et procédé de mesure de profondeur
WO2021128587A1 (fr) Dispositif de mesure de profondeur réglable et procédé de mesure associé
WO2021051477A1 (fr) Système et procédé de mesure de distance par temps de vol comportant un histogramme réglable
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
EP3185037B1 (fr) Système d'imagerie de profondeur
WO2021051479A1 (fr) Procédé et système de mesure de temps de vol fondée sur une interpolation
WO2021051481A1 (fr) Procédé de mesure de distance par temps de vol par traçage d'un histogramme dynamique et système de mesure associé
WO2021051480A1 (fr) Procédé de mesure de distance de temps de vol basé sur un dessin d'histogramme dynamique et système de mesure
WO2021238212A1 (fr) Appareil et procédé de mesure de profondeur, et dispositif électronique
CN105115445A (zh) 基于深度相机与双目视觉复合的三维成像系统及成像方法
WO2021238213A1 (fr) Appareil et procédé de mesure de profondeur basés sur le temps de vol, et dispositif électronique
CN110221274A (zh) 时间飞行深度相机及多频调制解调的距离测量方法
WO2021169531A1 (fr) Appareil de mesure de profondeur tof, procédé de commande d'appareil de mesure de profondeur tof et dispositif électronique
CN110320528A (zh) 时间深度相机及多频调制解调的降低噪声的距离测量方法
US11494925B2 (en) Method for depth image acquisition, electronic device, and storage medium
WO2023015880A1 (fr) Procédé d'acquisition d'un ensemble d'échantillons d'apprentissage, procédé de formation de modèle et appareil associé
CN110221273A (zh) 时间飞行深度相机及单频调制解调的距离测量方法
WO2020223981A1 (fr) Caméra de profondeur de temps de vol et procédé de mesure de distance de modulation et de démodulation multifréquence
CN110361751A (zh) 时间飞行深度相机及单频调制解调的降低噪声的距离测量方法
US11709271B2 (en) Time of flight sensing system and image sensor used therein
WO2022241942A1 (fr) Caméra de profondeur et procédé de calcul de profondeur
US11624834B2 (en) Time of flight sensing system and image sensor used therein
CN111654626A (zh) 一种包含深度信息的高分辨率相机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20903015

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20903015

Country of ref document: EP

Kind code of ref document: A1