CN111239729B - Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof - Google Patents

Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof Download PDF

Info

Publication number
CN111239729B
CN111239729B CN202010057595.XA CN202010057595A CN111239729B CN 111239729 B CN111239729 B CN 111239729B CN 202010057595 A CN202010057595 A CN 202010057595A CN 111239729 B CN111239729 B CN 111239729B
Authority
CN
China
Prior art keywords
speckle
depth
projector
tof
phase shift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010057595.XA
Other languages
Chinese (zh)
Other versions
CN111239729A (en
Inventor
葛晨阳
乔欣
邓鹏超
卫莉丽
李彤
周艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202010057595.XA priority Critical patent/CN111239729B/en
Publication of CN111239729A publication Critical patent/CN111239729A/en
Application granted granted Critical
Publication of CN111239729B publication Critical patent/CN111239729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement

Abstract

A ToF depth sensor and a distance measurement method thereof for fusing speckle and floodlight projection are provided, the ToF depth sensor comprises: the device comprises a floodlight projector, a laser speckle projector, a ToF receiving camera and a depth decoding module, wherein the floodlight projector generates a light source for uniform irradiation, the laser speckle projector forms a laser speckle pattern, the ToF receiving camera synchronously receives a phase shift image reflected by the projector after irradiating the surface of an object, and the depth decoding module performs depth decoding on RAW data of the phase shift image by using a phase shift method and fuses depth information obtained by calculation under the irradiation of the floodlight projector and the speckle projector. The distance measurement method can obtain the depth information which is fused with abundant short-distance details and stable and reliable long-distance measurement, is beneficial to solving the difficult problems of strong light interference resistance, multipath reflection resistance and the like of the ToF outdoor long-distance, and has wide application prospect in the fields of intelligent equipment, unmanned vehicles, robots and the like.

Description

Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof
Technical Field
The disclosure belongs to the technical field of depth sensors, machine vision, image processing, laser speckle and TOF, and particularly relates to a speckle and floodlight projection integrated ToF depth sensor and a distance measuring method thereof.
Background
In recent years, three-dimensional depth perception equipment begins to enter eyeballs of people, and a high-precision depth sensor is used as a novel medium for acquiring external information, so that the development of machine vision is promoted, a robot can understand the external world, and the development of human-computer interaction is promoted. Depth perception techniques can be broadly divided into passive and active. The traditional binocular stereo vision distance measurement is a passive distance measurement method, which is greatly influenced by ambient light and has a complex stereo matching process. The active ranging method mainly includes two methods of structured light coding ranging and ToF (time of flight) ranging. The structured light coding ranging method belongs to laser triangulation ranging essentially, and the ranging precision is reduced sharply along with the increase of the distance. The ToF camera obtains the depth information of the corresponding pixel by calculating the flight time of the emitted laser, and although the resolution ratio of the depth image obtained by the ToF camera is lower at present, the ToF camera has short response time, low cost and compact structure. With the reduction of the volume of the ToF module, the ToF module is gradually applied and popularized in embedded equipment, particularly smart phones and information appliances.
Floodlight illumination is generally adopted by an active light source projector of the current ToF module, and the uniform illumination method can enable ToF ranging calculation to obtain depth point cloud information with rich details in a close range; however, as the irradiation distance becomes longer, the energy of the irradiated light decreases rapidly, and the depth information of the remote object cannot be detected because the depth information is easily affected by the ambient light. If a laser speckle projector is used as a ToF active light source, the laser speckle point irradiation mode has higher energy density and can be projected to a longer distance, so that the ToF distance measurement calculation can obtain point cloud information of a remote object, but because the number of speckle points is limited, the correspondingly obtained point cloud is sparse and lacks point cloud details of a target object.
Disclosure of Invention
In view of this, the present disclosure provides a combined speckle and flood projection ToF depth sensor comprising: the device comprises a floodlight projector, a laser speckle projector, a ToF receiving camera and a depth decoding module; wherein the content of the first and second substances,
the floodlight projector comprises a laser light source and a diffusion sheet and is used for generating a light source for uniform illumination;
the laser speckle projector comprises a laser light source, a collimating mirror and a diffraction optical device DoE, and is used for generating a certain amount of uniformly copied laser scattered spots and forming a laser speckle pattern;
the ToF receiving camera comprises a ToF image sensor, an optical filter and an optical lens, and is used for: generating phase shift method modulation driving signals required by the floodlight projector and the laser speckle projector, and synchronously receiving phase shift images reflected back after the floodlight projector and the laser speckle projector irradiate the surface of an object;
the depth decoding module collects original RAW data of a phase shift image output by the ToF receiving camera, respectively carries out depth decoding on RAW data obtained by irradiation of the floodlight projector and RAW data obtained by irradiation of the laser speckle projector by using a phase shift method to obtain a floodlight depth image and a speckle depth image of the same scene, and then carries out fusion to obtain a fused depth image.
The present disclosure also provides a distance measuring method of a ToF depth sensor, the method comprising the steps of:
s100: the floodlight projector emits uniform light to irradiate periodic signals with phase modulation information to a target object or space to be detected;
s200: the ToF receiving camera synchronously receives a phase shift image reflected from a detected target object or space after the floodlight projector irradiates; correspondingly acquiring a plurality of phase shift images with different phases according to different phase modulation of a phase shift method;
s300: the depth decoding module collects RAW data of a plurality of phase shift images output by the ToF receiving camera, calculates to obtain a phase difference corresponding to each pixel in the images, filters depth information generated by unreliable pixels, and obtains a floodlight depth map according to a phase shift depth calculation formula;
s400: the laser speckle projector emits a certain amount of uniformly copied laser scattered spots to form a laser speckle pattern, and a periodic signal with phase modulation information is irradiated to the detected target object or space under the same scene in the step S100;
s500: the ToF receiving camera synchronously receives speckle phase shift images reflected from a detected target object or space after the irradiation of the laser speckle projector; correspondingly collecting a plurality of speckle phase shift images with different phases according to different phase modulation of a phase shift method;
s600: the depth decoding module collects RAW data of a plurality of speckle phase shift images output by the ToF receiving camera, extracts scattered spots, calculates to obtain a phase difference corresponding to a pixel where a speckle point is located in the image, filters depth information generated by unreliable pixels, and obtains a speckle depth image of the pixel where the speckle point is located according to a phase shift depth calculation formula;
s700: the above steps S100, S200, and S300 and steps S400, S500, and S600 are allowed to be performed in turn, and: and the depth decoding module fuses the floodlight depth map and the speckle depth map which are obtained successively on the basis of the steps S300 and S600 to finally obtain the fused depth map information.
Through the technical scheme, on the basis of the existing ToF depth sensor, the floodlight projector and the speckle projector are used for irradiating to respectively obtain the floodlight depth map and the speckle depth map, and then the depth map is fused to obtain the point cloud information which is rich in close-range details and stable and reliable in long-range distance measurement, so that the problems of outdoor long-range strong light interference resistance, multipath reflection and the like of the ToF are solved.
Drawings
FIG. 1 is a block diagram of a combined speckle and flood projecting ToF depth sensor provided in one embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an arrangement of a VCSEL light emitting lattice diffractively replicated (3x3, 3x5) by DoE in a laser speckle projector according to one embodiment of the present disclosure;
FIG. 3 is a flow chart of a ranging method using a ToF depth sensor that combines speckle and flood projection in one embodiment of the present disclosure.
Detailed Description
The present invention will be described in further detail with reference to fig. 1 to 3.
In one embodiment, referring to fig. 1, there is disclosed a fused speckle and flood projected ToF depth sensor comprising: the device comprises a floodlight projector, a laser speckle projector, a ToF receiving camera and a depth decoding module; wherein the content of the first and second substances,
the floodlight projector comprises a laser light source and a diffusion sheet and is used for generating a light source for uniform illumination;
the laser speckle projector comprises a laser light source, a collimating mirror and a diffraction optical device DoE, and is used for generating a certain amount of uniformly copied laser scattered spots and forming a laser speckle pattern;
the ToF receiving camera comprises a ToF image sensor, an optical filter and an optical lens, and is used for: generating phase shift method modulation driving signals required by the floodlight projector and the laser speckle projector, and synchronously receiving phase shift images reflected back after the floodlight projector and the laser speckle projector irradiate the surface of an object;
the depth decoding module collects original RAW data of a phase shift image output by the ToF receiving camera, respectively carries out depth decoding on RAW data obtained by irradiation of the floodlight projector and RAW data obtained by irradiation of the laser speckle projector by using a phase shift method to obtain a floodlight depth image and a speckle depth image of the same scene, and then carries out fusion to obtain a fused depth image.
In the embodiment, the point cloud information with rich close-range details and stable and reliable remote distance measurement can be obtained through depth map fusion, the difficult problems of strong light interference resistance, multipath reflection resistance and the like of the ToF outdoor remote distance are solved, and the method has wide application prospects in the fields of intelligent equipment, unmanned vehicles, robots and the like.
The field angles FoV of the floodlight projector and the laser speckle projector are generally larger than that of the ToF receiving camera, the field directions are kept consistent, and the field directions are generally vertical to the application of the smart phone.
The ToF receives the phase shift image received by the camera as an optical signal, and the camera is used for converting the phase shift image of the optical signal into a phase shift image of an electric signal and outputting the phase shift image to the depth decoding module.
In another embodiment, the laser speckle pattern is either a regular speckle point composition or an encoding pattern of randomly distributed scattered spots.
In another embodiment, the laser light source is a vertical cavity surface emitting laser VCSEL or a semiconductor laser LD.
In the case of this embodiment, the laser light source here refers to the laser light source of the floodlight projector and the laser light source of the laser speckle projector, and whether to project speckle or floodlight illumination is determined by the lens.
In another embodiment, the ToF receiving camera may alternately generate the phase-shift modulation driving signals required by the floodlight projector and the laser speckle projector when generating the phase-shift modulation driving signals required by the floodlight projector and the laser speckle projector. By alternate generation is meant that the flood projector and the laser speckle projector alternate illumination.
In another embodiment, the laser light source has a wavelength of 940nm or 850 nm.
For this embodiment, the light with these two wavelengths is less in the solar spectrum, so the sunlight interference resistance is good, which is a common choice.
In another embodiment, the diffractive optical device DoE is used to replicate and diffract a certain number of speckle points to form a speckle pattern for projection to the semiconductor laser LD or the vertical cavity surface emitting laser VCSEL.
For this embodiment, referring to fig. 2, for example, for a VCSEL light emitting array as a base primitive, the DoE is replicated M × N (M, N are all positive integers, e.g., 3 × 3, 3 × 5), and it is ensured that the brightness and contrast of the scattered spots of different replica blocks are kept uniform;
in another embodiment, the phase shifting method comprises a four-phase step method, a three-phase step method, or a five-phase step method.
For the embodiment, in which the four-phase-stepping method is to use four sampling computation windows to measure, each computation window is phase-delayed by 90 ° (0 °, 90 °, 180 °, 270 °), the RAW data collected by the ToF receiving camera are Q0, Q1, Q2 and Q3, respectively.
In another embodiment, the fusing specifically comprises: and filtering depth information generated by unreliable pixels aiming at a floodlight depth map and a speckle depth map which are respectively obtained in the same scene, and obtaining the fused depth map information through depth map processing and fusion.
In any embodiment of the present disclosure, in conjunction with the confidence discrimination, depth information generated by unreliable pixels is filtered out.
In another embodiment, referring to fig. 3, a ranging method using the ToF depth sensor includes the steps of:
s100: the floodlight projector emits uniform light to irradiate periodic signals with phase modulation information to a target object or space to be detected;
s200: the ToF receiving camera synchronously receives a phase shift image reflected from a detected target object or space after the floodlight projector irradiates; correspondingly acquiring a plurality of phase shift images with different phases according to different phase modulation of a phase shift method;
s300: the depth decoding module collects RAW data of a plurality of phase shift images output by the ToF receiving camera, calculates to obtain a phase difference corresponding to each pixel in the images, filters depth information generated by unreliable pixels, and obtains a floodlight depth map according to a phase shift depth calculation formula;
s400: the laser speckle projector emits a certain amount of uniformly copied laser scattered spots to form a laser speckle pattern, and a periodic signal with phase modulation information is irradiated to the detected target object or space under the same scene in the step S100;
s500: the ToF receiving camera synchronously receives speckle phase shift images reflected from a detected target object or space after the irradiation of the laser speckle projector; correspondingly collecting a plurality of speckle phase shift images with different phases according to different phase modulation of a phase shift method;
s600: the depth decoding module collects RAW data of a plurality of speckle phase shift images output by the ToF receiving camera, extracts scattered spots, calculates to obtain a phase difference corresponding to a pixel where a speckle point is located in the image, filters depth information generated by unreliable pixels, and obtains a speckle depth image of the pixel where the speckle point is located according to a phase shift depth calculation formula;
s700: the above steps S100, S200, and S300 and steps S400, S500, and S600 are allowed to be performed in turn, and: and the depth decoding module fuses the floodlight depth map and the speckle depth map which are obtained successively on the basis of the steps S300 and S600 to finally obtain the fused depth map information.
In further embodiments, further:
in step S200, a four-phase step method may be adopted, that is, four sampling calculation windows are used for measurement, each calculation window is delayed by 90 ° (0 °, 90 °, 180 °, 270 °), and four phase-shifted images acquired by the ToF receiving camera have RAW data of Q0, Q1, Q2, and Q3, respectively.
In case of confidence discrimination:
for step S300, the unpacking method of the four-phase walking method (i.e. for the above obtained RAW data Q0, Q1, Q2 and Q3) is specifically as follows: and (3) analyzing and calculating the phase difference of the emitted light and the received light corresponding to each pixel in the phase-shifted image according to a formula (1), and acquiring the floodlight depth information according to a formula (2) for converting the phase difference into depth calculation.
Figure BDA0002371873270000091
Figure BDA0002371873270000092
Wherein d is1Is the depth information of the measured target under floodlight irradiation, c is the speed of light, fmIn order to modulate the frequency of the laser light,
Figure BDA0002371873270000093
is the phase difference between the outgoing light and the incoming light signal.
The Confidence corresponding to each pixel point in the phase-shifted image is obtained according to the following formula (3),
Confidence=|Q3-Q1|+|Q0-Q2| (3)
in another embodiment of the present invention, the substrate is,
in step S300, the method specifically includes: and setting a fixed confidence threshold or a floating confidence threshold, wherein the floating confidence threshold can be set with different thresholds Ti according to different ranging distances, and the unreliable pixel is considered when the distance is less than the corresponding threshold. Therefore, the depth information generated by the unreliable pixels can be filtered.
In step S500, a four-phase-step method is adopted, that is, four sampling calculation windows are adopted for measurement, each calculation window is delayed by 90 ° (0 °, 90 °, 180 °, 270 °), and the ToF receiving camera acquires four speckle phase-shift images, and the original RAW data of the images are SQO, SQ1, SQ2, and SQ3, respectively.
Similarly, in the case of confidence discrimination:
in step S600, the method for extracting speckle in the speckle phase-shift image, that is, extracting speckle points specifically includes obtaining a confidence score sconfidente corresponding to each pixel point in the speckle phase-shift image according to formula (4), setting a search window of m × n (m and n are integers), calculating an average Mean _ sconfidente of the confidence scores in the search window, comparing the average with the confidence score of a central pixel point in the search window, and determining that the pixel is the pixel where the speckle point is located if the average is larger than the average.
SConfidence=|SQ3-SQ1|+|SQ0-SQ2| (4)
The unwrapping method of the four-phase-step method is that aiming at the steps, speckle phase shift image RAW data SQ0, SQ1, SQ2 and SQ3 are obtained, the phase difference between the emitted light and the received light corresponding to the pixel where the speckle point is located in the speckle phase shift image is analyzed and calculated according to a formula (5), and speckle depth information is obtained according to a formula (6) of converting the phase difference into depth calculation.
Figure BDA0002371873270000101
Figure BDA0002371873270000102
Wherein d is2Is the depth information of the measured target under speckle illumination, c is the speed of light, fmIn order to modulate the frequency of the laser light,
Figure BDA0002371873270000103
the phase difference between the emitted light and the received light corresponding to the pixel where the speckle point is located.
Step S700 specifically comprises the steps that a floodlight projector and a laser speckle projector respectively irradiate to obtain respective depth information, wherein floodlight irradiation can be carried out firstly and then speckle irradiation, and speckle irradiation can also be carried out firstly and then floodlight irradiation; the method can be characterized by comprising the following steps of firstly floodlighting a plurality of different frequency illuminations, then speckling a plurality of different frequency illuminations, or firstly speckling a plurality of different frequency illuminations, then floodlighting a plurality of different frequency illuminations; or under a plurality of frequencies, the floodlight and the speckles are irradiated successively.
Outputting the fused depth map according to the judgment criterion of the formula (7), wherein the depth value dout corresponding to each pixel point,
Figure BDA0002371873270000111
the Confidence coefficient is calculated from the phase-shifted image under floodlight irradiation, the Confidence coefficient is calculated from the speckle phase-shifted image under speckle irradiation, and the Confidence coefficient thresholds T1 and T2 are set.
Firstly, respectively calculating the measurement precision error sigma corresponding to each pixel in the floodlight depth image and the speckle depth image1、σ2See equation (8); the fused depth map can be obtained according to the depth value weighted average of pixels at the same position in the floodlight depth map and the speckle depth map, as shown in formula (9), wherein the weight omega can be obtained according to the measurement precision error sigma1、σ2Calculated by combining the formula (10), the amplitude A1Intensity value B1Obtained according to equation (11), amplitude A2Intensity value B2Obtained according to equation (12).
Figure BDA0002371873270000112
dout=ω·d1+(1-ω)·d2 (9)
Figure BDA0002371873270000113
Wherein A is1For flood phase shift image amplitude, B1The floodlight phase shift image intensity value is calculated according to the following formula:
Figure BDA0002371873270000114
wherein A is2For speckle phase-shift image amplitude, B2For speckle phase shift image intensity values, the calculation formula is as follows:
Figure BDA0002371873270000121
although the embodiments of the present invention have been described above with reference to the accompanying drawings, the present invention is not limited to the above-described embodiments and application fields, and the above-described embodiments are illustrative, instructive, and not restrictive. Those skilled in the art, having the benefit of this disclosure, may effect numerous modifications thereto without departing from the scope of the invention as defined by the appended claims.

Claims (7)

1. A ToF depth sensor that fuses speckle and flood projections, comprising: the device comprises a floodlight projector, a laser speckle projector, a ToF receiving camera and a depth decoding module; wherein the content of the first and second substances,
the floodlight projector comprises a laser light source and a diffusion sheet and is used for generating a light source for uniform illumination;
the laser speckle projector comprises a laser light source, a collimating mirror and a diffraction optical device DoE, and is used for generating a certain amount of uniformly copied laser scattered spots and forming a laser speckle pattern;
the ToF receiving camera comprises a ToF image sensor, an optical filter and an optical lens, and is used for: generating phase shift method modulation driving signals required by the floodlight projector and the laser speckle projector, and synchronously receiving phase shift images reflected back after the floodlight projector and the laser speckle projector irradiate the surface of an object;
the depth decoding module is used for acquiring original RAW data of a phase shift image output by the ToF receiving camera, respectively performing depth decoding on RAW data obtained by irradiation of the floodlight projector and RAW data obtained by irradiation of the laser speckle projector by using a phase shift method to obtain a floodlight depth image and a speckle depth image of the same scene, and then performing fusion to obtain a fused depth image;
wherein the fusing specifically comprises: and (3) aiming at a floodlight depth map and a speckle depth map which are respectively obtained in the same scene, combining confidence discrimination, filtering out depth information generated by unreliable pixels, and obtaining fused depth map information through depth map processing and fusion.
2. The ToF depth sensor of claim 1, wherein the laser speckle pattern is either a regular speckle point composition or a randomly distributed speckle point composition encoding pattern.
3. The ToF depth sensor of claim 1, wherein the laser light source is a Vertical Cavity Surface Emitting Laser (VCSEL) or a semiconductor Laser (LD).
4. The ToF depth sensor of claim 1, wherein the laser light source has a wavelength of 940nm or 850 nm.
5. The ToF depth sensor of claim 1, wherein the diffractive optical device DoE is configured to replicate, diffract, and form a speckle pattern to project a number of speckle points of the semiconductor laser LD or the vertical cavity surface emitting laser VCSEL.
6. The ToF depth sensor of claim 1, the phase shift method comprising a four-phase step method, a three-phase step method, or a five-phase step method.
7. A ranging method using the ToF depth sensor of claim 1, the method comprising the steps of:
s100: the floodlight projector emits uniform light to irradiate periodic signals with phase modulation information to a target object or space to be detected;
s200: the ToF receiving camera synchronously receives a phase shift image reflected from a detected target object or space after the floodlight projector irradiates; correspondingly acquiring a plurality of phase shift images with different phases according to different phase modulation of a phase shift method;
s300: the depth decoding module collects RAW data of a plurality of phase shift images output by the ToF receiving camera, calculates to obtain a phase difference corresponding to each pixel in the images, filters depth information generated by unreliable pixels, and obtains a floodlight depth map according to a phase shift depth calculation formula;
s400: the laser speckle projector emits a certain amount of uniformly copied laser scattered spots to form a laser speckle pattern, and a periodic signal with phase modulation information is irradiated to the detected target object or space under the same scene in the step S100;
s500: the ToF receiving camera synchronously receives speckle phase shift images reflected from a detected target object or space after the irradiation of the laser speckle projector; correspondingly collecting a plurality of speckle phase shift images with different phases according to different phase modulation of a phase shift method;
s600: the depth decoding module collects RAW data of a plurality of speckle phase shift images output by the ToF receiving camera, extracts scattered spots, calculates to obtain a phase difference corresponding to a pixel where a speckle point is located in the image, filters depth information generated by unreliable pixels, and obtains a speckle depth image of the pixel where the speckle point is located according to a phase shift depth calculation formula;
s700: the above steps S100, S200, and S300 and steps S400, S500, and S600 are allowed to be performed in turn, and: and the depth decoding module fuses the floodlight depth map and the speckle depth map which are obtained successively on the basis of the steps S300 and S600 to finally obtain the fused depth map information.
CN202010057595.XA 2020-01-17 2020-01-17 Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof Active CN111239729B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010057595.XA CN111239729B (en) 2020-01-17 2020-01-17 Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010057595.XA CN111239729B (en) 2020-01-17 2020-01-17 Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof

Publications (2)

Publication Number Publication Date
CN111239729A CN111239729A (en) 2020-06-05
CN111239729B true CN111239729B (en) 2022-04-05

Family

ID=70880932

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010057595.XA Active CN111239729B (en) 2020-01-17 2020-01-17 Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof

Country Status (1)

Country Link
CN (1) CN111239729B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021253321A1 (en) * 2020-06-18 2021-12-23 深圳市汇顶科技股份有限公司 Time-of-flight ranging method and related system
CN111693149A (en) * 2020-06-23 2020-09-22 广东小天才科技有限公司 Temperature measurement method, device, wearable equipment and medium
CN111815695B (en) * 2020-07-09 2024-03-15 Oppo广东移动通信有限公司 Depth image acquisition method and device, mobile terminal and storage medium
CN115218820A (en) * 2021-04-20 2022-10-21 上海图漾信息科技有限公司 Structured light projection device, depth data measuring head, computing device and measuring method
CN112312113B (en) * 2020-10-29 2022-07-15 贝壳技术有限公司 Method, device and system for generating three-dimensional model
CN114543696B (en) * 2020-11-24 2024-01-23 瑞芯微电子股份有限公司 Structured light imaging device, structured light imaging method, structured light imaging medium and electronic equipment
CN112950694A (en) * 2021-02-08 2021-06-11 Oppo广东移动通信有限公司 Image fusion method, single camera module, shooting device and storage medium
EP4308961A1 (en) * 2021-03-15 2024-01-24 Sony Semiconductor Solutions Corporation Illumination circuitry, illumination method, time-of-flight module
CN113093213B (en) * 2021-04-08 2023-05-09 上海炬佑智能科技有限公司 ToF sensing device and distance detection method thereof
CN113311451B (en) * 2021-05-07 2024-01-16 西安交通大学 Laser speckle projection TOF depth perception method and device
CN113542534A (en) * 2021-09-17 2021-10-22 珠海视熙科技有限公司 TOF camera control method and device and storage medium
CN113945951B (en) * 2021-10-21 2022-07-08 浙江大学 Multipath interference suppression method in TOF (time of flight) depth calculation, TOF depth calculation method and device
CN117607837B (en) * 2024-01-09 2024-04-16 苏州识光芯科技术有限公司 Sensor array, distance measuring device and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009288108A (en) * 2008-05-29 2009-12-10 Mitsutoyo Corp Image correlation displacement gage
CN108668078A (en) * 2018-04-28 2018-10-16 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN109798838A (en) * 2018-12-19 2019-05-24 西安交通大学 A kind of ToF depth transducer and its distance measuring method based on laser speckle projection
CN109886197A (en) * 2019-02-21 2019-06-14 北京超维度计算科技有限公司 A kind of recognition of face binocular three-dimensional camera
CN109901300A (en) * 2017-12-08 2019-06-18 宁波盈芯信息科技有限公司 A kind of laser speckle projector based on vertical cavity surface emitting laser rule dot matrix
CN110049305A (en) * 2017-12-18 2019-07-23 西安交通大学 A kind of the structure light depth camera automatic correcting method and device of smart phone
CN110333501A (en) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 Depth measurement device and distance measurement method
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003017536A (en) * 2001-07-04 2003-01-17 Nec Corp Pattern inspection method and inspection apparatus
US10061028B2 (en) * 2013-09-05 2018-08-28 Texas Instruments Incorporated Time-of-flight (TOF) assisted structured light imaging
US9958758B2 (en) * 2015-01-21 2018-05-01 Microsoft Technology Licensing, Llc Multiple exposure structured light pattern
CN109889799B (en) * 2017-12-06 2020-08-25 西安交通大学 Monocular structure light depth perception method and device based on RGBIR camera
EP3672223B1 (en) * 2018-04-28 2022-12-28 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Data processing method, electronic device, and computer-readable storage medium
CN113189826A (en) * 2019-01-09 2021-07-30 深圳市光鉴科技有限公司 Structured light projector and 3D camera
CN110335211B (en) * 2019-06-24 2021-07-30 Oppo广东移动通信有限公司 Method for correcting depth image, terminal device and computer storage medium
CN110471080A (en) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 Depth measurement device based on TOF imaging sensor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009288108A (en) * 2008-05-29 2009-12-10 Mitsutoyo Corp Image correlation displacement gage
CN109901300A (en) * 2017-12-08 2019-06-18 宁波盈芯信息科技有限公司 A kind of laser speckle projector based on vertical cavity surface emitting laser rule dot matrix
CN110049305A (en) * 2017-12-18 2019-07-23 西安交通大学 A kind of the structure light depth camera automatic correcting method and device of smart phone
CN108668078A (en) * 2018-04-28 2018-10-16 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN109798838A (en) * 2018-12-19 2019-05-24 西安交通大学 A kind of ToF depth transducer and its distance measuring method based on laser speckle projection
CN109886197A (en) * 2019-02-21 2019-06-14 北京超维度计算科技有限公司 A kind of recognition of face binocular three-dimensional camera
CN110333501A (en) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 Depth measurement device and distance measurement method
CN110456379A (en) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 The depth measurement device and distance measurement method of fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
System dependent sources of error in time-of-flight shear wave speed measurements;Yufeng Deng 等;《2015 IEEE International Ultrasonics Symposium (IUS)》;20151116;第1-4页 *
增强现实中虚实融合和人机交互技术的研究与应用;黄震宇;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20130715;第1-78页 *

Also Published As

Publication number Publication date
CN111239729A (en) 2020-06-05

Similar Documents

Publication Publication Date Title
CN111239729B (en) Speckle and floodlight projection fused ToF depth sensor and distance measuring method thereof
CN109798838B (en) ToF depth sensor based on laser speckle projection and ranging method thereof
US9501833B2 (en) Method and system for providing three-dimensional and range inter-planar estimation
Bodenmann et al. Generation of high‐resolution three‐dimensional reconstructions of the seafloor in color using a single camera and structured light
CN111025317B (en) Adjustable depth measuring device and measuring method
CN111492265A (en) Multi-resolution, simultaneous localization and mapping based on 3D lidar measurements
CN109889809A (en) Depth camera mould group, depth camera, depth picture capturing method and depth camera mould group forming method
Bergman et al. Deep adaptive lidar: End-to-end optimization of sampling and depth completion at low sampling rates
CN104541127A (en) Image processing system, and image processing method
CN107749070A (en) The acquisition methods and acquisition device of depth information, gesture identification equipment
CN108924408B (en) Depth imaging method and system
CN111427230A (en) Imaging method based on time flight and 3D imaging device
CN209676383U (en) Depth camera mould group, depth camera, mobile terminal and imaging device
US11803982B2 (en) Image processing device and three-dimensional measuring system
CN111678457A (en) ToF device under OLED transparent screen and distance measuring method
CN112230244A (en) Fused depth measurement method and measurement device
CN113311451B (en) Laser speckle projection TOF depth perception method and device
CN115542537A (en) Super-surface design method, super-surface, projection device and sweeping robot
JP6868167B1 (en) Imaging device and imaging processing method
Li et al. Fisher information guidance for learned time-of-flight imaging
CN112379389B (en) Depth information acquisition device and method combining structured light camera and TOF depth camera
CN111373222A (en) Light projection system
WO2021253308A1 (en) Image acquisition apparatus
Quero et al. Evaluation of a 3D imaging vision system based on a single-pixel InGaAs detector and the time-of-flight principle for drones
CN116482708A (en) TOF and structured light combined depth camera, depth detection method thereof and sweeper

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant