CN113514851A - Depth camera - Google Patents

Depth camera Download PDF

Info

Publication number
CN113514851A
CN113514851A CN202010216873.1A CN202010216873A CN113514851A CN 113514851 A CN113514851 A CN 113514851A CN 202010216873 A CN202010216873 A CN 202010216873A CN 113514851 A CN113514851 A CN 113514851A
Authority
CN
China
Prior art keywords
measurement
light
depth
receiving
optical signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010216873.1A
Other languages
Chinese (zh)
Inventor
苏公喆
杨心杰
朱力
吕方璐
汪博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN202010216873.1A priority Critical patent/CN113514851A/en
Publication of CN113514851A publication Critical patent/CN113514851A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

Abstract

The invention provides a depth camera, which comprises the following modules: a light projection module for projecting a first light signal toward a target object in a scene; the light receiving module receives the first light signal through at least three receiving windows, forms a second light signal through reflection of the target object, and further generates at least three collected light signals, wherein the phase delays of the at least three collected light signals relative to the second light signal are different; the control module is used for dividing a measuring interval into a plurality of measuring sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, determining the corresponding measuring sections according to the radiation energy of the at least three collected optical signals, and further determining the depth information of the target object, wherein the measuring interval is determined according to the pulse width of the first optical signal. The depth calculation of the whole measurement interval is divided into the depth measurement of a plurality of measurement sections, so that the depth is accurately calculated.

Description

Depth camera
Technical Field
The present invention relates to time-of-flight cameras, and in particular to a depth camera.
Background
In recent years, time-of-flight (ToF) cameras have become a popular 3D imaging technique, gaining popularity in some scientific and consumer applications such as robotic navigation, motion capture, human-machine interface, and 3D mapping.
The ToF camera is similar to an ordinary camera in imaging process and mainly comprises a light source, a photosensitive chip, a lens, a sensor, a driving control circuit, a processing circuit and other key units. The ToF camera includes a two-part core module, an emitting illumination module and a receiving light sensing module, which generate depth information based on the correlation between the two large core modules. The sensitization chip of ToF camera also divide into single-point and area array formula sensitization chip according to the quantity of pixel unit, in order to measure whole three-dimensional object surface position depth information, can utilize the single-point ToF camera to acquire the three-dimensional geometric structure of the object of being surveyed through scanning mode point by point, also can be through the area array formula ToF camera, shoot a scene picture and can acquire the surface geometric structure information of whole scene in real time, the face array formula ToF camera is changeed and is received the favor that consumer electronic system built.
As shown in fig. 1, T: x (t) is the light signal emitted by the light source, R: y (t) is a light signal received by the light source, reflected energy is received through two staggered first receiving windows RX1 and a second receiving window RX2, and the radiation intensity is B1 and B2 respectively; receiving ambient light energy through a third receiving window RX3 with a radiation intensity of B3; and calculating the phase difference by the ratio of receiving the effective energy through the two staggered receiving windows, and further acquiring the depth Distance of the target object. The calculation method specifically comprises the following steps:
Figure BDA0002424772930000011
wherein, gamma israngeIn order to measure the interval of the depth,
Figure BDA0002424772930000012
t is the pulse width of the light signal emitted by the light source, and c is the speed of light.
However, the accuracy of the ToF camera in the prior art decreases as the measurement interval of the degree of conformity increases, and thus cannot be applied to an application scenario with high measurement accuracy.
Disclosure of Invention
In view of the deficiencies in the prior art, it is an object of the present invention to provide a depth camera, a system, a device and a storage medium.
The depth camera provided by the invention comprises the following modules:
a light projection module for projecting a first light signal toward a target object in a scene;
the light receiving module receives the first light signal through at least three receiving windows, forms a second light signal through reflection of the target object, and further generates at least three collected light signals, wherein the phase delays of the at least three collected light signals relative to the second light signal are different;
the control module is used for dividing a measuring interval into a plurality of measuring sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, determining the corresponding measuring sections according to the radiation energy of the at least three collected optical signals, and further determining the depth information of the target object, wherein the measuring interval is determined according to the pulse width of the first optical signal.
Preferably, the light projection module comprises a light source, a light source driver and a light modulator;
the light source driver is connected with the light source and used for driving the light source to emit light;
the optical modulator is connected with the light source and used for modulating the light projected by the light source and then projecting a first optical signal to the body to be measured.
Preferably, the light receiving module includes a lens, an optical filter and an image sensor arranged along an optical path, and the image sensor is provided with at least three receiving windows; the pulse width of the receiving window is larger or smaller than the pulse width of the first optical signal;
the image sensor is used for receiving at least three second optical signals through at least three receiving windows; the at least three receiving windows are sequentially arranged in time sequence, and each collected light signal is generated according to the second light signal from each receiving window.
Preferably, when the control module divides the measurement interval into a plurality of measurement segments according to the phase delay of the pulse waveform of the second optical signal received continuously by at least three receiving windows, the measurement segments are sequentially marked in a preset order.
Preferably, when the control module determines the depth information of the target object according to a target measurement segment of the plurality of measurement segments in which the radiant energy of the at least three collected optical signals does the depth information of the target object, specifically:
determining the ratio values of the radiation intensities of the at least three second optical signals in the target measurement section according to a preset ratio relation, determining the depths of the radiation intensities of the at least three second optical signals in the target measurement section according to the ratio values, and accumulating the measurement sections in front of the target measurement section to determine the depth information of the target object.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window, and a fourth receiving window, the determining the depth information of the target object specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure BDA0002424772930000031
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure BDA0002424772930000032
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424772930000033
n is the order of the measurement segments.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window and a third receiving window, the determining the depth information of the target object specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 are quarter-phase apart.
Is provided with
Figure BDA0002424772930000034
A is set for the rising edge and the falling edge of the maximum value and the minimum value in the radiation energy curvemid,imid
According to imax,iminThe value of (A) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges;
the depth d within the corresponding measurement segment for the falling edge is expressed as:
Figure BDA0002424772930000041
the depth d within the corresponding measurement segment for the rising edge is expressed as:
Figure BDA0002424772930000042
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424772930000043
n is 0,1,2,3,4,5, and n is the order of the measurement segments.
Preferably, the pulse waveform of the first optical signal has a plurality of rectangles arranged in sequence.
Preferably, the radiation energy curve of the second optical signal received by each receiving window is trapezoidal, and the radiation energy curves of the second optical signals received by a plurality of receiving windows have phase differences.
Preferably, the measurement interval
Figure BDA0002424772930000044
Wherein, T is the pulse width of the light signal emitted by the light source, and c is the speed of light.
Compared with the prior art, the invention has the following beneficial effects:
in the invention, the light receiving module divides the measuring interval into a plurality of measuring sections through the pulse waveforms of the second light signals continuously received by at least three receiving windows, so that the control module can determine the corresponding measuring sections according to the radiation energy of the at least three collected light signals, determine the depth in the measuring sections, further accumulate the measuring sections in sequence before the target measuring section to determine the depth information of the target object, divide the depth measurement of the whole measuring interval into the depth measurement of the plurality of measuring sections, and realize the accurate calculation of the depth.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a schematic diagram of a prior art time-of-flight camera;
FIG. 2 is a diagram illustrating a depth calculation by time of flight in the prior art;
FIG. 3 is a block diagram of a depth camera in an embodiment of the invention;
FIG. 4 is a block diagram of a light projection module according to an embodiment of the present invention;
FIG. 5 is a block diagram of a light receiving module according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a pulse shape of a first optical signal according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of phase delay of collected light signals according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a pulse waveform of a first optical signal according to a variation of the present invention;
fig. 9 is a schematic diagram of phase delay of the collected optical signal according to the variation of the present invention;
FIG. 10 is a schematic diagram of a pulse waveform of a second optical signal received by four receiving windows in succession according to an embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating four measurement segments divided by pulse waveforms of four second optical signals according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a measurement interval divided into 6 measurement segments by pulse waveforms of three second optical signals according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The invention provides a depth camera, and aims to solve the problems in the prior art.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 3 is a schematic block diagram of a depth camera according to an embodiment of the present invention, and as shown in fig. 3, the depth camera provided by the present invention includes the following modules:
a light projection module for projecting a first light signal toward a target object in a scene;
the light receiving module receives the first light signal through at least three receiving windows, forms a second light signal through reflection of the target object, and further generates at least three collected light signals, wherein the phase delays of the at least three collected light signals relative to the second light signal are different;
the control module is used for dividing a measuring interval into a plurality of measuring sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, determining the corresponding measuring sections according to the radiation energy of the at least three collected optical signals, and further determining the depth information of the target object, wherein the measuring interval is determined according to the pulse width of the first optical signal.
Fig. 4 is a schematic block diagram of a light projection module according to an embodiment of the present invention, as shown in fig. 4, the light projection module includes a light source, a light source driver, and a light modulator;
the light source driver is connected with the light source and used for driving the light source to emit light;
the optical modulator is connected with the light source and used for modulating the light projected by the light source and then projecting a first optical signal to the body to be measured.
In the embodiment of the present invention, the modulated first optical signal is in a plurality of rectangles arranged in sequence, fig. 6 is a schematic diagram of a pulse waveform of the first optical signal in the embodiment of the present invention, and as shown in fig. 6, the pulse signal of the first optical signal is in a rectangle shape.
Fig. 5 is a schematic block diagram of a light receiving module according to an embodiment of the present invention, where as shown in fig. 5, the light receiving module includes a lens, an optical filter, and an image sensor disposed along a light path, and the image sensor is provided with at least three receiving windows; the pulse width of the receiving window is larger or smaller than the pulse width of the first optical signal;
the image sensor is used for receiving at least three second optical signals through at least three receiving windows; the at least three receiving windows are sequentially arranged in time sequence; and generating each collected light signal according to the second light signal from each receiving window.
Fig. 7 is a schematic diagram of phase delay of the collected light signals according to the embodiment of the present invention, and as shown in fig. 7, the four collected light signals have different phase delays relative to the second light signal, and are sequentially arranged in time sequence.
In the embodiment of the present invention, the pulse width of the receiving window is greater than the pulse width of the first optical signal and twice as large as the pulse width of the first optical signal.
Fig. 8 is a schematic diagram of a pulse waveform of a first optical signal according to a modification of the present invention, and fig. 9 is a schematic diagram of a phase delay of a collected optical signal according to a modification of the present invention, as shown in fig. 8 and 9, in the modification of the present invention, a pulse width of the receiving window is smaller than a pulse width of the first optical signal and is one-half of the pulse width of the first optical signal.
When the control module divides the measuring interval into a plurality of measuring sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows, the measuring sections are marked sequentially according to a preset sequence.
In the embodiment of the invention, the measurement interval
Figure BDA0002424772930000071
Wherein, T is the pulse width of the light signal emitted by the light source, and c is the speed of light.
Fig. 10 is a schematic diagram of pulse waveforms of second optical signals continuously received by four receiving windows in an embodiment of the present invention, as shown in fig. 10, a pulse width of the receiving window is greater than or less than a pulse width of the first optical signal, the pulse waveform of the first optical signal is a plurality of rectangles arranged in sequence, a radiation energy curve of the second optical signal received by each receiving window is trapezoidal, and a phase difference exists between radiation energy curves of the second optical signals received by the plurality of receiving windows.
When the control module determines the depth information of the target object according to a target measurement segment of the plurality of measurement segments where the radiation energy of at least three collected light signals does the depth information of the target object, specifically:
determining the ratio values of the radiation intensities of the at least three second optical signals in the target measurement section according to a preset ratio relation, determining the depths of the radiation intensities of the at least three second optical signals in the target measurement section according to the ratio values, and accumulating the measurement sections in front of the target measurement section to determine the depth information of the target object.
Fig. 11 is a schematic diagram of a pulse waveform of four second optical signals dividing a measurement interval into 8 measurement segments according to an embodiment of the present invention, as shown in fig. 11, a1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 are quarter-phase apart.
Is provided with
Figure BDA0002424772930000072
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure BDA0002424772930000081
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424772930000082
n is 0,1,2,3, and n is the order of the measurement sections.
Fig. 12 is a schematic diagram of a pulse waveform of three second optical signals dividing a measurement interval into 6 measurement segments according to an embodiment of the present invention, as shown in fig. 12, a1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 are quarter-phase apart.
Is provided with
Figure BDA0002424772930000083
A is set for the rising edge and the falling edge of the maximum value and the minimum value in the radiation energy curvemid,imid
According to imax,iminThe value of (2) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges:
for the depth d within the measurement segment corresponding to the falling edge:
Figure BDA0002424772930000084
for the depth d within the measurement segment corresponding to the rising edge:
Figure BDA0002424772930000085
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424772930000086
n is 0,1,2,3,4,5, and n is the order of the measurement segments.
In the embodiment of the present invention, the maximum value max (b) of the radiant energy curve of the second optical signal is 2N-1, where N is the number of bits of the analog-to-digital converter.
In the embodiment of the invention, the light receiving module divides the measurement interval into a plurality of measurement sections through the pulse waveforms of the second light signals continuously received by at least three receiving windows, so that the control module can determine the corresponding measurement sections according to the radiation energy of the at least three collected light signals, determine the depth in the measurement sections, further accumulate the measurement sections in sequence before the target measurement section to determine the depth information of the target object, divide the depth measurement of the whole measurement interval into the depth measurement of the plurality of measurement sections, and realize the accurate calculation of the depth.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (10)

1. A depth camera, comprising the following modules:
a light projection module for projecting a first light signal toward a target object in a scene;
the optical receiving module is used for receiving the first optical signal through at least three receiving windows and forming a second optical signal through reflection of the target object, so as to generate at least three collected optical signals, wherein the phase delays of the at least three collected optical signals relative to the second optical signal are different;
the control module is used for dividing a measuring interval into a plurality of measuring sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, determining the corresponding measuring sections according to the radiation energy of the at least three collected optical signals, and further determining the depth information of the target object, wherein the measuring interval is determined according to the pulse width of the first optical signal.
2. The depth camera of claim 1, wherein the light projection module comprises a light source, a light source driver, and a light modulator;
the light source driver is connected with the light source and used for driving the light source to emit light;
the optical modulator is connected with the light source and used for modulating the light projected by the light source and then projecting a first optical signal to the body to be measured.
3. The depth camera according to claim 1, wherein the light receiving module includes a lens, a filter, and an image sensor disposed along an optical path, the image sensor being provided with at least three of the receiving windows; the pulse width of the receiving window is larger or smaller than the pulse width of the first optical signal;
the image sensor is used for receiving at least three second optical signals through at least three receiving windows; the at least three receiving windows are sequentially arranged in time sequence, and each collected light signal is generated according to the second light signal from each receiving window.
4. The depth camera of claim 3, wherein the control module divides the measurement interval into a plurality of measurement segments according to phase delays of pulse waveforms of the second optical signal received continuously by at least three receiving windows, and sequentially marks the measurement segments in a preset order.
5. The depth camera of claim 1, wherein when the control module determines the depth information of the target object according to a target measurement segment of the plurality of measurement segments in which the radiant energy of the at least three collected light signals does the depth information of the target object, specifically:
determining the ratio values of the radiation intensities of the at least three second optical signals in the target measurement section according to a preset ratio relation, determining the depths of the radiation intensities of the at least three second optical signals in the target measurement section according to the ratio values, and accumulating the measurement sections in front of the target measurement section to determine the depth information of the target object.
6. The depth camera according to claim 1, wherein when the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window, and a fourth receiving window, the determining the depth information of the target object specifically comprises:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure FDA0002424772920000021
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure FDA0002424772920000022
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure FDA0002424772920000023
n is the order of the measurement segments.
7. The depth camera according to claim 1, wherein when the at least three receiving windows are a first receiving window, a second receiving window, and a third receiving window, the determining the depth information of the target object specifically comprises:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure FDA0002424772920000031
For radiant energyThe rising and falling edges of the quantity curve, which are also outside the maximum and minimum values, are set to Amid,imid
According to imax,iminThe value of (A) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges;
the depth d within the corresponding measurement segment for the falling edge is expressed as:
Figure FDA0002424772920000032
the depth d within the corresponding measurement segment for the rising edge is expressed as:
Figure FDA0002424772920000033
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure FDA0002424772920000034
n is the order of the measurement segments.
8. The depth camera of claim 1, wherein the pulse waveform of the first light signal is in the form of a plurality of rectangles arranged in sequence.
9. The depth camera of claim 1, wherein the radiation energy profile of the second optical signal received by each receiving window is trapezoidal and the radiation energy profiles of the second optical signals received by a plurality of receiving windows are out of phase.
10. The depth camera of claim 1, wherein the measurement interval
Figure FDA0002424772920000035
Wherein, T is the pulse width of the light signal emitted by the light source, and c is the speed of light.
CN202010216873.1A 2020-03-25 2020-03-25 Depth camera Pending CN113514851A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010216873.1A CN113514851A (en) 2020-03-25 2020-03-25 Depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010216873.1A CN113514851A (en) 2020-03-25 2020-03-25 Depth camera

Publications (1)

Publication Number Publication Date
CN113514851A true CN113514851A (en) 2021-10-19

Family

ID=78060130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010216873.1A Pending CN113514851A (en) 2020-03-25 2020-03-25 Depth camera

Country Status (1)

Country Link
CN (1) CN113514851A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101114021A (en) * 2006-06-14 2008-01-30 埃里斯红外线高智力传感器有限责任公司 Device and method for determining distance
CN108474849A (en) * 2016-02-17 2018-08-31 松下知识产权经营株式会社 Distance-measuring device
CN109870704A (en) * 2019-01-23 2019-06-11 深圳奥比中光科技有限公司 TOF camera and its measurement method
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN110187355A (en) * 2019-05-21 2019-08-30 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN110389351A (en) * 2018-04-16 2019-10-29 宁波飞芯电子科技有限公司 TOF range sensor, sensor array and the distance measuring method based on TOF range sensor
CN110412599A (en) * 2018-04-27 2019-11-05 索尼半导体解决方案公司 Range measurement processing unit, distance-measurement module and range measurement processing method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101114021A (en) * 2006-06-14 2008-01-30 埃里斯红外线高智力传感器有限责任公司 Device and method for determining distance
CN108474849A (en) * 2016-02-17 2018-08-31 松下知识产权经营株式会社 Distance-measuring device
CN110389351A (en) * 2018-04-16 2019-10-29 宁波飞芯电子科技有限公司 TOF range sensor, sensor array and the distance measuring method based on TOF range sensor
CN110412599A (en) * 2018-04-27 2019-11-05 索尼半导体解决方案公司 Range measurement processing unit, distance-measurement module and range measurement processing method
CN109870704A (en) * 2019-01-23 2019-06-11 深圳奥比中光科技有限公司 TOF camera and its measurement method
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN110187355A (en) * 2019-05-21 2019-08-30 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera

Similar Documents

Publication Publication Date Title
US20230273320A1 (en) Processing system for lidar measurements
US20230213656A1 (en) Processing Of Lidar Images
WO2021051477A1 (en) Time of flight distance measurement system and method with adjustable histogram
US8369575B2 (en) 3D image processing method and apparatus for improving accuracy of depth measurement of an object in a region of interest
US20200341144A1 (en) Independent per-pixel integration registers for lidar measurements
US9366759B2 (en) Apparatus and method for generating depth image
US10545237B2 (en) Method and device for acquiring distance information
US11287517B2 (en) Single frame distance disambiguation
US11294058B2 (en) Motion correction based on phase vector components
KR20120071970A (en) 3d image acquisition apparatus and method of extractig depth information in the 3d image acquisition apparatus
CN111123289B (en) Depth measuring device and measuring method
KR102650443B1 (en) Fully waveform multi-pulse optical rangefinder instrument
JP2010190675A (en) Distance image sensor system and method of generating distance image
US10948596B2 (en) Time-of-flight image sensor with distance determination
CN111896971B (en) TOF sensing device and distance detection method thereof
CN110361751A (en) The distance measurement method of time flight depth camera and the reduction noise of single-frequency modulation /demodulation
KR20210033545A (en) Method and system for increasing the range of a time-of-flight system by clear range toggling
CN111427230A (en) Imaging method based on time flight and 3D imaging device
CN111366943B (en) Flight time ranging system and ranging method thereof
CN111308482A (en) Filtered continuous wave time-of-flight measurements based on coded modulation images
US20220252730A1 (en) Time-of-flight imaging apparatus and time-of-flight imaging method
WO2021113001A1 (en) Configurable array of single-photon detectors
CN110062894B (en) Apparatus and method
CN113514851A (en) Depth camera
CN112513670A (en) Range finder, range finding system, range finding method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination