CN113447954A - Scene depth measuring method, system, device and storage medium - Google Patents

Scene depth measuring method, system, device and storage medium Download PDF

Info

Publication number
CN113447954A
CN113447954A CN202010216887.3A CN202010216887A CN113447954A CN 113447954 A CN113447954 A CN 113447954A CN 202010216887 A CN202010216887 A CN 202010216887A CN 113447954 A CN113447954 A CN 113447954A
Authority
CN
China
Prior art keywords
measurement
depth
optical signal
receiving
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010216887.3A
Other languages
Chinese (zh)
Inventor
苏公喆
杨心杰
朱力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN202010216887.3A priority Critical patent/CN113447954A/en
Publication of CN113447954A publication Critical patent/CN113447954A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

Abstract

The invention provides a method, a system, equipment and a storage medium for measuring scene depth, comprising the following steps: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal; receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals differing in phase delay relative to the second optical signal; dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signal; and determining a corresponding measuring section according to the radiation energy of the at least three collected light signals, and further determining the depth information of the target object. The depth calculation of the whole measurement interval is divided into the depth measurement of a plurality of measurement sections, so that the depth is accurately calculated.

Description

Scene depth measuring method, system, device and storage medium
Technical Field
The present invention relates to time-of-flight cameras, and in particular, to a method, system, device, and storage medium for measuring scene depth.
Background
In recent years, time-of-flight (ToF) cameras have become a popular 3D imaging technique, gaining popularity in some scientific and consumer applications such as robotic navigation, motion capture, human-machine interface, and 3D mapping.
The ToF camera is similar to an ordinary camera in imaging process and mainly comprises a light source, a photosensitive chip, a lens, a sensor, a driving control circuit, a processing circuit and other key units. The ToF camera includes a two-part core module, an emitting illumination module and a receiving light sensing module, which generate depth information based on the correlation between the two large core modules. The sensitization chip of ToF camera also divide into single-point and area array formula sensitization chip according to the quantity of pixel unit, in order to measure whole three-dimensional object surface position depth information, can utilize the single-point ToF camera to acquire the three-dimensional geometric structure of the object of being surveyed through scanning mode point by point, also can be through the area array formula ToF camera, shoot a scene picture and can acquire the surface geometric structure information of whole scene in real time, the face array formula ToF camera is changeed and is received the favor that consumer electronic system built.
As shown in fig. 1, T: x (t) is the light signal emitted by the light source, R: y (t) is a light signal received by the light source, reflected energy is received through two staggered first receiving windows RX1 and a second receiving window RX2, and the radiation intensity is B1 and B2 respectively; receiving ambient light energy through a third receiving window RX3 with a radiation intensity of B3; and calculating the phase difference by the ratio of receiving the effective energy through the two staggered receiving windows, and further acquiring the depth Distance of the target object. The calculation method specifically comprises the following steps:
Figure BDA0002424771980000011
wherein, gamma israngeIn order to measure the interval of the depth,
Figure BDA0002424771980000012
t is the pulse width of the light signal emitted by the light source, and c is the speed of light.
However, the accuracy of the ToF camera in the prior art decreases as the measurement interval of the degree of conformity increases, and thus cannot be applied to an application scenario with high measurement accuracy.
Disclosure of Invention
In view of the defects in the prior art, the present invention aims to provide a method, a system, a device and a storage medium for measuring scene depth.
The scene depth measuring method provided by the invention comprises the following steps:
step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
step S2: receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals differing in phase delay relative to the second optical signal;
step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signal;
step S4: and determining a corresponding measuring section according to the radiation energy of the at least three collected light signals, and further determining the depth information of the target object.
Preferably, the step S1 includes the steps of:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
step S102: adjusting the light beam emitted by the light source according to the first modulation signal, so that the light source generates the first light signal;
step S103: projecting the first optical signal to the target object, the first optical signal being reflected by the target object to form a second optical signal.
Preferably, the step S2 includes the steps of:
step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is greater than or less than the pulse width of the first optical signal, and the pulse waveform of the first optical signal is in a plurality of rectangles arranged in sequence;
step S202: receiving three second optical signals through at least three receiving windows, wherein the at least three receiving windows are sequentially arranged in time sequence;
step S203: and generating each of the collected light signals according to the second light signal from each of the receiving windows, wherein the phase delays of the at least three collected light signals relative to the second light signal are different.
Preferably, the step S3 includes the steps of:
step S301: determining a measurement interval according to the pulse width of the first optical signal;
step S302: dividing the measurement interval into a plurality of measurement sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
step S303: the measurement segments are marked in sequence in a preset order.
Preferably, the step S4 includes the steps of:
step S401: a target measurement section of the plurality of measurement sections in which the depth information of the target object is located according to the radiant energy of at least three collected light signals;
step S402: determining the ratio values of the radiation intensities of the at least three second optical signals in the target measurement section according to a preset ratio relation;
step S403: and determining the depths of the radiation intensities of the at least three second optical signals corresponding to the target measuring section according to the proportion values, and further accumulating the measuring sections in the sequence before the target measuring section to determine the depth information of the target object.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window and a fourth receiving window, the step S4 specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure BDA0002424771980000031
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure BDA0002424771980000032
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424771980000033
n is the order of the measurement segments.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window and a third receiving window, the step S4 specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure BDA0002424771980000041
For maximum in radiant energy curveThe value is also set to A along the rising and falling edges other than the minimum valuemid,imid
According to imax,iminThe value of (A) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges;
the depth d within the corresponding measurement segment for the falling edge is expressed as:
Figure BDA0002424771980000042
the depth d within the corresponding measurement segment for the rising edge is expressed as:
Figure BDA0002424771980000043
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424771980000044
n is the order of the measurement segments.
The scene depth measuring system provided by the invention comprises the following modules:
the system comprises a light projection module, a light source and a control module, wherein the light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
the light receiving module is used for receiving second light signals at least three receiving windows through an image sensor to generate at least three collected light signals, and the phase delays of the at least three collected light signals relative to the second light signals are different;
the measurement section generation module is used for dividing a measurement section into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement section is determined according to the pulse width of the first optical signal;
a depth calculation module for determining depth information of the target object from the plurality of measurement segments and the radiant energy of the at least three collected light signals.
According to the present invention, there is provided a scene depth measuring apparatus comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the scene depth measurement method of any of claims 1 to 7 via execution of the executable instructions.
According to the present invention, there is provided a computer-readable storage medium for storing a program which, when executed, implements the steps of the scene depth measurement method.
Compared with the prior art, the invention has the following beneficial effects:
in the invention, the measuring interval is divided into a plurality of measuring sections through the pulse waveforms of the second optical signals continuously received by the at least three receiving windows, so that the corresponding measuring sections can be determined according to the radiation energy of the at least three collected optical signals, the depth is determined in the measuring sections, the measuring sections in front of the target measuring section are accumulated to determine the depth information of the target object, the depth measurement of the whole measuring interval is divided into the depth measurement of the plurality of measuring sections, and the accurate calculation of the depth is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a schematic diagram of a prior art time-of-flight camera;
FIG. 2 is a diagram illustrating a depth calculation by time of flight in the prior art;
FIG. 3 is a flowchart illustrating steps of a method for measuring scene depth according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating the steps of forming a second optical signal according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating steps for collecting a light signal according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating steps for measurement segment generation according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating the steps of generating depth information for a target object according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a pulse shape of a first optical signal according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of phase delay of collected light signals according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a pulse waveform of a first optical signal according to a variation of the present invention;
fig. 11 is a schematic diagram of phase delay of an acquired optical signal according to a variation of the present invention;
FIG. 12 is a schematic diagram of a pulse waveform of a second optical signal received by four receiving windows in succession according to an embodiment of the present invention;
FIG. 13 is a schematic diagram illustrating four measurement segments divided by pulse waveforms of four second optical signals according to an embodiment of the present invention;
FIG. 14 is a schematic diagram of three pulse waveforms of the second optical signal dividing the measurement interval into 6 measurement segments according to an embodiment of the present invention;
FIG. 15 is a block diagram of a scene depth measurement system according to an embodiment of the present invention;
FIG. 16 is a schematic structural diagram of a scene depth measuring device according to an embodiment of the present invention; and
fig. 17 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The invention provides a scene depth measuring method, and aims to solve the problems in the prior art.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 3 is a flowchart illustrating steps of a scene depth measurement method according to an embodiment of the present invention, and as shown in fig. 3, the scene depth measurement method provided by the present invention includes the following steps:
step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
fig. 4 is a flowchart of steps of forming a second optical signal according to an embodiment of the present invention, and as shown in fig. 4, the step S1 includes the following steps:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
step S102: adjusting the light beam emitted by the light source according to the first modulation signal, so that the light source generates the first light signal;
step S103: projecting the first optical signal to the target object, the first optical signal being reflected by the target object to form a second optical signal.
In the embodiment of the present invention, the modulated first optical signal is in a plurality of rectangles arranged in sequence, fig. 8 is a schematic diagram of a pulse waveform of the first optical signal in the embodiment of the present invention, and as shown in fig. 8, the pulse signal of the first optical signal is in a rectangle shape.
Step S2: receiving, by an image sensor, a second optical signal at least three receiving windows to generate at least three collected optical signals, the at least three collected optical signals differing in phase delay relative to the second optical signal;
fig. 5 is a flowchart illustrating steps of collecting a light signal according to an embodiment of the present invention, and as shown in fig. 5, the step S2 includes the following steps:
step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is greater than or less than the pulse width of the first optical signal, and the pulse waveform of the first optical signal is in a plurality of rectangles arranged in sequence;
step S202: receiving three second optical signals through at least three receiving windows, wherein the at least three receiving windows are sequentially arranged in time sequence;
step S203: and generating each of the collected light signals according to the second light signal from each of the receiving windows, wherein the phase delays of the at least three collected light signals relative to the second light signal are different.
Fig. 9 is a schematic diagram of phase delay of the collected light signals according to the embodiment of the present invention, and as shown in fig. 9, the four collected light signals have different phase delays relative to the second light signal, and are sequentially arranged in time sequence.
Step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signal;
in the embodiment of the present invention, the pulse width of the receiving window is greater than the pulse width of the first optical signal and twice as large as the pulse width of the first optical signal.
Fig. 10 is a schematic diagram of a pulse waveform of a first optical signal in a modification of the present invention, and fig. 11 is a schematic diagram of a phase delay of a collected optical signal in the modification of the present invention, as shown in fig. 10 and fig. 11, in the modification of the present invention, a pulse width of the receiving window is smaller than a pulse width of the first optical signal and is one-half of the pulse width of the first optical signal.
Fig. 6 is a flowchart of steps of generating a measurement segment in the embodiment of the present invention, and as shown in fig. 6, the step S3 includes the following steps:
step S301: determining a measurement interval according to the pulse width of the first optical signal;
step S302: dividing the measurement interval into a plurality of measurement sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
step S303: the measurement segments are marked in sequence in a preset order.
Step S4: determining depth information of the target object from the plurality of measurement segments and the radiant energy of the at least three collected light signals.
In the embodiment of the invention, the measurement interval
Figure BDA0002424771980000081
Wherein T is a first optical signal emitted by the light sourcePulse width, c is the speed of light.
Fig. 12 is a schematic diagram of pulse waveforms of second optical signals continuously received by four receiving windows in an embodiment of the present invention, as shown in fig. 12, a pulse width of the receiving window is greater than or less than a pulse width of the first optical signal, the pulse waveform of the first optical signal is a plurality of rectangles arranged in sequence, a radiation energy curve of the second optical signal received by each receiving window is trapezoidal, and a phase difference exists between radiation energy curves of the second optical signals received by the plurality of receiving windows.
Fig. 13 is a schematic diagram of a pulse waveform of four second optical signals dividing a measurement interval into 8 measurement segments according to an embodiment of the present invention, as shown in fig. 13, a1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 are quarter-phase apart.
Is provided with
Figure BDA0002424771980000082
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure BDA0002424771980000091
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a valueFrom imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure BDA0002424771980000092
n is the order of the measurement segments.
Fig. 14 is a schematic diagram of a pulse waveform of three second optical signals dividing a measurement interval into 6 measurement segments according to an embodiment of the present invention, as shown in fig. 14, a1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 are quarter-phase apart.
Is provided with
Figure BDA0002424771980000093
A is set for the rising edge and the falling edge of the maximum value and the minimum value in the radiation energy curvemid,imid
According to imax,iminThe value of (2) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges:
for the depth d within the measurement segment corresponding to the falling edge:
Figure BDA0002424771980000094
for the depth d within the measurement segment corresponding to the rising edge:
Figure BDA0002424771980000095
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminDepth bias corresponding to the determined sectionA shift amount, which may take the value of
Figure BDA0002424771980000096
n is the order of the measurement segments.
In the embodiment of the present invention, the maximum value max (b) of the radiant energy curve of the second optical signal is 2N-1, where N is the number of bits of the analog-to-digital converter.
Fig. 7 is a flowchart of the steps of generating the depth information of the target object in the embodiment of the present invention, and as shown in fig. 7, the step S4 includes the following steps:
step S401: arranging a target measurement section in the plurality of measurement sections where the depth information of the target object is located according to the radiation energy of the three collected optical signals;
step S402: determining the ratio values of the radiation intensities of the three second optical signals in the target measurement section according to a preset ratio relation;
step S403: and determining the depths of the radiation intensities of the three second optical signals corresponding to the target measuring section according to the proportion values, and further accumulating the measuring sections in the sequence before the target measuring section to determine the depth information of the target object.
Fig. 15 is a schematic block diagram of a scene depth measurement system according to an embodiment of the present invention, and as shown in fig. 15, the scene depth measurement system provided by the present invention includes the following modules:
the system comprises a light projection module, a light source and a control module, wherein the light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
the light receiving module is used for receiving second light signals at least three receiving windows through an image sensor to generate at least three collected light signals, and the phase delays of the at least three collected light signals relative to the second light signals are different;
the measurement section generation module is used for dividing a measurement section into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement section is determined according to the pulse width of the first optical signal;
a depth calculation module for determining depth information of the target object from the plurality of measurement segments and the radiant energy of the at least three collected light signals.
The embodiment of the invention also provides scene depth measuring equipment which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the scene depth measurement method via execution of executable instructions.
As described above, the embodiment can divide the measurement interval into a plurality of measurement segments by the pulse waveforms of the second optical signals continuously received by the at least three receiving windows, so that the corresponding measurement segments can be determined according to the radiation energies of the at least three collected optical signals, the depth is determined in the measurement segment, the measurement segments in the sequence before the target measurement segment are accumulated to determine the depth information of the target object, the depth measurement of the whole measurement interval is divided into the depth measurement of the plurality of measurement segments, and the accurate calculation of the depth is realized.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 16 is a schematic structural diagram of a scene depth measuring apparatus in the embodiment of the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 16. The electronic device 600 shown in fig. 16 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 16, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code which is executable by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention as described in the above-mentioned scene depth measurement method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in FIG. 16, other hardware and/or software modules may be used in conjunction with electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the scene depth measuring method when being executed. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned scene depth measurement method section of the present description, when the program product is run on the terminal device.
As described above, the program of the computer-readable storage medium of this embodiment, when executed, divides the measurement interval into a plurality of measurement segments by the pulse waveform of the second optical signal received continuously through at least three receiving windows, so that it is possible to determine a corresponding measurement segment from the radiation energy of the at least three collected optical signals, determine the depth in the measurement segment and then accumulate the measurement segments in order before the target measurement segment to determine the depth information of the target object, divide the depth measurement of the entire measurement interval into depth measurements of a plurality of measurement segments, and implement accurate depth calculation.
Fig. 17 is a schematic structural diagram of a computer-readable storage medium in an embodiment of the present invention. Referring to fig. 17, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In the embodiment of the invention, the measurement interval is divided into a plurality of measurement sections by the pulse waveforms of the second optical signals continuously received by at least three receiving windows, so that the corresponding measurement sections can be determined according to the radiation energy of the at least three collected optical signals, the depth is determined in the measurement sections, the measurement sections in the front of the target measurement section are accumulated to determine the depth information of the target object, the depth calculation of the whole measurement interval is divided into the depth measurement of the plurality of measurement sections, and the accurate calculation of the depth is realized.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (10)

1. A scene depth measurement method is characterized by comprising the following steps:
step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
step S2: receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals differing in phase delay relative to the second optical signal;
step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signal;
step S4: and determining a corresponding measuring section according to the radiation energy of the at least three collected light signals, and further determining the depth information of the target object.
2. The scene depth measurement method according to claim 1, wherein the step S1 includes the steps of:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
step S102: adjusting the light beam emitted by the light source according to the first modulation signal, so that the light source generates the first light signal;
step S103: projecting the first optical signal to the target object, the first optical signal being reflected by the target object to form a second optical signal.
3. The scene depth measurement method according to claim 1, wherein the step S2 includes the steps of:
step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is greater than or less than the pulse width of the first optical signal, and the pulse waveform of the first optical signal is in a plurality of rectangles arranged in sequence;
step S202: receiving three second optical signals through at least three receiving windows, wherein the at least three receiving windows are sequentially arranged in time sequence;
step S203: and generating each of the collected light signals according to the second light signal from each of the receiving windows, wherein the phase delays of the at least three collected light signals relative to the second light signal are different.
4. The scene depth measurement method according to claim 3, wherein the step S3 includes the steps of:
step S301: determining a measurement interval according to the pulse width of the first optical signal;
step S302: dividing the measurement interval into a plurality of measurement sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
step S303: the measurement segments are marked in sequence in a preset order.
5. The scene depth measurement method according to claim 1, wherein the step S4 includes the steps of:
step S401: a target measurement section of the plurality of measurement sections in which the depth information of the target object is located according to the radiant energy of at least three collected light signals;
step S402: determining the ratio values of the radiation intensities of the at least three second optical signals in the target measurement section according to a preset ratio relation;
step S403: and determining the depths of the radiation intensities of the at least three second optical signals corresponding to the target measuring section according to the proportion values, and further accumulating the measuring sections in the sequence before the target measuring section to determine the depth information of the target object.
6. The method of claim 1, wherein when the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window, and a fourth receiving window, the step S4 specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; a4 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure FDA0002424771970000021
A is set for the rising edge of the radiation energy curve outside the maximum and minimum valuesrise,iriseThe falling edge is set as Afall,ifall
According to imax,iminThe value of (d) can divide the measurement interval into four measurement segments, wherein each measurement segment has a rising edge and a falling edge, and the depth d in one measurement segment is expressed as:
Figure FDA0002424771970000022
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure FDA0002424771970000031
n is the order of the measurement segments.
7. The method of claim 1, wherein when the at least three receiving windows are a first receiving window, a second receiving window and a third receiving window, the step S4 specifically includes:
let A1 be the radiant energy curve of the second optical signal received by the first receiving window; a2 is a radiant energy curve of the second optical signal received by a second receiving window; a3 is a radiant energy curve of the second optical signal received by a third receiving window; the radiation energy curve A1, the radiation energy curve A2, the radiation energy curve A3 and the radiation energy curve A4 have quarter phase difference;
is provided with
Figure FDA0002424771970000032
A is set for the rising edge and the falling edge of the maximum value and the minimum value in the radiation energy curvemid,imid
According to imax,iminThe value of (A) can divide the measurement interval into six measurement sections, wherein three measurement sections are falling edges, and three measurement sections are rising edges;
the depth d within the corresponding measurement segment for the falling edge is expressed as:
Figure FDA0002424771970000033
the depth d within the corresponding measurement segment for the rising edge is expressed as:
Figure FDA0002424771970000034
wherein d ismaxTo maximum measurement depth, dint(imax,imin) To take a value of imax,iminThe determined depth offset corresponding to the interval may be set to be
Figure FDA0002424771970000035
n is the order of the measurement segments.
8. A scene depth measurement system, comprising:
the system comprises a light projection module, a light source and a control module, wherein the light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
the light receiving module is used for receiving second light signals at least three receiving windows through an image sensor to generate at least three collected light signals, and the phase delays of the at least three collected light signals relative to the second light signals are different;
the measurement section generation module is used for dividing a measurement section into a plurality of measurement sections according to pulse waveforms of second optical signals continuously received by at least three receiving windows, wherein the measurement section is determined according to the pulse width of the first optical signal;
a depth calculation module for determining depth information of the target object from the plurality of measurement segments and the radiant energy of the at least three collected light signals.
9. A scene depth measurement device, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the scene depth measurement method of any of claims 1 to 7 via execution of the executable instructions.
10. A computer-readable storage medium storing a program, wherein the program is configured to implement the steps of the scene depth measurement method of any one of claims 1 to 7 when executed.
CN202010216887.3A 2020-03-25 2020-03-25 Scene depth measuring method, system, device and storage medium Pending CN113447954A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010216887.3A CN113447954A (en) 2020-03-25 2020-03-25 Scene depth measuring method, system, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010216887.3A CN113447954A (en) 2020-03-25 2020-03-25 Scene depth measuring method, system, device and storage medium

Publications (1)

Publication Number Publication Date
CN113447954A true CN113447954A (en) 2021-09-28

Family

ID=77806848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010216887.3A Pending CN113447954A (en) 2020-03-25 2020-03-25 Scene depth measuring method, system, device and storage medium

Country Status (1)

Country Link
CN (1) CN113447954A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015195318A1 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Automatic multiple depth cameras synchronization using time sharing
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015195318A1 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Automatic multiple depth cameras synchronization using time sharing
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera

Similar Documents

Publication Publication Date Title
US20210383560A1 (en) Depth measurement assembly with a structured light source and a time of flight camera
US20230273320A1 (en) Processing system for lidar measurements
EP2936204B1 (en) Multiple frequency time of flight dealiasing
US10430956B2 (en) Time-of-flight (TOF) capturing apparatus and image processing method of reducing distortion of depth caused by multiple reflection
EP2976878B1 (en) Method and apparatus for superpixel modulation
US8902411B2 (en) 3-dimensional image acquisition apparatus and method of extracting depth information in the 3D image acquisition apparatus
US10545237B2 (en) Method and device for acquiring distance information
US20200341144A1 (en) Independent per-pixel integration registers for lidar measurements
EP3971685A1 (en) Interactive control method and apparatus, electronic device and storage medium
US20120105587A1 (en) Method and apparatus of measuring depth information for 3d camera
CN105190426A (en) Time of flight sensor binning
US9628774B2 (en) Image processing method and apparatus
EP3092509A1 (en) Fast general multipath correction in time-of-flight imaging
CN112824935B (en) Depth imaging system, method, device and medium based on modulated light field
CN113447954A (en) Scene depth measuring method, system, device and storage medium
CN110988840B (en) Method and device for acquiring flight time and electronic equipment
CN114697521A (en) TOF camera motion blur detection method, system, equipment and storage medium
CN113009498A (en) Distance measuring method, device and system
US20230260143A1 (en) Using energy model to enhance depth estimation with brightness image
CN113514851A (en) Depth camera
CN115250316A (en) TOF mirror surface multipath removal method, system, equipment and medium based on modulated light field
WO2023156561A1 (en) Using energy model to enhance depth estimation with brightness image
CN116203538A (en) Time-of-flight ranging method, device, electronic equipment and readable storage medium
TWI773791B (en) Digital pixel with extended dynamic range
JP2002221408A (en) Optical measuring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination