CN113447954B - Scene depth measurement method, system, equipment and storage medium - Google Patents

Scene depth measurement method, system, equipment and storage medium Download PDF

Info

Publication number
CN113447954B
CN113447954B CN202010216887.3A CN202010216887A CN113447954B CN 113447954 B CN113447954 B CN 113447954B CN 202010216887 A CN202010216887 A CN 202010216887A CN 113447954 B CN113447954 B CN 113447954B
Authority
CN
China
Prior art keywords
optical signal
depth
receiving
measuring
receiving window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010216887.3A
Other languages
Chinese (zh)
Other versions
CN113447954A (en
Inventor
苏公喆
杨心杰
朱力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN202010216887.3A priority Critical patent/CN113447954B/en
Publication of CN113447954A publication Critical patent/CN113447954A/en
Application granted granted Critical
Publication of CN113447954B publication Critical patent/CN113447954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention provides a scene depth measuring method, a system, equipment and a storage medium, which comprise the following steps: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal; receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals having different phase delays relative to the second optical signal; dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signals; and determining a corresponding measuring section according to the radiation energy of the at least three collected light signals, and further determining the depth information of the target object. The invention divides the depth calculation of the whole measuring interval into the depth measurement of a plurality of measuring sections, thereby realizing the accurate calculation of the depth.

Description

Scene depth measurement method, system, equipment and storage medium
Technical Field
The present invention relates to a time-of-flight camera, and in particular, to a scene depth measurement method, system, apparatus, and storage medium.
Background
In recent years, time-of-flight (ToF) cameras have become a popular 3D imaging technology, becoming popular in a number of scientific and consumer applications such as robotic navigation, motion capture, human-machine interfaces, and 3D mapping.
The ToF camera is similar to the common camera in imaging process, and mainly comprises a light source, a photosensitive chip, a lens, a sensor, a driving control circuit, a processing circuit and other key units. The ToF camera includes a two-part core module, a transmitting illumination module and a photosensitive receiving module, which generate depth information based on the correlation between the two core modules. The photosensitive chips of the ToF camera are also divided into single-point photosensitive chips and area array photosensitive chips according to the number of pixel units, in order to measure the surface position depth information of the whole three-dimensional object, the three-dimensional geometric structure of the detected object can be obtained by utilizing the single-point ToF camera in a point-by-point scanning mode, the surface geometric structure information of the whole scene can be obtained in real time by shooting a scene picture through the area array ToF camera, and the area array ToF camera is more favored by being built by a consumer electronic system.
As shown in fig. 1, T: x (t) is an optical signal emitted by the light source, R: y (t) is an optical signal received by a light source, reflected energy is received through two staggered first receiving windows RX1 and second receiving windows RX2, and the radiation intensities are B1 and B2 respectively; receiving ambient light energy through a third receiving window RX3 with radiation intensity B3; and calculating the phase difference by the proportion of the effective energy received by the two staggered receiving windows, and further obtaining the depth Distance of the target object. The calculation mode is specifically as follows: wherein Γ range is the depth measurement interval,/> T is the pulse width of the light signal emitted by the light source, and c is the light speed.
However, the precision of the ToF camera in the prior art decreases with the increase of the measurement interval, and the ToF camera cannot be applied to an application scenario with a large measurement precision.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a scene depth measuring method, a scene depth measuring system, scene depth measuring equipment and a storage medium.
The scene depth measuring method provided by the invention comprises the following steps:
Step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
Step S2: receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals having different phase delays relative to the second optical signal;
step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signals;
Step S4: and determining a corresponding measuring section according to the radiation energy of the at least three collected light signals, and further determining the depth information of the target object.
Preferably, the step S1 includes the steps of:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
Step S102: adjusting the light beam emitted by the light source according to the first modulation signal so that the light source generates the first light signal;
step S103: and projecting the first optical signal to the target object, wherein the first optical signal is reflected by the target object to form a second optical signal.
Preferably, the step S2 includes the steps of:
Step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is larger than or smaller than that of the first optical signal, and the pulse waveform of the first optical signal is a plurality of rectangles which are sequentially arranged;
step S202: receiving three second optical signals through at least three receiving windows, the at least three receiving windows being sequentially arranged in time sequence;
Step S203: generating each collected optical signal according to the second optical signal received by each receiving window, wherein the phase delay of the at least three collected optical signals relative to the second optical signal is different.
Preferably, the step S3 includes the steps of:
Step S301: determining a measurement interval according to the pulse width of the first optical signal;
Step S302: dividing the measuring interval into a plurality of measuring sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
Step S303: the measurement segments are marked in sequence in a preset order.
Preferably, the step S4 includes the steps of:
Step S401: a target measuring section of a plurality of measuring sections in which depth information of the target object is actually located according to radiation energy of at least three collected optical signals;
Step S402: determining the proportion value of the radiation intensity of the at least three second optical signals in the target measuring section according to a preset proportion relation;
Step S403: and determining the depth of the at least three second optical signals corresponding to the radiation intensities of the at least three second optical signals according to the proportion value, and further determining the depth information of the target object by accumulating the measuring sections in front of the target measuring section.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window, and a fourth receiving window, the step S4 specifically includes:
Setting A1 as a radiation energy curve of the second optical signal received by the first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; a4 is a radiation energy curve of the second optical signal received by the fourth receiving window; the radiant energy curves A1, A2, A3 and A4 differ by a quarter phase;
Is provided with
For rising edges outside the maximum and minimum values in the radiant energy curve, set to a rise, and for falling edges set to a fall;
According to the value of i max,imin, the measuring interval can be divided into four measuring sections, wherein each measuring section has a rising edge and a falling edge, and the depth d in one measuring section is expressed as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N=0, 1,2,3, n is the order of the measurement segments.
Preferably, when the at least three receiving windows are a first receiving window, a second receiving window, and a third receiving window, the step S4 specifically includes:
Setting A1 as a radiation energy curve of the second optical signal received by the first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; the radiant energy curve A1, the radiant energy curve A2 and the radiant energy curve A3 differ by one third of the phase;
Is provided with
Setting A mid for rising and falling edges outside the maximum and minimum values in the radiant energy curve;
According to the value of i max,imin, the measuring interval can be divided into six measuring sections, wherein three measuring sections are falling edges, and three measuring sections are rising edges;
The depth d within the corresponding measurement segment for the falling edge is denoted as:
The depth d within the corresponding measurement segment for the rising edge is denoted as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N=0, 1,2,3,4,5, n is the order of the measurement segments.
The scene depth measurement system provided by the invention comprises the following modules:
The light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
The optical receiving module is used for receiving the second optical signal at least three receiving windows through the image sensor to generate at least three collected optical signals, and the phase delays of the at least three collected optical signals relative to the second optical signal are different;
the measuring section generating module is used for dividing a measuring section into a plurality of measuring sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, and the measuring section is determined according to the pulse width of the first optical signals;
and the depth calculation module is used for determining the depth information of the target object according to the plurality of measurement segments and the radiant energy of the at least three collected light signals.
The scene depth measuring device provided by the invention comprises:
a processor;
A memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the scene depth measurement method of any of claims 1 to 7 via execution of the executable instructions.
According to the present invention, there is provided a computer-readable storage medium storing a program which, when executed, implements the steps of the scene depth measurement method.
Compared with the prior art, the invention has the following beneficial effects:
In the invention, the pulse waveform of the second optical signal continuously received by at least three receiving windows divides the measuring interval into a plurality of measuring sections, so that the corresponding measuring section can be determined according to the radiation energy of the at least three collected optical signals, the depth is determined in the measuring section, the depth information of the target object is determined by accumulating the measuring sections in front of the target measuring section, and the depth measurement of the whole measuring interval is divided into the depth measurement of the plurality of measuring sections, thereby realizing the accurate calculation of the depth.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art. Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic diagram of a prior art time-of-flight camera;
FIG. 2 is a schematic diagram of depth calculation by time of flight in the prior art;
FIG. 3 is a flowchart illustrating steps of a field depth measurement method according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating steps for forming a second optical signal according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating steps for collecting optical signals according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating steps for measurement segment generation in accordance with an embodiment of the present invention;
FIG. 7 is a flowchart illustrating steps for generating depth information of a target object according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of pulse waveforms of a first optical signal according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of phase delay of an optical signal collected in an embodiment of the invention;
FIG. 10 is a schematic diagram of a pulse waveform of a first optical signal according to a modification of the present invention;
FIG. 11 is a schematic diagram showing a phase delay of an optical signal collected in a modification of the present invention;
fig. 12 is a schematic diagram of pulse waveforms of a second optical signal continuously received by four receiving windows according to an embodiment of the present invention;
FIG. 13 is a schematic diagram showing the pulse waveforms of four second optical signals dividing a measurement zone into four measurement segments according to an embodiment of the present invention;
FIG. 14 is a schematic diagram showing the pulse waveforms of three second optical signals dividing a measurement zone into 6 measurement segments according to an embodiment of the present invention;
FIG. 15 is a block diagram of a field depth measurement system according to an embodiment of the present invention;
FIG. 16 is a schematic diagram of a field depth measuring device according to an embodiment of the present invention; and
Fig. 17 is a schematic diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical scheme of the invention is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
The invention provides a scene depth measuring method, which aims to solve the problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 3 is a flowchart illustrating steps of a scene depth measurement method according to an embodiment of the present invention, where, as shown in fig. 3, the scene depth measurement method provided by the present invention includes the following steps:
Step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
Fig. 4 is a flowchart of steps for forming a second optical signal in the embodiment of the present invention, as shown in fig. 4, the step S1 includes the following steps:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
Step S102: adjusting the light beam emitted by the light source according to the first modulation signal so that the light source generates the first light signal;
step S103: and projecting the first optical signal to the target object, wherein the first optical signal is reflected by the target object to form a second optical signal.
In the embodiment of the present invention, the modulated first optical signal is a plurality of rectangles sequentially arranged and continuous, and fig. 8 is a schematic diagram of a pulse waveform of the first optical signal in the embodiment of the present invention, and the pulse signal of the first optical signal is a rectangle as shown in fig. 8.
Step S2: receiving a second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals having different phase delays relative to the second optical signal;
Fig. 5 is a flowchart of the step of collecting an optical signal in the embodiment of the present invention, as shown in fig. 5, the step S2 includes the following steps:
Step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is larger than or smaller than that of the first optical signal, and the pulse waveform of the first optical signal is a plurality of rectangles which are sequentially arranged;
step S202: receiving three second optical signals through at least three receiving windows, the at least three receiving windows being sequentially arranged in time sequence;
Step S203: generating each collected optical signal according to the second optical signal received by each receiving window, wherein the phase delay of the at least three collected optical signals relative to the second optical signal is different.
Fig. 9 is a schematic diagram of phase delay of the collected optical signals in the embodiment of the invention, as shown in fig. 9, the phase delays of the four collected optical signals relative to the second optical signal are different, and the four collected optical signals are sequentially arranged in time sequence.
Step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signals;
in an embodiment of the present invention, a pulse width of the receiving window is greater than and twice a pulse width of the first optical signal.
Fig. 10 is a schematic diagram of a pulse waveform of a first optical signal in a modification of the present invention, and fig. 11 is a schematic diagram of a phase delay of an optical signal collected in a modification of the present invention, as shown in fig. 10 and 11, in a modification of the present invention, a pulse width of the receiving window is smaller than a pulse width of the first optical signal and is a half of the pulse width of the first optical signal.
Fig. 6 is a flowchart of the step of generating a measurement segment in the embodiment of the present invention, as shown in fig. 6, the step S3 includes the following steps:
Step S301: determining a measurement interval according to the pulse width of the first optical signal;
Step S302: dividing the measuring interval into a plurality of measuring sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
Step S303: the measurement segments are marked in sequence in a preset order.
Step S4: depth information of the target object is determined from the plurality of measurement segments and the radiant energy of the at least three collected light signals.
In the embodiment of the invention, the measurement intervalWherein T is the pulse width of the first optical signal emitted by the light source, and c is the light speed.
Fig. 12 is a schematic diagram of pulse waveforms of a second optical signal continuously received by four receiving windows in an embodiment of the present invention, as shown in fig. 12, where the pulse width of the receiving window is greater than or less than the pulse width of the first optical signal, the pulse waveform of the first optical signal is a plurality of rectangles arranged in sequence, the radiation energy curve of the second optical signal received by each receiving window is trapezoidal, and the radiation energy curves of the second optical signal received by the receiving windows have phase differences.
Fig. 13 is a schematic diagram of dividing a measurement interval into 4 measurement segments by pulse waveforms of four second optical signals in an embodiment of the present invention, as shown in fig. 13, A1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; a4 is a radiation energy curve of the second optical signal received by the fourth receiving window; the radiant energy curves A1, A2, A3, A4 differ by a quarter phase.
Is provided with
For rising edges outside the maximum and minimum values in the radiant energy curve, set to a rise, and for falling edges set to a fall;
According to the value of i max,imin, the measuring interval can be divided into four measuring sections, wherein each measuring section has a rising edge and a falling edge, and the depth d in one measuring section is expressed as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N=0, 1,2,3, n is the order of the measurement segments.
Fig. 14 is a schematic diagram of dividing a measurement interval into 6 measurement segments by pulse waveforms of three second optical signals according to an embodiment of the present invention, as shown in fig. 14, A1 is a radiation energy curve of the second optical signals received by a first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; the radiant energy curves A1, A2, A3 differ by one third of the phase.
Is provided with
Setting A mid for rising and falling edges outside the maximum and minimum values in the radiant energy curve;
According to the value of i max,imin, the measuring interval can be divided into six measuring sections, wherein three measuring sections are falling edges, and three measuring sections are rising edges:
for a depth d within the measurement segment corresponding to the falling edge:
For depth d within the measurement segment corresponding to the rising edge:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N=0, 1,2,3,4,5, n is the order of the measurement segments.
In the embodiment of the present invention, the maximum value max (B) of the radiant energy curve of the second optical signal is 2 N -1, where N is the number of bits of the analog-to-digital converter.
Fig. 7 is a flowchart of steps for generating depth information of a target object according to an embodiment of the present invention, as shown in fig. 7, the step S4 includes the following steps:
Step S401: arranging a target measuring section in a plurality of measuring sections where depth information of the target object is located according to the radiation energy of the three collected light signals;
Step S402: determining the proportion value of the radiation intensities of the three second optical signals in the target measurement section according to a preset proportion relation;
Step S403: and determining the depth of the radiation intensities of the three second optical signals corresponding to the target measuring section according to the proportion value, and further determining the depth information of the target object by accumulating measuring sections in front of the target measuring section in sequence.
Fig. 15 is a schematic block diagram of a scene depth measurement system according to an embodiment of the present invention, and as shown in fig. 15, the scene depth measurement system provided by the present invention includes the following modules:
The light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
The optical receiving module is used for receiving the second optical signal at least three receiving windows through the image sensor to generate at least three collected optical signals, and the phase delays of the at least three collected optical signals relative to the second optical signal are different;
the measuring section generating module is used for dividing a measuring section into a plurality of measuring sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, and the measuring section is determined according to the pulse width of the first optical signals;
and the depth calculation module is used for determining the depth information of the target object according to the plurality of measurement segments and the radiant energy of the at least three collected light signals.
The embodiment of the invention also provides scene depth measuring equipment which comprises a processor. A memory having stored therein executable instructions of a processor. Wherein the processor is configured to perform the steps of the scene depth measurement method via execution of the executable instructions.
As described above, in this embodiment, the measuring section can be divided into a plurality of measuring sections by the pulse waveform of the second optical signal continuously received by at least three receiving windows, so that the corresponding measuring section can be determined according to the radiant energy of the at least three collected optical signals, the depth is determined in the measuring section, and then the measuring section before the target measuring section is determined by accumulating the depth information of the target object, and the depth measurement of the whole measuring section is divided into the depth measurement of the plurality of measuring sections, thereby realizing the accurate calculation of the depth.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" platform.
Fig. 16 is a schematic structural view of a scene depth measurement device in an embodiment of the invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 16. The electronic device 600 shown in fig. 16 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 16, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including memory unit 620 and processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code that is executable by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention described in the above-mentioned scene depth measurement method section of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 6201 and/or cache memory unit 6202, and may further include Read Only Memory (ROM) 6203.
The storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in fig. 16, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage platforms, and the like.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the steps of the scene depth measurement method are realized when the program is executed. In some possible embodiments, the aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the above description of the scene depth measurement method section, when the program product is run on the terminal device.
As described above, the program of the computer-readable storage medium of this embodiment, when executed, divides the measurement section into a plurality of measurement sections by the pulse waveform of the second optical signal continuously received through at least three reception windows, so that it is possible to determine the corresponding measurement section from the radiant energy of the at least three collected optical signals, determine the depth in the measurement section and further determine the depth information of the target object by accumulating the measurement sections preceding the target measurement section in order, and divide the depth measurement of the entire measurement section into the depth measurement of the plurality of measurement sections, thereby realizing accurate calculation of the depth.
Fig. 17 is a schematic diagram of a structure of a computer-readable storage medium in an embodiment of the present invention. Referring to fig. 17, a program product 800 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In the embodiment of the invention, the measuring section is divided into a plurality of measuring sections by the pulse waveform of the second optical signal continuously received by at least three receiving windows, so that the corresponding measuring section can be determined according to the radiation energy of the at least three collected optical signals, the depth is determined in the measuring section, the depth information of the target object is determined by the measuring section before the target measuring section in sequence of accumulation, and the depth calculation of the whole measuring section is divided into the depth measurement of the plurality of measuring sections, thereby realizing the accurate calculation of the depth.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing describes specific embodiments of the present invention. It is to be understood that the invention is not limited to the particular embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the claims without affecting the spirit of the invention.

Claims (11)

1. A scene depth measurement method, comprising the steps of:
Step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
Step S2: receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals having different phase delays relative to the second optical signal;
step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signals;
step S4: determining corresponding measuring segments according to the radiation energy of the at least three collected light signals, and further determining depth information of the target object; the modulated first optical signals are a plurality of rectangles which are sequentially arranged and continuous;
The step S4 includes the steps of:
step S401: determining a target measuring section in a plurality of measuring sections where the depth information of the target object is located according to the radiation energy of at least three collected light signals;
step S402: determining the proportion value of the radiation intensity of the at least three second optical signals in the target measuring section according to a preset proportion relation;
step S403: determining the depth of the radiation intensities of the at least three second optical signals corresponding to the target measuring section according to the proportion value, and further determining the depth information of the target object by accumulating measuring sections in front of the target measuring section in sequence;
When the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window, and a fourth receiving window, the step S4 specifically includes:
Setting A1 as a radiation energy curve of the second optical signal received by the first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; a4 is a radiation energy curve of the second optical signal received by the fourth receiving window; the radiant energy curves A1, A2, A3 and A4 differ by a quarter phase;
Is provided with
For rising edges outside the maximum and minimum values in the radiant energy curve, set to a rise, and for falling edges set to a fall;
According to the value of i max,imin, the measuring interval can be divided into four measuring sections, wherein each measuring section has a rising edge and a falling edge, and the depth d in one measuring section is expressed as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N is the order of the measurement segments.
2. The scene depth measurement method according to claim 1, wherein the step S1 comprises the steps of:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
Step S102: adjusting the light beam emitted by the light source according to the first modulation signal so that the light source generates the first light signal;
step S103: and projecting the first optical signal to the target object, wherein the first optical signal is reflected by the target object to form a second optical signal.
3. The scene depth measurement method according to claim 1, wherein the step S2 comprises the steps of:
Step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is larger than or smaller than that of the first optical signal, and the pulse waveform of the first optical signal is a plurality of rectangles which are sequentially arranged;
step S202: receiving three second optical signals through at least three receiving windows, the at least three receiving windows being sequentially arranged in time sequence;
Step S203: generating each collected optical signal according to the second optical signal received by each receiving window, wherein the phase delay of the at least three collected optical signals relative to the second optical signal is different.
4. A scene depth measurement method according to claim 3, characterized in that said step S3 comprises the steps of:
Step S301: determining a measurement interval according to the pulse width of the first optical signal;
Step S302: dividing the measuring interval into a plurality of measuring sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
Step S303: the measurement segments are marked in sequence in a preset order.
5. A scene depth measurement method, comprising the steps of:
Step S1: controlling a light source to project a first light signal to a target object in a scene, wherein the first light signal is reflected by the target object to form a second light signal;
Step S2: receiving the second optical signal at least three receiving windows by an image sensor to generate at least three collected optical signals, the at least three collected optical signals having different phase delays relative to the second optical signal;
step S3: dividing a measurement interval into a plurality of measurement sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, wherein the measurement interval is determined according to the pulse width of the first optical signals;
step S4: determining corresponding measuring segments according to the radiation energy of the at least three collected light signals, and further determining depth information of the target object; the modulated first optical signals are a plurality of rectangles which are sequentially arranged and continuous;
The step S4 includes the steps of:
step S401: determining a target measuring section in a plurality of measuring sections where the depth information of the target object is located according to the radiation energy of at least three collected light signals;
step S402: determining the proportion value of the radiation intensity of the at least three second optical signals in the target measuring section according to a preset proportion relation;
step S403: determining the depth of the radiation intensities of the at least three second optical signals corresponding to the target measuring section according to the proportion value, and further determining the depth information of the target object by accumulating measuring sections in front of the target measuring section in sequence;
When the at least three receiving windows are a first receiving window, a second receiving window and a third receiving window, the step S4 specifically includes:
Setting A1 as a radiation energy curve of the second optical signal received by the first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; the radiant energy curve A1, the radiant energy curve A2 and the radiant energy curve A3 differ by one third of the phase;
Is provided with
Setting A mid for rising and falling edges outside the maximum and minimum values in the radiant energy curve;
According to the value of i max,imin, the measuring interval can be divided into six measuring sections, wherein three measuring sections are falling edges, and three measuring sections are rising edges;
The depth d within the corresponding measurement segment for the falling edge is denoted as:
The depth d within the corresponding measurement segment for the rising edge is denoted as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N is the order of the measurement segments.
6. The scene depth measurement method according to claim 5, wherein the step S1 comprises the steps of:
step S101: acquiring a preset modulation function, and generating a first modulation signal according to the modulation function;
Step S102: adjusting the light beam emitted by the light source according to the first modulation signal so that the light source generates the first light signal;
step S103: and projecting the first optical signal to the target object, wherein the first optical signal is reflected by the target object to form a second optical signal.
7. The scene depth measurement method according to claim 5, wherein the step S2 comprises the steps of:
Step S201: acquiring a preset receiving window, wherein the pulse width of the receiving window is larger than or smaller than that of the first optical signal, and the pulse waveform of the first optical signal is a plurality of rectangles which are sequentially arranged;
step S202: receiving three second optical signals through at least three receiving windows, the at least three receiving windows being sequentially arranged in time sequence;
Step S203: generating each collected optical signal according to the second optical signal received by each receiving window, wherein the phase delay of the at least three collected optical signals relative to the second optical signal is different.
8. The scene depth measurement method according to claim 7, wherein the step S3 comprises the steps of:
Step S301: determining a measurement interval according to the pulse width of the first optical signal;
Step S302: dividing the measuring interval into a plurality of measuring sections according to the phase delay of the pulse waveform of the second optical signal continuously received by at least three receiving windows;
Step S303: the measurement segments are marked in sequence in a preset order.
9. A scene depth measurement system, comprising the following modules:
The light projection module is used for controlling the light source to project a first light signal to a target object in a scene, and the first light signal is reflected by the target object to form a second light signal;
The optical receiving module is used for receiving the second optical signal at least three receiving windows through the image sensor to generate at least three collected optical signals, and the phase delays of the at least three collected optical signals relative to the second optical signal are different;
the measuring section generating module is used for dividing a measuring section into a plurality of measuring sections according to pulse waveforms of the second optical signals continuously received by at least three receiving windows, and the measuring section is determined according to the pulse width of the first optical signals;
a depth calculation module for determining depth information of the target object according to a plurality of measurement segments and radiation energy of the at least three collected light signals; the modulated first optical signals are a plurality of rectangles which are sequentially arranged and continuous;
When the at least three receiving windows are a first receiving window, a second receiving window, a third receiving window and a fourth receiving window, the method specifically comprises the following steps:
Setting A1 as a radiation energy curve of the second optical signal received by the first receiving window; a2 is a radiation energy curve of the second optical signal received by the second receiving window; a3 is a radiation energy curve of the second optical signal received by a third receiving window; a4 is a radiation energy curve of the second optical signal received by the fourth receiving window; the radiant energy curves A1, A2, A3 and A4 differ by a quarter phase;
Is provided with
For rising edges outside the maximum and minimum values in the radiant energy curve, set to a rise, and for falling edges set to a fall;
According to the value of i max,imin, the measuring interval can be divided into four measuring sections, wherein each measuring section has a rising edge and a falling edge, and the depth d in one measuring section is expressed as:
Where d max is the maximum measured depth and d int(imax,imin) is the depth offset corresponding to the interval determined by i max,imin, which is N is the order of the measurement segments.
10. A scene depth measurement device, comprising:
a processor;
A memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the scene depth measurement method of any of claims 1 to 8 via execution of the executable instructions.
11. A computer-readable storage medium storing a program, characterized in that the program when executed implements the steps of the scene depth measurement method according to any one of claims 1 to 8.
CN202010216887.3A 2020-03-25 2020-03-25 Scene depth measurement method, system, equipment and storage medium Active CN113447954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010216887.3A CN113447954B (en) 2020-03-25 2020-03-25 Scene depth measurement method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010216887.3A CN113447954B (en) 2020-03-25 2020-03-25 Scene depth measurement method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113447954A CN113447954A (en) 2021-09-28
CN113447954B true CN113447954B (en) 2024-06-04

Family

ID=77806848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010216887.3A Active CN113447954B (en) 2020-03-25 2020-03-25 Scene depth measurement method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113447954B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057909A (en) * 1995-06-22 2000-05-02 3Dv Systems Ltd. Optical ranging camera
CA2838857A1 (en) * 2013-01-15 2014-07-15 Cgg Services Sa Seismic data processing including true-azimuth three-dimensional internal multiple attentuation without subsurface information
WO2015195318A1 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Automatic multiple depth cameras synchronization using time sharing
CN106464858A (en) * 2014-04-26 2017-02-22 泰特拉维公司 Method and system for robust and extended illumination waveforms for depth sensing in 3D imaging
WO2019091116A1 (en) * 2017-11-10 2019-05-16 Guangdong Kang Yun Technologies Limited Systems and methods for 3d scanning of objects by providing real-time visual feedback
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
WO2019129546A1 (en) * 2017-12-26 2019-07-04 Robert Bosch Gmbh Single-chip rgb-d camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN110168398A (en) * 2018-07-18 2019-08-23 深圳市汇顶科技股份有限公司 Range-measurement system and bearing calibration when flying
CN110221272A (en) * 2019-05-09 2019-09-10 深圳奥比中光科技有限公司 Time flight depth camera and jamproof distance measurement method
CN110412599A (en) * 2018-04-27 2019-11-05 索尼半导体解决方案公司 Range measurement processing unit, distance-measurement module and range measurement processing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9921300B2 (en) * 2014-05-19 2018-03-20 Rockwell Automation Technologies, Inc. Waveform reconstruction in a time-of-flight sensor

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057909A (en) * 1995-06-22 2000-05-02 3Dv Systems Ltd. Optical ranging camera
CA2838857A1 (en) * 2013-01-15 2014-07-15 Cgg Services Sa Seismic data processing including true-azimuth three-dimensional internal multiple attentuation without subsurface information
CN106464858A (en) * 2014-04-26 2017-02-22 泰特拉维公司 Method and system for robust and extended illumination waveforms for depth sensing in 3D imaging
WO2015195318A1 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Automatic multiple depth cameras synchronization using time sharing
WO2019091116A1 (en) * 2017-11-10 2019-05-16 Guangdong Kang Yun Technologies Limited Systems and methods for 3d scanning of objects by providing real-time visual feedback
WO2019129546A1 (en) * 2017-12-26 2019-07-04 Robert Bosch Gmbh Single-chip rgb-d camera
CN110412599A (en) * 2018-04-27 2019-11-05 索尼半导体解决方案公司 Range measurement processing unit, distance-measurement module and range measurement processing method
CN110168398A (en) * 2018-07-18 2019-08-23 深圳市汇顶科技股份有限公司 Range-measurement system and bearing calibration when flying
WO2020014902A1 (en) * 2018-07-18 2020-01-23 深圳市汇顶科技股份有限公司 Time-of-flight system and calibration method
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN110221272A (en) * 2019-05-09 2019-09-10 深圳奥比中光科技有限公司 Time flight depth camera and jamproof distance measurement method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多视角深度相机的猪体三维点云重构及体尺测量;尹令 等;《农业工程学报》;20191208;第201-208页 *

Also Published As

Publication number Publication date
CN113447954A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
US11625845B2 (en) Depth measurement assembly with a structured light source and a time of flight camera
US20230176223A1 (en) Processing system for lidar measurements
US20200081095A1 (en) Method and apparatus for generating object detection box, device, storage medium, and vehicle
US10545237B2 (en) Method and device for acquiring distance information
EP2936204B1 (en) Multiple frequency time of flight dealiasing
EP3971685A1 (en) Interactive control method and apparatus, electronic device and storage medium
US9628774B2 (en) Image processing method and apparatus
CN105190426A (en) Time of flight sensor binning
US12000963B2 (en) LiDAR device and method of operating the same
US20210256740A1 (en) Method for increasing point cloud sampling density, point cloud processing system, and readable storage medium
CN112824934B (en) TOF multipath interference removal method, system, equipment and medium based on modulated light field
CN110687516A (en) Control method, device and system for light beam scanning and corresponding medium
CN108646917B (en) Intelligent device control method and device, electronic device and medium
EP4206723A1 (en) Ranging method and device, storage medium, and lidar
Hussmann et al. Modulation method including noise model for minimizing the wiggling error of TOF cameras
KR102610830B1 (en) Method and device for acquiring distance information
CN113256539A (en) Depth image de-aliasing method, device, equipment and computer storage medium
CN113447954B (en) Scene depth measurement method, system, equipment and storage medium
KR20210036200A (en) LiDAR device and operating method of the same
CN112824935A (en) Depth imaging system, method, device and medium based on modulated light field
CN112513670B (en) Distance meter, distance measuring system, distance measuring method and program
CN109618085B (en) Electronic equipment and mobile platform
CN113514851B (en) Depth camera
CN113009498A (en) Distance measuring method, device and system
CN112987022A (en) Distance measurement method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant