CN112487655A - Phase folding optimization method, device, medium and equipment for TOF camera - Google Patents

Phase folding optimization method, device, medium and equipment for TOF camera Download PDF

Info

Publication number
CN112487655A
CN112487655A CN202011445773.2A CN202011445773A CN112487655A CN 112487655 A CN112487655 A CN 112487655A CN 202011445773 A CN202011445773 A CN 202011445773A CN 112487655 A CN112487655 A CN 112487655A
Authority
CN
China
Prior art keywords
double
value
depth
amplitude
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011445773.2A
Other languages
Chinese (zh)
Inventor
王俊
应忍冬
刘佩林
葛昊
邹耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Data Miracle Intelligent Technology Co ltd
Original Assignee
Shanghai Data Miracle Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Data Miracle Intelligent Technology Co ltd filed Critical Shanghai Data Miracle Intelligent Technology Co ltd
Priority to CN202011445773.2A priority Critical patent/CN112487655A/en
Publication of CN112487655A publication Critical patent/CN112487655A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/04Constraint-based CAD

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a phase folding optimization method for a TOF camera. Acquiring a double-depth map and a double-amplitude map by setting double initial modulation frequencies; constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value; and when the mode loss value is smaller than a preset threshold value, recovering the depth image according to the ambiguity combination solution, effectively reducing the influence of noise and improving the quality of the depth image.

Description

Phase folding optimization method, device, medium and equipment for TOF camera
Technical Field
The embodiment of the invention relates to the technical field of 3D depth cameras, in particular to a phase folding optimization method, a phase folding optimization device, a phase folding optimization medium and phase folding optimization equipment for a TOF camera.
Background
The 3D depth camera is a new technology that has emerged in recent years, and compared with a conventional camera, the depth camera functionally adds a depth measurement, thereby more conveniently and accurately sensing the surrounding environment and changes.
The 3D depth camera has a wide application field, such as gesture recognition, a robot platform, a consumer electronics field and the like. Currently, common depth cameras include a binocular depth camera, a structured light camera, and a Time-of-Flight (ToF) depth camera, wherein the ToF depth camera obtains object distance information by directly measuring light Flight Time, has small computation amount and stable 3D imaging, and is distinguished in three depth imaging technologies. However, ToF cameras still suffer from various noises, and have problems such as motion blur, multipath, distance folding, and the like. In the prior art, the distance folding of the ToF camera still has the problem of poor algorithm robustness.
Disclosure of Invention
The embodiment of the invention provides a phase folding optimization method, a phase folding optimization device, a phase folding optimization medium and phase folding optimization equipment for a TOF camera, which can effectively reduce the influence of noise and improve the quality of a depth image.
In a first aspect, an embodiment of the present invention provides a method for optimizing phase folding in a TOF camera, where the method includes:
setting double initial modulation frequencies to obtain a double-depth image and a double-amplitude image;
constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value;
and when the mode loss value is smaller than a preset threshold value, restoring the depth image according to the ambiguity combination solution.
Optionally, the method further includes:
when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image;
and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
Optionally, the constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value includes:
according to the double depth map d1,d2Constructing a data constraint term loss (d) for each pixel point1,d2;k1,k2) Wherein k is1,k2Representing an ambiguity combined solution;
according to the double depth map d1,d2And the double amplitude map a1,a2Constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Constructing a neighborhood constraint term loss (i, N) for each pixel point according to a neighborhood continuity principlei;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i;
Figure BDA0002824554260000021
and determining an optimal ambiguity combined solution and a corresponding loss value loss according to the formula.
Optionally, the setting of the dual initial modulation frequency to obtain the dual depth map and the dual amplitude map includes:
setting double initial modulation frequencies and initial integration time through a sensor;
and acquiring the double-depth map and the double-amplitude map based on an I-ToF imaging principle.
Optionally, the obtaining the dual depth map and the dual amplitude map based on the I-ToF imaging principle includes:
Figure BDA0002824554260000031
Figure BDA0002824554260000032
wherein c represents the speed of light, f represents the amplitude modulation frequency of the optical signal, d represents the distance from the object to the camera, and phi represents the phase difference between the transmitted and received signals;
wherein, the calculation formula of the phase difference phi between the transmitting and receiving signals is as follows:
Figure BDA0002824554260000033
wherein, { Q1,Q2,Q3,Q4Is the integration value measured by the TOF device.
Optionally, the restoring the depth image according to the ambiguity combined solution and the enhanced blur value includes:
Figure BDA0002824554260000034
wherein d isrecoverTo restore depth values, djTo set the j-th modulation frequency fjDepth value, k, corresponding to timejTo set the j-th modulation frequency fjThe corresponding ambiguity combined solution is applied, j, K is an integer value greater than 1.
In a second aspect, an embodiment of the present invention provides an apparatus for optimizing phase folding in a TOF camera, the apparatus including:
the double-image acquisition module is used for setting double initial modulation frequencies to acquire a double-depth image and a double-amplitude image;
the model solution module is used for constructing an optimization model according to the double-depth map and the double-amplitude map and determining a fuzzy degree combination solution and a mode loss value;
and the restoring module is used for restoring the depth image according to the ambiguity combination solution when the mode loss value is smaller than a preset threshold value.
Optionally, the recovery module is further configured to:
when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image; and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
Optionally, the solution model module is specifically configured to:
according to the double depth map d1,d2Constructing a data constraint term loss (d) for each pixel point1,d2;k1,k2) Wherein k is1,k2Representing an ambiguity combined solution;
according to the double depth map d1,d2And the double amplitude map a1,a2Constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Constructing a neighborhood constraint term loss (i, N) for each pixel point according to a neighborhood continuity principlei;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i;
Figure BDA0002824554260000041
and determining an optimal ambiguity combined solution and a corresponding loss value loss according to the formula.
Optionally, the dual map acquisition module is specifically configured to:
setting double initial modulation frequencies and initial integration time through a sensor;
and acquiring the double-depth map and the double-amplitude map based on an I-ToF imaging principle.
Optionally, the obtaining of the dual depth map and the dual amplitude map based on the I-ToF imaging principle in the dual map obtaining module specifically includes:
Figure BDA0002824554260000042
Figure BDA0002824554260000051
wherein c represents the speed of light, f represents the amplitude modulation frequency of the optical signal, d represents the distance from the object to the camera, and phi represents the phase difference between the transmitted and received signals;
wherein, the calculation formula of the phase difference phi between the transmitting and receiving signals is as follows:
Figure BDA0002824554260000052
wherein, { Q1,Q2,Q3,Q4Is the integration value measured by the TOF device.
Optionally, the recovery module is specifically configured to:
Figure BDA0002824554260000053
wherein d isrecoverTo restore depth values, djTo set the j-th modulation frequency fjDepth value, k, corresponding to timejTo set the j-th modulation frequency fjThe corresponding ambiguity combined solution is applied, j, K is an integer value greater than 1.
In a third aspect, embodiments of the present invention provide a computer-readable storage medium on which is stored a computer program which, when executed by a processor, implements a method as described above for phase folding optimization in a TOF camera.
In a fourth aspect, embodiments of the present invention provide an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method for phase folding optimization in a TOF camera as described above when executing the computer program.
In the embodiment of the invention, a double-depth map and a double-amplitude map are obtained by setting double initial modulation frequencies; constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value; and when the mode loss value is smaller than a preset threshold value, recovering the depth image according to the ambiguity combination solution, effectively reducing the influence of noise and improving the quality of the depth image.
Drawings
FIG. 1A is a schematic of the internal correlation integral of I-ToF;
FIG. 1B is a flowchart of a method for optimizing phase folding in a TOF camera according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of an exemplary method for optimizing phase folding in a TOF camera according to a second embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an optimization apparatus for phase folding in a TOF camera according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The ToF camera measures depth using (near) infrared signals, and can be classified into a direct Time-of-Flight (D-ToF) camera and an indirect Time-of-Flight (I-ToF) camera according to the difference in depth calculation principle. The distance folds are also not represented in the same form due to the difference in imaging principles. The algorithm discussed in the embodiments of the present invention is directed to an I-ToF camera in a ToF depth camera. The I-ToF camera is converted into a time-of-flight difference by calculating the phase difference between a reflected signal and an incident signal, and the phenomenon of distance folding is also commonly referred to as phase folding.
When the I-ToF camera works, the sensor firstly emits modulated infrared light, receives reflected light from an object, and obtains the distance of a target object by calculating the phase difference between the incident light and the reflected light. As shown in FIG. 1A, FIG. 1A is a schematic diagram of the internal correlation integral of I-ToF.
Specifically, the I-ToF camera calculates, by a circuit, a correlation value between a received modulation signal and a specific reference signal, and generally uses a 4-phase reference signal, i.e., performs correlation integration with reference signals and received signals having signal phase differences of 0 °, 90 °, 180 °, and 270 °, respectively: as shown in FIG. 1A, the 4 integrated values are respectively denoted as { Q1,Q2,Q3,Q4From which the phase difference can be calculated, i.e.:
Figure BDA0002824554260000071
in addition, the received light amplitude a can also be obtained by calculating the 4 correlation integrated values, namely:
Figure BDA0002824554260000072
the relationship between the modulation frequency of the optical signal, the phase difference of the transmission and reception signals, and the distance is:
Figure BDA0002824554260000073
where c represents the speed of light, f represents the amplitude modulation frequency of the light signal, phi represents the phase difference between the transmitted and received signals, and d represents the distance of the object from the camera.
The phase difference of the I-ToF camera is generally obtained using an arctangent function, and due to the periodicity of the arctangent function, when the actual phase difference between incident light and reflected light exceeds 2 pi, a phase folding phenomenon occurs, resulting in an error in depth measurement. The number of whole cycles of phase difference between the actual phase (depth) and the measured phase (depth) is also called ambiguity.
φgt=φmea+k·2π
Figure BDA0002824554260000081
Wherein d ismeaIs the depth value of the sensor output, dgtIs the actual depth value, phigtIs a theoretical phase value, phimeaIs the measured phase value of the sensor output. k is the ambiguity, c is the speed of light, and f is the modulation frequency of the camera.
However, most of the existing algorithms for phase unfolding are based on a single-frequency measurement method, and an optimization model is constructed by using amplitude and depth constraints and neighborhood pixel depth constraints. The method simplifies the model of the amplitude, and has poorer algorithm robustness for the scene with high ambiguity. The method based on multi-frequency measurement usually fixes frequency, constructs an optimization model according to the constraint relation between multi-frequency measurement values, and solves the optimal ambiguity combination. However, this class of methods neglects the influence of the selection of modulation frequency on the complexity and accuracy of the optimization model solution. For each given set of modulation frequencies, the solution over a certain distance range may fall into local optima due to the presence of noise.
Example one
Fig. 1B is a flowchart of a method for optimizing phase folding in a TOF camera according to an embodiment of the present invention, where the method may be performed by an apparatus for optimizing phase folding in a TOF camera according to an embodiment of the present invention, and the apparatus may be implemented in software and/or hardware. The method specifically comprises the following steps:
and S110, setting double initial modulation frequencies to obtain a double-depth map and a double-amplitude map.
The modulation frequency is a modulation mode for expressing information by using the instantaneous frequency change of a carrier wave, and different information is expressed by using different frequencies of the carrier wave. The initial modulation frequency refers to the modulation frequency set at the beginning of the embodiment of the present invention. The depth map and the amplitude map refer to maps on depth and amplitude generated according to an input modulation frequency.
Specifically, in the embodiment of the present invention, a dual initial modulation frequency, that is, two initial modulation frequencies, is set, a set value may be input according to previous experience of a worker, and after the dual initial modulation frequency is input, two depth maps and two amplitude maps corresponding to the dual initial modulation frequency are obtained. For example, the present embodiment may set the initial modulation frequency to f1=80M,f2100M. In addition, the embodiment of the invention also sets the integration time to 750 microseconds.
Optionally, the setting of the dual initial modulation frequency to obtain the dual depth map and the dual amplitude map includes: setting double initial modulation frequencies and initial integration time through a sensor; and acquiring the double-depth map and the double-amplitude map based on an I-ToF imaging principle.
Optionally, the obtaining the dual depth map and the dual amplitude map based on the I-ToF imaging principle includes:
Figure BDA0002824554260000091
Figure BDA0002824554260000092
wherein c represents the speed of light, f represents the amplitude modulation frequency of the optical signal, d represents the distance from the object to the camera, and phi represents the phase difference between the transmitted and received signals;
wherein, the calculation formula of the phase difference phi between the transmitting and receiving signals is as follows:
Figure BDA0002824554260000093
wherein, { Q1,Q2,Q3,Q4Is the integration value measured by the TOF device.
And S120, constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value.
Wherein, the ambiguity refers to the whole cycle number of the phase difference between the actual phase (depth) and the measured phase (depth), the ambiguity combination solution refers to the solution value about the ambiguity solved by the constructed optimization model, and the mode loss value refers to the model loss value, namely the loss value generated by introducing the model when solving the ambiguity.
Optionally, the constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value includes:
according to the double depth map d1,d2Constructing a data constraint term loss (d) for each pixel point1,d2;k1,k2) Wherein k is1,k2Representing an ambiguity combined solution;
according to the double depth map d1,d2And the double amplitude map a1,a2Constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Constructing a neighborhood constraint term loss (i, N) for each pixel point according to a neighborhood continuity principlei;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i; for example, the value of the neighborhood constraint term i may take 8 neighborhoods, that is, 8 pixels above, below, left, right, above left, above right, below left, and below right of the pixel i.
Figure BDA0002824554260000101
Wherein, the lambda represents the weight coefficient, the proportion of different loss terms, and the lambda1The value may be set to 0.8, λ2The value may be set to 0.4.
And determining an optimal ambiguity combined solution and a corresponding loss value loss according to the formula.
And S130, when the mode loss value is smaller than a preset threshold value, restoring the depth image according to the ambiguity combination solution.
The preset threshold may be set to any value, for example, the preset threshold may be set to 0.05.
Optionally, the embodiment of the present invention further includes: when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image; and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
The initial solution depth value is obtained by solving a ambiguity combined solution obtained by constructing an optimization model after setting two initial modulation frequencies, for example, the solution formula is:
Figure BDA0002824554260000111
wherein d isrecover1Refers to the initial solution depth value.
The term-added fuzzy value refers to a fuzzy value calculated by additionally adding a new modulation frequency to construct an optimization model on the basis of setting double initial modulation frequencies.
Specifically, when the mode loss value is smaller than a preset threshold value, the depth image is restored according to the ambiguity combination solution; and when the mode loss value is not less than the preset threshold value, continuously determining the initial solution depth value according to the currently acquired depth image and the amplitude image. The staff adjusts the initial modulation frequency and the integration time according to the initial depth value, specifically, for example, when the initial depth value is small, the integration time is adjusted toward small, and when the initial depth value is large, the integration time is adjusted toward large. According to the embodiment of the invention, on the basis of double initial modulation frequencies, a new modulation frequency is added to construct an optimization model so as to accurately combine the solution of the ambiguity and the enhanced ambiguity value.
Optionally, the restoring the depth image according to the ambiguity combined solution and the enhanced blur value includes:
Figure BDA0002824554260000112
wherein d isrecoverTo restore depth values, djTo set the j-th modulation frequency fjDepth value, k, corresponding to timejTo set the j-th modulation frequency fjThe corresponding ambiguity combined solution is applied, j, K is an integer value greater than 1.
Specifically, K is related to the set number of modulation frequencies, for example, if the mode loss value after setting the dual initial modulation frequencies is smaller than the preset threshold, K is 2, and the initial depth value d is obtained by the embodimentrecover1The last restored depth image.
In the embodiment of the invention, a double-depth map and a double-amplitude map are obtained by setting double initial modulation frequencies; constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value; and when the mode loss value is smaller than a preset threshold value, recovering the depth image according to the ambiguity combination solution, effectively reducing the influence of noise and improving the quality of the depth image.
Example two
Fig. 2 is a flowchart of an exemplary method for optimizing phase folding in a TOF camera according to a second embodiment of the present invention.
The method comprises the steps of setting two initial modulation frequencies, constructing an optimization model according to dual-frequency depth data to obtain an initial solution of real depth, determining new modulation frequency and integration time parameters according to the initial solution and corresponding loss, solving real depth and model loss by combining new depth measurement values, and continuously iterating until the condition of iteration ending is met, thereby realizing phase folding repair based on multi-frequency measurement and sensor in-loop control.
The method comprises the following steps:
and S1, initializing the double modulation frequency, and acquiring a corresponding double-frequency depth map and an amplitude map.
S2, constructing an optimization model based on the measurement data of the double frequency, and solving the ambiguity k1,k2And loss of model loss.
S3, if loss, executing step S6; otherwise, determining new modulation frequency and integration time parameters, and acquiring a new depth map and an amplitude map.
And S4, combining the new depth data and the new amplitude data, optimizing and solving the model, and solving the ambiguity and the loss of the model.
S5, repeating the steps (S3-S4).
And S6, generating a true depth map with the highest quality based on the multi-frequency measurement data and the solution model.
In the above step S1, the embodiment of the present invention includes the following steps:
step S11, setting the initial modulation frequency to f1=80M,f2=100M;
Step S12, based on the I-ToF imaging principle, obtaining two depth maps d corresponding to the frequency1,d2And amplitude map a1,a2
In the above step S2, the embodiment of the present invention includes the following steps:
step S21, according to the double-frequency depth data, constructing a data constraint item loss (d) for each pixel point1,d2;k1,k2);
Step S22, according to the amplitude and depth data, constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Step S23, according to the neighborhood continuity principle, a neighborhood constraint term loss (is) is constructed for each pixel pointi,Ni;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i;
step S24, solving the following formula to find the optimal ambiguity combination and the corresponding loss;
Figure BDA0002824554260000131
in the above step S3, the embodiment of the present invention includes the following steps:
step S31, judging whether the loss obtained in step S24 meets the requirement, wherein, the depth image restoration follows the following expression;
Figure BDA0002824554260000132
step S32, if loss satisfies loss < lossthOtherwise, go out of the step, go to step S6; otherwise, go to step S33;
step S33, determining a new modulation frequency f based on the overall depth of the imagejAnd an integration time;
step S34, obtaining new measurement data, depth map djAnd amplitude map aj
In the above step S4, the embodiment of the present invention includes the following steps:
step S41, updating the loss function based on the latest depth map and amplitude map;
step S42, updating the combination of the fuzzy degrees and the corresponding loss based on the solving method of the step S24;
in the above step S5, the embodiment of the present invention includes the following steps:
step S51, repeating steps S3-S4;
in the above step S6, the embodiment of the present invention includes the following steps:
step S61, restoring the depth image based on the blur degree obtained in step S5;
Figure BDA0002824554260000141
the invention provides a method for solving phase folding in a ring based on a sensor, which is characterized in that an optimization model is constructed by using multi-frequency depth measurement data, and the phase after solution and folding is solved. According to the optimization result of the model, parameters such as modulation frequency, integration time and the like of the camera are changed in real time, new depth and amplitude measurement values are obtained and added into the optimization model, and therefore the influence of noise can be reduced, and the quality of the recovered depth image is improved. Meanwhile, the prior dependence on the scene is lower based on a sensor-in-loop control mode, and compared with a method for fixing the modulation frequency, the algorithm has wider applicability.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an apparatus for optimizing phase folding in a TOF camera according to an embodiment of the present invention, where the apparatus specifically includes:
a dual-map obtaining module 310, configured to set dual initial modulation frequencies to obtain a dual-depth map and a dual-amplitude map;
a solution model module 320, configured to construct an optimization model according to the dual depth map and the dual amplitude map, and determine a ambiguity combined solution and a loss value;
and the restoring module 330 is configured to restore the depth image according to the ambiguity combination solution when the mode loss value is smaller than a preset threshold.
Optionally, the recovery module 330 is further configured to:
when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image; and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
Optionally, the solution model module 320 is specifically configured to:
according to the double depth map d1,d2Constructing a data constraint term loss (d) for each pixel point1,d2;k1,k2) Wherein k is1,k2Representing an ambiguity combined solution;
according to the double depth map d1,d2And the double amplitude map a1,a2Constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Constructing a neighborhood constraint term loss (i, N) for each pixel point according to a neighborhood continuity principlei;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i;
Figure BDA0002824554260000151
and determining an optimal ambiguity combined solution and a corresponding loss value loss according to the formula.
Optionally, the dual graph obtaining module 310 is specifically configured to:
setting double initial modulation frequencies and initial integration time through a sensor;
and acquiring the double-depth map and the double-amplitude map based on an I-ToF imaging principle.
Optionally, the dual-depth map and the dual-amplitude map are acquired by the dual-map acquiring module 310 based on an I-ToF imaging principle, specifically:
Figure BDA0002824554260000161
Figure BDA0002824554260000162
wherein c represents the speed of light, f represents the amplitude modulation frequency of the optical signal, d represents the distance from the object to the camera, and phi represents the phase difference between the transmitted and received signals;
wherein, the calculation formula of the phase difference phi between the transmitting and receiving signals is as follows:
Figure BDA0002824554260000163
wherein, { Q1,Q2,Q3,Q4Is the measured integration value.
Optionally, the recovery module 330 is specifically configured to:
Figure BDA0002824554260000164
wherein d isrecoverTo restore depth values, djTo set the j-th modulation frequency fjDepth value, k, corresponding to timejTo set the j-th modulation frequency fjThe corresponding ambiguity combined solution is applied, j, K is an integer value greater than 1.
In the embodiment of the invention, a double-depth map and a double-amplitude map are obtained by setting double initial modulation frequencies; constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value; and when the mode loss value is smaller than a preset threshold value, recovering the depth image according to the ambiguity combination solution, effectively reducing the influence of noise and improving the quality of the depth image.
Example four
Embodiments of the present application also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform:
setting double initial modulation frequencies to obtain a double-depth image and a double-amplitude image;
constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value;
and when the mode loss value is smaller than a preset threshold value, restoring the depth image according to the ambiguity combination solution.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in the computer system in which the program is executed, or may be located in a different second computer system connected to the computer system through a network (such as the internet). The second computer system may provide the program instructions to the computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations, such as in different computer systems that are connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the optimization operations for phase folding in the TOF camera as described above, and may also perform related operations in the optimization method for phase folding in the TOF camera as provided in any embodiment of the present application.
EXAMPLE five
The embodiment of the application provides electronic equipment, and a synchronization device for phase folding optimization in a TOF camera, which is provided by the embodiment of the application, can be integrated in the electronic equipment. Fig. 4 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present application. As shown in fig. 4, the present embodiment provides an electronic device 400, which includes: one or more processors 420; storage 410 to store one or more programs that, when executed by the one or more processors 420, cause the one or more processors 420 to implement:
setting double initial modulation frequencies to obtain a double-depth image and a double-amplitude image;
constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value;
and when the mode loss value is smaller than a preset threshold value, restoring the depth image according to the ambiguity combination solution.
As shown in fig. 4, the electronic device 400 includes a processor 420, a storage device 410, an input device 430, and an output device 440; the number of the processors 420 in the electronic device may be one or more, and one processor 420 is taken as an example in fig. 4; the processor 420, the storage device 410, the input device 430, and the output device 440 in the electronic apparatus may be connected by a bus or other means, and are exemplified by a bus 450 in fig. 4.
The storage device 410 is a computer readable storage medium, and can be used to store software programs, computer executable programs, and module units, such as program instructions corresponding to the phase folding optimization method in the TOF camera in the embodiments of the present application.
The storage device 410 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the storage 410 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 410 may further include memory located remotely from processor 420, which may be connected via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 430 may be used to receive input numbers, character information, or voice information, and to generate key signal inputs related to user settings and function control of the electronic device. The output device 440 may include a display screen, speakers, etc.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method for phase folding optimization in a TOF camera, comprising:
setting double initial modulation frequencies to obtain a double-depth image and a double-amplitude image;
constructing an optimization model according to the double-depth map and the double-amplitude map, and determining a ambiguity combination solution and a mode loss value;
and when the mode loss value is smaller than a preset threshold value, restoring the depth image according to the ambiguity combination solution.
2. The method of claim 1, further comprising:
when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image;
and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
3. The method according to claims 1 and 2, wherein the constructing an optimization model according to the double-depth map and the double-amplitude map, and determining the ambiguity combination solution and the loss value comprises:
according to the double depth map d1,d2Constructing a data constraint term loss (d) for each pixel point1,d2;k1,k2) Wherein k is1,k2Representing groups of ambiguitiesSynthesizing and solving;
according to the double depth map d1,d2And the double amplitude map a1,a2Constructing an amplitude constraint term loss (a) for each pixel point1,a2,d1,d2;k1,k2);
Constructing a neighborhood constraint term loss (i, N) for each pixel point according to a neighborhood continuity principlei;ki) Where i denotes the target pixel, NiA neighborhood set representing pixel i;
Figure FDA0002824554250000011
and determining an optimal ambiguity combined solution and a corresponding loss value loss according to the formula.
4. The method of claim 3, wherein setting the dual initial modulation frequencies to obtain the dual depth map and the dual amplitude map comprises:
setting double initial modulation frequencies and initial integration time through a sensor;
and acquiring the double-depth map and the double-amplitude map based on an I-ToF imaging principle.
5. The method of claim 4, wherein the obtaining the dual depth map and the dual amplitude map based on the I-ToF imaging principle comprises:
Figure FDA0002824554250000021
Figure FDA0002824554250000022
wherein c represents the speed of light, f represents the amplitude modulation frequency of the optical signal, d represents the distance from the object to the camera, and phi represents the phase difference between the transmitted and received signals;
wherein, the calculation formula of the phase difference phi between the transmitting and receiving signals is as follows:
Figure FDA0002824554250000023
wherein, { Q1,Q2,Q3,Q4Is the integration value measured by the TOF device.
6. The method of claim 2, wherein the restoring the depth image according to the blurriness combined solution and the augmented blur values comprises:
Figure FDA0002824554250000024
wherein d isrecoverTo restore depth values, djTo set the j-th modulation frequency fjDepth value, k, corresponding to timejTo set the j-th modulation frequency fjThe corresponding ambiguity combined solution is applied, j, K is an integer value greater than 1.
7. An apparatus optimized for phase folding in a TOF camera, comprising:
the double-image acquisition module is used for setting double initial modulation frequencies to acquire a double-depth image and a double-amplitude image;
the model solution module is used for constructing an optimization model according to the double-depth map and the double-amplitude map and determining a fuzzy degree combination solution and a mode loss value;
and the restoring module is used for restoring the depth image according to the ambiguity combination solution when the mode loss value is smaller than a preset threshold value.
8. The apparatus of claim 7, wherein the recovery module further comprises:
when the mode loss value is not smaller than a preset threshold value, continuously determining an initial solution depth value according to a currently acquired depth image and an amplitude image; and according to the initial solution depth value, continuously updating the initial modulation frequency and the integration time to continuously obtain an augmented item fuzzy value and a module loss updating value until the module loss updating value is smaller than a preset threshold value, and then recovering the depth image according to the fuzzy degree combination solution and the augmented item fuzzy value.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method for phase folding optimization in a TOF camera according to any one of claims 1 to 6.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method for phase folding optimization in a TOF camera according to any one of claims 1 to 6 when executing the computer program.
CN202011445773.2A 2020-12-09 2020-12-09 Phase folding optimization method, device, medium and equipment for TOF camera Pending CN112487655A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011445773.2A CN112487655A (en) 2020-12-09 2020-12-09 Phase folding optimization method, device, medium and equipment for TOF camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011445773.2A CN112487655A (en) 2020-12-09 2020-12-09 Phase folding optimization method, device, medium and equipment for TOF camera

Publications (1)

Publication Number Publication Date
CN112487655A true CN112487655A (en) 2021-03-12

Family

ID=74941348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011445773.2A Pending CN112487655A (en) 2020-12-09 2020-12-09 Phase folding optimization method, device, medium and equipment for TOF camera

Country Status (1)

Country Link
CN (1) CN112487655A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111445404A (en) * 2020-03-23 2020-07-24 上海数迹智能科技有限公司 Phase deblurring method based on dual-frequency sum probability model
CN111708039A (en) * 2020-05-24 2020-09-25 深圳奥比中光科技有限公司 Depth measuring device and method and electronic equipment
US20200349728A1 (en) * 2019-05-02 2020-11-05 Samsung Electronics Co., Ltd. Time-of-flight depth measurement using modulation frequency adjustment
CN112037295A (en) * 2020-09-04 2020-12-04 上海数迹智能科技有限公司 Event type ToF camera encoding and decoding method, device, medium and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200349728A1 (en) * 2019-05-02 2020-11-05 Samsung Electronics Co., Ltd. Time-of-flight depth measurement using modulation frequency adjustment
CN111445404A (en) * 2020-03-23 2020-07-24 上海数迹智能科技有限公司 Phase deblurring method based on dual-frequency sum probability model
CN111708039A (en) * 2020-05-24 2020-09-25 深圳奥比中光科技有限公司 Depth measuring device and method and electronic equipment
CN112037295A (en) * 2020-09-04 2020-12-04 上海数迹智能科技有限公司 Event type ToF camera encoding and decoding method, device, medium and equipment

Similar Documents

Publication Publication Date Title
CN107909612B (en) Method and system for visual instant positioning and mapping based on 3D point cloud
KR102581429B1 (en) Method and apparatus for detecting obstacle, electronic device, storage medium and program
US10529086B2 (en) Three-dimensional (3D) reconstructions of dynamic scenes using a reconfigurable hybrid imaging system
US10003757B2 (en) Method and apparatus for de-noising data from a distance sensing camera
US11176694B2 (en) Method and apparatus for active depth sensing and calibration method thereof
KR20200049502A (en) System and method for disparity estimation using cameras with different fields of view
CN108139476A (en) Information processing equipment, information processing method and program
CN114013449B (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
WO2022213632A1 (en) Millimeter-wave radar calibration method and apparatus, and electronic device and roadside device
CN108957392A (en) Sounnd source direction estimation method and device
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN113587928B (en) Navigation method, navigation device, electronic equipment, storage medium and computer program product
CN112487655A (en) Phase folding optimization method, device, medium and equipment for TOF camera
CN113760539A (en) TOF camera depth data processing method, terminal and storage medium
CN107845108B (en) Optical flow value calculation method and device and electronic equipment
US11741671B2 (en) Three-dimensional scene recreation using depth fusion
CN115388882A (en) Method and apparatus for depth-assisted visual inertial ranging
CN112288817B (en) Three-dimensional reconstruction processing method and device based on image
CN110349109B (en) Fisheye distortion correction method and system and electronic equipment thereof
JP7255690B2 (en) Phase unwrapping device and phase unwrapping method
WO2021087812A1 (en) Method for determining depth value of image, image processor and module
JP4984668B2 (en) Sonar system and phase error correction method thereof
CN113311432A (en) InSAR long and short baseline fusion phase estimation method based on phase derivative variance
CN112950709A (en) Pose prediction method, pose prediction device and robot
CN114463409B (en) Image depth information determining method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination