CN117689705A - Deep learning stripe structure light depth reconstruction method and device - Google Patents

Deep learning stripe structure light depth reconstruction method and device Download PDF

Info

Publication number
CN117689705A
CN117689705A CN202410136911.0A CN202410136911A CN117689705A CN 117689705 A CN117689705 A CN 117689705A CN 202410136911 A CN202410136911 A CN 202410136911A CN 117689705 A CN117689705 A CN 117689705A
Authority
CN
China
Prior art keywords
phase
image
value
wrapped
wrapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410136911.0A
Other languages
Chinese (zh)
Other versions
CN117689705B (en
Inventor
毛凤辉
郭振民
周建国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Virtual Reality Institute Co Ltd filed Critical Nanchang Virtual Reality Institute Co Ltd
Priority to CN202410136911.0A priority Critical patent/CN117689705B/en
Priority claimed from CN202410136911.0A external-priority patent/CN117689705B/en
Publication of CN117689705A publication Critical patent/CN117689705A/en
Application granted granted Critical
Publication of CN117689705B publication Critical patent/CN117689705B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a deep learning stripe structure-based light depth reconstruction method and device, wherein the method comprises the following steps: collecting sample images, wherein each sample image comprises stripe images with a plurality of frequencies; calculating a wrapped phase sine value, a cosine value, a wrapped phase value and an absolute phase value of the fringe image of each frequency; designing a wrapped phase neural network model and an absolute neural network model; acquiring an image to be detected, and reasoning to obtain a fitting absolute phase value through the wrapped phase neural network model and the absolute neural network model; processing the fitting absolute phase value to obtain an image fringe series, and then recalculating the image fringe series and the wrapping phase value to obtain a final absolute phase: and performing depth information fitting on the final absolute phase to obtain a depth value. According to the embodiment of the application, the acquisition time of the fringe image can be effectively shortened, and the absolute phase precision is guaranteed through post-processing.

Description

Deep learning stripe structure light depth reconstruction method and device
Technical Field
The embodiments of the application belong to the technical field of stripe image processing, and particularly relate to a stripe structure light depth reconstruction method and device based on deep learning.
Background
With the development of technology, two-dimensional imaging has been far from meeting the demands of consumers, so three-dimensional imaging has been developed, and two-dimensional cameras of different versions impact markets like spring bamboo shoots after raining, but have the advantages of high precision, non-contact measurement, wide applicability and the like based on stripe structured light, and are a commonly used three-dimensional imaging technology. However, at present, a multi-frequency heterodyne method is generally used for stripe structure light based on a traditional algorithm, multiple frames of images with different phases are required to be acquired at different frequencies, for example, 12 frames of stripe images and 36 frames of stripe images with three frequencies and four phases and three frequencies and ten phases are required to be jointly solved into one frame of depth data, so that most of the time of a camera is used for acquiring images with different phases at different frequencies, and the longer time of acquiring the stripe images in the scheme in the prior art cannot guarantee the processing precision of the stripe images.
Disclosure of Invention
In order to solve or alleviate the problems in the prior art.
In a first aspect, an embodiment of the present application provides a deep learning stripe structure-based optical depth reconstruction method, including:
collecting a plurality of groups of sample images, wherein each group of sample images comprises a plurality of stripe images with different frequencies;
calculating a wrapping phase sine value and a wrapping phase cosine value of each frequency fringe image, calculating a wrapping phase value of each frequency fringe image according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, and calculating a wrapping phase absolute value of each group of sample images according to the wrapping phase value of each frequency fringe image;
taking the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image as supervision amounts of a wrapped phase neural network model, and designing a first loss function according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image to train the wrapped phase neural network model until convergence;
taking the absolute value of the wrapping phase of each group of sample images as a supervision quantity, designing a second loss function according to the absolute value of the wrapping phase of each group of sample images, and training the absolute neural network model of the wrapping phase until convergence;
collecting an image to be tested, deducing the image to be tested through the wrapped phase neural network model after training convergence to obtain a wrapped phase sine value and a wrapped phase cosine value of each frequency fringe image of the image to be tested, and calculating the wrapped phase value of each frequency fringe image of the image to be tested according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image of the image to be tested:
the wrapping phase value of each frequency fringe image of the image to be tested is input into a training convergence wrapping phase absolute neural network model, and the fitting absolute phase value of each frequency fringe image of the image to be tested is obtained in a reasoning mode;
obtaining the fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and calculating the final wrapping absolute phase value of the image to be measured by using the fringe order and the wrapping phase value of each frequency fringe image of the image to be measured;
and performing depth information fitting on the final absolute phase value of the image to be measured to obtain a depth value of the image to be measured.
Compared with the prior art, the embodiment of the application provides a deep learning stripe structure-based light depth reconstruction method, through the embodiment of the application, the acquisition time of stripe images can be effectively shortened, and the accuracy of wrapping absolute phase values is guaranteed through post-processing.
In a second aspect, embodiments of the present application further provide a deep learning stripe structure-based light depth reconstruction device, including:
the acquisition module is used for acquiring a plurality of groups of sample images, and each group of sample images comprises a plurality of stripe images with different frequencies;
the calculation module is used for calculating the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, calculating the wrapping phase value of each frequency fringe image according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, and calculating the wrapping phase absolute value of each group of sample images according to the wrapping phase value of each frequency fringe image;
the training module is used for taking the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image as the supervision quantity of the wrapping phase neural network model, and designing a first loss function according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image to train the wrapping phase neural network model until convergence;
the training module is also used for taking the absolute value of the wrapping phase of each group of sample images as a supervision quantity, designing a second loss function according to the absolute value of the wrapping phase of each group of sample images, and training the absolute neural network model of the wrapping phase until convergence;
the reasoning module is used for collecting the image to be measured, reasoning the image to be measured through the wrapped phase neural network model after training convergence to obtain wrapped phase sine values and wrapped phase cosine values of each frequency fringe image of the image to be measured, and calculating wrapped phase values of each frequency fringe image of the image to be measured according to the wrapped phase sine values and wrapped phase cosine values of each frequency fringe image of the image to be measured:
the reasoning module is also used for inputting the wrapping phase value of each frequency fringe image of the image to be tested into the training converged wrapping phase absolute neural network model, and reasoning to obtain the fitting absolute phase value of each frequency fringe image of the image to be tested;
the calculation module is further used for obtaining the fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and then calculating the final wrapping absolute phase value of the image to be measured by using the fringe order of each frequency fringe image of the image to be measured and the wrapping phase value;
and the fitting module is used for carrying out depth information fitting on the final absolute phase value of the image to be measured to obtain a depth value of the image to be measured.
Compared with the prior art, the beneficial effects of the deep learning stripe structure-based light depth reconstruction device provided by the embodiment of the application are the same as those of the technical scheme provided by the first aspect, and are not repeated here.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. Some specific embodiments of the present application will be described in detail hereinafter by way of example and not by way of limitation with reference to the accompanying drawings. The same reference numbers in the drawings denote the same or similar parts or portions, and it will be understood by those skilled in the art that the drawings are not necessarily drawn to scale, in which:
fig. 1 is a schematic flow chart of a deep learning stripe structure-based optical depth reconstruction method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an optical depth reconstruction device based on a deep learning stripe structure according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
In a first aspect, as shown in fig. 1, an embodiment of the present application provides a deep learning stripe structure-based light depth reconstruction method, including:
step S01, collecting a plurality of groups of sample fringe images, wherein each group of sample images comprises a plurality of fringe images with different frequencies;
it should be noted that, using a digital light processing Device (DLP) and a camera-built environment, multiple sets of sample images are acquired, each set of sample images includes 36 frames of fringe images (i.e., 3 different frequencies, each frequency includes 12 frames of fringe images, and each frame of fringe images has different phase values), and multiple sets of devices are used to acquire data of different objects within the reconstruction range.
Each time the acquisition device clicks to acquire 36 frames of stripe images, the 36 frames of stripe images are three-frequency ten-two-phase stripe images, and the data acquisition of different objects in the reconstruction range by using multiple sets of devices is to ensure the diversity of training samples, so that the trained model has stronger generalization.
Step S02, calculating a wrapping phase sine value and a wrapping phase cosine value of each frequency fringe image, calculating a wrapping phase value of each frequency fringe image according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, and calculating a wrapping phase absolute value of each group of sample images according to the wrapping phase value of each frequency fringe image;
the wrapping phase value, the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image are calculated by the following modes:wherein (1)>Representing a wrapped phase value, imgsin (x, y), imgcos (x, y) representing a wrapped phase sine value and a wrapped phase cosine value at (x, y), respectively, for each frequency fringe image, 2 kpi/N representing a phase shift value, where n=12, k=0, 1, …, N-1, (x, y) representing pixel coordinates, I k (x, y) represents the pixel value at (x, y) in each frequency kth Zhang Tiaowen image.
Calculating the absolute value of the wrapping phase of each group of sample images according to the wrapping phase value of each frequency fringe image by the following method:
determining a modulation degree of each frequency corresponding to the stripe image in each group of sample images;
judging whether the modulation degree of each frequency is not smaller than a preset modulation degree;
if so, calculating the wrapping phase value of each frequency fringe image by a multi-frequency heterodyne method to obtain the wrapping absolute phase value of each group of sample images.
Specifically, the modulation degree B (x, y) of each frequency is calculated, a preset threshold M is set, if B (x, y) is smaller than M, j (x, y) =0 is set, the area with low modulation degree is filtered and removed, and then the package absolute phase is calculated according to the traditional multi-frequency heterodyne method, and the multi-frequency heterodyne method is not repeated in the prior art.
After the above steps are completed, each sample stripe image (36 frames of stripe images) obtains a frame of sine image, a frame of cosine image, a frame of wrapping phase image and a frame of absolute phase image, which are all floating point number images, and are totally three frames of wrapping phase maps, because one frequency-frame wrapping phase map is used in the subsequent calculation of the wrapping absolute phase, but wrapping absolute phase calculation based on the multi-frequency heterodyne method is a very mature technology and is not repeated.
Step S03, taking the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image as supervision amounts of a wrapped phase neural network model, and designing a first loss function according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image to train the wrapped phase neural network model until convergence;
step S04, taking the absolute value of the wrapping phase of each group of sample images as a supervision quantity, designing a second loss function according to the absolute value of the wrapping phase of each group of sample images, and training the wrapping phase absolute neural network model until convergence;
specifically, a wrapped phase neural network model is designed:
the wrapped phase neural network model is input as a minimum frequency T in a sample fringe image 1 A fringe image with a phase shift value of 2 kpi/N, wherein N=12, k=0, 3,6,9 (namely, the 0 th, 3 rd, 6 th and 9 th four-frame images), the wrapped phase sine value imgsin and the wrapped phase cosine value imgcosN of each frequency fringe image are taken as the output of the wrapped phase neural network model, and the resolution of the input sample fringe image pixel and the output sample fringe image pixel are the same;
first loss function of wrapped-phase neural network model is fitted with loss lo by wrapped-phase siness1 and wrapped phase cosine fit loss2, which are both cross entropy losses, and optimizing the function to enable the wrapped phase neural network model to converge to obtain a final wrapped phase neural network model:wherein, lossNet 1 Net representing wrapped phase neural network model 1 A loss function of (2); loss1 is the wrapping phase sine fitting loss of the sample stripe image; loss2 represents the wrap phase cosine fit loss of the sample fringe image; h, w represents the height, width, w of the sample fringe image 1 ,w 2 Imgsin (x, y) represents the wrapped-phase sine value and wrapped-phase cosine value of the sample fringe image at (x, y), where n=12, k=0, 1, …, N-1, respectively, and (x, y) represents the pixel coordinates, which are the weights of loss1 and loss 2.
Designing a wrapped phase absolute neural network model:
using wrapped absolute phase calculated by the multi-frequency heterodyne methodAs input, the absolute phase will be fittedAs an output, the resolution of the input sample stripe image pixels and the resolution of the output sample stripe image pixels are the same;
wrapped-phase absolute neural network model Net 2 The second loss function has only one loss value, which is the fit absolute phaseWrap absolute phase calculated from the three-frequency twelve-phase>Loss between:wherein, lossNet 2 Net representing wrapped absolute neural network model 2 Is the loss function of (h), h, w represents the sample barThe height and width of the grain image; />Representing wrapped absolute phase values of the sample fringe image at (x, y); />Net representing wrapped-phase absolute neural network model 2 The wrapped absolute phase value of the training sample fringe image at (x, y), which represents the pixel coordinates.
By optimizing the loss function lossent 2 And converging the primary wrapped-phase absolute neural network model to obtain a final wrapped-phase absolute neural network model.
Step S05, collecting an image to be tested, deducing the image to be tested through the wrapped phase neural network model after training convergence to obtain a wrapped phase sine value and a wrapped phase cosine value of each frequency fringe image of the image to be tested, and calculating the wrapped phase value of each frequency fringe image of the image to be tested according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image of the image to be tested:
step S06, inputting the wrapping phase value of each frequency fringe image of the image to be tested into a training converged wrapping phase absolute neural network model, and deducing to obtain a fitting absolute phase value of each frequency fringe image of the image to be tested:
calculating a wrapping phase value of each frequency fringe image of the image to be measured by the following method:
acquiring four frames of stripe images corresponding to the minimum frequency of each frequency stripe image of the image to be detected;
inputting the four frames of fringe images corresponding to the minimum frequency into a wrapped phase neural network model, and outputting a wrapped phase sine value and a wrapped phase cosine value of an image to be detected;
calculating the wrapping phase value of each frequency fringe image of the image to be measured by the following formula:wherein (1)>Net representing wrapped phase neural network model 1 The wrapping phase value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning; imgsinR (x, y), imgcosR (x, y) represent the wrapped-phase neural network model Net, respectively 1 The wrapped phase sine value and wrapped phase cosine value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning, and (x, y) represents pixel coordinates.
The method comprises the steps of obtaining four frames of stripe images with minimum frequency in an image to be detected, wherein the four frames of stripe images with minimum frequency in the image to be detected are specifically:
changing DLP optical machine template image, only burning minimum frequency T in DLP 1 The templates of the 0,3,6 and 9 th frames in the frequency (namely, each time sample stripe image is acquired, only four frames of minimum frequency T 1 Frequency fringe images, instead of 36 frames when training sample fringe images).
Step S07, obtaining the fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and calculating the final wrapping absolute phase value of the image to be measured by using the fringe order and the wrapping phase value of each frequency fringe image of the image to be measured:
the calculating the final wrapped absolute phase value of the image to be measured comprises the following steps:wherein Imglevel (x, y) represents the number of fringe orders at (x, y) for each frequency fringe image of the image under test; the absolutePhase (x, y) represents the final wrapped absolute phase value of the image under test at (x, y); round represents a rounded library function in programming language c/c++ (x, y) is pixel location, ++>Net representing wrapped-phase absolute neural network model 2 Wrapping absolute phase value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning, +.>Net representing wrapped phase neural network model 1 The wrapped phase value at (x, y) for each frequency fringe image of the image under test at the time of actual reasoning.
And step S08, fitting the final absolute phase value of the image to be measured with depth information to obtain the depth value of the image to be measured.
Through the specific embodiment of the application, three frequencies are used, and each frequency is provided with a stripe image of 12 frames of images as tag data of supervised training of a wrapped phase neural network model and a wrapped phase absolute neural network model, so that noise caused by nonlinear interference of a system is somewhat reduced.
In actual use, only the minimum frequency T is acquired 1 The 4 frame data with medium frequency (12 phases) is input as a wrapped phase neural network model, so that the acquisition time of stripe images is effectively shortened, the absolute phase precision is ensured to be equivalent to that of a traditional algorithm through post-processing, and the precision is ensured while the efficiency is improved.
In a second aspect, as shown in fig. 2, an embodiment of the present application further provides a depth-learning stripe structure-based optical depth reconstruction device, including:
an acquisition module 21, configured to acquire a plurality of sets of sample images, each set of sample images including a plurality of fringe images with different frequencies;
the calculating module 22 is configured to calculate a wrapped phase sine value and a wrapped phase cosine value of each frequency fringe image, calculate a wrapped phase value of each frequency fringe image according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image, and calculate a wrapped phase absolute value of each group of sample images according to the wrapped phase value of each frequency fringe image;
the training module 23 is configured to use the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image as a supervision of the wrapped phase neural network model, and design a first loss function according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image, so as to train the wrapped phase neural network model until convergence;
the training module 23 is further configured to use the absolute value of the wrapped phase of each set of sample images as a supervision, and design a second loss function according to the absolute value of the wrapped phase of each set of sample images, so as to train the wrapped phase absolute neural network model until convergence;
the reasoning module 24 is configured to collect an image to be measured, and to reason the image to be measured by training the converged wrapped phase neural network model to obtain a wrapped phase sine value and a wrapped phase cosine value of each frequency fringe image of the image to be measured, and calculate a wrapped phase value of each frequency fringe image of the image to be measured according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image of the image to be measured:
the reasoning module 24 is further configured to input the wrapped phase value of each frequency fringe image of the image to be measured into a wrapped phase absolute neural network model after training convergence, and to reason to obtain a fitted absolute phase value of each frequency fringe image of the image to be measured;
the calculating module 22 is further configured to obtain a fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and then calculate a final wrapped absolute phase value of the image to be measured by using the fringe order and the wrapped phase value of each frequency fringe image of the image to be measured;
and the fitting module 25 is used for performing depth information fitting on the final absolute phase value of the image to be measured to obtain a depth value of the image to be measured.
Compared with the prior art, the beneficial effects of the deep learning stripe structure-based light depth reconstruction device provided by the embodiment of the application are the same as those of the technical scheme provided by the first aspect, and are not repeated here.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. The deep learning stripe structure-based light depth reconstruction method is characterized by comprising the following steps of:
collecting a plurality of groups of sample images, wherein each group of sample images comprises a plurality of stripe images with different frequencies;
calculating a wrapping phase sine value and a wrapping phase cosine value of each frequency fringe image, calculating a wrapping phase value of each frequency fringe image according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, and calculating a wrapping phase absolute value of each group of sample images according to the wrapping phase value of each frequency fringe image;
taking the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image as supervision amounts of a wrapped phase neural network model, and designing a first loss function according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image to train the wrapped phase neural network model until convergence;
taking the absolute value of the wrapping phase of each group of sample images as a supervision quantity, designing a second loss function according to the absolute value of the wrapping phase of each group of sample images, and training the absolute neural network model of the wrapping phase until convergence;
collecting an image to be tested, deducing the image to be tested through the wrapped phase neural network model after training convergence to obtain a wrapped phase sine value and a wrapped phase cosine value of each frequency fringe image of the image to be tested, and calculating the wrapped phase value of each frequency fringe image of the image to be tested according to the wrapped phase sine value and the wrapped phase cosine value of each frequency fringe image of the image to be tested:
the wrapping phase value of each frequency fringe image of the image to be tested is input into a training convergence wrapping phase absolute neural network model, and the fitting absolute phase value of each frequency fringe image of the image to be tested is obtained in a reasoning mode;
obtaining the fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and calculating the final wrapping absolute phase value of the image to be measured by using the fringe order and the wrapping phase value of each frequency fringe image of the image to be measured;
and performing depth information fitting on the final absolute phase value of the image to be measured to obtain a depth value of the image to be measured.
2. The depth reconstruction method of light based on deep learning stripe structure of claim 1, wherein the wrapped phase value, wrapped phase sine value and wrapped phase cosine value of each frequency stripe image are calculated by:wherein (1)>Representing a wrapped phase value, imgsin (x, y), imgcos (x, y) representing a wrapped phase sine value and a wrapped phase cosine value at (x, y), respectively, for each frequency fringe image, 2 kpi/N representing a phase shift value, where n=12, k=0, 1, …, N-1, (x, y) representing pixel coordinates, I k (x, y) represents the pixel value at (x, y) in each frequency kth Zhang Tiaowen image.
3. A deep learning fringe structure based optical depth reconstruction method as claimed in claim 1 wherein the absolute value of the wrapping phase of each set of sample images is calculated from the wrapping phase value of each frequency fringe image by:
determining a modulation degree of each frequency corresponding to the stripe image in each group of sample images;
judging whether the modulation degree of each frequency is not smaller than a preset modulation degree;
if so, calculating the wrapping phase value of each frequency fringe image by a multi-frequency heterodyne method to obtain the wrapping absolute phase value of each group of sample images.
4. The deep learning stripe structure-based optical depth reconstruction method according to claim 1, wherein the wrapped-phase neural network model is input as a stripe image with a minimum frequency and a phase shift value of 2 kpi/N among all stripe images, wherein n=12, k=0, 3,6,9.
5. The depth reconstruction method of stripe structure light based on deep learning of claim 1, wherein the first loss function is:wherein, lossNet 1 Net representing wrapped phase neural network model 1 A loss function of (2); loss1 is the sine fit loss of the wrapping phase of the sample fringe image; loss2 represents the wrap phase cosine fit loss of the sample fringe image; h, w represents the height, width, w of the sample fringe image 1 ,w 2 Imgsin (x, y) represents the wrapped-phase sine value and wrapped-phase cosine value of the sample fringe image at (x, y), where n=12, k=0, 1, …, N-1, respectively, and (x, y) represents the pixel coordinates, which are the weights of loss1 and loss 2.
6. The deep learning stripe structure-based optical depth reconstruction method according to claim 1, wherein the wrapped-phase absolute neural network model is input as wrapped-phase values of each group of sample images, and output as wrapped-phase values of each group of fitted sample images.
7. The depth reconstruction method of stripe structure light based on deep learning of claim 1, wherein the second loss function is:wherein, lossNet 2 Absolute neural network representing wrapped phasesNet of complex model 2 And h, w represents the height and width of the sample fringe image; />Representing wrapped absolute phase values of the sample fringe image at (x, y); />Net representing wrapped-phase absolute neural network model 2 The wrapped absolute phase value of the training sample fringe image at (x, y), which represents the pixel coordinates.
8. The deep learning stripe structure-based optical depth reconstruction method according to claim 1, wherein the wrapping phase value of each frequency stripe image of the image to be measured is calculated by:
acquiring four frames of stripe images corresponding to the minimum frequency of each frequency stripe image of the image to be detected;
inputting the four frames of fringe images corresponding to the minimum frequency into a wrapped phase neural network model, and outputting a wrapped phase sine value and a wrapped phase cosine value of an image to be detected;
calculating the wrapping phase value of each frequency fringe image of the image to be measured by the following formula:wherein (1)>Net representing wrapped phase neural network model 1 The wrapping phase value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning; imgsinR (x, y), imgcosR (x, y) represent the wrapped-phase neural network model Net, respectively 1 The wrapped phase sine value and wrapped phase cosine value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning, and (x, y) represents pixel coordinates.
9. The depth-learning stripe structure-based optical depth reconstruction method according to claim 1, wherein the final wrapped absolute phase value of the image to be measured is calculated by:wherein Imglevel (x, y) represents the number of fringe orders at (x, y) for each frequency fringe image of the image under test; the absolutePhase (x, y) represents the final wrapped absolute phase value of the image under test at (x, y); round represents a rounded library function in programming language c/c++ (x, y) is pixel location, ++>Net representing wrapped-phase absolute neural network model 2 Wrapping absolute phase value of each frequency fringe image of the image to be measured at (x, y) in actual reasoning, +.>Net representing wrapped phase neural network model 1 The wrapped phase value at (x, y) for each frequency fringe image of the image under test at the time of actual reasoning.
10. A deep learning stripe structure based light depth reconstruction device, comprising:
the acquisition module is used for acquiring a plurality of groups of sample images, and each group of sample images comprises a plurality of stripe images with different frequencies;
the calculation module is used for calculating the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, calculating the wrapping phase value of each frequency fringe image according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image, and calculating the wrapping phase absolute value of each group of sample images according to the wrapping phase value of each frequency fringe image;
the training module is used for taking the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image as the supervision quantity of the wrapping phase neural network model, and designing a first loss function according to the wrapping phase sine value and the wrapping phase cosine value of each frequency fringe image to train the wrapping phase neural network model until convergence;
the training module is also used for taking the absolute value of the wrapping phase of each group of sample images as a supervision quantity, designing a second loss function according to the absolute value of the wrapping phase of each group of sample images, and training the absolute neural network model of the wrapping phase until convergence;
the reasoning module is used for collecting the image to be measured, reasoning the image to be measured through the wrapped phase neural network model after training convergence to obtain wrapped phase sine values and wrapped phase cosine values of each frequency fringe image of the image to be measured, and calculating wrapped phase values of each frequency fringe image of the image to be measured according to the wrapped phase sine values and wrapped phase cosine values of each frequency fringe image of the image to be measured:
the reasoning module is also used for inputting the wrapping phase value of each frequency fringe image of the image to be tested into the training converged wrapping phase absolute neural network model, and reasoning to obtain the fitting absolute phase value of each frequency fringe image of the image to be tested;
the calculation module is further used for obtaining the fringe order of each frequency fringe image of the image to be measured according to the fitting absolute phase value of each frequency fringe image of the image to be measured, and then calculating the final wrapping absolute phase value of the image to be measured by using the fringe order of each frequency fringe image of the image to be measured and the wrapping phase value;
and the fitting module is used for carrying out depth information fitting on the final absolute phase value of the image to be measured to obtain the depth value of the image to be measured.
CN202410136911.0A 2024-01-31 Deep learning stripe structure light depth reconstruction method and device Active CN117689705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410136911.0A CN117689705B (en) 2024-01-31 Deep learning stripe structure light depth reconstruction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410136911.0A CN117689705B (en) 2024-01-31 Deep learning stripe structure light depth reconstruction method and device

Publications (2)

Publication Number Publication Date
CN117689705A true CN117689705A (en) 2024-03-12
CN117689705B CN117689705B (en) 2024-05-28

Family

ID=

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111879258A (en) * 2020-09-28 2020-11-03 南京理工大学 Dynamic high-precision three-dimensional measurement method based on fringe image conversion network FPTNet
CN112833818A (en) * 2021-01-07 2021-05-25 南京理工大学智能计算成像研究院有限公司 Single-frame fringe projection three-dimensional surface type measuring method
US20210356258A1 (en) * 2018-09-29 2021-11-18 Nanjing University Of Science And Technology A deep learning-based temporal phase unwrapping method for fringe projection profilometry
CN113884027A (en) * 2021-12-02 2022-01-04 南京理工大学 Geometric constraint phase unwrapping method based on self-supervision deep learning
CN114777677A (en) * 2022-03-09 2022-07-22 南京理工大学 Single-frame dual-frequency multiplexing fringe projection three-dimensional surface type measuring method based on deep learning
CN115063466A (en) * 2022-06-24 2022-09-16 复旦大学 Single-frame three-dimensional measurement method based on structured light and deep learning
CN116310080A (en) * 2022-12-30 2023-06-23 重庆大学 Single-frame structured optical gear fault three-dimensional measurement method and system based on deep learning
CN117156113A (en) * 2023-10-30 2023-12-01 南昌虚拟现实研究院股份有限公司 Deep learning speckle camera-based image correction method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210356258A1 (en) * 2018-09-29 2021-11-18 Nanjing University Of Science And Technology A deep learning-based temporal phase unwrapping method for fringe projection profilometry
CN111879258A (en) * 2020-09-28 2020-11-03 南京理工大学 Dynamic high-precision three-dimensional measurement method based on fringe image conversion network FPTNet
CN112833818A (en) * 2021-01-07 2021-05-25 南京理工大学智能计算成像研究院有限公司 Single-frame fringe projection three-dimensional surface type measuring method
CN113884027A (en) * 2021-12-02 2022-01-04 南京理工大学 Geometric constraint phase unwrapping method based on self-supervision deep learning
CN114777677A (en) * 2022-03-09 2022-07-22 南京理工大学 Single-frame dual-frequency multiplexing fringe projection three-dimensional surface type measuring method based on deep learning
CN115063466A (en) * 2022-06-24 2022-09-16 复旦大学 Single-frame three-dimensional measurement method based on structured light and deep learning
CN116310080A (en) * 2022-12-30 2023-06-23 重庆大学 Single-frame structured optical gear fault three-dimensional measurement method and system based on deep learning
CN117156113A (en) * 2023-10-30 2023-12-01 南昌虚拟现实研究院股份有限公司 Deep learning speckle camera-based image correction method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LU, J等: "End-to-end deep learning method for absolute phase retrieval", FOURTEENTH INTERNATIONAL CONFERENCE ON INFORMATION OPTICS AND PHOTONICS, 6 January 2024 (2024-01-06) *
张钊;韩博文;于浩天;张毅;郑东亮;韩静;: "多阶段深度学习单帧条纹投影三维测量方法", 红外与激光工程, no. 06, 25 June 2020 (2020-06-25) *

Similar Documents

Publication Publication Date Title
Heber et al. Neural epi-volume networks for shape from light field
CN110163817B (en) Phase principal value extraction method based on full convolution neural network
CN111043988B (en) Single stripe projection measurement method based on graphics and deep learning
CN107808138B (en) Communication signal identification method based on FasterR-CNN
CN109886880A (en) A kind of optical imagery phase unwrapping winding method based on U-Net segmentation network
Zhao et al. ADRN: Attention-based deep residual network for hyperspectral image denoising
CN113160183A (en) Hyperspectral data processing method, device and medium
US20220414827A1 (en) Training apparatus, training method, and medium
CN111079893A (en) Method and device for obtaining generator network for interference fringe pattern filtering
CN116972814A (en) Shallow sea water depth detection method, equipment and storage medium based on active and passive remote sensing fusion
JP2019537736A (en) System and method for object detection in holographic lens-free imaging with convolution dictionary learning and coding
CN117689705B (en) Deep learning stripe structure light depth reconstruction method and device
CN114299358A (en) Image quality evaluation method and device, electronic equipment and machine-readable storage medium
CN110246152A (en) PIV image processing method and system
CN116579959B (en) Fusion imaging method and device for hyperspectral image
CN117689705A (en) Deep learning stripe structure light depth reconstruction method and device
CN117422619A (en) Training method of image reconstruction model, image reconstruction method, device and equipment
CN110751732A (en) Method for converting 2D image into 3D image
CN116051532A (en) Deep learning-based industrial part defect detection method and system and electronic equipment
CN113378715A (en) Living body detection method based on color face image and related equipment
CN112639868A (en) Image processing method and device and movable platform
CN112907495A (en) Spatial resolution enhancement method and device for hyperspectral image and electronic equipment
Otoya et al. Real-time non-invasive leaf area measurement method using depth images
CN111951159A (en) Processing method for super-resolution of light field EPI image under strong noise condition
CN116127273B (en) Snow index acquisition method, device, storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant