CN113008163B - Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system - Google Patents

Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system Download PDF

Info

Publication number
CN113008163B
CN113008163B CN202110225563.0A CN202110225563A CN113008163B CN 113008163 B CN113008163 B CN 113008163B CN 202110225563 A CN202110225563 A CN 202110225563A CN 113008163 B CN113008163 B CN 113008163B
Authority
CN
China
Prior art keywords
target
formula
fringe pattern
fringe
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110225563.0A
Other languages
Chinese (zh)
Other versions
CN113008163A (en
Inventor
齐召帅
刘晓霖
刘晓君
杨佳琪
张艳宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110225563.0A priority Critical patent/CN113008163B/en
Publication of CN113008163A publication Critical patent/CN113008163A/en
Application granted granted Critical
Publication of CN113008163B publication Critical patent/CN113008163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Abstract

The invention discloses a coding and decoding method based on frequency shift stripes in a structured light three-dimensional reconstruction system, which comprises the steps of firstly generating a target stripe graph group according to target parameters, and coding projector image coordinates into the target stripe graph group, wherein the target parameters at least comprise the frequency shift amount and the frequency shift times N of the stripes, and the N is an integer which is more than or equal to 3; sequentially projecting the target stripe pattern group to a scene to be detected, and shooting projected images to obtain N first stripe patterns; and finally, decoding the image coordinates of the projector from the N first stripe images. Compared with the existing encoding and decoding method based on the phase shift fringes, the method disclosed by the invention can be used for calculating the projector image coordinate of each pixel independently in the decoding process, so that the method is not limited by any space prior information, and can be better suitable for complex scenes such as discontinuous surfaces.

Description

Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system
Technical Field
The invention belongs to the technical field of computer vision, and particularly relates to an encoding and decoding method.
Background
The structured light three-dimensional reconstruction technology is always a hotspot of research in the fields of three-dimensional computer vision, three-dimensional sensing and measurement, and is widely applied to the aspects of industrial detection, reverse engineering, human three-dimensional modeling, cultural relic protection and the like. The method has the characteristics of non-contact, non-damage, high speed, high precision and the like, so that the method becomes the most ideal means for accurate three-dimensional reconstruction and profile morphology measurement of scenes.
In the structured light three-dimensional reconstruction process, a projector or a light source projects a series of fringe patterns into a scene, then a camera shoots the fringe patterns reflected and modulated by the surface of the scene, the obtained fringe patterns are processed by a fringe analysis technology to obtain the phase of fringes, and then the fringe phase is converted into a three-dimensional reconstruction result according to a triangulation principle and known system parameters. The method comprises the steps of obtaining accurate fringe pattern phase based on a certain coding and decoding strategy, and obtaining corresponding projector image coordinates through calculation. At present, an encoding and decoding method based on single-frequency fringe projection is based on phase space continuity assumption, or accurate acquisition of image coordinates of a projector is realized by introducing a plurality of cameras or adopting a light field camera and the like. However, the former cannot adapt to a surface with steps because of the dependency relationship between adjacent pixels in the decoding process based on the assumption of phase space continuity, while the latter can implement independent encoding and decoding between pixels, but increases the system cost and complexity because of the addition of hardware. Compared with the prior art, the coding and decoding method based on the phase shift stripes can accurately and robustly acquire the image coordinates of the projector by projecting a plurality of stripe images with different frequencies and carrying out multiple phase shifts on the stripe images with each frequency, and adjacent pixels in the decoding process have independence and can be suitable for non-continuous surfaces. However, the efficiency of three-dimensional reconstruction is reduced due to the need to project and shoot a plurality of fringe patterns with different frequencies. Meanwhile, the methods are all based on a two-step strategy, namely, the wrapping phase is firstly obtained, then the wrapping phase is unwrapped based on an unwrapping algorithm to obtain a final result, and the operation is complicated.
In summary, a faster, simpler and simpler encoding and decoding method needs to be designed, so as to realize stable, robust and inter-pixel independent operation projector image coordinate encoding and decoding under the premise of requiring fewer fringe patterns and not needing to increase hardware. Therefore, it is necessary to develop a codec method capable of coordinating or solving the above problems.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a coding and decoding method based on frequency shift stripes in a structured light three-dimensional reconstruction system, which comprises the steps of firstly generating a target stripe graph group according to target parameters, coding projector image coordinates into the target stripe graph group, wherein the target parameters at least comprise the frequency shift amount and the frequency shift times N of the stripes, and the N is an integer which is greater than or equal to 3; sequentially projecting the target fringe pattern group to a scene to be detected, and shooting a projected image to obtain N first fringe patterns; and finally, decoding the image coordinates of the projector from the N first stripe images. Compared with the existing coding and decoding method based on the phase shift stripes, the invention calculates the projector image coordinate of each pixel independently in the decoding process, so that the method is not limited by any space prior information, and can better adapt to complex scenes such as discontinuous surfaces and the like.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
the structured light three-dimensional reconstruction system comprises a computer, a projector and a camera; the projector and the camera are connected with the computer; the computer is used for generating a target fringe pattern group and decoding the image coordinates of the projector from a first fringe pattern shot by the camera; the projector is used for projecting a target fringe pattern group to a scene; the camera is used for shooting a target fringe pattern group reflected by the surface of a scene to obtain N fringe patterns, the fringe patterns shot by the camera are called as first fringe patterns, and N is an integer greater than or equal to 3;
step 1: defining parameters of a target fringe pattern group, generating the target fringe pattern group by a target computer according to the parameter coding of the target fringe pattern group, and simultaneously coding image coordinates of a projector into the target fringe pattern group, wherein the target fringe pattern group comprises N target fringe patterns;
defining the average brightness of the target stripe as A, the modulation degree of the target stripe as B, and the initial phase of the target stripe as
Figure GDA0003781188390000025
Initial frequency of target fringe is f 0 (ii) a The specific encoding steps are as follows:
step 1-1: generating a 1 st target fringe pattern by adopting a trigonometric function;
if the generated target fringe pattern is a vertical fringe pattern, encoding by adopting any one of the formula (1) and the formula (2):
Figure GDA0003781188390000021
Figure GDA0003781188390000022
if the generated target fringe pattern is a horizontal fringe pattern, encoding by adopting any one of the formulas (3) and (4):
Figure GDA0003781188390000023
Figure GDA0003781188390000024
wherein, I p1 (x p ,y p ) Is the 1 st target stripe pattern, x p As a coordinate of the horizontal direction of the projector image, x p =1,2,3,…,M p -1,M p The width of the target stripe image in the horizontal direction; y is p As a coordinate of the vertical direction of the projector image, y p =1,2,3,…,N p -1,N p The height of the target stripe pattern in the vertical direction is shown;
if the formula (1) is selected for coding, the formula (5), the formula (9) and the formula (13) are sequentially selected when the vertical stripe graph is calculated in the subsequent step; if the formula (2) is selected for encoding, the formula (6), the formula (10) and the formula (14) are sequentially selected when the vertical stripe pattern is calculated in the subsequent step;
if the formula (3) is selected for encoding, the formula (7), the formula (11) and the formula (15) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step; if the formula (4) is selected for encoding, the formula (8), the formula (12) and the formula (16) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step;
step 1-2: generating target fringe graphs from the 2 nd to the Nth;
the frequency of the (i + 1) th target fringe pattern is f 0 + i Δ f, Δ f is the amount of frequency shift, i ═ 1,2,3, …, N-1;
if the generated target fringe image is a vertical fringe image, selecting an equation (5) or an equation (6) according to the step 1-1 by the calculation formula of the (i + 1) th target fringe image:
Figure GDA0003781188390000031
Figure GDA0003781188390000032
if the generated target fringe image is a horizontal fringe image, selecting an equation (7) or an equation (8) according to the step 1-1 by the calculation formula of the (i + 1) th target fringe image:
Figure GDA0003781188390000033
Figure GDA0003781188390000034
and 2, step: the projector sequentially projects N target fringe patterns in the target fringe pattern group to a scene to be measured, the camera sequentially shoots the projected images to obtain N first fringe patterns, and the target fringe patterns and the first fringe patterns are in one-to-one correspondence;
and step 3: the computer decodes from N first stripe images to obtain the image coordinates of the projector;
step 3-1: assuming that each first fringe pattern in the N first fringe patterns comprises H multiplied by W pixel points; the coordinates of the pixel points are represented by (p, q), wherein p is 1,2,3, …, H, q is 1,2,3, …, W; the coordinates of the pixel points at the same position in each first stripe image are the same as the coordinates of the pixel points at the same position in the other N-1 first stripe images;
step 3-2: with pixels as units, decoding projector image coordinates of each pixel point in H multiplied by W pixel points pixel by pixel from N first stripe images;
step 3-2-1: establishing a tth parameter estimation equation related to a jth pixel point in the first fringe image according to the image brightness of the jth pixel point in the tth first fringe image, wherein t is 1,2,3, …, N, j is 1,2,3, … and H multiplied by W;
if the target fringe image corresponding to the first fringe image in the t-th frame is a vertical fringe image, selecting an expression (9) or an expression (10) according to the step 1-1 by using the parameter estimation equation:
Figure GDA0003781188390000035
Figure GDA0003781188390000041
if the target fringe image corresponding to the first fringe image in the t-th frame is a horizontal fringe image, selecting an equation (11) or an equation (12) according to the step 1-1 by using the parameter estimation equation:
Figure GDA0003781188390000042
Figure GDA0003781188390000043
wherein, I ct (p, q) is the image brightness of the jth pixel point in the tth first stripe image, A c Representing the average brightness of the stripes as the parameters to be estimated; b is c Representing the modulation degree of the stripes as parameters to be estimated; (x' p ,y′ p ) The projector image coordinate corresponding to the jth pixel point in the tth first stripe image;
step 3-2-2: obtaining the parameter estimation equations of the jth pixel point in the 1 st to Nth first stripe images, N in total, and simultaneously obtaining N parameter estimation equations to solve the parameter A to be estimated at the jth pixel c 、B c And (x' p ,y′ p );
Establishing an optimization objective function:
Figure GDA0003781188390000044
Figure GDA0003781188390000045
Figure GDA0003781188390000046
Figure GDA0003781188390000047
if the target fringe pattern corresponding to the first fringe pattern is a vertical fringe pattern, selecting the optimization objective function E (A) of the formula (13) or the formula (14) according to the step 1-1 c ,B c ,x′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first fringe image is obtained;
if the target fringe pattern corresponding to the first fringe pattern is a horizontal fringe pattern, the optimization objective function E (A) of the formula (15) or the formula (16) is selected according to the step 1-1 c ,B c ,y′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first fringe image is obtained;
step 3-2-3: and repeating the step 3-2-1 and the step 3-2-2 to obtain projector image coordinates corresponding to all pixel points in the first fringe pattern.
Preferably, the method for establishing the optimization objective function in the step 3-2-2 is a least square method.
The invention has the following beneficial effects:
compared with the existing encoding and decoding method based on the phase shift fringe, the target fringe pattern group generated according to the frequency shift quantity and the frequency shift times is adopted, the periodicity in the projector image coordinate decoding process can be eliminated by the frequency shift quantity and the frequency shift times, so that the corresponding projector image coordinate can be directly obtained without phase unwrapping, and the frequency shift times N can be 3 at the minimum theoretically, and the number of the required fringe patterns is less than that of a multi-frequency conventional phase shift fringe pattern, so that the efficiency is higher; in addition, compared with other space phase unwrapping methods, the method provided by the invention can be used for independently calculating the projector image coordinate of each pixel in the decoding process, so that the method is not limited by any space prior information, and can be better suitable for complex scenes such as discontinuous surfaces.
Drawings
Fig. 1 is a schematic diagram of a structured light three-dimensional reconstruction system according to the present invention.
FIG. 2 is a flow chart of the method of the present invention.
Fig. 3 is a schematic diagram of design parameters required by the frequency-shifted fringe pattern set generated by the present invention.
FIG. 4 is a schematic diagram of a set of frequency-shifted fringe patterns generated by the present invention: FIG. 4(A) is a frequency-shifted fringe pattern with a vertical fringe pattern; fig. 4(B) is a frequency-shifted fringe pattern with a horizontal fringe pattern.
Fig. 5 is an example of a set of N-step frequency-shifted stripes with a vertical stripe pattern of the present invention when N is 4: fig. 5(a) to 5(D) are 1,2,3, and 4-step frequency-shift stripe patterns, respectively; fig. 5(E) is a section line of the 1,2,3, 4-th step frequency shift stripe pattern.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
As shown in fig. 2, a coding and decoding method based on frequency shift stripes in a structured light three-dimensional reconstruction system includes the following steps:
as shown in fig. 1, the structured light three-dimensional reconstruction system comprises a computer, a projector and a camera; the projector and the camera are connected with the computer; the computer is used for generating a target fringe pattern group and decoding the image coordinates of the projector from a first fringe pattern shot by the camera; the projector is used for projecting a target fringe pattern group to a scene; the camera is used for shooting a target fringe pattern group reflected by the surface of a scene to obtain N fringe patterns, the fringe patterns shot by the camera are called as first fringe patterns, and N is an integer greater than or equal to 3;
step 1: defining parameters of a target fringe pattern group, generating the target fringe pattern group by a target computer according to the parameter coding of the target fringe pattern group, and simultaneously coding image coordinates of a projector into the target fringe pattern group, wherein the target fringe pattern group comprises N target fringe patterns;
as shown in fig. 3, the average brightness of the target stripe is defined as a, the modulation degree of the target stripe is defined as B, and the initial phase of the target stripe is defined as
Figure GDA0003781188390000051
Initial frequency of target fringe is f 0 (ii) a The specific encoding steps are as follows:
step 1-1: generating a 1 st target fringe pattern by adopting a trigonometric function;
as shown in fig. 4, if the generated target fringe pattern is a vertical fringe pattern, encoding is performed by using any one of the following equations (1) and (2):
Figure GDA0003781188390000061
Figure GDA0003781188390000062
if the generated target fringe pattern is a horizontal fringe pattern, encoding is performed by adopting any one of the formula (3) and the formula (4):
Figure GDA0003781188390000063
Figure GDA0003781188390000064
wherein, I p1 (x p ,y p ) Is the 1 st target stripe pattern, x p As a coordinate of the horizontal direction of the projector image, x p =1,2,3,…,M p -1,M p The width of the target stripe image in the horizontal direction; y is p As the coordinate of the vertical direction of the projector image, y p =1,2,3,…,N p -1,N p The height of the target stripe pattern in the vertical direction is shown;
if the formula (1) is selected for coding, the formula (5), the formula (9) and the formula (13) are sequentially selected when the vertical stripe graph is calculated in the subsequent step; if the formula (2) is selected for encoding, the formula (6), the formula (10) and the formula (14) are sequentially selected when the vertical fringe pattern is calculated in the subsequent step;
if the formula (3) is selected for encoding, the formula (7), the formula (11) and the formula (15) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step; if the formula (4) is selected for encoding, the formula (8), the formula (12) and the formula (16) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step;
step 1-2: generating target fringe graphs from the 2 nd to the Nth;
the frequency of the (i + 1) th target fringe pattern is f 0 + i Δ f, Δ f is the amount of frequency shift, i ═ 1,2,3, …, N-1;
if the generated target fringe image is a vertical fringe image, selecting an equation (5) or an equation (6) according to the step 1-1 by the calculation formula of the (i + 1) th target fringe image:
Figure GDA0003781188390000065
Figure GDA0003781188390000066
if the generated target fringe pattern is a horizontal fringe pattern, the calculation formula of the (i + 1) th target fringe pattern selects the formula (7) or the formula (8) according to the step 1-1:
Figure GDA0003781188390000067
Figure GDA0003781188390000068
and 2, step: the projector sequentially projects N target fringe patterns in the target fringe pattern group to a scene to be measured, the camera sequentially shoots the projected images to obtain N first fringe patterns, and the target fringe patterns and the first fringe patterns are in one-to-one correspondence;
and step 3: the computer decodes from N first stripe images to obtain the image coordinates of the projector;
step 3-1: assuming that each first fringe pattern in the N first fringe patterns comprises H multiplied by W pixel points; the coordinates of the pixel points are represented by (p, q), wherein p is 1,2,3, …, H, q is 1,2,3, …, W; the coordinates of the pixel points at the same position in each first stripe image are the same as the coordinates of the pixel points at the same position in the other N-1 first stripe images;
step 3-2: with pixels as units, decoding projector image coordinates of each pixel point in H multiplied by W pixel points pixel by pixel from N first stripe images;
step 3-2-1: establishing a tth parameter estimation equation related to a jth pixel point in the first fringe image according to the image brightness of the jth pixel point in the tth first fringe image, wherein t is 1,2,3, …, N, j is 1,2,3, … and H multiplied by W;
if the target fringe image corresponding to the first fringe image in the t-th frame is a vertical fringe image, selecting an expression (9) or an expression (10) according to the step 1-1 by using the parameter estimation equation:
Figure GDA0003781188390000071
Figure GDA0003781188390000072
if the target fringe pattern corresponding to the first fringe pattern in the t-th frame is a horizontal fringe pattern, selecting an expression (11) or an expression (12) according to the step 1-1 by using the parameter estimation equation:
Figure GDA0003781188390000073
Figure GDA0003781188390000074
wherein, I ct (p, q) is the image brightness of the jth pixel point in the tth first stripe image, A c Representing the average brightness of the stripes as the parameters to be estimated; b is c Representing the modulation degree of the stripes as parameters to be estimated; (x' p ,y′ p ) For the jth image in the tth first stripe imageProjector image coordinates corresponding to the pixel points;
step 3-2-2: obtaining the parameter estimation equations of the jth pixel point in the 1 st to Nth first stripe graphs, N in total, and simultaneously obtaining N parameter estimation equations to solve the parameter A to be estimated at the jth pixel point c 、B c And (x' p ,y′ p );
The parameter estimation equation establishment processes of each pixel point are mutually independent, namely the projector image coordinate decoding processes of any two different pixels are independent, and the decoding process is not constrained by spatial smoothness prior information, so that the method is very suitable for three-dimensional reconstruction of discontinuous surfaces.
And (3) establishing an optimized objective function by adopting a least square or other optimization algorithm:
Figure GDA0003781188390000075
Figure GDA0003781188390000076
Figure GDA0003781188390000081
Figure GDA0003781188390000082
if the target fringe pattern corresponding to the first fringe pattern is a vertical fringe pattern, selecting the optimization objective function E (A) of formula (13) or formula (14) according to step 1-1 c ,B c ,x′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first stripe image is obtained;
if the target fringe pattern corresponding to the first fringe pattern is a horizontal fringe pattern, the optimization objective function E (A) of the formula (15) or the formula (16) is selected according to the step 1-1 c ,B c ,y′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first fringe image is obtained;
step 3-2-3: and repeating the step 3-2-1 and the step 3-2-2, and calculating to obtain a group of parameters to be estimated with the minimum optimization objective function, namely the solution result of the equation set, wherein the image coordinates of the projector in the result are the decoding result. And obtaining the projector image coordinates corresponding to all the pixel points in the first stripe image.
According to the established optimization objective function, because a fixed period T does not exist, the image coordinate of the projector is added or subtracted with the period T, and the value of the optimization objective function is kept unchanged, namely only one unique solution can be formed in an equation set formed by simultaneous N parameter estimation equations, the method does not have the phase wrapping problem, and can directly decode the image coordinate of the projector without calculating the phase of the stripes to convert the phase of the stripes to obtain a decoding result. Meanwhile, theoretically, only three parameters to be estimated are included in an equation set formed by the simultaneous N parameter estimation equations, so that the equation set can be solved by at least three parameter estimation equations, namely the minimum number N of the required fringe patterns is 3; the number of the fringe patterns required for acquiring the wrapping phase by the phase-shifting fringe pattern is 3, and at least 6 fringe patterns are required to obtain the final decoding result by adding the fringe patterns with at least two frequencies. In comparison, theoretically, the coding and decoding method based on the frequency shift stripes needs fewer stripe patterns, so that the efficiency is higher, and the method is more suitable for the three-dimensional reconstruction of a rapid and dynamic scene.
An experiment is performed by taking N-4, and the experimental result is as shown in fig. 5, which is an example of a group of N-step frequency shift fringe patterns when N-4 is used with a vertical fringe pattern: fig. 5(a) to 5(D) are i-th to 1,2,3, 4-th step frequency-shift stripe diagrams, respectively; fig. 5(E) is a section line of the i-th-1, 2,3, 4-step frequency shift stripe pattern.
As can be seen from the figure, the method achieves better effect.

Claims (2)

1. A coding and decoding method based on frequency shift stripes in a structured light three-dimensional reconstruction system is characterized in that the structured light three-dimensional reconstruction system comprises a computer, a projector and a camera; the projector and the camera are connected with the computer; the computer is used for generating a target fringe pattern group and decoding the image coordinates of the projector from a first fringe pattern shot by the camera; the projector is used for projecting a target fringe pattern group to a scene; the camera is used for shooting a target fringe pattern group reflected by the surface of a scene to obtain N fringe patterns, the fringe patterns shot by the camera are called as first fringe patterns, and N is an integer greater than or equal to 3; the method comprises the following steps:
step 1: defining parameters of a target fringe pattern group, generating the target fringe pattern group by a target computer according to the parameter coding of the target fringe pattern group, and simultaneously coding image coordinates of a projector into the target fringe pattern group, wherein the target fringe pattern group comprises N target fringe patterns;
defining the average brightness of the target stripe as A, the modulation degree of the target stripe as B, and the initial phase of the target stripe as
Figure FDA0003781188380000015
Initial frequency of target fringe is f 0 (ii) a The specific encoding steps are as follows:
step 1-1: generating a 1 st target fringe pattern by adopting a trigonometric function;
if the generated target fringe pattern is a vertical fringe pattern, encoding by adopting any one of the formula (1) and the formula (2):
Figure FDA0003781188380000011
Figure FDA0003781188380000012
if the generated target fringe pattern is a horizontal fringe pattern, encoding is performed by adopting any one of the formula (3) and the formula (4):
Figure FDA0003781188380000013
Figure FDA0003781188380000014
wherein, I p1 (x p ,y p ) Is the 1 st target stripe image, x p As a coordinate of the horizontal direction of the projector image, x p =1,2,3,…,M p -1,M p The width of the target stripe image in the horizontal direction; y is p As the coordinate of the vertical direction of the projector image, y p =1,2,3,…,N p -1,N p The height of the target stripe pattern in the vertical direction is shown;
if the formula (1) is selected for coding, the formula (5), the formula (9) and the formula (13) are sequentially selected when the vertical stripe graph is calculated in the subsequent step; if the formula (2) is selected for encoding, the formula (6), the formula (10) and the formula (14) are sequentially selected when the vertical stripe pattern is calculated in the subsequent step;
if the formula (3) is selected for encoding, the formula (7), the formula (11) and the formula (15) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step; if the formula (4) is selected for encoding, the formula (8), the formula (12) and the formula (16) are sequentially selected when the horizontal fringe pattern is calculated in the subsequent step;
step 1-2: generating target fringe graphs from the 2 nd to the Nth;
the frequency of the (i + 1) th target fringe pattern is f 0 + i Δ f, Δ f is the amount of frequency shift, i ═ 1,2,3, …, N-1;
if the generated target fringe pattern is a vertical fringe pattern, selecting formula (5) or formula (6) according to step 1-1 for the calculation formula of the (i + 1) th target fringe pattern:
Figure FDA0003781188380000021
Figure FDA0003781188380000022
if the generated target fringe pattern is a horizontal fringe pattern, the calculation formula of the (i + 1) th target fringe pattern selects the formula (7) or the formula (8) according to the step 1-1:
Figure FDA0003781188380000023
Figure FDA0003781188380000024
and 2, step: the projector sequentially projects N target fringe patterns in the target fringe pattern group to a scene to be measured, the camera sequentially shoots the projected images to obtain N first fringe patterns, and the target fringe patterns and the first fringe patterns are in one-to-one correspondence;
and step 3: the computer decodes from N first stripe pictures to obtain the image coordinate of the projector;
step 3-1: assuming that each first fringe pattern in the N first fringe patterns comprises H multiplied by W pixel points; the coordinates of the pixel points are represented by (p, q), wherein p is 1,2,3, …, H, q is 1,2,3, …, W; the coordinates of the pixel points at the same position in each first stripe image are the same as the coordinates of the pixel points at the same position in the other N-1 first stripe images;
step 3-2: with pixels as units, decoding projector image coordinates of each pixel point in H multiplied by W pixel points pixel by pixel from N first stripe images;
step 3-2-1: establishing a tth parameter estimation equation related to a jth pixel point in the first fringe image according to the image brightness of the jth pixel point in the tth first fringe image, wherein t is 1,2,3, …, N, j is 1,2,3, … and H multiplied by W;
if the target fringe pattern corresponding to the first fringe pattern in the t-th frame is a vertical fringe pattern, selecting an expression (9) or an expression (10) according to the step 1-1 by using the parameter estimation equation:
Figure FDA0003781188380000025
Figure FDA0003781188380000026
if the target fringe pattern corresponding to the first fringe pattern in the t-th frame is a horizontal fringe pattern, selecting an expression (11) or an expression (12) according to the step 1-1 by using the parameter estimation equation:
Figure FDA0003781188380000027
Figure FDA0003781188380000028
wherein, I ct (p, q) is the image brightness of the jth pixel point in the tth first stripe image, A c Representing the average brightness of the stripes as the parameters to be estimated; b is c Representing the modulation degree of the stripes as parameters to be estimated; (x' p ,y′ p ) The projector image coordinate corresponding to the jth pixel point in the tth first stripe image;
step 3-2-2: obtaining the parameter estimation equations of the jth pixel point in the 1 st to Nth first stripe images, N in total, and simultaneously obtaining N parameter estimation equations to solve the parameter A to be estimated at the jth pixel c 、B c And (x' p ,y′ p );
Establishing an optimization objective function:
Figure FDA0003781188380000031
Figure FDA0003781188380000032
Figure FDA0003781188380000033
Figure FDA0003781188380000034
if the target fringe pattern corresponding to the first fringe pattern is a vertical fringe pattern, selecting the optimization objective function E (A) of formula (13) or formula (14) according to step 1-1 c ,B c ,x′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first fringe image is obtained;
if the target fringe pattern corresponding to the first fringe pattern is a horizontal fringe pattern, the optimization objective function E (A) of the formula (15) or the formula (16) is selected according to the step 1-1 c ,B c ,y′ p ) Minimum, solving N parameter estimation equations to obtain (x' p ,y′ p ) The image coordinate of the projector corresponding to the jth pixel point in the first fringe image is obtained;
step 3-2-3: and repeating the step 3-2-1 and the step 3-2-2 to obtain projector image coordinates corresponding to all pixel points in the first fringe pattern.
2. The encoding and decoding method based on frequency-shift fringes in the structured light three-dimensional reconstruction system according to claim 1, wherein the method for establishing the optimized objective function in step 3-2-2 is a least square method.
CN202110225563.0A 2021-03-01 2021-03-01 Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system Active CN113008163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110225563.0A CN113008163B (en) 2021-03-01 2021-03-01 Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110225563.0A CN113008163B (en) 2021-03-01 2021-03-01 Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system

Publications (2)

Publication Number Publication Date
CN113008163A CN113008163A (en) 2021-06-22
CN113008163B true CN113008163B (en) 2022-09-27

Family

ID=76387036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110225563.0A Active CN113008163B (en) 2021-03-01 2021-03-01 Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system

Country Status (1)

Country Link
CN (1) CN113008163B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114061489A (en) * 2021-11-15 2022-02-18 资阳联耀医疗器械有限责任公司 Structured light coding method and system for three-dimensional information reconstruction
CN114792345B (en) * 2022-06-27 2022-09-27 杭州蓝芯科技有限公司 Calibration method based on monocular structured light system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106643562A (en) * 2016-10-27 2017-05-10 天津大学 Time domain and space domain hybrid coding based structured light fringe projection method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100480627C (en) * 2007-10-26 2009-04-22 北京航空航天大学 Steel rail wearing integrative parameter vehicle-mounted dynamic measuring device and method
US20160094830A1 (en) * 2014-09-26 2016-03-31 Brown University System and Methods for Shape Measurement Using Dual Frequency Fringe Patterns
CN105890546A (en) * 2016-04-22 2016-08-24 无锡信捷电气股份有限公司 Structured light three-dimensional measurement method based on orthogonal Gray code and line shift combination
CN107169952B (en) * 2017-03-07 2021-07-23 广东顺德中山大学卡内基梅隆大学国际联合研究院 Stripe recognition and information detection method for visible light imaging positioning

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106643562A (en) * 2016-10-27 2017-05-10 天津大学 Time domain and space domain hybrid coding based structured light fringe projection method

Also Published As

Publication number Publication date
CN113008163A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN111563564B (en) Speckle image pixel-by-pixel matching method based on deep learning
CN109186476B (en) Color structured light three-dimensional measurement method, device, equipment and storage medium
CN106705855B (en) A kind of high dynamic performance method for three-dimensional measurement based on adaptive optical grating projection
CN113008163B (en) Encoding and decoding method based on frequency shift stripes in structured light three-dimensional reconstruction system
Zhang et al. High-resolution, real-time 3D shape acquisition
Zhang Recent progresses on real-time 3D shape measurement using digital fringe projection techniques
CN107967697B (en) Three-dimensional measurement method and system based on color random binary coding structure illumination
CN111563952B (en) Method and system for realizing stereo matching based on phase information and spatial texture characteristics
CN108596008B (en) Face shake compensation method for three-dimensional face measurement
CN108613637A (en) A kind of structured-light system solution phase method and system based on reference picture
CN109523627B (en) Three-dimensional reconstruction method of profile structured light based on Taylor index expression
Lu et al. Reconstruction of isolated moving objects with high 3D frame rate based on phase shifting profilometry
CN110692084B (en) Apparatus and machine-readable storage medium for deriving topology information of a scene
CN109945802A (en) A kind of structural light three-dimensional measurement method
Je et al. Color-phase analysis for sinusoidal structured light in rapid range imaging
CN113379818A (en) Phase analysis method based on multi-scale attention mechanism network
Wu et al. A novel phase-shifting profilometry to realize temporal phase unwrapping simultaneously with the least fringe patterns
CN111307066A (en) Phase unwrapping method for interval processing
CN114170345A (en) Fringe pattern design method for structured light projection nonlinear correction
Wang et al. A 3D shape measurement method based on novel segmented quantization phase coding
Li et al. An improved 2+ 1 phase-shifting algorithm
JP2001330417A (en) Three-dimensional shape measuring method and apparatus using color pattern light projection
CN116385653A (en) Three-dimensional imaging self-supervision method and device based on single-view high-frequency stripes
CN115839677A (en) Method and system for measuring three-dimensional topography of surface of object with high dynamic range
CN111023999B (en) Dense point cloud generation method based on spatial coding structured light

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant