CN116429014A - Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method - Google Patents

Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method Download PDF

Info

Publication number
CN116429014A
CN116429014A CN202310431122.5A CN202310431122A CN116429014A CN 116429014 A CN116429014 A CN 116429014A CN 202310431122 A CN202310431122 A CN 202310431122A CN 116429014 A CN116429014 A CN 116429014A
Authority
CN
China
Prior art keywords
projection
image
phase
point cloud
structured light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310431122.5A
Other languages
Chinese (zh)
Inventor
邓高旭
马立东
姬小峰
牛健
郭瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Science and Technology
Original Assignee
Taiyuan University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Science and Technology filed Critical Taiyuan University of Science and Technology
Priority to CN202310431122.5A priority Critical patent/CN116429014A/en
Publication of CN116429014A publication Critical patent/CN116429014A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2433Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring outlines by shadow casting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces

Abstract

The invention discloses an orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method, which comprises the following steps: constructing a multi-projection structured light three-dimensional measurement platform; the method comprises the steps of performing independent projection and shooting with a corresponding shooting device through a projection device, generating a phase shift image, solving a gamma value of the measurement platform based on the phase shift image, and solving a frequency and a gray value of the phase shift image; projecting a preset coding image onto the surface of an object to be measured to obtain an aliased image, solving the relative phase and the absolute phase of the aliased image, carrying out depth information solving according to internal and external parameters to obtain point cloud information, and then fusing the point cloud information according to calibration parameters of a measurement platform to obtain a point cloud image of the surface of the object to be measured, wherein the point cloud image is used for three-dimensional shape measurement of the surface of the object and measurement of the size of the object. The invention can effectively ensure the effective measurement of large-size objects, can be applied to the fields of large-size metal surface flatness, surface quality detection and the like, and has great practical application value.

Description

Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method
Technical Field
The invention relates to the technical field of structured light three-dimensional measurement, in particular to an orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method.
Background
Machine vision is widely used in modern industrial manufacturing, particularly in high-risk environments and under conditions that are not manually accessible. The visual three-dimensional reconstruction refers to shooting real-world objects and scenes by using a camera, and processing the real-world objects and scenes by using a computer vision technology so as to obtain a three-dimensional model of the objects. The three-dimensional reconstruction technology is widely used in the fields of 3D face recognition, virtual reality, augmented reality, robot navigation, robot grabbing, automatic driving and the like. Compared with other three-dimensional measurement technologies, the structured light measurement technology has the advantages of simplicity in implementation, high measurement accuracy, high measurement speed and small influence of ambient light. However, a single-projection structured light system cannot generally obtain the surface full view of the object to be measured at one time, and particularly, for a scene with high real-time performance, the three-dimensional shape of the object to be measured needs to be obtained at one time. The multi-camera multi-projection multi-structure light system not only can solve the problem of one-time overall reconstruction, but also can solve the problem of light shielding in single-view reconstruction by utilizing multi-projection under the same view angle.
The problem of image aliasing caused by simultaneous projection of streak images by multiple projectors is unavoidable. If the problem is to be solved, the most straightforward method is to separate the aliased images captured by the camera and then reconstruct three-dimensionally according to a single projection structured light system. However, the existing multi-camera multi-projection measurement method has the problems of low measurement efficiency and poor separation precision of overlapped images, which seriously affects the practical application and popularization of the method
Disclosure of Invention
In order to solve the technical problems, the invention provides a three-dimensional measurement method for the non-separated structured light of an orthogonal multi-projection aliasing image, which utilizes the solved complex boundary conditions to restrict the frequency and intensity values, and solves the problems of brightness overexposure and interference-like blurring which can occur when the image is aliased; and researching the time and space characteristics of the aliased images, transforming the phase shift map shot based on time sequence by utilizing one-dimensional discrete Fourier transform, respectively obtaining the corresponding phase of each projector through phase spectrum analysis, and then obtaining the three-dimensional shape of the object surface by utilizing a phase-depth calculation method.
In order to achieve the above object, the present invention provides a method for three-dimensionally measuring an aliased image non-separated structured light by orthogonal multi-projection, comprising:
building a multi-projection structured light three-dimensional measurement platform and debugging the multi-projection structured light three-dimensional measurement platform; the multi-projection structured light three-dimensional measurement platform comprises a camera device, a projection device, a network card and computer equipment;
the method comprises the steps of performing independent projection and shooting through a projection device and a corresponding camera device, generating a phase shift image, solving a gamma value of the measurement platform based on the phase shift image, encoding the gamma value into a phase shift image projected subsequently, solving the frequency and gray value of the phase shift image, and determining internal and external parameters of the camera device and the projection device;
projecting a preset coding image onto the surface of an object to be measured to obtain an aliasing image, solving the relative phase and the absolute phase of the aliasing image, carrying out depth information solving according to the internal and external parameters to obtain point cloud information, and then fusing the point cloud information according to the calibration parameters of the measurement platform to obtain a point cloud image of the surface of the object to be measured, wherein the point cloud image is used for three-dimensional shape measurement of the surface of the object and measurement of the size of the object.
Preferably, the number of the camera devices in the multi-projection structured light three-dimensional measurement platform is not less than two, the number of the projection devices is not less than three, and a consistency trigger mechanism is established between the camera devices and the projection devices; the consistency triggering mechanism is used for keeping consistency projection for the projection device, triggering the camera device to acquire after the projection is completed, and triggering the projection device to project the next image after the camera device acquires the aliased image each time.
Preferably, the debugging the multi-projection structured light three-dimensional measurement platform comprises:
starting the computer equipment, connecting a camera device, the projection device and the network card, testing an external trigger function between the projection device and the camera device, testing a facula projection function of the projection device, and testing an image acquisition function of the camera device;
the projection angle of the projection device is adjusted, the overlapping part between the projected images is ensured, the shooting view field range of the camera device is adjusted, and the camera device can shoot the projection view field of the projection device completely.
Preferably, solving the gamma value of the measurement platform based on the phase-shifted image includes:
and solving the gamma value based on a duty ratio phase chord distribution coding method, calculating a gamma value corresponding to each wrapping phase curve, summing and averaging the gamma values, and taking the average value as the gamma value of each wrapping phase period.
Preferably, determining the internal and external parameters of the image capturing device and the projection device according to a polygonal prism calibration block calibration method includes:
acquiring initial estimation of a transformation matrix between the camera devices through a polyhedral prismatic calibration block, and minimizing errors based on three-dimensional constraint through reconstructed three-dimensional standard component point clouds; and establishing a world coordinate system on a reference camera coordinate system in the reconstruction process, realizing multi-view point cloud fusion based on a position conversion matrix from other camera devices to the reference camera device, and acquiring internal and external parameters of the camera device and the projection device.
Preferably, the aliased image I c The method comprises the following steps:
Figure BDA0004190303780000031
wherein b k ,ω k And delta k Representing the contrast, spatial angular frequency and phase shift, respectively, of the fringe image projected by the kth projector, the a value comprising the ambient illumination and the total contribution of the plurality of projectors, P being the number of aliased images or projectors, x k Representing a projected imageThe lateral coordinates of the image.
Preferably, the relative phase of the aliased image is solved based on a time discrete fourier transform, and the absolute phase of the aliased image is solved by using a robust spatial phase unwrapping method;
the spatial phase unwrapping is performed by a weighted least squares phase unwrapping method, including:
firstly, preprocessing a shot aliasing image, sampling the preprocessed image according to the density of sampling points, and calculating the phase value and the signal-to-noise ratio of each sampling point;
and constructing a weight matrix based on the signal-to-noise ratio and the phase difference information, and carrying out weighted average on the phase difference between adjacent pixels to obtain a continuous phase expansion result.
Preferably, fusing the point cloud information according to the calibration parameters of the measurement platform includes:
performing PCA algorithm on the point cloud information to perform rough matching, reducing the dimension of a data set, and explaining the point cloud information through the feature of the greatest contribution in the reserved point cloud information;
according to the rotation and translation vectors corresponding to the main direction calculation of the point cloud information, updating the closest point matching solution transformation of the distance between the source point cloud and the target point cloud based on the matching algorithm of ICP, and obtaining a globally optimal transformation matrix;
and transforming the source point cloud into the target point cloud based on the globally optimal transformation matrix, evaluating the distance between the fused adjacent point clouds through Euclidean distance, deleting the point clouds under a given threshold value, and the like until all the point clouds are fused.
Compared with the prior art, the invention has the following advantages and technical effects:
the method mainly solves the defects that multiple images are required to be separated in the aliasing process and the problems of overexposure and interference of images in the aliasing process in the prior art, can directly increase a projector to enlarge the measurement area of an object, can effectively ensure the effective measurement of a large-size object, can be applied to the fields of detection of the surface flatness and the surface quality of large-size metal such as steel plates, magnesium-aluminum alloy plates and the like in the metallurgical industry, has certain guarantee on the production quality of metal plates, and has important theoretical significance and great practical application value in the quality detection of the metal plates.
The method can effectively ensure the effective measurement of large-size objects, can be applied to the fields of large-size metal surface flatness, surface quality detection and the like of steel plates, magnesium-aluminum alloy plates and the like in the metallurgical industry, and has important theoretical significance and great practical application value.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application, illustrate and explain the application and are not to be construed as limiting the application. In the drawings:
FIG. 1 is a flow chart of a three-dimensional measurement method of an orthogonal multi-projection aliasing image non-separated structured light according to an embodiment of the invention;
FIG. 2 is a schematic diagram illustrating decoding of a multi-camera multi-projection structured light system according to an embodiment of the present invention;
FIG. 3 is a flowchart of a calibration method based on a polygonal prism-shaped calibration block according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an exemplary system composition structure according to an embodiment of the present invention.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
An orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method comprises theoretical research, basic calibration and online measurement. The theoretical research comprises interference-like and overexposure boundary conditions, the space-time characteristics of the aliasing grating image and a space phase unfolding method. The basic calibration process comprises gamma correction based on duty cycle coding, discontinuous position identification and segmentation of phase diagrams, and calibration of internal and external parameters of a projector and a camera. The online measurement mainly comprises a phase solving method of an aliased image, generation of a synchronously projected multi-aliased image, spatial unfolding of a phase, a phase-depth mapping method and a point cloud fusion method. The general technical route is shown in the following figure 1.
The method specifically comprises the following steps:
1) And constructing a multi-projection structured light three-dimensional measurement platform comprising a camera, a lens, a projector, a POE network card and a computer.
2) Starting a computer, connecting a camera, a projector and a network card, and enabling a debugging system to work in a linkage way;
3) Testing an external trigger function between the projector and the camera, testing a facula projection function of the projector, and testing an image acquisition function of the camera;
4) The projection angle of the projector is adjusted to ensure that overlapping parts exist between the projected fringe images, the shooting view field range of the camera is adjusted, and the camera can shoot the projection view field of the projector completely;
5) Each projector and the corresponding camera independently project and shoot the generated phase shift image, solve the gamma value of the system based on a duty ratio phase chord distribution coding method, and code the gamma value in a phase shift diagram of the subsequent projection;
6) Encoding the frequency and gray values of the image according to the frequency and gray values solved by the complex boundary conditions;
in this embodiment, the sum of the maximum gray values of all the projection images does not exceed 255 and the orthogonality condition is satisfied between the projection frequencies.
7) Determining internal and external parameters of each camera and projector in the system according to a multi-sided prism calibration block calibration method;
8) Projecting a preset coding image to the surface of an object to be measured to obtain an aliasing image;
9) Solving the relative phase based on the discrete Fourier transform of time, and solving the absolute phase by using a robust space phase expansion diagram;
10 According to the determined internal and external parameters of the system, carrying out depth information solving by combining the solved phase information;
11 Obtaining point cloud information after solving, and then fusing the point clouds according to system calibration parameters;
12 Finally, a point cloud image of the surface of the measured object is obtained.
In the step 1), the number of cameras is more than or equal to 2, and the number of projectors is more than or equal to 3;
in step 3), a consistency trigger mechanism is established between the cameras and the projectors, all the projectors keep consistent projection, all the cameras are triggered to collect after the projection is completed, and all the cameras trigger all the projectors to project the next image after each time of collecting the aliased image.
Class interference and overexposure boundary conditions:
in a dual-projection structured light system, if the frequency and gray scale intensity of the phase-shifted image projected by the dual projector are the same, fringe images can produce interference-like blurring and exposure phenomena when aliased.
For a one-dimensional time signal, at time T s The conditions for any two sub-signals to be orthogonal are:
Figure BDA0004190303780000071
the minimum sub-carrier frequency interval when the above is satisfied is
Δf min =1/T s (2)
Using the characteristics of one-dimensional orthogonal signals in phase-shifting image coding with frequencies f 1 And f 2 The period of the two phase shift images of (a) is T 1 And T 2 The image width is Co, then there is
T 1 =Co/f 1 ,T 2 =Co/f 2 (3)
In the above, T s Is T 1 ,T 2 Is the least common multiple of (1), the minimum frequency interval is
Δf min =Co/T s (4)
Thus, the phase shift pattern frequency projected by two projectors can be expressed as f 1 And f 1 +Δf min
The gamma solving method of the duty ratio chord distribution coding comprises the following steps:
each phase period in the wrapped phase solved by the three-step phase shift method comprises three chord distribution curves. Theoretically, each chord distribution curve can calculate the corresponding duty cycle coding value and obtain the corresponding gamma value, and the gamma is equal. In practical calculations, the gamma values found individually for each chord distribution curve are slightly different. Therefore, in order to calculate the gamma of the phase shift image corresponding to each wrapping phase period more accurately, three gamma values corresponding to each wrapping phase curve need to be calculated. As shown in the dashed box of fig. 1, the gamma calculation method of the duty cycle phase chord distribution curve is to sum and average three gamma values obtained by three chord distribution curves, and the average value is used as the gamma of each wrapping phase period. Finally, in the computer generated phase shifted fringe image, the gamma value for each wrapped phase period is encoded into the corresponding fringe image. To this end, the gamma effect of the projector is eliminated during the projection of the phase-shifted fringe image.
Discontinuous identification and segmentation of phase diagrams:
the estimated pattern is used as an indication of the parcel phase quality. In order to obtain the weight mask, it is necessary to further divide the solved pattern to unwrap the phases. Since the difference between the two categories is small, i.e., the phase values of the continuous region and the discontinuous region are small, it is difficult to determine the hard threshold shown in equation 5. Thus, a new post-processing solution is proposed, i.e. to obtain the weight mask without using hard thresholds, which can be expressed by the following formula:
w i,j =PF(Φ i,j ) (5)
where PF (-) represents the post-processing function.
The post-processing function to obtain the weight mask includes three steps: bilateral filtering, STDS and numerical re-labeling. In fact, the direction jumps between adjacent periods seriously affect the segmentation of the internal boundaries (discontinuities in the phase map) resulting in the phase being erroneously unwrapped. In order to suppress the occurrence of such a jump phenomenon, first, the anomaly direction is eliminated using a bilateral filter. Then, the present invention extracts/removes edges/textures of the bilateral filter output, respectively, using STDS having image smoothing capability, in consideration of the fact that there would be useful edges hidden in textures. The output of the STDS is explicitly labeled as two regions: a continuous region with a value of 0 and a discontinuous region with a value of 0.9. Finally, a weight mask is obtained by re-labeling the two regions, i.e. a weight mask of 0 corresponds to a continuous region and a weight mask of 1 corresponds to a discontinuous region.
Quadrature encoded aliased images do not separate phase solutions:
the aliased image captured by the camera may be written as an additive combination of multiple orthogonal stripe images. Superimposing the images projected by the plurality of projectors onto the image projected by the first projector, the aliased image captured by the camera is represented as:
Figure BDA0004190303780000091
wherein b is k ,ω k And delta k Respectively representing the contrast, spatial angular frequency and phase shift of the fringe image projected by the kth projector. The value of a in the formula contains the ambient illumination and the overall contribution of the plurality of projectors.
If it is
Figure BDA0004190303780000092
Respectively representing the wrapping phases of the kth projector, there are
Figure BDA0004190303780000093
Wherein I is c [n]Representing the nth phase-shifted image captured by the camera.
Along I c [n]The time dimension t of (2) is subjected to discrete Fourier transform to effectively obtain a phase value. FIG. 2 is a view ofShown as a phase decoding technique route for multiple aliased images.
Figure BDA0004190303780000094
Can see
Figure BDA0004190303780000095
Is each I c [n]Is a discrete spectrum of (c). Wherein the dc spectral component is used to represent the ambient illumination and the average intensity of the two projectors. The additional spectral component is used to represent the projector and k represents the second projector. This sample approach decomposes the N captured images into contributions for each projector.
Spatial spreading of the phase:
the spatial phase expansion rule is to rapidly expand the wrapping phase through the structural characteristics of the wrapping phase map and the spatial characteristics of the adjacent points. The weighted least square phase unwrapping method is one of the phase space unwrapping methods, and can effectively solve the problems of noise interference, uneven sampling and the like. The method is based on the weighted least squares principle and introduces a weight matrix into the phase unwrapping process. Specifically, in the weighted least squares phase unwrapping method, first, preprocessing such as removing background, filtering, and segmentation is required for the image. And then sampling the image according to a certain sampling point density, and calculating the phase value and the signal to noise ratio of each sampling point. Then, a weight matrix is constructed by using the signal-to-noise ratio and the phase difference information, and the phase difference between adjacent pixels is weighted and averaged, so that a continuous phase expansion result is obtained. In the weighted least squares phase unwrapping method, the choice of the weight matrix is critical. Different weight matrixes can be designed according to different application scenes, so that a better phase expansion effect is achieved. For example, in the case of high signal-to-noise ratio, a larger weight value may be employed to reduce the effect of noise; at low signal-to-noise ratios, the weight values may be reduced appropriately to avoid excessive smoothing and information loss. The method not only can calculate the discontinuous position of the wrapping phase, but also can minimize the influence of noise in a least square way, thereby improving the robustness to the noise.
Calibration method for polyhedral prismatic calibration block system
The initial estimation of the position conversion matrix among the multiple cameras is obtained through the three-dimensional calibration plate, a world coordinate system is established on a reference camera coordinate system in the reconstruction process, and multi-view point cloud fusion is achieved based on the position conversion matrix from other cameras to the reference camera. Meanwhile, the polygonal prismatic calibration block is customized to replace a traditional calibration plate for shooting of a camera under multiple visual angles, and the cylindrical calibration images are identical and the angles are different, so that the operation is convenient in the calibration process. The multi-view light projects the overlapped phase shift image, so that parameters of the system have certain correlation in an overlapped area, and the system equipment is increased when the measurement of a large-size object is flexibly carried out, and the technical route is shown in the following figure 3.
Depth resolution based on absolute phase
The method only uses the abscissa of the pixel point under the image coordinate system of the projector, and the correction operation cannot be normally performed due to the absence of the ordinate. In order to introduce parameter deformation into triangulation for correction operation, the method adopts a method of multiple iterations, namely, the calculated ordinate is participated in calculation by utilizing a reprojection mode. The three-dimensional coordinates in the world coordinate system are obtained by triangulation calculation without considering the projector distortion, then transformed into the three-dimensional coordinates in the projector coordinate system, and projected into a two-dimensional plane image to obtain the ordinate. By iterating multiple times, the distortion correction of the projector can achieve a better effect.
Multi-projection point cloud fusion
The method adopted is based on coarse matching of principal component analysis (Principal Component Analysis, PCA) and fine matching of iterative closest points (Iterative Closest Point, ICP), the repeated point clouds are deleted and then fused into a cluster of point clouds, and the following description is given of a point cloud registration method.
First, performing coarse matching based on PCA algorithm on the point cloud, wherein the main effect of PCA is to simplify data, reduce the dimension of a data set and interpret the point cloud data through the feature of the greatest contribution in the reserved point cloud. The corresponding rotation and translation vectors are calculated from the principal direction of the point cloud. The matching algorithm based on ICP is a nonlinear iterative process by continuously updating the matching solution transformation of the nearest point, and finally the globally optimal transformation matrix is obtained. According to the finally obtained transformation matrix, the source point cloud can be transformed to the target point cloud, the Euclidean distance is selected to evaluate the distance between the fused adjacent points in order to ensure that the point cloud does not appear repeatedly, and then the point cloud under a given threshold value is deleted. Similarly, the point clouds of the multi-projection system will complete the fusion.
By adopting the method, a new thought of solving a phase diagram and performing depth calculation without separating orthogonal multi-projection aliased images is established, based on the thought, a set of accurate three-dimensional measurement system suitable for large-size and dynamic moving object surfaces can be developed by combining hardware equipment such as projectors, cameras and network cards, the number of the projectors and the cameras can be flexibly increased by the multi-projection structured light three-dimensional measurement platform, and only the projection images of the projectors and the projection images of the existing projectors are ensured to have certain aliased areas, so that effective measurement of the large-size objects can be effectively ensured, and the method can be applied to the fields of large-size metal surface flatness, surface quality detection and the like of steel plates, magnesium-aluminum alloy plates and the like in the metallurgical industry, and has important theoretical significance and great practical application value.
The invention is further illustrated by the following examples:
FIG. 4 shows an aliased image non-separated structured light three-dimensional measurement system based on orthogonal multi-projection, which comprises three industrial cameras and lenses, three projectors, a computer, a network card, necessary power lines and the like, wherein the cameras adopt ME2P-1230-9GC-P color cameras of a large constant water star series, the projectors adopt K137i model DLP projectors under a macro-micro Acer series, the network card adopts a gigabit network card powered by POE, and the computer is an i7 processor and 32G memory; the projector projects the coded image onto the surface of the measured object, then the camera captures the image with aliasing, the absolute phase diagram corresponding to each projector is solved by a time-based discrete Fourier transform mode, the unfolding phase is obtained by a robust space phase unfolding method, finally the three-dimensional point cloud information is obtained by a phase-depth mapping resolving model, and finally the structural size of the measured object is obtained. This example is a preferred embodiment of the present application, but the scope of protection of the present application is in no way limited thereto, including the number and types of projectors and cameras, the number and types of network cards, and the configuration of the computer, and all conceivable changes are covered in the scope of protection of the present application.
The foregoing is merely a preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions easily conceivable by those skilled in the art within the technical scope of the present application should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. An orthogonal multi-projection aliased image non-separated structured light three-dimensional measurement method, comprising:
building a multi-projection structured light three-dimensional measurement platform and debugging the multi-projection structured light three-dimensional measurement platform; the multi-projection structured light three-dimensional measurement platform comprises a camera device, a projection device, a network card and computer equipment;
the method comprises the steps of performing independent projection and shooting through a projection device and a corresponding camera device, generating a phase shift image, solving a gamma value of the measurement platform based on the phase shift image, encoding the gamma value into a phase shift image projected subsequently, solving the frequency and gray value of the phase shift image, and determining internal and external parameters of the camera device and the projection device;
projecting a preset coding image onto the surface of an object to be measured to obtain an aliasing image, solving the relative phase and the absolute phase of the aliasing image, carrying out depth information solving according to the internal and external parameters to obtain point cloud information, and then fusing the point cloud information according to the calibration parameters of the measurement platform to obtain a point cloud image of the surface of the object to be measured, wherein the point cloud image is used for three-dimensional shape measurement of the surface of the object and measurement of the size of the object.
2. The three-dimensional measurement method of the orthogonal multi-projection aliased image non-separated structured light according to claim 1, wherein the number of the image pickup devices in the multi-projection structured light three-dimensional measurement platform is not less than two, the number of the projection devices is not less than three, and a consistency trigger mechanism is established between the image pickup devices and the projection devices; the consistency triggering mechanism is used for keeping consistency projection for the projection device, triggering the camera device to acquire after the projection is completed, and triggering the projection device to project the next image after the camera device acquires the aliased image each time.
3. The orthogonal multi-projection aliased image non-split structured light three-dimensional measurement method of claim 1, wherein tuning the multi-projection structured light three-dimensional measurement platform comprises:
starting the computer equipment, connecting a camera device, the projection device and the network card, testing an external trigger function between the projection device and the camera device, testing a facula projection function of the projection device, and testing an image acquisition function of the camera device;
the projection angle of the projection device is adjusted, the overlapping part between the projected images is ensured, the shooting view field range of the camera device is adjusted, and the camera device can shoot the projection view field of the projection device completely.
4. The orthogonal multi-projection aliased image non-split structured light three-dimensional measurement method of claim 1, wherein solving the gamma value of the measurement platform based on the phase-shifted image comprises:
and solving the gamma value based on a duty ratio phase chord distribution coding method, calculating a gamma value corresponding to each wrapping phase curve, summing and averaging the gamma values, and taking the average value as the gamma value of each wrapping phase period.
5. The orthogonal multi-projection aliased image non-split structured light three-dimensional measurement method according to claim 1, wherein determining the internal and external parameters of the image pickup device and the projection device according to a polygonal prism-shaped calibration block calibration method comprises:
acquiring initial estimation of a transformation matrix between the camera devices through a polyhedral prismatic calibration block, and minimizing errors based on three-dimensional constraint through reconstructed three-dimensional standard component point clouds; and establishing a world coordinate system on a reference camera coordinate system in the reconstruction process, realizing multi-view point cloud fusion based on a position conversion matrix from other camera devices to the reference camera device, and acquiring internal and external parameters of the camera device and the projection device.
6. The orthogonal multi-projection aliased image non-split structured light three-dimensional measurement method of claim 1, wherein the aliased image I c The method comprises the following steps:
Figure FDA0004190303770000021
wherein b k ,ω k And delta k Representing the contrast, spatial angular frequency and phase shift, respectively, of the fringe image projected by the kth projector, the a value comprising the ambient illumination and the total contribution of the plurality of projectors, P being the number of aliased images or projectors, x k Representing the lateral coordinates of the projected image.
7. The orthogonal multi-projection aliased image non-separated structured light three-dimensional measurement method of claim 1, wherein the relative phase of the aliased image is solved based on a time-based discrete fourier transform, and the absolute phase of the aliased image is solved using a robust spatial phase unwrapped graph method;
the spatial phase unwrapping is performed by a weighted least squares phase unwrapping method, including:
firstly, preprocessing a shot aliasing image, sampling the preprocessed image according to the density of sampling points, and calculating the phase value and the signal-to-noise ratio of each sampling point;
and constructing a weight matrix based on the signal-to-noise ratio and the phase difference information, and carrying out weighted average on the phase difference between adjacent pixels to obtain a continuous phase expansion result.
8. The orthogonal multi-projection aliased image non-separated structured light three-dimensional measurement method according to claim 1, wherein fusing the point cloud information according to calibration parameters of the measurement platform comprises:
performing PCA algorithm on the point cloud information to perform rough matching, reducing the dimension of a data set, and explaining the point cloud information through the feature of the greatest contribution in the reserved point cloud information;
according to the rotation and translation vectors corresponding to the main direction calculation of the point cloud information, updating the closest point matching solution transformation of the distance between the source point cloud and the target point cloud based on the matching algorithm of ICP, and obtaining a globally optimal transformation matrix;
and transforming the source point cloud into the target point cloud based on the globally optimal transformation matrix, evaluating the distance between the fused adjacent point clouds through Euclidean distance, deleting the point clouds under a given threshold value, and the like until all the point clouds are fused.
CN202310431122.5A 2023-04-21 2023-04-21 Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method Pending CN116429014A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310431122.5A CN116429014A (en) 2023-04-21 2023-04-21 Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310431122.5A CN116429014A (en) 2023-04-21 2023-04-21 Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method

Publications (1)

Publication Number Publication Date
CN116429014A true CN116429014A (en) 2023-07-14

Family

ID=87090648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310431122.5A Pending CN116429014A (en) 2023-04-21 2023-04-21 Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method

Country Status (1)

Country Link
CN (1) CN116429014A (en)

Similar Documents

Publication Publication Date Title
Zhang Absolute phase retrieval methods for digital fringe projection profilometry: A review
CN110514143B (en) Stripe projection system calibration method based on reflector
Wang et al. Estimation of multiple directional light sources for synthesis of augmented reality images
JP6143747B2 (en) Improved depth measurement quality
WO2016181687A1 (en) Image processing device, image processing method and program
WO2009150799A1 (en) Image processing device, image processing method, and program
Aliaga et al. A self-calibrating method for photogeometric acquisition of 3D objects
CN107767456A (en) A kind of object dimensional method for reconstructing based on RGB D cameras
CN110692084B (en) Apparatus and machine-readable storage medium for deriving topology information of a scene
CN113506348A (en) Gray code-assisted three-dimensional coordinate calculation method
CN111536905B (en) Monocular grating structure optical vision measurement method and system based on reference image
Chen et al. Finding optimal focusing distance and edge blur distribution for weakly calibrated 3-D vision
CN113074667A (en) Global absolute phase alignment method based on mark points, storage medium and system
d'Angelo et al. Image-based 3D surface reconstruction by combination of photometric, geometric, and real-aperture methods
CN116429014A (en) Orthogonal multi-projection aliasing image non-separation structured light three-dimensional measurement method
Tehrani et al. A new approach to 3D modeling using structured light pattern
Dizeu et al. Frequency shift triangulation: a robust fringe projection technique for 3D shape acquisition in the presence of strong interreflections
CN111023999B (en) Dense point cloud generation method based on spatial coding structured light
Petković et al. Multiprojector multicamera structured light surface scanner
CN112325799A (en) High-precision three-dimensional face measurement method based on near-infrared light projection
CN113709442B (en) Single-pixel imaging method based on projection reconstruction
Ouji et al. A space-time depth super-resolution scheme for 3D face scanning
Bender et al. A Hand-held Laser Scanner based on Multi-camera Stereo-matching
Garcia et al. Temporally-consistent phase unwrapping for a stereo-assisted structured light system
Lu et al. Parallax correction of texture image in fringe projection profilometry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination