CN115876083A - Mobile projection type three-dimensional measurement method and device - Google Patents

Mobile projection type three-dimensional measurement method and device Download PDF

Info

Publication number
CN115876083A
CN115876083A CN202211540888.9A CN202211540888A CN115876083A CN 115876083 A CN115876083 A CN 115876083A CN 202211540888 A CN202211540888 A CN 202211540888A CN 115876083 A CN115876083 A CN 115876083A
Authority
CN
China
Prior art keywords
light
plane
structured light
line structured
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211540888.9A
Other languages
Chinese (zh)
Inventor
张效栋
王倩雯
房长帅
陈杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202211540888.9A priority Critical patent/CN115876083A/en
Publication of CN115876083A publication Critical patent/CN115876083A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the field of mobile projection type three-dimensional measurement, in particular to a mobile projection type three-dimensional measurement method and a mobile projection type three-dimensional measurement device, wherein the mobile projection type three-dimensional measurement method comprises the following steps: acquiring a multi-line structured light bar image of an object to be detected based on a DLP optical machine; obtaining pixel coordinates of an object to be detected based on a gray scale gravity center method by utilizing the multi-line structured light bar image; obtaining a three-dimensional coordinate of the object to be detected based on a pre-calibrated multi-light plane by utilizing the pixel coordinate of the object to be detected; the three-dimensional coordinate of the object to be measured is used as a three-dimensional measurement result of the object to be measured, the DLP optical machine is used for calibrating multiple optical planes of the object to be measured after acquiring a single optical plane, the measurement efficiency is high, the structure is simple on the basis of ensuring certain precision, and the system cost is low.

Description

Mobile projection type three-dimensional measurement method and device
Technical Field
The invention relates to the field of mobile projection type three-dimensional measurement, in particular to a mobile projection type three-dimensional measurement method and a mobile projection type three-dimensional measurement device.
Background
At present, a few three-dimensional measuring methods for industrial parts exist, but a few rapid measuring methods with high precision and low cost are available. The contact type measuring method, such as a three-coordinate measuring instrument, can accurately measure the information of the size, the shape and the like of a complex workpiece, but the measuring process depends on point-by-point scanning of a probe, the speed is slow, the efficiency is low, the surface of an object can be damaged, the volume of the equipment is large, and the requirement on the application environment is high. In non-contact measurement, optical measurement methods are the focus of research and are divided into active measurement and passive measurement. The structured light method is the most important and common method in active non-contact measurement, and has the advantages of multiple characteristic points, high measurement accuracy, strong anti-interference capability and the like. The method projects the structured light with characteristics to the surface of an object to form a light field on the surface of the object, and the light field information is resolved into depth information by combining the triangulation distance measuring principle, so that the three-dimensional structure of the surface of the object is obtained. The structured light measurement can be classified into a light point type, a light strip type and a light surface type according to different light sources. The light point type structured light measurement is to project a point to the surface of an object to obtain depth information, and if the surface information of the whole object is to be obtained, the surface of the object needs to be scanned point by point. The light bar structured light method uses a line light source, which has a scan dimension reduced to one dimension compared to a light dot method. The precision and the light point type of the conventional light strip type structured light method are in one order of magnitude, the linear structured light is limited by a laser, the three-dimensional measurement of an object to be measured can be completed only by matching with a high-precision displacement table, and the method has the defects of low measurement speed, large volume, high cost and the like.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a mobile projection type three-dimensional measurement method and a mobile projection type three-dimensional measurement device, and a three-dimensional object is rapidly measured by an innovative multi-light plane calibration method.
In order to achieve the above object, the present invention provides a mobile projection type three-dimensional measurement method, including:
s1, acquiring a multi-line structured light bar image of an object to be detected based on a DLP (digital light processing) optical machine;
s2, obtaining pixel coordinates of the object to be detected based on a gray scale gravity center method by utilizing the multi-line structured light bar image;
s3, obtaining a three-dimensional coordinate of the object to be detected based on a pre-calibrated multi-light plane by utilizing the pixel coordinate of the object to be detected;
and S4, using the three-dimensional coordinate of the object to be measured as a three-dimensional measurement result of the object to be measured.
Preferably, the acquiring of the multi-line structured light bar image of the object to be measured based on the DLP light machine includes:
projecting a multi-line structured light strip on the surface of an object to be detected by using a DLP (digital light processing) optical machine to obtain a first multi-line structured light strip image of the object to be detected;
after the number of the pixel rows of the gray scale 255 corresponding to the DLP optical machine and the object to be detected is adjusted, a second multi-line structured light bar image of the object to be detected is obtained;
and utilizing the first multi-line structured light strip image and the second multi-line structured light strip image as the multi-line structured light strip image of the object to be detected.
Preferably, the obtaining of the pixel coordinates of the object to be measured based on a gray scale gravity center method by using the multi-line structured light bar image includes:
acquiring the light strip central line of the multi-line structure light strip image modulated by the surface of the object to be measured based on a gray scale gravity center method by using the multi-line structure light strip image;
and obtaining the pixel coordinate of the object to be measured by utilizing the light strip central line of the multi-line structured light strip image.
Preferably, the pre-calibration of the multi-light plane comprises:
s3-1, projecting an equally spaced multi-line structured light strip by using a DLP (digital light processing) optical machine;
s3-2, establishing a multi-light-plane initial relational expression by utilizing pixel positions of all stripes in the projection picture and light planes formed by projection of all the stripes in the light stripes with the equal-interval multi-line structure;
s3-3, using a part of light planes in the projected multiple light planes as basic multiple light planes;
s3-4, respectively carrying out single-optical-plane calibration processing by utilizing the basic multi-optical-plane;
s3-5, substituting the parameters of the single light plane into the multi-light plane initial relational expression, calculating the multi-light plane initial relational expression, and completing multi-light plane initial calibration;
s3-6, calculating a light plane optimization formula by using the multi-light plane measurement standard ball diameter data which is subjected to initial calibration;
and S3-7, obtaining a final multi-light plane relational expression by utilizing the light plane optimization formula and the multi-light plane initial relational expression, and completing the calibration processing of the multi-light plane.
Further, the single optical plane calibration includes:
s3-5-1, performing basic calibration processing on the double telecentric lens to obtain a first conversion relational expression;
s3-5-2, when the checkerboard of the calibration target surface is clearly imaged, collecting a target image;
s3-5-3, projecting single-line structured light to a calibrated target surface by using a DLP (digital light processing) optical machine to obtain an intersection line of a light plane and the target plane;
s3-5-4, collecting light strip imaging of the single-line structured light and checkerboard imaging corresponding to the light strip imaging;
s3-5-5, adjusting the plane position of the calibration target surface to move up dmm, and repeating S3-5-2 to S3-5-3 to obtain n groups of single-light plane initial images;
s3-5-6, acquiring a rotation matrix and a translation vector by using the homography matrix;
s3-5-7, obtaining a second homography matrix by utilizing the world coordinates of the checkerboard corner points and the pixel coordinates of the checkerboard corner points corresponding to the single-optical-plane initial image based on a DLT algorithm;
s3-5-8, obtaining a second conversion relational expression by utilizing the second homography matrix;
s3-5-9, fitting by using each angular point of the checkerboard based on a least square method to obtain a target plane;
s3-5-10, obtaining light bar center line camera coordinates by utilizing pixel coordinates of single-line structured light based on camera internal parameters according to a single-line structured light target plane equation corresponding to the single-line structured light;
s3-5-11, performing fitting processing by using the camera coordinates of the light strip central line to obtain a single light plane;
and S3-5-12, obtaining a plane equation according to the single optical plane to complete single optical plane calibration.
Further, the calculation formula for obtaining the first conversion relation by performing the basic calibration processing on the double telecentric lens is as follows:
Figure BDA0003977581660000031
wherein (X) w1 ,Y w1 ) As world coordinates, (u) 1 ,v 1 ) As pixel coordinates, R 1 For a rotation matrix, T 1 To translate the matrix, H 1 Is a first homography matrix.
Further, the formula for establishing the initial relationship of the multiple light planes by using the pixel position of each stripe in the projected picture and the light plane formed by projecting each stripe in the equally-spaced multi-line structured light bars is as follows: :
A=f 1 (u)=a 1 u n +a 2 u n-1 +…+a n+1
B=f 2 (u)=b 1 u n +b 2 u n-1 +…+b n+1
C=f 3 (u)=c 1 u n +c 2 u n-1 +…+c n+1
D=f 4 (u)=d 1 u n +d 2 u n-1 +…+d n+1
wherein, A, B, C, D respectively represent four parameters in the optical plane equation Ax + By + Cz + D =0, a n 、b n 、c n 、d n The coefficient is a polynomial, u is the number of pixel columns in the projection picture where the projected light bars are located, and n is the degree of the polynomial.
Further, the calculation formula of the optical plane optimization formula by using the multi-optical plane measurement standard sphere diameter data after the initial calibration is as follows:
F=||R-R i (a j ,b j ,c j ,d j )|| 2 (j=1,2,…6)
wherein R is the standard radius value of the standard ball, R i Solving coefficients obtained by polynomial fittingObtained spherical radius, a j 、b j 、c j 、d j Is the coefficient of the polynomial.
Preferably, the obtaining the three-dimensional coordinate of the object to be measured based on the pre-calibrated multiple light planes by using the pixel coordinate of the object to be measured includes:
and substituting the pixel coordinates of the object to be measured into the corresponding pre-calibrated light plane calibration relational expression to obtain the three-dimensional coordinates of the object to be measured.
Based on the same inventive concept, the invention also provides a mobile projection type three-dimensional measuring device, which comprises a DLP optical machine, a camera with a double telecentric lens, a DLP mounting plate, a camera mounting plate, a transfer plate, a camera bottom plate, a bottom plate and a gantry;
the DLP optical machine is fixed on the bottom plate through the DLP mounting panel, and the camera and the double telecentric lens are fixed on the bottom plate through the camera mounting panel, the adapter plate and the camera bottom plate, so that the bottom plate is fixed on the gantry.
Compared with the closest prior art, the invention has the following beneficial effects:
the number of the light stripes projected by the mobile projection method is large, the light stripes can be translated for realizing full-resolution measurement, the measurement efficiency is effectively improved, the object to be measured is calibrated on the basis of the DLP light machine, the measurement efficiency is high, the structure is simple, and the system cost is low on the basis of ensuring certain precision.
Drawings
FIG. 1 is a flow chart of a mobile projection type three-dimensional measurement method provided by the present invention;
FIG. 2 is a schematic diagram of a single optical plane calibration process of a mobile projection type three-dimensional measurement method provided by the invention;
FIG. 3 is a schematic diagram of a mobile projection-type three-dimensional measuring device provided by the present invention;
FIG. 4 is a schematic view of a light bar translation of a mobile projection type three-dimensional measuring apparatus according to the present invention;
reference numerals are as follows:
1. a DLP optical machine; 2. a camera having a double telecentric lens; 3. a DLP mounting plate; 4. a camera mounting plate; 5. an adapter plate; 6. a camera chassis; 7. a base plate; 8. a gantry; 9. a standard ball; 10. a support member; 11. and (6) testing the part to be tested.
Detailed Description
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1:
the invention provides a mobile projection type three-dimensional measurement method, as shown in figure 1, comprising the following steps:
s1, acquiring a multi-line structured light bar image of an object to be detected based on a DLP (digital light processing) optical machine;
s2, obtaining pixel coordinates of the object to be detected based on a gray scale gravity center method by utilizing the multi-line structured light bar image;
s3, obtaining a three-dimensional coordinate of the object to be detected based on a pre-calibrated multi-light plane by utilizing the pixel coordinate of the object to be detected;
and S4, using the three-dimensional coordinates of the object to be measured as a three-dimensional measurement result of the object to be measured.
S1 specifically comprises the following steps:
s1-1, projecting a multi-line structured light strip on the surface of an object to be detected by using a DLP (digital light processing) optical machine to obtain a first multi-line structured light strip image of the object to be detected;
s1-2, adjusting the number of pixel rows of the gray scale 255 corresponding to the DLP optical machine and the object to be detected to obtain a second multi-line structured light strip image of the object to be detected;
and S1-3, using the first multi-line structured light strip image and the second multi-line structured light strip image as the multi-line structured light strip image of the object to be detected.
S2 specifically comprises the following steps:
s2-1, acquiring a light strip central line of the multi-line structured light strip image modulated by the surface of the object to be detected based on a gray scale gravity center method by using the multi-line structured light strip image;
and S2-2, obtaining the pixel coordinate of the object to be measured by utilizing the light strip central line of the light strip image with the multi-line structure.
S3 specifically comprises the following steps:
s3-1, projecting an equally spaced multi-line structured light strip by using a DLP (digital light processing) optical machine;
s3-2, establishing a multi-light-plane initial relational expression by utilizing the pixel positions of the stripes in the projection picture and the light planes formed by projecting the stripes in the equally-spaced multi-line structured light stripes;
s3-3, using part of the projected multiple light planes as a basic multiple light plane;
s3-4, respectively carrying out single-optical-plane calibration processing by utilizing the basic multi-optical-plane;
s3-5, substituting the parameters of the single light plane into the multi-light plane initial relational expression, calculating the multi-light plane initial relational expression, and completing multi-light plane initial calibration;
s3-6, calculating a light plane optimization formula by using the data of the diameter of the multi-light plane measurement standard ball which is subjected to initial calibration;
and S3-7, obtaining a final multi-light plane relational expression by utilizing the light plane optimization formula and the multi-light plane initial relational expression, and completing the calibration processing of the multi-light plane.
And S3-8, acquiring a multi-line structured light strip image modulated by the surface of the object to be detected, and extracting the pixel coordinate of the central line of each light strip to obtain the pixel coordinate of the object to be detected.
And S3-9, substituting the surface pixel coordinates of the object to be measured into a corresponding pre-calibrated optical plane relation to obtain the three-dimensional coordinates of the object to be measured.
In this embodiment, in the mobile projection three-dimensional measurement method, the single optical plane calibration process is as shown in fig. 2.
In the embodiment, under the influence of factors such as displacement table precision and the like, a calibration result has certain deviation from a theoretical value, the calibrated multi-light plane is used for measuring the diameter of the standard sphere, and the relationship is nonlinearly optimized through a Levenberg-Marquardt algorithm.
The calculation formula of S3-3 is as follows:
A=f 1 (u)=a 1 u n +a 2 u n-1 +…+a n+1
B=f 2 (u)=b 1 u n +b 2 u n-1 +…+b n+1
C=f 3 (u)=c 1 u n +c 2 u n-1 +…+c n+1
D=f 4 (u)=d 1 u n +d 2 u n-1 +…+d n+1
wherein, A, B, C, D respectively represent four parameters in the optical plane equation Ax + By + Cz + D =0, a n 、b n 、c n 、d n The coefficient is a polynomial, u is the number of pixel columns in the projection picture where the projected light bars are located, and n is the degree of the polynomial.
S3-5 specifically comprises:
s3-5-1, performing basic calibration processing on the double telecentric lens to obtain a first conversion relational expression;
s3-5-2, when the checkerboard of the calibration target surface is clearly imaged, collecting a target image;
s3-5-3, projecting single-line structured light to a calibrated target surface by using a DLP (digital light processing) optical machine to obtain an intersection line of a light plane and the target plane;
s3-5-4, acquiring light strip imaging of the single-line structured light and checkerboard imaging corresponding to the light strip imaging;
s3-5-5, adjusting the plane position of the calibration target surface to move up dmm, and repeating S3-5-2 to S3-5-3 to obtain n groups of single-light plane initial images;
s3-5-6, acquiring a rotation matrix and a translation vector by using the homography matrix;
s3-5-7, obtaining a second homography matrix by utilizing the world coordinates of the checkerboard corner points and the pixel coordinates of the checkerboard corner points corresponding to the single-optical-plane initial image based on a DLT (digital Living transform) algorithm;
s3-5-8, obtaining a second conversion relational expression by utilizing the second homography matrix;
s3-5-9, fitting by using each angular point of the checkerboard based on a least square method to obtain a target plane;
s3-5-10, obtaining light bar center line camera coordinates by utilizing pixel coordinates of single-line structured light based on camera internal parameters according to a single-line structured light target plane equation corresponding to the single-line structured light;
s3-5-11, performing fitting processing by using the camera coordinates of the light strip center line to obtain a single light plane;
and S3-5-12, obtaining a plane equation according to the single optical plane to complete single optical plane calibration.
The calculation formula of S3-5-1 is as follows:
Figure BDA0003977581660000061
wherein (X) w1 ,Y w1 ) As world coordinates, (u) 1 ,v 1 ) As pixel coordinates, R 1 For a rotation matrix, T 1 To translate the matrix, H 1 Is a first homography matrix.
In this embodiment, in the mobile projection three-dimensional measurement method, after the plane position of the calibration target surface is adjusted to move up by 0.25mm, S3-5-2 to S3-5-3 are repeated to obtain 25 sets of single-light plane initial images.
In this embodiment, in a mobile projection type three-dimensional measurement method, a relationship of a second homography matrix in S3-5-7 is as follows:
Figure BDA0003977581660000071
wherein (X) w2 ,Y w2 ) As world coordinates, (u) 2 ,v 2 ) Is a pixel coordinate, R 2 For a rotation matrix, T 2 To translate the matrix, H 2 Is a second homography matrix.
In this embodiment, in the mobile projection type three-dimensional measurement method, the calculation formula of the second conversion relation in S3-5-8 is as follows:
Figure BDA0003977581660000072
wherein (x) c ,y c ,z c ) To calibrate the coordinates of each angular point of the checkerboard on the target surface under the camera coordinate system, (X) w2 ,Y w2 ,Z w2 ) World coordinates, R, of each corner of the checkerboard 2 For a rotation matrix, T 2 Is a translation matrix.
In this embodiment, in the mobile projection type three-dimensional measurement method, the target plane equation in S3-5-10 is calculated as follows:
Aixc+Biyc+Cizc+Di=0
wherein, A i ,B i ,C i ,D i Are target plane equation parameters.
In this embodiment, a calculation formula for calculating coordinates of a camera of a center line of a light bar of a single structured light according to coordinates of pixels of the single structured light is as follows:
Figure BDA0003977581660000073
wherein (x) Lc ,y Lc ) Camera coordinates for each point on the light bar centerline of the single line structured light, (u) L ,v L ) Pixel coordinates of each point on the light bar center line of the single line structured light.
In this embodiment, in a mobile projection type three-dimensional measurement method, a single optical plane calibration formula of S3-5-12 is as follows:
Ax Lc +By Lc +Cz Lc +D Lc =0
wherein (x) Lc ,y Lc ,z Lc ) The complete camera coordinates of the central line of the light bar are A, B, C and D respectively are single light plane parameters.
The calculation formula of S3-7 is as follows:
F=||R-R i (a j ,b j ,c j ,d j )|| 2 (j=1,2,…6)
wherein R is the standard radius value of the standard ball, R i Radius of sphere, a, calculated for coefficients obtained by polynomial fitting j 、b j 、c j 、d j Are coefficients of a polynomial.
In this embodiment, a mobile projection type three-dimensional measurement method, the R i And substituting R into the initial fringe light plane relational expression to obtain the light plane relational expression.
Example 2:
the invention provides a mobile projection type three-dimensional measuring device, which comprises a DLP optical machine 1, a camera 2 with a double telecentric lens, a DLP mounting plate 3, a camera mounting plate 4, an adapter plate 5, a camera bottom plate 6, a bottom plate 7 and a gantry 8, wherein the DLP optical machine is arranged on the camera 2;
DLP ray apparatus 1 is fixed on bottom plate 7 through DLP mounting panel 3, and camera 2 that has two telecentric mirror heads is fixed on bottom plate 7 through camera mounting panel 4, keysets 5 and camera bottom plate 6, fixes bottom plate 7 on longmen 8, standard ball 9 sets up on support piece 10, surveyed part 11 and set up in camera 2 below that has two telecentric mirror heads.
The working process of the embodiment is as follows:
step 1, placing a measured object 11 on an objective table;
step 2, as shown in fig. 4, the DLP optical machine 1 projects a multi-line structured light bar on the surface of the measured object, in the measuring process, the number of pixel rows of the gray level 255 in the projected picture on the DLP optical machine is translated, the measurement is continuously moved, the sampling rate is increased, and the camera respectively shoots the picture and records the picture as p;
and 3, extracting the center line of the light strip in the p by using a gray scale gravity center method to obtain the pixel coordinate of the light strip, and obtaining the three-dimensional coordinate of the light strip in a camera coordinate system according to the calibration model of the multiple light planes so as to finish the size measurement.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (10)

1. A mobile projection type three-dimensional measurement method is characterized by comprising the following steps:
s1, acquiring a multi-line structured light strip image of an object to be detected based on a DLP light machine;
s2, obtaining pixel coordinates of the object to be detected based on a gray scale gravity center method by utilizing the multi-line structured light bar image;
s3, obtaining a three-dimensional coordinate of the object to be detected based on a pre-calibrated multi-light plane by utilizing the pixel coordinate of the object to be detected;
and S4, using the three-dimensional coordinates of the object to be measured as a three-dimensional measurement result of the object to be measured.
2. The method of claim 1, wherein the obtaining the image of the multi-line structured light bar of the object to be measured based on the DLP optical machine comprises:
projecting a multi-line structured light strip on the surface of an object to be detected by using a DLP (digital light processing) machine to obtain a first multi-line structured light strip image of the object to be detected;
after the number of pixel rows of the gray level 255 in the projection picture of the DLP optical machine is adjusted, a second multi-line structured light bar image of the object to be detected is obtained;
and utilizing the first multi-line structured light strip image and the second multi-line structured light strip image as the multi-line structured light strip image of the object to be detected.
3. The method as claimed in claim 1, wherein obtaining pixel coordinates of an object to be measured based on a gray-scale gravity center method using the image of the multi-line structured light bar comprises:
acquiring the light strip central line of the object surface modulated multi-line structure light strip image based on a gray scale gravity center method by using the multi-line structure light strip image;
and obtaining the pixel coordinate of the object to be measured by utilizing the light strip central line of the multi-line structured light strip image.
4. The mobile projective three-dimensional measurement method of claim 1, wherein the pre-calibration of the multiple light planes comprises:
s3-1, projecting an equally-spaced multi-line structured light strip by using a DLP (digital light processing) optical machine;
s3-2, establishing a multi-light-plane initial relational expression by utilizing the pixel positions of the stripes in the projection picture and the light planes formed by projecting the stripes in the equally-spaced multi-line structured light stripes;
s3-3, using part of the projected multiple light planes as a basic multiple light plane;
s3-4, respectively carrying out single-optical-plane calibration processing by utilizing the basic multi-optical-plane;
s3-5, substituting the parameters of the single light plane into the multi-light plane initial relational expression, calculating the multi-light plane initial relational expression, and completing multi-light plane initial calibration;
s3-6, calculating a light plane optimization formula by using the multi-light plane measurement standard ball diameter data which is subjected to initial calibration;
and S3-7, obtaining a final multi-light plane relational expression by utilizing the light plane optimization formula and the multi-light plane initial relational expression, and completing the calibration processing of the multi-light plane.
5. The mobile projective three-dimensional measurement method of claim 4, wherein the single-plane calibration comprises:
s3-5-1, performing basic calibration processing on the double telecentric lens to obtain a first conversion relational expression;
s3-5-2, when the checkerboard of the calibration target surface is clearly imaged, collecting a target image;
s3-5-3, projecting single-line structured light to a calibrated target surface by using a DLP (digital light processing) optical machine to obtain an intersection line of a light plane and the target plane;
s3-5-4, acquiring light strip imaging of the single-line structured light and checkerboard imaging corresponding to the light strip imaging;
s3-5-5, adjusting the plane position of the calibration target surface to move up dmm, and repeating S3-5-2 to S3-5-3 to obtain n groups of single-light plane initial images;
s3-5-6, acquiring a rotation matrix and a translation vector by using the homography matrix;
s3-5-7, obtaining a second homography matrix by utilizing the world coordinates of the checkerboard corner points and the pixel coordinates of the checkerboard corner points corresponding to the single-optical-plane initial image based on a DLT algorithm;
s3-5-8, obtaining a second conversion relational expression by utilizing the second homography matrix;
s3-5-9, fitting by using each angular point of the checkerboard based on a least square method to obtain a target plane;
s3-5-10, obtaining light bar center line camera coordinates by utilizing pixel coordinates of single-line structured light based on camera internal parameters according to a single-line structured light target plane equation corresponding to the single-line structured light;
s3-5-11, performing fitting processing by using the camera coordinates of the light strip center line to obtain a single light plane;
and S3-5-12, obtaining a plane equation according to the single optical plane to complete single optical plane calibration.
6. The method according to claim 5, wherein the calculation of the first transform relation by performing the basic calibration on the double telecentric lens is as follows:
Figure FDA0003977581650000021
wherein (X) w1 ,Y w1 ) As world coordinates, (u) 1 ,v 1 ) Is a pixel coordinate, R 1 As a rotation matrix, T 1 To translate the matrix, H 1 Is a first homography matrix.
7. The method according to claim 4, wherein the calculation of the initial relationship of multiple light planes using the pixel position of each stripe in the projected picture and the light plane formed by the projection of each stripe in the equally spaced multi-line structured light stripe is as follows:
A=f 1 (u)=a 1 u n +a 2 u n-1 +…+a n+1
B=f 2 (u)=b 1 u n +b 2 u n-1 +…+b n+1
C=f 3 (u)=c 1 u n +c 2 u n-1 +…+c n+1
D=f 4 (u)=d 1 u n +d 2 u n-1 +…+d n+1
wherein, A, B, C, D respectively represent four parameters in the optical plane equation Ax + By + Cz + D =0, a n 、b n 、c n 、d n U is the number of pixel columns in the projection picture where the projected light bars are located, and n is the degree of the polynomial.
8. The mobile projective three-dimensional measurement method of claim 4, wherein the calculation formula of the optical plane optimization formula using the multi-optical plane measurement standard sphere diameter data after the initial calibration is as follows:
F=||R-R i (a j ,b j ,c j ,d j )|| 2 (j=1,2,…6)
wherein R is the standard radius value of the standard ball, R i Radius of sphere, a, calculated for coefficients obtained by polynomial fitting j 、b j 、c j 、d j Are coefficients of a polynomial.
9. The mobile projection-type three-dimensional measurement method of claim 1, wherein obtaining the three-dimensional coordinates of the object to be measured based on the pre-calibrated multiple light planes by using the pixel coordinates of the object to be measured comprises:
and substituting the pixel coordinates of the object to be measured into a multi-light plane calibration relation corresponding to the object to be measured which is calibrated in advance to obtain the three-dimensional coordinates of the object to be measured.
10. A mobile projection type three-dimensional measuring device is characterized by comprising a DLP optical machine (1), a camera (2) with a double telecentric lens, a DLP mounting plate (3), a camera mounting plate (4), an adapter plate (5), a camera bottom plate (6), a bottom plate (7) and a gantry (8);
DLP ray apparatus (1) is fixed on bottom plate (7) through DLP mounting panel (3), the camera that has two telecentric mirror heads passes through camera mounting panel (4), keysets (5) and camera bottom plate (6) to be fixed on bottom plate (7), fixes bottom plate (7) on longmen (8).
CN202211540888.9A 2022-12-02 2022-12-02 Mobile projection type three-dimensional measurement method and device Pending CN115876083A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211540888.9A CN115876083A (en) 2022-12-02 2022-12-02 Mobile projection type three-dimensional measurement method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211540888.9A CN115876083A (en) 2022-12-02 2022-12-02 Mobile projection type three-dimensional measurement method and device

Publications (1)

Publication Number Publication Date
CN115876083A true CN115876083A (en) 2023-03-31

Family

ID=85765660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211540888.9A Pending CN115876083A (en) 2022-12-02 2022-12-02 Mobile projection type three-dimensional measurement method and device

Country Status (1)

Country Link
CN (1) CN115876083A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116817796A (en) * 2023-08-23 2023-09-29 武汉工程大学 Method and device for measuring precision parameters of curved surface workpiece based on double telecentric lenses

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116817796A (en) * 2023-08-23 2023-09-29 武汉工程大学 Method and device for measuring precision parameters of curved surface workpiece based on double telecentric lenses
CN116817796B (en) * 2023-08-23 2023-11-24 武汉工程大学 Method and device for measuring precision parameters of curved surface workpiece based on double telecentric lenses

Similar Documents

Publication Publication Date Title
CN108844459B (en) Calibration method and device of blade digital sample plate detection system
US20190195616A1 (en) Method and apparatus for processing three-dimensional vision measurement data
TWI635252B (en) Methods and system for inspecting a 3d object using 2d image processing
Santolaria et al. A one-step intrinsic and extrinsic calibration method for laser line scanner operation in coordinate measuring machines
CN110672037A (en) Linear light source grating projection three-dimensional measurement system and method based on phase shift method
KR102248197B1 (en) Large reflector 3D surface shape measuring method by using Fringe Pattern Reflection Technique
Feng et al. High-speed real-time 3-D coordinates measurement based on fringe projection profilometry considering camera lens distortion
Peng Algorithms and models for 3-D shape measurement using digital fringe projections
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
Contri et al. Quality of 3D digitised points obtained with non-contact optical sensors
CN115876083A (en) Mobile projection type three-dimensional measurement method and device
Yang et al. A novel projector ray-model for 3D measurement in fringe projection profilometry
CN110910506B (en) Three-dimensional reconstruction method and device based on normal detection, detection device and system
JPH09218022A (en) Contour determination method for diffusion surface of work
CN113074660A (en) Surface shape measuring method for large-size transparent object
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN110248179B (en) Camera pupil aberration correction method based on light field coding
Wan et al. Robot line structured light vision measurement system: light strip center extraction and system calibration
Zexiao et al. A novel approach for the field calibration of line structured-light sensors
Hess et al. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test object
CN115661226B (en) Three-dimensional measuring method of mirror surface object, computer readable storage medium
CN110428471B (en) Accurate self-positioning method for optical free-form surface sub-aperture deflection measurement
CN111336942A (en) Shooting method for three-dimensional strain deformation measurement
Chang et al. Non-contact scanning measurement utilizing a space mapping method
US20160349045A1 (en) A method of measurement of linear dimensions of three-dimensional objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination