CN108981604B - Line laser-based three-dimensional full-view measurement method for precision part - Google Patents

Line laser-based three-dimensional full-view measurement method for precision part Download PDF

Info

Publication number
CN108981604B
CN108981604B CN201810767127.4A CN201810767127A CN108981604B CN 108981604 B CN108981604 B CN 108981604B CN 201810767127 A CN201810767127 A CN 201810767127A CN 108981604 B CN108981604 B CN 108981604B
Authority
CN
China
Prior art keywords
point cloud
point
dimensional
formula
profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810767127.4A
Other languages
Chinese (zh)
Other versions
CN108981604A (en
Inventor
宋丽梅
孙思远
郭庆华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingyan Zhongdian (Tianjin) Intelligent Equipment Co.,Ltd.
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN201810767127.4A priority Critical patent/CN108981604B/en
Publication of CN108981604A publication Critical patent/CN108981604A/en
Application granted granted Critical
Publication of CN108981604B publication Critical patent/CN108981604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points

Abstract

The invention belongs to the field of three-dimensional machine vision, and relates to a three-dimensional full-view measuring method of a precision part based on line laser. The method is based on a laser triangulation method, changes of reflected laser stripes are collected by a high-speed camera to reflect height changes of the surface of an object, and therefore the cross section profile of the object at the moment is collected. And then, combining translation scanning and rotation scanning modes, converting two-dimensional profile data under different viewing angles into point cloud data under the same coordinate system according to the calculated rotation translation matrix between different viewing angles, and finally performing curved surface reconstruction on the point cloud to obtain real three-dimensional full-appearance data of the measured object. Compared with other non-contact measuring methods, the three-dimensional overall measuring method designed by the invention has the advantages that the measuring mode is more flexible, the method is suitable for diversified working conditions, the measuring precision is higher, the point cloud effect is better, and the processing efficiency of a computer can be effectively improved.

Description

Line laser-based three-dimensional full-view measurement method for precision part
Technical Field
The invention relates to a three-dimensional overall measurement method of a precision part based on laser scanning, in particular to a three-dimensional overall measurement method capable of converting laser profile information into actual three-dimensional point cloud.
Background
With the gradual maturity of photoelectric sensor devices and computer technologies, the optical three-dimensional measurement method is widely applied to a plurality of fields such as industrial detection, reverse engineering, medical scanning, cultural relic protection, military science and technology, and has the advantages of fast forming and high precision in the detection of free-form surfaces. Optical three-dimensional measurement techniques can be classified into laser triangulation, time-of-flight, binocular vision, and spectral confocal, depending on the imaging mode. For precise part details and complex working condition environments, a laser triangulation method is often adopted for measurement. Because each sampling data of the line laser is the cross section outline of the object, the line laser is generally synchronized with a motion device in the three-dimensional reconstruction process, then a world coordinate system is determined, the outline data is converted into point cloud data, and finally the point clouds are registered according to the calculated rotation translation matrix and are fused into a complete three-dimensional full-view model of the object. Due to the surface material characteristics of the measured object, the profile data has peaks and dead angles, so that the point cloud processing process becomes complicated, and the three-dimensional reconstruction effect is directly influenced. In order to solve the problem of three-dimensional overall measurement of the precision part, the invention designs a three-dimensional overall measurement method for the precision part based on the measurement principle of a laser triangulation method.
Disclosure of Invention
The invention designs a line laser-based three-dimensional overall measurement method for precision parts, which can be applied to high-precision three-dimensional measurement and overcomes the defect of poor quality of traditional optical modeling point clouds.
The hardware system of the three-dimensional overall measurement method comprises the following steps:
the line laser profile scanner is used for acquiring the profile of the part;
a computer for precision control, contour acquisition and data processing;
the grating ruler and the encoder are used for synchronously triggering the acquisition signals;
a multi-axis scanning platform for fixing the scanner probe and motion control;
a standard gauge block and a calibration ball for calibration;
the invention designs a three-dimensional full-view measuring method of a precision part based on line laser, which is characterized by comprising the following steps:
step 1: starting a line laser sensor for collecting the outline of the precision part and the multi-axis scanning platform for motion control; acquiring the measured object once by using the line laser sensor to obtain the section profile P of the measured object; carrying out translation scanning on the measured object, namely acquiring the profile for multiple times along one axis to obtain batch processing profile data B;
step 2: performing inclination correction on the section profile P in the step 1 by using a least square method; let the measured cross-sectional profile be P (x)i,yi) i is 1, 2 …, n, and the compensated cross-sectional profile P' (x) is obtained according to equation (1)i,yi-axi-b);
Figure BSA0000166996900000011
And step 3: respectively calibrating the x, y and z axial directions of a sensor coordinate system by utilizing a step gauge block, and setting a step gauge block standard wheelProfile is S, and profile measurement error e is obtained according to formula (2)pObtaining minimum values under different compensation proportions c;
epmin ∑ (cP' -S) formula (2)
And 4, step 4: obtaining calibrated profile data cP' through the steps, determining a world coordinate system and a unit of the calibrated profile data according to the coordinate system of the line laser sensor in the step 3, converting the calibrated profile data into three-dimensional point cloud data, fixing an x-axis coordinate corresponding to each point, determining a z-axis coordinate by the size of the step gauge block in the step 3, and determining a y-axis coordinate by the acquisition frequency and the step length of the line laser sensor in the step 1;
and 5: carrying out bilateral filtering on the three-dimensional point cloud according to a formula (3) to remove point cloud noise, and calculating to obtain d as an adjustment distance of a point to be processed in the normal vector direction of the point to be processed; calculating the required weight W according to equation (4)cAnd WsLet P be a point in P, N (P) represent the k neighborhood of point P, | P-PiI denotes the slave point piThe length of the modulus of the vector to point p,
Figure BSA0000166996900000021
representing the normal vector at the point p,
Figure BSA0000166996900000022
representing a vector
Figure BSA0000166996900000023
And from point piThe inner product of the vectors to point p;
Figure BSA0000166996900000024
Figure BSA0000166996900000025
step 6: calibrating a rotating shaft of the multi-shaft scanning platform by using the standard sphere; with a radius of RBThe calibration ball is placed on a rotary table of the scanning platform, and the calibration ball passes through the step 1The translation scanning method of the method obtains point cloud data of a part of spherical surface to obtain the spherical coordinates of the spherical surface
Figure BSA0000166996900000026
Constructing a nonlinear equation system according to the formula (5);
Figure BSA0000166996900000027
and 7: solving the sphere center coordinate O of the spherical surface in the step 6n(xn,yn,zn) Driving the rotary table in the step 6 to rotate, respectively measuring the three-dimensional data of the spherical surface in the step 6 at N (N is more than or equal to 4) positions, obtaining N groups of spherical center coordinate data by using the step 6, and constructing N groups of linear equation sets by using the spherical center coordinates according to the formula (6);
Axn+Byn+Czn+ D is 0 formula (6)
And 8: solving the linear equation set in the step 7, and fitting the spherical center O in the step 6nEquation P of the plane of the rotation trajectoryBCalculating P according to equation (7)BThe normal vector u of (a); the coordinates O of the sphere center in the step 6nSubstituting the formula (8) to obtain the intersection point O of the rotating shaft of the sphere center in the step 6 and the plane where the sphere center is located in the step 6n′(xn′,yn′,zn′);
Figure BSA0000166996900000028
Figure BSA0000166996900000031
And step 9: let RuFor the rotation matrix of the point cloud to be converted, T (x)n′,yn′,zn') is a translation matrix of the point cloud to be converted, and the rotation angle is theta; combining u and O calculated in step 8nCalculating the coordinates of each point cloud under different viewing angles in a corresponding world coordinate system by using the formulas (9) and (10);
Figure BSA0000166996900000032
Figure BSA0000166996900000033
Figure BSA0000166996900000034
step 10: converting the point clouds of all parts in the step 9 into the same coordinate system according to the scanning sequence of the rotating shaft to finish rough registration of the point clouds, and then performing fine registration on the point clouds after the rough registration by searching an iterative closest point; let two point clouds to be registered be Pi,QiDefining an objective function epsilon (a) according to formula (11), and calculating the rotational-translational matrix R of step 9u,T(xn′,yn′,zn') obtaining a rigid transformation matrix R, T meeting the threshold value by adopting a traversal search method;
Figure BSA0000166996900000035
wherein Φ P is PiMiddle point piCorresponding points of the fitted surface are located;
step 11: for the point cloud data P to be transformed in the step 10iSubstituting the rigid body transformation matrix and the rotational translation matrix R, T in the step 10 into a formula (12) to obtain the point cloud Q after registration in the step 10i
Qi=RPi+ T formula (12)
Step 12: repeating the step 10 and the step 11 to finish the fine registration of all the visual angle point clouds to be converted to obtain the three-dimensional full-looking point cloud data of the measured object;
step 13: down-sampling the point cloud by establishing a grid, simplifying the point cloud data in the step 12, and performing point cloud simplification on the point cloud data obtained in the step 12 according to a formula (13), whereinD is the side length of the voxel grid, α is a scale factor, N is the number of point clouds of the point cloud obtained in step 12, (D)x,Dy,Dz) The maximum (x) of the point cloud data obtained in the step 12 in the directions of the three coordinate axes of x, y and z is shownmax,ymax,zmax) Minimum (x)min,ymin,zmin) Coordinate values and are obtained according to the formula (14);
Figure BSA0000166996900000041
Figure BSA0000166996900000042
step 14: converting the simplified point cloud data in the step 13 into polygons for surface reconstruction by a method for constructing a neural network, and constructing a network kernel function according to a formula (15)
Figure BSA0000166996900000045
And the radial range of action sigma and the centre of action P are determinedi(ii) a Let omegaiInterpolating the point cloud P simplified in the step 13 according to a formula (16) for the network weight to be trained to obtain a full-looking high-precision three-dimensional reconstruction f (P) of the measured object;
Figure BSA0000166996900000043
Figure BSA0000166996900000044
and finishing the operation of the measured three-dimensional overall information. The flow chart of the three-dimensional overall measurement method related to the patent of the invention is shown in figure 1. And f (P) is the three-dimensional overall information of the object to be measured.
The invention has the beneficial effects that: the three-dimensional point cloud overall measurement method introduced by the invention can solve the problems of large point cloud noise, low matching precision and the like of small complex workpieces, can realize high-precision three-dimensional overall measurement of complete point cloud data of the small workpieces without spraying a developer, and avoids the defects in the traditional structured light three-dimensional measurement method.
Drawings
FIG. 1: a flow chart of a line laser three-dimensional overall measurement method;
FIG. 2: measuring block ladder three-dimensional point cloud data;
FIG. 3: a schematic diagram of a rotating shaft calibration method;
FIG. 4: the three-dimensional overall effect graph of the workpiece is measured by the method.
Detailed Description
The laser triangulation method is to project a laser beam to the surface of an object to be measured by using a laser, and calculate the cross-sectional shape of the object according to the change of laser light bars acquired by a camera.
According to determination, the width (x axial direction) of the acquired stripe is 16mm, the batch processing length (y axial direction) is 100mm, the height difference range (z axial direction) is 16mm, and the highest acquisition frequency is 500Hz when the stripe is shot by a line laser sensor in an experimental scene.
The gauge block ladder is formed by combining standard gauge blocks with the thicknesses of 1.08mm, 1.5mm and 2mm into a ladder, and the scanned point cloud data of the gauge block ladder is shown in figure 2. The edge data is removed to reduce the error, and the calibrated tilt compensation line is y equal to 1.167 multiplied by 10-5x +1.078, and the compensation ratio c is 0.993.
The radius of a calibration ball used for calibrating the rotating shaft is RBThe method comprises the steps of rotating a 10mm ceramic ball clockwise for 60 degrees each time, measuring for 6 times in total, obtaining point cloud data of 6 spheres, and calculating coordinates of 6 sphere centers
Figure BSA0000166996900000051
And calculating the normal vector u of the rotating shaft and the equation P of the rotating planeBFig. 3 is a schematic diagram of a method for calibrating a rotating shaft.
And (3) converting the 6 parts of point clouds into a rotation and translation matrix according to theta (i is i multiplied by 60) and (i is 1, 2, … 6) for coordinate transformation according to a rotating shaft calibration result and a fine registration method.
And (3) performing curved surface reconstruction on the point cloud data through constructing a neural network to finally obtain complete three-dimensional overall data of the workpiece, wherein an effect diagram of three-dimensional overall measurement of the workpiece is shown in fig. 4.
The invention designs a three-dimensional full-view measuring method of a precision part based on line laser, which is characterized by comprising the following steps:
step 1: starting a line laser sensor for collecting the outline of the precision part and the multi-axis scanning platform for motion control; acquiring the measured object once by using the line laser sensor to obtain the section profile P of the measured object; carrying out translation scanning on the measured object, namely acquiring the profile for multiple times along one axis to obtain batch processing profile data B;
step 2: performing inclination correction on the section profile P in the step 1 by using a least square method; let the measured cross-sectional profile be P (x)i,yi) The compensated cross-sectional profile P' (x) is obtained as follows, i being 1, 2 …, ni,yi-axi-b);
Figure BSA0000166996900000052
And step 3: respectively calibrating the x, y and z axial directions of a sensor coordinate system by utilizing a step gauge block, setting the standard profile of the step gauge block as S, and enabling the profile measurement error e according to the following formulapObtaining minimum values under different compensation proportions c;
ep=min∑(cP′-S)
and 4, step 4: obtaining calibrated profile data cP' through the steps, determining a world coordinate system and a unit of the calibrated profile data according to the coordinate system of the line laser sensor in the step 3, converting the calibrated profile data into three-dimensional point cloud data, fixing an x-axis coordinate corresponding to each point, determining a z-axis coordinate by the size of the step gauge block in the step 3, and determining a y-axis coordinate by the acquisition frequency and the step length of the line laser sensor in the step 1;
and 5: carrying out bilateral filtering on the three-dimensional point cloud according to the following formula to remove point cloud noise, and calculating to obtain d as an adjustment distance of a point to be processed in the normal vector direction of the point to be processed; according to the following formulaCalculating the required weight WcAnd WsLet P be a point in P, N (P) represent the k neighborhood of point P, | P-PiI denotes the slave point piThe length of the modulus of the vector to point p,
Figure BSA0000166996900000056
representing the normal vector at the point p,
Figure BSA0000166996900000053
representing a vector
Figure BSA0000166996900000054
And from point piThe inner product of the vectors to point p;
Figure BSA0000166996900000055
Figure BSA0000166996900000061
step 6: calibrating a rotating shaft of the multi-shaft scanning platform by using the standard sphere; with a radius of RBThe calibration ball is placed on a rotary table of the scanning platform, and the point cloud data of part of the spherical surface is obtained by the translation scanning method of the method in the step 1 to obtain the spherical coordinates of the spherical surface
Figure BSA0000166996900000062
Constructing a nonlinear equation system according to the following formula;
Figure BSA0000166996900000063
and 7: solving the sphere center coordinate O of the spherical surface in the step 6n(xn,yn,zn) Driving the rotary table in the step 6 to rotate, respectively measuring the three-dimensional data of the spherical surface in the step 6 at N (N is more than or equal to 4) positions, obtaining N groups of spherical center coordinate data by using the step 6, and constructing N groups of linear equation sets by using the spherical center coordinates according to the following formula;
Axn+Byn+Czn+D=0
and 8: solving the linear equation set in the step 7, and fitting the spherical center O in the step 6nEquation P of the plane of the rotation trajectoryBP is calculated as followsBThe normal vector u of (a); the coordinates O of the sphere center in the step 6nSubstituting the formula to obtain the intersection point O of the rotating shaft of the sphere center in the step 6 and the plane where the sphere center is located in the step 6n′(xn′,yn′,zn′);
Figure BSA0000166996900000064
Figure BSA0000166996900000065
And step 9: let RuFor the rotation matrix of the point cloud to be converted, T (x)n′,yn′,zn') is a translation matrix of the point cloud to be converted, and the rotation angle is theta; combining u and O calculated in step 8nCalculating the coordinates of each point cloud under different viewing angles in a corresponding world coordinate system by using the following formula;
Figure BSA0000166996900000066
Figure BSA0000166996900000071
Figure BSA0000166996900000072
step 10: converting the point clouds of all parts in the step 9 into the same coordinate system according to the scanning sequence of the rotating shaft to finish rough registration of the point clouds, and then performing fine registration on the point clouds after the rough registration by searching an iterative closest point; let two point clouds to be registered be Pi,QiDefining an objective function ε (a) according to the following formula, and calculating in step 9The rotational translation matrix Ru,T(xn′,yn′,zn') obtaining a rigid transformation matrix R, T meeting the threshold value by adopting a traversal search method;
Figure BSA0000166996900000073
wherein Φ P is PiMiddle point piCorresponding points of the fitted surface are located;
step 11: for the point cloud data P to be transformed in the step 10iSubstituting the rigid body transformation matrix and the rotational translation matrix R, T in the step 10 into the following formula to obtain the point cloud Q after registration in the step 10i
Qi=RPi+T
Step 12: repeating the step 10 and the step 11 to finish the fine registration of all the visual angle point clouds to be converted to obtain the three-dimensional full-looking point cloud data of the measured object;
step 13, down-sampling the point cloud by establishing a grid, simplifying the point cloud data in the step 12, and performing point cloud simplification on the point cloud data obtained in the step 12 according to the following formula, wherein D is the side length of a voxel grid, α is a scale factor, N is the point cloud number of the point cloud obtained in the step 12, (D)x,Dy,Dz) The maximum (x) of the point cloud data obtained in the step 12 in the directions of the three coordinate axes of x, y and z is shownmax,ymax,zmax) Minimum (x)min,ymin,zmin) Coordinate values and are obtained according to the following formula;
Figure BSA0000166996900000074
Figure BSA0000166996900000075
step 14: converting the simplified point cloud data in the step 13 into polygons for surface reconstruction by a method for constructing a neural network, and constructing a network kernel function according to the following formula
Figure BSA0000166996900000076
And the radial range of action sigma and the centre of action P are determinedi(ii) a Let omegaiInterpolating the point cloud P simplified in the step 13 according to the following formula for the network weight to be trained to obtain the full-appearance high-precision three-dimensional reconstruction f (P) of the measured object;
Figure BSA0000166996900000081
Figure BSA0000166996900000082
the flow chart of the three-dimensional overall measurement method related to the patent of the invention is shown in figure 1.
The biggest difference between the method and the existing three-dimensional reconstruction method is as follows: the existing three-dimensional reconstruction method has the problems of insufficient reconstruction precision of small workpieces and higher requirements on surface materials, thereby causing point cloud loss, high point cloud noise, low splicing precision and the like, and the problems can be avoided only by spraying a developer. The three-dimensional overall measurement method designed by the invention has the advantages that the extracted contour and the fused point cloud not only contain more detailed information, but also contain complete global information of the whole object through two calibration methods, so that the problems existing in the existing method are fundamentally solved. Therefore, the method designed by the invention can solve the problem of high-precision three-dimensional full-view measurement of small workpieces.
In summary, the three-dimensional overall measurement method of the present invention has the following advantages:
(1) the correction method for extracting the contour is more accurate, so that the three-dimensional full-view measurement method designed by the invention is more in line with the contour size of an actual model.
(2) Due to the double registration method of the rotation axis calibration and the closest point iteration, the three-dimensional reconstruction method designed by the invention can realize high-precision point cloud registration.
(3) The measurement problem of objects with different surface materials is solved through bilateral filtering and point cloud simplification, coloring materials such as a spraying developer and the like are not needed, the measurement process is green and environment-friendly, and the material consumption cost in measurement is saved.
The invention and its embodiments have been described above schematically, without limitation, and the figures shown in the drawings represent only one embodiment of the invention. Therefore, if persons skilled in the art should be informed by the teachings of the present invention, other similar components or other arrangements of components may be adopted without departing from the spirit of the present invention, and technical solutions and embodiments similar to the technical solutions may be creatively designed without departing from the scope of the present invention.

Claims (1)

1. A three-dimensional full-view measuring method of a precision part based on line laser is characterized by comprising the following steps:
step 1: starting a line laser sensor for collecting the outline of the precision part and a multi-axis scanning platform for motion control; acquiring the measured object once by using the line laser sensor to obtain the section profile P of the measured object; carrying out translation scanning on the measured object, namely acquiring the profile for multiple times along one axis to obtain batch processing profile data B;
step 2: the section profile P described in step 1 is subjected to tilt correction using the least squares method: let the measured cross-sectional profile be P (x)i,yi) i is 0, 1, 2 …, n, and the compensated cross-sectional profile P' (x) is obtained according to equation (1)i,yi-axi-b);
Figure FSB0000185648810000011
And step 3: respectively calibrating the x, y and z axial directions of a sensor coordinate system by utilizing a step gauge block, setting the standard profile of the step gauge block as S, and enabling the profile measurement error e according to a formula (2)pObtaining minimum values under different compensation proportions c;
epmin ∑ (cP' -S) formula (2)
And 4, step 4: obtaining calibrated profile data cP' through the steps 1 to 3, determining a world coordinate system and a unit of the calibrated profile data according to the coordinate system of the line laser sensor in the step 3, converting the calibrated profile data into three-dimensional point cloud data, fixing an x-axis coordinate corresponding to each point, determining a z-axis coordinate by the size of the step gauge block in the step 3, and determining a y-axis coordinate by the acquisition frequency and the step length of the line laser sensor in the step 1;
and 5: carrying out bilateral filtering on the three-dimensional point cloud according to a formula (3) to remove point cloud noise, and calculating to obtain d as an adjustment distance of a point to be processed in the normal vector direction of the point to be processed; calculating the required weight W according to equation (4)cAnd WsLet P be a point in P, N (P) represent the k neighborhood of point P, | P-PiI denotes the slave point piThe length of the modulus of the vector to point p,
Figure FSB0000185648810000018
representing the normal vector at the point p,
Figure FSB0000185648810000012
representing a vector
Figure FSB0000185648810000013
And from point piThe inner product of the vectors to point p;
Figure FSB0000185648810000014
Figure FSB0000185648810000015
step 6: calibrating a rotating shaft of the multi-shaft scanning platform by using a standard sphere; with a radius of RBThe calibration ball is placed on a rotary table of the scanning platform, and the point cloud data of part of the spherical surface is obtained by the translation scanning method in the step 1 to obtain the spherical coordinates of the spherical surface
Figure FSB0000185648810000016
Constructing a nonlinear equation system according to the formula (5);
Figure FSB0000185648810000017
and 7: solving the sphere center coordinate O of the spherical surface in the step 6n(xn,yn,zn) Driving the rotary table in the step 6 to rotate, respectively measuring the three-dimensional data of the spherical surface in the step 6 at N (N is more than or equal to 4) positions, obtaining N groups of spherical center coordinate data by using the step 6, and constructing N groups of linear equation sets by using the spherical center coordinates according to the formula (6);
Axn+Byn+Czn+ D is 0 formula (6)
And 8: solving the linear equation set in the step 7, and fitting the spherical center O in the step 7nEquation P of the plane of the rotation trajectoryBCalculating P according to equation (7)BThe normal vector u of (a); the coordinates O of the sphere center in the step 7nSubstituting the formula (8) to obtain the intersection point O of the rotating shaft of the sphere center in the step 7 and the plane where the sphere center is located in the step 7n′(xn′,yn′,zn′);
Figure FSB0000185648810000021
Figure FSB0000185648810000022
And step 9: let RuFor the rotation matrix of the point cloud to be converted, T (x)n′,yn′,zn') is a translation matrix of the point cloud to be converted, and the rotation angle is theta; combining u and O calculated in step 8nCalculating the coordinates of each point cloud under different viewing angles in a corresponding world coordinate system by using the formulas (9) and (10);
Figure FSB0000185648810000023
Figure FSB0000185648810000024
step 10: converting the point clouds to be converted in the step 9 into the same coordinate system according to the scanning sequence of the rotating shaft, completing the coarse registration of the point clouds, and then performing fine registration on the point clouds after the coarse registration by searching an iterative closest point; let two point clouds to be registered be Pi,QiDefining an objective function epsilon (a) according to formula (11), and calculating the rotation matrix R of step 9u,T(xn′,yn′,zn') obtaining rigid transformation matrixes R and T meeting the threshold value by adopting a traversal search method;
Figure FSB0000185648810000025
wherein Φ P is PiMiddle point piCorresponding points of the fitted surface are located;
step 11: for the point cloud data P to be transformed in the step 10iSubstituting the rigid transformation matrix R and T in the step 10 into a formula (12) to obtain the point cloud Q after registration in the step 10i
Qi=RPi+ T formula (12)
Step 12: repeating the step 10 and the step 11 to finish the fine registration of all the visual angle point clouds to be converted to obtain the three-dimensional full-looking point cloud data of the measured object;
step 13, down-sampling the point cloud by establishing a grid, simplifying the point cloud data in the step 12, and performing point cloud simplification on the point cloud data obtained in the step 12 according to a formula (13), wherein D is the side length of a voxel grid, α is a scale factor, N is the point cloud number of the point cloud obtained in the step 12, (D)x,Dy,Dz) The maximum (x) of the point cloud data obtained in the step 12 in the directions of the three coordinate axes of x, y and z is shownmax,ymax,zmax) Minimum (x)min,ymin,zmin) Coordinate values based onFormula (14) is obtained;
Figure FSB0000185648810000031
Figure FSB0000185648810000032
step 14: converting the simplified point cloud data in the step 13 into polygons for surface reconstruction by a method for constructing a neural network, and constructing a network kernel function according to a formula (15)
Figure FSB0000185648810000033
And the radial range of action sigma and the centre of action P are determinedi(ii) a Let omegaiInterpolating the point cloud P simplified in the step 13 according to a formula (16) for the network weight to be trained to obtain a full-looking high-precision three-dimensional reconstruction f (P) of the measured object;
Figure FSB0000185648810000034
Figure FSB0000185648810000035
and finishing the operation of the measured three-dimensional overall information.
CN201810767127.4A 2018-07-11 2018-07-11 Line laser-based three-dimensional full-view measurement method for precision part Active CN108981604B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810767127.4A CN108981604B (en) 2018-07-11 2018-07-11 Line laser-based three-dimensional full-view measurement method for precision part

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810767127.4A CN108981604B (en) 2018-07-11 2018-07-11 Line laser-based three-dimensional full-view measurement method for precision part

Publications (2)

Publication Number Publication Date
CN108981604A CN108981604A (en) 2018-12-11
CN108981604B true CN108981604B (en) 2020-06-09

Family

ID=64537168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810767127.4A Active CN108981604B (en) 2018-07-11 2018-07-11 Line laser-based three-dimensional full-view measurement method for precision part

Country Status (1)

Country Link
CN (1) CN108981604B (en)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109615644B (en) * 2018-12-25 2021-05-04 北京理工大学 Surface type matching method for precision assembly of bowl matching parts
CN109848790A (en) * 2019-01-23 2019-06-07 宁波东力传动设备有限公司 A kind of large gear automatic chamfering on-line measuring device and method
CN110111378B (en) * 2019-04-04 2021-07-02 贝壳技术有限公司 Point cloud registration optimization method and device based on indoor three-dimensional data
CN110425998B (en) * 2019-06-05 2021-02-02 中北大学 Three-dimensional measurement method for component with height of coupling characteristic points of gray level image
CN110280906B (en) * 2019-07-18 2022-04-19 深圳晶尹阳光电有限公司 Fine 3D laser processing method and device for marking rotating target
CN110715618A (en) * 2019-09-29 2020-01-21 北京天远三维科技股份有限公司 Dynamic three-dimensional scanning method and device
CN111273704B (en) * 2020-01-21 2023-09-15 上海万物新生环保科技集团有限公司 Method and device for automatically inserting external equipment hole
CN111539446B (en) * 2020-03-04 2023-10-03 南京航空航天大学 Template matching-based 2D laser hole site detection method
CN111486821A (en) * 2020-04-28 2020-08-04 苏州江腾智能科技有限公司 Quick calibration method based on multi-dimensional position data
CN111609811A (en) * 2020-04-29 2020-09-01 北京机科国创轻量化科学研究院有限公司 Machine vision-based large-size plate forming online measurement system and method
CN111444630A (en) * 2020-05-19 2020-07-24 上汽大众汽车有限公司 Method for assisting optical measurements using virtual adapters
CN111578866B (en) * 2020-06-16 2021-04-20 大连理工大学 Spatial pose calibration method for multi-line laser sensor combined measurement
CN111750804B (en) * 2020-06-19 2022-10-11 浙江华睿科技股份有限公司 Object measuring method and device
CN111895921B (en) * 2020-08-05 2022-03-11 珠海博明视觉科技有限公司 Compensation method for improving measurement precision of system to height difference
CN112528464A (en) * 2020-11-06 2021-03-19 贵州师范大学 Method for reversely solving slotting forming grinding wheel truncation based on pixel matrix method
CN112833818B (en) * 2021-01-07 2022-11-15 南京理工大学智能计算成像研究院有限公司 Single-frame fringe projection three-dimensional surface type measuring method
CN113048886B (en) * 2021-05-31 2021-08-17 山东捷瑞数字科技股份有限公司 Method and apparatus for measuring size of irregular body of workpiece
CN113358059B (en) * 2021-06-08 2023-06-02 西安交通大学 Off-axis aspheric surface type error measurement method based on line laser scanning
CN114166153B (en) * 2021-11-29 2024-02-27 浙江工业大学 Straight shank twist drill coaxiality error measurement method
CN114413788B (en) * 2022-01-21 2024-04-09 武汉惟景三维科技有限公司 Part surface pattern detection method based on binocular vision and reverse model reconstruction
CN115046474B (en) * 2022-05-20 2023-03-10 浙江大学 System and method for measuring inner and outer surfaces of tubular part
CN115420196B (en) * 2022-08-30 2023-11-14 天津大学 General object three-dimensional measurement method and device
CN115493523B (en) * 2022-11-21 2023-04-25 三代光学科技(天津)有限公司 High-speed measuring method and device for three-dimensional morphology of wafer surface
CN117011349B (en) * 2023-08-01 2024-03-19 奥谱天成(厦门)光电有限公司 Line laser image optimization method, system and medium based on two-dimensional motion platform
CN116878419B (en) * 2023-09-06 2023-12-01 南京景曜智能科技有限公司 Rail vehicle limit detection method and system based on three-dimensional point cloud data and electronic equipment
CN117272522B (en) * 2023-11-21 2024-02-02 上海弥彧网络科技有限责任公司 Portable aircraft curved surface skin rivet hole profile measurement system and method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900651A (en) * 2006-07-27 2007-01-24 西安交通大学 Three dimension object contour phase measuring method based on double frequency color strip projection
CN101551918A (en) * 2009-04-28 2009-10-07 浙江大学 Acquisition method of large scene based on line laser
CN102661724A (en) * 2012-04-10 2012-09-12 天津工业大学 RGBPSP (red green blue phase shift profilometry) three-dimensional color reconstruction method applied to online detection for fabric defects
CN102853783A (en) * 2012-09-18 2013-01-02 天津工业大学 High-precision multi-wavelength three-dimensional measurement method
CN104748679A (en) * 2015-03-19 2015-07-01 中国矿业大学(北京) Space point three dimension coordinate measuring method based on rotation sector laser angle measurement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7428061B2 (en) * 2002-08-14 2008-09-23 Metris Ipr N.V. Optical probe for scanning the features of an object and methods thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900651A (en) * 2006-07-27 2007-01-24 西安交通大学 Three dimension object contour phase measuring method based on double frequency color strip projection
CN101551918A (en) * 2009-04-28 2009-10-07 浙江大学 Acquisition method of large scene based on line laser
CN102661724A (en) * 2012-04-10 2012-09-12 天津工业大学 RGBPSP (red green blue phase shift profilometry) three-dimensional color reconstruction method applied to online detection for fabric defects
CN102853783A (en) * 2012-09-18 2013-01-02 天津工业大学 High-precision multi-wavelength three-dimensional measurement method
CN104748679A (en) * 2015-03-19 2015-07-01 中国矿业大学(北京) Space point three dimension coordinate measuring method based on rotation sector laser angle measurement

Also Published As

Publication number Publication date
CN108981604A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
CN108981604B (en) Line laser-based three-dimensional full-view measurement method for precision part
Ebrahim 3D laser scanners’ techniques overview
Luhmann Close range photogrammetry for industrial applications
Luna et al. Calibration of line-scan cameras
Genta et al. Calibration procedure for a laser triangulation scanner with uncertainty evaluation
El-Hakim et al. Comparative evaluation of the performance of passive and active 3D vision systems
Beraldin et al. Metrological characterization of 3D imaging systems: progress report on standards developments
Lai et al. Registration and data merging for multiple sets of scan data
Yang et al. Modeling and calibration of the galvanometric laser scanning three-dimensional measurement system
CN111612768A (en) Method for detecting blade by adopting structured light space positioning and two-dimensional industrial CT
CN109323665B (en) Precise three-dimensional measurement method for line-structured light-driven holographic interference
CN112525106B (en) Three-phase machine cooperative laser-based 3D detection method and device
Dubreuil et al. Mesh-Based Shape Measurements with Stereocorrelation: Principle and First Results
CN112002013A (en) Three-dimensional overlapping type modeling method
Yu et al. Surface modeling method for aircraft engine blades by using speckle patterns based on the virtual stereo vision system
CN109035238A (en) A kind of machining allowance off-line analysis method towards Free-form Surface Parts
CN110260817B (en) Complex surface deflection measurement self-positioning method based on virtual mark points
CN114170321A (en) Camera self-calibration method and system based on distance measurement
Isa et al. Laser triangulation
Voicu et al. 3D MEASURING OF COMPLEX AUTOMOTIVE PARTS USING VIDEO-LASER SCANNING.
Qi et al. A novel method for Aero engine blade removed-material measurement based on the robotic 3D scanning system
Isheil-Bubaker et al. 3D displacements and strains solid measurement based on the surface texture with a scanner laser
und Geomatik Geometric inspection of 3D production parts in shipbuilding–comparison and assessment of current optical measuring methods
Zhang et al. A Study on Refraction Error Compensation Method for Underwater Spinning Laser Scanning 3D Imaging
Tang et al. 3-step-calibration of 3D vision measurement system based-on structured light

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201113

Address after: 300051 1603, block B, Kangning building, Xikang Road, Heping District, Tianjin

Patentee after: Qingyan Zhongdian (Tianjin) Intelligent Equipment Co.,Ltd.

Address before: 300387 Tianjin city Xiqing District West Binshui Road No. 399

Patentee before: TIANJIN POLYTECHNIC University

CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: Room 1-104, Block A, No. 196 Hongqi Road, Nankai District, Tianjin 300110

Patentee after: Qingyan Zhongdian (Tianjin) Intelligent Equipment Co.,Ltd.

Address before: Room 1603, Block B, Kangning Building, Xikang Road, Heping District, Tianjin 300051

Patentee before: Qingyan Zhongdian (Tianjin) Intelligent Equipment Co.,Ltd.