CN114066996A - Calibration method of three-dimensional measurement equipment - Google Patents

Calibration method of three-dimensional measurement equipment Download PDF

Info

Publication number
CN114066996A
CN114066996A CN202111375763.0A CN202111375763A CN114066996A CN 114066996 A CN114066996 A CN 114066996A CN 202111375763 A CN202111375763 A CN 202111375763A CN 114066996 A CN114066996 A CN 114066996A
Authority
CN
China
Prior art keywords
calibration
determining
coordinate
coordinates
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111375763.0A
Other languages
Chinese (zh)
Inventor
张晓元
房徐
张勇
姚毅
杨艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luster LightTech Co Ltd
Beijing Luster LightTech Co Ltd
Original Assignee
Luster LightTech Co Ltd
Beijing Luster LightTech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luster LightTech Co Ltd, Beijing Luster LightTech Co Ltd filed Critical Luster LightTech Co Ltd
Priority to CN202111375763.0A priority Critical patent/CN114066996A/en
Publication of CN114066996A publication Critical patent/CN114066996A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a calibration method of three-dimensional measurement equipment, which comprises the steps of obtaining calibration image data of a calibration object and space coordinates of feature points on the calibration object, wherein the space coordinates of the feature points on the calibration object are obtained according to the size of the calibration object; determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the characteristic points on the calibration object according to the calibration image data; determining distortion parameters through the distortion model and pixel coordinates of the characteristic points of the imaging surface; determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through a distortion model; determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to a homography matrix, wherein the transformation matrix is used for calibrating the coordinate of the object to be measured; according to the method and the device, the transformation matrix for calibration is determined by obtaining the space coordinates of the characteristic points on the calibration object and the corresponding undistorted pixel coordinates, and the calibration precision of the three measuring devices is improved.

Description

Calibration method of three-dimensional measurement equipment
Technical Field
The application relates to the field of machine vision, in particular to a calibration method of three-dimensional measurement equipment.
Background
The three-dimensional measuring equipment has important application in the aspects of industrial automatic measurement, positioning, detection and the like; the three-dimensional measuring equipment maps the space points on the surface of the measured object to pixel points in the camera imaging image based on the line laser camera, so that three-dimensional geometric information and gray texture information of the space points corresponding to the pixel points are determined.
The calibration of the existing three-dimensional measuring equipment mainly comprises internal and external reference calibration and light plane calibration of a camera. As shown in fig. 1, the calibration of the internal and external parameters of the camera is completed by the aid of the optical assembly 30, and within the clear imaging range of the camera 10, feature extraction and calibration of the internal and external parameters of the camera are required to be performed by continuously adjusting the posture of the calibration plate 20 with feature points and geometric information; correcting image distortion through camera internal parameters, and then calibrating an optical plane by combining camera external parameters; the camera internal parameters comprise focal length, principal point and lens distortion coefficient, and the camera external parameters comprise a rotation matrix and a translation matrix of a camera coordinate system relative to a space coordinate system.
However, in the calibration of the existing three-dimensional measurement device, the internal and external parameters of the camera need to be calibrated first to obtain the calibration images of the calibration plates in different postures under the optical assembly, and it is difficult to ensure the definition of the calibration image in each posture, thereby reducing the calibration precision.
Disclosure of Invention
The application provides a calibration method of three-dimensional measurement equipment, which aims to solve the technical problems that in the prior art, when internal and external parameters of a camera are calibrated, calibration images of calibration plates in different postures under an optical assembly are obtained, the definition of the calibration images in each posture is difficult to ensure, and the calibration precision is reduced.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
the application provides a calibration method of three-dimensional measurement equipment, which comprises the following steps:
acquiring calibration image data of a calibration object and space coordinates of feature points on the calibration object, wherein the space coordinates of the feature points on the calibration object are acquired according to the size of the calibration object;
determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the characteristic points on the calibration object according to the calibration image data;
determining distortion parameters through the distortion model and pixel coordinates of the characteristic points of the imaging surface;
determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through the distortion model and the distortion parameters;
and determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to the homography matrix, wherein the transformation matrix is used for calibrating the coordinate of the object to be measured.
In a possible implementation manner, after determining the transformation matrix, the calibration method further includes:
acquiring an imaging coordinate graph of a standard gauge block and the distance between the standard gauge block and an adjacent standard surface;
determining the real coordinates of the standard gauge block according to the imaging coordinates of the standard gauge block and the transformation matrix;
determining a true value of the distance between the adjacent standard surfaces of the standard gauge blocks according to the real coordinates of the standard gauge blocks;
and determining a correction coefficient according to the distance between the adjacent standard surfaces of the standard gauge blocks and the distance truth value, wherein the correction coefficient is used for correcting the longitudinal coordinate of the object to be measured.
In one possible implementation, a three-dimensional measurement device includes a camera, a calibration object, and a mobile station; the mobile station moves along the direction perpendicular to the camera line laser, and the calibration object is placed on the mobile station.
The calibration object comprises a platform base, a cushion block and a calibration plate; one side of the platform base is provided with a groove, the other side of the platform base is provided with a baffle, and the groove is used for placing the cushion block; the calibration plate is obliquely arranged between the baffle of the platform base and the cushion block and forms an inclination angle theta with the platform base.
This application provides calibration plate under the different gestures for camera imaging through calibration thing and mobile station, and the calibration image under the different gestures is in the clear imaging range of camera, the precision of the formation of image that improves.
The application provides a calibration method of three-dimensional measurement equipment, which comprises the steps of obtaining calibration image data of a calibration object and space coordinates of feature points on the calibration object, wherein the space coordinates of the feature points on the calibration object are obtained according to the size of the calibration object; determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the characteristic points on the calibration object according to the calibration image data; determining distortion parameters through the distortion model and pixel coordinates of the characteristic points of the imaging surface; determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through a distortion model; determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to a homography matrix, wherein the transformation matrix is used for calibrating the coordinate of the object to be measured; according to the method and the device, the transformation matrix for calibration is determined by obtaining the space coordinates of the characteristic points on the calibration object and the corresponding undistorted pixel coordinates, and the calibration precision of the three measuring devices is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of the calibration of internal and external reference of a conventional camera of the three-dimensional measuring device of the present application;
fig. 2 is a schematic structural diagram of a three-dimensional measurement apparatus according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a calibration object according to an embodiment of the present application;
FIG. 4 is a schematic view of a calibration plate according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a calibration method for a three-dimensional measurement device according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a calibration method for calibrating a three-dimensional measurement device according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a standard gauge block according to an embodiment of the present disclosure;
wherein: 1-a camera; 2-calibration object; 21-a platform base; 211-grooves; 212-a baffle; 22-cushion block; 23-calibration plate; 3-mobile station.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The three-dimensional measuring equipment acquires high-resolution geometric information and gray texture information of the surface of a measured object based on a line laser camera; the calibration of the internal and external parameters of the camera can be realized by a Zhang camera calibration method, images at a plurality of positions and different postures are acquired in a camera imaging area by utilizing a calibration plate with characteristic points and geometric information, such as a plane checkerboard or a dot grid, namely, images containing laser lines and the characteristics of the calibration plate are acquired, the characteristics of the images are further extracted, the internal and external parameters of the camera are calibrated, the image distortion is corrected through the internal parameters of the camera, and then the external parameters of the camera are combined to calibrate the light plane. The camera internal and external parameter calibration process is carried out by continuously adjusting the posture of the calibration plate, the operation is complex, and the automation realization difficulty is high.
For a camera of line laser, the camera imaging can adopt the Schlemm's law, so that a camera lens is focused in a light plane, and for calibration plates in different postures, because the camera imaging makes the clear imaging range of a vertical line laser plane small through the Schlemm's law, the imaging definition of the calibration plate in each posture is difficult to ensure, and the calibration precision is reduced.
In order to solve the above problem, some embodiments of the present application provide a three-dimensional measurement apparatus, as shown in fig. 2, the three-dimensional measurement apparatus may include a camera 1 based on a line laser, a calibration object 2 and a mobile station 3, where the calibration object 2 is placed on the mobile station 3, the calibration object 2 is adjusted within a clear measurement range of the camera 1, and when the line laser corresponds to features on a same line of a calibration board, the mobile station 3 is controlled to move in a direction perpendicular to the line laser.
In one embodiment, a spatial coordinate system O-XZ is established and a pixel coordinate system O-uv is established at the imaging plane. In the space coordinate system, X is perpendicular to the moving direction, Z is vertical upwards, and the origin is arranged at the pattern space point of the calibration plate corresponding to the characteristic point at the upper left of the calibration data. In the imaging plane establishing pixel coordinate system, u is horizontally right, v is vertically downward, and the origin is located at the first pixel on the upper left.
As shown in fig. 3 and 4, the calibration object may be a calibration assembly, the calibration assembly may include a platform base 21, a spacer 22, and a calibration plate 23, one side of the platform base 21 has a groove 211, the other side has a baffle 212, the groove 211 is used for placing the spacer 22, the calibration plate 23 is obliquely placed between the baffle 212 and the spacer 22 of the platform base 21, and the spacer 22 with different heights may be replaced according to the requirement of a calibrated three-dimensional measurement device, in the present application, the two-dimensional calibration plate 23 is converted into a target with three-dimensional feature points by a mechanical automatic control manner, that is, the target has depth direction information, and calibration image data of the calibration object may be obtained within a clear imaging range of a camera.
The calibration board 23 is a flat board with a characteristic pattern, which may be a dot diagram of white dots on a black matrix, with the center of the dot being the characteristic used for calibration. In some embodiments, the pattern of the calibration plate may be a checkerboard, by extracting the corner points of the checkerboard as the features for distortion correction and system calibration; or even grid line pattern, and the intersection points of the grid lines and the grid are extracted as the characteristics of distortion correction and system calibration.
In some embodiments, the feature point space coordinates may also be obtained by a calibration object with three-dimensional feature information, such as a height block with feature points, a standard sphere, a sawtooth block, and the like.
The application provides a three-dimensional measuring device, which comprises a camera, a calibration object and a mobile station; the mobile station moves along the direction vertical to the camera line laser, and the calibration object is placed on the mobile station; the calibration object comprises a platform base, a cushion block and a calibration plate; one side of the platform base is provided with a groove, the other side of the platform base is provided with a baffle, and the groove is used for placing the cushion block; the calibration plate is obliquely arranged between the baffle and the cushion block of the platform base; this application provides calibration plate under the different gestures for camera imaging through calibration thing and mobile station, and the calibration image under the different gestures is in the clear imaging range of camera, the precision of the formation of image that improves.
In combination with the three-dimensional measurement device provided in the foregoing embodiments, some embodiments of the present application provide a calibration method for a three-dimensional measurement device, as shown in fig. 5, the calibration includes the following steps:
s101, obtaining calibration image data of a calibration object and space coordinates of feature points on the calibration object.
The calibration image data is scanned by the moving calibration object to obtain a plurality of groups of image data to be processed; preprocessing the multiple groups of image data to be processed to obtain calibration image data; the preprocessing algorithm determines a set of calibration image data corresponding to a plurality of sets of image data to be processed. The calibration image data comprises a coordinate graph and a gray scale graph, wherein the coordinate graph comprises pixel coordinates and line coordinate values; the gray value comprises a pixel coordinate and a gray value; the specific method for determining the line coordinate value and the gray value is as follows:
the calibration object is placed on the mobile platform, the calibration object is adjusted within the clear measurement range of the camera, and when the line laser corresponds to the features on the same line of the calibration plate, the mobile platform is controlled to move along the direction vertical to the line laser; during the movement of the mobile station, the camera scans the calibration plate for N times to obtain N groups of linear laser light stripe images on the surface of the calibration plate, wherein each group of linear stripe images corresponds to the images of the calibration object at different positions. The light stripe pattern is an image of the same row of features on the calibration plate scanned by a line laser based camera. When the calibration plate moves along the direction perpendicular to the line laser according to the preset speed, the camera scans the light stripe images on the calibration plate according to the preset time to obtain N groups of light stripe images, wherein the preset time can be determined according to the preset speed.
And determining corresponding line coordinate values and brightness values of each group of light stripe images through a light stripe center extraction algorithm. And for each group of light stripe images, extracting the central row coordinate and the brightness of each column by a light stripe center extraction algorithm to obtain the row coordinate value and the brightness value of the group of light stripe images. The light stripe center extraction algorithm can be a traditional light stripe center extraction algorithm, can be an improved algorithm based on traditional light stripe center extraction, and can also be a neural network light stripe center extraction algorithm.
And converting the N groups of brightness values into corresponding N groups of gray values. Thus, the coordinate map in the calibration image data includes N sets of row coordinate values and the grayscale map in the calibration image data includes N sets of grayscale values.
The space coordinate of the feature point on the calibration object is determined according to the size of the calibration object, the calibration object can be a calibration component, the calibration component can comprise a platform base, a cushion block and a calibration plate, one side of the platform base is provided with a groove, the other side of the platform base is provided with a baffle plate, the groove is used for placing the cushion block, the calibration plate is obliquely placed between the baffle plate and the cushion block of the platform base, the cushion block with different heights can be replaced according to the requirement of calibrated three-dimensional measuring equipment, the two-dimensional calibration plate is converted into a target with three-dimensional feature points, and the target has depth direction information.
The calibration board is a flat board with a characteristic pattern, the pattern can be a dot diagram of white dots with black background, the circle centers of the dots are used as the characteristics for calibration, and the interval in the row direction of the centers of the dots is delta phThe interval in the central column direction of the dots is Δ pvAs shown in the figure, the dimension of the calibration assembly is schematically shown, and the inclination angle θ of the calibration plate is determined according to the dimension of the calibration assembly:
Figure BDA0003363747320000051
in the formula, h is the height of the cushion block, d is the depth of the groove, W is the distance between the baffle plate of the platform base and the groove, and the space coordinate P (X) of the characteristic point of the I-th column and the J-th row on the calibration plate is further determinedP,ZP) The space coordinates of the feature points are obtained by calculation according to the following formula:
Figure BDA0003363747320000052
in some embodiments, the spatial coordinates of the feature points on the calibration object may be determined according to three-dimensional feature information of the calibration object, for example, a height block with feature points, a standard sphere, and a sawtooth block, and the spatial coordinates of the feature points may be determined according to the geometric structure thereof.
And S102, determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the coordinates of the spatial characteristic points according to the calibration image data.
Determining spatial coordinates P (X) of characteristic points of a light planeP,ZP) Pixel coordinate p (u) of corresponding imaging surface characteristic pointp,vp). Firstly, acquiring spatial feature points, extracting feature coordinates in a gray scale image as gray scale image feature points pin(uin,vin) As shown in FIG. 2And in the calibration board, extracting characteristic coordinates by taking the circle center as a characteristic, detecting dots in the gray-scale image by using a dot detection algorithm, and extracting the coordinates of the circle center.
According to the characteristic point coordinates in the gray scale image corresponding to the characteristic points of the calibration object, determining the column coordinates u of the characteristic points of the imaging surfacep(ii) a Determining coordinates in a coordinate graph according to the coordinates of the feature points in the gray scale graph; performing interpolation algorithm f on neighborhood coordinates of the coordinate valuesinterDetermining the line coordinates v of the characteristic points of the imaging planep(ii) a The pixel coordinates of the feature points corresponding to the imaging plane, that is, the feature points corresponding to the jth row and the ith column in the imaging plane are expressed as:
Figure BDA0003363747320000061
s103, determining distortion parameters through the distortion model and the pixel coordinates of the imaging surface characteristic points.
The distortion parameter comprises a principal point (u)c,vc) Radial distortion coefficient k1、k2Sum angle τx、τy
The principal point is determined through cross ratio invariance, and cross ratio CR of the imaging surface feature point and the neighborhood feature point in n directions is determinedpAnd cross ratio CR corresponding to the feature points on the calibration objectT(ii) a Determining deviation E according to N characteristic points of an imaging surfacep
Figure BDA0003363747320000062
In the formula, wnThe direction distance weight, N the number of directions and N the number of feature points.
By analyzing the error distribution curve of the cross ratio deviation of all the characteristic points, the distortion of the attachment of the principal point is minimum, so the principal point (u) is determined by the coordinate corresponding to the minimum extreme point of the error distribution curvec,vc) That is, the principal point is the imaging surface feature point corresponding to the minimum deviation.
The diameterDetermining the distortion coefficients and the Samm angles by feature point collinearity, including determining the ideal feature point p based on the collinearity of the feature points in the rows or columnsij *(ii) a And establishing an objective optimization function according to the distance from the ideal characteristic points to the fitted straight line:
F(k1,k2xy)=∑∑|pij *-Lm|2
in the formula, k1、k2Is the radial distortion coefficient, τx、τyIs angle of Samm, pij *Is an ideal feature point, LmTo fit a straight line.
And carrying out nonlinear optimization on the target optimization function through an optimization algorithm to obtain a radial distortion coefficient and the Samm angle.
And S104, determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through a distortion model.
The distortion parameters in the distortion model comprise principal points, radial distortion coefficients and Samm angles, and the distortion parameters are distributed and calibrated by analyzing the distribution rule of the characteristic points of the imaging surface. The characteristic point evenly distributed at equal interval in space on the calibration board, the space characteristic point that imaging surface characteristic point and correspond satisfies the cross ratio invariance to, the equal collineation of characteristic point that is located same row or same column confirms distortion parameter, and the rethread distortion model confirms the undistorted pixel coordinate that the pixel coordinate of imaging surface characteristic point corresponds includes:
s1041, determining a standard image plane distortion point coordinate, wherein the standard image plane distortion point p'xyCoordinate (x) ofp′,yp') is calculated according to the following formula:
Figure BDA0003363747320000063
in the formula, s1Is a scale factor, TSAccording to a rotation matrix R (tau)xy) Determined conversion relation, τx、τyIs angle of Samm, uc,vcAs a principal point coordinate;up,vpIs the pixel coordinate, x, of a characteristic point of the imaging planep′,yp' is a standard image plane distortion point coordinate.
S1042, determining the standard image surface distortion point-free coordinates
Figure BDA0003363747320000064
Coordinate (x) ofp *,yp *) The compound is obtained by calculation according to the following formula:
xp *=xp′+δu(xp′,yp′)
yp *=yp′+δv(xp′,yp′)
in the formula, xp *,yp *Is a standard image plane distortion point-free coordinate deltauAnd deltavThe model parameters all include radial distortion coefficient k1、k2
S1043, determining a pixel coordinate of an imaging plane distortion-free point, wherein the pixel coordinate of the imaging plane distortion-free point is obtained by calculation according to the following formula:
Figure BDA0003363747320000071
in the formula (I), the compound is shown in the specification,
Figure BDA0003363747320000072
the pixel coordinate of the imaging plane without distortion point.
And S105, determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to the homography matrix.
Establishing a transformation model of the space coordinates and the undistorted pixel coordinates according to the multiple groups of corresponding space coordinates and undistorted pixel coordinates and the homography matrix of the space coordinates and the undistorted pixel coordinates, as follows:
Figure BDA0003363747320000073
in the formula, s2Is a scale factor, H3×3Is a transformation matrix.
Determining a transformation matrix H through a plurality of groups of corresponding space coordinates and distortion-free pixel coordinates and through an optimization algorithm3×3And the method is used for calibrating the coordinates of the object to be measured, wherein the optimization algorithm can be a least square algorithm.
Transformation matrix H obtained by the above method3×3In actual measurement, the real space coordinates (X, Z) can be three-dimensionally reconstructed through the pixel coordinates obtained by the imaging plane, and are calculated according to the following formula:
Figure BDA0003363747320000074
Figure BDA0003363747320000075
in the formula, u and v are pixel coordinates of an imaging plane.
The application provides a calibration method of three-dimensional measurement equipment, which comprises the steps of determining calibration image data of a calibration object and space coordinates of feature points on the calibration object, wherein the calibration image data is determined by scanning the moving calibration object, and the space coordinates of the feature points on the calibration object are determined according to the size of the calibration object; determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the characteristic points on the calibration object according to the calibration image data; determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through a distortion model; determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to a homography matrix, wherein the transformation matrix is used for calibrating the coordinate of the object to be measured; according to the method and the device, the transformation matrix for calibration is determined by obtaining the space coordinates of the characteristic points on the calibration object and the corresponding undistorted pixel coordinates, and the calibration precision of the three measuring devices is improved.
On the basis of the calibration method for the three-dimensional measurement device provided above, some embodiments of the present application further provide a method for improving calibration accuracy by using a high-accuracy standard gauge block, as shown in fig. 6, after determining the transformation matrix:
s106, obtaining an imaging coordinate graph of the standard gauge block and the distance between the standard gauge block and the adjacent standard surface.
As shown in FIG. 7, in the standard gauge block 4, each step surface is a high-precision plane, and the height difference between the adjacent step surfaces is known, i.e. the distance d between two standard surfaces on the standard gauge blockmn
In some embodiments, other configurations of standard blocks may be used to correct the data.
And S107, determining the real coordinates of the standard gauge block according to the imaging coordinates of the standard gauge block according to the transformation matrix.
In some embodiments, the imaging coordinates (u)t,vt) Calculating corresponding real space coordinates (X, Z); and calculating the coordinate Y in the motion direction by combining the motion information of the mobile station, including the preset speed and the like, so as to obtain the real point cloud data (X, Y, Z) of the standard gauge block.
And S108, determining a true distance value of the adjacent standard surface of the standard gauge block according to the real coordinate of the standard gauge block.
Obtaining a true value D of the distance between two standard surfaces of the standard gauge block through real point cloud datamn
And S109, determining a correction coefficient according to the distance between the adjacent standard surfaces of the standard gauge blocks and the distance truth value.
The correction coefficient is used for correcting the longitudinal coordinate of the object to be detected. Obtaining the combination of all two standard surfaces on the standard gauge block and the distance d between the two corresponding standard surfacesmnTrue value D of the distance between two standard surfacesmnThe formula for calculating the correction coefficient γ is as follows:
Figure BDA0003363747320000081
in the formula, N is the combined number of two standard surfaces.
Correcting the space coordinate of three-dimensional reconstruction after the three-dimensional measuring equipment is calibrated, and correcting the coordinate Z*γ · Z. In the application, the requirement on Z-direction coordinate calibration is high, so that the standard gauge blocks in the height direction are adopted for analysis.
According to the method and the device, the high-precision standard gauge block is used for correcting the data of the three-dimensional reconstructed space coordinate after calibration, and the calibration precision is further improved.
The above-mentioned contents are only for explaining the technical idea of the present application, and the protection scope of the present application is not limited thereby, and any modification made on the basis of the technical idea presented in the present application falls within the protection scope of the claims of the present application.
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments have been discussed in the foregoing disclosure by way of example, it should be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.

Claims (10)

1. A calibration method of three-dimensional measurement equipment is characterized by comprising the following steps:
acquiring calibration image data of a calibration object and space coordinates of feature points on the calibration object, wherein the space coordinates of the feature points on the calibration object are acquired according to the size of the calibration object;
determining the pixel coordinates of the characteristic points of the imaging surface corresponding to the characteristic points on the calibration object according to the calibration image data;
determining distortion parameters through the distortion model and pixel coordinates of the characteristic points of the imaging surface;
determining undistorted pixel coordinates corresponding to the pixel coordinates of the imaging surface feature points through the distortion model and the distortion parameters;
and determining a transformation matrix of the space coordinate and the undistorted pixel coordinate according to the homography matrix, wherein the transformation matrix is used for calibrating the coordinate of the object to be measured.
2. The calibration method of a three-dimensional measuring device according to claim 1, wherein after determining the transformation matrix, the calibration method further comprises:
acquiring an imaging coordinate graph of a standard gauge block and the distance between the standard gauge block and an adjacent standard surface;
determining the real coordinates of the standard gauge block according to the imaging coordinates of the standard gauge block and the transformation matrix;
determining a true value of the distance between the adjacent standard surfaces of the standard gauge blocks according to the real coordinates of the standard gauge blocks;
and determining a correction coefficient according to the distance between the adjacent standard surfaces of the standard gauge blocks and the distance truth value, wherein the correction coefficient is used for correcting the longitudinal coordinate of the object to be measured.
3. Calibration method for a three-dimensional measuring device according to claim 1, characterized in that said distortion parameters comprise the principal point (u)c,vc) Radial distortion coefficient k1、k2Sum angle τx、τy
4. The method for calibrating a three-dimensional measuring device according to claim 3, wherein the principal point is determined by cross-ratio invariance, comprising:
determining the intersection ratio CR of the feature point of the imaging surface in n directions and the feature point of the neighborhoodpAnd cross ratio CR corresponding to the feature points on the calibration objectT
Determining deviation E according to N characteristic points of an imaging surfacep
Figure FDA0003363747310000011
In the formula, wnThe direction distance weight is used, N is the number of directions, and N is the number of characteristic points;
determining a principal point (u)c,vc) And the principal point is the corresponding imaging surface characteristic point when the deviation is minimum.
5. The method for calibrating a three-dimensional measuring device according to claim 3, wherein the radial distortion coefficient and the Samm angle are determined by collinear feature points, and the method comprises the following steps:
determining ideal characteristic point p according to collinear characteristic points of rows or columnsij *
And establishing an objective optimization function according to the distance from the ideal characteristic points to the fitted straight line:
F(k1,k2,τx,τy)=∑∑|pij *-Lm|2
in the formula, k1、k2Is the radial distortion coefficient, τx、τyIs angle of Samm, pij *Is an ideal feature point, LmIs a fitting straight line;
and carrying out nonlinear optimization on the target optimization function, and determining the radial distortion coefficient and the Samm angle.
6. The calibration method of the three-dimensional measurement equipment according to claim 1, wherein the determining of the undistorted pixel coordinate corresponding to the pixel coordinate of the imaging surface feature point through the distortion model and the distortion parameter comprises:
determining a standard image plane distortion point coordinate, wherein the standard image plane distortion point coordinate is obtained by calculation according to the following formula:
Figure FDA0003363747310000021
in the formula, s1Is a scale factor, TsAccording to a rotation matrix R (tau)x,τy) Determined conversion relation, τx、τyIs angle of Samm, uc,vcIs a principal point coordinate; u. ofp,vpIs the pixel coordinate, x, of a characteristic point of the imaging planep′,yp' is a standard image plane distortion point coordinate;
determining a standard image surface distortion point-free coordinate, wherein the standard image surface distortion point-free coordinate is obtained by calculation according to the following formula:
xp *=xp′+δu(xp′,yp′)
yp *=yp′+δv(xp′,yp′)
in the formula, xp *,yp *No distortion of standard image planeCoordinate of change point, deltauAnd deltavThe model parameters all include radial distortion coefficient k1、k2
Determining the pixel coordinate of an imaging plane distortion-free point, wherein the pixel coordinate of the imaging plane distortion-free point is obtained by calculation according to the following formula:
Figure FDA0003363747310000022
in the formula (I), the compound is shown in the specification,
Figure FDA0003363747310000023
the pixel coordinate of the imaging plane without distortion point.
7. The calibration method of a three-dimensional measurement device according to claim 1, wherein the calibration image data includes a coordinate map and a gray scale map, the coordinate map includes pixel coordinates and row coordinate values, the gray scale map includes pixel coordinates and gray scale values, and the determination of the row coordinate values and gray scale values includes:
acquiring a plurality of groups of light stripe images of the calibration object, wherein each group of light stripe images correspond to images of the calibration object at different positions;
and determining corresponding line coordinate values and gray values of each group of light stripe images through a light stripe center extraction algorithm.
8. The method for calibrating three-dimensional measurement equipment according to claim 7, wherein determining the pixel coordinates of the feature point corresponding to the feature point of the imaging surface comprises:
according to the characteristic point coordinates in the gray scale image corresponding to the characteristic points of the calibration object, determining the column coordinates u of the characteristic points of the imaging surfacep
Determining coordinates in a coordinate graph according to the coordinates of the feature points in the gray scale graph;
determining the line coordinate v of the characteristic point of the imaging surface by an interpolation algorithm for the neighborhood coordinates of the coordinate valuesp
9. The calibration method of the three-dimensional measurement device according to claim 1, wherein the obtaining of the calibration image data includes:
scanning the moving calibration object to obtain a plurality of groups of image data to be processed;
and preprocessing the plurality of groups of image data to be processed to obtain the calibration image data.
10. The calibration method of the three-dimensional measurement equipment according to claim 1, wherein the calibration object comprises a platform base (21), a cushion block (22) and a calibration plate (23);
one side of the platform base is provided with a groove (211), the other side of the platform base is provided with a baffle (212), and the groove (211) is used for placing the cushion block (22); the calibration plate (23) is obliquely arranged on the baffle (212) and the cushion block (22) and forms an inclination angle theta with the platform base (21);
and determining the space coordinates of the characteristic points on the calibration object according to the inclination angle theta.
CN202111375763.0A 2021-11-19 2021-11-19 Calibration method of three-dimensional measurement equipment Pending CN114066996A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111375763.0A CN114066996A (en) 2021-11-19 2021-11-19 Calibration method of three-dimensional measurement equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111375763.0A CN114066996A (en) 2021-11-19 2021-11-19 Calibration method of three-dimensional measurement equipment

Publications (1)

Publication Number Publication Date
CN114066996A true CN114066996A (en) 2022-02-18

Family

ID=80278574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111375763.0A Pending CN114066996A (en) 2021-11-19 2021-11-19 Calibration method of three-dimensional measurement equipment

Country Status (1)

Country Link
CN (1) CN114066996A (en)

Similar Documents

Publication Publication Date Title
CN109454634B (en) Robot hand-eye calibration method based on plane image recognition
CN108562250B (en) Keyboard keycap flatness rapid measurement method and device based on structured light imaging
CN109443209B (en) Line structured light system calibration method based on homography matrix
CN110443879B (en) Perspective error compensation method based on neural network
CN107121093A (en) A kind of gear measurement device and measuring method based on active vision
CN105783711B (en) Three-dimensional scanner correction system and correction method thereof
CN108716890A (en) A kind of high-precision size detecting method based on machine vision
CN103905719A (en) Correcting sheet for correcting multiple image capturing devices and correcting method thereof
CN111508027A (en) Method and device for calibrating external parameters of camera
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN106323286B (en) A kind of robot coordinate system and the transform method of three-dimensional measurement coordinate system
CN112614188A (en) Dot-matrix calibration board based on cross ratio invariance and identification method thereof
CN114359405A (en) Calibration method of off-axis Samm 3D line laser camera
CN112489137A (en) RGBD camera calibration method and system
CN111047586A (en) Pixel equivalent measuring method based on machine vision
CN112802123A (en) Binocular linear array camera static calibration method based on stripe virtual target
CN111476844B (en) Calibration method for multiple linear array camera array systems
WO2018168757A1 (en) Image processing device, system, image processing method, article manufacturing method, and program
CN109472778B (en) Appearance detection method for towering structure based on unmanned aerial vehicle
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN113989393A (en) Scanning device objective table flatness correction method based on image information processing
CN112381888A (en) Dynamic compensation method for H-shaped steel cutting path
CN112767494A (en) Precise measurement positioning method based on calibration algorithm
CN114066996A (en) Calibration method of three-dimensional measurement equipment
CN114295056B (en) Rapid correction method and application of visual positioning system of laser processing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination