CN113959362B - Calibration method and inspection data processing method of structured light three-dimensional measurement system - Google Patents

Calibration method and inspection data processing method of structured light three-dimensional measurement system Download PDF

Info

Publication number
CN113959362B
CN113959362B CN202111107578.3A CN202111107578A CN113959362B CN 113959362 B CN113959362 B CN 113959362B CN 202111107578 A CN202111107578 A CN 202111107578A CN 113959362 B CN113959362 B CN 113959362B
Authority
CN
China
Prior art keywords
calibration
point cloud
structured light
measurement system
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111107578.3A
Other languages
Chinese (zh)
Other versions
CN113959362A (en
Inventor
邓成呈
张猛
丁祥宇
蔡晓君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Shenhao Technology Co Ltd
Original Assignee
Hangzhou Shenhao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Shenhao Technology Co Ltd filed Critical Hangzhou Shenhao Technology Co Ltd
Priority to CN202111107578.3A priority Critical patent/CN113959362B/en
Publication of CN113959362A publication Critical patent/CN113959362A/en
Application granted granted Critical
Publication of CN113959362B publication Critical patent/CN113959362B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application belongs to the technical field of track detection, and particularly relates to a calibration method of a structured light three-dimensional measurement system of a patrol robot, wherein the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the patrol robot; the method comprises the following steps: s10, acquiring initial point cloud images of two calibration blocks which are arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoid tables on the same plane; s20, extracting three-dimensional corner points of each trapezoidal table from the initial point cloud image to serve as characteristic points, and taking the three-dimensional corner points of the corresponding positions of the two calibration blocks as characteristic point pairs of the left calibration block and the right calibration block; s30, constructing an objective function of an ICP algorithm according to the extracted characteristic point pairs of the left calibration block and the right calibration block, and carrying out iterative solution to obtain a transformation relation between two structural light sensor coordinate systems. The method can accurately and efficiently mark the structural light sensor, and improves the accuracy and efficiency of structural light measurement.

Description

Calibration method and inspection data processing method of structured light three-dimensional measurement system
Technical Field
The application belongs to the technical field of track detection, and particularly relates to a calibration method of a structured light three-dimensional measurement system of a patrol robot.
Background
The inspection robot adopts left and right structured light to detect the section of the steel rail respectively, and then calculates the track gauge value. Because of the reasons of machining precision and manual installation errors, the spatial position relation of the left laser shooting assembly and the right laser shooting assembly is unknown, whether the left laser plane and the right laser plane are arranged in a coplanar mode cannot be judged, and the left structural light measuring point and the right structural light measuring point are dislocated in the running direction, so that the measuring error is introduced.
The CN112785654A patent acquires a plurality of plane calibration plate images under different postures of the calibration targets through cameras in laser shooting assemblies at the left side and the right side in the track geometry detection system, and obtains a transformation relation between camera coordinate systems in the laser shooting assemblies at the left side and the right side. But under the condition that the cameras on the left side and the right side can shoot the complete corresponding plane calibration plates simultaneously, the mode needs to acquire three-axis translation of a plurality of calibration targets along a motion coordinate system, and plane calibration plate images in different postures rotate around the three axes. The process is often not satisfied because the left and right structured light cannot be satisfied while the picture is taken.
The existing method can not accurately and efficiently mark the laser camera assembly, and reduces the accuracy and efficiency of structured light measurement.
Disclosure of Invention
First, the technical problem to be solved
In view of the above-mentioned shortcomings and disadvantages of the prior art, the present application provides a calibration method and a patrol data processing method for a structured light three-dimensional measurement system.
(II) technical scheme
In order to achieve the above purpose, the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a calibration method for a structured light three-dimensional measurement system of a patrol robot, where the structured light three-dimensional measurement system includes two structured light sensors disposed on two sides of the patrol robot, and the method includes:
s10, acquiring initial point cloud images of two calibration blocks which are arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoid tables on the same plane;
s20, extracting three-dimensional corner points of each trapezoidal table from the initial point cloud image to serve as characteristic points, and taking the three-dimensional corner points of the corresponding positions of the two calibration blocks as characteristic point pairs of the left calibration block and the right calibration block;
s30, constructing an objective function of an ICP algorithm according to the extracted characteristic point pairs of the left calibration block and the right calibration block, and carrying out iterative solution to obtain a transformation relation between two structural light sensor coordinate systems.
Optionally, before S10, the method further includes:
the inspection robot is placed on a calibration platform to ensure the consistency of walking of the inspection robot;
scanning two calibration blocks which are arranged side by side and fixed on the calibration platform through the optical sensors of the left and right side structures in the inspection robot, wherein the two calibration blocks are arranged in the working range of the optical sensors of the left and right side structures;
and acquiring point cloud images of the two calibration blocks through the structural light sensors at the left side and the right side.
Optionally, the calibration block is 16 trapezoid tables arranged on a reference plane in an array manner.
Optionally, extracting the three-dimensional corner point of each trapezoidal table from the initial point cloud image in S20 includes:
s21, extracting plane mathematical models of the side face, the upper surface and the bottom face of the trapezoid table from the initial point cloud image by adopting a preset point cloud image segmentation algorithm;
s22, solving 128 three-dimensional corner coordinates of each calibration block according to the three-dimensional corner points of each trapezoid table and based on the plane data models of three adjacent planes of each three-dimensional corner point.
Optionally, S30 includes:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
s32, registering two initial point cloud images acquired by two structural light sensors on the basis of an ICP algorithm with the standard point cloud images respectively to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix;
s33, obtaining a transformation relation between the two groups of structural light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
Optionally, the objective function is:
wherein n is the number of nearest point pairs, p i Q is a point in the target point cloud P i For the AND p in the source point cloud Q i The corresponding nearest point, R is the rotation matrix and t is the translation vector.
Optionally, the transformation relationship between the two structural light sensor coordinate systems includes a rotation matrix and a translation matrix to the camera coordinate system in the left structural light sensor to the camera coordinate system in the right structural light sensor.
In a second aspect, an embodiment of the present application provides a method for processing inspection data of a track line inspection robot, the method including,
obtaining a transformation relation between two structural light sensor coordinate systems by adopting the structural light three-dimensional measurement system calibration method of the inspection robot according to any one of the first aspect;
acquiring two point cloud images acquired by the track line inspection robot when the track line inspection robot performs line inspection through a structured light three-dimensional measurement system;
adjusting two point cloud images based on the transformation relationship;
and carrying out three-dimensional reconstruction based on the adjusted point cloud image.
In a third aspect, an embodiment of the present application provides an electronic device, including: the method comprises the steps of a method for calibrating a structured light three-dimensional measurement system of a patrol robot according to any one of the first aspects, a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the computer program is executed by the processor.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor implements the steps of the method for calibrating a structured light three-dimensional measurement system of a inspection robot according to any one of the first aspect above.
(III) beneficial effects
The beneficial effects of the application are as follows: the application provides a calibration method of a structured light three-dimensional measurement system of a patrol robot, wherein the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the patrol robot; the method comprises the following steps: s10, acquiring initial point cloud images of two calibration blocks which are arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoid tables on the same plane; s20, extracting three-dimensional corner points of each trapezoidal table from the initial point cloud image to serve as characteristic points, and taking the three-dimensional corner points of the corresponding positions of the two calibration blocks as characteristic point pairs of the left calibration block and the right calibration block; s30, constructing an objective function of an ICP algorithm according to the extracted characteristic point pairs of the left calibration block and the right calibration block, and carrying out iterative solution to obtain a transformation relation between two structural light sensor coordinate systems. By the method, the structured light sensor can be accurately and efficiently marked.
Further, the application also provides a method for processing the inspection data of the track line inspection robot, which comprises the following steps: the transformation relation between two structural light sensor coordinate systems is obtained by adopting the structural light three-dimensional measurement system calibration method of the inspection robot; acquiring two point cloud images acquired by the track line inspection robot when the track line inspection robot performs line inspection through a structured light three-dimensional measurement system; adjusting two point cloud images based on the transformation relationship; and carrying out three-dimensional reconstruction based on the adjusted point cloud image. By the method, measurement errors caused by dislocation of the left and right structural light measurement points in the running direction are greatly reduced, and the accuracy and the efficiency of structural light measurement are improved.
Drawings
The application is described with the aid of the following figures:
FIG. 1 is a schematic flow chart of a calibration method of a structured light three-dimensional measurement system of a inspection robot in an embodiment of the application;
FIG. 2 is a schematic diagram of the calibration block according to an embodiment of the present application;
FIG. 3 is a schematic view of a trapezoidal mesa and corner points in one embodiment of the present application;
fig. 4 is a schematic flow chart of a method for processing inspection data of a track line inspection robot according to another embodiment of the present application;
fig. 5 is a schematic diagram of an architecture of an electronic device according to another embodiment of the application.
Detailed Description
The application will be better explained by the following detailed description of the embodiments with reference to the drawings. It is to be understood that the specific embodiments described below are merely illustrative of the related application, and not restrictive of the application. In addition, it should be noted that, in the case of no conflict, the embodiments of the present application and the features in the embodiments may be combined with each other; for convenience of description, only parts related to the application are shown in the drawings.
Example 1
Fig. 1 is a schematic flow chart of a calibration method of a structured light three-dimensional measurement system of a inspection robot in an embodiment of the application. As shown in fig. 1, in the calibration method of the structured light three-dimensional measurement system of the inspection robot of the present embodiment, the structured light three-dimensional measurement system includes two structured light sensors disposed on two sides of the inspection robot, and the method includes:
s10, acquiring initial point cloud images of two calibration blocks which are arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoid tables on the same plane;
s20, extracting three-dimensional corner points of each trapezoidal table from the initial point cloud image to serve as characteristic points, and taking the three-dimensional corner points of the corresponding positions of the two calibration blocks as characteristic point pairs of the left calibration block and the right calibration block;
s30, constructing an objective function of an ICP algorithm according to the extracted characteristic point pairs of the left calibration block and the right calibration block, and carrying out iterative solution to obtain a transformation relation between two structural light sensor coordinate systems.
The calibration method of the structured light three-dimensional measurement system of the inspection robot can accurately and efficiently calibrate the structured light sensor, and improves the accuracy and efficiency of structured light measurement.
In order to better understand the present application, each step in this embodiment is explained below.
In S10, the inspection robot may be a railway line inspection robot.
In S10, the calibration block includes a plurality of trapezoidal stages on the same plane, and the number of trapezoidal stages is at least 2. Specifically, in this embodiment, the calibration block may be 16 trapezoidal stages arranged in an array on a reference plane. FIG. 2 is a schematic diagram of the calibration block according to an embodiment of the present application, as shown in FIG. 2, 4*4 standard trapezoidal stages are co-located on a reference plane, and 16 trapezoidal stages are distributed in an array with equal intervals.
The present embodiment is merely an exemplary illustration of the number of trapezoidal stages, and does not constitute a specific limitation on the number of trapezoidal stages.
Fig. 3 is a schematic view of a trapezoidal table top and a corner point in an embodiment of the present application, as shown in fig. 3, an upper left corner trapezoidal table schematic view of a calibration block is taken, where a is a reference plane, B, C, D, E is four sides of the trapezoidal table respectively, and F is an upper surface of the trapezoidal table and is parallel to the reference plane a. a is the unique intersection of the A, B, E three faces, b is the unique intersection of the A, B, C three faces; c is the unique intersection of the A, D, C three facets; d is the unique intersection of the A, D, E three facets; e is the unique intersection of the E, B, F three facets; f is the unique intersection of the C, B, F three facets; g is the unique intersection of the E, D, F three facets; h is the unique intersection of the D, C, F three facets.
In this embodiment, extracting the three-dimensional corner point of each trapezoidal table from the initial point cloud image in S20 includes:
s21, extracting a plane mathematical model of the side face, the upper surface and the bottom face of the trapezoid table from an initial point cloud image by adopting a preset point cloud image segmentation algorithm;
specifically, the preset point cloud image segmentation algorithm may be a random sampling consistency algorithm.
Respectively extracting each plane, namely the side face, the upper surface and the bottom face of the trapezoid table from the initial point cloud image by using a random sampling consistency algorithm; the method specifically comprises the following steps:
s211, randomly selecting a sample subset from the samples, and then calculating model parameters for the subset by using a minimum variance estimation algorithm;
the planar model is adapted to the assumed local points, i.e. all unknown parameters can be calculated from the assumed and sufficient number of local points.
S212, calculating the deviation between all samples and the model, comparing the deviation with a preset threshold value, and marking the point as an intra-local point when the deviation is smaller than the threshold value, otherwise, eliminating the point;
all other data is tested with the model obtained in S211 and if a certain point is suitable for the estimated model, it is considered as an intra-local point as well. If there are enough points to be classified as hypothetical local points, then the estimated model is reasonable enough.
S213, re-estimating the model by using all assumed local points, and further obtaining more accurate model parameters.
S214, evaluating the model by estimating the error rate of the local points and the model.
S215, repeating the steps S211-S214 until the end condition is met, wherein the obtained model parameters are the optimal plane model; wherein the end condition includes: the model accords with the constraint condition of hypothesis, namely the error rate is smaller than the expected error rate, and the iteration times reach the preset times.
And respectively extracting models of the side face, the upper surface and the bottom face of the trapezoidal table from the initial point cloud image through a random sampling consistency algorithm.
S22, solving 128 three-dimensional corner coordinates of each calibration block according to the three-dimensional corner points of each trapezoid table and based on the plane data models of three adjacent planes of each three-dimensional corner point.
In this embodiment, the three-dimensional corner point of the trapezoidal table is uniquely determined by the common intersection point of the adjacent three faces, and a process of solving the unique common intersection point of the adjacent three faces is described below.
The three plane equations are assumed as follows:
a1x+b1y+c1z+d1=0
a2x+b2y+c2z+d2=0
a3x+b3y+c3z+d3=0
wherein a1, a2, a3, b1, b2, b3, c1, c2, c3, d1, d2, d3 are plane equation coefficients, respectively.
Establishing a matrix equation:
the method can solve:
wherein ,
through the mode, 8.16 corner points of the intersection points of the plurality of faces can be respectively obtained. Meanwhile, the absolute positions of 128 angular points of the calibration block are known, so that 128 groups of corresponding sequence points can be obtained.
In S30, the method includes:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
in this embodiment, since the two calibration blocks are arranged side by side, the two standard point clouds, i.e. the ideal state of the left standard point cloud and the right standard point cloud, are completely parallel, and only the translation amount in the x direction exists, so there are:
P Lbase =P Rbase +T base
wherein ,PLbase Is a left standard point cloud P Lbase ,P Rbase Is the right standard point cloud, T base Is a translation vector between two standard point clouds.
And S32, carrying out registration on two initial point cloud images acquired by two structural light sensors based on a closest point (Iternate ClosestPoint, ICP) point cloud matching algorithm and a standard point cloud image respectively to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix.
The registration algorithm ICP is widely applied to various fields, the original ICP algorithm has large calculation cost, is sensitive to initial transformation, and is easy to fall into a local optimal solution. If the number of the points on the surface is too large, the number of the points on the surface can be reduced to improve the calculation speed through the downsampling measure, but the downsampling method can reduce the positions of the registration original points, and the matching error is increased. In this embodiment, considering the registration speed and the registration precision, the 16 x 8 three-dimensional corner points of the calibration block are adopted as the anchor feature points, so as to accelerate convergence, and the extraction of each corner point in 128 is generated by extracting the intersection point of three surfaces associated with each corner point, so that the precision of the extracted corner points and the precision of registration are ensured.
The ICP algorithm is described below.
The final purpose of point cloud registration is to unify two or more sets of point cloud data under different coordinate systems to the same reference coordinate system through certain rotation and translation transformations. This process may be accomplished through a set of mappings. Assuming that the mapping transformation is a rigid transformation matrix H, six unknowns α, β, γ, tx, ty, tz need to be found in the rigid transformation matrix. Solving six unknown parameters requires six linear equations, namely at least 3 groups of point cloud corresponding point pairs to be matched are required to be found, the 3 groups of corresponding point pairs cannot be collinear, 128 groups of data points can be obtained in total in S20, further parameter estimation of the rigid matrix is completed, and the data point pairs larger than 3 groups further improve the parameter estimation precision of the rigid transformation matrix.
The basic principle of the ICP algorithm is as follows: in the target point cloud P and the source point cloud Q with matching, respectively, according to a certain constraint condition, the nearest point (P i ,q i ) Then, optimal matching parameters R and t are calculated, and the objective function is to minimize the error function. The error function is E (R, t) is:
wherein n is the number of nearest point pairs, p i Q is a point in the target point cloud P i For the AND p in the source point cloud Q i The corresponding nearest point, R is the rotation matrix and t is the translation vector.
ICP algorithm steps, including:
(1) Taking a point set P in a target point cloud P i ∈P;
(2) Finding a corresponding point set Q in the source point cloud Q i E Q such that min= |q i -p i ||;
(3) Calculating a rotation matrix R and a translation matrix t so as to minimize an error function;
(4) P pair of i Performing rotation and translation transformation by using the rotation matrix R and the translation matrix t obtained in the previous step to obtain a new corresponding point set p i ’={p i ’=Rp i +t,p i ∈P};
(5) Calculation of p i And corresponding point set q i An average distance d of (2);
(6) If d is less than a given threshold or greater than a preset maximum number of iterations, the iterative calculation is stopped. Otherwise, returning to the step 2 until the convergence condition is met.
Each sensor in two structure light sensors gathers two calibration block point cloud images of world coordinate system, and every calibration block has 128 angular points, then carries out above-mentioned registration operation with the ideal point cloud under the absolute parallel of control calibration block, can acquire the rotation and translation matrix of real-time collection calibration block gesture and calibration block standard gesture respectively, includes: left calibration block rotation matrix R L Translation matrix T of left calibration block L Right calibration block rotation matrix R R Translation matrix T of right calibration block R
S33, obtaining a transformation relation between the two groups of structural light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
In this embodiment, the transformation relationship between the two structural light sensor coordinate systems includes a rotation matrix and a translation matrix to the camera coordinate system in the left structural light sensor to the camera coordinate system in the right structural light sensor.
In this embodiment, since:
P Rbase =Q Rbase R Rinit +Tpc Rbase
P Lbase =Q Lbase R Linit +Tpc Lbase
wherein ,PRbase 、P Lbase The initial point clouds on the left side and the right side are respectively point clouds after rotation translation transformation based on standard point clouds, Q Rbase 、、Q Lbase Respectively left and right initial point clouds, R Rinit 、、R Linit Rotational vector initialization matrix, tpc, consisting of left and right point cloud normal vectors, respectively Rbase 、、Tpc Lbase Translation vectors are initialized for the left and right sides, respectively.
Coordinate relation of combining standard point cloud: p (P) Lbase =P Rbase +T base
Then the first time period of the first time period,
P Lbase =Q Lbase R Linit +Tpc Lbase =P Rbase +T base =Q Rbase R Rinit +Tpc Rbase +T base
namely:
Q Lbase R Linit +Tpc Lbase =Q Rbase R Rinit +Tpc Rbase +T base
the transformation relation between the two groups of structured light coordinate systems can be obtained by the above formula.
Preferably, in S30, S31 further includes:
the calculation formula of the point cloud centroid coordinates is as follows:
wherein Pc is the centroid coordinates of the point cloud, n is the number of points in the point cloud, and x i 、y i 、z i The i-th point coordinates, respectively.
And respectively extracting the angular point centroid and the point cloud normal vector according to the above formula, and carrying out translation and rotation to serve as registration initial gestures.
P Rbase =Q Rbase R Rinit +Tpc Rbase
P Lbase =Q Lbase R Linit +Tpc Lbase
wherein ,PRbase 、P Lbase The left side and the right side are respectively the point clouds after rotation translation transformation based on the standard point clouds, Q Rbase 、、Q Lbase R is the original point cloud on the left side and the original point cloud on the right side respectively Rinit 、R Linit Respectively initializing matrixes, tpc, of rotation vectors formed by normal vectors of point clouds on left and right sides Rbase 、Tpc Lbase Translation vectors are initialized for the left and right sides, respectively.
Because the uniqueness and the position certainty of the corner points of the calibration block and the ICP comparison depends on the transformation initial value, the initial value of registration can be given by the centroid position of 128 corner points, and the constraint of registration is further enhanced.
In this embodiment, before S10, the method may further include:
the inspection robot is placed on a calibration platform to ensure the consistency of walking of the inspection robot; meanwhile, the absolute position between the calibration blocks can be ensured;
scanning two calibration blocks which are arranged side by side and fixed on a calibration platform through the left and right side structure light sensors in the inspection robot, wherein the two calibration blocks are arranged in the working range of the left and right side structure light sensors;
and acquiring point cloud images of the two calibration blocks through the structural light sensors at the left side and the right side.
The acquisition process requires a calibration platform, two high-precision calibration blocks, and the relative positions and the respective three-dimensional forms of the calibration blocks are known. Since the left and right structured light pose and distance are always kept constant, the next set of point cloud data of the respective reference coordinate system is acquired by the left and right cameras, respectively, and the two sets of point clouds are associated together in pairs. Thus, the calibration problem of two camera coordinate systems can be translated into the registration problem of two sets of point cloud data.
According to the method, two groups of standard calibration terraces fixed on calibration tables with higher machining precision are collected through the left and right side structure light sensor assemblies in the line inspection robot, the standard calibration terraces are arranged in the working range of the left and right side structure light sensor assemblies, intersection points of three adjacent sides of the three-dimensional 4*4 terraces are extracted, 4 x 8 corner feature points are taken as a total, an ICP objective function is constructed according to the extracted feature point pairs of the left and right reference terraces, high-precision iterative solution is carried out, transformation relations between coordinate systems of the left and right side structure light sensor assemblies without common vision field are calibrated, namely, absolute positions in the coordinate systems of the inspection robot are achieved, accurate and efficient calibration of the structure light sensors is achieved, and therefore the problem that the position and the posture of the left and right structure light of the line inspection robot are inconsistent during installation can be solved.
Example two
In a second aspect of the present application, another embodiment provides a method for processing inspection data of a track line inspection robot, and fig. 4 is a flowchart of a method for processing inspection data of a track line inspection robot according to another embodiment of the present application, as shown in fig. 4, where the method includes,
s100, obtaining a transformation relation between two structural light sensor coordinate systems by adopting the structural light three-dimensional measurement system calibration method of the inspection robot in the first embodiment;
s200, acquiring two point cloud images acquired by the track line inspection robot during line detection through a structured light three-dimensional measurement system;
s300, adjusting two point cloud images based on a transformation relation;
s400, performing three-dimensional reconstruction based on the adjusted point cloud image.
The inspection data processing method can calibrate the transformation relation between the left structural light coordinate system and the right structural light coordinate system, further obtain the relative posture of the two groups of structural lights in the coordinate system of the line inspection robot, solve the problem of the relative coordinate system without a public view field, and improve the accuracy and the efficiency of the inspection robot cross-view structural light measurement.
Example III
A third aspect of the present application provides, in a third embodiment, an electronic device, including: the method for calibrating a structured light three-dimensional measurement system of a inspection robot according to any one of the above embodiments includes the steps of a memory, a processor, and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor.
Fig. 5 is a schematic diagram of an architecture of an electronic device according to another embodiment of the application.
The electronic device shown in fig. 5 may include: at least one processor 101, at least one memory 102, at least one network interface 104, and other user interfaces 103. The various components in the electronic device are coupled together by a bus system 105. It is understood that the bus system 105 is used to enable connected communications between these components. The bus system 105 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 105 in fig. 5.
The user interface 103 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball (trackball), or a touch pad, etc.).
It will be appreciated that the memory 102 in this embodiment may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (ProgrammableROM, PROM), an erasable programmable Read-only memory (ErasablePROM, EPROM), an electrically erasable programmable Read-only memory (ElectricallyEPROM, EEPROM), or a flash memory, among others. The volatile memory may be a random access memory (RandomAccessMemory, RAM) that acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic random access memory (DynamicRAM, DRAM), synchronous dynamic random access memory (SynchronousDRAM, SDRAM), double data rate synchronous dynamic random access memory (ddr SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous link dynamic random access memory (SynchlinkDRAM, SLDRAM), and direct memory bus random access memory (DirectRambusRAM, DRRAM). The memory 62 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 102 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 1021, and application programs 1022.
The operating system 1021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. Application 622 contains various applications, such as an industrial control device operation management system, for implementing various application services. A program for implementing the method of the embodiment of the present application may be included in the application program 1022.
In an embodiment of the present application, the processor 101 is configured to execute the method steps provided in the first aspect by calling a program or an instruction stored in the memory 102, specifically, a program or an instruction stored in the application 1022.
The method disclosed in the above embodiment of the present application may be applied to the processor 101 or implemented by the processor 101. The processor 101 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 101 or instructions in the form of software. The processor 101 described above may be a general purpose processor, a digital signal processor, an application specific integrated circuit, an off-the-shelf programmable gate array or other programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software elements in a decoding processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 102, and the processor 101 reads information in the memory 102, and in combination with its hardware, performs the steps of the method described above.
In addition, in combination with the calibration method of the structured light three-dimensional measurement system of the inspection robot in the above embodiment, the embodiment of the application may provide a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the calibration method of the structured light three-dimensional measurement system of the inspection robot in any one of the above method embodiments is implemented.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. The use of the terms first, second, third, etc. are for convenience of description only and do not denote any order. These terms may be understood as part of the component name.
Furthermore, it should be noted that in the description of the present specification, the terms "one embodiment," "some embodiments," "example," "specific example," or "some examples," etc., refer to a specific feature, structure, material, or characteristic described in connection with the embodiment or example being included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art upon learning the basic inventive concepts. Therefore, the appended claims should be construed to include preferred embodiments and all such variations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, the present application should also include such modifications and variations provided that they come within the scope of the following claims and their equivalents.

Claims (8)

1. The method for calibrating the structured light three-dimensional measurement system of the inspection robot is characterized in that the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the inspection robot, and the method comprises the following steps:
s10, acquiring initial point cloud images of two calibration blocks which are arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoid tables on the same plane;
s20, extracting three-dimensional corner points of each trapezoidal table from the initial point cloud image to serve as characteristic points, and taking the three-dimensional corner points of the corresponding positions of the two calibration blocks as characteristic point pairs of the left calibration block and the right calibration block;
the extracting the three-dimensional angular point of each trapezoidal stage from the initial point cloud image comprises the following steps:
s21, extracting plane mathematical models of the side face, the upper surface and the bottom face of the trapezoid table from the initial point cloud image by adopting a preset point cloud image segmentation algorithm;
s22, solving 128 three-dimensional corner coordinates of each calibration block according to the three-dimensional corner points of each trapezoid table and based on the plane data models of three adjacent planes of each three-dimensional corner point;
s30, constructing an objective function of an ICP algorithm according to the extracted characteristic point pairs of the left calibration block and the right calibration block, and carrying out iterative solution to obtain a transformation relation between two structural light sensor coordinate systems, wherein the method comprises the following steps:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
s32, registering two initial point cloud images acquired by two structural light sensors on the basis of an ICP algorithm with the standard point cloud images respectively to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix;
s33, obtaining a transformation relation between the two groups of structural light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
2. The method for calibrating a structured light three-dimensional measurement system of a inspection robot according to claim 1, further comprising, before S10:
the inspection robot is placed on a calibration platform to ensure the consistency of walking of the inspection robot;
scanning two calibration blocks which are arranged side by side and fixed on the calibration platform through the optical sensors of the left and right side structures in the inspection robot, wherein the two calibration blocks are arranged in the working range of the optical sensors of the left and right side structures;
and acquiring point cloud images of the two calibration blocks through the structural light sensors at the left side and the right side.
3. The method for calibrating a structured light three-dimensional measurement system of a patrol robot according to claim 2, wherein the calibration block is 16 trapezoidal stages arranged in an array on a reference plane.
4. The method for calibrating a structured light three-dimensional measurement system of a patrol robot according to claim 1, wherein the objective function is:
wherein ,nas the number of nearest-neighbor point pairs,p i is a target point cloudPIn (a) is provided,q i as a source point cloudQMiddle ANDp i The corresponding closest point of the two-dimensional image is,Rin order to rotate the matrix is rotated,tis a translation vector.
5. The method for calibrating a structured light three-dimensional measurement system of a inspection robot according to claim 1, wherein the transformation relationship between the two structured light sensor coordinate systems includes a rotation matrix and a translation matrix to a camera coordinate system in the left structured light sensor to a camera coordinate system in the right structured light sensor.
6. The method for processing the inspection data of the track line inspection robot is characterized by comprising the following steps of:
obtaining a transformation relation between two structural light sensor coordinate systems by adopting the calibration method of the structural light three-dimensional measurement system of the inspection robot according to any one of claims 1 to 5;
acquiring two point cloud images acquired by the track line inspection robot when the track line inspection robot performs line inspection through a structured light three-dimensional measurement system;
adjusting two point cloud images based on the transformation relationship;
and carrying out three-dimensional reconstruction based on the adjusted point cloud image.
7. An electronic device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the structured light three-dimensional measurement system calibration method of a inspection robot according to any one of the preceding claims 1 to 5.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the structured light three-dimensional measurement system calibration method of a inspection robot according to any one of the preceding claims 1 to 5.
CN202111107578.3A 2021-09-22 2021-09-22 Calibration method and inspection data processing method of structured light three-dimensional measurement system Active CN113959362B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111107578.3A CN113959362B (en) 2021-09-22 2021-09-22 Calibration method and inspection data processing method of structured light three-dimensional measurement system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111107578.3A CN113959362B (en) 2021-09-22 2021-09-22 Calibration method and inspection data processing method of structured light three-dimensional measurement system

Publications (2)

Publication Number Publication Date
CN113959362A CN113959362A (en) 2022-01-21
CN113959362B true CN113959362B (en) 2023-09-12

Family

ID=79462355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111107578.3A Active CN113959362B (en) 2021-09-22 2021-09-22 Calibration method and inspection data processing method of structured light three-dimensional measurement system

Country Status (1)

Country Link
CN (1) CN113959362B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115035195B (en) * 2022-08-12 2022-12-09 歌尔股份有限公司 Point cloud coordinate extraction method, device, equipment and storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008164493A (en) * 2006-12-28 2008-07-17 Pulstec Industrial Co Ltd Method for measuring three-dimensional shape, and calibrating body
CN102364299A (en) * 2011-08-30 2012-02-29 刘桂华 Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN105180830A (en) * 2015-09-28 2015-12-23 浙江大学 Automatic three-dimensional point cloud registration method applicable to ToF (Time of Flight) camera and system
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
CN107843208A (en) * 2017-10-27 2018-03-27 北京矿冶研究总院 Mine roadway contour sensing method and system
WO2018103694A1 (en) * 2016-12-07 2018-06-14 苏州笛卡测试技术有限公司 Robotic three-dimensional scanning device and method
CN108225216A (en) * 2016-12-14 2018-06-29 中国科学院深圳先进技术研究院 Structured-light system scaling method and device, structured-light system and mobile equipment
WO2018195986A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser sensors
CN110058237A (en) * 2019-05-22 2019-07-26 中南大学 InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images
CN110440692A (en) * 2019-08-27 2019-11-12 大连理工大学 Laser tracker and structured light 3D scanner combined type measure scaling method
CN111121628A (en) * 2019-12-31 2020-05-08 芜湖哈特机器人产业技术研究院有限公司 Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar
CN111429490A (en) * 2020-02-18 2020-07-17 北京林业大学 Agricultural and forestry crop three-dimensional point cloud registration method based on calibration ball
CN112561966A (en) * 2020-12-22 2021-03-26 清华大学 Sparse point cloud multi-target tracking method fusing spatio-temporal information
CN112785654A (en) * 2021-01-21 2021-05-11 中国铁道科学研究院集团有限公司基础设施检测研究所 Calibration method and device for track geometry detection system
CN112950562A (en) * 2021-02-22 2021-06-11 杭州申昊科技股份有限公司 Fastener detection algorithm based on line structured light

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008164493A (en) * 2006-12-28 2008-07-17 Pulstec Industrial Co Ltd Method for measuring three-dimensional shape, and calibrating body
CN102364299A (en) * 2011-08-30 2012-02-29 刘桂华 Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN105180830A (en) * 2015-09-28 2015-12-23 浙江大学 Automatic three-dimensional point cloud registration method applicable to ToF (Time of Flight) camera and system
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
WO2018103694A1 (en) * 2016-12-07 2018-06-14 苏州笛卡测试技术有限公司 Robotic three-dimensional scanning device and method
CN108225216A (en) * 2016-12-14 2018-06-29 中国科学院深圳先进技术研究院 Structured-light system scaling method and device, structured-light system and mobile equipment
WO2018195986A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser sensors
CN107843208A (en) * 2017-10-27 2018-03-27 北京矿冶研究总院 Mine roadway contour sensing method and system
CN110058237A (en) * 2019-05-22 2019-07-26 中南大学 InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images
CN110440692A (en) * 2019-08-27 2019-11-12 大连理工大学 Laser tracker and structured light 3D scanner combined type measure scaling method
CN111121628A (en) * 2019-12-31 2020-05-08 芜湖哈特机器人产业技术研究院有限公司 Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar
CN111429490A (en) * 2020-02-18 2020-07-17 北京林业大学 Agricultural and forestry crop three-dimensional point cloud registration method based on calibration ball
CN112561966A (en) * 2020-12-22 2021-03-26 清华大学 Sparse point cloud multi-target tracking method fusing spatio-temporal information
CN112785654A (en) * 2021-01-21 2021-05-11 中国铁道科学研究院集团有限公司基础设施检测研究所 Calibration method and device for track geometry detection system
CN112950562A (en) * 2021-02-22 2021-06-11 杭州申昊科技股份有限公司 Fastener detection algorithm based on line structured light

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邓成呈."面向服务机器人三维地图创建的大规模点云分割".《机电一体化》.2012,全文. *

Also Published As

Publication number Publication date
CN113959362A (en) 2022-01-21

Similar Documents

Publication Publication Date Title
CN108346165B (en) Robot and three-dimensional sensing assembly combined calibration method and device
US10979694B2 (en) Information processing apparatus, method, and storage medium
Zhang Camera calibration with one-dimensional objects
Zhou et al. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations
Vasconcelos et al. A minimal solution for the extrinsic calibration of a camera and a laser-rangefinder
US9928595B2 (en) Devices, systems, and methods for high-resolution multi-view camera calibration
Zeller et al. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry
CN111127422A (en) Image annotation method, device, system and host
CN112183171A (en) Method and device for establishing beacon map based on visual beacon
JP7151879B2 (en) Camera calibration device, camera calibration method, and program
JP7462769B2 (en) System and method for characterizing an object pose detection and measurement system - Patents.com
WO2020188799A1 (en) Camera calibration device, camera calibration method, and non-transitory computer-readable medium having program stored thereon
Chen et al. A self-recalibration method based on scale-invariant registration for structured light measurement systems
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN111210478A (en) Method, medium and system for calibrating external parameters of common-view-free multi-camera system
CN110675440A (en) Confidence evaluation method and device for three-dimensional depth data and computer equipment
CN113959362B (en) Calibration method and inspection data processing method of structured light three-dimensional measurement system
Wei et al. Flexible calibration of a portable structured light system through surface plane
CN112629565B (en) Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
Tahri et al. Efficient iterative pose estimation using an invariant to rotations
Yang et al. A novel camera calibration method based on circle projection model
Li et al. Extrinsic calibration of non-overlapping multi-camera system with high precision using circular encoded point ruler
Duan et al. Calibrating effective focal length for central catadioptric cameras using one space line
CN113706635B (en) Long-focus camera calibration method based on point feature and line feature fusion
CN113012279B (en) Non-contact three-dimensional imaging measurement method and system and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant