CN111735390A - Calibration block for line laser sensor and hand-eye calibration method - Google Patents

Calibration block for line laser sensor and hand-eye calibration method Download PDF

Info

Publication number
CN111735390A
CN111735390A CN202010881809.5A CN202010881809A CN111735390A CN 111735390 A CN111735390 A CN 111735390A CN 202010881809 A CN202010881809 A CN 202010881809A CN 111735390 A CN111735390 A CN 111735390A
Authority
CN
China
Prior art keywords
checkerboard
laser sensor
line laser
coordinate system
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010881809.5A
Other languages
Chinese (zh)
Other versions
CN111735390B (en
Inventor
江文松
罗哉
赵洪楠
朱志远
陈艺文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN202010881809.5A priority Critical patent/CN111735390B/en
Publication of CN111735390A publication Critical patent/CN111735390A/en
Application granted granted Critical
Publication of CN111735390B publication Critical patent/CN111735390B/en
Priority to JP2021000997A priority patent/JP7075145B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a calibration block for a line laser sensor and a hand-eye calibration method, and relates to a calibration technology of a laser sensor. The calibration method solves the problem of the calibration technology of the laser sensor in the prior art. The calibration block for the line laser sensor and the hand-eye calibration method are characterized in that the surface of the calibration block is provided with a checkerboard which is formed by intersecting longitudinal and transverse equidistant checkerboard lines, the middle of the calibration block is provided with a circular table hole, the axis of the circular table hole is perpendicular to the surface of the checkerboard, and the axis of the circular table hole passes through the intersection point of two intersecting lines on the checkerboard. The calibration block only needs to mark checkerboard lines and circular truncated cone holes on a plane, so that the machining is simpler, and the machining precision is easy to improve; the robot has wider pose transformation range during calibration, the fitting and extracting process of the characteristic points is more convenient, easy to operate, quick and accurate, and the coordinates of the calibration block in a world coordinate system do not need to be measured.

Description

Calibration block for line laser sensor and hand-eye calibration method
Technical Field
The invention belongs to the technical field of optical detection, and particularly relates to a calibration technology of a laser sensor.
Background
With the continuous improvement of the manufacturing process, the industrial detection industry puts forward higher and higher requirements on the detection mode, the traditional contact type measurement method cannot meet the requirements of industrial detection, and the non-contact, high-precision, high-sensitivity, digitalization and portability become the main trend of the current development of the detection industry. The line laser sensor is a sensor for measuring by using a laser technology, non-contact laser measurement is taken as a branch of a photoelectric detection technology, and the line laser sensor has the advantages of high speed, high precision, wide measuring range, strong reliability, wide application range and the like, and is widely applied to detection of length, distance, three-dimensional morphology and the like.
In order to meet the requirements of high-efficiency and high-precision measurement, the combined measurement system in which the line laser sensor is mounted on the end effector of the mechanical arm in the modern industry is more and more widely applied. During installation, the position relationship between the robot and the line laser sensor is unknown, and in order to realize the purpose of accurate measurement, the position relationship between the robot and the laser sensor must be solved and corrected before actual measurement, namely 'hand-eye calibration'. It is necessary to convert the coordinates of the object in the image coordinate system into the sensor coordinate system based on the intrinsic parameters of the sensing device and then to convert the coordinates of the object in the sensor coordinate system into the world coordinate system based on the extrinsic parameters of the measurement system. In fact, the internal parameters of the sensing device are already solidified when leaving the factory and cannot be modified in use; the external parameters can be adjusted and calibrated, wherein an important process is that the robot vision system is calibrated by hands and eyes to determine the relative position relationship between the sensor and the robot.
In brief, the purpose of hand-eye calibration is to obtain a conversion relationship between a base coordinate system or a world coordinate system and a visual coordinate system of a line laser sensor, and further unify data in the visual coordinate system measured by the line laser sensor into the base coordinate system or the world coordinate system.
At present, the method for calibrating the hand-eye relationship generally comprises a limited scene point method, a plane target method and a standard sphere method. The method is characterized in that the same point P in the space is shot by adjusting different poses of the camera, and the hand-eye relationship between the camera and the robot is obtained by solving the position relationship of the point P in different images. Due to the limited measurement range and different measurement principles of the line laser sensor, the method is not suitable for the hand-eye calibration of the line laser sensor.
The standard sphere method is generally applied to the hand-eye calibration of a line laser sensor at present, but the calibration method is complicated in operation process, high in requirement on the processing precision of a standard sphere, large in tangential plane circle fitting error when the sphere center coordinate is obtained, and influences the accuracy of a calibration result, so that the precision requirement of the hand-eye calibration result is difficult to guarantee.
Disclosure of Invention
The invention aims to solve the problems in the prior art, and provides a calibration block for a line laser sensor and a hand-eye calibration method.
The purpose of the invention can be realized by the following technical scheme: a calibration block for a line laser sensor and a hand-eye calibration method are provided, wherein the surface of the calibration block is provided with a checkerboard which is formed by intersecting longitudinal and transverse equidistant checkerboard lines, the middle of the calibration block is provided with a circular table hole, the axis of the circular table hole is vertical to the surface of the checkerboard, and the axis of the circular table hole passes through the intersection point of two intersecting lines on the checkerboard.
In some embodiments, the diameter of the circular opening edge where the wall surface of the circular truncated cone hole intersects with the surface of the checkerboard is smaller than the side length of the smallest square on the checkerboard, i.e. the distance between two adjacent parallel lines.
In some embodiments, the included angle between the truncated cone generatrix and the bottom surface is 45 °.
In some embodiments, the checkerboard is a negative score line.
In some embodiments, a scanning positioning device is arranged below the circular platform hole, the scanning positioning device comprises a photoelectric detector capable of receiving scanning light, a photoelectric probe of the photoelectric detector is arranged on a deformation frame, the deformation frame comprises 12 arc deformation units, each deformation unit comprises two arc rods, the middle parts of the two arc rods are rotatably connected through a middle hinge shaft, the arc rods of the deformation units are connected into a closed circular deformation frame through end hinge shafts, the circular deformation frame is perpendicular to the central axis of the circular platform hole, the circle center of the deformation frame is located on the central axis of the circular platform hole, and the two end hinge shafts in the same radial direction of the same deformation unit are slidably matched on the radial guide rail; the end hinge shaft or the middle hinge shaft is controlled by the linear motor to move along the radial guide rail, so that the shrinkage deformation of the deformation frame is controlled, the distance between the photoelectric detector and the central axis of the circular truncated cone hole is changed, and the photoelectric detector is arranged on the 12 end hinge shafts on the inner side of the deformation frame.
A hand-eye calibration method for a line laser sensor comprises the following steps:
(1) sensor setting: fixedly mounting a line laser sensor on an end effector of the mechanical arm, wherein the calibration block and the checkerboard are mounted in the measurement range of the line laser sensor;
(2) acquiring a pose-point cloud data set: adjusting the pose of the end effector by a mechanical arm, enabling a laser scanning surface emitted by a line laser sensor under the corresponding pose to intersect with a checkerboard on a calibration block, forming a light strip straight line on the checkerboard surface, enabling the light strip straight line to pass through at least two intersection points of checkerboard lines which are symmetrical about the center of a circular table hole, enabling the center of the circular table hole to be located on the light strip straight line, acquiring coordinate data of 3D point clouds reflected by the laser scanning surface under a sensor coordinate system by a sensor, reading and recording pose information data of the end effector under different poses by a mechanical arm demonstrator, and acquiring 3D point cloud data corresponding to at least 6 different poses of the end effector;
(3) and (3) calculating the coordinates of the center of the circular truncated cone hole under a sensor coordinate system: and performing linear fitting on the corresponding 3D point cloud data under different poses, constructing a fitting linear equation, acquiring two symmetrical points which are symmetrical about the center of the circular truncated cone hole, substituting first coordinate values of the two symmetrical points corresponding to the same coordinate axis into the fitting linear equation, and obtaining second coordinate values of the two symmetrical points corresponding to the other coordinate axis on the fitting linear equation, wherein the sum of the corresponding coordinate values of the two symmetrical points is the coordinate of the center of the circular truncated cone hole under the sensor coordinate system.
(4) Carrying out hand-eye calibration on the robot: since the calibration block and the base of the robotic arm are both fixed, the translation and rotation matrix of the sensor coordinate system relative to the end effector coordinate system can be solved from the following matrix equations
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE002
Wherein n represents the total number of times of scanning and acquiring data,
Figure DEST_PATH_IMAGE003
representing the coordinates of the center of the circular table hole in the base coordinate system,
Figure DEST_PATH_IMAGE004
representing the fitted coordinates of the center of the circular truncated cone hole in the sensor coordinate system at the time of data acquisition of the ith scan,
Figure DEST_PATH_IMAGE005
representing a translation and rotation matrix of the end effector coordinate system relative to the base coordinate system at the time the data was acquired for the ith scan,
Figure DEST_PATH_IMAGE006
representing the translation and rotation matrices of the sensor coordinate system relative to the end effector coordinate system, i.e., the hand-eye relationship that needs to be solved for.
In some embodiments, the two symmetric points are two points with abrupt changes in coordinates in the 3D point cloud data corresponding to each light bar straight line.
In some embodiments, the 3D point cloud data includes two parts, namely surface 3D point cloud data reflected by a checkerboard surface and bottom 3D point cloud data reflected by a bottom surface inside a circular truncated cone, and the bottom 3D point cloud data is rejected according to the characteristic that a mutation exists between the coordinates of the bottom 3D point cloud data and the coordinates of the surface 3D point cloud data.
In some embodiments, the laser sensor is driven such that the intersection of two longitudinal and transverse checkerboard lines on the checkerboard that are symmetric about the center of the circular mesa hole are both on the light bar straight line.
In some embodiments, the two points of symmetry are intersections of two checkerboard lines symmetrical about the frustum hole and the straight line of the light bar.
In some embodiments, when the laser sensor is driven, part of the scanning light is emitted into the circular truncated cone hole, the deformation of the deformation frame is controlled by the linear motor, so that the scanning light is detected by the photoelectric detectors on two hinge axes, and the incident angle and the direction of the scanning light can be calculated according to the expansion amount of the linear motor and the positions of the hinge axes where the photoelectric detectors detecting the scanning light are located, so that the scanning position of the scanning light on the checkerboard can be known.
Compared with the prior art, the calibration block for the line laser sensor and the hand-eye calibration method have the advantages that:
the calibration block only needs to mark checkerboard lines and circular truncated cone holes on a plane, so that the machining is simpler, and the machining precision is easy to improve; the robot has wider pose transformation range during calibration, the fitting and extracting process of the characteristic points is more convenient, easy to operate, quick and accurate, and the coordinates of the calibration block in a world coordinate system do not need to be measured.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different examples of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
FIG. 1 is a schematic view of a calibration apparatus according to a first embodiment;
FIG. 2 is a perspective view of a calibration block;
FIG. 3 is a checkerboard front view of a calibration block;
FIG. 4 is a cross-sectional view of a calibration block;
FIG. 5 is a schematic diagram showing the cross state of two laser scanning surfaces and a checkerboard surface of a calibration block;
FIG. 6 is a schematic distribution diagram of the collected 3D point cloud data;
FIG. 7 is a schematic flow chart of hand-eye calibration;
FIG. 8 is a schematic view of a third embodiment;
FIG. 9 is a schematic view of a scanning and positioning apparatus according to a third embodiment;
FIG. 10 is a schematic view of a scanning positioning apparatus according to a third embodiment after deformation;
in the figure: 101. a mechanical arm; 201. an end effector; 301. a line laser sensor; 302. a laser scanning surface; 401. calibrating the block; 402. checkerboard; 403. a feature point P; 404. a circular truncated cone hole; 405. a symmetric intersection pair about the feature point P (example I); 406. a pair of symmetric intersections about the feature point P (example II); 501. an operating platform; a middle hinge shaft 6; an end hinge shaft 7; a photoelectric probe 8; a radial guide 9; a linear motor 10; a swivel 11; an arcuate bar 12.
Detailed Description
The following are specific examples of the present invention, and the technical solutions of the present invention are further described with reference to the drawings, but the present invention is not limited to these examples, and the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the solution of the invention.
It will be understood by those of ordinary skill in the art that all directional references (e.g., above, below, upward, downward, top, bottom, left, right, vertical, horizontal, etc.) are illustratively used in the figures to aid the reader's understanding and do not imply (e.g., position, orientation, or use, etc.) a limitation on the scope of the invention, which is defined by the claims appended hereto. Additionally, the term "substantially" may refer to slight imprecision or deviation in conditions, amounts, values, or dimensions, etc., some of which may be within manufacturing or tolerance limits.
Example one
As shown in fig. 1, 2, 3 and 4, a position fixed relative to a world coordinate system is determined on an operation platform 501 outside a mechanical arm, and a calibration block 401 is fixed, wherein the calibration block 401 is of a cubic structure, the specification is 220.0mm × 220.0mm × 30.0mm, the size processing precision is 0.01mm, the surface roughness is 1.6, and the material is ceramic; the surface of the calibration block is provided with a checkerboard which is formed by intersecting longitudinal and transverse equidistant checkerboard lines, the checkerboard 402 is positioned at the center of the surface of the calibration block, the thickness of the checkerboard is 220mm multiplied by 220mm, the specification of the checkerboard is 200mm multiplied by 200mm, the checkerboard is composed of 21 horizontal and vertical equidistant line segments, the distance between two adjacent parallel line segments is 10.0mm, and the scribing precision is 0.01 mm; the middle part of the calibration block is provided with a circular truncated cone hole (namely a circular truncated cone-shaped hole), and the central axis of the circular truncated cone hole 404 is coaxial with the central axis of the calibration block 401 and is positioned at the center of the calibration block 401; the height of the circular truncated cone hole 404 is 30mm, the radius of the upper surface circle is 5mm, the radius of the lower surface circle is 35mm, the included angle between the generatrix of the circular truncated cone and the bottom surface is 45 degrees, the axis of the circular truncated cone hole is vertical to the surface where the checkerboards are located, the characteristic point P403 is located at the central intersection point of the checkerboards, and the axis of the circular truncated cone hole passes through the intersection point; therefore, the feature point P403 is also the center of the circular hole on the circular table. The diameter of the circular opening edge of the intersection of the hole wall surface of the circular table hole and the surface of the checkerboard is smaller than the side length of the smallest square on the checkerboard, namely the distance between two adjacent parallel lines.
Calibration basic principle: the characteristic point P403 is located at the center of the surface of the calibration block 401, the line laser sensor 301 is fixedly mounted on the end effector 201 of the mechanical arm, the end effector 201 is adjusted through the mechanical arm 101, the line laser sensor 301 is driven to move together, and the line laser sensor 301 scans and acquires position information about the characteristic point P403 at different poses. Because the relative position relationship between the line laser sensor 301 and the end effector 201 of the robot is not changed, the relative position relationship between the sensor coordinate system and the end effector coordinate system is not changed, and the relation between the two coordinate systems is obtained to be the hand-eye calibration of the robot.
As shown in fig. 5, 6, and 7, the specific calibration method is as follows:
(1) sensor setting: the line laser sensor 301 is fixedly arranged on the end effector 201 of the mechanical arm 101, the calibration block 401 is fixedly arranged on the operation platform 501, and the calibration block 401 and the checkerboard 402 are arranged in the measurement range of the line laser sensor 301;
(2) acquiring a pose-point cloud data set: adjusting the pose of the end effector 201 through the mechanical arm 101, enabling a laser scanning surface emitted by a line laser sensor 301 under the corresponding pose to intersect with a checkerboard 402 on a calibration block 401, forming a light strip straight line on the checkerboard surface, enabling the light strip straight line to pass through at least two checkerboard line intersection points symmetrical about the center (namely a characteristic point) of a circular table hole so as to enable the center of the circular table hole to be positioned on the light strip straight line, acquiring coordinate data of 3D point clouds reflected back by the laser scanning surface under a sensor coordinate system through a sensor, reading and recording pose information data of the end effector 201 under different poses through a mechanical arm demonstrator, acquiring 3D point cloud data corresponding to at least 6 different poses, and packaging the pose data of the process and the 3D data of the calibration block 401 into a 'pose-point cloud data set';
(3) and (3) calculating the coordinates of the center of the circular truncated cone hole under a sensor coordinate system:
performing straight line fitting on the corresponding 3D point cloud data under different poses, constructing a fitting straight line equation, acquiring two symmetrical points which are symmetrical about the center of the circular truncated cone hole, substituting first coordinate values corresponding to the two symmetrical points on the same coordinate axis into the fitting straight line equation, and obtaining a second coordinate value on the other coordinate axis corresponding to the two symmetrical points on the fitting straight line, wherein the sum of the corresponding coordinate values of the two symmetrical points is the coordinate of the center of the circular truncated cone hole under a sensor coordinate system, and the two symmetrical points can be two points with sudden changes of coordinates in the 3D point cloud data corresponding to each light strip straight line, namely two corresponding points of the edge of the circular truncated cone hole under the sensor coordinate system, because the 3D point cloud data comprises two parts of surface 3D point cloud data reflected by the surface of the checkerboard 402 and bottom surface 3D point cloud data reflected by the bottom surface inside the circular truncated cone hole 404, according to the characteristic that the coordinates of the bottom surface 3D point cloud data and the coordinates of the surface 3D point cloud data have sudden changes, points where the 3D point cloud data coordinates have sudden changes are two points where a light strip straight line intersects with the edge of a circular truncated cone hole, and the centers of the two points are the circle center of the circular hole, namely a characteristic point P403, more specifically, the following description is given:
the 3D point cloud data measured by the line laser sensor 301 for the calibration block 401 are all in the yz plane in the sensor coordinate system, and include N data points (0, y)i,zi) The laser scanning lines are distributed on the straight line of the crossed light bars of the laser scanning surface 302 emitted by the laser sensor 301 and the calibration block 401; the 3D point cloud data comprises two parts, namely 3D point cloud data on the checkerboard 402 and 3D point cloud data on the bottom surface of the circular truncated cone hole 404, and A, B points are two points which are closest to a characteristic point P403 in the 3D point cloud data on the checkerboard 402; processing the 3D point cloud data on the light strip straight lines, removing the 3D point cloud data where the light strip straight lines on the bottom surface of the circular table hole 404 are located, and obtaining the 3D point cloud data where the light strip straight lines on the checkerboard 402 are located; fitting the 3D point cloud data where the straight lines of the light stripes on the chessboard 402 are positioned in a yz plane in a sensor coordinate system by a least square method, and constructing a function Z (y) to express a fitted straight line equation
Figure DEST_PATH_IMAGE007
Acquiring two 3D point cloud data closest to the feature point P403 as A (0, y)1,z1) And B (0, y)2,z2) Taking the y-axis value and substituting the y-axis value into a fitting linear equation
Figure 98355DEST_PATH_IMAGE007
The fitted Z-axis coordinate values of the A, B two points are respectively:
Figure DEST_PATH_IMAGE008
;
Figure DEST_PATH_IMAGE009
since the feature point P403 is the midpoint of the line segment AB, the coordinates of the feature point P403 after fitting in the sensor coordinate system can be expressed as:
Figure DEST_PATH_IMAGE010
. Thus, the coordinates P (0, y) of the feature point P403 in the sensor coordinate system can be obtained0,z0);
(4) Carrying out hand-eye calibration on the robot: since both the calibration block 401 and the base of the robot arm 101 are fixed, the following matrix equation holds:
Figure DEST_PATH_IMAGE011
(5)
wherein n represents the total number of times of scanning and acquiring data,
Figure DEST_PATH_IMAGE012
representing the coordinates of the center of the circular table hole in the base coordinate system,
Figure DEST_PATH_IMAGE013
representing the fitted coordinates of the center of the circular truncated cone hole in the sensor coordinate system at the time of data acquisition of the ith scan,
Figure DEST_PATH_IMAGE014
is shown asiA translation and rotation matrix of the end effector coordinate system relative to the base coordinate system during the sub-scan acquisition of data,
Figure DEST_PATH_IMAGE015
representing a translation and rotation matrix of the sensor coordinate system relative to the end effector coordinate system, namely a hand-eye relation to be solved, and solving the translation and rotation matrix of the sensor coordinate system relative to the end effector coordinate system
Figure 405708DEST_PATH_IMAGE015
And completing the hand-eye calibration. The specific calculation process is as follows:
from equation (5), the following equation can be derived:
Figure DEST_PATH_IMAGE016
(6)
order to
Figure DEST_PATH_IMAGE017
Substitution and expansion of formula (6) can yield:
Figure DEST_PATH_IMAGE018
(7)
then, the following relation is established:
Figure DEST_PATH_IMAGE019
(8)
Figure DEST_PATH_IMAGE020
(9)
Figure DEST_PATH_IMAGE021
(10)
in the above formula, the first and second light sources are,
Figure DEST_PATH_IMAGE022
the fitted coordinates in the sensor coordinate system representing the feature point P403 at the time of data acquisition of the ith scan can be rewritten as the following matrix equation (7):
Ax=b (11)
the solution for x can be obtained by a least squares solution equation:
Figure DEST_PATH_IMAGE023
(12)
r can be obtained by solving xx,TxAnd obtaining a hand-eye relation matrix according to the parameter values, and completing the hand-eye calibration.
Example two
When the laser sensor 301 is driven, the intersection points of the laser scanning surface and the longitudinal and transverse checkerboard lines which are symmetrical about the center of the circular table hole on the checkerboard 402 are all located on the straight line of the light strip, so that the center of the circular table hole is located on the laser scanning surface, and the intersection points are convenient to identify.
When the coordinate of the center of the circular truncated cone hole under the sensor coordinate system is calculated, the intersection points of two longitudinal and transverse checkerboard lines which are symmetrical about the center of the circular truncated cone hole on the checkerboard are located on the light bar straight line, and the intersection points of the checkerboard lines and the light bar straight line have obvious coordinate value change relative to other points of the 3D point cloud in the coordinate of the sensor coordinate system due to the fact that the checkerboard lines are intaglio lines, two symmetrical points are easy to find, namely the two symmetrical points are the intersection points of the two checkerboard lines which are symmetrical about the circular truncated cone hole and the light bar straight line.
When the line laser sensor 301 is driven, the mechanical arm 101 is controlled to adjust the pose of the end effector 201 so that the feature point P403 is close to the center position of the straight line of the light bar, which is beneficial to more accurate processing of the coordinate data collected by the line laser sensor 301.
EXAMPLE III
As shown in fig. 8, 9, and 10, different from the above embodiment, a scanning positioning device is disposed below the circular platform hole, the scanning positioning device includes a photodetector capable of receiving scanning light, a photodetector 8 of the photodetector is disposed on a deformation frame, the deformation frame includes 12 arc deformation units, the deformation unit includes two arc rods 12, the middle portions of the two arc rods are rotatably connected by a middle hinge shaft 6, the arc rods of the deformation units are connected by ending by an end hinge shaft 7 to form a closed circular deformation frame, the circular deformation frame is perpendicular to the central axis of the circular platform hole, the center of the deformation frame is located on the central axis of the circular platform hole, and two end hinge shafts of the same deformation unit in the same radial direction are slidably fitted on the radial guide rail 9. The end hinge shafts or the middle hinge shafts are controlled by the linear motor 10 to move along the radial guide rail, so that the shrinkage deformation of the deformation frame is controlled, the distance between the photoelectric detector and the central axis of the circular truncated cone hole is changed, and the photoelectric detector is arranged on the 12 end hinge shafts on the inner side of the deformation frame. Although a plurality of linear motors are shown in the figure, only one linear motor is needed, and other telescopic control devices can be adopted.
During detection, part of scanning light is emitted into the circular truncated cone hole, deformation of the deformation frame is controlled through the linear motor, so that the scanning light is detected by the photoelectric detectors on the two hinge shafts, the incident angle and the direction of the scanning light can be calculated according to the stretching amount of the linear motor and the positions of the hinge shafts where the photoelectric detectors detecting the scanning light are located, and the scanning position of the scanning light on the checkerboard can be known.
The radial guide rail can be fixed on the swivel 11, the swivel is coaxial with the axis of the circular table hole, the stepping motor controls the swivel to rotate around the axis of the circular table hole, and two photodetectors can detect light in the direction at the same time generally with light capable of scanning towards any direction.
Although some terms are used more herein, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention. The order of execution of the operations, steps, and the like in the apparatuses and methods shown in the specification and drawings may be implemented in any order as long as the output of the preceding process is not used in the subsequent process, unless otherwise specified. The descriptions using "first", "next", etc. for convenience of description do not imply that they must be performed in this order.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (10)

1. A calibration block for a line laser sensor is characterized in that a round table hole is formed in the middle of the calibration block, the axis of the round table hole is perpendicular to the surface of the checkerboard, and the axis of the round table hole passes through the intersection point of two intersecting lines on the checkerboard.
2. The calibration block for the line laser sensor as claimed in claim 1, wherein the diameter of the circular opening edge where the wall surface of the circular truncated cone hole intersects with the surface of the checkerboard is smaller than the side length of the smallest square on the checkerboard.
3. The calibration block for a line laser sensor according to claim 1, wherein the included angle between the generatrix of the circular truncated cone and the bottom surface is 45 °.
4. Calibration block for a line laser sensor according to claim 1, characterized in that the checkerboard is an intaglio line.
5. The calibration block for the line laser sensor according to claim 1, wherein a scanning positioning device is arranged below the circular platform hole, the scanning positioning device comprises a photoelectric detector capable of receiving scanning light, a photoelectric probe of the photoelectric detector is arranged on a deformation frame, the deformation frame comprises 12 arc-shaped deformation units, the deformation units comprise two arc-shaped rods, the middle parts of the two arc-shaped rods are rotatably connected through a middle hinge shaft, the arc-shaped rods of the deformation units are connected into a closed circular deformation frame through end hinge shafts in a receiving and tailing mode, the circular deformation frame is perpendicular to the central axis of the circular platform hole, the center of the deformation frame is located on the central axis of the circular platform hole, and the two end hinge shafts in the same radial direction of the same deformation unit are slidably matched on the radial guide rail; the end hinge shaft or the middle hinge shaft is controlled by the linear motor to move along the radial guide rail, so that the shrinkage deformation of the deformation frame is controlled, the distance between the photoelectric detector and the central axis of the circular truncated cone hole is changed, and the photoelectric detector is arranged on the 12 end hinge shafts on the inner side of the deformation frame.
6. A hand-eye calibration method for a line laser sensor is characterized by comprising the following steps:
(1) sensor setting: fixedly mounting a line laser sensor on an end effector of the mechanical arm, wherein the calibration block and the checkerboard are mounted in the measurement range of the line laser sensor;
(2) acquiring a pose-point cloud data set: adjusting the pose of the end effector by a mechanical arm, enabling a laser scanning surface emitted by a line laser sensor under the corresponding pose to intersect with a checkerboard on a calibration block, forming a light strip straight line on the checkerboard surface, enabling the light strip straight line to pass through at least two intersection points of checkerboard lines which are symmetrical about the center of a circular table hole, enabling the center of the circular table hole to be located on the light strip straight line, acquiring coordinate data of 3D point clouds reflected by the laser scanning surface under a sensor coordinate system by a sensor, reading and recording pose information data of the end effector under different poses by a mechanical arm demonstrator, and acquiring 3D point cloud data corresponding to at least 6 different poses of the end effector;
(3) and (3) calculating the coordinates of the center of the circular truncated cone hole under a sensor coordinate system:
performing linear fitting on the corresponding 3D point cloud data under different poses, constructing a fitting linear equation, acquiring two symmetrical points which are symmetrical about the center of the circular truncated cone hole, substituting first coordinate values of the two symmetrical points corresponding to the same coordinate axis into the fitting linear equation, and obtaining second coordinate values of the two symmetrical points corresponding to the other coordinate axis on the fitting linear equation, wherein the half of the sum of the corresponding coordinate values of the two symmetrical points is the coordinate of the center of the circular truncated cone hole under a sensor coordinate system;
(4) carrying out hand-eye calibration on the robot: since the calibration block and the base of the robotic arm are both fixed, the translation and rotation matrix of the sensor coordinate system relative to the end effector coordinate system can be solved from the following matrix equations
Figure 679570DEST_PATH_IMAGE001
Figure 787335DEST_PATH_IMAGE002
Wherein n represents the total number of times of scanning and collecting data, PBaseIndicating circular tableThe coordinates of the center of the hole in the base coordinate system,
Figure 27823DEST_PATH_IMAGE004
representing the fitted coordinates of the center of the circular truncated cone hole in the sensor coordinate system at the time of data acquisition of the ith scan,
Figure 417610DEST_PATH_IMAGE006
representing a translation and rotation matrix of the end effector coordinate system relative to the base coordinate system at the time the data was acquired for the ith scan,
Figure 378744DEST_PATH_IMAGE008
representing the translation and rotation matrices of the sensor coordinate system relative to the end effector coordinate system, i.e., the hand-eye relationship that needs to be solved for.
7. The hand-eye calibration method for the line laser sensor as recited in claim 6, wherein the two symmetrical points are two points with abrupt change of coordinates in the 3D point cloud data corresponding to each light bar straight line.
8. The hand-eye calibration method for line laser sensor as claimed in claim 6, wherein the intersection points of two longitudinal and transverse checkerboard lines symmetric about the center of the circular truncated cone hole on the checkerboard are all on the straight line of the light bar when the line laser sensor is driven.
9. The hand-eye calibration method for line laser sensor as claimed in claim 6, wherein said two symmetric points are the intersection points of two checkerboard lines symmetric about the circular truncated cone hole and the straight line of the light bar.
10. The hand-eye calibration method for the line laser sensor as recited in claim 6, wherein when the line laser sensor is driven, a portion of the scanning light is projected into the circular truncated cone hole, the deformation of the deformable frame is controlled by the linear motor, so that the scanning light is detected by the photodetectors on two hinge axes, and the incident angle and direction of the scanning light can be calculated according to the amount of expansion and contraction of the linear motor and the position of the hinge axis where the photodetector detecting the scanning light is located, thereby knowing the scanning position of the scanning light on the checkerboard.
CN202010881809.5A 2020-08-28 2020-08-28 Calibration block for line laser sensor and hand-eye calibration method Active CN111735390B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010881809.5A CN111735390B (en) 2020-08-28 2020-08-28 Calibration block for line laser sensor and hand-eye calibration method
JP2021000997A JP7075145B2 (en) 2020-08-28 2021-01-06 Calibration block and hand eye calibration method for line laser sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010881809.5A CN111735390B (en) 2020-08-28 2020-08-28 Calibration block for line laser sensor and hand-eye calibration method

Publications (2)

Publication Number Publication Date
CN111735390A true CN111735390A (en) 2020-10-02
CN111735390B CN111735390B (en) 2020-12-11

Family

ID=72658914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010881809.5A Active CN111735390B (en) 2020-08-28 2020-08-28 Calibration block for line laser sensor and hand-eye calibration method

Country Status (2)

Country Link
JP (1) JP7075145B2 (en)
CN (1) CN111735390B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113334383A (en) * 2021-06-22 2021-09-03 华中科技大学 Robot tail end tool offset calibration method based on line laser measuring instrument
CN115014257A (en) * 2022-06-30 2022-09-06 南京航空航天大学 Two-dimensional line laser sensor calibration device and method
CN115371564A (en) * 2022-10-24 2022-11-22 南京航空航天大学 Method and system for calibrating relative pose of linear laser sensor and robot flange plate
CN116100564A (en) * 2023-04-12 2023-05-12 深圳广成创新技术有限公司 High-precision calibration method and device for calibrating manipulator
CN116117810A (en) * 2023-01-10 2023-05-16 群滨智造科技(苏州)有限公司 Method and device for secondary calibration after data conversion among different robots
CN116222385A (en) * 2023-05-08 2023-06-06 东莞市兆丰精密仪器有限公司 Laser center position calibration method and measurement system
CN117036510A (en) * 2023-09-18 2023-11-10 合肥埃科光电科技股份有限公司 Calibration method, system and equipment for line spectrum confocal sensor

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114689106B (en) * 2022-03-31 2024-03-08 上海擎朗智能科技有限公司 Sensor calibration method, robot and computer readable storage medium
CN114782513B (en) * 2022-04-22 2023-11-10 成都飞机工业(集团)有限责任公司 Point laser sensor mounting pose calibration method based on plane
CN115046436B (en) * 2022-06-15 2024-03-12 重庆大学 Fragment distribution planning and paving control method suitable for complex-configuration fragment array design
CN115241110B (en) * 2022-08-15 2023-12-08 魅杰光电科技(上海)有限公司 Wafer motion control method and wafer motion control system
CN115319754A (en) * 2022-09-16 2022-11-11 唐山松下产业机器有限公司 Robot and laser sensor hand-eye calibration method and device
CN115249267B (en) * 2022-09-22 2022-12-30 海克斯康制造智能技术(青岛)有限公司 Automatic detection method and device based on turntable and robot position and posture calculation
CN115615355B (en) * 2022-10-13 2023-05-16 中国科学院声学研究所 Contour measurement compensation method in line laser liquid medium based on data
CN116000927B (en) * 2022-12-29 2024-06-18 中国工程物理研究院机械制造工艺研究所 Measuring device and method for spatial position guiding precision of robot vision system
CN116079729B (en) * 2023-01-13 2024-08-13 香港中文大学深港创新研究院(福田) Autonomous hand-eye calibration algorithm for mechanical arm
CN116117818A (en) * 2023-02-28 2023-05-16 哈尔滨工业大学 Robot sensor hand-eye calibration method and system
CN116359891B (en) * 2023-06-01 2023-09-12 季华实验室 Multi-sensor rapid calibration method and system
CN117140535B (en) * 2023-10-27 2024-02-02 南湖实验室 Robot kinematics parameter calibration method and system based on single measurement
CN117576227B (en) * 2024-01-16 2024-04-19 中铁科工集团有限公司 Hand-eye calibration method, device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999010136A1 (en) * 1997-08-28 1999-03-04 Proteus Corporation Laser calibration of robotics systems
JP2011220794A (en) * 2010-04-08 2011-11-04 Seiko Epson Corp Calibration jig and imaging apparatus calibration method using the same
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN110161485A (en) * 2019-06-13 2019-08-23 同济大学 A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera
CN111156925A (en) * 2019-12-19 2020-05-15 南京理工大学 Three-dimensional measurement method for large component based on line structured light and industrial robot
CN111366912A (en) * 2020-03-10 2020-07-03 上海西井信息科技有限公司 Laser sensor and camera calibration method, system, device and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4443497B2 (en) 2005-11-02 2010-03-31 パルステック工業株式会社 Multi-degree-of-freedom robot positioning error correction apparatus and positioning error correction method
JP6017213B2 (en) 2012-07-20 2016-10-26 株式会社神戸製鋼所 Calibration method for robot with optical sensor
JP2017003537A (en) 2015-06-16 2017-01-05 大同特殊鋼株式会社 Three dimensional shape measurement method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999010136A1 (en) * 1997-08-28 1999-03-04 Proteus Corporation Laser calibration of robotics systems
JP2011220794A (en) * 2010-04-08 2011-11-04 Seiko Epson Corp Calibration jig and imaging apparatus calibration method using the same
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN110161485A (en) * 2019-06-13 2019-08-23 同济大学 A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera
CN111156925A (en) * 2019-12-19 2020-05-15 南京理工大学 Three-dimensional measurement method for large component based on line structured light and industrial robot
CN111366912A (en) * 2020-03-10 2020-07-03 上海西井信息科技有限公司 Laser sensor and camera calibration method, system, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陆艺 等: "线结构光视觉传感器机器人手眼关系标定", 《中国测试》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113334383A (en) * 2021-06-22 2021-09-03 华中科技大学 Robot tail end tool offset calibration method based on line laser measuring instrument
CN115014257A (en) * 2022-06-30 2022-09-06 南京航空航天大学 Two-dimensional line laser sensor calibration device and method
CN115371564A (en) * 2022-10-24 2022-11-22 南京航空航天大学 Method and system for calibrating relative pose of linear laser sensor and robot flange plate
CN115371564B (en) * 2022-10-24 2023-03-07 南京航空航天大学 Method and system for calibrating relative pose of linear laser sensor and robot flange plate
CN116117810A (en) * 2023-01-10 2023-05-16 群滨智造科技(苏州)有限公司 Method and device for secondary calibration after data conversion among different robots
CN116117810B (en) * 2023-01-10 2023-12-29 群滨智造科技(苏州)有限公司 Method and device for secondary calibration after data conversion among different robots
CN116100564A (en) * 2023-04-12 2023-05-12 深圳广成创新技术有限公司 High-precision calibration method and device for calibrating manipulator
CN116222385A (en) * 2023-05-08 2023-06-06 东莞市兆丰精密仪器有限公司 Laser center position calibration method and measurement system
CN117036510A (en) * 2023-09-18 2023-11-10 合肥埃科光电科技股份有限公司 Calibration method, system and equipment for line spectrum confocal sensor
CN117036510B (en) * 2023-09-18 2024-05-07 合肥埃科光电科技股份有限公司 Calibration method, system and equipment for line spectrum confocal sensor

Also Published As

Publication number Publication date
JP7075145B2 (en) 2022-05-25
JP2022039903A (en) 2022-03-10
CN111735390B (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN111735390B (en) Calibration block for line laser sensor and hand-eye calibration method
CN104567679B (en) A kind of system of turbo blade vision-based detection
CN109341546B (en) Light beam calibration method of point laser displacement sensor at any installation pose
CN109357631B (en) Measuring system center calibration method based on laser displacement sensor
CN111982019B (en) High-precision blade section contour detection method based on line-structured light sensor
CN107607059B (en) One-key type 3D contour measurement equipment and measurement calculation method thereof
CN107121093A (en) A kind of gear measurement device and measuring method based on active vision
CN111536902A (en) Galvanometer scanning system calibration method based on double checkerboards
CN107270833A (en) A kind of complex curved surface parts three-dimension measuring system and method
Lu et al. On-line measurement of the straightness of seamless steel pipes using machine vision technique
CN101509759B (en) Self-demarcating system and method for vision detecting system
CN113465513B (en) Laser sensor inclination angle error measurement compensation method and system based on cylindrical angle square
CN102538707B (en) Three dimensional localization device and method for workpiece
CN110645911A (en) Device and method for obtaining complete outer surface 3D contour through rotary scanning
TW201803706A (en) Robot correction system and method thereof
Zhu et al. Noncontact 3-D coordinate measurement of cross-cutting feature points on the surface of a large-scale workpiece based on the machine vision method
CN204514271U (en) A kind of system of turbo blade vision-based detection
JP7353757B2 (en) Methods for measuring artifacts
CN116572077B (en) Method for normal measurement of large-curvature free-form surface
CN116105638A (en) U-shaped turntable coaxiality detection system and detection method
CN112485805A (en) Laser triangular displacement sensor and measuring method thereof
Hui-yuan et al. Accurate extrinsic calibration method of a line structured-light sensor based on a standard ball
CN115179323A (en) Machine end pose measuring device based on telecentric vision constraint and precision improving method
CN117824502A (en) Laser three-dimensional scanning-based non-contact detection method for assembling complex assembled workpiece
JP3293830B2 (en) Apparatus and method for measuring and calculating geometric parameters of an object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant