CN116619350A - Robot error calibration method based on binocular vision measurement - Google Patents

Robot error calibration method based on binocular vision measurement Download PDF

Info

Publication number
CN116619350A
CN116619350A CN202210132480.1A CN202210132480A CN116619350A CN 116619350 A CN116619350 A CN 116619350A CN 202210132480 A CN202210132480 A CN 202210132480A CN 116619350 A CN116619350 A CN 116619350A
Authority
CN
China
Prior art keywords
robot
error
model
binocular vision
binocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210132480.1A
Other languages
Chinese (zh)
Inventor
江小辉
孙翼飞
丁子珊
郭维诚
魏雨希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN202210132480.1A priority Critical patent/CN116619350A/en
Publication of CN116619350A publication Critical patent/CN116619350A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a robot error calibration method based on binocular vision measurement, which comprises the following steps: step 1, constructing and debugging a robot error calibration platform based on binocular vision measurement, deducing a robot kinematic model and establishing a robot error compensation model; step 2, a robot distance error model is established based on a robot kinematics model and a robot error compensation model; step 3, selecting a proper binocular camera according to the working range of the robot to measure the tail end position so as to obtain the robot system data; step 4, processing the robot system data, carrying out binocular vision image correction and matching processing to obtain a binocular vision image, and obtaining a distance error item of a robot distance error model according to the binocular vision image; and 5, identifying the geometric parameter error of the robot connecting rod based on the distance error model, and compensating the error parameter obtained by calibration into the initial geometric parameter error model to realize the kinematic calibration effect.

Description

Robot error calibration method based on binocular vision measurement
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot error calibration method based on binocular vision measurement.
Background
The weak rigidity of the industrial robot joint and the tiny deviation of the internal kinematic model cannot ensure higher absolute positioning accuracy. The robot kinematics calibration is one of key technologies for improving the working accuracy of the robot, and the calibration is a process for improving the absolute accuracy of the robot by identifying the accurate kinematics geometric parameters of the robot model by applying an advanced measuring means and a parameter identification method based on the kinematics model.
The positioning accuracy of the robot is divided into repeated positioning accuracy and absolute positioning accuracy, the repeated positioning accuracy of the industrial robot can reach within 0.1mm nowadays, but the absolute positioning accuracy can only reach millimeter level, and the accuracy of the level is insufficient in certain high-end manufacturing fields, such as hole making procedures of an aircraft assembly manufacturing production line, blade polishing and the like. For the aircraft manufacturing environment, the general industrial robot has only millimeter-level precision, so that the precision can not meet the requirement of aerospace manufacturing, and the error calibration technology of the industrial robot is an effective means for improving the absolute positioning precision of the robot, so that the research of the error calibration technology of the robot is very important.
In order to measure the positioning error of the robot end, the methods commonly adopted at present are as follows:
1. the method comprises the steps of adopting a laser tracker and a matched target ball or using a specific calibration device provided by a robot manufacturer, installing the laser tracker target ball on a robot base, determining the plane of the robot base after calibration measurement for a plurality of times, fixing the target ball on a robot end flange, calibrating a robot base coordinate system through circumferential rotation of a first shaft and a second shaft of the robot, and then selecting a calibrated base coordinate to obtain the absolute position of the robot end flange relative to the robot base coordinate system;
2. the stay wire type displacement sensor is matched with corresponding test software, one end of the stay wire type displacement sensor is fixed on the base, the other end of the stay wire type displacement sensor is fixed on the tail end of the robot, and the displacement value output by the upper software of the stay wire type displacement sensor when the robot moves is read and compared with the coordinate generated in the robot for measurement;
3. the robot is calibrated in a contact mode by adopting a ball rod instrument or a double ball rod instrument, the two ends of the ball rod instrument are composed of high-precision steel balls, one end of the ball rod instrument is fixed, and the other end of the ball rod instrument is connected with a high-precision displacement sensor. Two ends of the two steel balls are positioned through magnetic attraction seats positioned at three points, one end of the two steel balls is attracted to the main shaft, and the other end of the two steel balls is attracted to the workbench. The method is mainly used for measuring the precision of the tail end position of the robot in X-Y, X-Z and Y-Z planes. When the workbench performs circular interpolation motion relative to the main shaft, a robot simulation arc track is formed, the sensor collects the change of the distance between the two steel balls and transmits the change to the computer, and the roundness precision of the movement of the tail end of the robot is obtained and various single errors such as geometrical parameter errors of each shaft of the robot, the movement distance errors of the tail end of the robot and the like are separated through iterative algorithm calculation and analysis. The method adopting the laser tracker and the target ball has higher precision, but has high price, and the laser tracker and corresponding software need professional training; the whole occupied volume of equipment such as a stay wire type displacement sensor is large, the process of arranging a measuring platform is complex, and the measuring space is limited; the method adopting the club instrument has higher precision, but the calibration range is limited, and the higher the manufacturing precision is, the higher the cost of the club instrument is.
Disclosure of Invention
The present invention has been made to solve the above problems, and an object of the present invention is to provide a robot error calibration method based on binocular vision measurement.
The invention provides a robot error calibration method based on binocular vision measurement, which has the characteristics that the method comprises the following steps: step 1, a robot error calibration platform based on binocular vision measurement is built and debugged, a robot kinematic model is deduced, and a robot error compensation model is built; step 2, a robot distance error model is established based on a robot kinematics model and a robot error compensation model; step 3, selecting a proper binocular camera according to the working range of the robot to measure the end position to obtain binocular camera system data; step 4, processing binocular camera system data, correcting and matching binocular vision images to obtain binocular vision images, and obtaining distance error items of a robot distance error model according to the binocular vision images; and 5, identifying the geometric parameter error of the robot connecting rod based on the distance error model, and compensating the error parameter obtained by calibration into the initial geometric parameter error model to realize the kinematic calibration effect.
The robot error calibration method based on binocular vision measurement provided by the invention can also have the following characteristics: in step 1, a robot kinematic model is deduced according to the overall dimension of the robot, and a differential motion principle is introduced to build a robot error compensation model.
The robot error calibration method based on binocular vision measurement provided by the invention can also have the following characteristics: in the step 2, a robot distance error model is deduced and established by adopting an M-DH method.
The robot error calibration method based on binocular vision measurement provided by the invention can also have the following characteristics: in step 3, the binocular camera system data includes binocular camera distortion, an internal reference and a world coordinate system, a camera and a camera lens with proper resolution, visual field and focal length are selected according to the size and the working movement range of the robot body, the high-precision optical calibration plate is clamped by the robot to move in the working range of the camera, and meanwhile, the position of the binocular camera is adjusted to ensure that the camera can accurately capture the angular point of the calibration plate, and the binocular camera distortion, the internal reference and the world coordinate system are calibrated through an algorithm.
The robot error calibration method based on binocular vision measurement provided by the invention can also have the following characteristics: wherein, the step 4 is divided into the following sub-steps: step 4-1, writing a motion control program of the robot on the robot control software, driving the robot to move by the motion control program, and in addition, writing an upper computer program of the robot to process the binocular camera system data; step 4-2, starting the robot and executing a motion control program to enable the end effector to move circularly, setting an initial point as a circle center, enabling the initial point to be a starting point of a subsequent track, determining the radius of the circle in the visual field range of the camera, and calculating the theoretical distance of the track through a robot error compensation model at a second point of the track searched on the circle; step 4-3, controlling a camera to shoot images of the robot reaching the track start point and the track end point through the control of a robot upper computer program and signals, and correcting binocular images by using camera distortion values and internal references to obtain binocular vision images; step 4-4, taking the round flange at the tail end of the robot as a characteristic, extracting characteristic points in the binocular vision image, and combining a double-target fixed result to calculate the characteristic points under a world coordinate system to obtain the position information of the characteristic points; and 4-5, storing the position information of the characteristic points in MATLAB and combining the binocular camera system data to obtain a distance error item of the robot distance error model.
The robot error calibration method based on binocular vision measurement provided by the invention can also have the following characteristics: in step 5, starting from the D-H model, correcting theoretical parameters in the D-H parameter table, and using the corrected D-H model to describe the actual robot spatial motion, so as to implement robot motion error compensation, and repeatedly correcting the model.
Effects and effects of the invention
According to the robot error calibration method based on binocular vision measurement, which is related by the invention, the calibration steps are as follows: step 1, a robot error calibration platform based on binocular vision measurement is built and debugged, a robot kinematic model is deduced, and a robot error compensation model is built; step 2, a robot distance error model is established based on a robot kinematics model and a robot error compensation model; step 3, selecting a proper binocular camera according to the working range of the robot to measure the end position to obtain binocular camera system data; step 4, processing binocular camera system data, carrying out binocular vision image correction and matching processing to obtain a binocular vision image, and obtaining a distance error item of a robot distance error model according to the binocular vision image; and 5, identifying the geometric parameter error of the robot connecting rod based on the distance error model, and compensating the error parameter obtained by calibration into the initial geometric parameter error model to realize the kinematic calibration effect.
Therefore, the measuring and calibrating tool of the robot error calibrating method based on binocular vision measurement is a binocular vision camera, and compared with a common laser tracker, the cost of the measuring and calibrating tool is greatly reduced. Meanwhile, binocular vision is used as a calibration tool, and the robot control cabinet can trigger shooting pictures through signals, so that not only can the automation of the calibration process be easily realized, but also the camera can be further put into subsequent vision guidance, and the whole robot operation procedure is compact.
In addition, compared with a laser tracker and a high-precision double-club calibration mode, the robot error calibration method based on binocular vision measurement reduces a large amount of cost and field arrangement time.
In addition, the robot error calibration method based on binocular vision measurement only needs to recalibrate the camera, so that the robot can be put into the application related to the subsequent robot vision guidance, and the application efficiency of the robot is improved;
in addition, the robot error calibration method based on binocular vision measurement can be applied to the precise application fields of robots such as robot polishing, drilling, welding and the like.
Drawings
FIG. 1 is a flow chart of a robot error calibration method based on binocular vision measurement in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a robot error calibration platform based on binocular vision measurement in an embodiment of the present invention;
FIG. 3 is a schematic illustration of the connection of a robot end flange clamping calibration plate in an embodiment of the present invention;
FIG. 4 is a diagram depicting a robot distance error calibration trajectory in an embodiment of the invention;
fig. 5 is a schematic diagram of distance error calibration in an embodiment of the invention.
Detailed Description
In order to make the technical means, creation characteristics, achievement purposes and effects realized by the invention easy to understand, the following embodiment describes a robot error calibration method based on binocular vision measurement specifically with reference to the accompanying drawings.
In this embodiment, a robot error calibration method based on binocular vision measurement is provided.
Fig. 1 is a flowchart of a robot error calibration method based on binocular vision measurement in the present embodiment.
As shown in fig. 1, the robot error calibration method based on binocular vision measurement in the present embodiment includes the following steps:
step S1, a robot error calibration platform based on binocular vision measurement is built and debugged, a robot kinematic model is deduced, and a robot error compensation model is built.
Fig. 2 is a schematic diagram of a robot error calibration platform based on binocular vision measurement in the present embodiment.
As shown in fig. 2, the robot error calibration platform 1000 based on binocular vision measurement in the present embodiment includes a robot module 100, a binocular vision measurement module 200, and a data processing module 300.
The robot module 100 includes a robot body 10, a robot control cabinet 2, and a high-precision calibration plate 9 mounted at the end of the robot for camera calibration.
The binocular vision measurement module 200 comprises a signal connection line 1, two identical high-precision CMOS cameras 5 and 7, a tripod 3, a ball pair holder 4 and an elongated turntable 6.
The tripod 3 has the functions of rotation and lifting, a strip-shaped rotary table 6 is arranged on the tripod 3, a ball auxiliary cradle head 4 is arranged on the rotary table 6, the ball auxiliary cradle head 4 can move in the transverse direction of the rotary table 6 to adjust the position of a camera, a binocular camera is arranged on the two ball auxiliary cradle heads 4 to rotate in three-degree-of-freedom directions so as to be convenient for adjusting a camera lens to a proper position, the tripod 3 is preferably arranged right in front of a robot 10 according to the focal length of the lens, and the camera lenses on the ball auxiliary cradle head 4 are placed in parallel as much as possible so as to reduce calculation errors generated in the calibration process.
The support cradle head in this embodiment includes a rotary table 6 and a lifting tripod 3, the rotary table 6 is disposed on the lifting tripod 3, and binocular vision cameras 5 and 7 are disposed on the three-dimensional cradle head 4 on the rotary table 6, wherein the lifting tripod 3 is a standing tripod.
The data processing module 300 comprises a tooling table 11, an upper display 12 and a computer 13.
Fig. 3 is a schematic connection diagram of the robot end flange clamping calibration plate in this embodiment.
In this embodiment, the two high-precision CMOS cameras 5 and 7 measure a certain circumferential track position point in a plane space region right in front of the robot 10, and the field of view of the two high-precision CMOS cameras 5 and 7 completely includes the plane space region, so as to avoid the phenomenon that the measurement point exceeds the field of view or the measurement error cannot be eliminated due to a large measurement error, as shown in fig. 3, in this embodiment, it is required to ensure that the end flange surface of the robot 10 can be effectively measured as a feature point under any motion condition of the target plane. Both high-precision CMOS cameras 5 and 7 are binocular vision cameras.
The upper display 12, the computer 13 and the binocular vision cameras 5 and 7 perform data transmission through the signal connecting line 1, so that the stable transmission of data at high speed and high frame rate under the working condition of the multi-camera system is ensured.
In this embodiment, a robot kinematic model is derived from the robot physical dimensions, and a differential motion principle is introduced to build a robot error compensation model.
All measurement coordinate systems in the invention are based on a space Cartesian coordinate system, a robot kinematics model is established by a D-H method, and the joint values [ q ] of each joint axis of the robot are input 1 ,q 2 ,q 3 ,q 4 ,q 5 ,q 6 ]The end pose T reached by the robot is output through a kinematic system p The positive kinematic expression from the ith rod coordinate system to the (i+1) th rod coordinate system of the robot by the above method is as follows:
i+ i 1 T=Rot(Z,θ i )Trans(Z,d i )Trans(X,a i )Rot(X,α i )
the form of expanding the above equation into a homogeneous rotation matrix is:
wherein, cθ i Is cos theta i Abbreviations, sθ i Is sin theta i The remaining similar expressions are exemplified above and so on.
According to the link relation between the serial robot connecting rods and the D-H method principle, the kinematic system expression of the industrial six-axis robot from the base to the end effector can be obtained:
the expression is converted into a homogeneous rotation matrix form as follows:
and S2, deducing and establishing a robot distance error model by adopting an M-DH method based on the robot kinematics model and the robot error compensation model. And obtaining the mapping relation between the error of the distance between two adjacent points on the theoretical track of the robot and the distance between two adjacent points on the actual track and the geometrical parameters of the robot body.
And step S3, selecting a proper binocular camera according to the working range of the robot to measure the end position so as to obtain binocular camera system data.
The binocular camera system data comprise binocular camera distortion, internal parameters and a world coordinate system, a camera and a camera lens with proper resolution, visual field and focal length are selected according to the size and the working movement range of a robot body, a high-precision optical calibration plate (the precision reaches 0.01 mm) is clamped by the robot to move in the working range, meanwhile, the position of the binocular camera is adjusted to ensure that the camera can accurately capture the corner points of the calibration plate, and the binocular camera distortion, the internal parameters and the world coordinate system are calibrated through an algorithm.
Firstly, the calibration plate 9 is placed on the flange clamping plate 8 at the tail end of the robot, and the pictures of the calibration plate under different robot postures are acquired through a binocular camera, so that 10-20 groups of pictures are needed in total.
In the embodiment, a binocular camera calibration and measurement program is written based on an OpenCV4.1.0 visual algorithm library under the development environment of visual studio2019, and the relation between the internal parameters of the binocular camera and the world coordinate system is obtained.
The internal reference form of camera calibration in this example is as follows, f x Representing the length of the focal length in the x-axis direction in the pixel coordinate system, f y Representing the length, sigma, of the focal length in the y-axis direction in the pixel coordinate system x σ y Representing the actual position of the principal point below the pixel plane (since the camera lens used in this example is not a fisheye lens, tangential distortion may be disregarded, and only radial distortion may be calibrated).
The world coordinate system for camera calibration in this example is the spatial position of the right camera relative to the left camera with the left camera as the reference coordinate system, and is in the form as follows.
And S4, processing the binocular camera system data, carrying out binocular vision image correction and matching processing to obtain a binocular vision image, and obtaining a distance error item of the robot distance error model according to the binocular vision image. The method comprises the following sub-steps:
and S4-1, compiling a motion control program of the robot, which is a given motion track and operation method of the robot, on the robot control software, driving the robot to move by using the motion control program, and compiling a robot upper computer program and a MATLAB data processing program which are developed by combining the robot control software, wherein the robot upper computer program processes binocular camera system data.
And S4-2, starting the robot and executing a motion control program to enable the end effector to move circularly, setting an initial point as a circle center, enabling the initial point to be a starting point of a subsequent track, determining the radius of the circle in the visual field range of the camera, and calculating the theoretical distance of the track through a robot error compensation model at a second point of the track found on the circle.
And S4-3, controlling a camera to shoot images of the robot reaching the track start point and the track end point through the control of a robot upper computer program and signals, and correcting binocular images by using camera distortion values and internal references to obtain binocular vision images.
And correcting the binocular image according to the obtained internal and external parameters of the camera to enable the polar lines of the binocular image to be aligned so as to facilitate the calculation of the world coordinates of the subsequent feature points.
And S4-4, taking the round flange at the tail end of the robot as a characteristic, extracting characteristic points in the binocular vision image, and combining a double-target fixed result to calculate the characteristic points under a world coordinate system to obtain the position information of the characteristic points.
In this example, the circular flange at the tail end of the robot is used as the characteristic to extract the center coordinates of the image characteristic points, and the industrial robot is mainly used as the circular flange at the tail end, so that the measuring method can be applied to the tail end position measurement of all robots with the characteristics.
And then, matching the characteristic points on the left image and the right image to ensure that the characteristic points on the left image and the right image express the same point in the real space, and filtering other redundant characteristic points in the acquired picture by taking the radius and the roundness of the round flange at the tail end of the robot as constraint conditions. Three image thresholds F, F need to be determined during the matching process y L and F are picture division threshold values, and prevent redundant points from being excessive during full-picture matching y And (3) screening the same points matched with each other in the left image and the right image by using the value for the error threshold value in the Y direction of the pixel, wherein L is the polar line length error threshold value, and because polar line alignment correction is carried out on the binocular image, the matching limit length of each characteristic point is equal, and further screening the matching points by using the value.
And finally, according to the extraction and matching of the internal and external participation characteristic points of the binocular camera obtained through the calibration, calculating the coordinates of the circle center of the circular flange at the tail end of the robot relative to the world coordinate system, and obtaining the position information of the characteristic points.
And S4-5, storing the position information of the characteristic points in MATLAB and combining the binocular camera system data to obtain a distance error item of the robot distance error model.
In this embodiment, first, differential kinematics is based on the premise that the actual pose error of the robot is small, and it can be considered that the differential motion of a certain rod of the robot is the result of the combined action of the three-axis differential translation and differential rotation of the rod around the cartesian coordinate system, and the differential motion can be expressed as follows:
dT=T·Rot(X,Y,Z)·Trans(X,Y,Z)
then, the differential operators are introduced into all six serial rod pieces of the robot, meanwhile, due to discontinuity of a D-H model at parallel joints, a beta angle rotating around a Y axis is introduced at the parallel joints, and the M-DH method is adopted to establish the mapping relation between each joint differential motion operator and each joint geometrical motion parameter error of the robot:
P(dx,dy,dz)=G(i)ξ(θ i ,d i ,a iii )
then, the differential errors of all the rod pieces are converted to an end coordinate system through a differential kinematics principle, and an identification equation between the position error quantity under the end coordinate system and the geometric parameter errors of all the connecting rods of the robot is obtained by combining the mapping relation between the position differential operator and the geometric kinematics parameter errors:
P(Dx,Dy,Dz)=F(i) 3×30 ξ(i) 30×1
finally, in order to facilitate the geometric parameter error identification of the subsequent robot connecting rod, the distance between the starting point O and the end point P of the front and back measuring positions should be kept as equal as possible, so that a circle plane is generated by taking a point in the visual field of the binocular camera as the circle center O and selecting the radius R in the visual field of the binocular camera, and the plane is parallel to the imaging plane of the camera.
Fig. 4 is a diagram illustrating a robot distance error calibration trajectory in the present embodiment.
The control robot takes the circle center O as a starting point and takes any different point P on the circle i A set of parameter identification equations is established for the terminal point, and a specific point position and a robot control flow are shown in fig. 4, so that the distance l of each set of parameter identification equations is ensured to be equal, and more than 30 sets of parameter identification equations are obtained in total to form an equation set.
Every time the robot moves to point P on the circle i And then, triggering a rising edge by an IO signal to control a camera to record the current position image of the robot, reading out the current TCP pose information of the robot in a robot controller by an upper computer, and then storing the data into MATLAB. The robot is repeatedly moved back to the circle center O and then returned to the circle surface, and the calibration process can be automated by combining with the camera image acquisition controlled by the signals, so that the error calibration efficiency of the robot is greatly improved.
And S5, identifying the geometric parameter error of the robot connecting rod based on the distance error model, and compensating the error parameter obtained by calibration into the initial geometric parameter error model to realize the kinematic calibration effect.
In order to realize the universality of the space error compensation of the robot, starting from models such as D-H (digital-to-analog) and the like capable of describing the structural parameters of the robot, starting from the D-H model, correcting theoretical parameters in a D-H parameter table, using the corrected D-H model to describe the space motion of the actual robot, so as to realize the motion error compensation of the robot, and repeatedly correcting the models.
Fig. 5 is a schematic diagram of the distance error calibration in this embodiment.
In this embodiment, first, any motion path of the robot in the space can be regarded as the result of the linear motion of interpolation of a plurality of points, so two points on the motion path of the robot are extracted as shown in fig. 5, wherein P 1 P 1 ' represents an actual starting point and a theoretical starting point, P 2 P 2 ' represents the actual endpoint and the theoretical endpoint. The actual starting point and the end point form a vectorTheoretical starting point and end point form vector->The difference between the actual movement distance and the theoretical movement distance is expressed as:
secondly, the relation between the position error quantity and all geometric parameter errors under the robot terminal coordinate system obtained in the step S4-5 is brought into an expression of the distance error, the distance error is decomposed into components along the XYZ direction in a space coordinate system, and an identification equation of the distance error and the geometric parameter errors of the connecting rod of the robot is established as follows:
wherein Deltal is the track distance error, (F (2) -F (1)) 3×30 For the difference in the error recognition matrix between the start and end points, ζ (i) 30×1 Is the geometric parameter error of the robot connecting rod to be identified. Since the number of robot error parameters based on the M-DH model is 30, the distance error to be measured is at least 30 groups of spatial points.
And finally, integrating and storing data obtained by processing the binocular vision image of the data field of the upper computer of the robot into MATLAB, and establishing a parameter identification program through a least square method parameter identification principle. In order to realize the universality of the space motion error compensation of the robot, a D-H model capable of describing the structural parameters of the robot is started, a robot D-H parameter table is established as follows, the geometric parameter error compensation obtained based on parameter identification is put into the D-H model, and the corrected D-H model is used for describing the space motion of the actual robot, so that the motion error compensation of the robot is realized.
Table 1 shows the robot D-H parameter tables.
TABLE 1
It should be noted that the explanation of the method embodiment is also applicable to the device of the similar embodiment, and will not be repeated here.
In summary, the industrial robot error calibration method based on binocular vision measurement provided in this embodiment first determines a geometric error model of a robot; secondly, selecting a proper binocular camera according to the working range of the robot to measure the tail end position; then, processing the obtained image and binocular camera system data; and finally, the geometric parameter errors of the connecting rod of the robot are identified based on the distance error model, the error parameters obtained by calibration are compensated into the initial geometric model to realize the kinematic calibration effect, the positioning accuracy of the robot is further improved, meanwhile, the whole error calibration flow is automated based on the binocular vision image acquisition module of IO signal programmable control, the error calibration efficiency of the robot is improved, and the method is more economical and convenient to operate compared with the traditional method.
Effects and effects of the examples
According to the robot error calibration method based on binocular vision measurement, the calibration steps are as follows: step 1, a robot error calibration platform based on binocular vision measurement is built and debugged, a robot kinematic model is deduced, and a robot error compensation model is built; step 2, a robot distance error model is established based on a robot kinematics model and a robot error compensation model; step 3, selecting a proper binocular camera according to the working range of the robot to measure the end position to obtain binocular camera system data; step 4, processing binocular camera system data, carrying out binocular vision image correction and matching processing to obtain a binocular vision image, and obtaining a distance error item of a robot distance error model according to the binocular vision image; and 5, identifying the geometric parameter error of the robot connecting rod based on the distance error model, and compensating the error parameter obtained by calibration into the initial geometric parameter error model to realize the kinematic calibration effect.
Therefore, the measurement calibration tool of the robot error calibration method based on binocular vision measurement in the embodiment is a binocular vision camera, and compared with a common laser tracker, the cost of the measurement calibration tool is greatly reduced. Meanwhile, binocular vision is used as a calibration tool, and the robot control cabinet can trigger shooting pictures through signals, so that not only can the automation of the calibration process be easily realized, but also the camera can be further put into subsequent vision guidance, and the whole robot operation procedure is compact.
In addition, compared with a laser tracker and a high-precision double-club calibration mode, the robot error calibration method based on binocular vision measurement reduces a large amount of cost and field arrangement time.
In addition, according to the binocular vision measurement-based robot error calibration method, the robot can be put into the application related to the subsequent robot vision guidance only by recalibrating the camera, so that the application efficiency of the robot is improved;
in addition, the robot error calibration method based on binocular vision measurement of the embodiment can be applied to the precise application fields of robots such as robot polishing, drilling, welding and the like.
The above embodiments are preferred examples of the present invention, and are not intended to limit the scope of the present invention.

Claims (6)

1. The robot error calibration method based on binocular vision measurement is characterized by comprising the following steps of:
step 1, a robot error calibration platform based on binocular vision measurement is built and debugged, a robot kinematic model is deduced, and a robot error compensation model is built;
step 2, a robot distance error model is established based on the robot kinematics model and the robot error compensation model;
step 3, selecting a proper binocular camera according to the working range of the robot to measure the end position to obtain binocular camera system data;
step 4, processing binocular camera system data, correcting and matching binocular vision images to obtain binocular vision images, and obtaining distance error items of the robot distance error model according to the binocular vision images;
and 5, identifying the geometric parameter error of the robot connecting rod based on the distance error item, and compensating the error parameter obtained by calibration into a geometric parameter error model to realize the kinematic calibration effect.
2. The robot error calibration method based on binocular vision measurement according to claim 1, wherein:
in step 1, the robot kinematic model is deduced according to the overall dimension of the robot, and a differential motion principle is introduced to build the robot error compensation model.
3. The robot error calibration method based on binocular vision measurement according to claim 1, wherein:
in the step 2, an M-DH method is adopted to deduce and establish the robot distance error model.
4. The robot error calibration method based on binocular vision measurement according to claim 1, wherein:
wherein in step 3, the binocular camera system data includes binocular camera distortion, internal parameters and world coordinate system,
according to the size and the working movement range of the robot body, a camera and a camera lens with proper resolution, visual field and focal length are selected, the high-precision optical calibration plate is clamped by the robot to move in the working range of the camera, meanwhile, the position of the binocular camera is adjusted to ensure that the camera can accurately capture the corner point of the calibration plate, and the binocular camera is calibrated to be distorted by an algorithm and participate in the world coordinate system.
5. The robot error calibration method based on binocular vision measurement according to claim 1, wherein:
wherein, the step 4 is divided into the following sub-steps:
step 4-1, a motion control program of the robot is written on the robot control software, the robot is driven to move by the motion control program, and in addition, an upper computer program of the robot is written to process the binocular camera system data;
step 4-2, starting the robot and executing the motion control program, enabling the end effector to move according to a circle, setting an initial point as a circle center, enabling the initial point to be a starting point of a subsequent track, determining the radius of the circle in the visual field range of the camera, and calculating the theoretical distance of the track through the error compensation model of the robot at a second point of the track searched on the circle;
step 4-3, controlling a camera to shoot images of the robot reaching the track start point and the track end point through the control of the upper computer program and the signal of the robot, and correcting binocular images by using the camera distortion value and the internal reference to obtain binocular vision images;
step 4-4, taking a round flange at the tail end of the robot as a characteristic, extracting characteristic points in the binocular vision image, and combining a double-target fixed result to calculate the characteristic points under a world coordinate system to obtain characteristic point position information;
and 4-5, storing the position information of the characteristic points in MATLAB and combining the binocular camera system data to obtain the distance error item of the robot distance error model.
6. The robot error calibration method based on binocular vision measurement according to claim 1, wherein:
in step 5, starting from the D-H model, correcting theoretical parameters in the D-H parameter table, and using the corrected D-H model to describe the actual robot spatial motion, so as to implement robot motion error compensation, and repeatedly correcting the model.
CN202210132480.1A 2022-02-14 2022-02-14 Robot error calibration method based on binocular vision measurement Pending CN116619350A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210132480.1A CN116619350A (en) 2022-02-14 2022-02-14 Robot error calibration method based on binocular vision measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210132480.1A CN116619350A (en) 2022-02-14 2022-02-14 Robot error calibration method based on binocular vision measurement

Publications (1)

Publication Number Publication Date
CN116619350A true CN116619350A (en) 2023-08-22

Family

ID=87608581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210132480.1A Pending CN116619350A (en) 2022-02-14 2022-02-14 Robot error calibration method based on binocular vision measurement

Country Status (1)

Country Link
CN (1) CN116619350A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117490571A (en) * 2024-01-02 2024-02-02 中国石油大学(华东) Double-plane mirror installation error measurement method for mirror image vision measurement system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117490571A (en) * 2024-01-02 2024-02-02 中国石油大学(华东) Double-plane mirror installation error measurement method for mirror image vision measurement system
CN117490571B (en) * 2024-01-02 2024-03-22 中国石油大学(华东) Double-plane mirror installation error measurement method for mirror image vision measurement system

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN109794938B (en) Robot hole-making error compensation device and method suitable for curved surface structure
JP4021413B2 (en) Measuring device
Tsai et al. A new technique for fully autonomous and efficient 3 d robotics hand/eye calibration
US5297238A (en) Robot end-effector terminal control frame (TCF) calibration method and device
TWI670153B (en) Robot and robot system
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN113001535B (en) Automatic correction system and method for robot workpiece coordinate system
CN111127568B (en) Camera pose calibration method based on spatial point location information
CN112070133B (en) Three-dimensional space point positioning method based on distance measuring instrument and machine vision
WO2009059323A1 (en) A method and system for finding a tool center point for a robot using an external camera
Hu et al. Automatic calibration of hand–eye–workspace and camera using hand-mounted line laser
US20200262080A1 (en) Comprehensive model-based method for gantry robot calibration via a dual camera vision system
CN113146620A (en) Binocular vision-based double-arm cooperative robot system and control method
CN114643578B (en) Calibration device and method for improving robot vision guiding precision
CN112917513A (en) TCP calibration method of three-dimensional dispensing needle head based on machine vision
CN112658643A (en) Connector assembly method
CN116026252A (en) Point cloud measurement method and system
CN116619350A (en) Robot error calibration method based on binocular vision measurement
CN111591474A (en) Alignment type hand-eye calibration method for spacecraft on-orbit operating system
JP2682763B2 (en) Automatic measurement method of operation error of robot body
CN110962127A (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN112907682A (en) Hand-eye calibration method and device for five-axis motion platform and related equipment
CN110533727B (en) Robot self-positioning method based on single industrial camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination