CN113580137A - Mobile robot base-workpiece relative pose determination method based on vision measurement - Google Patents

Mobile robot base-workpiece relative pose determination method based on vision measurement Download PDF

Info

Publication number
CN113580137A
CN113580137A CN202110923264.4A CN202110923264A CN113580137A CN 113580137 A CN113580137 A CN 113580137A CN 202110923264 A CN202110923264 A CN 202110923264A CN 113580137 A CN113580137 A CN 113580137A
Authority
CN
China
Prior art keywords
robot
relative
workpiece
error
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110923264.4A
Other languages
Chinese (zh)
Other versions
CN113580137B (en
Inventor
丁雅斌
付津昇
黄田
刘海涛
宋咏傧
肖聚亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202110923264.4A priority Critical patent/CN113580137B/en
Publication of CN113580137A publication Critical patent/CN113580137A/en
Application granted granted Critical
Publication of CN113580137B publication Critical patent/CN113580137B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1607Calculation of inertia, jacobian matrixes and inverses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention discloses a method for determining relative pose of a base and a workpiece of a mobile robot based on vision measurement, which comprises the steps of firstly respectively establishing a mathematical model of relative pose relationship between two systems constructed by measurement information of a laser tracker and a vision sensor; then, the former is taken as a reference, the minimum deviation of the latter relative to the former under the finite form is taken as a target, and the coefficients of the robot driving joint and virtual joint motion error multiple linear regression model are identified; and finally, providing a strategy and an algorithm for determining the relative pose between two systems by using the online vision measurement information only once by using the model. The method of the invention replaces a laser tracker with a vision measuring system, and simultaneously ensures that the positioning precision of the auxiliary robot is unchanged, thereby obviously improving the processing efficiency of the robot and reducing the cost of equipment.

Description

Mobile robot base-workpiece relative pose determination method based on vision measurement
Technical Field
The invention relates to a deviation compensation method for determining relative poses of a base and a workpiece of a mobile robot, in particular to a base and workpiece pose deviation compensation method for a mobile robot, which is based on vision measurement information and is used for constructing a multiple regression model.
Background
At present, a mobile workstation formed by carrying an industrial robot on an AGV (automatic guided vehicle) is becoming an important development trend for completing operations such as on-site local milling, hole making, polishing, assembling and the like of an ultra-large component, and is gradually applied to the fields of aerospace, rail transit, power generation equipment and the like.
The method is an important work for guaranteeing the processing precision of the mobile robot by quickly and accurately determining the pose of the base coordinate system of the mobile robot relative to the workpiece coordinate system at each station of a processing field. The mainstream methods available can be roughly divided into two categories, depending on the type of external measurement sensor used. The first method is a method based on laser tracker measurement information, which has high measurement accuracy, but when the overall structure shape is complex, the number of mobile stations is large or the measurement range is large, in order to solve the problem of light path obstruction, a plurality of laser trackers need to be configured simultaneously or a single-station-moving mode is adopted. Obviously, the former scheme is too costly, while the latter scheme is less efficient and has larger accumulated errors. The second method is a machine vision measuring method in which coordinates of a target point mounted on a workpiece are measured by a three-dimensional vision sensor mounted on an end effector of a robot, a homogeneous transformation matrix of a coordinate system of the workpiece with respect to a vision coordinate system is constructed, and then a relationship between a robot base coordinate system and the coordinate system of the workpiece is indirectly constructed by an operation such as an eye transformation. Compared with a laser tracker, the vision sensor has reasonable measurement precision and is more economical, and multi-station automatic measurement is easy to realize through integration with a numerical control system. However, it is noted that the robot body still has a pose error even after calibration, and hand-eye calibration is usually performed on this basis, so the accumulated error of the two will cause the pose determined by indirect method to be different from that determined by direct method. In other words, if the pose determined by the laser tracker measurement system is considered to be correct, the pose measured by the vision system has an error, so that the adjustable parameters in the precision transmission path of the vision measurement system must be identified and corrected, so as to ensure that the two results have good consistency.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides the method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement.
The method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement is characterized by comprising the following steps of:
firstly, establishing a relative pose relationship between a robot base coordinate system constructed by measuring information of a laser tracker and a vision sensor and a workpiece coordinate system, and comprising the following steps of:
(1) establishing a coordinate system, comprising the following steps: a certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece, a base connected coordinate system { B } is established on a robot base, a cutter connected coordinate system { T } is established on a cutter, and a visual sensor connected coordinate system { V } is established on a three-dimensional visual sensor;
(2) order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with a laser tracker at the station of the mobile robot;
(3) a given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VA homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements;
step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,VBξT/Bxi where xi is Tapa+Tcpc=Tp
Figure BDA0003208227480000021
Figure BDA0003208227480000023
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference value
Figure BDA0003208227480000022
The measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Brepresenting the measurement of the 6-dimensional pose error in { B } between the pose of { T } relative to { B } and the nominal value obtained by using the inverse kinematics model, and being called the pose error vector of { W } relative to { B };
Figure BDA0003208227480000031
the robot is jacobian for the movement of the robot;
Figure BDA0003208227480000032
is jacobian for the limited movement of the robot,
Figure BDA0003208227480000033
in the form of a generalized motion jacobian,
Figure BDA0003208227480000034
representing a motion error vector of a robot joint;
Figure BDA0003208227480000035
representing a motion error vector of a robot driving joint;
Figure BDA0003208227480000036
representing the motion error vector of the virtual joint of the robot, wherein zeta represents the sum of error quantums of { T } relative to { B } caused by other unmodeled error sources in the robot body when the zero return error of the driving joint is not counted, and etaaIndicating driveThe zero-return error vector of the joint,
Figure BDA0003208227480000037
is the driving force jacobi of the robot;
Figure BDA0003208227480000038
the constraint force of the robot is Jacobian;
step three, according to the deviation model in the step two, constructing an error fitting model of { T } relative to { B } and estimating a coefficient beta of the error fitting model, wherein the specific steps are as follows:
firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ
in the formula, xi represents the pose error vector of the robot, Z represents the motion error fitting matrix of the driving joint and the virtual joint, and beta represents the coefficient vector of the motion error fitting of the driving joint and the virtual joint;
secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of beta
Figure BDA0003208227480000039
Wherein:
Figure BDA00032082274800000310
in the formula, y represents a vector formed by observation of ξ, ξ12,...,ξKDenotes the observed value of xi in the 1 st, 2., K configuration, X denotes a regression design matrix or a recognition matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing measurement noise following a zero-mean standard normal distribution, I6KRepresents an identity matrix of 6K × 6K;
fourthly, determining the relative position and posture of the robot base coordinate system and the workpiece coordinate system through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a reference point of a robot moving platform reaches a middle point of a task space;
secondly, planning a measuring configuration of { T } relative to { B } for robot vision measurement, aligning the center of the three-dimensional vision sensor to the center of a triangle circumscribed circle formed by the centers of three target balls mounted on a workpiece, measuring the coordinates of the centers of the three target balls on the workpiece under { V } in the step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Figure BDA0003208227480000041
Fourthly, determining a nominal driving joint variable q under the measurement configuration by using a robot position inverse solution model, and calculating a motion error fitting matrix Z (q) and a generalized motion Jacobian T (q) under the nominal driving joint variable q;
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Figure BDA0003208227480000042
Sixth step, constructed according to the second and fifth steps
Figure BDA0003208227480000043
Computing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V
Seventh, based on the error fitting model, use
Figure BDA0003208227480000044
Structure of the device
Figure BDA0003208227480000045
Isomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement is
Figure BDA0003208227480000046
Wherein
Figure BDA0003208227480000047
Presentation pairBξW/B,VIs determined by the estimated value of (c),
Figure BDA0003208227480000048
presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
The invention has the advantages that: the method comprises the steps of firstly, respectively establishing a mathematical model of the relative pose relationship between two systems constructed by the measurement information of a laser tracker and a vision sensor; then, the former is taken as a reference, the minimum deviation of the latter relative to the former under the finite form is taken as a target, and the coefficients of the robot driving joint and virtual joint motion error multiple linear regression model are identified; and finally, providing a strategy and an algorithm for determining the relative pose between two systems by using the online vision measurement information only once by using the model. The vision measuring system replaces a laser tracker, the positioning accuracy of the auxiliary robot is guaranteed to be unchanged, the machining efficiency of the robot is obviously improved, and the equipment cost is reduced.
Drawings
FIG. 1 is a schematic diagram of a vision-based robot and workpiece pose measurement principle;
FIG. 2 is a vision-based determination of a coordinate system relationship of a robot to a pose of a workpiece;
FIG. 3 is a schematic diagram of measurement configuration and coordinate system representation.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments.
The invention provides a method for determining a relative pose of a base and a workpiece of a mobile robot based on vision measurement, which comprises the following steps:
establishing a relative pose relation between a mobile robot base coordinate system constructed by measuring information of a laser tracker and a visual sensor and a workpiece coordinate system;
as a specific embodiment of the present invention, the method may include the steps of:
(1) as shown in fig. 2, a coordinate system is established. A certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece 1, a base connected coordinate system { B } is established on a robot base 2, a cutter connected coordinate system { T } is established on a cutter 3, and a vision sensor connected coordinate system { V } is established on a three-dimensional vision sensor 4. The method for establishing each coordinate system can be the existing method.
(2) Order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with the laser tracker at the station of the mobile robot.
As an embodiment of the present invention, there is obtainedBAW,LThe method comprises the following steps: measuring the coordinates of at least three non-collinear target points on the workpiece and the robot base in the laser tracker coordinate system { L } using the laser tracker, respectively, and fitting by means of least squaresBAW,L
(3) A given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VIs a homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements.
BAW,VThe solution process of (2) is as follows:
BAW,Vcan be further expressed asBAW,VBAT TAV VAWVAWA homogeneous transformation matrix representing the system { W } relative to the system { V } is determined by measuring coordinates in { V } of three non-collinear target spheres affixed to the workpiece with a vision sensor;TAVa homogeneous transformation matrix representing { V } relative to { T } can be determined as a constant matrix through off-line hand-eye calibration of the robot;BATa homogeneous transformation matrix representing { T } versus { B } whose nominal values can be obtained by off-line programming of an Inverse kinematics model (document 1: Inverse kinematics of a 5-axis hybrid robot with non-single tool path generation, robotics and Computer Integrated kinematics, 2019.9, 140-148. Inverse solution for five-axis hybrid robot kinematics based on non-singular tool trajectory generation. robotics and Computer Integrated Manufacturing, 2019.9,140-148). Theoretically, should beBAW,LBAW,V. The robot in the method may be any type of robot.
Step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,VBξT/B=ξ (1)
where xi is Tapa+Tcpc=Tp
Figure BDA0003208227480000061
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference value
Figure BDA0003208227480000062
The measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Ba measure of the 6-dimensional pose error in { B } between { T } versus { B } and its nominal value using an inverse kinematics model (see document 1) is referred to as the pose error vector of { W } versus { B }. The physical meaning of formula (1) can be interpreted as: if soBAW,LIs taken as a reference, and is provided withWAVAndTAVis accurate, thenBξW/B,VCan be used approximatelyBξT/BInstead of this. For convenience of countingMake xi ═BξT/B
Figure BDA0003208227480000071
The robot is jacobian for the movement of the robot;
Figure BDA0003208227480000072
is jacobian for the limited movement of the robot,
Figure BDA0003208227480000073
referred to as the generalized motor jacobian.
Figure BDA0003208227480000074
Representing the motion error vector of the robot joint.
With respect to the formula (2),
Figure BDA0003208227480000075
representing a motion error vector of a robot driving joint;
Figure BDA0003208227480000076
representing the motion error vector of the virtual joint of the robot. ζ represents the sum of error fluxes of { T } relative { B } caused by other unmodeled error sources (such as structural errors between two adjacent connected systems described by D-H parameters, elastic deformation of a member caused by a gravity field and the like) in the robot body when the driving joint return to zero error is not counted. EtaaRepresenting the return to zero error vector that drives the joint.
Figure BDA0003208227480000077
Is the driving force jacobi of the robot;
Figure BDA0003208227480000078
is jacobian for the restraining force of the robot. Wherein WaAnd WcIs known, ζ and ηaIs unknown, in order to calculate pacFitting is performed using equation (13).
The derivation of the bias model is illustrated as follows:
because the robot body has errors, the step twoBAW,LBAW,VAnd this deviation is related to the configuration of the robot. For this purpose, theBAW,LIs taken as a reference, and
Figure BDA0003208227480000079
is composed ofBAW,VIs expected value ofBAW,VRelative to each other
Figure BDA00032082274800000710
Can be expressed as
Figure BDA00032082274800000711
In the formula (I), the compound is shown in the specification,
Figure BDA00032082274800000712
it is known that it has been determined beforehand by vision measurement and hand-eye calibrationWAVAndTAVand the zero point error and other error sources of the robot joint are small in amount relative to the nominal values of the zero point error and other error sources after the kinematic calibration. Then, forBAW,VAbout
Figure BDA00032082274800000713
Performing a first-order Taylor expansion, and performing right multiplication on both ends after substituting the formula (3)
Figure BDA00032082274800000714
The linearized error model with the lie algebra format can be obtained:
Figure BDA00032082274800000715
further adapted to a linearized error model in Pl ü cker axis coordinate format
Figure BDA0003208227480000081
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference value
Figure BDA0003208227480000082
The measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Band the measurement of the 6-dimensional pose error in the B between the pose of the T relative to the B and the nominal value obtained by the inverse kinematics model is called the pose error vector of the W relative to the B.
For convenience, let xi be equal toBξT/B. On one hand, when the return-to-zero error of each driving joint of the robot body and the structural error of all joints are small quantities, ξ can be further expressed as:
ξ=Jaηa+ζ (6)
in the formula (I), the compound is shown in the specification,
Figure BDA0003208227480000083
representing the return-to-zero error vector, η, of the driving jointa,iIndicates the zero-return error of the ith (i is 1,2, …, f) driving joint, and f indicates the number of degrees of freedom of the robot.
Figure BDA0003208227480000084
To drive the error jacobian of the joint,
Figure BDA0003208227480000085
is shown as [. eta. ]a,i1 and ηa,jWhen the value is 0(j ≠ i), the error rotation amount of system { T } relative to system { B }. ζ represents the sum of error fluxes of { T } relative { B } caused by other unmodeled error sources (such as structural errors between two adjacent connected systems described by D-H parameters, elastic deformation of a member caused by a gravity field and the like) in the robot body when the driving joint return to zero error is not counted.
On the other hand, as known from the momentum algebra, because the momentum space of the rigid variational motion of the robot terminal can be decomposed into the direct sum of two subspaces of the allowable variable motion and the limited variational motion, ξ can also be expressed as:
ξ=ξac=Taρa+Tcρc=Tρ (7)
in the formula (I), the compound is shown in the specification,
Figure BDA0003208227480000086
represents a motion error vector of a robot joint,
Figure BDA0003208227480000087
representing the motion error vector that drives the joint,
Figure BDA0003208227480000088
representing the motion error vector of the virtual joint. XiaIndicating the allowable pose error, ξcAnd representing the limited pose error rotation.
Figure BDA0003208227480000089
The robot is jacobian for the movement of the robot;
Figure BDA00032082274800000810
is jacobian for the limited movement of the robot,
Figure BDA00032082274800000811
referred to as the generalized motor jacobian. The column vector of T spans one base of the kinematic momentum space, and it is also a nonlinear function of the mechanism nominal scale parameters and the nominal drive joint variables.
Comparing the two contents, considering that the driving joint is the motion generating element of the robot mechanism, T is takena=JaI.e. using JaThe column vector of (a) constructs a base of the allowed variable motion subspace. At this time, the process of the present invention,
Figure BDA0003208227480000091
and
Figure BDA0003208227480000092
physics ofMeaning is understood as the motion error vector of the driving joint and the virtual joint. Further, the force jacobian W of the system is constructed by using the dual space theory so that
W=[Wa Wc]=T-T (8)
Further, the force jacobian W of the system is constructed by using the dual space theory, and the following relationship is established:
Figure BDA0003208227480000093
in the formula (I), the compound is shown in the specification,
Figure BDA0003208227480000094
indicating the driving force jacobi of the robot,
Figure BDA0003208227480000095
representing the jacobian binding force of the robot.
Figure BDA0003208227480000096
And
Figure BDA0003208227480000097
are all varied with position, so rhoaAnd ρcAll can be regarded as nominal driving joint variable
Figure BDA0003208227480000098
Is a non-linear function of (a). EtaaRepresenting the return to zero error vector that drives the joint.
And step three, constructing an error fitting model of { T } relative to { B } according to the deviation model (formulas 1 and 2) in the step two, and estimating a coefficient beta of the error fitting model.
According to the formula (1), it can be seen that,BξT/BandBξW/Bis approximately equal relation, and the purpose of establishing an error fitting model is to compensate in a deviation modelBξW/B
Firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ (10)
where ξ represents the robot pose error vector (obtained by the deviation model of equation (1)), Z represents the motion error fitting matrix (known quantity) of the drive joint and the virtual joint,
Figure BDA0003208227480000099
for the generalized motion jacobian (known quantity) of the robot, beta represents a coefficient vector (coefficient to be fitted) for fitting the motion errors of the driving joint and the virtual joint;
the derivation process of the error fitting function is:
under any given configuration of the robot, e.g. modifying the command for driving the joints to qm
qm=q+Δq (11)
And the correction amount Deltaq is made to satisfy
Figure BDA0003208227480000101
Then the theoretical compensation by paInduced allowable motion pose error momentum xia
Although ρaAnd ρcAre unknown, but they are all non-linear functions of nominal values q of robot-driven joint variables, for which purpose p is fitted with a quadratic polynomial of qaAnd ρcI.e. by
Figure BDA0003208227480000102
In the formula, ρa,iRepresenting the i-th component, p, of the driving joint motion error vectorc,1Representing the 1 st component, q, of a virtual joint motion error vectorjRepresenting the motion variable of the j-th drive joint, qkRepresenting the motion variable of the kth drive joint, alphaa,0,iCoefficient, alpha, representing the zeroth order term that drives the joint motion error fitc,0,1Coefficients representing the zeroth order term of the virtual joint motion error fit. Alpha is alphaa,j,iRepresenting the fitting to the i-th driving joint motion errorFirst order coefficient, alpha, associated with the j-th driving joint variablec,j,1The coefficients of the first order term associated with the j-th driven joint variable in the fit to the 1 st virtual joint motion error are shown. Alpha is alphaa,jk,iRepresenting the coefficients of the second order terms, alpha, associated with the j-th and k-th driven joint variables in the fitting of the motion error of the i-th driven jointc,jk,1Representing the coefficients of the second order terms associated with the j-th and k-th driven joint variables in the fit to the 1 st virtual joint motion error.
Performing variable substitution on the formula (13), and writing the formula (13) into the form of a multi-element linear function
Figure BDA0003208227480000103
In the formula, betaa,j,iN denotes coefficients of the jth polynomial in the fit to the ith driving joint motion error, βc,j,1J-0, 1,.. and n denote coefficients of the jth polynomial in the 1 st virtual joint motion error fit. z is a radical ofjJ-1, 2.. and n denote the j-th polynomial associated with the driving joint variable. z represents by zjThe vectors of the components. Beta is aa,iIs represented by betaa,j,iVector of composition, betac,iIs represented by betac,j,1The vectors of the components. n represents the number of coefficients in a multiple linear function constructed by a quadratic polynomial, and n is 21.
Writing formula (13) in matrix form, having
ρa=Zaβa,ρc=Zcβc (15)
In the formula
Figure BDA0003208227480000111
Substituting formula (15) for formula (7) to obtain an error fitting model
ξ=TZβ (16)
In the formula (I), the compound is shown in the specification,
Figure BDA0003208227480000112
a motion error fitting matrix representing the driven joint and the virtual joint,
Figure BDA0003208227480000113
coefficient vectors representing a fit of the drive joint and virtual joint motion errors.
Secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of beta
Figure BDA0003208227480000114
The robot working area is covered by K bit shapes, such as: it is possible to run through K150 bit shapes in the robot work area. According to the error fitting model, as shown in formula (16), the following multiple linear regression model is constructed:
y=Xβ+ε (17)
wherein:
Figure BDA0003208227480000115
in the formula, y represents a vector formed by observation of ξ. Xi12,...,ξKAn observed value of ξ in the form of the 1 st, 2.,. K position can be obtained by the expressions (3) to (5). X represents a regression design matrix or an identification matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing the measurement noise following a zero-mean standard normal distribution. Wherein I6KRepresenting a 6K x 6K identity matrix.
Although an unbiased estimate of β can be obtained using the ordinary least squares method, it is noted that β contains 126 coefficients. The identification parameters are more, so there may be a ill-conditioned problem caused by non-column full rank or complex collinearity of the identification matrix. Therefore, a damped least square algorithm is adopted to obtain a reliable estimated value of beta
Figure BDA0003208227480000121
Fourthly, determining the relative position and posture of the base coordinate system and the workpiece coordinate system of the mobile robot through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a robot moving platform reference point P reaches a task space midpoint P0
Secondly, as shown in FIG. 3, a measuring configuration of { T } relative to { B } for robot vision measurement is planned, so that the center V of the three-dimensional vision sensor is aligned with the center O of a triangle circumscribed circle formed by the centers of three target balls mounted on the workpieceWMeasuring the coordinates of the centers of the three target balls on the workpiece under the { V } in the step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Figure BDA0003208227480000122
Fourthly, a nominal driving joint variable q under the measured configuration is determined by using a robot position inverse solution model, and a motion error fitting matrix Z (q) and a generalized motion jacobian T (q) under the nominal driving joint variable q are calculated, wherein the calculation method is shown in a reference 2: research on parametric modeling and integrated design method of novel five-degree-of-freedom hybrid robot [ D ], Tianjin university ";
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley (the existing structure) is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Figure BDA0003208227480000123
Sixth step, constructed according to the second and fifth steps
Figure BDA0003208227480000131
Computing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V
Seventh, the model (equation (10)) is fitted based on the error, using
Figure BDA0003208227480000132
Structure of the device
Figure BDA0003208227480000133
Isomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement is
Figure BDA0003208227480000134
Wherein
Figure BDA0003208227480000135
To representBξW/B,VIs determined by the estimated value of (c),
Figure BDA0003208227480000136
presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
Therefore, the relative poses of the base coordinate system and the workpiece coordinate system of the mobile robot can be determined through one-time online vision measurement, and the complete compensation process is shown in figure 1.
Although the present invention has been described in terms of its functions and operations with reference to the accompanying drawings, the present invention is not limited to the specific functions and operations described above, and the above-described embodiments are merely illustrative and not restrictive, and those skilled in the art can make many modifications within the scope of the present invention without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims (3)

1. The method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement is characterized by comprising the following steps of:
firstly, establishing a relative pose relationship between a robot base coordinate system constructed by measuring information of a laser tracker and a vision sensor and a workpiece coordinate system, and comprising the following steps of:
(1) establishing a coordinate system, comprising the following steps: a certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece, a base connected coordinate system { B } is established on a robot base, a cutter connected coordinate system { T } is established on a cutter, and a visual sensor connected coordinate system { V } is established on a three-dimensional visual sensor;
(2) order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with a laser tracker at the station of the mobile robot;
(3) a given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VA homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements;
step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,VBξT/Bxi where xi is Tapa+Tcpc=Tp
Figure FDA0003208227470000011
Figure FDA0003208227470000012
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference value
Figure FDA0003208227470000013
The measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Brepresenting the measurement of the 6-dimensional pose error in { B } between the pose of { T } relative to { B } and the nominal value obtained by using the inverse kinematics model, and being called the pose error vector of { W } relative to { B };
Figure FDA0003208227470000014
the robot is jacobian for the movement of the robot;
Figure FDA0003208227470000015
is jacobian for the limited movement of the robot,
Figure FDA0003208227470000016
in the form of a generalized motion jacobian,
Figure FDA0003208227470000017
representing a motion error vector of a robot joint;
Figure FDA0003208227470000018
representing a motion error vector of a robot driving joint;
Figure FDA0003208227470000019
representing the motion error vector of the virtual joint of the robot, wherein zeta represents the sum of error quantums of { T } relative to { B } caused by other unmodeled error sources in the robot body when the zero return error of the driving joint is not counted, and etaaRepresenting a return-to-zero error vector that drives the joint,
Figure FDA00032082274700000110
is the driving force jacobi of the robot;
Figure FDA00032082274700000111
the constraint force of the robot is Jacobian;
step three, according to the deviation model in the step two, constructing an error fitting model of { T } relative to { B } and estimating a coefficient beta of the error fitting model, wherein the specific steps are as follows:
firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ
in the formula, xi represents the pose error vector of the robot, Z represents the motion error fitting matrix of the driving joint and the virtual joint, and beta represents the coefficient vector of the motion error fitting of the driving joint and the virtual joint;
secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of beta
Figure FDA0003208227470000021
Wherein:
Figure FDA0003208227470000022
in the formula, y represents a vector formed by observation of ξ, ξ12,...,ξKDenotes the observed value of xi in the 1 st, 2., K configuration, X denotes a regression design matrix or a recognition matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing measurement noise following a zero-mean standard normal distribution, I6KRepresents an identity matrix of 6K × 6K;
fourthly, determining the relative position and posture of the robot base coordinate system and the workpiece coordinate system through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a reference point of a robot moving platform reaches a middle point of a task space;
second onePlanning a measuring configuration of { T } relative to { B } for robot vision measurement, aligning the center of the three-dimensional vision sensor to the center of a triangle circumscribed circle formed by the centers of three target balls mounted on a workpiece, measuring the coordinates of the centers of the three target balls on the workpiece under { V } in step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Figure FDA0003208227470000023
Fourthly, determining a nominal driving joint variable q under the measurement configuration by using a robot position inverse solution model, and calculating a motion error fitting matrix Z (q) and a generalized motion Jacobian T (q) under the nominal driving joint variable q;
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Figure FDA0003208227470000031
Sixth step, constructed according to the second and fifth steps
Figure FDA0003208227470000032
Computing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V
Seventh, based on the error fitting model, use
Figure FDA0003208227470000033
Structure of the device
Figure FDA0003208227470000034
Isomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement is
Figure FDA0003208227470000035
Wherein
Figure FDA0003208227470000036
Presentation pairBξW/B,VIs determined by the estimated value of (c),
Figure FDA0003208227470000037
presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
2. The vision measurement-based mobile robot base-workpiece relative pose determination method according to claim 1, wherein:
BAW,Vthe solution process of (2) is as follows: will be described inBAW,VIs further shown asBAW,VBAT TAV VAWVAWA homogeneous transformation matrix representing the system { W } relative to the system { V } determined by measuring coordinates in { V } of three non-collinear target spheres affixed to the workpiece with a vision sensor;TAVa homogeneous transformation matrix representing { V } relative to { T } is determined as a constant matrix through off-line hand-eye calibration of the robot;BATand (3) a homogeneous transformation matrix representing { T } relative to { B } and nominal values of the homogeneous transformation matrix are obtained by off-line programming of an inverse kinematics model.
3. The vision measurement-based mobile robot base-workpiece relative pose determination method according to claim 1, wherein: to obtainBAW,LThe method comprises the following steps: measuring the coordinates of at least three non-collinear target points on the workpiece and the robot base in the laser tracker coordinate system { L } using the laser tracker, respectively, and fitting by means of least squaresBAW,L
CN202110923264.4A 2021-08-12 2021-08-12 Mobile robot base-workpiece relative pose determining method based on vision measurement Active CN113580137B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110923264.4A CN113580137B (en) 2021-08-12 2021-08-12 Mobile robot base-workpiece relative pose determining method based on vision measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110923264.4A CN113580137B (en) 2021-08-12 2021-08-12 Mobile robot base-workpiece relative pose determining method based on vision measurement

Publications (2)

Publication Number Publication Date
CN113580137A true CN113580137A (en) 2021-11-02
CN113580137B CN113580137B (en) 2023-09-22

Family

ID=78257429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110923264.4A Active CN113580137B (en) 2021-08-12 2021-08-12 Mobile robot base-workpiece relative pose determining method based on vision measurement

Country Status (1)

Country Link
CN (1) CN113580137B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114131595A (en) * 2021-11-12 2022-03-04 清华大学 Robot 6D pose estimation system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010003289A1 (en) * 2008-07-11 2010-01-14 中国科学院沈阳自动化研究所 Apparatus and method for robots tracking appointed path with high precision
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
CN107175660A (en) * 2017-05-08 2017-09-19 同济大学 A kind of six-freedom degree robot kinematics scaling method based on monocular vision
CN107741224A (en) * 2017-08-28 2018-02-27 浙江大学 A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation
CN109794963A (en) * 2019-01-07 2019-05-24 南京航空航天大学 A kind of robot method for rapidly positioning towards curved surface member
US20200206945A1 (en) * 2018-12-29 2020-07-02 Ubtech Robotics Corp Ltd Robot pose estimation method and apparatus and robot using the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010003289A1 (en) * 2008-07-11 2010-01-14 中国科学院沈阳自动化研究所 Apparatus and method for robots tracking appointed path with high precision
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
CN107175660A (en) * 2017-05-08 2017-09-19 同济大学 A kind of six-freedom degree robot kinematics scaling method based on monocular vision
CN107741224A (en) * 2017-08-28 2018-02-27 浙江大学 A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation
US20200206945A1 (en) * 2018-12-29 2020-07-02 Ubtech Robotics Corp Ltd Robot pose estimation method and apparatus and robot using the same
CN109794963A (en) * 2019-01-07 2019-05-24 南京航空航天大学 A kind of robot method for rapidly positioning towards curved surface member

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李耀贵;: "一种含铰接动平台姿态调整类并联机器人的误差建模与验证", 重庆理工大学学报(自然科学), no. 08 *
杨守瑞;尹仕斌;任永杰;邾继贵;叶声华: "机器人柔性视觉测量系统标定方法的改进", 光学精密工程, vol. 22, no. 12 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114131595A (en) * 2021-11-12 2022-03-04 清华大学 Robot 6D pose estimation system and method
CN114131595B (en) * 2021-11-12 2023-09-12 清华大学 Robot 6D pose estimation system and method

Also Published As

Publication number Publication date
CN113580137B (en) 2023-09-22

Similar Documents

Publication Publication Date Title
CN109848983B (en) Method for guiding robot to cooperatively work by high-compliance person
CN102591257B (en) Parameter curve cutter path oriented numerical control system contour error control method
Peng et al. An enhanced kinematic model for calibration of robotic machining systems with parallelogram mechanisms
Masory et al. Kinematic modeling and calibration of a Stewart platform
CN111037542B (en) Track error compensation method for linear machining of inverse dynamics control robot
Kubela et al. Assessment of industrial robots accuracy in relation to accuracy improvement in machining processes
CN113580148B (en) Parallel robot kinematics calibration method based on equivalent kinematic chain
CN114147726A (en) Robot calibration method combining geometric error and non-geometric error
CN113580137A (en) Mobile robot base-workpiece relative pose determination method based on vision measurement
CN114505865A (en) Pose tracking-based mechanical arm path generation method and system
CN112587237A (en) Method for reducing operation error of orthopedic operation robot
Qi et al. Accuracy improvement calibrations for the double-position 4-PPPS aircraft docking system
US6785624B2 (en) Method for calibrating of machine units located in machine tools or robotic devices
Kubela et al. High accurate robotic machining based on absolute part measuring and on-line path compensation
Kim et al. Joint compliance error compensation for robot manipulator using body frame
Zhao et al. An efficient error prediction and compensation method for coordinated five-axis machine tools under variable temperature
Kvrgic et al. A control algorithm for a vertical five-axis turning centre
Zhao et al. Calibration and Compensation of Rotary Axis Angular Positioning Deviations on a SCARA-Type Industrial Robot Using a Laser Tracker
Li Visual Calibration, Identification and Control of 6-RSS Parallel Robots
Zhao et al. Kinematic modeling and inverse kinematics solution of a new six-axis machine tool for oval hole drilling in aircraft wing assembly
Lounici et al. Tool Positioning Error Minimization during Robotic Milling Based on the Genetic Algorithm Technique
CN110940351A (en) Robot precision compensation method based on parameter dimension reduction identification
Lounici et al. Cutting Forces Impact on the Spindle Path during Robotic Milling
Newman et al. Towards a new real-time metrology guidance method for robotized machining of aerostructures robust against cutting fluids and debris
Wu et al. Volumetric error modeling and accuracy improvement by parameter identification of a compound machine tool

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant