CN113580137A - Mobile robot base-workpiece relative pose determination method based on vision measurement - Google Patents
Mobile robot base-workpiece relative pose determination method based on vision measurement Download PDFInfo
- Publication number
- CN113580137A CN113580137A CN202110923264.4A CN202110923264A CN113580137A CN 113580137 A CN113580137 A CN 113580137A CN 202110923264 A CN202110923264 A CN 202110923264A CN 113580137 A CN113580137 A CN 113580137A
- Authority
- CN
- China
- Prior art keywords
- robot
- relative
- workpiece
- error
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1605—Simulation of manipulator lay-out, design, modelling of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/005—Manipulators for mechanical processing tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1607—Calculation of inertia, jacobian matrixes and inverses
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Abstract
The invention discloses a method for determining relative pose of a base and a workpiece of a mobile robot based on vision measurement, which comprises the steps of firstly respectively establishing a mathematical model of relative pose relationship between two systems constructed by measurement information of a laser tracker and a vision sensor; then, the former is taken as a reference, the minimum deviation of the latter relative to the former under the finite form is taken as a target, and the coefficients of the robot driving joint and virtual joint motion error multiple linear regression model are identified; and finally, providing a strategy and an algorithm for determining the relative pose between two systems by using the online vision measurement information only once by using the model. The method of the invention replaces a laser tracker with a vision measuring system, and simultaneously ensures that the positioning precision of the auxiliary robot is unchanged, thereby obviously improving the processing efficiency of the robot and reducing the cost of equipment.
Description
Technical Field
The invention relates to a deviation compensation method for determining relative poses of a base and a workpiece of a mobile robot, in particular to a base and workpiece pose deviation compensation method for a mobile robot, which is based on vision measurement information and is used for constructing a multiple regression model.
Background
At present, a mobile workstation formed by carrying an industrial robot on an AGV (automatic guided vehicle) is becoming an important development trend for completing operations such as on-site local milling, hole making, polishing, assembling and the like of an ultra-large component, and is gradually applied to the fields of aerospace, rail transit, power generation equipment and the like.
The method is an important work for guaranteeing the processing precision of the mobile robot by quickly and accurately determining the pose of the base coordinate system of the mobile robot relative to the workpiece coordinate system at each station of a processing field. The mainstream methods available can be roughly divided into two categories, depending on the type of external measurement sensor used. The first method is a method based on laser tracker measurement information, which has high measurement accuracy, but when the overall structure shape is complex, the number of mobile stations is large or the measurement range is large, in order to solve the problem of light path obstruction, a plurality of laser trackers need to be configured simultaneously or a single-station-moving mode is adopted. Obviously, the former scheme is too costly, while the latter scheme is less efficient and has larger accumulated errors. The second method is a machine vision measuring method in which coordinates of a target point mounted on a workpiece are measured by a three-dimensional vision sensor mounted on an end effector of a robot, a homogeneous transformation matrix of a coordinate system of the workpiece with respect to a vision coordinate system is constructed, and then a relationship between a robot base coordinate system and the coordinate system of the workpiece is indirectly constructed by an operation such as an eye transformation. Compared with a laser tracker, the vision sensor has reasonable measurement precision and is more economical, and multi-station automatic measurement is easy to realize through integration with a numerical control system. However, it is noted that the robot body still has a pose error even after calibration, and hand-eye calibration is usually performed on this basis, so the accumulated error of the two will cause the pose determined by indirect method to be different from that determined by direct method. In other words, if the pose determined by the laser tracker measurement system is considered to be correct, the pose measured by the vision system has an error, so that the adjustable parameters in the precision transmission path of the vision measurement system must be identified and corrected, so as to ensure that the two results have good consistency.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides the method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement.
The method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement is characterized by comprising the following steps of:
firstly, establishing a relative pose relationship between a robot base coordinate system constructed by measuring information of a laser tracker and a vision sensor and a workpiece coordinate system, and comprising the following steps of:
(1) establishing a coordinate system, comprising the following steps: a certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece, a base connected coordinate system { B } is established on a robot base, a cutter connected coordinate system { T } is established on a cutter, and a visual sensor connected coordinate system { V } is established on a three-dimensional visual sensor;
(2) order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with a laser tracker at the station of the mobile robot;
(3) a given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VA homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements;
step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,V≈BξT/Bxi where xi is Tapa+Tcpc=Tp
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference valueThe measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Brepresenting the measurement of the 6-dimensional pose error in { B } between the pose of { T } relative to { B } and the nominal value obtained by using the inverse kinematics model, and being called the pose error vector of { W } relative to { B };the robot is jacobian for the movement of the robot;is jacobian for the limited movement of the robot,in the form of a generalized motion jacobian,representing a motion error vector of a robot joint;
representing a motion error vector of a robot driving joint;representing the motion error vector of the virtual joint of the robot, wherein zeta represents the sum of error quantums of { T } relative to { B } caused by other unmodeled error sources in the robot body when the zero return error of the driving joint is not counted, and etaaIndicating driveThe zero-return error vector of the joint,is the driving force jacobi of the robot;the constraint force of the robot is Jacobian;
step three, according to the deviation model in the step two, constructing an error fitting model of { T } relative to { B } and estimating a coefficient beta of the error fitting model, wherein the specific steps are as follows:
firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ
in the formula, xi represents the pose error vector of the robot, Z represents the motion error fitting matrix of the driving joint and the virtual joint, and beta represents the coefficient vector of the motion error fitting of the driving joint and the virtual joint;
secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of betaWherein:
in the formula, y represents a vector formed by observation of ξ, ξ1,ξ2,...,ξKDenotes the observed value of xi in the 1 st, 2., K configuration, X denotes a regression design matrix or a recognition matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing measurement noise following a zero-mean standard normal distribution, I6KRepresents an identity matrix of 6K × 6K;
fourthly, determining the relative position and posture of the robot base coordinate system and the workpiece coordinate system through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a reference point of a robot moving platform reaches a middle point of a task space;
secondly, planning a measuring configuration of { T } relative to { B } for robot vision measurement, aligning the center of the three-dimensional vision sensor to the center of a triangle circumscribed circle formed by the centers of three target balls mounted on a workpiece, measuring the coordinates of the centers of the three target balls on the workpiece under { V } in the step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Fourthly, determining a nominal driving joint variable q under the measurement configuration by using a robot position inverse solution model, and calculating a motion error fitting matrix Z (q) and a generalized motion Jacobian T (q) under the nominal driving joint variable q;
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Sixth step, constructed according to the second and fifth stepsComputing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V;
Seventh, based on the error fitting model, useStructure of the deviceIsomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement isWhereinPresentation pairBξW/B,VIs determined by the estimated value of (c),presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
The invention has the advantages that: the method comprises the steps of firstly, respectively establishing a mathematical model of the relative pose relationship between two systems constructed by the measurement information of a laser tracker and a vision sensor; then, the former is taken as a reference, the minimum deviation of the latter relative to the former under the finite form is taken as a target, and the coefficients of the robot driving joint and virtual joint motion error multiple linear regression model are identified; and finally, providing a strategy and an algorithm for determining the relative pose between two systems by using the online vision measurement information only once by using the model. The vision measuring system replaces a laser tracker, the positioning accuracy of the auxiliary robot is guaranteed to be unchanged, the machining efficiency of the robot is obviously improved, and the equipment cost is reduced.
Drawings
FIG. 1 is a schematic diagram of a vision-based robot and workpiece pose measurement principle;
FIG. 2 is a vision-based determination of a coordinate system relationship of a robot to a pose of a workpiece;
FIG. 3 is a schematic diagram of measurement configuration and coordinate system representation.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments.
The invention provides a method for determining a relative pose of a base and a workpiece of a mobile robot based on vision measurement, which comprises the following steps:
establishing a relative pose relation between a mobile robot base coordinate system constructed by measuring information of a laser tracker and a visual sensor and a workpiece coordinate system;
as a specific embodiment of the present invention, the method may include the steps of:
(1) as shown in fig. 2, a coordinate system is established. A certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece 1, a base connected coordinate system { B } is established on a robot base 2, a cutter connected coordinate system { T } is established on a cutter 3, and a vision sensor connected coordinate system { V } is established on a three-dimensional vision sensor 4. The method for establishing each coordinate system can be the existing method.
(2) Order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with the laser tracker at the station of the mobile robot.
As an embodiment of the present invention, there is obtainedBAW,LThe method comprises the following steps: measuring the coordinates of at least three non-collinear target points on the workpiece and the robot base in the laser tracker coordinate system { L } using the laser tracker, respectively, and fitting by means of least squaresBAW,L。
(3) A given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VIs a homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements.
BAW,VThe solution process of (2) is as follows:
BAW,Vcan be further expressed asBAW,V=BAT TAV VAW,VAWA homogeneous transformation matrix representing the system { W } relative to the system { V } is determined by measuring coordinates in { V } of three non-collinear target spheres affixed to the workpiece with a vision sensor;TAVa homogeneous transformation matrix representing { V } relative to { T } can be determined as a constant matrix through off-line hand-eye calibration of the robot;BATa homogeneous transformation matrix representing { T } versus { B } whose nominal values can be obtained by off-line programming of an Inverse kinematics model (document 1: Inverse kinematics of a 5-axis hybrid robot with non-single tool path generation, robotics and Computer Integrated kinematics, 2019.9, 140-148. Inverse solution for five-axis hybrid robot kinematics based on non-singular tool trajectory generation. robotics and Computer Integrated Manufacturing, 2019.9,140-148). Theoretically, should beBAW,L=BAW,V. The robot in the method may be any type of robot.
Step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,V≈BξT/B=ξ (1)
where xi is Tapa+Tcpc=Tp
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference valueThe measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Ba measure of the 6-dimensional pose error in { B } between { T } versus { B } and its nominal value using an inverse kinematics model (see document 1) is referred to as the pose error vector of { W } versus { B }. The physical meaning of formula (1) can be interpreted as: if soBAW,LIs taken as a reference, and is provided withWAVAndTAVis accurate, thenBξW/B,VCan be used approximatelyBξT/BInstead of this. For convenience of countingMake xi ═BξT/B。The robot is jacobian for the movement of the robot;is jacobian for the limited movement of the robot,referred to as the generalized motor jacobian.Representing the motion error vector of the robot joint.
With respect to the formula (2),representing a motion error vector of a robot driving joint;representing the motion error vector of the virtual joint of the robot. ζ represents the sum of error fluxes of { T } relative { B } caused by other unmodeled error sources (such as structural errors between two adjacent connected systems described by D-H parameters, elastic deformation of a member caused by a gravity field and the like) in the robot body when the driving joint return to zero error is not counted. EtaaRepresenting the return to zero error vector that drives the joint.Is the driving force jacobi of the robot;is jacobian for the restraining force of the robot. Wherein WaAnd WcIs known, ζ and ηaIs unknown, in order to calculate pa,ρcFitting is performed using equation (13).
The derivation of the bias model is illustrated as follows:
because the robot body has errors, the step twoBAW,L≠BAW,VAnd this deviation is related to the configuration of the robot. For this purpose, theBAW,LIs taken as a reference, andis composed ofBAW,VIs expected value ofBAW,VRelative to each otherCan be expressed as
In the formula (I), the compound is shown in the specification,it is known that it has been determined beforehand by vision measurement and hand-eye calibrationWAVAndTAVand the zero point error and other error sources of the robot joint are small in amount relative to the nominal values of the zero point error and other error sources after the kinematic calibration. Then, forBAW,VAboutPerforming a first-order Taylor expansion, and performing right multiplication on both ends after substituting the formula (3)The linearized error model with the lie algebra format can be obtained:
further adapted to a linearized error model in Pl ü cker axis coordinate format
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference valueThe measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Band the measurement of the 6-dimensional pose error in the B between the pose of the T relative to the B and the nominal value obtained by the inverse kinematics model is called the pose error vector of the W relative to the B.
For convenience, let xi be equal toBξT/B. On one hand, when the return-to-zero error of each driving joint of the robot body and the structural error of all joints are small quantities, ξ can be further expressed as:
ξ=Jaηa+ζ (6)
in the formula (I), the compound is shown in the specification,representing the return-to-zero error vector, η, of the driving jointa,iIndicates the zero-return error of the ith (i is 1,2, …, f) driving joint, and f indicates the number of degrees of freedom of the robot.To drive the error jacobian of the joint,is shown as [. eta. ]a,i1 and ηa,jWhen the value is 0(j ≠ i), the error rotation amount of system { T } relative to system { B }. ζ represents the sum of error fluxes of { T } relative { B } caused by other unmodeled error sources (such as structural errors between two adjacent connected systems described by D-H parameters, elastic deformation of a member caused by a gravity field and the like) in the robot body when the driving joint return to zero error is not counted.
On the other hand, as known from the momentum algebra, because the momentum space of the rigid variational motion of the robot terminal can be decomposed into the direct sum of two subspaces of the allowable variable motion and the limited variational motion, ξ can also be expressed as:
ξ=ξa+ξc=Taρa+Tcρc=Tρ (7)
in the formula (I), the compound is shown in the specification,represents a motion error vector of a robot joint,representing the motion error vector that drives the joint,representing the motion error vector of the virtual joint. XiaIndicating the allowable pose error, ξcAnd representing the limited pose error rotation.The robot is jacobian for the movement of the robot;is jacobian for the limited movement of the robot,referred to as the generalized motor jacobian. The column vector of T spans one base of the kinematic momentum space, and it is also a nonlinear function of the mechanism nominal scale parameters and the nominal drive joint variables.
Comparing the two contents, considering that the driving joint is the motion generating element of the robot mechanism, T is takena=JaI.e. using JaThe column vector of (a) constructs a base of the allowed variable motion subspace. At this time, the process of the present invention,andphysics ofMeaning is understood as the motion error vector of the driving joint and the virtual joint. Further, the force jacobian W of the system is constructed by using the dual space theory so that
W=[Wa Wc]=T-T (8)
Further, the force jacobian W of the system is constructed by using the dual space theory, and the following relationship is established:
in the formula (I), the compound is shown in the specification,indicating the driving force jacobi of the robot,representing the jacobian binding force of the robot.Andare all varied with position, so rhoaAnd ρcAll can be regarded as nominal driving joint variableIs a non-linear function of (a). EtaaRepresenting the return to zero error vector that drives the joint.
And step three, constructing an error fitting model of { T } relative to { B } according to the deviation model (formulas 1 and 2) in the step two, and estimating a coefficient beta of the error fitting model.
According to the formula (1), it can be seen that,BξT/BandBξW/Bis approximately equal relation, and the purpose of establishing an error fitting model is to compensate in a deviation modelBξW/B。
Firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ (10)
where ξ represents the robot pose error vector (obtained by the deviation model of equation (1)), Z represents the motion error fitting matrix (known quantity) of the drive joint and the virtual joint,for the generalized motion jacobian (known quantity) of the robot, beta represents a coefficient vector (coefficient to be fitted) for fitting the motion errors of the driving joint and the virtual joint;
the derivation process of the error fitting function is:
under any given configuration of the robot, e.g. modifying the command for driving the joints to qm,
qm=q+Δq (11)
And the correction amount Deltaq is made to satisfy
Then the theoretical compensation by paInduced allowable motion pose error momentum xia。
Although ρaAnd ρcAre unknown, but they are all non-linear functions of nominal values q of robot-driven joint variables, for which purpose p is fitted with a quadratic polynomial of qaAnd ρcI.e. by
In the formula, ρa,iRepresenting the i-th component, p, of the driving joint motion error vectorc,1Representing the 1 st component, q, of a virtual joint motion error vectorjRepresenting the motion variable of the j-th drive joint, qkRepresenting the motion variable of the kth drive joint, alphaa,0,iCoefficient, alpha, representing the zeroth order term that drives the joint motion error fitc,0,1Coefficients representing the zeroth order term of the virtual joint motion error fit. Alpha is alphaa,j,iRepresenting the fitting to the i-th driving joint motion errorFirst order coefficient, alpha, associated with the j-th driving joint variablec,j,1The coefficients of the first order term associated with the j-th driven joint variable in the fit to the 1 st virtual joint motion error are shown. Alpha is alphaa,jk,iRepresenting the coefficients of the second order terms, alpha, associated with the j-th and k-th driven joint variables in the fitting of the motion error of the i-th driven jointc,jk,1Representing the coefficients of the second order terms associated with the j-th and k-th driven joint variables in the fit to the 1 st virtual joint motion error.
Performing variable substitution on the formula (13), and writing the formula (13) into the form of a multi-element linear function
In the formula, betaa,j,iN denotes coefficients of the jth polynomial in the fit to the ith driving joint motion error, βc,j,1J-0, 1,.. and n denote coefficients of the jth polynomial in the 1 st virtual joint motion error fit. z is a radical ofjJ-1, 2.. and n denote the j-th polynomial associated with the driving joint variable. z represents by zjThe vectors of the components. Beta is aa,iIs represented by betaa,j,iVector of composition, betac,iIs represented by betac,j,1The vectors of the components. n represents the number of coefficients in a multiple linear function constructed by a quadratic polynomial, and n is 21.
Writing formula (13) in matrix form, having
ρa=Zaβa,ρc=Zcβc (15)
In the formula
Substituting formula (15) for formula (7) to obtain an error fitting model
ξ=TZβ (16)
In the formula (I), the compound is shown in the specification,a motion error fitting matrix representing the driven joint and the virtual joint,coefficient vectors representing a fit of the drive joint and virtual joint motion errors.
Secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of beta
The robot working area is covered by K bit shapes, such as: it is possible to run through K150 bit shapes in the robot work area. According to the error fitting model, as shown in formula (16), the following multiple linear regression model is constructed:
y=Xβ+ε (17)
wherein:
in the formula, y represents a vector formed by observation of ξ. Xi1,ξ2,...,ξKAn observed value of ξ in the form of the 1 st, 2.,. K position can be obtained by the expressions (3) to (5). X represents a regression design matrix or an identification matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing the measurement noise following a zero-mean standard normal distribution. Wherein I6KRepresenting a 6K x 6K identity matrix.
Although an unbiased estimate of β can be obtained using the ordinary least squares method, it is noted that β contains 126 coefficients. The identification parameters are more, so there may be a ill-conditioned problem caused by non-column full rank or complex collinearity of the identification matrix. Therefore, a damped least square algorithm is adopted to obtain a reliable estimated value of beta
Fourthly, determining the relative position and posture of the base coordinate system and the workpiece coordinate system of the mobile robot through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a robot moving platform reference point P reaches a task space midpoint P0;
Secondly, as shown in FIG. 3, a measuring configuration of { T } relative to { B } for robot vision measurement is planned, so that the center V of the three-dimensional vision sensor is aligned with the center O of a triangle circumscribed circle formed by the centers of three target balls mounted on the workpieceWMeasuring the coordinates of the centers of the three target balls on the workpiece under the { V } in the step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Fourthly, a nominal driving joint variable q under the measured configuration is determined by using a robot position inverse solution model, and a motion error fitting matrix Z (q) and a generalized motion jacobian T (q) under the nominal driving joint variable q are calculated, wherein the calculation method is shown in a reference 2: research on parametric modeling and integrated design method of novel five-degree-of-freedom hybrid robot [ D ], Tianjin university ";
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley (the existing structure) is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Sixth step, constructed according to the second and fifth stepsComputing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V;
Seventh, the model (equation (10)) is fitted based on the error, usingStructure of the deviceIsomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement isWhereinTo representBξW/B,VIs determined by the estimated value of (c),presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
Therefore, the relative poses of the base coordinate system and the workpiece coordinate system of the mobile robot can be determined through one-time online vision measurement, and the complete compensation process is shown in figure 1.
Although the present invention has been described in terms of its functions and operations with reference to the accompanying drawings, the present invention is not limited to the specific functions and operations described above, and the above-described embodiments are merely illustrative and not restrictive, and those skilled in the art can make many modifications within the scope of the present invention without departing from the spirit and scope of the present invention as defined by the appended claims.
Claims (3)
1. The method for determining the relative pose of the base and the workpiece of the mobile robot based on vision measurement is characterized by comprising the following steps of:
firstly, establishing a relative pose relationship between a robot base coordinate system constructed by measuring information of a laser tracker and a vision sensor and a workpiece coordinate system, and comprising the following steps of:
(1) establishing a coordinate system, comprising the following steps: a certain station of the mobile robot is given, a workpiece connected coordinate system { W } is established on a workpiece, a base connected coordinate system { B } is established on a robot base, a cutter connected coordinate system { T } is established on a cutter, and a visual sensor connected coordinate system { V } is established on a three-dimensional visual sensor;
(2) order toBAW,LA homogeneous transformation matrix representing { W } versus { B } determined with a laser tracker at the station of the mobile robot;
(3) a given robot is given a position shape at a given mobile robot station position, and the position shape is obtained by solvingBAW,VThe above-mentionedBAW,VA homogeneous transformation matrix of { W } versus { B } constructed by machine vision measurements;
step two, constructing a deviation model of robot joint variable motion related to configuration:
BξW/B,V≈BξT/Bxi where xi is Tapa+Tcpc=Tp
In the formula (I), the compound is shown in the specification,BξW/B,Vshowing the pose of { W } relative { B } obtained by machine visionBAW,VAnd its reference valueThe measure of the 6-dimensional pose deviation in { B } is called the pose deviation rotation of { W } relative to { B };BξT/Brepresenting the measurement of the 6-dimensional pose error in { B } between the pose of { T } relative to { B } and the nominal value obtained by using the inverse kinematics model, and being called the pose error vector of { W } relative to { B };the robot is jacobian for the movement of the robot;is jacobian for the limited movement of the robot,in the form of a generalized motion jacobian,representing a motion error vector of a robot joint;
representing a motion error vector of a robot driving joint;representing the motion error vector of the virtual joint of the robot, wherein zeta represents the sum of error quantums of { T } relative to { B } caused by other unmodeled error sources in the robot body when the zero return error of the driving joint is not counted, and etaaRepresenting a return-to-zero error vector that drives the joint,is the driving force jacobi of the robot;the constraint force of the robot is Jacobian;
step three, according to the deviation model in the step two, constructing an error fitting model of { T } relative to { B } and estimating a coefficient beta of the error fitting model, wherein the specific steps are as follows:
firstly, constructing an error fitting model, wherein the formula is as follows:
ξ=TZβ
in the formula, xi represents the pose error vector of the robot, Z represents the motion error fitting matrix of the driving joint and the virtual joint, and beta represents the coefficient vector of the motion error fitting of the driving joint and the virtual joint;
secondly, identifying a coefficient vector beta in the regression model y ═ X beta + epsilon by adopting a damped least square algorithm to obtain a reliable estimated value of betaWherein:
in the formula, y represents a vector formed by observation of ξ, ξ1,ξ2,...,ξKDenotes the observed value of xi in the 1 st, 2., K configuration, X denotes a regression design matrix or a recognition matrix, T1,T2,...,TKRepresents the calculated value of T in the 1 st, 2 nd, K-th position, Z1,Z2,...,ZKRepresents the calculated value for Z in the 1 st, 2., K position; epsilon to N (0 sigma)2I6K) Representing measurement noise following a zero-mean standard normal distribution, I6KRepresents an identity matrix of 6K × 6K;
fourthly, determining the relative position and posture of the robot base coordinate system and the workpiece coordinate system through one-time online vision measurement of machine vision, and specifically comprising the following steps:
firstly, a region to be processed on a workpiece is given, three target points are installed in the region to be processed to form a workpiece coordinate system { W }, and meanwhile, the robot is in a reference configuration, namely, a reference point of a robot moving platform reaches a middle point of a task space;
second onePlanning a measuring configuration of { T } relative to { B } for robot vision measurement, aligning the center of the three-dimensional vision sensor to the center of a triangle circumscribed circle formed by the centers of three target balls mounted on a workpiece, measuring the coordinates of the centers of the three target balls on the workpiece under { V } in step one, and constructing a nominal homogeneous transformation matrix of { T } relative to { B } according to the coordinates
Fourthly, determining a nominal driving joint variable q under the measurement configuration by using a robot position inverse solution model, and calculating a motion error fitting matrix Z (q) and a generalized motion Jacobian T (q) under the nominal driving joint variable q;
fifthly, the robot is driven by q to move from the reference configuration to the measurement configuration, the measurement configuration is kept unchanged, the AGV trolley is moved, a visual sensor carried on an A/C swing angle head A shaft of the AGV trolley enters a preset position, coordinates of the centers of three target balls are measured through the visual sensor respectively, and a homogeneous transformation matrix of { W } relative to { V } is constructed
Sixth step, constructed according to the second and fifth stepsComputing a homogeneous transformation matrix of { W } versus { B } at the measured configurationBAW,V;
Seventh, based on the error fitting model, useStructure of the deviceIsomorphic transformation and right multiplication at both endsBAW,VConstruction DeltaBAWSuch that the { W } relative { V } homogeneous transformation matrix determined by the compensated machine vision measurement isWhereinPresentation pairBξW/B,VIs determined by the estimated value of (c),presentation pairBAW,VCompensated result, ΔBAWPresentation pairBAW,VA compensation value.
2. The vision measurement-based mobile robot base-workpiece relative pose determination method according to claim 1, wherein:
BAW,Vthe solution process of (2) is as follows: will be described inBAW,VIs further shown asBAW,V=BAT TAV VAW,VAWA homogeneous transformation matrix representing the system { W } relative to the system { V } determined by measuring coordinates in { V } of three non-collinear target spheres affixed to the workpiece with a vision sensor;TAVa homogeneous transformation matrix representing { V } relative to { T } is determined as a constant matrix through off-line hand-eye calibration of the robot;BATand (3) a homogeneous transformation matrix representing { T } relative to { B } and nominal values of the homogeneous transformation matrix are obtained by off-line programming of an inverse kinematics model.
3. The vision measurement-based mobile robot base-workpiece relative pose determination method according to claim 1, wherein: to obtainBAW,LThe method comprises the following steps: measuring the coordinates of at least three non-collinear target points on the workpiece and the robot base in the laser tracker coordinate system { L } using the laser tracker, respectively, and fitting by means of least squaresBAW,L。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110923264.4A CN113580137B (en) | 2021-08-12 | 2021-08-12 | Mobile robot base-workpiece relative pose determining method based on vision measurement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110923264.4A CN113580137B (en) | 2021-08-12 | 2021-08-12 | Mobile robot base-workpiece relative pose determining method based on vision measurement |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113580137A true CN113580137A (en) | 2021-11-02 |
CN113580137B CN113580137B (en) | 2023-09-22 |
Family
ID=78257429
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110923264.4A Active CN113580137B (en) | 2021-08-12 | 2021-08-12 | Mobile robot base-workpiece relative pose determining method based on vision measurement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113580137B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114131595A (en) * | 2021-11-12 | 2022-03-04 | 清华大学 | Robot 6D pose estimation system and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010003289A1 (en) * | 2008-07-11 | 2010-01-14 | 中国科学院沈阳自动化研究所 | Apparatus and method for robots tracking appointed path with high precision |
CN105773609A (en) * | 2016-03-16 | 2016-07-20 | 南京工业大学 | Robot kinematics calibration method based on vision measurement and distance error model |
CN107175660A (en) * | 2017-05-08 | 2017-09-19 | 同济大学 | A kind of six-freedom degree robot kinematics scaling method based on monocular vision |
CN107741224A (en) * | 2017-08-28 | 2018-02-27 | 浙江大学 | A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation |
CN109794963A (en) * | 2019-01-07 | 2019-05-24 | 南京航空航天大学 | A kind of robot method for rapidly positioning towards curved surface member |
US20200206945A1 (en) * | 2018-12-29 | 2020-07-02 | Ubtech Robotics Corp Ltd | Robot pose estimation method and apparatus and robot using the same |
-
2021
- 2021-08-12 CN CN202110923264.4A patent/CN113580137B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010003289A1 (en) * | 2008-07-11 | 2010-01-14 | 中国科学院沈阳自动化研究所 | Apparatus and method for robots tracking appointed path with high precision |
CN105773609A (en) * | 2016-03-16 | 2016-07-20 | 南京工业大学 | Robot kinematics calibration method based on vision measurement and distance error model |
CN107175660A (en) * | 2017-05-08 | 2017-09-19 | 同济大学 | A kind of six-freedom degree robot kinematics scaling method based on monocular vision |
CN107741224A (en) * | 2017-08-28 | 2018-02-27 | 浙江大学 | A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation |
US20200206945A1 (en) * | 2018-12-29 | 2020-07-02 | Ubtech Robotics Corp Ltd | Robot pose estimation method and apparatus and robot using the same |
CN109794963A (en) * | 2019-01-07 | 2019-05-24 | 南京航空航天大学 | A kind of robot method for rapidly positioning towards curved surface member |
Non-Patent Citations (2)
Title |
---|
李耀贵;: "一种含铰接动平台姿态调整类并联机器人的误差建模与验证", 重庆理工大学学报(自然科学), no. 08 * |
杨守瑞;尹仕斌;任永杰;邾继贵;叶声华: "机器人柔性视觉测量系统标定方法的改进", 光学精密工程, vol. 22, no. 12 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114131595A (en) * | 2021-11-12 | 2022-03-04 | 清华大学 | Robot 6D pose estimation system and method |
CN114131595B (en) * | 2021-11-12 | 2023-09-12 | 清华大学 | Robot 6D pose estimation system and method |
Also Published As
Publication number | Publication date |
---|---|
CN113580137B (en) | 2023-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109848983B (en) | Method for guiding robot to cooperatively work by high-compliance person | |
CN102591257B (en) | Parameter curve cutter path oriented numerical control system contour error control method | |
Peng et al. | An enhanced kinematic model for calibration of robotic machining systems with parallelogram mechanisms | |
Masory et al. | Kinematic modeling and calibration of a Stewart platform | |
CN111037542B (en) | Track error compensation method for linear machining of inverse dynamics control robot | |
Kubela et al. | Assessment of industrial robots accuracy in relation to accuracy improvement in machining processes | |
CN113580148B (en) | Parallel robot kinematics calibration method based on equivalent kinematic chain | |
CN114147726A (en) | Robot calibration method combining geometric error and non-geometric error | |
CN113580137A (en) | Mobile robot base-workpiece relative pose determination method based on vision measurement | |
CN114505865A (en) | Pose tracking-based mechanical arm path generation method and system | |
CN112587237A (en) | Method for reducing operation error of orthopedic operation robot | |
Qi et al. | Accuracy improvement calibrations for the double-position 4-PPPS aircraft docking system | |
US6785624B2 (en) | Method for calibrating of machine units located in machine tools or robotic devices | |
Kubela et al. | High accurate robotic machining based on absolute part measuring and on-line path compensation | |
Kim et al. | Joint compliance error compensation for robot manipulator using body frame | |
Zhao et al. | An efficient error prediction and compensation method for coordinated five-axis machine tools under variable temperature | |
Kvrgic et al. | A control algorithm for a vertical five-axis turning centre | |
Zhao et al. | Calibration and Compensation of Rotary Axis Angular Positioning Deviations on a SCARA-Type Industrial Robot Using a Laser Tracker | |
Li | Visual Calibration, Identification and Control of 6-RSS Parallel Robots | |
Zhao et al. | Kinematic modeling and inverse kinematics solution of a new six-axis machine tool for oval hole drilling in aircraft wing assembly | |
Lounici et al. | Tool Positioning Error Minimization during Robotic Milling Based on the Genetic Algorithm Technique | |
CN110940351A (en) | Robot precision compensation method based on parameter dimension reduction identification | |
Lounici et al. | Cutting Forces Impact on the Spindle Path during Robotic Milling | |
Newman et al. | Towards a new real-time metrology guidance method for robotized machining of aerostructures robust against cutting fluids and debris | |
Wu et al. | Volumetric error modeling and accuracy improvement by parameter identification of a compound machine tool |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |