CN111546344A - Mechanical arm control method for alignment - Google Patents

Mechanical arm control method for alignment Download PDF

Info

Publication number
CN111546344A
CN111546344A CN202010422260.3A CN202010422260A CN111546344A CN 111546344 A CN111546344 A CN 111546344A CN 202010422260 A CN202010422260 A CN 202010422260A CN 111546344 A CN111546344 A CN 111546344A
Authority
CN
China
Prior art keywords
mechanical arm
image
space
obtaining
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010422260.3A
Other languages
Chinese (zh)
Inventor
贾庆轩
段嘉琪
陈钢
王一帆
潘广堂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN202010422260.3A priority Critical patent/CN111546344A/en
Publication of CN111546344A publication Critical patent/CN111546344A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1607Calculation of inertia, jacobian matrixes and inverses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention provides a mechanical arm control method for alignment, which comprises the following steps: obtaining an image Jacobian matrix of the mapping relation between the image characteristic space and the mechanical arm joint space; obtaining a target characteristic depth value based on the mechanical arm and the measurement data of the image motion; obtaining a kinetic equation of the mechanical arm and the target load; and obtaining a mechanical arm control equation for alignment according to the image Jacobian matrix, the target characteristic depth value and a kinetic equation of the mechanical arm and the target load. According to the technical scheme provided by the embodiment of the invention, the alignment operation in the assembly process can be completed under the condition that the hand-eye relationship of the mechanical arm cannot be effectively calibrated, the on-line identification of the depth information of the target characteristic can be realized, and the robust control of the assembly with unknown inertial parameters is realized on the basis.

Description

Mechanical arm control method for alignment
[ technical field ] A method for producing a semiconductor device
The invention relates to an automatic control technology, in particular to a mechanical arm control method for performing servo alignment on an assembly body with unknown inertial parameters when the hand-eye relationship cannot be effectively calibrated.
[ background of the invention ]
With the continuous deepening of space development, the maximization of a space structure is a main trend of the development of future aerospace industry. Many scientific research programs require large space structures as the foundation, for example, a space station needs a solar cell array with a kilometer size to provide sufficient energy; high-resolution earth observation requires a hundred-meter satellite-borne antenna to improve the precision; the deep space exploration aircraft needs a huge solar sail made of ultra-light films to utilize solar photons as power. However, due to the limitation of the size of the fairing of the launch vehicle, the large space structure cannot be directly sent into space by the launch vehicle, and needs to be assembled by trusses in an on-orbit mode. In the rail assembling process of the truss, the space manipulator inevitably needs to complete a large amount of alignment work by using a visual servo. However, accurate and effective calibration of the system cannot be realized in the rail assembly process, for example, the Sky-worker robot used for the rail assembly in the U.S. has a hand arm specially carrying a camera. Each time an assembly pair is made, the robot arm is programmed to move to the relevant position to complete the "look" action. Therefore, the relationship between the hands and eyes of the alignment operation is uncertain every time, the recalibration is very complicated and difficult to realize, the assembly efficiency is low, and a great error is generated by continuously using the original calibration result. In addition, considering that the shape and quality of the held truss can be changed along with the assembly of the robot when the robot performs the truss alignment operation every time, the truss can be a simple truss which is not assembled and connected, and can also be a large truss which is partially built. The inertial parameters of the on-track mounted vision servo control object are therefore uncertain. This uncertainty in inertial parameters also has an effect on alignment accuracy. Therefore, it is very important to research how to control the mechanical arm to realize alignment assembly of the assembly with unknown inertial parameters without calibration.
The existing mechanical arm visual servo control method mainly comprises the steps of controlling a mechanical arm to finish an alignment task on a two-dimensional plane, and the method has limited application condition; the uncalibrated visual servo is realized by assuming that the target depth information is a slowly-varying value, the method cannot accurately obtain the target depth information, and meanwhile, the control precision is poor under the condition of large depth information variation; the design of the hybrid force controller based on the self-adaptive algorithm realizes the undetermined servo control of the inertial parameters of the operation object, needs complex parameter estimation, consumes long time and cannot meet the requirement of rapid assembly of a large-scale space structure. Therefore, the existing control method is not suitable for the servo alignment control of the mechanical arm in the rail assembly.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a robot arm control method for alignment, so as to achieve alignment of an assembly body with unknown inertial parameters without calibration in rail assembly.
The embodiment of the invention provides a mechanical arm control method for alignment, which comprises the following steps:
obtaining an image Jacobian matrix of the mapping relation between the image characteristic space and the mechanical arm joint space;
obtaining target characteristic depth value based on mechanical arm and image motion measurement data
Obtaining a kinetic equation of the mechanical arm and the target load;
and obtaining a mechanical arm control equation for alignment according to the image Jacobian matrix, the target characteristic depth value and a kinetic equation of the mechanical arm and the target load.
In the method, the jacobian matrix of the image for obtaining the mapping relation between the image feature space and the mechanical arm joint space is as follows:
Figure BDA0002496718780000031
Figure BDA0002496718780000032
wherein the content of the first and second substances,
Figure BDA0002496718780000033
2-dimensional image feature vectors for m feature points, JrTo reflect the jacobian matrix of the mapping relationship between the image feature space and the task space,
Figure BDA0002496718780000034
generalized velocity of camera in task space, JqIn order to reflect the Jacobian matrix of the mechanical arm of the mapping relation between the task space and the joint space of the mechanical arm,
Figure BDA0002496718780000035
and the joint speed of the mechanical arm is represented, m is the number of the selected characteristic points, and n is the degree of freedom of the mechanical arm.
In the method, a Jacobian matrix J reflecting the mapping relation between the image feature space and the task space is obtained by using the following formular
Jr=[Jr1,Jr2...Jrm]T
Figure BDA0002496718780000036
Wherein, JriIs the Jacobian matrix of the ith characteristic point, lambda is the focal length of the camera, z is the depth of the target characteristic point, [ uv ]]TThe projection coordinates of the target feature point in the image space.
In the above method, the obtaining of the target feature depth value based on the measurement data of the mechanical arm and the image motion is:
Figure BDA0002496718780000041
where v is the translational motion speed of the camera, w is the rotational motion speed of the camera, JtReflecting the influence of translational camera motion on the image feature vector, JωReflecting the influence of the rotation motion of the camera on the image feature vector, and can be based on the Jacobian matrix J reflecting the mapping relation between the image feature space and the task spacerAnd (5) derivation and obtaining.
Figure BDA0002496718780000042
In the above method, the equation of dynamics for obtaining the mechanical arm and the target load is:
Figure BDA0002496718780000043
in the above formula, H (q) is a positive fixed mass inertia matrix, C (q) is a Coriolis force and centrifugal force term, tau is a mechanical arm joint moment, JqIs a jacobian matrix of the mechanical arm FeThe tail end outputs force when the mechanical arm carries out assembly operation,
Figure BDA0002496718780000044
representing the joint acceleration of the mechanical arm.
In the above method, the obtaining a manipulator control equation for alignment according to the image jacobian matrix, the target feature depth value, and the kinetic equation of the manipulator and the target load is:
(1) by adopting a PI control theory, the alignment control equation of the task space manipulator is obtained as follows:
Figure BDA0002496718780000045
u (k +1) is the servo motion quantity of the camera at the moment k +1, u (k) ═ Δ r (k +1) ═ r (k +1) -r (k), and r (k) is the coordinate parameter of the camera in the task space at the moment k; e.g. of the typef(t) is the difference between the expected image feature and the actual image feature value, ef(t)=f*(t)-fg(t) wherein f*(t)=(u*v*)TThe expected projection of the characteristic points on the image plane, f (t) the actual projection of the characteristic points on the image plane; c. C1、c2Proportional and integral coefficients of the PI controller, respectively.
(2) By adopting sliding mode control based on a moment calculation method, the alignment control equation of the joint space mechanical arm is obtained as follows:
Figure BDA0002496718780000051
wherein tau is a control moment of a mechanical arm joint;
Figure BDA0002496718780000052
and
Figure BDA0002496718780000053
to use the estimated value of inertial parameter
Figure BDA0002496718780000054
Calculated positive fixed mass inertia matrix h (q) and coriolis force and centrifugal force term c (q) estimate; j. the design is a squareqA manipulator Jacobian matrix reflecting the mapping relation between the task space and the manipulator joint space; feThe tail end outputs force when the mechanical arm is assembled;
Figure BDA0002496718780000055
for the desired joint acceleration of the arm, Λ is a positive angular matrix, e ═ qdQ is the difference between the desired angle and the actual joint angle, d is the design quantity, defining the sliding mode surface of the sliding mode control
Figure BDA0002496718780000056
Selecting
Figure BDA0002496718780000057
In the formula
Figure BDA0002496718780000058
For weighted average, η is the approach rate for sliding mode control, sgn(s) is a step function.
According to the technical scheme, the embodiment of the invention has the following beneficial effects:
according to the technical scheme of the embodiment of the invention, the image Jacobian matrix of the mapping relation between the image feature space and the mechanical arm joint space is obtained, the depth value of the target feature is obtained according to the measurement data of the mechanical arm and the image motion, and the control equation applicable to the mechanical arm to align and assemble the assembly with unknown inertial parameters is obtained by combining the kinetic equation of the mechanical arm and the target load, so that the mechanical arm can be controlled to realize the servo alignment of the assembly with unknown inertial parameters under the condition of no calibration, a complicated calibration procedure of the hand-eye relation is omitted, the influence of uncertain inertial parameters of the operated object on the assembly precision is overcome, the robustness of hand-eye coordination is improved, and the mechanical arm can complete the on-orbit assembly task under the complicated space environment.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creativity and labor.
FIG. 1 is a schematic flow chart diagram of a robot arm control method for alignment according to an embodiment of the present invention;
FIG. 2-A is a projection of feature points at an initial moment in an embodiment of the invention at an image plane;
FIG. 2-B is a diagram illustrating the configuration of the robotic arm at an initial moment in an embodiment of the present invention;
FIG. 3-A is a projection of the final moment feature point on the image plane in an embodiment of the present invention;
FIG. 3-B is a diagram illustrating the configuration of the final moment arm in an embodiment of the present invention;
FIG. 4-A is a diagram illustrating the motion trajectory of feature points in image space during alignment according to an embodiment of the present invention;
FIG. 4-B is a graph comparing depth estimates to true values during alignment according to an embodiment of the present invention;
FIG. 4-C is a graph of image feature errors during alignment according to an embodiment of the present invention;
FIG. 4D is a diagram illustrating the angle of articulation of the robot arm during alignment according to an embodiment of the present invention;
fig. 5-a is a motion trajectory of a feature point in an image space when z is 1 in the embodiment of the present invention;
fig. 5-B is a motion trajectory of a feature point in an image space when z is 10 in the embodiment of the present invention;
FIG. 5-C is an image characteristic error diagram when the inertial parameters of the object to be assembled are undetermined in the embodiment of the invention;
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a method for controlling a robot arm for alignment, please refer to fig. 1, which is a schematic flow chart of the method for controlling a robot arm for alignment according to the embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step 101, obtaining an image Jacobian matrix of the mapping relation between the image characteristic space and the mechanical arm joint space.
Specifically, a Jacobian matrix reflecting the mapping relation between the image space and the task space is established first, then a manipulator Jacobian matrix reflecting the mapping relation between the task space and the manipulator joint space is established, and an image Jacobian matrix reflecting the mapping relation between the image space and the manipulator joint space is established on the basis.
Aiming at a Jacobian matrix reflecting the mapping relation between an image space and a task space, the following expression is established:
Figure BDA0002496718780000071
the specific development is as follows:
Figure BDA0002496718780000072
wherein the content of the first and second substances,
Figure BDA0002496718780000073
is the 2-dimensional image feature vector of the feature point i,
Figure BDA0002496718780000074
for generalized velocity of the camera in the task space, [ u v ]]TAnd the projection coordinates of the target characteristic point in the image space are shown, lambda is the focal length of the camera, and z is the depth of the target characteristic point.
Extend to m feature points:
Figure BDA0002496718780000081
wherein, JrIs a Jacobian matrix reflecting the mapping relation between the image space and the task space.
Aiming at a jacobian matrix of the mechanical arm, the following expression is established:
Figure BDA0002496718780000082
wherein, JqIn order to reflect the Jacobian matrix of the mechanical arm of the mapping relation between the task space and the joint space of the mechanical arm,
Figure BDA0002496718780000083
representing the joint velocity of the robotic arm.
The following expression is obtained from formula (2) and formula (3):
Figure BDA0002496718780000084
wherein the content of the first and second substances,
Figure BDA0002496718780000085
the method is characterized in that the method is a 2-dimensional image feature vector of m feature points, m is the number of selected feature points, n is the degree of freedom of a mechanical arm, J is an image Jacobian matrix reflecting the mapping relation between an image space and a joint space of the mechanical arm, and is expressed as follows:
Figure BDA0002496718780000086
step 102, obtaining a depth estimator for estimating depth values of target features on line based on measurement data of mechanical arms and image motion.
Specifically, according to a Jacobian matrix which is established according to the formula (1) and reflects the mapping relation between the image space and the task space, items related to the target feature depth are rearranged, the influence of the camera motion on the image feature is separated, and the target feature depth value is estimated online by using a least square method.
Firstly, establishing a mapping relation from a single feature point image space to an operation space according to formula (1), wherein the first three columns of the jacobian matrix of the image are related to the depth value of the target feature point, and rearranging:
Figure BDA0002496718780000091
wherein, JtFor the influence of the translational motion of the camera on the image feature vectors, JωIs the effect of camera rotational motion on the image feature vector.
Figure BDA0002496718780000092
To the right of the equation is the observed optical flow minus the optical flow due to camera rotation. The residual optical flow obtained by subtraction is the optical flow caused by the translation of the camera. Solving the target characteristic depth value in the above formula by using a least square method to obtain:
Figure BDA0002496718780000093
and 103, obtaining a kinetic equation of the mechanical arm and the target load.
The kinetic equation of the mechanical arm and the target load is as follows:
Figure BDA0002496718780000101
wherein H (q) is a positive fixed mass inertia matrix, C (q) is a Coriolis force and centrifugal force term, tau is a mechanical arm joint moment, JqIs a jacobian matrix of the mechanical arm FeThe tail end outputs force when the mechanical arm carries out assembly operation,
Figure BDA0002496718780000102
representing the joint acceleration of the mechanical arm.
And 104, obtaining a controller applicable to the robot arm alignment assembly of the unknown assembly with the inertial parameters according to the image Jacobian matrix, the depth estimator and the kinetic equation.
Specifically, PI control is adopted for visual servo outer loop control, so that errors between expected image features and actual image features are continuously reduced, and alignment tracking is realized; and sliding mode control based on a moment calculation method is adopted for visual servo inner ring control, so that the joint moment is controlled, and the influence of undetermined inertia parameters of an assembly on the system is reduced.
(1) The mechanical arm visual servo alignment system has the following errors:
ef(t)=f*(t)-fg(t)
wherein f is*(t)=(u*v*)TFor the desired projection of the feature points in the image plane, fg(t) is the actual projection of the feature points onto the image plane. The purpose of the visual servo outer loop control is to obtain the control quantity
Figure BDA0002496718780000103
To minimize the objective function
Figure BDA0002496718780000104
Discretizing the control quantity to obtain the optimal control quantity at the k +1 moment
Figure BDA0002496718780000105
u (k +1) is the servo motion quantity of the camera at the moment k +1, u (k) ═ Δ r (k +1) ═ r (k +1) -r (k), and r (k) is the coordinate parameter of the camera in the task space at the moment k; e.g. of the typef(t) is the difference between the expected image feature and the actual image feature value, ef(t)=f*(t)-fg(t) wherein f*(t)=(u*,v*)TThe expected projection of the characteristic points on the image plane, f (t) the actual projection of the characteristic points on the image plane; c. C1、c2Proportional and integral coefficients of the PI controller, respectively.
(2) When the inertia parameters of the mechanical arm and the magnitude of the tail end output force are unknown, the control law is taken as
Figure BDA0002496718780000111
Wherein tau is a control moment of a mechanical arm joint;
Figure BDA0002496718780000112
actual output acceleration of the mechanical arm joint;
Figure BDA0002496718780000113
and
Figure BDA0002496718780000114
to use the estimated value of inertial parameter
Figure BDA00024967187800001112
Calculated positive fixed mass inertia matrix h (q) and coriolis force and centrifugal force term c (q) estimate; j. the design is a squareqA manipulator Jacobian matrix reflecting the mapping relation between the task space and the manipulator joint space; feThe end outputs force for the robotic arm during an assembly operation.
According to formula (7) and formula (8):
Figure BDA0002496718780000115
namely, it is
Figure BDA0002496718780000116
Wherein the content of the first and second substances,
Figure BDA0002496718780000117
each representing the difference between the actual value and the estimated value.
Based on estimated values of inertial parameters
Figure BDA0002496718780000118
So that
Figure BDA0002496718780000119
Reversible then the above formula can be written as
Figure BDA00024967187800001110
Definition of
Figure BDA00024967187800001111
The sliding mode surface controlled by the sliding mode is
Figure BDA0002496718780000121
Wherein e ═ qd-q,
Figure BDA0002496718780000122
qdAnd
Figure BDA0002496718780000123
respectively, desired angle and desired angular velocity of the joint, s ═ s1...sn]TAnd Λ is a positive diagonal matrix.
Get
Figure BDA0002496718780000124
Then
Figure BDA0002496718780000125
Therefore, the whole control system can trend to a sliding mode surface from any initial state, and the accessibility of the sliding mode is met. The visual servo sliding mode control law according to the formula (8) and the formula (9) is
Figure BDA0002496718780000126
Wherein the content of the first and second substances,
Figure BDA0002496718780000127
according to the method provided by the embodiment of the invention, the mechanical arm control method for alignment is simulated, simulation experiment researches are respectively carried out on the traditional visual servo under different depth values, and the results are compared with the method provided by the embodiment of the invention so as to verify the effectiveness of the method.
The simulation was performed using a 6 degree-of-freedom robot, with DH parameters as shown in table 1. Assuming that the target object is a square with a side length of 0.4m and a center of (0,0,2) in the yoz plane, 4 vertex pixel coordinates are taken as the image feature. The joint angle q at the initial moment is [00.78543.141600.7854-0.7854 ]]rad. camera focal length λ 8mm, resolution 1024 × 1024pixels, mounted at the end of a robotic arm with an unknown positional relationship between the two, the desired location of the image feature on the image plane is a 250 × 250pixels square centered at the main point1=1.5,c20.1. The desired position of the visual servo sliding mode control is obtained from the image jacobian,
Figure BDA0002496718780000128
Figure BDA0002496718780000131
the constants in the diagonal matrix Λ are 25.
TABLE 1 space manipulator D-H parameter table
Figure BDA0002496718780000132
The technical scheme of the embodiment of the invention is used for simulating the tasks, and the projection of the target characteristic point on the image plane at the initial time of alignment and assembly is shown in figure 2-A, and the initial configuration of the mechanical arm is shown in figure 2-B;
at the end of the alignment assembly, the projection of the target feature point on the image plane is shown in fig. 3-a, and the initial configuration of the mechanical arm is shown in fig. 3-B;
the feature points being in image space throughout the alignment assembly processThe motion trajectory is shown as fig. 4-a, the time-varying curve of the depth of the target under the depth estimator is shown as fig. 4-B, and assuming that the initial value of the estimated depth is 0, it can be seen from the figure that the estimated depth rises rapidly, and then the depth of the actual target is tracked, and then the true depth can be accurately tracked in the convergence process of the controller. Since the depth estimation value is greatly different from the actual value in the initial stage, the motion direction of the characteristic point is deviated from the motion trajectory diagram, which is consistent with the actual situation of the mechanical arm performing visual servo alignment. The image characteristic error of the target in the visual servo alignment process is shown in figure 4-C, and the motion angle of each joint of the mechanical arm is shown in figure 4-D. Through analysis, the errors of the characteristic points rapidly trend to 0 within 5s, and all the errors are stabilized around 0 at 8 s. The final characteristic point error is of the order of 10-8pixels. Errors of this order are negligible during actual assembly alignment;
if the conventional visual servo control method is adopted, the visual servo simulation results when z is 1 and z is 10 are shown in fig. 5-a and 5-B, and it can be seen that the motion trajectory of the feature point on the image is no longer a straight line because the existing jacobian matrix fails to accurately reflect the relationship between the camera motion and the motion of the feature point of the image. The convergence rate is much slower when z is 1 than when z is 10. The jacobian matrix of the image when z 1 overestimates the optical flow, so its inverse underestimates the required motion velocity of the camera. In the case of z 10, the camera moves relatively large in each step, forming a jagged path. By comparing the characteristic point motion trail maps based on the depth estimator (figure 4-A) and under different depth values (figure 5-A, figure 5-B), the motion performance of uncalibrated visual servo is proved to be improved well by using the control method provided by the embodiment of the invention.
5-C is an image characteristic error graph when the inertial parameters of the object to be assembled are uncertain and the traditional visual servo is adopted, and comparing the graph in 4-C with the graph in 5-C, it can be seen that when the inertial parameters of the object to be assembled are not timed, the traditional control method can cause large deviation of 4 characteristic points, and the error of the peak value is 127pixels which is far larger than the peak error of the control method designed by the text. At 14s each feature point substantially reaches the specified position, but there is still an error of 2 pixels. Due to the closed-loop nature of the visual servoing, the error of each feature point eventually goes to 0 at 16s, but converges much slower than the control method designed herein. It can be thus demonstrated that the above-described control method provided by the embodiments of the present invention achieves fast and stable alignment tracking of the target in the assembly.
The technical scheme of the embodiment of the invention has the following beneficial effects:
the established image Jacobian matrix model reflects the mapping relation between the image characteristic space and the mechanical arm joint space, the target characteristic can reach the expected position in the image space by controlling the mechanical arm movement, the complicated hand-eye calibration process is omitted, and the calibration result of the internal and external parameters of the camera has strong robustness; the depth estimator for estimating the depth value of the target feature on line based on the measurement data of the mechanical arm and the image motion can quickly and accurately realize the on-line estimation of the depth value of the target feature, solves the problem that the depth information is nonlinearly presented in an image Jacobian matrix in a reciprocal form, and provides powerful theoretical support for self-adaptive control for visual servo; the sliding mode control method based on the calculated torque can realize robust control on the inertial parameter unknown assembly, and has the advantages of quick response, insensitivity to parameter change and disturbance and simple physical realization; the mechanical arm control method for alignment can be widely applied to other on-orbit assembly operation tasks and research fields.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (6)

1. A robotic arm control method for alignment, the method comprising:
obtaining an image Jacobian matrix of the mapping relation between the image characteristic space and the mechanical arm joint space;
obtaining a target characteristic depth value based on the mechanical arm and the measurement data of the image motion;
obtaining a kinetic equation of the mechanical arm and the target load;
and obtaining a mechanical arm control equation for alignment according to the image Jacobian matrix, the target characteristic depth value and a kinetic equation of the mechanical arm and the target load.
2. The method according to claim 1, wherein the image jacobian matrix for obtaining the mapping relationship between the image feature space and the mechanical arm joint space is as follows:
Figure FDA0002496718770000011
Figure FDA0002496718770000012
wherein the content of the first and second substances,
Figure FDA0002496718770000013
2-dimensional image feature vectors for m feature points, JrIs a Jacobian matrix of the mapping relation between the image feature space and the task space,
Figure FDA0002496718770000014
generalized velocity of camera in task space, JqIn order to reflect the Jacobian matrix of the mechanical arm of the mapping relation between the task space and the joint space of the mechanical arm,
Figure FDA0002496718770000015
and the joint speed of the mechanical arm is represented, m is the number of the selected characteristic points, and n is the degree of freedom of the mechanical arm.
3. The method of claim 2, wherein the reflection is obtained using the following formulaJacobian matrix J of mapping relation between image feature space and task spacer
Jr=[Jr1,Jr2...Jrm]T
Figure FDA0002496718770000021
Where λ is the focal length of the camera, z is the target feature point depth, [ u v ]]TThe projection coordinates of the target feature point in the image feature space are obtained.
4. A method according to claims 1 and 3, characterized in that the target feature depth values obtained based on the measured data of the mechanical arm and the image motion are:
Figure FDA0002496718770000022
where v is the translational motion speed of the camera, w is the rotational motion speed of the camera, JtReflecting the influence of translational camera motion on the image feature vector, JωReflecting the influence of the rotation motion of the camera on the image feature vector, and can be based on the Jacobian matrix J reflecting the mapping relation between the image feature space and the task spacerAnd (5) derivation and obtaining.
Figure FDA0002496718770000023
5. The method of claim 1, wherein the obtaining the kinematic equation of the mechanical arm and the target load is:
Figure FDA0002496718770000024
in the above formula, H (q) is a positive fixed mass inertia matrix, C (q) is a Coriolis force and centrifugal force term, tau is a mechanical arm joint moment, JqIs a jacobian matrix of the mechanical arm FeThe tail end outputs force when the mechanical arm carries out assembly operation,
Figure FDA0002496718770000025
representing the joint acceleration of the mechanical arm.
6. The method of claim 1, wherein the obtaining a robot arm control equation for alignment based on the image jacobian matrix, the target feature depth values, and the robot arm to target load dynamics equations is:
(1) by adopting a PI control theory, the alignment control equation of the task space manipulator is obtained as follows:
Figure FDA0002496718770000031
u (k +1) is the servo motion quantity of the camera at the moment k +1, u (k) ═ Δ r (k +1) ═ r (k +1) -r (k), and r (k) is the coordinate parameter of the camera in the task space at the moment k; e.g. of the typef(t) is the difference between the desired image feature and the actual image feature, ef(t)=f*(t)-fg(t) wherein f*(t)=(u*v*)TThe expected projection of the characteristic points on the image plane, f (t) the actual projection of the characteristic points on the image plane; c. C1、c2Proportional and integral coefficients of the PI controller are respectively;
(2) by adopting sliding mode control based on a moment calculation method, the alignment control equation of the joint space mechanical arm is obtained as follows:
Figure FDA0002496718770000032
wherein tau is a control moment of a mechanical arm joint;
Figure FDA0002496718770000033
and
Figure FDA0002496718770000034
to utilize inertiaParameter estimation
Figure FDA0002496718770000035
Calculated positive fixed mass inertia matrix h (q) and coriolis force and centrifugal force term c (q) estimate; j. the design is a squareqA manipulator Jacobian matrix of the mapping relation between the task space and the manipulator joint space; feThe tail end outputs force when the mechanical arm is assembled;
Figure FDA0002496718770000036
Figure FDA0002496718770000037
for the desired joint acceleration of the arm, Λ is a positive angular matrix, e ═ qdQ is the difference between the desired angle and the actual joint angle, d is the design quantity, defining the sliding mode surface of the sliding mode control
Figure FDA0002496718770000038
Selecting
Figure FDA0002496718770000039
In the formula
Figure FDA00024967187700000310
For weighted average, η is the approach rate for sliding mode control, sgn(s) is a step function.
CN202010422260.3A 2020-05-18 2020-05-18 Mechanical arm control method for alignment Pending CN111546344A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010422260.3A CN111546344A (en) 2020-05-18 2020-05-18 Mechanical arm control method for alignment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010422260.3A CN111546344A (en) 2020-05-18 2020-05-18 Mechanical arm control method for alignment

Publications (1)

Publication Number Publication Date
CN111546344A true CN111546344A (en) 2020-08-18

Family

ID=71998855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010422260.3A Pending CN111546344A (en) 2020-05-18 2020-05-18 Mechanical arm control method for alignment

Country Status (1)

Country Link
CN (1) CN111546344A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734823A (en) * 2020-12-30 2021-04-30 东北大学 Jacobian matrix depth estimation method based on visual servo of image
CN112894823A (en) * 2021-02-08 2021-06-04 珞石(山东)智能科技有限公司 Robot high-precision assembling method based on visual servo

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030220756A1 (en) * 2002-03-01 2003-11-27 Siemens Aktiengesellschaft Method for calibrating of machine units located in machine tools or robotic devices
CN101430192A (en) * 2007-11-07 2009-05-13 三菱电机株式会社 Method and system for determining 3D pose of object in scene
US20100246899A1 (en) * 2009-03-26 2010-09-30 Rifai Khalid El Method and Apparatus for Dynamic Estimation of Feature Depth Using Calibrated Moving Camera
CN102944241A (en) * 2012-11-15 2013-02-27 北京理工大学 Spacecraft relative attitude determining method based on multicell liner differential inclusion
US20130079928A1 (en) * 2011-09-28 2013-03-28 Universal Robots A/S Calibration and Programming of Robots
CN107097231A (en) * 2017-07-06 2017-08-29 哈尔滨工业大学深圳研究生院 A kind of concentric tube robot precise motion control method of view-based access control model servo
CN108297093A (en) * 2017-12-29 2018-07-20 中国海洋大学 A kind of step identification method of Manipulator Dynamics parameter
CN108748149A (en) * 2018-06-04 2018-11-06 上海理工大学 Based on deep learning without calibration mechanical arm grasping means under a kind of complex environment
CN109848984A (en) * 2018-12-29 2019-06-07 芜湖哈特机器人产业技术研究院有限公司 A kind of visual servo method controlled based on SVM and ratio
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110722533A (en) * 2018-07-17 2020-01-24 天津工业大学 External parameter calibration-free visual servo tracking of wheeled mobile robot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030220756A1 (en) * 2002-03-01 2003-11-27 Siemens Aktiengesellschaft Method for calibrating of machine units located in machine tools or robotic devices
CN101430192A (en) * 2007-11-07 2009-05-13 三菱电机株式会社 Method and system for determining 3D pose of object in scene
US20100246899A1 (en) * 2009-03-26 2010-09-30 Rifai Khalid El Method and Apparatus for Dynamic Estimation of Feature Depth Using Calibrated Moving Camera
US20130079928A1 (en) * 2011-09-28 2013-03-28 Universal Robots A/S Calibration and Programming of Robots
CN102944241A (en) * 2012-11-15 2013-02-27 北京理工大学 Spacecraft relative attitude determining method based on multicell liner differential inclusion
CN107097231A (en) * 2017-07-06 2017-08-29 哈尔滨工业大学深圳研究生院 A kind of concentric tube robot precise motion control method of view-based access control model servo
CN108297093A (en) * 2017-12-29 2018-07-20 中国海洋大学 A kind of step identification method of Manipulator Dynamics parameter
CN108748149A (en) * 2018-06-04 2018-11-06 上海理工大学 Based on deep learning without calibration mechanical arm grasping means under a kind of complex environment
CN110722533A (en) * 2018-07-17 2020-01-24 天津工业大学 External parameter calibration-free visual servo tracking of wheeled mobile robot
CN109848984A (en) * 2018-12-29 2019-06-07 芜湖哈特机器人产业技术研究院有限公司 A kind of visual servo method controlled based on SVM and ratio
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
YUN-HUI LIU;HESHENG WANG;CHENGYOU WANG;KIN KWAN LAM: "Uncalibrated Visual Servoing of Robots Using", 《ROBOTICS》 *
王梓诚: "基于图像的无标定目标伺服控制", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
王耀兵: "《空间机器人》", 31 May 2018, 北京理工大学出版社 *
科克: "《机器人学、机器视觉与控制—MATLAB算法基础》", 31 May 2016, 电子工业出版社 *
钱江,苏剑波,席裕庚: "图像雅可比矩阵的在线估计和机器人无标定三维运动跟踪", 《中国自动化学会第十六届青年学术年会》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734823A (en) * 2020-12-30 2021-04-30 东北大学 Jacobian matrix depth estimation method based on visual servo of image
CN112734823B (en) * 2020-12-30 2023-10-20 东北大学 Image-based visual servo jacobian matrix depth estimation method
CN112894823A (en) * 2021-02-08 2021-06-04 珞石(山东)智能科技有限公司 Robot high-precision assembling method based on visual servo
CN112894823B (en) * 2021-02-08 2022-06-21 珞石(山东)智能科技有限公司 Robot high-precision assembling method based on visual servo

Similar Documents

Publication Publication Date Title
WO2021180128A1 (en) Rgbd sensor and imu sensor-based positioning method
CN105259786B (en) The inertial parameter discrimination method and device of target to be identified
CN109048890A (en) Coordination method for controlling trajectory, system, equipment and storage medium based on robot
CN111993417B (en) Mechanical arm self-adaptive impedance control method based on RBF neural network
CN107457783B (en) Six-degree-of-freedom mechanical arm self-adaptive intelligent detection method based on PD controller
CN104898642A (en) Integrated test simulation system for spacecraft attitude control algorithm
CN107414827B (en) Six-degree-of-freedom mechanical arm self-adaptive detection method based on linear feedback controller
CN112327942B (en) Automatic leveling method for triaxial air-bearing satellite simulation platform
CN106777656A (en) A kind of industrial robot absolute precision calibration method based on PMPSD
CN108469737B (en) Dynamics control method and system for space non-cooperative target navigation capture
CN107450317A (en) A kind of space manipulator self-adapting power control method for coordinating
CN108267952B (en) Self-adaptive finite time control method for underwater robot
CN109426147B (en) Adaptive gain adjustment control method for combined spacecraft after satellite acquisition
CN111546344A (en) Mechanical arm control method for alignment
CN111618861A (en) Double-follow-up intelligent arm control method based on four-axis structure
CN109623812A (en) Consider the mechanical arm method for planning track of spacecraft ontology attitude motion
CN114942593A (en) Mechanical arm self-adaptive sliding mode control method based on disturbance observer compensation
Shi et al. Modeling and simulation of space robot visual servoing for autonomous target capturing
CN112990549B (en) Space non-cooperative target near-around flight observation track optimization method
CN109764876A (en) The multi-modal fusion localization method of unmanned platform
CN108247636B (en) Parallel robot closed-loop feedback control method, system and storage medium
CN115755939A (en) Four-rotor underwater vehicle forward motion state estimation method
Liu et al. Vision-based path following of snake-like robots
Li et al. Vision-based dynamic positioning system design and experiment for an underwater vehicle
Shi et al. Study on intelligent visual servoing of space robot for cooperative target capturing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200818

WD01 Invention patent application deemed withdrawn after publication