CN115454097A - Robot end effector working space boundary generation method based on linear programming - Google Patents

Robot end effector working space boundary generation method based on linear programming Download PDF

Info

Publication number
CN115454097A
CN115454097A CN202211236206.5A CN202211236206A CN115454097A CN 115454097 A CN115454097 A CN 115454097A CN 202211236206 A CN202211236206 A CN 202211236206A CN 115454097 A CN115454097 A CN 115454097A
Authority
CN
China
Prior art keywords
joint
robot
end effector
constraint
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202211236206.5A
Other languages
Chinese (zh)
Inventor
邱蜀伟
邓文平
黄坤
诸明翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Shibite Robot Co Ltd
Original Assignee
Hunan Shibite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Shibite Robot Co Ltd filed Critical Hunan Shibite Robot Co Ltd
Priority to CN202211236206.5A priority Critical patent/CN115454097A/en
Publication of CN115454097A publication Critical patent/CN115454097A/en
Priority to CN202310209843.1A priority patent/CN116141331A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a robot end effector working space boundary generating method based on linear programming, which comprises the following steps: s1, constructing kinematic modeling of robot motion by representing joint motion in a parameterization mode according to joint characteristics of the robot; s2, respectively constructing linear equations for generating all constraints required by the boundary of the working space of the robot end effector; s3, determining a linear constraint equation set required to be met by the boundary of the working space of the robot end effector; s4, acquiring all robot joint motion parameters meeting the linear equation in the constraint equation set; s5, finding out boundary joint motion parameters meeting partial constraint from all robot joint motion parameters meeting the linear equations of the linear constraint equation set; and S6, identifying and determining the working space boundary points of the robot end effector. When the robot and the industrial production line are arranged, the operation of overlapping the industrial production line and the working space of the robot is simplified, and the robot arrangement operation is greatly simplified.

Description

Robot end effector working space boundary generation method based on linear programming
Technical Field
The invention relates to the technical field of automatic control, in particular to a method for generating a working space boundary of a robot.
Background
With the continuous development of automatic control technology, robots are widely applied to various industries, and in various scenes and tasks of the robots, different requirements are provided for positions and postures of end effectors of the robots to reach. When a robot and an industrial production line are arranged, the conventional method is to roughly estimate the moving range of the end effector of the robot on the premise of meeting task requirements according to a design drawing of the robot, but the possibility that the end effector of the robot cannot completely meet the task requirements exists in the estimation method, and because whether one robot can meet the requirements of a certain task cannot be determined, a technical worker is required to perform a full-flow test in a manual walking point mode after the robot is deployed.
The method is only suitable for robots with few joints, extremely large amount of calculation is needed for multi-joint robots, and a large amount of redundant calculation exists.
Therefore, there is a need in the art for a method for obtaining a working space of an end effector for joint characteristics of a robot, so as to determine whether the robot can reach a certain working point and complete a certain task when laying out the robot and an industrial production line.
Disclosure of Invention
In order to solve at least one of the technical problems, the invention provides a robot end effector work space boundary generating method based on linear programming.
The purpose of the invention is realized by the following technical scheme:
the invention provides a robot end effector working space boundary generating method based on linear programming, which comprises the following steps:
s1, representing the joint motion by parameterization according to the joint characteristics of the robot, and constructing a robot kinematics model;
s2, respectively constructing robot assembly constraints, matrix rank deficiency constraints, constraints on the pose of the robot end effector and linear equations of the constraints generated by introducing intermediate variables, which are required by generating the boundary of the working space of the robot end effector;
s3, determining a linear constraint equation set which needs to be met by generating a working space boundary of the robot end effector by combining joint motion parameters and the constructed robot assembly constraint, matrix rank deficiency constraint, constraint on the pose of the robot end effector and a linear equation of constraint generated by introducing intermediate variables;
s4, obtaining all robot joint motion parameters meeting the linear equations in the constraint equation set according to the linear constraint equation set;
s5, finding out boundary joint motion parameters which simultaneously satisfy robot assembly constraint, matrix rank deficiency constraint and robot end effector pose constraint from all robot joint motion parameters which satisfy linear equations of a linear constraint equation set;
and S6, identifying and determining the working space boundary points of the robot end effector from the boundary joint motion parameters by combining the robot assembly constraint.
As a further improvement, in the step S1, a kinematics modeling of the robot motion is constructed by using a parameterized representation of the joint motion according to the joint characteristics of the robot, and the method specifically includes the following steps:
s11, taking a joint point on each joint of the robot, and abstracting all joints of the robot into an abstraction model formed by points and line segments;
s12, representing the motion generated by the current joint by the rotation posture or position change of the next joint point connected with the current joint by parameterization;
and S13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization to construct kinematic modeling of the motion of the robot.
As a further improvement, in step S12, the parameterization is used to represent the motion generated by the current joint by the change of the rotational posture or the position of the next joint point connected to the current joint, and specifically includes the following steps:
s121, when the current joint is a rotary joint, the motion parameter of the current joint is represented by the rotation posture change of the next joint point by adopting a quaternion parameter;
and S122, when the current joint is a telescopic joint, representing the motion parameter of the current joint by adopting a three-dimensional vector parameter according to the position change of the next joint point.
As a further improvement, in step S2, constructing a robot assembly constraint that generates a boundary of the workspace is specifically: judging each joint of the robot, and when the current joint is a rotary joint, the robot assembly constraint condition required to be met by the current joint is that the rotary posture of the next joint point is changed, and the rotary direction is always consistent with the rotary shaft of the current joint; when the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position of the next joint point changes, two vectors which are inconsistent with the telescopic positive direction of the current joint are always kept unchanged.
As a further improvement, in the step S2, the linear equation of the rank default constraint of the constructed matrix is:
Figure BDA0003883552170000031
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003883552170000033
to represent
Figure BDA0003883552170000034
Relative to
Figure BDA0003883552170000035
The transposed matrix of the partial derivatives of (a),
Figure BDA0003883552170000036
a linear equation representing the assembly constraints of the robot,
Figure BDA0003883552170000037
representing the robot end effector joint motion parameters,
Figure BDA0003883552170000038
indicating and removing machine
Other joint motion parameters outside of the robot end effector,
Figure BDA0003883552170000039
representing a random vector.
As a further improvement, in step S3, the system of linear constraint equations to be satisfied for generating the boundary of the working space of the robot end effector is as follows:
Figure BDA0003883552170000032
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00038835521700000310
represents a linear equation of constraint on the pose of the robot end effector,
Figure BDA00038835521700000311
representing a constrained linear equation generated by introducing intermediate variables,
Figure BDA00038835521700000312
representing intermediate variables introduced by secondary variables composed of joint motion parameters,
Figure BDA00038835521700000313
representing intermediate variables introduced by bilinear variables composed of joint motion parameters.
As a further improvement, in the step S4, all robot joint motion parameters satisfying the linear equations in the constraint equation set are obtained according to the linear constraint equation set, which includes the following steps:
s41, finding out the maximum value and the minimum value of the joint motion parameter which can be obtained on the premise of meeting a linear constraint equation set;
s42, setting a threshold, judging whether the maximum value in the maximum value ranges of all the joint motion parameters is higher than the set threshold, if so, dividing the value range of the joint motion parameter into two from a middle point, respectively storing the value ranges of the two groups of obtained parameters into a queue of the joint motion parameters, and keeping the value ranges of other joint motion parameters unchanged;
and S43, repeating the steps S41 and S42 for the value range of the next group of parameters in the joint motion parameter queue until the value ranges of all the joint motion parameters are lower than the set threshold value.
As a further improvement, in step S5, boundary joint motion parameters satisfying robot assembly constraints, matrix rank deficiency constraints and constraints on the pose of the robot end effector are found out from all robot joint motion parameters satisfying the linear equations of the linear constraint equation set by a newton iteration method.
As a further improvement, in step S6, identifying and determining the workspace boundary points of the robot end effector from the boundary joint motion parameters in combination with the robot assembly constraints, the method includes the following steps:
s61, searching each group of boundary joint motion parameters, and respectively finding out a joint motion normal of a high-dimensional geometric figure formed by a robot assembly constraint linear equation relative to the robot end effector joint motion parameters on the current boundary joint motion parameter position, wherein the specific formula is as follows:
Figure BDA0003883552170000041
wherein the content of the first and second substances,
Figure BDA0003883552170000042
representing a joint motion normal of a high-dimensional geometric figure formed by a robot assembly constraint linear equation relative to a robot end effector joint motion parameter at a current boundary joint motion parameter position,
Figure BDA0003883552170000043
is that
Figure BDA0003883552170000044
Relative to
Figure BDA0003883552170000045
The partial derivative of (a) is,
Figure BDA0003883552170000046
representing a random vector representing a current boundary joint motion parameter;
and S62, setting a judgment function by combining the joint motion normal, and judging the joint motion parameters of each robot end effector through the judgment function to identify and determine the working space boundary points of the end effector.
The invention provides a robot end effector working space boundary generating method based on linear programming, which is characterized by comprising the following steps: s1, constructing kinematic modeling of robot motion by representing joint motion in a parameterization mode according to joint characteristics of the robot; s2, respectively constructing robot assembly constraints, matrix rank deficiency constraints, constraints on the pose of the robot end effector and linear equations of the constraints generated by introducing intermediate variables, which are required by generating the boundary of the working space of the robot end effector; s3, determining a linear constraint equation set which needs to be met by generating a working space boundary of the robot end effector by combining joint motion parameters and the constructed robot assembly constraint, matrix rank deficiency constraint, constraint on the pose of the robot end effector and a linear equation of constraint generated by introducing intermediate variables; s4, obtaining all robot joint motion parameters meeting the linear equations in the constraint equation set according to the linear constraint equation set; s5, finding out boundary joint motion parameters which simultaneously satisfy robot assembly constraint, matrix rank deficiency constraint and robot end effector pose constraint from all robot joint motion parameters which satisfy linear equations of a linear constraint equation set; and S6, identifying and determining the working space boundary point of the robot end effector from the boundary joint motion parameters by combining the robot assembly constraint. In the application process, based on the requirements of the structural size, joint limit and task target of the robot, the boundary points of the working space obtained by the robot end effector on the premise of meeting the task requirements are connected to obtain the working space boundary of the robot end effector, and all constraint conditions for the robot end effector are introduced. When the robot and the industrial production line are arranged, the arrangement work is simplified into the work of overlapping the industrial production line and the robot working space, so that the arrangement work of an automatic scene can be greatly simplified, and the arrangement accuracy can be greatly improved.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a diagram of an abstraction model according to one embodiment of the invention;
fig. 3 is a schematic view of the workspace boundary of the robotic end effector of the present invention.
Detailed Description
Referring to fig. 1, an embodiment of the present invention provides a method for generating a boundary of a working space of a robot end effector based on linear programming, which specifically includes the following steps:
s1, representing the joint motion by parameterization according to the joint characteristics of the robot, and constructing a robot kinematic model. For example, the robot of the present embodiment may be selected, but not limited to, by taking an articulated robot as an example for explanation, but not limited to this. Specifically, the robot may optionally, but not limited to, include a plurality of joint axes, and the specific positions, numbers, types, etc. of the joint axes may be arbitrarily set by those skilled in the art according to the application range, field, working performance, etc. of the robot. Specifically, the joint shaft may optionally but not limited to include a rotation shaft and/or a telescopic shaft according to the function that the robot can implement, the two joint shafts are connected by a joint arm (i.e., a joint link), and the joint characteristics of the robot include rotation or telescopic. More specifically, the end effector is an actuating member mounted at the end of the joint, and may be, but is not limited to, a multi-finger gripper, a paint gun, a welding tool, or other work tool. More specifically, each joint of the robot is optionally but not limited to be represented mathematically by a parameter to realize mathematical modeling; more specifically, the method may be, but not limited to, a mathematical modeling method using Linear Programming (LP), which is a basic mathematical theory and method for studying extreme value problems of Linear objective functions under Linear constraint conditions, but is not limited thereto. More specifically, step S1, optionally but not limited to, includes the following:
s11, as shown in FIG. 2, a joint point is taken from each joint of the robot, the taken joint point is a joint axis of the robot in the embodiment, all joints of the robot are abstracted into an abstraction model formed by points and line segments, each joint comprises a joint point and a joint arm, the joint point, namely the position of the joint axis, is represented by the points in the abstraction model, and the joint arm is represented by the lines in the abstraction model;
s12, representing the motion generated by the rotation posture or position change of the next joint point connected with the current joint on the current joint by using parameters; specifically, the method can be selected from but not limited to the following steps:
s121, when the current joint is a rotary joint, because the position of the next joint point connected with the current joint is determined by the rotary posture of the next joint point, the current joint motion parameter is represented by the rotary posture change of the next joint point by using a quaternion parameter, and as shown in figure 2, the joint motion parameter of the joint 1 is represented by a joint point P 1 The change of the rotating posture of the rotor adopts quaternion parametersThe joint motion parameters of the joint 2 are represented by joint points P 2 The change of the rotary posture of the joint is expressed by using quaternion parameters, and the like.
And S122, when the current joint is a telescopic joint, the rotating posture of the next joint point connected with the current joint cannot be changed, so that the motion parameter of the current joint is represented by the position change of the next joint point by adopting a three-dimensional vector parameter.
And S13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization, and constructing a kinematic model of the robot.
And S2, respectively constructing robot assembly constraints, matrix rank deficiency constraints, constraints on the pose of the robot end effector and linear equations of the constraints generated by introducing intermediate variables, which are required by generating the boundary of the working space of the robot end effector.
Constructing robot assembly constraints for generating a workspace boundary: since the robot is formed by connecting different components by all joints, and the connection of each joint of the robot can only allow the motion of a limited degree of freedom between adjacent components, the parameters representing the motion of the joint need to satisfy the assembly constraint condition introduced by the joint to limit the variation form of the parameters, specifically: judging each joint of the robot, and if the current joint is a rotary joint, then the robot assembly constraint condition which needs to be met by the current joint is that the direction in the rotary posture is always consistent with the rotary shaft of the current joint when the rotary posture of the next joint point is changed, wherein the direction in the rotary posture is defined and determined by the coordinate system of the next joint point. Such as the joint point P in fig. 2 1 And P 2 Has a rotational posture of (q) 10 ,q 11 ,q 12 ,q 13 ) And (q) 20 ,q 21 ,q 22 ,q 23 ) Then their corresponding rotation matrices are as follows:
Figure BDA0003883552170000071
Figure BDA0003883552170000072
since the joint 2 can only surround P 1 So that the constraint introduced by the joint 2 is P 2 Is always equal to P in the Y-axis direction in the rotating posture of 1 The Y-axis of (i) being coincident, i.e.
2(q 11 q 12 -q 10 q 13 )=2(q 21 q 22 -q 20 q 23 )
Figure BDA0003883552170000073
2(q 10 q 11 +q 12 q 13 )=2(q 20 q 21 +q 22 q23)
When the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position (X, Y, Z) of the next joint point changes, two vectors which are inconsistent with the telescopic positive direction of the current joint are always kept unchanged.
After assembly constraints introduced by all joints of the robot are deduced, the method is used
Figure BDA0003883552170000087
A linear equation representing the assembly constraints of the robot, wherein,
Figure BDA0003883552170000085
representing robot end effector articulation parameters, e.g. articulation point P in FIG. 2 6 The parameters of the quaternion of (a) are,
Figure BDA0003883552170000086
representing other joint motion parameters than the robot end-effector, e.g. P in FIG. 2 1 ...P 5 The quaternion parameter of (1). At all satisfy
Figure BDA0003883552170000087
The joint motion parameters include joint motion parameters corresponding to the boundary points of the working space and joint motion parameters corresponding to the internal points of the working space, and most of the joint motion parameters corresponding to the internal points of the working space need to be removed by introducing a matrix rank lack constraint condition.
The linear equation of the constructed matrix rank lack constraint is as follows:
Figure BDA0003883552170000081
wherein the content of the first and second substances,
Figure BDA0003883552170000088
represent
Figure BDA0003883552170000089
Relative to
Figure BDA00038835521700000810
The transpose of the partial derivatives of (a),
Figure BDA00038835521700000811
a linear equation representing the assembly constraints of the robot,
Figure BDA00038835521700000812
representing a random vector.
The linear equation for constructing the constraint of the pose of the end effector is:
Figure BDA00038835521700000813
constructing a linear equation of the constraint brought by the introduced intermediate variables: since linear programming requires all constraint conditions to be linear equations, secondary variables such as the order of matrix and pose of the end effector of the robot can be introduced when constructing assembly constraint of the robot, matrix rank deficiency constraint and constraint on the pose of the end effector of the robot
Figure BDA00038835521700000816
And bilinear variablesSuch as g i g j . To linearize all the system of constraint equations, we introduce intermediate variables and some constraint conditions to specify the relationship between the newly introduced intermediate variables and the original parameters.
For the second order variable
Figure BDA00038835521700000814
We introduce an intermediate variable s i Make it
Figure BDA00038835521700000815
Suppose g i Is in the value range of [ l i ,u i ]The following three constraint equations need to be introduced to constrain s i And g i In relation to each other, i.e.
Figure BDA0003883552170000082
Figure BDA0003883552170000083
For bilinear variables g i g j Introducing an intermediate variable b ij Let b be ij =g i g j Let g be i And g j Are respectively [ l i ,u i ]And [ l j ,u j ]The following four constraint equations need to be introduced to constrain b ij And g i And g i In relation to each other, i.e. b ij =g i g j
Figure BDA0003883552170000091
From the above description, all linear equations introducing constraints generated by intermediate variables are written as:
Figure BDA0003883552170000092
wherein the content of the first and second substances,
Figure BDA0003883552170000093
intermediate variables that represent the introduction of secondary variables composed of joint motion parameters (e.g.,
Figure BDA0003883552170000094
),
Figure BDA0003883552170000095
representing intermediate variables introduced by bilinear variables composed of joint-motion parameters (e.g. b) ij =g i g j )。
S3, determining a linear constraint equation set which needs to be met by generating a working space boundary of the robot end effector by combining joint motion parameters, the constructed robot assembly constraint, the matrix rank deficiency constraint, the constraint on the pose of the robot end effector and a linear equation of the constraint generated by introducing an intermediate variable, and setting an initial value range for each parameter respectively, for example, in the embodiment, the joint motion parameters for representing the rotary posture are quaternion parameters, wherein the initial value range of each element can be set to be [ -1,1];
Figure BDA0003883552170000096
wherein the content of the first and second substances,
Figure BDA0003883552170000097
represents a linear equation of constraint on the pose of the robot end effector,
Figure BDA0003883552170000098
representing a constrained linear equation generated by introducing intermediate variables.
S4, obtaining all robot joint motion parameters meeting the linear equations in the constraint equation set according to the linear constraint equation set, and comprising the following steps:
s41, narrowing down each time by adopting a Pruning method (Pruning) through a linear programming methodThe range of values of the joint motion parameters achieves the purpose of pruning, and the maximum value and the minimum value which can be obtained by the joint motion parameters on the premise of meeting a linear constraint equation set are found out, specifically: for each joint motion parameter, two linear programming problems need to be defined to respectively find out the minimum value and the maximum value which can be obtained by the current parameter on the premise of meeting all constraint conditions in a constraint equation set. For example, when it is desired to trim a certain joint motion parameter g i When the value range of (a) is obtained, we need to solve the following two linear programming problems:
Figure BDA0003883552170000101
and
Figure BDA0003883552170000102
updating g separately by solving the above maximization and minimization problems i The upper and lower limits of the range.
And S42, setting a threshold value, and judging whether the maximum value in the value ranges of all the joint motion parameters is higher than the set threshold value. Specifically, after all the joint motion parameters are pruned, if the maximum value range of all the joint motion parameters is higher than a set threshold value, dividing the value range of the joint motion parameter into two parts from a middle point by using a Branching method, respectively storing the obtained value ranges of the two groups of parameters into a queue of the joint motion parameters, and keeping the value ranges of other joint motion parameters unchanged;
and S43, repeating the steps S41 and S42 for the value range of the next group of parameters in the joint motion parameter queue until the value ranges of all the joint motion parameters are lower than the set threshold value.
S5, after the operation of the pruning method and the branching method, all joint motion parameters (namely all joint motion parameters meeting the linear constraint equation set) are found out
Figure BDA0003883552170000111
And
Figure BDA0003883552170000112
) A number of different sets of value ranges. Newton iteration method is used for finding out robot joint motion parameters which simultaneously satisfy robot assembly constraint, matrix rank deficiency constraint and constraint on pose of robot end effector (namely, the constraint can satisfy robot assembly constraint, matrix rank deficiency constraint and robot end effector pose constraint) from all linear equations satisfying linear constraint equation set
Figure BDA0003883552170000113
) The boundary joint motion parameter of (1). Newton's iteration method is a method of solving equations approximately in the real and complex domains. In this step, since the value range of each joint motion parameter is smaller than the set threshold value through the step S4, a solution satisfying all constraint conditions can be rapidly obtained by using the solution process of the newton iteration method.
And S6, identifying and determining the working space boundary point of the robot end effector from the boundary joint motion parameters by combining robot assembly constraint, wherein the matrix rank lack constraint condition is only a necessary condition that the numerical value of one group of joint motion parameters is used as the working space boundary point, and the numerical value of each group of joint motion parameters needs to be more accurately identified in the step so as to determine whether the numerical value is the working space boundary point of the robot end effector. The method specifically comprises the following steps:
s61, searching each group of boundary joint motion parameters, wherein the embodiment uses one group of boundary joint motion parameters
Figure BDA0003883552170000114
For example, linear equations constrained by robot assembly are found separately
Figure BDA0003883552170000115
Formed high-dimensional geometric figure joint motion parameters at current boundary
Figure BDA0003883552170000116
Articulation parameters positionally related to a robot end effector
Figure BDA0003883552170000117
Normal to joint motion of
Figure BDA0003883552170000118
The specific formula is as follows:
Figure BDA0003883552170000119
wherein the content of the first and second substances,
Figure BDA00038835521700001110
is that
Figure BDA00038835521700001111
Relative to
Figure BDA00038835521700001112
The partial derivative of (a) of (b),
Figure BDA00038835521700001113
random vectors representing current boundary joint motion parameters, formulated
Figure BDA0003883552170000121
Obtaining;
s62, setting a judgment function by combining with the joint motion normal line, in the embodiment, setting
Figure BDA0003883552170000122
Is any set of joint motion parameters satisfying robot assembly constraint linear equation, and
Figure BDA0003883552170000123
are adjacent to each other by a judgment function
Figure BDA0003883552170000124
Determining the joint motion parameters of each robot end effector to identify and determine the working space of the end effectorBoundary points, the embodiment passing the judgment function
Figure BDA0003883552170000125
Determining
Figure BDA0003883552170000126
Whether it is a workspace boundary point of a robotic end effector, if
Figure BDA0003883552170000127
Is positively or negatively determined, then
Figure BDA0003883552170000128
Is a workspace boundary point of a robot end effector; if it is not
Figure BDA0003883552170000129
Cannot be determined, then
Figure BDA00038835521700001210
Is not a workspace boundary point for a robotic end effector.
In the application process, based on the requirements of the structure size, the joint limit and the task target of the robot, the boundary points of the working space obtained by the robot end effector on the premise of meeting the task requirement are connected to obtain the working space boundary of the robot end effector, and all constraint conditions for the robot end effector are introduced. When the robot and the industrial production line are arranged, the arrangement work is simplified into the work of overlapping the industrial production line and the robot working space, so that the arrangement work of an automatic scene can be greatly simplified, and the arrangement accuracy can be greatly improved.
The present invention is also applicable to any robot including, but not limited to, industrial robots, parallel robots, redundant robots, dual arm robots, or multi-fingered robots, among others.
All possible combinations of the technical features of the above embodiments may not be described for the sake of brevity, but should be considered as within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. A robot end effector work space boundary generation method based on linear programming is characterized by comprising the following steps:
s1, representing the joint motion by parameterization according to the joint characteristics of the robot, and constructing a robot dynamic model;
s2, respectively constructing robot assembly constraints, matrix rank deficiency constraints, constraints on the pose of the robot end effector and linear equations of the constraints generated by introducing intermediate variables, which are required by generating the boundary of the working space of the robot end effector;
s3, determining a linear constraint equation set which needs to be met by generating a working space boundary of the robot end effector by combining joint motion parameters and the constructed robot assembly constraint, matrix rank deficiency constraint, constraint on the pose of the robot end effector and a linear equation of constraint generated by introducing intermediate variables;
s4, acquiring robot joint motion parameters meeting linear equations in a constraint equation set according to the linear constraint equation set;
s5, boundary joint motion parameters which simultaneously satisfy robot assembly constraint, matrix rank deficiency constraint and robot end effector pose constraint are found out from robot joint motion parameters which satisfy linear equations of a linear constraint equation set;
and S6, identifying and determining the working space boundary point of the robot end effector from the boundary joint motion parameters by combining the robot assembly constraint.
2. The method for generating boundaries of a robot end effector working space based on linear programming according to claim 1, wherein in the step S1, the kinematics modeling of the robot motion is constructed by using a parameterized representation of the joint motion according to the joint characteristics of the robot, and specifically comprises the following steps:
s11, taking a joint point on each joint of the robot, and abstracting all joints of the robot into an abstraction model consisting of points and line segments;
s12, representing the motion generated by the current joint by parameterization according to the rotation attitude or position change of the next joint point connected with the current joint;
and S13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization, and constructing a kinematic model of the motion of the robot.
3. The method for generating boundaries of a working space of a robot end effector based on linear programming according to claim 2, wherein the step S12 represents the rotation attitude or position change of the next joint point connected to the current joint to the motion generated by the current joint by parameterization, which comprises the following steps:
s121, when the current joint is a rotary joint, the motion parameter of the current joint is represented by the rotation attitude change of the next joint point by adopting a quaternion parameter;
and S122, when the current joint is a telescopic joint, the motion parameter of the current joint is represented by the position change of the next joint point by adopting a three-dimensional vector parameter.
4. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 3, wherein in the step S2, the robot assembly constraint for generating the boundary of the working space is specifically: judging each joint of the robot, and when the current joint is a rotary joint, if the assembling constraint condition of the robot, which needs to be met by the current joint, is that the rotary attitude of the next joint point is changed, the rotary direction is always consistent with the rotary shaft of the current joint; when the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position of the next joint point changes, two vectors which are inconsistent with the telescopic positive direction of the current joint are always kept unchanged.
5. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 1, wherein the linear equation of the constructed matrix rank default constraint in the step S2 is:
Figure FDA0003883552160000021
wherein the content of the first and second substances,
Figure FDA0003883552160000022
represent
Figure FDA0003883552160000023
Relative to
Figure FDA0003883552160000024
The transposed matrix of the partial derivatives of (a),
Figure FDA0003883552160000025
a linear equation representing the assembly constraints of the robot,
Figure FDA0003883552160000026
representing the robot end effector joint motion parameters,
Figure FDA0003883552160000027
representing other articulation parameters in addition to the robotic end effector,
Figure FDA0003883552160000028
representing a random vector.
6. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 1, wherein in the step S3, the system of linear constraint equations to be satisfied for generating the boundary of the working space of the robot end effector is as follows:
Figure FDA0003883552160000031
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003883552160000032
represents a linear equation of constraint on the pose of the robot end effector,
Figure FDA0003883552160000033
representing a constrained linear equation generated by introducing intermediate variables,
Figure FDA0003883552160000034
intermediate variables representing the introduction of secondary variables consisting of joint motion parameters,
Figure FDA0003883552160000035
representing intermediate variables introduced by bilinear variables composed of joint motion parameters.
7. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 1, wherein in the step S4, all the robot joint motion parameters satisfying the linear equations in the constraint equation set are obtained according to the linear constraint equation set, and the method comprises the following steps:
s41, finding out the maximum value and the minimum value of the joint motion parameter which can be obtained on the premise of meeting a linear constraint equation set;
s42, setting a threshold, judging whether the maximum value in the maximum value ranges of all the joint motion parameters is higher than the set threshold, if so, dividing the value range of the joint motion parameter into two parts from a middle point, respectively storing the value ranges of the two groups of obtained parameters into a queue of the joint motion parameters, and keeping the value ranges of other joint motion parameters unchanged;
and S43, repeating the steps S41 and S42 for the value ranges of the next group of parameters in the joint motion parameter queue until the value ranges of all the joint motion parameters are lower than the set threshold value.
8. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 7, wherein in the step S5, boundary joint motion parameters satisfying robot assembly constraint, matrix rank deficiency constraint and constraint on the pose of the robot end effector at the same time are found out by a newton iteration method from all the robot joint motion parameters satisfying the linear equations of the linear constraint equation set.
9. The method for generating the boundary of the working space of the robot end effector based on the linear programming according to claim 7, wherein the step S6 of identifying and determining the boundary point of the working space of the robot end effector from the boundary joint motion parameters by combining the robot assembly constraint comprises the following steps:
s61, searching each group of boundary joint motion parameters, and respectively finding out a joint motion normal of a high-dimensional geometric figure formed by a robot assembly constraint linear equation relative to the robot end effector joint motion parameters on the current boundary joint motion parameter position, wherein the specific formula is as follows:
Figure FDA0003883552160000041
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003883552160000042
representing a joint motion normal of a high-dimensional geometric figure formed by a robot assembly constraint linear equation relative to a robot end effector joint motion parameter at a current boundary joint motion parameter position,
Figure FDA0003883552160000043
is that
Figure FDA0003883552160000044
Relative to
Figure FDA0003883552160000045
The partial derivative of (a) of (b),
Figure FDA0003883552160000046
representing a random vector representing a current boundary joint motion parameter;
and S62, setting a judgment function by combining the joint motion normal, and judging the joint motion parameters of each robot end effector through the judgment function to identify and determine the working space boundary points of the end effector.
CN202211236206.5A 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming Withdrawn CN115454097A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211236206.5A CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming
CN202310209843.1A CN116141331A (en) 2022-10-10 2023-03-07 Robot end effector working space boundary generation method based on linear programming

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211236206.5A CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming

Publications (1)

Publication Number Publication Date
CN115454097A true CN115454097A (en) 2022-12-09

Family

ID=84309033

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211236206.5A Withdrawn CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming
CN202310209843.1A Pending CN116141331A (en) 2022-10-10 2023-03-07 Robot end effector working space boundary generation method based on linear programming

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310209843.1A Pending CN116141331A (en) 2022-10-10 2023-03-07 Robot end effector working space boundary generation method based on linear programming

Country Status (1)

Country Link
CN (2) CN115454097A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117207200A (en) * 2023-11-09 2023-12-12 湖南视比特机器人有限公司 Method and device for generating working space of mechanical arm and computer equipment

Also Published As

Publication number Publication date
CN116141331A (en) 2023-05-23

Similar Documents

Publication Publication Date Title
CN109895101B (en) Unique solution method for inverse kinematics numerical value of joint type mechanical arm
CN107443375B (en) Robot origin calibration method, apparatus, storage medium and computer equipment
CN113319857B (en) Mechanical arm force and position hybrid control method and device, electronic equipment and storage medium
Chirikjian et al. Pose changes from a different point of view
JP2008546099A (en) Kinematic singularity compensation system and method
CN116038702B (en) Seven-axis robot inverse solution method and seven-axis robot
Celikag et al. Cartesian stiffness optimization for serial arm robots
Lambert et al. A novel parallel haptic device with 7 degrees of freedom
CN115454097A (en) Robot end effector working space boundary generation method based on linear programming
Wen et al. Singularities in three-legged platform-type parallel mechanisms
CN113715016A (en) Robot grabbing method, system and device based on 3D vision and medium
CN114818165A (en) Planning method and system for milling large complex parts by robot
CN109366486B (en) Flexible robot inverse kinematics solving method, system, equipment and storage medium
CN116330267A (en) Control method based on industrial robot wrist singular point calculation
Djuric et al. Graphical representation of the significant 6R KUKA robots spaces
CN113954070B (en) Mechanical arm motion control method and device, storage medium and electronic equipment
CN115933374A (en) Industrial robot load parameter static identification and pose identification optimization method
CN117325143A (en) Redundant mechanical arm singular position type lower kinematics optimization method
CN114536351A (en) Redundant double-arm robot teaching method and device, electronic equipment and system
CN108555904B (en) Method for optimizing operation performance of surface modification robot
CN112847441A (en) Six-axis robot coordinate offset detection method and device based on gradient descent method
Aydin et al. Genetic algorithm based redundancy resolution of robot manipulators
Parkin An interactive robotic simulation package
Fraczek et al. Calibration of multi-robot system without and under load using electronic theodolites
Guo Multi-degree-of-freedom robot arm motion simulation based on MATLAB

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20221209

WW01 Invention patent application withdrawn after publication