CN112297007A - Linear motion planning method under external reference coordinate system of robot - Google Patents
Linear motion planning method under external reference coordinate system of robot Download PDFInfo
- Publication number
- CN112297007A CN112297007A CN202011137069.0A CN202011137069A CN112297007A CN 112297007 A CN112297007 A CN 112297007A CN 202011137069 A CN202011137069 A CN 202011137069A CN 112297007 A CN112297007 A CN 112297007A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- external reference
- reference coordinate
- point
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Numerical Control (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a linear motion planning method under an external reference coordinate system of a robot, which is based on the idea of reversing workpieces and a method for planning a pose by utilizing a virtual external reference coordinate system and is used for decomposing non-intuitive curvilinear motion in a space into superposition of two simple motions. The two simple motions are respectively planned and then synthesized into a target curve motion to be realized, so that the linear motion of the workpiece relative to an external reference coordinate system is realized. The planning method is simple to implement, linear motion under an external reference coordinate system can be realized by applying the method, and the application field of the robot is expanded.
Description
Technical Field
The invention relates to a robot motion trail planning method, in particular to a linear motion planning method under an external reference coordinate system of a robot.
Background
Generally, the coordinate system to which the cartesian space trajectory planning of the robot refers is the world coordinate system or the user coordinate system. The trajectory planning of the cartesian space of the robot generally plans the pose of a robot TCP (tool center point), and the robot TCP is a regular straight line or circular arc trajectory in the space relative to a world coordinate system. When the robot works, the tool is installed at the tail end of the robot, the workpiece is fixed in the reachable working space of the robot, and the robot works on the workpiece according to different application requirements. For some applications, however, such as glue application, it is necessary to mount the workpiece at the end of the robot, with the workpiece moving relative to one of the stationary glue guns. No matter how the posture of the workpiece changes, the motion track of the workpiece relative to the fixed glue gun is always the linear or circular arc contour of the workpiece, and at the moment, the robot TCP is an invisible space curve track in a world coordinate system and cannot directly perform track planning on the space curve track. For the application of gluing, motion planning needs to be carried out relative to an external fixed reference coordinate system, and the motion of the position and the posture of the workpiece is relative to the external reference coordinate system.
The posture planning under the world coordinate system or the user coordinate system is relative to the world coordinate system, and the planning mode is that when the postures of the workpiece at the teaching starting point and the teaching end point are changed, if the linear contour of the workpiece moves along the external reference coordinate system (glue gun), a plurality of intermediate points need to be taught to fit the contour of the workpiece. The more teaching intermediate points are, the better the fitting effect is, but the complexity and the difficulty of teaching are increased, and the uniform change of the gluing speed cannot be ensured. And the posture planning under the external reference coordinate system is relative to the external reference coordinate system, the movement planning is carried out under the external reference coordinate system, no matter how the posture of the starting point and the ending point of the workpiece changes, the linear movement only needs to teach the starting point and the ending point, and the teaching operation is simplified. When the teaching track is reproduced, the workpiece held by the robot can move relative to the external reference coordinate system, so that the uniformity of speed is ensured. As shown in fig. 1.
The linear motion of the robot is the most basic track motion, and the application field of the robot, such as the above gluing application, can be expanded by realizing the linear motion under the externally fixed reference coordinate system. In a document, "a grinding and polishing method for an industrial robot based on dynamic external TCP" (plum name water, plum, Ouchoujiang, etc.. 2018,31(2):20-23), dynamic external TCP points are automatically generated by extracting workpiece surface characteristic information, and then the final running track of the robot is obtained according to the pose conversion relation between the track discrete point and the dynamic external TCP, wherein the external TCP mentioned in the document is an external reference coordinate system essentially, but the document does not describe in detail how to plan a straight line under the external reference coordinate system.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a linear motion planning method under an external reference coordinate system of a robot, which is based on the idea of reversing a workpiece around the origin of the external reference coordinate system and planning the pose by using a virtual external reference coordinate system, decomposes a complex motion into superposition of translation and rotation motions, plans the position and the pose respectively, synthesizes a target motion by the position and the pose, and realizes linear planning under the external fixed reference coordinate system, thereby realizing the linear motion of the workpiece relative to the external reference coordinate system.
The basic technical idea of the invention is as follows:
the motion of the workpiece is decomposed into linear motion and rotational motion. The linear motion means that the workpiece keeps unchanged in posture with the teaching starting point and translates along the original point of the external reference coordinate system et1 to the middle point of the contact position of the workpiece and the original point of the external reference coordinate system at the teaching end point, in the process, the external reference coordinate system et1 is unchanged, and the TCP point of the robot is converted to the middle point t2' from the teaching starting point t 1. The rotary motion refers to that the workpiece rotates around the original point of the external reference coordinate system to the position with the same posture as the teaching end point at the intermediate point, the TCP point of the robot is transformed to the teaching end point t2 from the intermediate point t2', the position of the original point of the external reference coordinate system is unchanged, but the posture of the external reference coordinate system changes along with the rotation of the workpiece, and when the workpiece rotates to the teaching end point, the external reference coordinate system is transformed from et1 to et 2. et2 is a virtual external reference coordinate system with no practical significance, only for the introduction of interpolation calculations. And the starting point and the end point of the linear motion under the external reference coordinate system are obtained by teaching. The known conditions are: the description of the pose of the TCP point t1 of the teaching origin robot relative to the external reference coordinate system et1 is shown asThe gesture description is expressed asThe position is described as The pose description of the TCP point t2 of the teaching endpoint robot relative to the external reference coordinate system et1 is expressed asThe gesture description is expressed asThe description of the external reference coordinate system et1 is obtained by the coordinate system calibration of the controller, and the description of the pose of the external reference coordinate system et1 relative to the world coordinate system is expressed asThe gesture description is expressed asThe position description is expressed as
The invention discloses a linear motion planning method under an external reference coordinate system of a robot, which comprises the following steps:
step 1. inverse transformation for motion decomposition
Reversing the posture of the robot to an intermediate point of the same posture as the teaching starting point around the origin of an external reference coordinate system et1 at the teaching ending point, wherein a robot TCP point t2 is reversed to a point t2', and the posture change of the robot is eliminated by reversing; after the inversion, the robot only changes the position at the intermediate point and the teaching starting point; and introducing a virtual external reference coordinate system et2 at the teaching end point, wherein the et2 coincides with the origin of the external reference coordinate system et1, and when the posture of the robot at the teaching end point is inverted to the middle point of the same posture as the teaching start point around the origin of the external reference coordinate system et1, the et2 coincides with the et 1.
Wherein the content of the first and second substances,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1;in order to teach the posture description of the TCP point t2 of the end point robot relative to an external reference coordinate system et2, the posture interpolation starting point is a unit matrix E, and the posture interpolation end point is
Step 3, determining the position interpolation starting and ending point
Linear position interpolation starting point is position description of t1 under external reference coordinate system et1Obtained by teaching; the linear position interpolation end point is t2' and is described by the position under the external reference coordinate system et1Due to the fact thatWhereinFor the pose description of the position interpolation end point t2' under the external reference coordinate system et1,to teach the pose description of the end point t2 under the external reference coordinate system et2,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1. ByThe description of the position interpolation end point can be obtainedThe interpolation end point t2' is described by the pose under the external reference coordinate system et 1.
Separately interpolating the pose and position, both with respect to an external reference coordinate system et 1; interpolating the changing posture of et2 relative to et1 to obtain the posture of the posture interpolation intermediate pointIs a dynamically changing attitude matrix with the interval from unit matrix E to attitudeA change in (c); for space line segmentInterpolating to obtain the position of the intermediate point of the position interpolationet1P (i); when the position and the posture are interpolated, a speed planning mode can be selected optionally, such as common trapezoidal speed planning, S-shaped speed planning and the like.
Step 5, calculating the position and pose of TCP by motion synthesis
External reference coordinate system of gesture interpolation intermediate point is under world coordinate systemDescription of the posture ofSuperposing the obtained linear interpolation point position to a new interpolation gesture, and synthesizing the gesture and the position motion to obtain the position of the TCP point real-time output of the robot; interpolating the position of the final output TCP point relative to the world coordinate system For interpolating the pose of the external reference coordinate system of the intermediate point relative to the world coordinate systemInterpolating the attitude of the final output TCP point relative to the world coordinate system according to the motion characteristics of the rigid body
According to the geometric characteristics of linear motion under an external reference coordinate, the method for planning the pose based on the thought of inverting the workpiece and the virtual external reference coordinate system decomposes the non-intuitive curvilinear motion in the space into the superposition of two simple motions, plans the two simple motions respectively, and synthesizes the two simple motions into the target curvilinear motion to be realized, thereby realizing the linear motion of the workpiece relative to the external reference coordinate system. The planning method is simple to implement, linear motion under an external reference coordinate system can be realized by applying the method, and the application field of the robot is expanded.
Drawings
Fig. 1 shows the difference between teaching and reproduction of linear motion in an external reference coordinate system and a world coordinate system. Wherein, a is the teaching under the world coordinate system, marked in the figure, 1 is a workpiece, and 2 is a glue gun; figure b is a teaching in an external reference coordinate system.
Fig. 2 is a process of resolving linear motion under an external reference coordinate system.
FIG. 3 is a schematic and enlarged illustration of linear motion interpolation in an external reference coordinate system.
Fig. 4 is a flowchart of linear interpolation in the external reference coordinate system.
FIG. 5 position interpolation points for each directional component. The diagram a is a position interpolation X and Y direction position diagram, and the diagram b is a position interpolation X and Z direction position diagram.
FIG. 6 is the position component of TCP point in world coordinate system. Wherein, the diagram a is a component diagram of the TCP point in the world coordinate system X and Y directions, and the diagram b is a component diagram of the TCP point in the world coordinate system X and Z directions
Fig. 7 shows the posture change of the TCP point in the world coordinate system. The graph a is a change graph of the posture alpha of the TCP point in the world coordinate system, the graph b is a change graph of the posture beta of the TCP point in the world coordinate system, and the graph c is a change graph of the posture gamma of the TCP point in the world coordinate system.
Detailed Description
The process of the present invention will be described in further detail with reference to specific examples.
The invention uses a universal six-joint robot as a control object, fixes a workpiece needing gluing on a gripper at the tail end of the robot, and fixes a glue gun in a working space of the robot relative to the ground. The robot holds the workpiece to perform gluing operation under the action of a glue gun, as shown in figure 1. And establishing an external reference coordinate system at the glue gun, wherein the teaching starting point of the linear motion under the external reference coordinate system is P1, and the terminal point is P2. And during teaching, the linear contour of the workpiece needing to be coated is tightly attached to the origin of an external reference coordinate system.
As shown in figure 2, the description of the external reference coordinate system et1 relative to the world coordinate system is obtained by calibrating the coordinate system of the controller, and the description of the posture of the established external reference coordinate system et1 relative to the world coordinate system is shown as In the form of a matrix expression, (x)et1,yet1,zet1,αet1,βet1,χet1) The two are equivalent in terms of position Euler angle representation. Wherein xet1、yet1、zet1And the position component of the origin of the external reference coordinate system in the coordinate axis direction of the world coordinate system is expressed in mm. Alpha is alphaet1、βet1、γet1The Euler angle representation form is ZYX form Euler angle representation form, and represents Euler angles of an external reference coordinate system under a world coordinate system, and the Euler angles are degrees. Location descriptionGesture description
The description of the pose of the robot TCP point t1 at the teaching starting point P1 relative to the external reference coordinate system et1 is shown asxt1、yt1、zt1、αt1、βt1、γt1Is represented by t1 at the position and euler angle under the external reference coordinate system et 1. Gesture descriptionLocation description Description of point t1 in world coordinate systemThe description of the pose of the TCP point t2 of the robot at the teaching end point P2 relative to the external reference coordinate system et1 is shown asxt2、yt2、zt2、αt2、βt2、γt2Is t2 outAnd is represented by a position under the reference coordinate system et1 and an euler angle. Location descriptionGesture descriptionDescription of point t2 in world coordinate system
To realize the linear motion in the external reference coordinate system, the idea is to decompose the motion of the workpiece into linear motion and rotational motion, as shown in fig. 2. The linear motion means that the workpiece keeps unchanged in posture with the teaching starting point and translates along the original point of the external reference coordinate system et1 to the middle point of the contact position of the workpiece and the original point of the external reference coordinate system at the teaching end point, in the process, the external reference coordinate system et1 is unchanged, and the TCP point of the robot is converted to the middle point t2' from the teaching starting point t 1. The rotary motion refers to that the workpiece rotates around the original point of the external reference coordinate system to the position with the same posture as the teaching end point at the intermediate point, the TCP point of the robot is transformed to the teaching end point t2 from the intermediate point t2', the position of the original point of the external reference coordinate system is unchanged, but the posture of the external reference coordinate system changes along with the rotation of the workpiece, and when the workpiece rotates to the teaching end point, the external reference coordinate system is transformed from et1 to et 2. et2 is a virtual external reference coordinate system with no practical significance, only for the introduction of interpolation calculations.
The specific steps of the invention are as follows, and the flow chart is shown in figure 4.
Step 1. inverse transformation for motion decomposition
Reversing the posture of the robot to an intermediate point of the same posture as the teaching starting point around the origin of an external reference coordinate system et1 at the teaching ending point, wherein a robot TCP point t2 is reversed to a point t2', and the posture change of the robot is eliminated by reversing; after the inversion, the robot only changes the position at the intermediate point and the teaching starting point; and introducing a virtual external reference coordinate system et2 at the teaching end point, wherein the et2 coincides with the origin of the external reference coordinate system et1, and when the posture of the robot at the teaching end point is inverted to the middle point of the same posture as the teaching start point around the origin of the external reference coordinate system et1, the et2 coincides with the et 1.
Derived from the kinematic characteristics of rigid bodiesGet it solvedWherein the content of the first and second substances,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1;in order to teach the posture description of the TCP point t2 of the end-point robot relative to the external reference coordinate system et2, the posture interpolation starting point is (0,0,0) of the unit matrix E, and the posture interpolation end point is
Step 3, determining the position interpolation starting and ending point
Linear position interpolation starting point is position description of t1 under external reference coordinate system et1 Obtained by teaching; the linear position interpolation end point is t2' and is described by the position under the external reference coordinate system et1Due to the fact thatWhereinFor the pose description of the position interpolation end point t2' under the external reference coordinate system et1,to teach the pose description of the end point t2 under the external reference coordinate system et2,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1. ByThe description of the position interpolation end point can be obtainedThe pose description for the interpolation end point t2' under the external reference coordinate system et 1.
As in fig. 3, the attitude and position are interpolated, respectively, both with respect to an external reference coordinate system et 1; interpolating the changing posture of et2 relative to et1 to obtain the posture of the posture interpolation intermediate pointIs a dynamically changing attitude matrix with the interval from unit matrix E to attitudeA change in (c); for space line segmentInterpolating to obtain the position of the intermediate point of the position interpolationet1P (i); when the position and the posture are interpolated, a speed planning mode can be selected optionally, such as common trapezoidal speed planning, S-shaped speed planning and the like. The position interpolation curve is a straight line with respect to the varying external reference coordinate system, see fig. 5.
Step 5, calculating the position and pose of TCP by motion synthesis
Pose interpolation intermediate point external parametersAttitude description under world coordinate system under reference coordinate systemSuperposing the obtained linear interpolation point position to a new interpolation gesture, and synthesizing the gesture and the position motion to obtain the position of the TCP point real-time output of the robot; interpolating the position of the final output TCP point relative to the world coordinate system For interpolating the pose of the external reference coordinate system of the intermediate point relative to the world coordinate systemInterpolating the attitude of the final output TCP point relative to the world coordinate system according to the motion characteristics of the rigid bodyThe TCP point is an irregular curve relative to a curve under a world coordinate system, the position is shown in figure 6, and the posture is shown in figure 7.
Claims (3)
1. A linear motion planning method under an external reference coordinate system of a robot comprises the following steps:
the description of the pose of the TCP point t1 of the teaching origin robot relative to the external reference coordinate system et1 is shown asThe gesture description is expressed asThe position is described as The pose description of the TCP point t2 of the teaching endpoint robot relative to the external reference coordinate system et1 is expressed asThe gesture description is expressed asThe description of the external reference coordinate system et1 is obtained by the coordinate system calibration of the controller, and the description of the pose of the external reference coordinate system et1 relative to the world coordinate system is expressed asThe gesture description is expressed asThe position description is expressed as
Step 1. inverse transformation for motion decomposition
Reversing the posture of the robot to an intermediate point of the same posture as the teaching starting point around the origin of an external reference coordinate system et1 at the teaching ending point, wherein a robot TCP point t2 is reversed to a point t2', and the posture change of the robot is eliminated by reversing; after the inversion, the robot only changes the position at the intermediate point and the teaching starting point; introducing a virtual external reference coordinate system et2 at the teaching end point, wherein the et2 coincides with the origin of the external reference coordinate system et1, and when the posture of the robot at the teaching end point is inverted to the middle point with the same posture as the teaching start point around the origin of the external reference coordinate system et1, the et2 coincides with the et 1;
step 2, determining a posture interpolation starting and ending point
Wherein the content of the first and second substances,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1;in order to teach the posture description of the TCP point t2 of the end point robot relative to an external reference coordinate system et2, the posture interpolation starting point is a unit matrix E, and the posture interpolation end point is
Step 3, determining the position interpolation starting and ending point
Linear position interpolation starting point is position description of t1 under external reference coordinate system et1Obtained by teaching; the linear position interpolation end point is t2' and is described by the position under the external reference coordinate system et1Due to the fact thatWhereinFor the pose description of the position interpolation end point t2' under the external reference coordinate system et1,to teach the pose description of the end point t2 under the external reference coordinate system et2,is described for the pose of the virtual external reference coordinate system et2 relative to the external reference coordinate system et 1. ByThe description of the position interpolation end point can be obtainedThe pose description for the interpolation end point t2' under the external reference coordinate system et 1.
Step 4, calculating intermediate point of interpolation of attitude and position
Separately interpolating the pose and position, both with respect to an external reference coordinate system et 1; interpolating the changing posture of et2 relative to et1 to obtain the posture of the posture interpolation intermediate pointIs a dynamically changing attitude matrix with the interval from unit matrix E to attitudeA change in (c); for space line segmentInterpolating to obtain the position of the intermediate point of the position interpolationet1P(i);
Step 5, calculating the position and pose of TCP by motion synthesis
Attitude description of external reference coordinate system of attitude interpolation intermediate point under world coordinate systemSuperposing the obtained linear interpolation point position to a new interpolation gesture, and synthesizing the gesture and the position motion to obtain the position of the TCP point real-time output of the robot; wherein the content of the first and second substances,is a description of the pose of the external reference coordinate system et1 relative to the world coordinate system;
interpolating the position of the final output TCP point relative to the world coordinate system For interpolating the pose of the external reference coordinate system of the intermediate point relative to the world coordinate systemInterpolating the attitude of the final output TCP point relative to the world coordinate system according to the motion characteristics of the rigid body
2. The method for planning the linear motion of the robot in the external reference coordinate system according to claim 1, wherein the method comprises the following steps:
in the step 4, the attitude and the position are interpolated by adopting a speed planning mode.
3. The method for planning the linear motion of the robot in the external reference coordinate system according to claim 2, wherein the method comprises the following steps:
the speed planning mode adopts trapezoidal speed planning or S-shaped speed planning.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011137069.0A CN112297007B (en) | 2020-10-22 | 2020-10-22 | Linear motion planning method under external reference coordinate system of robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011137069.0A CN112297007B (en) | 2020-10-22 | 2020-10-22 | Linear motion planning method under external reference coordinate system of robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112297007A true CN112297007A (en) | 2021-02-02 |
CN112297007B CN112297007B (en) | 2021-10-26 |
Family
ID=74328438
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011137069.0A Active CN112297007B (en) | 2020-10-22 | 2020-10-22 | Linear motion planning method under external reference coordinate system of robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112297007B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62259675A (en) * | 1986-05-06 | 1987-11-12 | Hitachi Ltd | Controller for robot |
CN101053953A (en) * | 2004-07-15 | 2007-10-17 | 上海交通大学 | Method for rapid calibrating hand-eye relationship of single eye vision sensor of welding robot |
CN106079317A (en) * | 2016-04-22 | 2016-11-09 | 苏州超群智能科技有限公司 | A kind of coordinate truss-like injection molding mechanical arm with vision guide positioning function |
CN106671079A (en) * | 2015-11-06 | 2017-05-17 | 中国科学院沈阳计算技术研究所有限公司 | Motion control method for welding robot in coordination with positioner |
CN109454642A (en) * | 2018-12-27 | 2019-03-12 | 南京埃克里得视觉技术有限公司 | Robot coating track automatic manufacturing method based on 3D vision |
US20190193269A1 (en) * | 2017-12-27 | 2019-06-27 | Hanwha Precision Machinery Co., Ltd. | Robot control system and method of controlling a robot |
CN111230880A (en) * | 2020-02-24 | 2020-06-05 | 西安交通大学 | Complex curved surface processing track generation method in offline programming |
-
2020
- 2020-10-22 CN CN202011137069.0A patent/CN112297007B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62259675A (en) * | 1986-05-06 | 1987-11-12 | Hitachi Ltd | Controller for robot |
CN101053953A (en) * | 2004-07-15 | 2007-10-17 | 上海交通大学 | Method for rapid calibrating hand-eye relationship of single eye vision sensor of welding robot |
CN106671079A (en) * | 2015-11-06 | 2017-05-17 | 中国科学院沈阳计算技术研究所有限公司 | Motion control method for welding robot in coordination with positioner |
CN106079317A (en) * | 2016-04-22 | 2016-11-09 | 苏州超群智能科技有限公司 | A kind of coordinate truss-like injection molding mechanical arm with vision guide positioning function |
US20190193269A1 (en) * | 2017-12-27 | 2019-06-27 | Hanwha Precision Machinery Co., Ltd. | Robot control system and method of controlling a robot |
CN109454642A (en) * | 2018-12-27 | 2019-03-12 | 南京埃克里得视觉技术有限公司 | Robot coating track automatic manufacturing method based on 3D vision |
CN111230880A (en) * | 2020-02-24 | 2020-06-05 | 西安交通大学 | Complex curved surface processing track generation method in offline programming |
Also Published As
Publication number | Publication date |
---|---|
CN112297007B (en) | 2021-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113524157A (en) | Robot system, method, robot arm, and storage medium for configuring copy function | |
CN103934528B (en) | A kind of six-axis linkage interpolating method for spark machined | |
CN113601512B (en) | General avoidance method and system for singular points of mechanical arm | |
CN112453648B (en) | Off-line programming laser welding seam tracking system based on 3D vision | |
JP2006236031A (en) | Robot trajectory controlling method, system, and program for robot trajectory controlling method | |
JP2006099474A (en) | Method for controlling robot locus | |
CN105856231B (en) | A kind of motion control method of particular configuration six-shaft industrial robot | |
CN109648230B (en) | Swing welding method for expanding double-rotation cooperative function shaft based on six-degree-of-freedom robot | |
CN105598975B (en) | A kind of method for determining industrial robot motion track | |
CN109015652A (en) | A kind of control method of robot and the positioner coordinated movement of various economic factors | |
CN109648229B (en) | Swing welding method for expanding double-straight-line cooperative function shaft based on six-degree-of-freedom robot | |
CN111890349A (en) | Four-degree-of-freedom mechanical arm motion planning method | |
CN109773376B (en) | Sine swing welding method of welding robot | |
KR20150142796A (en) | Method and system for controlling elbow of robot | |
CN112297007B (en) | Linear motion planning method under external reference coordinate system of robot | |
CN109129469B (en) | Mechanical arm kinematics inverse solution method and device and mechanical arm | |
JP2009166164A (en) | Industrial robot | |
JP2020171989A (en) | Robot teaching system | |
JP7199073B2 (en) | Teaching data creation system for vertical articulated robots | |
CN112684793B (en) | Trajectory tracking control method for zero-radius over-bending of robot in discrete system | |
CN115808904A (en) | Industrial robot arc auxiliary point passing track planning method | |
CN111761586B (en) | Remote control robot based on big data | |
KR100818059B1 (en) | Method for controlling human arm motion reproduction of humanoid robot | |
JPH06236206A (en) | Control method for industrial robot mounted on turn table | |
KR101312003B1 (en) | Interpolation method of robot comprising r-axis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |