CN112847366A - Force and position hybrid teaching robot system and teaching method - Google Patents
Force and position hybrid teaching robot system and teaching method Download PDFInfo
- Publication number
- CN112847366A CN112847366A CN202110018489.5A CN202110018489A CN112847366A CN 112847366 A CN112847366 A CN 112847366A CN 202110018489 A CN202110018489 A CN 202110018489A CN 112847366 A CN112847366 A CN 112847366A
- Authority
- CN
- China
- Prior art keywords
- robot
- force
- information
- teaching
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The invention provides a force and position hybrid teaching robot system and a teaching method. The teaching robot system includes: a robot body; the tail end sensor is connected to the tail end of the robot body and used for measuring force or moment; a control system configured to: when the robot body is taught, the control system collects position information of the robot body and force/torque information obtained by the end sensor, and establishes a corresponding relation between the position information and the force/torque information. The invention has the beneficial effects that: (1) the teaching efficiency of the robot is higher, especially for parts with complex curved surfaces; (2) the experience of the skilled operator, in particular the contact force between the tool and the workpiece during operation, can be taught to give the robot the experience of the skilled operator.
Description
Technical Field
The invention relates to a teaching robot system and a teaching method, in particular to a force and position hybrid teaching robot system and a teaching method, and belongs to the technical field of robots.
Background
Patent No. CN103909516A discloses a robot teaching system including a sensor, a screen generating section, an adjusting section, and a task generating section. And a force/torque sensor is arranged at the tail end of the robot, and is used for acquiring and storing a contact force signal of the robot in the teaching process and controlling the motion of the robot as a feedback signal when the robot works. The teaching of the robot is still finished through a demonstrator, and the teaching is not finished by adopting simpler dragging.
Patent application No. CN109483556A discloses a robot teaching system. The robot teaching system comprises a mechanical arm, a six-dimensional force/torque sensor, a polishing head and a PC upper computer. The teaching process is that firstly, the teaching of the polishing track is manually completed, the robot repeats the teaching track and collects the contact force in the motion process of the polishing head, and then a relation model of the force and the position in the polishing process of the robot is established. The robot teaching system does not have mixed teaching of force and position, and the force of the polishing process is not taught by a skilled technician of the polishing process.
The invention patent with the patent application number of CN109434843A also discloses a device and a method for dragging the taught robot to control and polish the blade, and the system comprises a robot, a six-dimensional force sensor, a polishing mechanism, a positioner and a robot control system. The system polishing track is taught by adopting a dragging teaching mode, the polishing force is realized by a force control module, and a force/position mixed teaching mode does not exist.
The teaching method for realizing force/position hybrid control of the existing robot system comprises the following steps:
(1) and the three-dimensional model of the part to be processed is imported into offline programming software in an offline programming mode, the robot is programmed offline, force control in one or more directions is realized according to the process requirement of the part to be processed, and position control in the other directions is realized.
(2) And after the track on-line programming is finished, the robot realizes the force control in one or more directions in the running process.
(3) The robot track is programmed by adopting a dragging teaching mode, and one or more directional forces are controlled on line, such as the three patents mentioned above.
The three teaching modes for realizing force/position hybrid control of the existing robot system are as above. The method (1) is relatively simple in a programming stage, but the absolute precision of the robot needs to be calibrated, and the relative relation between the robot coordinate and the workpiece coordinate needs to be accurate; in the mode (2), a large amount of time is spent in an online programming teaching stage, and particularly for parts with complex curved surfaces, the curved surfaces need to be discretized to realize the teaching of tracks; the mode (3) can improve the problems of the modes (1) and (2). However, the above three methods are all teaching of the robot track, that is, teaching of the robot position, and cannot be used for teaching the robot acting force, and the acting force of the robot is programmed by a programmer, so that there is a need for further improvement in convenience of use.
The three methods require a demonstrator to have a certain skill level on the robot during teaching, have various teaching contents for different tasks, and need to perform complicated programming again when a workpiece to be operated is changed. Meanwhile, the experience of a skilled worker on the workpiece processing force is difficult to introduce into a robot system, and the force control needs to be tested for many times to obtain proper force data.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: the experience of skilled workers with workpiece handling forces is difficult to introduce into robotic systems.
In order to solve the above-described technical problem, a first aspect of the present invention provides a teaching robot system including:
a robot body;
the tail end sensor is connected to the tail end of the robot body and used for measuring force or moment;
a control system configured to: when the robot body is taught, the control system collects position information of the robot body and force/torque information obtained by the end sensor, and establishes a corresponding relation between the position information and the force/torque information.
In some embodiments, when the robot body is taught, a drag teaching process is included.
In some embodiments, the robot body includes a plurality of robot joints, each robot joint mounting a joint motor and a joint sensor, each joint sensor measuring a moment of the corresponding joint.
In some embodiments, the joint motor and joint sensor are configured to: and controlling the corresponding joint motor to generate resisting torque according to the torque measured by the joint sensor so as to counteract the gravity torque of the robot.
In some embodiments, the robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor, a speed reducer and an encoder, an input end of the speed reducer is connected to an output end of the joint motor, and the joint motor and the speed reducer are configured to: and acquiring and calculating the difference value between the encoder reading at the output end of the joint motor and the encoder reading at the output end of the speed reducer, and calculating the torque of the joint according to the rigidity of the speed reducer.
In some embodiments, the robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor and a harmonic reducer, and a flexible gear of the harmonic reducer is provided with a strain gauge for measuring the torque of the joint.
In a second aspect of the present invention, there is provided a second teaching robot system comprising:
a robot body;
the tail end sensor is connected to the tail end of the robot body and used for measuring force or moment;
a control system configured to: when the robot body is taught, the control system collects position information and time information of the robot body and force/moment information acquired by the end sensor, and establishes a corresponding relation between the position information, the time information and the force/moment information.
In a third aspect of the present invention, there is provided a third teaching robot system comprising:
the robot comprises a robot body, a control unit and a control unit, wherein the robot body comprises a plurality of robot joints, and each robot joint is provided with a joint motor and a joint sensor;
the end effector is connected to the tail end of the robot body;
an end sensor for measuring a force or moment exerted on the end effector;
a control system configured to: when the robot is dragged and taught, the control system collects the position information of each joint sensor and the force/moment information obtained by the tail end sensor, and establishes the corresponding relation between the position information and the force/moment information.
In some embodiments, the end sensor is integrated into the end effector.
In some embodiments, the end effector comprises one of a deburring tool, a grinding tool, a polishing tool, or a screwing tool.
In some embodiments, the control system comprises:
the data acquisition module is used for acquiring force/moment information of the terminal sensor and position information of the robot body in the teaching process, and the plurality of position signals form track information;
the data processing module is used for processing the force/moment information and the position information, obtaining track information from a plurality of position information and then establishing the relation between the track information and the force/moment information;
the data storage module is used for storing track information, force/moment information and the relation between the track information and the force/moment information;
the track generation module is used for converting the track information stored by the data storage module into a track which can be executed by the robot;
and the position control module controls the motion track of the robot body according to the executable track.
In some embodiments, the teaching robotic system further comprises a robotic interaction device comprising:
the operation command module is used for changing the state of the robot and sending a robot action command;
and the display module is used for displaying the pose, force/moment information and track information of the robot.
In a fourth aspect of the present invention, there is provided a robot teaching method, including the steps of:
carrying out dragging teaching on the robot body, acquiring force/torque information applied to the end effector by the robot body in the teaching process, and acquiring track information of the robot body;
processing the force/moment information and the track information, and establishing a corresponding relation between the track and the force/moment;
the force/moment information, the trajectory information, and the correspondence between the trajectory and the force/moment are stored.
In some embodiments, the force/torque information is acquired by means of a force/torque sensor mounted between the end of the robot body and the end effector, the taught gripping point being provided on the robot body.
The invention has the beneficial effects that:
(1) the teaching efficiency of the robot is higher, especially for parts with complex curved surfaces;
(2) the experience of the skilled operator, in particular the contact force between the tool and the workpiece during operation, can be taught to give the robot the experience of the skilled operator.
Drawings
FIG. 1 is a schematic diagram of the overall structure of a teaching robot system according to a preferred embodiment of the present invention;
FIG. 2 is a block and flow diagram of a teaching robotic system in accordance with a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of the teaching of a teaching robotic system in a preferred embodiment of the present invention;
FIG. 4 is a block diagram and flow diagram of a teaching robotic system in accordance with another preferred embodiment of the present invention.
The reference numbers of the above figures are as follows:
100 robot body
110 robot connecting rod
120 robot joint
121-joint servo driver
122 joint position sensor
123 joint torque sensor
130 tip force/moment sensor
140 teaching a grasp site
200 robot control system
201 data acquisition module
202 data processing module
203 data storage module
204 track generation module
205 position control module
206 position conversion module
207 drag teaching module
208 force control module
209 force conversion module
210 force generating module
300 robot interaction device
310 operation command module
320 display module
400 end effector
410 tip force/moment sensor
500 operated object
600 operator
Detailed Description
Unless otherwise defined, technical or scientific terms used in the claims and the specification of this patent shall have the ordinary meaning as understood by those of ordinary skill in the art to which this patent belongs. As used in this specification and the appended claims, the terms "first," "second," and the like do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The terms "a" or "an," and the like, do not denote a limitation of quantity, but rather denote the presence of at least one. In the description of this patent, unless otherwise indicated, "a plurality" means two or more. The word "comprising" or "having", and the like, means that the element or item appearing before "comprises" or "having" covers the element or item listed after "comprising" or "having" and its equivalent, but does not exclude other elements or items.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements.
Different embodiments of the invention may be replaced or combined in combination, and the invention is thus to be construed as encompassing all possible combinations of the same and/or different embodiments described. Thus, if one embodiment includes feature A, B, C and another embodiment includes feature B, D, then the invention should also be construed as including embodiments that include one or more of all other possible combinations of A, B, C, D, even though such embodiments may not be explicitly recited in the following text.
The invention adopts a force/position mixed teaching method to realize the force/position mixed control of the robot system. The teaching of the position or the track of the tail end of the robot is realized through the force control of the joints of the robot, and a single-dimensional or multi-dimensional force/torque sensor at the tail end of the robot acquires a force or torque signal acting on an operated target in the teaching process. The robot reproduces the relation between the taught position and force (or moment) signal and force-position during operation, and realizes force/position hybrid control.
The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.
Example 1
Teaching system
The robot/bit hybrid teaching system provided by the embodiment is shown in fig. 1. The hybrid teaching system is mainly composed of three parts, namely a robot body 100, a robot control system 200 and a robot interaction device 300. Further, the hybrid teaching system may further include an end-effector tool 400 installed at the end of the robot body 100, and the end-effector tool 400 may also be referred to as an end effector. Between the robot body 100 and the end-effector 400 is mounted an end force/torque sensor 130 for measuring a force or torque applied to the end-effector 400. The tip force/moment sensor 130 is a single-or multi-dimensional force/moment sensor. In the force/position hybrid teaching, the tip force/torque sensor 130 collects the contact force of the tip operating tool 400 and the object 500 to be operated, for example, the force in the normal direction of the polishing surface during polishing. Tip force/torque sensor 130 is a key component for performing the force/position hybrid teaching.
The robot body 100 is composed of a plurality of robot links 110 and robot joints 120, and the robot body 100 generally includes six or more robot joints 120, but is not limited thereto. If the robot body is required to provide fewer degrees of freedom, the number of robot links 110 and robot joints 120 may be reduced accordingly.
The robot joint 120 is composed of a joint servo driver 121, a joint position sensor 122, and a joint torque sensor 123. The joint servo driver 121 is preferably a servo motor. A joint torque sensor 123 is installed at each robot joint 120 for measuring a torque signal corresponding to the robot joint 120 so as to control the current magnitude of the servo motor of the robot joint, so that the torque generated by the servo motor is enough to overcome the gravity torque generated by the gravity of the robot link and the robot joint. Thus, the operator 600 can easily drag the end of the robot body 100 to move the robot body 100 as desired. For facilitating grasping, a teaching grasping member 140 is provided on the robot body 100. The joint position sensor 122 provides position information corresponding to the robot joint.
The robot control system 200 mainly includes a data acquisition module 201, a data processing module 202, a data storage module 203, a trajectory generation module 204, and a position control module 205. When the robot body 100 is dragged for teaching, the robot control system 200 collects position information of the robot body 100 and force/moment information acquired by the terminal force/moment sensor 130, and establishes a corresponding relationship between the position information and the force/moment information. The robot body 100 is dragged to teach that the robot body is dragged in the whole process; or the following steps: one part of the process is dragging teaching, and the other part of the process is programming teaching.
The data acquisition module 201 is used for acquiring signals of the terminal force/torque sensor 130 and position signals of the robot body 100 in the teaching process, and a series of position signals form track signals of the robot. The data processing module 202 is configured to smooth the force/torque signals collected from the end force/torque sensor 130 and the trajectory signals of the robot, and establish a relationship between the robot trajectory and the force/torque. The data storage module 203 is used for storing the values of the robot track and the force/moment and the relationship between the robot track and the force/moment. The track generation module 204 is used for converting the robot track collected by the data storage module 203 into a track which can be walked by the robot. The position control module 205 is used for controlling the robot body 100 to control the track according to the robot track generated by the track generation module 204.
The robot interaction device 300 includes an operation command module 310 and a display module 320. The operation command module 310 is used to change the state of the robot and send robot action commands. The display module 320 displays pose of the robot, force/moment curve data, robot trajectory data, and the like. The robot interaction device 300 generally employs a special robot demonstrator, or may employ a handheld intelligent device such as a mobile phone or a tablet computer, or may be a personal computer. The robot interaction device 300 is installed with robot operation software.
The end effector 400 may be a polishing tool, a grinding tool, a deburring tool, a screw locking tool, etc., but is not limited to the above tools, and any application method having a force/position mixing teaching feature is within the scope of the present patent.
Teaching process
A teaching process using the above-described robot force/position hybrid teaching system is shown in fig. 2. To start teaching the robot, the operator 600 presses a virtual button on a software interface displayed on the display module 320 of the robot interaction device 300 or presses a physical button on the robot interaction device 300 to change the working state of the robot, so that the robot is in a drag teaching mode. Next, the operator 600 drags the teaching grip 140 to move the robot end-point operation tool 400 along a desired trajectory (for example, along a surface to be polished during polishing), thereby completing teaching of the trajectory of the robot end-point operation tool 400.
The taught grip location 140 must be located at the upper end of the tip force/torque sensor 130, for example, between the robot tip flange and the tip force/torque sensor 130. While teaching the trajectory, the robot operator 600 applies a force in one or more directions, for example, a polishing force perpendicular to the polishing surface during polishing. In the process of dragging the robot body 100, the robot control system 200 simultaneously collects the position signal of the lower robot and the force/moment signal output by the terminal force/moment sensor 130, and the corresponding relationship between the position signal and the force/moment signal, thereby completing the force/position hybrid teaching of the robot.
The terminal force/moment sensor 130 connected to the robot body 100 is used for sensing a contact force between the robot body 100 and the operated target 500 during the teaching process, the data acquisition module 201 acquires signals output by the terminal force/moment sensor 130, the acquired force/moment signals pass through the data processing module 202, and noise signals in the acquired force/moment signals are filtered by a digital filter to smooth the force/moment signals. From the filtered force/torque signals, force profile data is generated by the force generation module 210.
The joint torque sensor 123 is installed at each robot joint 120 of the robot, and is calculated by the force conversion module 209 according to the signal output by the joint torque sensor 123, and assuming that the tail end of the robot body 100 is subjected to an external force f and a torque m, a six-dimensional vector is formed by combining
N-dimensional vector composed of driving moments of each joint
τ=[τ1 τ2 ... τn]
The two have a relationship
τ=JTF
Where J is the jacobian matrix of the robot arm, the external force F experienced by the end of the robot arm can be solved, usually by solving a system of equations, with the joint moment τ known. When the inverse of the Jacobian exists, the method is carried out by
F=(JT)-1τ
The force control module 208 controls the forces in 6 directions in cartesian space, and may use a proportional-integral-derivative control algorithm, but is not limited thereto. In the process of controlling the Cartesian space force, resolving the gravity of the robot to be overcome in real time; when the resolved force deviates from the gravity value to be overcome, the robot control system 200 considers that the tail end is dragged, and the robot body 100 moves along the direction of the dragging force under the instruction sent by the dragging teaching module 207.
The joint position sensor 122 collects the real-time rotation angle of each robot joint, and transmits real-time data to the robot control system 200 through the joint servo driver 121 in a bus manner, wherein the bus may be a CAN bus or an EtherCAT bus, but is not limited to these buses. The position transformation module 206 transforms the joint angle into a Cartesian space pose, the transformation method being the same as the Cartesian space force transformation method. The transformed cartesian space pose is collected and processed by the trajectory generation module 204 on the trajectory dragged by the end-point manipulation tool 400, and then stored in the data storage module 203. The treatment method comprises the following steps:
suppose that a given trajectory acquires n control points d according to a certain sampling frequencyi(i=0,1,…,n),
The B-spline curve equation is:
the interval of the parameter u satisfies u0≤u1≤…≤un+k+1,Ni,kAnd (u) is a k-th order piecewise polynomial basis function.
Due to the local nature of B-splines, parametric transformations may be performed
u=u(t)=(1-t)ui+tui+1,0≤t≤1,i=k,k+1,...,n
The curve equation can be in the form of a parameter matrix
p(u)=p(u(t))=[1 t … tk]Mk[di-k di-k+1 … di]T,0≤t≤1,i=k,k+1,...,n
Wherein the 3-order coefficient matrix is
The force profile generated by tip force/moment sensor 130 and the robot tip profile and their correspondence.
Force/position hybrid control process
The force/position hybrid control of the robot is realized according to the B-spline generated by the robot control system 200 and the filtered signal of the force/moment sensor, and the principle of the realization is shown in fig. 3.
The force curve generated during teaching is used as a reference value for force control, and the signal of the robot end force/moment sensor 130 is processed by the data acquisition module 201 and the data processing module 202 to be used as a feedback value of the contact force of the robot and the operated target 500. When the robot force control value deviates from the reference value, the pose of the robot end effector 400 (as shown in fig. 1) in the force direction is changed by the position control module 205, thereby achieving the force adjustment.
The generated B-spline curve is used as a reference value for robot position control, and when the position of the robot end manipulation tool 400 deviates from the reference value, only the position of the normal plane of the robot is adjusted by the position control module 205 without changing the position of the robot direction. The change of the robot pose is processed by the position transformation module 206, and the pose of the end effector 400 is converted into the angle of each robot joint 120 (shown in fig. 1) according to the inverse kinematics of the robot, and is sent to each joint servo driver 121 (shown in fig. 1) by means of a bus, so as to realize the control of the joints.
Example 2
In embodiment 1, the tip force/moment sensor is attached to the robot body, is a separate component, or is integrated in the robot body. This provides the benefit of facilitating the replacement of various end effector tools.
In this embodiment, the tip force/torque sensor 410 is integrated into the tip operating tool 400, which is integral with the tip operating tool, as shown in FIG. 4. This has the advantage that the tip force/torque sensor 410 measures a more accurate value of the force or torque of the tip manipulation tool 400. However, a better adaptation between the end effector 400 and the robot body 100 and the hardware and software of the robot control system 200 is required.
The end effector 400 is attached to the end of the robot body 100 and the end force/torque sensor 410 is attached to the data acquisition module 201. When the robot body is taught, the data acquisition module 201 acquires signals of the terminal force/torque sensor 410 and position signals of the robot body 100 in the teaching process, and a series of position signals form a track signal of the robot. The subsequent processing and control steps are similar to those in embodiment 1, and are not described herein again.
Example 3
The scheme provided by the embodiment is a further improvement of the scheme provided in embodiment 1. In the scheme of embodiment 1, the data acquisition module 201 acquires a force or torque signal of the terminal force/torque sensor 130 and a position signal of the robot body 100 during the teaching process, and then the robot control system 200 establishes a corresponding relationship between the position information and the force/torque information.
In this embodiment, the robot control system 200 is provided with a time module. The data acquisition module 201 acquires time information in addition to the force or torque signal and the position signal, and establishes a corresponding relationship between the position information, the time information, and the force/torque information. This may broaden more application scenarios for the robot.
Example 4
The robot body 100 includes essential parts such as a robot link 110 and a robot joint 120, and has a considerable weight. In order to facilitate the dragging teaching of the robot body, the gravity moment of the robot body needs to be overcome in advance. In the process of dragging and teaching the robot body, the gravity moment changes in real time along with the change of the pose of the robot, and a robot control system needs to sample and calculate the gravity moment and then controls and offsets the gravity moment in real time. In the solution of embodiment 1, each robot joint 120 is provided with a joint torque sensor 123, which accurately monitors the gravitational torque of the robot and feeds it back to the robot control system 200. The robot control system 200 controls the joint servo driver to generate the canceling moment. This is a very good and reliable solution, but at a high cost.
In practical application, the joint torque sensor method is not limited to the above-mentioned joint torque sensor method, and can be implemented by the following method:
(1) the robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor, a speed reducer and an encoder, and the input end of the speed reducer is connected with the output end of the joint motor. The difference (assumed to be delta theta) between the encoder reading at the output end of the joint motor and the encoder reading at the output end of the speed reducer (for example, a harmonic speed reducer) is collected and calculated, and then the torque (assumed to be T-delta theta-K) of the joint is calculated according to the rigidity (assumed to be K) of the speed reducer. When the robot body is static, the moment is equal to the gravity moment. However, when the robot is dragged for teaching, the gravity moment component in the moment is calculated, and then the robot control system controls the joint motor to work to offset the gravity moment, so that the dragging teaching can be smoothly carried out.
(2) The robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor and a harmonic reducer, and a flexible gear of the harmonic reducer is attached with a strain gauge for measuring the torque of the joint. When the robot is dragged for teaching, the gravity moment component in the moment is calculated, and then the robot control system controls the joint motor to work so as to offset the gravity moment, so that the dragging teaching can be smoothly carried out.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.
Claims (14)
1. A teaching robotic system, comprising:
a robot body;
the tail end sensor is connected to the tail end of the robot body and used for measuring force or moment;
a control system configured to: when the robot body is taught, the control system collects position information of the robot body and force/moment information obtained by the end sensor, and establishes a corresponding relation between the position information and the force/moment information.
2. A teaching robot system according to claim 1, characterized in that when the robot body is taught, it includes a drag teaching process.
3. A teaching robot system according to claim 2, wherein the robot body comprises a plurality of robot joints, each of the robot joints mounting a joint motor and a joint sensor, each of the joint sensors measuring a moment of the corresponding joint.
4. The teach-robot system of claim 3, wherein the joint motor and the joint sensor are configured to: and controlling the joint motor to generate resisting torque according to the torque measured by the joint sensor so as to counteract the gravity torque of the robot.
5. The teaching robot system of claim 2, wherein the robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor, a reducer and an encoder, an input end of the reducer is connected to an output end of the joint motor, and the joint motor and the reducer are configured to: and acquiring and calculating the difference value between the encoder reading at the output end of the joint motor and the encoder reading at the output end of the speed reducer, and calculating the torque of the joint according to the rigidity of the speed reducer.
6. The teaching robot system of claim 2, wherein the robot body comprises a plurality of robot joints, each robot joint is provided with a joint motor and a harmonic reducer, and a flexible gear of the harmonic reducer is attached with a strain gauge for measuring the torque of the joint.
7. A teaching robotic system, comprising:
a robot body;
the tail end sensor is connected to the tail end of the robot body and used for measuring force or moment;
a control system configured to: when the robot body is taught, the control system collects position information and time information of the robot body and force/moment information obtained by the terminal sensor, and establishes a corresponding relation between the position information, the time information and the force/moment information.
8. A teaching robotic system, comprising:
the robot comprises a robot body, a control unit and a control unit, wherein the robot body comprises a plurality of robot joints, and each robot joint is provided with a joint motor and a joint sensor;
an end effector connected to an end of the robot body;
an end sensor for measuring a force or moment exerted on the end effector;
a control system configured to: when the robot is subjected to dragging teaching, the control system collects position information of each joint sensor and force/moment information obtained by the tail end sensor, and establishes a corresponding relation between the position information and the force/moment information.
9. The teach robot system of claim 8 wherein the end-point sensor is integrated into the end-effector.
10. The teach robot system of claim 8, wherein the end effector comprises one of a deburring tool, a grinding tool, a polishing tool, or a screwing tool.
11. The teaching robot system according to any one of claims 1 to 10, wherein the control system comprises:
the data acquisition module is used for acquiring force/torque information of the terminal sensor and position information of the robot body in the teaching process, and track information is formed by a plurality of position signals;
the data processing module is used for processing the force/moment information and the position information, obtaining track information by a plurality of position information and then establishing the relation between the track information and the force/moment information;
the data storage module is used for storing the track information, the force/moment information and the relation between the track information and the force/moment information;
the track generation module is used for converting the track information stored by the data storage module into a track which can be executed by the robot;
and the position control module controls the motion track of the robot body according to the executable track.
12. The teaching robot system of claim 11, further comprising a robot interaction device, the robot interaction device comprising:
the operation command module is used for changing the state of the robot and sending a robot action command;
and the display module is used for displaying the pose, force/moment information and track information of the robot.
13. A robot teaching method is characterized by comprising the following steps:
carrying out dragging teaching on a robot body, acquiring force/torque information applied to an end effector by the robot body in a teaching process, and acquiring track information of the robot body;
processing the force/moment information and the track information, and establishing a corresponding relation between a track and force/moment;
and storing the force/moment information, the track information and the corresponding relation between the track and the force/moment.
14. A robot teaching method according to claim 13, wherein the force/torque information is acquired by means of a force/torque sensor installed between the end of the robot body and the end effector, and a taught grasp point is provided on the robot body.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110018489.5A CN112847366B (en) | 2021-01-07 | 2021-01-07 | Force-position hybrid teaching robot system and teaching method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110018489.5A CN112847366B (en) | 2021-01-07 | 2021-01-07 | Force-position hybrid teaching robot system and teaching method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112847366A true CN112847366A (en) | 2021-05-28 |
CN112847366B CN112847366B (en) | 2023-07-25 |
Family
ID=76004944
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110018489.5A Active CN112847366B (en) | 2021-01-07 | 2021-01-07 | Force-position hybrid teaching robot system and teaching method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112847366B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113442115A (en) * | 2021-06-09 | 2021-09-28 | 配天机器人技术有限公司 | Method and system for reproducing robot dragging teaching track and computer storage medium |
CN114571491A (en) * | 2022-03-04 | 2022-06-03 | 天津新松机器人自动化有限公司 | Robot control device based on force sensor and teaching method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59157715A (en) * | 1983-02-25 | 1984-09-07 | Hitachi Ltd | Direct teaching method of robot |
US20130338832A1 (en) * | 2012-06-13 | 2013-12-19 | Fanuc Corporation | Robot controller which conducts a force control by using a three-axial force sensor |
CN103925879A (en) * | 2014-04-24 | 2014-07-16 | 中国科学院合肥物质科学研究院 | Indoor robot vision hand-eye relation calibration method based on 3D image sensor |
CN107696036A (en) * | 2017-08-21 | 2018-02-16 | 北京精密机电控制设备研究所 | A kind of dragging teaching machine of apery mechanical arm |
CN108656112A (en) * | 2018-05-15 | 2018-10-16 | 清华大学深圳研究生院 | A kind of mechanical arm zero-force control experimental system towards direct teaching |
CN108789363A (en) * | 2018-05-25 | 2018-11-13 | 雅客智慧(北京)科技有限公司 | It is a kind of that teaching system and method are directly dragged based on force snesor |
CN109434843A (en) * | 2018-12-10 | 2019-03-08 | 华中科技大学 | A kind of device and method of the Robot Force console keyboard mill blade based on dragging teaching |
CN109483556A (en) * | 2018-10-30 | 2019-03-19 | 武汉大学 | A kind of robot polishing system and method based on learning from instruction |
CN109712507A (en) * | 2018-11-19 | 2019-05-03 | 溱者(上海)智能科技有限公司 | A kind of robot demonstrator |
CN111624941A (en) * | 2020-06-15 | 2020-09-04 | 吉林大学 | Unknown environment-oriented six-degree-of-freedom robot power control method |
-
2021
- 2021-01-07 CN CN202110018489.5A patent/CN112847366B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59157715A (en) * | 1983-02-25 | 1984-09-07 | Hitachi Ltd | Direct teaching method of robot |
US20130338832A1 (en) * | 2012-06-13 | 2013-12-19 | Fanuc Corporation | Robot controller which conducts a force control by using a three-axial force sensor |
CN103925879A (en) * | 2014-04-24 | 2014-07-16 | 中国科学院合肥物质科学研究院 | Indoor robot vision hand-eye relation calibration method based on 3D image sensor |
CN107696036A (en) * | 2017-08-21 | 2018-02-16 | 北京精密机电控制设备研究所 | A kind of dragging teaching machine of apery mechanical arm |
CN108656112A (en) * | 2018-05-15 | 2018-10-16 | 清华大学深圳研究生院 | A kind of mechanical arm zero-force control experimental system towards direct teaching |
CN108789363A (en) * | 2018-05-25 | 2018-11-13 | 雅客智慧(北京)科技有限公司 | It is a kind of that teaching system and method are directly dragged based on force snesor |
CN109483556A (en) * | 2018-10-30 | 2019-03-19 | 武汉大学 | A kind of robot polishing system and method based on learning from instruction |
CN109712507A (en) * | 2018-11-19 | 2019-05-03 | 溱者(上海)智能科技有限公司 | A kind of robot demonstrator |
CN109434843A (en) * | 2018-12-10 | 2019-03-08 | 华中科技大学 | A kind of device and method of the Robot Force console keyboard mill blade based on dragging teaching |
CN111624941A (en) * | 2020-06-15 | 2020-09-04 | 吉林大学 | Unknown environment-oriented six-degree-of-freedom robot power control method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113442115A (en) * | 2021-06-09 | 2021-09-28 | 配天机器人技术有限公司 | Method and system for reproducing robot dragging teaching track and computer storage medium |
CN114571491A (en) * | 2022-03-04 | 2022-06-03 | 天津新松机器人自动化有限公司 | Robot control device based on force sensor and teaching method |
Also Published As
Publication number | Publication date |
---|---|
CN112847366B (en) | 2023-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106945043B (en) | Multi-arm cooperative control system of master-slave teleoperation surgical robot | |
JP6924145B2 (en) | Robot teaching method and robot arm control device | |
CN106826769B (en) | A kind of quick teaching apparatus of industrial robot and its implementation | |
US9031699B2 (en) | Kinematic predictor for articulated mechanisms | |
US5880956A (en) | Lead-through robot programming system | |
US9958862B2 (en) | Intuitive motion coordinate system for controlling an industrial robot | |
KR20190075098A (en) | System and method for directing a robot | |
CN109434843A (en) | A kind of device and method of the Robot Force console keyboard mill blade based on dragging teaching | |
Shen et al. | Asymptotic trajectory tracking of manipulators using uncalibrated visual feedback | |
CN111438687A (en) | Determination device | |
CN111775145B (en) | Control system of serial-parallel robot | |
US12005582B2 (en) | Controller and control system | |
CN112847366B (en) | Force-position hybrid teaching robot system and teaching method | |
CN106647529A (en) | Six-axis industrial robot track accurate tracking-and-controlling oriented intelligent teaching system | |
JP4534015B2 (en) | Master / slave robot control information confirmation method | |
Debus et al. | Cooperative human and machine perception in teleoperated assembly | |
Long et al. | Robotic arm simulation by using matlab and robotics toolbox for industry application | |
JP2819456B2 (en) | General purpose polishing equipment | |
Abdelaal | A study of robot control programing for an industrial robotic arm | |
JP6928031B2 (en) | Control device and control system | |
JP3577124B2 (en) | Method of acquiring mating data using force control robot | |
KR101474778B1 (en) | Control device using motion recognition in artculated robot and method thereof | |
JP2791030B2 (en) | Curved copying controller for multi-degree-of-freedom work machine | |
US11904479B2 (en) | Method for controlling drives of a robot, and robot system | |
Lei et al. | Vision-based position/impedance control for robotic assembly task |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |