CN114454144A - Robot teaching method, device and system - Google Patents
Robot teaching method, device and system Download PDFInfo
- Publication number
- CN114454144A CN114454144A CN202210142011.8A CN202210142011A CN114454144A CN 114454144 A CN114454144 A CN 114454144A CN 202210142011 A CN202210142011 A CN 202210142011A CN 114454144 A CN114454144 A CN 114454144A
- Authority
- CN
- China
- Prior art keywords
- poses
- robot
- pose
- teaching
- transition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
The invention discloses a robot teaching method, a device and a system, wherein the teaching method comprises the following steps: receiving end poses of an execution end manually calibrated by a user, wherein the end poses are multiple; and calculating transition poses of the robot according to the end poses and the working space of the robot, wherein the number of the transition poses is one or more, and the end poses are included in the collection of the working spaces of all the transition poses. According to the robot teaching method, device and system, the user calibrates the terminal poses through the teaching unit, the system groups all the terminal poses according to the working space of the robot, and calculates the transition poses of the robot for each pose group, so that the point which stays the least when the robot executes a task can be operated the most, and the operation efficiency of the robot is improved.
Description
Technical Field
The invention relates to the technical field of robots, in particular to a robot teaching method, a device and a system.
Background
The robot is widely applied in industry and life at present, when the robot executes multi-point tasks, the pose of the executing tail end of the robot needs to be calibrated manually, the position of the robot is controlled and the pose of a mechanical arm on the robot is controlled by the existing calibration method, so that the pose of the executing tail end of the mechanical arm meets the use requirement, and the calibration method is time-consuming and labor-consuming and is difficult to achieve better operation efficiency.
Disclosure of Invention
The invention aims to: in order to overcome the defects in the prior art, the invention provides a robot teaching method, a device and a system which are simple in calibration and enable the robot to have higher operation efficiency subsequently.
The technical scheme is as follows: to achieve the above object, a robot teaching method according to the present invention includes:
receiving end poses of an execution end manually calibrated by a user, wherein the end poses are multiple;
the method further comprises the following steps:
and calculating transition poses of the robot according to the end poses and the working space of the robot, wherein the number of the transition poses is one or more, and the end poses are included in the collection of the working spaces of all the transition poses.
Further, the calculating the transitional pose of the robot according to the end pose and the working space of the robot comprises:
according to the working space of the robot, a pose group is defined from all the residual end poses to obtain a plurality of pose groups; the set of poses includes a first of all of the remaining end poses, and all of the end poses included in the same set of poses can be included in the workspace at the same time;
for each of the pose groups, a transition pose corresponding to the pose group is calculated.
Further, the receiving the end pose of the execution end manually calibrated by the user comprises:
acquiring attitude data of a teaching unit;
and calculating the pose of the tail end of the teaching unit according to the pose data of the teaching unit to be used as the tail end pose.
Further, the acquiring pose data of the teaching unit includes:
acquiring data of a label on the teaching unit, which is acquired by a positioning element;
calculating the position of the label according to the data of the label at the same moment;
and calculating the pose data of the teaching unit according to the positions of all the labels.
A robot teaching device, comprising:
a receiving unit, configured to receive end poses of an execution end manually calibrated by a user, where the end poses are multiple;
further comprising:
and the computing unit is used for computing the transitional poses of the robot according to the end poses and the working space of the robot, the number of the transitional poses is one or more, and the collection of the working spaces of all the transitional poses includes all the end poses.
A robot teaching system, comprising:
a robot including a controller for implementing the teaching method;
and the teaching unit is used for calibrating the terminal pose of the user.
Has the advantages that: according to the robot teaching method, device and system, the user calibrates the terminal poses through the teaching unit, the system groups all the terminal poses according to the working space of the robot, and calculates the transition poses of the robot for each pose group, so that the point which stays the least when the robot executes a task can be operated the most, and the operation efficiency of the robot is improved.
Drawings
FIG. 1 is a configuration diagram of a teaching system;
FIG. 2 is a schematic flow chart of a robot teaching method;
FIG. 3 is a combination view of the mounting station and the positioning unit;
FIG. 4 is a block diagram of a mounting location;
fig. 5 is a structural view of the positioning unit.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
The teaching method of the robot of the present invention is based on the teaching system shown in fig. 1, and the teaching system includes:
a robot 1 comprising a controller for implementing the teaching method of the present invention; the manipulator further comprises a movable base 11 and a manipulator 12;
and the teaching unit 2 is used for calibrating the terminal pose of the user and is provided with an acquisition button. During teaching, a user holds the teaching unit 2 in hand, enables the end part of the teaching unit 2 to face a target position and presses the acquisition button, the pose of the end part of the teaching unit 2 corresponds to the pose of the execution terminal 13 of the robot 1 when the robot 1 executes a task, and the controller of the robot 1 acquires the pose information of the teaching unit 2 and calculates the pose of the end part of the teaching unit 2 according to the pose information, namely the terminal pose of the execution terminal 13.
As shown in fig. 2, the teaching method of a robot of the present invention includes the following steps S101 to S102:
step S101, receiving a plurality of terminal poses of an execution terminal 13 manually calibrated by a user;
in this step, the end pose is calibrated by the user through the teaching unit 2. The end poses are discrete position points with pose information one by one.
Step S102, calculating transition poses of the robot 1 according to the end poses and the working space of the robot 1, wherein the number of the transition poses is one or more, and the end poses are contained in the working space set of all the transition poses.
In this step, the working space of the robot 1 is a space formed by all poses that the execution tail end 13 of the robot 1 can execute, that is, the execution tail end 13 of the robot 1 can reach all poses in the working space when the robot 1 is at one position; the transition poses are point positions of a path needed by the robot 1 when actually executing a task, each transition pose corresponds to at least one end pose, and when the robot 1 is at the transition poses, the controller of the robot 1 can enable the execution end 13 of the robot 1 to sequentially pass through the end poses corresponding to the transition poses. When the task is actually executed, the controller controls the robot 1 to sequentially pass through the transition poses, and controls the execution tail end 13 of the robot 1 to traverse all tail end poses corresponding to the transition poses at the transition poses, so that the robot 1 can traverse all calibrated tail end poses.
In the process, the user firstly calibrates the terminal poses through the teaching unit 2, the controller then groups all the terminal poses according to the working space of the robot 1, and calculates the transition poses of the robot 1 according to each group of pose groups, so that the point which stays the least when the robot 1 executes the task can be operated the most, and the operation efficiency of the robot 1 is improved. Compared with the traditional teaching mode (the position of the robot 1 is controlled, and the posture of a mechanical arm on the robot 1 is controlled, so that the posture of the execution tail end 13 of the mechanical arm meets the use requirement), the teaching method provided by the invention can obviously improve the efficiency.
Preferably, the calculating the transition pose of the robot 1 according to the end pose and the working space of the robot 1 in the step S102 includes the following steps S201 to S202:
step S201, a pose group is defined from all the residual end poses according to the working space of the robot 1 to obtain a plurality of pose groups; the set of poses includes a first of all of the remaining end poses, and all of the end poses included in the same set of poses can be included in the workspace at the same time;
in this step, initially, the first end pose is the first calibrated end pose, and after one or more pose groups are marked, the first end pose of all remaining hole segment poses is the end pose closest to the transition pose corresponding to the previous pose group.
Step S202, aiming at each pose group, calculating a transition pose corresponding to the pose group.
The steps S201 to S202 are executed circularly until all the end poses are grouped.
Specifically, in the step S201, the step of defining a pose group from all the remaining end poses according to the working space of the robot 1 to obtain a plurality of pose groups specifically includes the following steps S301 to S302:
step S301, adjusting the working space so that the number of the end poses included in the working space reaches a maximum value, and the working space includes a first end pose of all the remaining end poses;
step S302, dividing all the end poses contained in the working space into a group, and taking other end poses as the rest end poses.
Preferably, the working space is divided into a primary space and a secondary space, all poses included in the primary space are preferred poses that are easily reached, and the step S202 of calculating the transition poses corresponding to each pose group includes the following steps S401 to S402:
step S401, adjusting the position and the posture of the working space to obtain a pose meeting preset conditions, wherein the preset conditions are as follows: the number of end poses in the set of poses that are located in the primary space reaches a maximum value, and all the end poses in the set of poses are in the working space;
in this step, the work of adjusting the position and the posture of the working space is simulated by the controller, a model of the working space is stored in the controller, and the position and the posture of the working space are adjusted by changing the position and the posture of the model.
Step S402, selecting a group of poses as selected poses; when the pose meeting the preset condition is unique, selecting the unique pose as a selected pose; when a plurality of poses meeting the preset condition exist, selecting the pose with the minimum change compared with the selected pose corresponding to the previous pose group as the selected pose corresponding to the current pose group;
in this step, the method for determining the pose with the minimum change compared with the selected pose corresponding to the previous pose group is as follows: and judging the change of the displacement, selecting the pose with small change of the displacement as a selected pose, judging the change of the orientation, and selecting the pose with small change of the orientation as a selected pose.
And S403, determining a transition pose according to the selected pose.
In this step, the pose of the working space is determined, and the transition pose of the robot 1 is also determined.
Preferably, the receiving of the end pose of the execution end 13 manually calibrated by the user in the step S101 includes the following steps S501 to S502:
step S501, acquiring attitude data of a teaching unit 2;
step S502, calculating the terminal pose of the teaching unit 2 according to the pose data of the teaching unit as the terminal pose.
Further, the teaching unit 2 has two tags 21, the robot 1 has 4 positioning elements mounted thereon, the 4 positioning elements are not coplanar, and any three positioning elements are not collinear, and the acquiring of the attitude data of the teaching unit 2 in the step S501 includes the following steps S601-S603:
step S601, acquiring data of the label 21 on the teaching unit 2 acquired by a positioning element;
in this step, after the user determines the calibration point, the user presses the acquisition button, the two tags 21 simultaneously send out data packets with timestamps and tag 21 numbers, and each positioning element receives each data packet and records the time for receiving each data packet. The data acquired by the controller includes the data packet received by the teaching unit 2 and the reception time of the data packet.
Step S602, calculating the position of the tag 21 according to the data of the tag 21 at the same time;
in this step, the controller collects the data packets with the same time stamps and the same number as the tags 21 and the corresponding receiving time together for processing, and the distance from each tag 21 to each positioning element can be calculated according to the time stamps in the data packets and the receiving time of the corresponding data packets, so that the position of each tag 21 can be calculated continuously.
Step S603, calculating the pose data of the teaching unit 2 from the positions of all the tags 21.
In this step, the positions of the two tags 21 are determined, and the position and posture of the teaching unit 2 are also determined.
In this way, the relative positional relationship between the robot 1 and the teaching unit 2 can be determined, and the controller can determine the position of the robot 1 with respect to the world coordinate system by means of the positioning and navigation system mounted on the robot 1 itself, so that the controller can convert the coordinates and posture (i.e., pose) of the teaching unit 2 with respect to the world coordinate system.
Preferably, the positioning element is detachable from the robot 1, and the robot 1 is provided with a mounting position for mounting the positioning element, so that the teaching unit 2 and the positioning unit 15 can be used for positioning a plurality of robots 1, and when teaching work for one robot is completed, the positioning element can be detached from the robot 1 and mounted on another robot 1 to perform teaching work in cooperation with the teaching unit 2.
Preferably, the robot 1 has four mounting positions, wherein a connecting line of two mounting positions is in a transverse layout, a connecting line of the other two mounting positions is in a vertical layout, and the two connecting lines are in a straight line with different surfaces.
As shown in fig. 4, the mounting site includes a mounting seat 141 and a contact unit 142, and the contact unit 142 is composed of a plurality of elastic contacts; as shown in fig. 5, the positioning unit 15 includes a docking seat 151 and a contact unit 152, the mounting seat 141 has a mounting groove in a shape of "convex", the docking seat 151 has a docking block in a shape of "convex", and after the positioning unit 15 is mounted to the mounting position, the assembled state is as shown in fig. 3, and the contact unit 152 and the contact unit 142 are in contact with each other. The contact unit 142 is connected with the controller, and the controller collects data of the positioning unit 15 through the contact unit 142.
The present invention also provides a robot 1 teaching apparatus, comprising:
a receiving unit configured to receive a plurality of end poses of the execution end 13 manually calibrated by a user; and
a calculating unit, configured to calculate transition poses of the robot 1 according to the end poses and a working space of the robot 1, where the number of the transition poses is one or more, and a collection of working spaces of all the transition poses includes all the end poses.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.
Claims (6)
1. A method for teaching a robot, comprising:
receiving end poses of an execution end manually calibrated by a user, wherein the end poses are multiple;
characterized in that the method further comprises:
and calculating transition poses of the robot according to the end poses and the working space of the robot, wherein the number of the transition poses is one or more, and the end poses are included in the collection of the working spaces of all the transition poses.
2. The method for teaching a robot according to claim 1, wherein said calculating the transitional pose of the robot from the end pose and the workspace of the robot comprises:
according to the working space of the robot, a pose group is defined from all the residual end poses to obtain a plurality of pose groups; the set of poses includes a first of all of the remaining end poses, and all of the end poses included in the same set of poses can be included in the workspace at the same time;
for each of the pose groups, a transition pose corresponding to the pose group is calculated.
3. The method according to claim 2, wherein the receiving of the end pose of the execution tip manually calibrated by the user comprises:
acquiring attitude data of a teaching unit;
and calculating the pose of the tail end of the teaching unit according to the pose data of the teaching unit to be used as the tail end pose.
4. The teaching method of a robot according to claim 3, wherein said acquiring attitude data of the teaching unit includes:
acquiring data of a label on the teaching unit, which is acquired by a positioning element;
calculating the position of the label according to the data of the label at the same moment;
and calculating the pose data of the teaching unit according to the positions of all the labels.
5. A robot teaching device, comprising:
a receiving unit, configured to receive end poses of an execution end manually calibrated by a user, where the end poses are multiple;
it is characterized by also comprising:
and the computing unit is used for computing the transitional poses of the robot according to the end poses and the working space of the robot, the number of the transitional poses is one or more, and the collection of the working spaces of all the transitional poses includes all the end poses.
6. A robot teaching system, characterized in that it comprises:
a robot comprising a controller for implementing the teaching method of any one of claims 1-4;
and the teaching unit is used for a user to calibrate the terminal pose.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210142011.8A CN114454144A (en) | 2022-02-16 | 2022-02-16 | Robot teaching method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210142011.8A CN114454144A (en) | 2022-02-16 | 2022-02-16 | Robot teaching method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114454144A true CN114454144A (en) | 2022-05-10 |
Family
ID=81412629
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210142011.8A Withdrawn CN114454144A (en) | 2022-02-16 | 2022-02-16 | Robot teaching method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114454144A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115476338A (en) * | 2022-07-08 | 2022-12-16 | 深圳市越疆科技有限公司 | Attitude adjustment method and apparatus, robot mechanism, electronic device, and storage medium |
-
2022
- 2022-02-16 CN CN202210142011.8A patent/CN114454144A/en not_active Withdrawn
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115476338A (en) * | 2022-07-08 | 2022-12-16 | 深圳市越疆科技有限公司 | Attitude adjustment method and apparatus, robot mechanism, electronic device, and storage medium |
CN115476338B (en) * | 2022-07-08 | 2024-01-26 | 深圳市越疆科技股份有限公司 | Gesture adjustment method and device, robot mechanism, electronic device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7207851B2 (en) | Control method, robot system, article manufacturing method, program and recording medium | |
US8855824B2 (en) | Dual arm robot | |
EP1533671B1 (en) | Teaching position correcting device | |
JP2020011339A (en) | Robot system control method and robot system | |
US20150045949A1 (en) | Robot control apparatus and method for controlling robot | |
US20150045955A1 (en) | Robot control apparatus and method for controlling robot | |
CN112629499B (en) | Hand-eye calibration repeated positioning precision measuring method and device based on line scanner | |
JP2015147280A (en) | robot calibration method | |
CN102300680A (en) | Method Of Controlling Robot Arm | |
US20190299418A1 (en) | Method and device for evaluating calibration precision | |
CN114454144A (en) | Robot teaching method, device and system | |
JP2016013608A (en) | Teaching point conversion method of robot, device and robot cell | |
JP5061965B2 (en) | Robot production system | |
CN109530984A (en) | Vision positioning welding and assembling method | |
CN113211444B (en) | System and method for robot calibration | |
JP5505155B2 (en) | Robot system and robot control method | |
CN115824258A (en) | Time stamp correction method based on correction plate | |
CN115200475B (en) | Rapid correction method for arm-mounted multi-vision sensor | |
EP4177015B1 (en) | Robot teaching system | |
WO2022195944A1 (en) | Robot system, and control method for robot system | |
US10744549B2 (en) | Method of verifying operating command, method of controlling machining device, recording medium recorded with operating-command verification program, and operating-command verification system | |
CN114248270A (en) | Industrial robot precision compensation method based on artificial intelligence | |
US20230191611A1 (en) | Robot system | |
CN106272418B (en) | A kind of device and method for position-finding for quickly seeking position for welding robot | |
KR100762362B1 (en) | Method control and apparatus welding robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220510 |
|
WW01 | Invention patent application withdrawn after publication |