CN111390882A - Robot teaching control method, device and system and electronic equipment - Google Patents

Robot teaching control method, device and system and electronic equipment Download PDF

Info

Publication number
CN111390882A
CN111390882A CN202010487351.5A CN202010487351A CN111390882A CN 111390882 A CN111390882 A CN 111390882A CN 202010487351 A CN202010487351 A CN 202010487351A CN 111390882 A CN111390882 A CN 111390882A
Authority
CN
China
Prior art keywords
coordinate system
robot
base coordinate
robot base
angle information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010487351.5A
Other languages
Chinese (zh)
Other versions
CN111390882B (en
Inventor
何嘉臻
薛光坛
李一娴
田松坡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202010487351.5A priority Critical patent/CN111390882B/en
Publication of CN111390882A publication Critical patent/CN111390882A/en
Application granted granted Critical
Publication of CN111390882B publication Critical patent/CN111390882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a robot teaching control method, a device and a system and electronic equipment, wherein first position data of a robot under a current robot base coordinate system is acquired; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is not consistent with the angle of the robot real object image actually seen by an operator is solved.

Description

Robot teaching control method, device and system and electronic equipment
Technical Field
The invention relates to the technical field of industrial robots, in particular to a robot teaching control method, device and system and electronic equipment.
Background
When teaching a robot, an operator generally holds a teaching device with his/her hand to operate the robot. The operator usually looks at the actual screen of the robot and the workpiece during operation to determine what the next operation is, then operates in the teach pendant, and looks at the three-dimensional image of the robot displayed in the teach pendant for reference during operation in the teach pendant.
Normally, the basic coordinate system of the robot is already set, the direction of the X, Y, Z axis thereof is fixed, and the basic coordinate system of the three-dimensional image of the robot on the teach pendant interface is also fixed and corresponds to the basic coordinate system of the robot. For example, the X1 axis direction of the basic coordinate system of the three-dimensional image displayed on the teach pendant is perpendicular to the interface inward, the Y1 axis is toward the left in the interface, and the Z1 axis is upward in the interface, wherein the X1 axis direction corresponds to the X axis direction, the Y1 axis direction corresponds to the Y axis direction, and the Z1 axis direction corresponds to the Z axis direction; at this time, if the actual standing position of the operator is in the forward direction on the X axis of the base coordinate system of the robot and the facing direction is the X axis, the angle of the robot real image actually seen by the operator is matched with the angle of the three-dimensional image displayed on the teach pendant.
However, in this case, the angle of the robot real image actually seen by the operator is not identical to the angle of the three-dimensional image displayed on the teach pendant, and when the operator determines the next operation by viewing the actual images of the robot and the workpiece and then operates the robot in the teach pendant, the operator may look at the three-dimensional image of the robot displayed on the teach pendant for reference, which may easily cause an erroneous operation; for example, when the operator sees that the real-object image is the front side of the robot and the demonstrator displays the back side of the three-dimensional image of the robot, if the operator judges that the robot should move to the left for a certain distance next time, the demonstrator should control the three-dimensional image of the robot to move to the right for a certain distance, but in the actual operation process, the operator is likely to control the three-dimensional image of the robot to move to the left for a certain distance, which causes errors.
Disclosure of Invention
In view of the defects of the prior art, an object of the embodiments of the present application is to provide a robot teaching control method, apparatus, system and electronic device, which aim to solve the problem that an incorrect operation is easily caused because the angle of a three-dimensional image displayed on a teach pendant is not consistent with the angle of a robot real image actually seen by an operator.
In a first aspect, an embodiment of the present application provides a robot teaching control method, which is applied to a robot, and includes the steps of:
acquiring first attitude data of the robot under a current robot base coordinate system;
acquiring position angle information of a demonstrator under a current robot base coordinate system;
reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero;
calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
generating three-dimensional image information of the robot according to the second posture data;
and sending the three-dimensional image information to the demonstrator for displaying.
In some embodiments, the step of acquiring the position angle information of the teach pendant under the current robot base coordinate system comprises:
and measuring the position angle information of the demonstrator under the current robot base coordinate system by using a position angle measuring device.
In other embodiments, the step of acquiring the position angle information of the teach pendant under the current robot base coordinate system includes:
acquiring position number information;
inquiring in a preset position angle information table according to the position number information to obtain preset position angle information of the demonstrator in a preset robot base coordinate system;
and calculating the position angle information of the demonstrator under the current robot base coordinate system according to the preset position angle information.
In the robot teaching control method, the position angle information includes azimuth angle information and elevation angle information;
the step of reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and making the position angle of the demonstrator under the reconstructed robot base coordinate system be zero comprises:
and reconstructing the current robot base coordinate system according to the azimuth angle information and the elevation angle information to obtain a reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
Further, the reconstructing the current robot base coordinate system according to the position angle information, and after the step of obtaining the position angle information of the teach pendant under the current robot base coordinate system, the method further includes:
judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value or not;
if the azimuth angle is not larger than a preset azimuth angle threshold value, setting the azimuth angle information to zero;
judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
if the elevation angle is not larger than a preset elevation angle threshold value, setting the elevation angle information to zero;
and if the azimuth angle information and the elevation angle information are both zero, not reconstructing the current robot base coordinate system.
In a second aspect, an embodiment of the present application provides a robot teaching control apparatus, including:
the first acquisition module is used for acquiring first attitude data of the robot under the current robot base coordinate system;
the second acquisition module is used for acquiring the position angle information of the demonstrator under the current robot base coordinate system;
the third acquisition module is used for reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, so that the position angle of the demonstrator under the reconstructed robot base coordinate system is zero;
the calculation module is used for calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
the generating module is used for generating three-dimensional image information of the robot according to the second posture data;
and the sending module is used for sending the three-dimensional image information to the demonstrator for displaying.
In the robot teaching control device, the position angle information includes azimuth angle information and elevation angle information; and the third acquisition module reconstructs the current robot base coordinate system according to the azimuth angle information and the elevation angle information to acquire a reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
The robot teaching control device further includes:
the first judgment module is used for judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value or not;
the first execution module is used for setting the azimuth information to zero when the azimuth is not larger than a preset azimuth threshold;
the second judgment module is used for judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
the second execution module is used for setting the high-low angle information to zero when the high-low angle is not larger than a preset high-low angle threshold value;
and the third acquisition module does not reconstruct the current robot base coordinate system when the azimuth angle information and the elevation angle information are both zero.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the robot teaching control method by calling the computer program stored in the memory.
In a fourth aspect, an embodiment of the present application provides a robot teaching control system, including a robot and a teach pendant that are communicatively connected;
the robot is used for acquiring first position and orientation data of the robot under a current robot base coordinate system, acquiring position angle information of a demonstrator under the current robot base coordinate system, reconstructing the current robot base coordinate system according to the position angle information, acquiring a reconstructed robot base coordinate system, enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero, calculating the position and orientation of the robot in the reconstructed robot base coordinate system according to the first position and orientation data, acquiring second position and orientation data, generating three-dimensional image information of the robot according to the second position and orientation data, and sending the three-dimensional image information to the demonstrator;
the teaching device is used for receiving the three-dimensional image information and displaying a three-dimensional image of the robot according to the three-dimensional image information.
Has the advantages that:
according to the robot teaching control method, the device and the system and the electronic equipment, first position and attitude data of the robot under a current robot base coordinate system are acquired; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the angle of the three-dimensional image displayed on the demonstrator is consistent with the angle of the robot real object image actually seen by the operator, and the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is inconsistent with the angle of the robot real object image actually seen by the operator is solved.
Drawings
Fig. 1 is a flowchart of a robot teaching control method provided in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a robot teaching control device according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a robot teaching control system according to an embodiment of the present application.
Fig. 5 is a schematic view of a position angle.
Fig. 6 is a schematic diagram of a process of reconstructing a current robot base coordinate system.
Fig. 7 is a schematic diagram of a process of reconstructing a current robot base coordinate system.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The following disclosure provides embodiments or examples for implementing different configurations of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Referring to fig. 1, a robot teaching control method provided in an embodiment of the present application is applied to a robot, and includes the steps of:
A1. acquiring first attitude data of the robot under a current robot base coordinate system;
A2. acquiring position angle information of a demonstrator under a current robot base coordinate system;
A3. reconstructing a current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero;
A4. calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
A5. generating three-dimensional image information of the robot according to the second posture data;
A6. and sending the three-dimensional image information to a demonstrator for displaying.
In some embodiments, a2. the step of acquiring position angle information of the teach pendant under the current robot base coordinate system includes:
and measuring the position angle information of the demonstrator under the current robot base coordinate system by using a position angle measuring device.
For example, the position angle measuring device is a radar provided on the robot, and the position angle information of the demonstrator in the radar coordinate system can be measured by the radar, and then the position angle information in the radar coordinate system is converted into the position angle information in the current robot base coordinate system according to the conversion relation between the radar coordinate system and the current robot base coordinate system.
For another example, the position angle measuring device is a vision system provided on the robot, and the vision system can measure the position angle information of the demonstrator in the coordinate system of the vision system, and then the position angle information in the coordinate system of the vision system is converted into the position angle information under the coordinate system of the current robot according to the conversion relationship between the coordinate system of the vision system and the coordinate system of the current robot.
Because the problem of easy misoperation can be avoided only by approximately conforming the angle of the three-dimensional image displayed on the demonstrator to the angle of the robot real image actually seen by the operator, the requirement on measurement precision is not high, so that the position angle information of the operator can also be used as the position angle information of the demonstrator by measuring the position angle information of the operator, and the measurement is simpler and quicker because the size of the operator is larger.
The position angle measuring device can also be a positioning module (a GPS positioning module, a Beidou positioning module and the like) arranged on the robot and the demonstrator, positioning information of the robot and the demonstrator is obtained through the positioning module, and position angle information of the demonstrator under a current robot base coordinate system is calculated through the positioning information of the robot and the demonstrator.
In other embodiments, a2. acquiring position angle information of the teach pendant in the current robot base coordinate system includes:
acquiring position number information;
inquiring in a preset position angle information table according to the position number information to obtain preset position angle information of the demonstrator in a preset robot base coordinate system;
and calculating the position angle information of the demonstrator under the current robot base coordinate system according to the preset position angle information.
In the embodiment, a plurality of standing positions are preset around the robot, each standing position corresponds to a position number, an operator is specified to stand in the standing positions for operation during teaching, and for each standing position, the position angle information of the standing position under the base coordinate system of the robot is measured in advance and forms a position angle information table to be stored; when an operator conducts teaching operation, position number information of the standing position is sent to the robot through the demonstrator according to the selected standing position, the robot can inquire in a position angle information table according to the position number information to quickly obtain preset position angle information of the demonstrator under a preset robot base coordinate system, and finally position angle information of the demonstrator under the current robot base coordinate system is obtained through calculation according to a conversion relation between the current robot base coordinate system and the preset robot base coordinate system. Therefore, a position angle measuring device is not needed, and the cost can be reduced.
Wherein each standing position may be, but is not limited to, a circular area, a rectangular area, an elliptical area, a sector area, etc.
The robot can also be provided with a sensor at each standing position for detecting whether an operator enters the standing position, when the operator is detected to enter the standing position, the sensor sends a trigger signal to the robot, and the robot inquires a position number comparison table of the sensor identification number and the standing position according to the identification number of the sensor sending the trigger signal, so that position number information is obtained. According to the method, an operator does not need to manually input the position number, and manual input errors are avoided.
Further, the position angle information includes azimuth size information and azimuth direction information; taking fig. 5 as an example, the azimuth angle a is positive if the counterclockwise direction is positive, and negative if the clockwise direction is positive.
Further, the position angle information includes azimuth angle information and elevation angle information. The azimuth angle is an included angle between a connecting line of the projection of the object in the horizontal direction and the original point of the robot base coordinate system and a positive direction axis of the robot base coordinate system (the positive direction axis is one of two horizontal main axes of the robot base coordinate system), and the elevation angle is an included angle between the connecting line of the object and the original point of the robot base coordinate system and the horizontal plane.
Taking fig. 5 as an example, the robot base coordinate system oyx includes a vertical axis Z, and an axis X and an axis Y on a horizontal plane, assuming that the axis X is a positive direction axis, a projection of the object P on the horizontal plane is a point P ', an azimuth angle a is an included angle between a connection line P' O and the axis X, and an elevation angle b is an included angle between a connection line PO and the horizontal plane.
And A3, reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, wherein the step of enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero comprises the following steps of:
and reconstructing the current robot base coordinate system according to the azimuth angle information and the elevation angle information to obtain a reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
In some embodiments, the reconstruction process is: firstly, rotating a current robot base coordinate system around a vertical axis thereof according to azimuth information to obtain a middle coordinate system, and enabling an azimuth of a demonstrator in the middle coordinate system to be zero; and then, rotating the middle coordinate system around a side direction axis (the other main axis vertical to the positive direction axis on the horizontal plane) according to the height angle information to obtain a reconstructed robot base coordinate system, and enabling the height angle of the demonstrator in the reconstructed robot base coordinate system to be zero.
Taking fig. 5-7 as an example, firstly rotating the xyz system by the angle a around the Z axis to obtain the intermediate coordinate system OX "Y 'Z, and making the azimuth angle of the demonstrator in the intermediate coordinate system OX" Y' Z zero; then, the middle coordinate system OX 'Y' Z is rotated by an angle b around a lateral direction axis (namely, a Y 'axis) to obtain a reconstructed robot base coordinate system OX' Y 'Z', and the elevation angle of the demonstrator in the reconstructed robot base coordinate system is made to be zero.
Further, before reconstructing the current robot base coordinate system, a2. after the step of obtaining the position angle information of the teach pendant under the current robot base coordinate system, the method further includes:
judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value or not;
if the azimuth angle is not larger than the preset azimuth angle threshold value, setting the azimuth angle information to zero;
judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
if the elevation angle is not larger than the preset elevation angle threshold value, setting the elevation angle information to zero;
and if the azimuth angle information and the elevation angle information are zero, not reconstructing the current robot base coordinate system.
In the actual operation process, the problem of easy misoperation can be avoided only by approximately conforming the angle of the three-dimensional image displayed on the demonstrator to the angle of the robot real object image actually seen by the operator, so that if the error between the angle of the three-dimensional image displayed on the demonstrator and the angle of the robot real object image actually seen by the operator is within an acceptable range, the current robot base coordinate system can not be reconstructed, and the working efficiency is improved. Wherein the preset azimuth angle threshold value and the preset elevation angle threshold value are set according to the actually acceptable error range.
Because the position where the operator stands may change in the operation process, when the change does not exceed the preset threshold value, the position where the operator stands may be considered to have no change, so that the corresponding position angle information is set to zero, and the influence on the efficiency due to the frequent resetting of the base coordinate system of the front robot is avoided.
And because the error ranges of the angles of the three-dimensional images acceptable by different operators and the angles of the real images are usually different, corresponding azimuth angle threshold values and elevation angle threshold values can be set for different operators to form an angle threshold value table, and during teaching, the corresponding azimuth angle threshold values and elevation angle threshold values are obtained by acquiring identity information of the operators and then inquiring in the angle threshold value table according to the identity information.
It should be noted that, in general, an operator stands on the ground to operate, and the position of the operator only changes on the horizontal plane, and since the origin of the robot base coordinate system is set on the base of the robot, the altitude angle is not large at this time, and the altitude angle can be considered to be zero all the time; in this case, only the azimuth needs to be considered, and in this case, the position angle information acquired in step a2 may only include azimuth information, and only rotation around the vertical axis is needed when reconstructing the current robot base coordinate system, which is efficient.
According to the robot teaching control method, the first position data of the robot under the current robot base coordinate system is obtained; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the angle of the three-dimensional image displayed on the demonstrator is consistent with the angle of the robot real object image actually seen by the operator, and the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is inconsistent with the angle of the robot real object image actually seen by the operator is solved.
Referring to fig. 2, an embodiment of the present application further provides a robot teaching control device, which includes a first obtaining module 1, a second obtaining module 2, a third obtaining module 3, a calculating module 4, a generating module 5, and a sending module 6;
the first acquisition module 1 is used for acquiring first attitude data of the robot under a current robot base coordinate system;
the second acquisition module 2 is used for acquiring position angle information of the demonstrator under the current robot base coordinate system;
the third obtaining module 3 is configured to reconstruct the current robot base coordinate system according to the position angle information, obtain a reconstructed robot base coordinate system, and make a position angle of the demonstrator under the reconstructed robot base coordinate system zero;
the calculation module 4 is used for calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
the generating module 5 is used for generating three-dimensional image information of the robot according to the second position and posture data;
the sending module 6 is used for sending the three-dimensional image information to the demonstrator for displaying.
In some embodiments, the position angle information includes azimuth angle information and elevation angle information; the third acquisition module 3 reconstructs the current robot base coordinate system according to the azimuth angle information and the elevation angle information to acquire the reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
In some embodiments, the robot teaching control device further includes a first determining module, a first executing module, a second determining module, and a second executing module;
the first judgment module is used for judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value;
the first execution module is used for setting the azimuth information to zero when the azimuth is not larger than a preset azimuth threshold;
the second judging module is used for judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
the second execution module is used for setting the high-low angle information to zero when the high-low angle is not larger than a preset high-low angle threshold value;
and the third acquisition module does not reconstruct the current robot base coordinate system when the azimuth angle information and the elevation angle information are both zero.
Therefore, the robot teaching control device acquires the first position data of the robot under the current robot base coordinate system; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the angle of the three-dimensional image displayed on the demonstrator is consistent with the angle of the robot real object image actually seen by the operator, and the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is inconsistent with the angle of the robot real object image actually seen by the operator is solved.
Referring to fig. 3, an electronic device 100 according to an embodiment of the present application further includes a processor 101 and a memory 102, where the memory 102 stores a computer program, and the processor 101 is configured to execute the robot teaching control method by calling the computer program stored in the memory 102.
The processor 101 is electrically connected to the memory 102. The processor 101 is a control center of the electronic device 100, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or calling a computer program stored in the memory 102 and calling data stored in the memory 102, thereby performing overall monitoring of the electronic device.
The memory 102 may be used to store computer programs and data. The memory 102 stores computer programs containing instructions executable in the processor. The computer program may constitute various functional modules. The processor 101 executes various functional applications and data processing by calling a computer program stored in the memory 102.
In this embodiment, the processor 101 in the electronic device 100 loads instructions corresponding to one or more processes of the computer program into the memory 102, and the processor 101 runs the computer program stored in the memory 102 according to the following steps, so as to implement various functions: acquiring first attitude data of the robot under a current robot base coordinate system; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; and sending the three-dimensional image information to the demonstrator for displaying.
According to the above, the electronic device acquires the first attitude data of the robot under the current robot base coordinate system; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the angle of the three-dimensional image displayed on the demonstrator is consistent with the angle of the robot real object image actually seen by the operator, and the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is inconsistent with the angle of the robot real object image actually seen by the operator is solved.
Referring to fig. 4, an embodiment of the present application further provides a robot teaching control system, which includes a robot 200 and a teach pendant 300 that are communicatively connected;
the robot 200 is configured to acquire first pose data of the robot under a current robot base coordinate system, acquire position angle information of the demonstrator under the current robot base coordinate system, reconstruct the current robot base coordinate system according to the position angle information, acquire a reconstructed robot base coordinate system, make a position angle of the demonstrator under the reconstructed robot base coordinate system zero, calculate a pose of the robot in the reconstructed robot base coordinate system according to the first pose data, acquire second pose data, generate three-dimensional image information of the robot according to the second pose data, and send the three-dimensional image information to the demonstrator 300;
the teach pendant 300 is configured to receive the three-dimensional image information and display a three-dimensional image of the robot according to the three-dimensional image information.
In some embodiments, the position angle information includes azimuth angle information and elevation angle information;
when the robot 200 reconstructs the current robot base coordinate system according to the position angle information and obtains the reconstructed robot base coordinate system, the current robot base coordinate system is reconstructed according to the azimuth angle information and the elevation angle information to obtain the reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
In some embodiments, before reconstructing the current robot base coordinate system, the robot 200 first determines whether the azimuth angle of the azimuth angle information is greater than a preset azimuth angle threshold, and sets the azimuth angle information to zero if the azimuth angle is not greater than the preset azimuth angle threshold; and judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not, and setting the elevation angle information to be zero if the elevation angle is not larger than the preset elevation angle threshold value.
According to the robot teaching control system, the first position data of the robot under the current robot base coordinate system is obtained; acquiring position angle information of a demonstrator under a current robot base coordinate system; reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero; calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data; generating three-dimensional image information of the robot according to the second posture data; sending the three-dimensional image information to the demonstrator for displaying; therefore, the angle of the three-dimensional image displayed on the demonstrator is consistent with the angle of the robot real object image actually seen by the operator, and the problem that misoperation is easy to occur due to the fact that the angle of the three-dimensional image displayed on the demonstrator is inconsistent with the angle of the robot real object image actually seen by the operator is solved
In summary, although the present invention has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention, which are substantially the same as the present invention.

Claims (10)

1. A robot teaching control method is applied to a robot and is characterized by comprising the following steps:
acquiring first attitude data of the robot under a current robot base coordinate system;
acquiring position angle information of a demonstrator under a current robot base coordinate system;
reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero;
calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
generating three-dimensional image information of the robot according to the second posture data;
and sending the three-dimensional image information to the demonstrator for displaying.
2. The robot teaching control method according to claim 1, wherein the step of acquiring positional angle information of the teach pendant in a current robot base coordinate system includes:
and measuring the position angle information of the demonstrator under the current robot base coordinate system by using a position angle measuring device.
3. The robot teaching control method according to claim 1, wherein the step of acquiring positional angle information of the teach pendant in a current robot base coordinate system includes:
acquiring position number information;
inquiring in a preset position angle information table according to the position number information to obtain preset position angle information of the demonstrator in a preset robot base coordinate system;
and calculating the position angle information of the demonstrator under the current robot base coordinate system according to the preset position angle information.
4. The robot teaching control method according to claim 1, wherein the positional angle information includes azimuth angle information and elevation angle information;
the step of reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, and making the position angle of the demonstrator under the reconstructed robot base coordinate system be zero comprises:
and reconstructing the current robot base coordinate system according to the azimuth angle information and the elevation angle information to obtain a reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
5. The robot teaching control method according to claim 4, wherein the step of acquiring position angle information of the teach pendant in a current robot base coordinate system further comprises:
judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value or not;
if the azimuth angle is not larger than a preset azimuth angle threshold value, setting the azimuth angle information to zero;
judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
if the elevation angle is not larger than a preset elevation angle threshold value, setting the elevation angle information to zero;
and if the azimuth angle information and the elevation angle information are both zero, not reconstructing the current robot base coordinate system.
6. A robot teaching control device, comprising:
the first acquisition module is used for acquiring first attitude data of the robot under the current robot base coordinate system;
the second acquisition module is used for acquiring the position angle information of the demonstrator under the current robot base coordinate system;
the third acquisition module is used for reconstructing the current robot base coordinate system according to the position angle information to obtain a reconstructed robot base coordinate system, so that the position angle of the demonstrator under the reconstructed robot base coordinate system is zero;
the calculation module is used for calculating the pose of the robot in the reconstructed robot base coordinate system according to the first pose data to obtain second pose data;
the generating module is used for generating three-dimensional image information of the robot according to the second posture data;
and the sending module is used for sending the three-dimensional image information to the demonstrator for displaying.
7. The robot teaching control device according to claim 6, wherein the position angle information includes azimuth angle information and elevation angle information; and the third acquisition module reconstructs the current robot base coordinate system according to the azimuth angle information and the elevation angle information to acquire a reconstructed robot base coordinate system, so that the azimuth angle and the elevation angle of the demonstrator under the reconstructed robot base coordinate system are both zero.
8. The robot teaching control device according to claim 7, further comprising:
the first judgment module is used for judging whether the azimuth angle of the azimuth angle information is larger than a preset azimuth angle threshold value or not;
the first execution module is used for setting the azimuth information to zero when the azimuth is not larger than a preset azimuth threshold;
the second judgment module is used for judging whether the elevation angle of the elevation angle information is larger than a preset elevation angle threshold value or not;
the second execution module is used for setting the high-low angle information to zero when the high-low angle is not larger than a preset high-low angle threshold value;
and the third acquisition module does not reconstruct the current robot base coordinate system when the azimuth angle information and the elevation angle information are both zero.
9. An electronic device comprising a processor and a memory, the memory having stored therein a computer program, the processor being configured to execute the robot teaching control method according to any one of claims 1 to 5 by calling the computer program stored in the memory.
10. A robot teaching control system is characterized by comprising a robot and a teaching device which are in communication connection;
the robot is used for acquiring first position and orientation data of the robot under a current robot base coordinate system, acquiring position angle information of a demonstrator under the current robot base coordinate system, reconstructing the current robot base coordinate system according to the position angle information, acquiring a reconstructed robot base coordinate system, enabling the position angle of the demonstrator under the reconstructed robot base coordinate system to be zero, calculating the position and orientation of the robot in the reconstructed robot base coordinate system according to the first position and orientation data, acquiring second position and orientation data, generating three-dimensional image information of the robot according to the second position and orientation data, and sending the three-dimensional image information to the demonstrator;
the teaching device is used for receiving the three-dimensional image information and displaying a three-dimensional image of the robot according to the three-dimensional image information.
CN202010487351.5A 2020-06-02 2020-06-02 Robot teaching control method, device and system and electronic equipment Active CN111390882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010487351.5A CN111390882B (en) 2020-06-02 2020-06-02 Robot teaching control method, device and system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010487351.5A CN111390882B (en) 2020-06-02 2020-06-02 Robot teaching control method, device and system and electronic equipment

Publications (2)

Publication Number Publication Date
CN111390882A true CN111390882A (en) 2020-07-10
CN111390882B CN111390882B (en) 2020-08-18

Family

ID=71418580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010487351.5A Active CN111390882B (en) 2020-06-02 2020-06-02 Robot teaching control method, device and system and electronic equipment

Country Status (1)

Country Link
CN (1) CN111390882B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100648A1 (en) * 2020-11-13 2022-05-19 苏州艾利特机器人有限公司 Industrial robot system, teaching method and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268369A1 (en) * 2011-04-19 2012-10-25 Microsoft Corporation Depth Camera-Based Relative Gesture Detection
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN105528789A (en) * 2015-12-08 2016-04-27 深圳市恒科通多维视觉有限公司 Robot vision positioning method and device, and visual calibration method and device
CN106783712A (en) * 2015-11-24 2017-05-31 沈阳新松机器人自动化股份有限公司 The method that dynamic wafer centre deviation position is verified in AWC systems
JP2019014011A (en) * 2017-07-06 2019-01-31 株式会社不二越 Method of correcting teaching position of robot
CN110006361A (en) * 2019-03-12 2019-07-12 精诚工科汽车系统有限公司 Part automated detection method and system based on industrial robot
JP2019119005A (en) * 2018-01-05 2019-07-22 株式会社Fdkエンジニアリング Calibration method of component assembly device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268369A1 (en) * 2011-04-19 2012-10-25 Microsoft Corporation Depth Camera-Based Relative Gesture Detection
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN106783712A (en) * 2015-11-24 2017-05-31 沈阳新松机器人自动化股份有限公司 The method that dynamic wafer centre deviation position is verified in AWC systems
CN105528789A (en) * 2015-12-08 2016-04-27 深圳市恒科通多维视觉有限公司 Robot vision positioning method and device, and visual calibration method and device
JP2019014011A (en) * 2017-07-06 2019-01-31 株式会社不二越 Method of correcting teaching position of robot
JP2019119005A (en) * 2018-01-05 2019-07-22 株式会社Fdkエンジニアリング Calibration method of component assembly device
CN110006361A (en) * 2019-03-12 2019-07-12 精诚工科汽车系统有限公司 Part automated detection method and system based on industrial robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100648A1 (en) * 2020-11-13 2022-05-19 苏州艾利特机器人有限公司 Industrial robot system, teaching method and storage medium

Also Published As

Publication number Publication date
CN111390882B (en) 2020-08-18

Similar Documents

Publication Publication Date Title
US11911914B2 (en) System and method for automatic hand-eye calibration of vision system for robot motion
US9199379B2 (en) Robot system display device
CN107972070B (en) Method and system for testing performance of robot and computer readable storage medium
JP4021413B2 (en) Measuring device
US10618166B2 (en) Teaching position correction device and teaching position correction method
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US20150261899A1 (en) Robot simulation system which simulates takeout process of workpieces
CN108527360B (en) Position calibration system and method
US11192249B2 (en) Simulation device for robot
CN111256732B (en) Target attitude error measurement method for underwater binocular vision
JP2012240174A (en) Calibration device and calibration method
CN111390882B (en) Robot teaching control method, device and system and electronic equipment
CN112603542B (en) Hand-eye calibration method and device, electronic equipment and storage medium
JPH09128549A (en) Relative position attitude detecting method for robot system
CN113840695B (en) Calibration inspection assembly, robot system, inspection method and calibration method
US20240316756A1 (en) Teaching devic18/270,398
JPH07237158A (en) Position-attitude detecting method and device thereof and flexible production system
CN113240745A (en) Point cloud data calibration method and device, computer equipment and storage medium
KR102174035B1 (en) Object inspection method using an augmented-reality
CN115200475B (en) Rapid correction method for arm-mounted multi-vision sensor
CN114918723B (en) Workpiece positioning control system and method based on surface detection
US20230249341A1 (en) Robot teaching method and robot working method
CN115327480A (en) Sound source positioning method and system
CN114485386B (en) Workpiece coordinate system calibration method, device and system
EP3627099B1 (en) A method of calibrating an apparatus for pointing spatial coordinates as well as a corresponding apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant