CN114670212B - IMU and vision-based robot guiding handle and use method thereof - Google Patents

IMU and vision-based robot guiding handle and use method thereof Download PDF

Info

Publication number
CN114670212B
CN114670212B CN202210444260.2A CN202210444260A CN114670212B CN 114670212 B CN114670212 B CN 114670212B CN 202210444260 A CN202210444260 A CN 202210444260A CN 114670212 B CN114670212 B CN 114670212B
Authority
CN
China
Prior art keywords
handle
robot
linear
guiding
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210444260.2A
Other languages
Chinese (zh)
Other versions
CN114670212A (en
Inventor
孟祥敦
谭彬彬
谷阳正
高增桂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Xinlan Robot Technology Co ltd
Original Assignee
Nantong Xinlan Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Xinlan Robot Technology Co ltd filed Critical Nantong Xinlan Robot Technology Co ltd
Priority to CN202210444260.2A priority Critical patent/CN114670212B/en
Publication of CN114670212A publication Critical patent/CN114670212A/en
Application granted granted Critical
Publication of CN114670212B publication Critical patent/CN114670212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a robot guiding handle based on IMU and vision, which comprises a handle, wherein the front end of the handle is provided with a surface type reference, the bottom of the handle is provided with a linear reference, the linear reference is used for realizing the guiding function of a robot in a linear application scene, and the surface type reference is used for realizing the guiding function of the robot in the linear application scene; the linear application scene is suitable for burr, chamfer and gluing, and the planar application scene is suitable for grinding and polishing; the front part of the handle is provided with a 3D vision camera with the visual field facing downwards, the 3D vision camera has a sense tracking function, and an IMU (inertial measurement unit) capable of detecting the motion state of 6 degrees of freedom is integrated in the 3D vision camera; according to the robot guiding handle based on the IMU and the vision, the IMU is matched with the vision camera with the environment sensing and tracking capabilities, so that the guiding handle and the robot can be separated and remotely operated; the device also has the functions of track following, track recording and track reproduction, and can be better adapted to complex curved surfaces and arc-shaped working conditions.

Description

IMU and vision-based robot guiding handle and use method thereof
Technical Field
The invention belongs to the technical field of robot automation application, and particularly relates to a robot guiding handle based on an IMU and vision and a use method thereof.
Background
Along with popularization of application of robots in industrial production, the requirements of people on operability and safety of the robots are higher and higher, most of traditional robot application is that debugging personnel debug the robots on site to meet production requirements, and the operation mode is low in efficiency; later appears gradually and install auxiliary assembly such as force transducer at the robot end and drag the teaching of robot, compare traditional teaching orbit through the demonstrator, efficiency promotes to some extent. However, in the two application modes of the robot, the robot is required to work beside the robot, particularly in a dragging teaching mode, the robot is required to be in contact with the robot equipment 0 in a distance, and potential safety hazards of robot operation are increased to a certain extent; meanwhile, in the two robot application modes, teaching work is completed by inserting point position information and selecting different interpolation modes, certain limitation exists for complex curved surfaces and arc-shaped workpieces, and the surface consistency of the produced product cannot be guaranteed.
Disclosure of Invention
The invention aims to provide a robot guiding handle based on IMU, combining vision, a sensor, a gyroscope and the like to sense and identify surrounding environment, and realizing functions of remote operation, complex environment operation, rapid track guiding, track recording, track reproduction and the like of a robot, and a use method thereof.
In order to solve the technical problems, the invention discloses a robot guiding handle based on IMU and vision, which comprises a handle, wherein a plurality of buttons are arranged on the upper side of the handle, a surface type reference is arranged at the front end of the handle, a linear reference is arranged at the bottom of the handle, and the surface type reference is perpendicular to the linear reference; the linear reference is used for realizing the guiding function of the robot in the linear application scene, and the surface reference is used for realizing the guiding function of the robot in the linear application scene; the linear application scene is at least suitable for burr, chamfering and gluing processes, and the surface application scene is at least suitable for grinding and polishing processes;
the buttons at least comprise a button A, a button B, a button C and a button D, and the 4 buttons are respectively used for controlling functions of robot Movl point position record, movj point position record, full track record and program issuing;
the front part of the handle is provided with a 3D vision camera with a downward visual field, the 3D vision camera has a sense tracking function, and an inertial measurement unit capable of detecting the motion state of 6 degrees of freedom is integrated in the 3D vision camera;
an integrated circuit board is arranged in the handle, and a first communication cable connected with the 3D vision camera and a second communication cable connected with the integrated circuit board are arranged on the side face of the handle.
Preferably, the planar datum and the linear datum are both detachably connected to the handle.
Preferably, the plane-type reference and the line-type reference are connected by a connector of the same type so that positions of the plane-type reference and the line-type reference are interchangeable.
Preferably, the front end of the handle is provided with a cover for protecting the camera, and the cover is provided with sealing shockproof cotton.
Preferably, gyroscopes and various sensors for stabilizing images are integrated into the inertial measurement unit to detect rotation and movement of the three axes, as well as pitch, yaw and roll conditions.
Preferably, a wireless communication module for realizing remote guidance of the robot is integrated on the integrated circuit board.
Preferably, the linear reference lower end is tapered.
Preferably, the lower end of the linear datum is provided with a groove, and a ball is arranged in the groove.
Preferably, the upper part of the groove is communicated with a containing groove, and a color-containing indicator is arranged in the containing groove; an injection hole communicated with the accommodating groove is formed in the side wall of the lower part of the linear reference; the ball is used for drawing a running track on a reference datum plane.
The using method of the robot guiding handle based on IMU and vision comprises the steps of obtaining the pose of a current reference in a world coordinate system through a 3D vision camera, analyzing the relative pose relation of a calibrated robot and the guiding handle, and converting the track of the guiding handle into the track of a robot tail end tool, so that the track reproduction function is realized;
in the track guiding process, an operator moves a linear reference or a plane reference along a reference datum plane, and selects the linear reference or the plane reference for the linear application scene; the operator selects the button A or the button B to record point position information and interpolation mode according to the process requirement, after the continuous point position recording is finished, the operator clicks the button D to send recorded program information to the robot, and then the track guiding and teaching work of the robot can be finished;
for the continuous complex curved surface working condition, an operator can select a linear datum to be used as a reference, click a button C, open a full track following recording function, uniformly move along a curved surface through the tail end point of the linear datum, click the button C again after track guiding is completed, close the full track following recording function, click the button D to send recorded program information to the robot, and the guiding teaching work of the robot under the complex curved surface working condition can be completed.
Compared with the traditional teaching robot through a demonstrator or the emerging situation of dragging teaching robot program language through a force sensor, the invention combines various sensors and gyroscopes by using an Inertial Measurement Unit (IMU), and can well realize the separation operation of the guiding handle and the robot through Ethernet communication in cooperation with a vision camera with environment sensing and tracking capabilities; meanwhile, the guide handle has the functions of track following, track recording and track reproduction, and can be better adapted to complex curved surfaces and arc-shaped working conditions; in addition, the design of the ball at the linear datum end can effectively protect the reference datum plane.
Drawings
FIG. 1 is a front view of a robotic guiding handle based on IMU and vision;
FIG. 2 is a left side view of the IMU and vision based robotic guiding handle of FIG. 1;
FIG. 3 is a partial cross-sectional view of the IMU and vision based robotic guiding handle of FIG. 1;
FIG. 4 is an internal block diagram of the IMU and vision based robotic guiding handle of FIG. 1;
fig. 5 is a schematic diagram of an end structure of another linear datum.
The reference numerals in the drawings are respectively: 1. handle, 2, button a,3, button B,4, button C,5, button D,6, lid, 7, communication cable, 8, communication cable, 9, double Kong Pingtou bolt, 10, line standard, 11, face standard, 12, group a stud, 13, sealed shockproof cotton, 14, group B stud, 15, hexagonal stud, 16, integrated circuit board, 17, lock nut, 18, fixed block a,19, bolt, 20, fixed block B,21, 3D vision camera, 22, groove, 23, ball, 24, holding tank, 25, color indicator, 26, injection hole.
Detailed Description
The present invention is described in further detail below by way of examples to enable those skilled in the art to practice the same by reference to the specification.
It will be understood that terms, such as "having," "including," and "comprising," as used herein, do not preclude the presence or addition of one or more other elements or groups thereof.
Example 1
As shown in fig. 1-2, the guiding handle of the robot based on the IMU and vision comprises a handle 1, wherein a plurality of buttons are arranged on the upper side of the handle 1, a surface type reference 11 is arranged at the front end of the handle 1, a linear reference 10 is arranged at the bottom of the handle 1, and the surface type reference 11 is perpendicular to the linear reference 10; the linear datum 10 is used for realizing the guiding function of the robot in the linear application scene, and the plane datum 11 is used for realizing the guiding function of the robot in the linear application scene; the linear application scene is at least suitable for burr, chamfering and gluing processes, and the surface application scene is at least suitable for grinding and polishing processes;
the buttons at least comprise a button A2, a button B3, a button C4 and a button D5, and the 4 buttons are respectively used for controlling the functions of robot Movl point position record, movj point position record, full track record and program issuing; therefore, the track guidance of the entity robot can be simply and rapidly realized, and the real-time and high-efficiency performance can be realized.
The front part of the handle 1 is provided with a 3D vision camera 21,3D with a downward visual field, the vision camera 21 has a sense tracking function, and an inertial measurement unit IMU which can be used for detecting the motion state of 6 degrees of freedom is integrated in the handle;
as shown in fig. 4, an integrated circuit board 16 is disposed in the handle 1, and a first communication cable 7 connected to a 3D vision camera 21 and a second communication cable 8 connected to the integrated circuit board 16 are disposed on the side of the handle 1.
The surface type datum 11 and the linear datum 10 are detachably connected with the handle 1.
As shown in fig. 3, the surface-type datum 11 and the line-type datum 10 are connected by the same type of connecting piece, so that the positions of the surface-type datum 11 and the line-type datum 10 can be interchanged, and a user can conveniently select a proper installation form according to practical application. In the embodiment, the cover 6 is fixedly connected with the handle 1 through a double-hole flat head bolt 9; the linear datum 10 and the plane datum 11 are fixedly connected to the handle 1 through a group A stud 12 and a group B stud 14 respectively; the sealing shockproof cotton 13 is adhered to the cover 6 and used for guiding the sealing of the handle and the shock absorption protection of the 3D vision camera 21; the hexagonal stud 15 is strongly adhered to the protruding part on the inner side of the handle 1, is not detachable, is provided with a PCB integrated circuit board 16, and is fixed on the hexagonal stud 15 through a locking nut 17; the 3D vision camera 21 is pressed and fixed on the inner side of the handle 1 by the fixing block A18 and the fixing block B19 through the bolts 19.
The front end of the handle 1 is provided with a cover 6 for protecting the camera, and the cover 6 is provided with a sealing shockproof cotton 13.
The inertial measurement unit IMU has integrated therein gyroscopes and various sensors for stabilizing images to detect rotation and movement of the three axes, as well as pitch, yaw and roll conditions.
The integrated circuit board 16 has integrated thereon a wireless communication module for enabling remote guidance of the robot. In the embodiment, the guiding handle is connected and communicated with the robot through the Ethernet, can be communicated with a plurality of brands of robots, can realize functions of guiding, recording, reproducing and the like of the robot track, can enable remote operation to be possible, can be used in the fields of dangerous environment operation and the like, and can well realize the crossing from robot upper to robot replacement; the integrated circuit board design integrates data receiving and data processing of switching value, analog value and the like, and is finally output to terminal equipment by the RS422 transmission protocol.
The lower end of the linear reference 10 is processed into a cone shape.
The using method of the robot guiding handle based on IMU and vision comprises the steps of acquiring the pose of a current reference standard in a world coordinate system through a 3D vision camera 21, analyzing the relative pose relation of a calibrated robot and the guiding handle, and converting the track of the guiding handle into the track of a robot tail end tool, so that the track reproduction function is realized;
in the track guiding process, an operator moves the linear datum 10 or the plane datum 11 along a reference datum plane, and selects the linear datum 10 and the plane datum 11 for the linear application scene; the operator selects the button A2 or the button B3 to record the point position information and the interpolation mode according to the process requirement, and after the continuous point position recording is finished, the operator clicks the button D5 to send the recorded program information to the robot, so that the track guiding and teaching work of the robot can be finished;
for the continuous complex curved surface working condition, an operator can select the linear datum 10 as a reference, click the button C4, open the full track following recording function, uniformly move along the curved surface through the tail end sharp point of the linear datum 10, click the button C4 again after track guiding is completed, close the full track following recording function, click the button D5 to send recorded program information to the robot, and the guiding teaching work of the robot under the complex curved surface working condition can be completed.
Example 2
Similar to embodiment 1, the difference is that the lower end of the linear reference 10 is provided with a groove 22 in which a ball 23 is provided. Compared with the linear reference in embodiment 1, the linear reference in this embodiment has a tapered (sharp) lower end, and the structure of the linear reference similar to the ball point pen tip is less likely to scratch the reference surface, and is smoother during sliding, and less likely to generate harsh noise.
Example 3
As shown in fig. 5, similar to embodiment 2, the difference is that the upper part of the groove is communicated with a containing groove 24, and a color indicator 25 is arranged in the containing groove; an injection hole 26 communicated with the accommodating groove is formed in the side wall of the lower part of the linear standard 10; the injection hole is used for injecting a small amount of color indicator into the accommodating groove, and the ball is used for drawing a running track on the reference datum plane. The color indicator can select corresponding printing ink according to the material of the workpiece, and a track line is formed when the ball passes through the reference datum plane, so that the track line is favorable for subsequent data adjustment, calibration, repeated test and the like.
Although embodiments of the present invention have been disclosed above, it is not limited to the details and embodiments shown, it is well suited to various fields of use for which the invention is suited, and further modifications may be readily made by one skilled in the art, and the invention is therefore not to be limited to the particular details and examples shown and described herein, without departing from the general concepts defined by the claims and the equivalents thereof.

Claims (8)

1. The robot guiding handle based on IMU and vision is characterized by comprising a handle (1), wherein a plurality of buttons are arranged on the upper side of the handle (1), a plane reference (11) is arranged at the front end of the handle (1), a linear reference (10) is arranged at the bottom of the handle (1), and the plane reference (11) is perpendicular to the linear reference (10); the linear reference (10) is used for realizing the guiding function of the robot in the linear application scene, and the surface reference (11) is used for realizing the guiding function of the robot in the linear application scene; the linear application scene is at least suitable for burr, chamfering and gluing processes, and the surface application scene is at least suitable for grinding and polishing processes;
the buttons at least comprise a button A (2), a button B (3), a button C (4) and a button D (5), and the 4 buttons are respectively used for controlling functions of robot Movl point position record, movj point position record, full track record and program issuing;
a 3D vision camera (21) with a downward visual field is arranged at the front part of the handle (1), the 3D vision camera (21) has a sense tracking function, and an Inertial Measurement Unit (IMU) capable of detecting the motion state of 6 degrees of freedom is integrated in the 3D vision camera;
an integrated circuit board (16) is arranged in the handle (1), and a first communication cable (7) connected with the 3D vision camera (21) and a second communication cable (8) connected with the integrated circuit board (16) are arranged on the side surface of the handle (1);
the lower end of the linear datum (10) is provided with a groove (22), and a ball (23) is arranged in the groove;
the upper part of the groove is communicated with a containing groove (24), and a color-containing indicator (25) is arranged in the containing groove; an injection hole (26) communicated with the accommodating groove is formed in the side wall of the lower part of the linear standard (10); the ball is used for drawing a running track on a reference datum plane.
2. The IMU and vision based robotic guiding handle according to claim 1, wherein the planar fiducial (11) and the linear fiducial (10) are both detachably connected to the handle (1).
3. The IMU-and vision-based robotic guiding handle according to claim 2, wherein the planar fiducial (11) and the linear fiducial (10) are connected by the same model of connection such that the positions of the planar fiducial (11) and the linear fiducial (10) are interchangeable.
4. The guiding handle of the robot based on IMU and vision according to claim 1, wherein the front end of the handle (1) is provided with a cover (6) for protecting the camera, and the cover (6) is provided with sealing shockproof cotton (13).
5. The IMU and vision based robotic guide handle of claim 1, wherein gyroscopes and various sensors for stabilizing images are integrated into the Inertial Measurement Unit (IMU) to detect rotation and movement of the three axes, as well as pitch, yaw and roll conditions.
6. The IMU-and-vision-based robotic guidance handle of claim 1, wherein the integrated circuit board (16) has integrated thereon a wireless communication module for enabling remote guidance of the robot.
7. The IMU-and vision-based robotic guiding handle according to claim 1, wherein the linear datum (10) is tapered at its lower end.
8. A method for using the guiding handle of the robot based on IMU and vision according to any one of claims 1-7, characterized in that the pose of the current reference in the world coordinate system is obtained by a 3D vision camera (21), the relative pose relationship of the calibrated robot and the guiding handle is analyzed, and the track of the guiding handle is converted into the track of the end tool of the robot, so as to realize the track reproduction function;
in the track guiding process, an operator moves the linear type datum (10) or the plane type datum (11) along a reference datum plane, and selects the linear type datum (10) for the linear application scene and the plane type datum (11) for the plane type application scene; the operator selects the button A (2) or the button B (3) to record the point position information and the interpolation mode according to the process requirement, and after the continuous point position recording is finished, the operator clicks the button D (5) to send the recorded program information to the robot, so that the track guiding and teaching work of the robot can be finished;
for a continuously-changing complex curved surface working condition, an operator can select a linear reference (10) as a reference, click a button C (4), open a full track following recording function, uniformly move along a curved surface through the tail end sharp point of the linear reference (10), click the button C (4) again after track guiding is completed, close the full track following recording function, click a button D (5) to send recorded program information to the robot, and the guiding teaching work of the robot under the complex curved surface working condition can be completed.
CN202210444260.2A 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof Active CN114670212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210444260.2A CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210444260.2A CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Publications (2)

Publication Number Publication Date
CN114670212A CN114670212A (en) 2022-06-28
CN114670212B true CN114670212B (en) 2023-04-21

Family

ID=82081056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210444260.2A Active CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Country Status (1)

Country Link
CN (1) CN114670212B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2665897B2 (en) * 1995-08-08 1997-10-22 川崎重工業株式会社 Robot work teaching device
CN104766527B (en) * 2015-03-25 2018-04-10 淮安信息职业技术学院 Integrate the industrial robot instructional device that polynary post capability is trained
CN107309882B (en) * 2017-08-14 2019-08-06 青岛理工大学 A kind of robot teaching programming system and method
CN110039520B (en) * 2018-04-11 2020-11-10 陈小龙 Teaching and processing system based on image contrast
US11279044B2 (en) * 2018-08-03 2022-03-22 Yaskawa America, Inc. Robot instructing apparatus, teaching pendant, and method of instructing a robot
CN110053045A (en) * 2019-04-08 2019-07-26 佛山市宸卡机器人科技有限公司 Workpiece surface contour line acquisition methods, interference detection method and relevant apparatus
CN110919626B (en) * 2019-05-16 2023-03-14 广西大学 Robot handheld teaching device and method based on stereoscopic vision
CN111347431B (en) * 2020-04-16 2023-05-23 广东工业大学 Robot teaching spraying method and device for teaching hand-held tool

Also Published As

Publication number Publication date
CN114670212A (en) 2022-06-28

Similar Documents

Publication Publication Date Title
US11440179B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
CN109571487B (en) Robot demonstration learning method based on vision
US9772173B2 (en) Method for measuring 3D coordinates of a surface with a portable articulated arm coordinate measuring machine having a camera
CN106826769B (en) A kind of quick teaching apparatus of industrial robot and its implementation
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
US10569419B2 (en) Control device and robot system
CN106182001B (en) A kind of workpiece coordinate system automatic calibration device based on robot
CN110171009B (en) Robot handheld teaching device based on stereoscopic vision
CN106647529B (en) A kind of intelligent teaching system towards the accurate tracing control in six-shaft industrial robot track
CN111347431A (en) Robot teaching spraying method and device for teaching handheld tool
CN110125944B (en) Mechanical arm teaching system and method
CN107953333B (en) Control method and system for calibrating tool at tail end of manipulator
CN103192386A (en) Image-vision-based automatic calibration method of clean robot
CN107796276A (en) A kind of device and method for estimating industrial robot absolute fix precision
CN107671838B (en) Robot teaching recording system, teaching process steps and algorithm flow thereof
CN111823100A (en) Robot-based small-curvature polishing and grinding method
CN114670212B (en) IMU and vision-based robot guiding handle and use method thereof
CN110193816B (en) Industrial robot teaching method, handle and system
Han et al. Grasping control method of manipulator based on binocular vision combining target detection and trajectory planning
CN110428496B (en) Handheld tool operation guiding method based on virtual-real fusion
CN113752236B (en) Device, calibration rod and method for teaching mechanical arm
CN108527405A (en) A kind of cooperation robot guiding teaching system
JP4170530B2 (en) Robot teaching method and apparatus
CN112181135B (en) 6-DOF visual and tactile interaction method based on augmented reality
JP5797761B2 (en) Measuring device acting non-linearly to analyze and improve the adjustment of the orientation device acting on the sphere

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant