CN114670212A - Robot guide handle based on IMU and vision and use method thereof - Google Patents

Robot guide handle based on IMU and vision and use method thereof Download PDF

Info

Publication number
CN114670212A
CN114670212A CN202210444260.2A CN202210444260A CN114670212A CN 114670212 A CN114670212 A CN 114670212A CN 202210444260 A CN202210444260 A CN 202210444260A CN 114670212 A CN114670212 A CN 114670212A
Authority
CN
China
Prior art keywords
robot
handle
linear
imu
datum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210444260.2A
Other languages
Chinese (zh)
Other versions
CN114670212B (en
Inventor
孟祥敦
谭彬彬
谷阳正
高增桂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Xinlan Robot Technology Co ltd
Original Assignee
Nantong Xinlan Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Xinlan Robot Technology Co ltd filed Critical Nantong Xinlan Robot Technology Co ltd
Priority to CN202210444260.2A priority Critical patent/CN114670212B/en
Publication of CN114670212A publication Critical patent/CN114670212A/en
Application granted granted Critical
Publication of CN114670212B publication Critical patent/CN114670212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a robot guide handle based on an IMU and vision, which comprises a handle, wherein the front end of the handle is provided with a surface type reference, the bottom of the handle is provided with a linear reference, the linear reference is used for realizing the guide function of a robot in a linear application scene, and the surface type reference is used for realizing the guide function of the robot in the surface application scene; the linear application scene is suitable for burr, chamfering and gluing, and the surface application scene is suitable for grinding and polishing; the front part of the handle is provided with a 3D visual camera with a downward view, the 3D visual camera has a real sense tracking function, and an IMU (inertial measurement Unit) capable of detecting the motion state of 6 degrees of freedom is integrated in the 3D visual camera; according to the robot guide handle based on the IMU and the vision, the IMU is matched with the vision camera with the environment sensing and tracking capabilities, so that the guide handle can be separated from the robot and remotely operated; the device also has the functions of track following, track recording and track reproduction, and can be better suitable for complex curved surface and arc working conditions.

Description

Robot guide handle based on IMU and vision and use method thereof
Technical Field
The invention belongs to the technical field of robot automation application, and particularly relates to a robot guide handle based on an IMU and vision and a use method thereof.
Background
With the popularization of the application of the robot in industrial production, the requirements of people on the controllability and the safety of the robot are higher and higher, the traditional robot application is that mostly debugging personnel debug the robot on site to meet the production requirements, and the operation mode is low in efficiency; appear gradually later and pull the robot teaching at auxiliary assembly such as robot end installation force sensor, compare traditional teaching orbit through the demonstrator, efficiency promotes to some extent. However, in the two robot application modes, a person is required to work beside the robot, particularly in a dragging teaching mode, the person is required to be in contact with the robot equipment at a distance of 0, and potential safety hazards of robot operation are increased to a certain extent; meanwhile, in the two robot application modes, the teaching work is finished by inserting point position information and selecting different interpolation forms, so that the application of complex curved-surface and arc-shaped workpieces has certain limitation, and the surface continuity of the produced products cannot be ensured.
Disclosure of Invention
The invention aims to provide a robot guide handle which is based on an IMU, senses and identifies the surrounding environment by combining vision, a sensor, a gyroscope and the like, and realizes the functions of remote operation, complex environment operation, quick track guide, track recording, track reproduction and the like of a robot, and a use method thereof.
In order to solve the technical problem, the invention discloses a robot guide handle based on IMU and vision, which comprises a handle, wherein a plurality of buttons are arranged on the upper side of the handle, a surface type datum is arranged at the front end of the handle, a linear datum is arranged at the bottom of the handle, and the surface type datum is arranged perpendicular to the linear datum; the linear benchmark is used for realizing the guiding function of the robot in the linear application scene, and the surface benchmark is used for realizing the guiding function of the robot in the surface application scene; the linear application scene is at least suitable for the processes of deburring, chamfering and gluing, and the surface application scene is at least suitable for the processes of grinding and polishing;
the buttons at least comprise a button A, a button B, a button C and a button D, and the 4 buttons are respectively used for controlling the Movl point location record, the Movj point location record, the full track record and the program issuing function of the robot;
the front part of the handle is provided with a 3D visual camera with a downward view, the 3D visual camera has a real sense tracking function, and an inertia measurement unit which can be used for detecting the motion state of 6 degrees of freedom is integrated in the 3D visual camera;
be equipped with integrated circuit board in the handle, the handle side is equipped with the first communication cable of being connected with 3D vision camera, the second communication cable of being connected with integrated circuit board.
Preferably, the surface type datum and the linear type datum are detachably connected with the handle.
Preferably, the surface type reference and the line type reference are connected by the same type of connecting member so that the positions of the surface type reference and the line type reference are interchangeable.
Preferably, the front end of the handle is provided with a cover for protecting the camera, and the cover is provided with a sealing shockproof cotton.
Preferably, gyroscopes for image stabilization and various sensors are integrated into the inertial measurement unit to detect rotation and movement in three axes, as well as pitch, yaw, and roll conditions.
Preferably, a wireless communication module for remotely guiding the robot is integrated on the integrated circuit board.
Preferably, the lower end of the linear reference is machined to be conical.
Preferably, the lower end of the linear reference is provided with a groove, and a ball is arranged in the groove.
Preferably, the upper part of the groove is communicated with an accommodating groove, and a color-containing indicator is arranged in the accommodating groove; an injection hole communicated with the accommodating groove is processed on the side wall of the lower part of the linear reference; the ball is used for marking a running track on the reference datum plane.
A method for using a robot guide handle based on IMU and vision comprises the steps of obtaining the pose of a current reference datum in a world coordinate system through a 3D camera, analyzing the relative pose relation of a calibrated robot and a guide handle, and converting the track of the guide handle into the track of a tool at the tail end of the robot, so that the track reproduction function is realized;
in the track guiding process, an operator moves a linear benchmark or a surface benchmark along a reference benchmark surface, the linear benchmark is selected for a linear application scene, and the surface benchmark is selected for a surface application scene; an operator selects a button A or a button B to record point location information and an interpolation mode according to process requirements, and clicks a button D to send recorded program information to the robot after continuous point location recording is finished, so that track guidance and teaching work of the robot can be finished;
for the continuously changing complex curved surface working condition, an operator can select a linear standard as a reference, click the button C, open the full track following recording function, uniformly move along the curved surface through the tail end sharp point of the linear standard, click the button C again after the track guidance is finished, close the full track following recording function, and click the button D to send recorded program information to the robot, so that the guidance teaching work of the robot on the complex curved surface working condition can be finished.
Compared with the traditional robot teaching through a demonstrator or the emerging current situation of dragging and teaching robot programming language through a force sensor, the robot guiding handle based on the IMU and the vision of the invention combines various sensors and gyroscopes by using an Inertial Measurement Unit (IMU) to cooperate with a vision camera with environment sensing and tracking capabilities, and can well realize the operation of separating the guiding handle from the robot through Ethernet communication; meanwhile, the guide handle has the functions of track following, track recording and track reproduction, and can be better suitable for complex curved surface and arc working conditions; in addition, the ball design at the end part of the linear datum can also effectively protect the reference datum plane.
Drawings
FIG. 1 is a front view of a robot guide handle based on IMU and vision;
FIG. 2 is a left side view of the IMU and vision based robot guide handle of FIG. 1;
FIG. 3 is a partial cross-sectional view of the IMU and vision based robot guide handle of FIG. 1;
FIG. 4 is an internal block diagram of the IMU and vision based robotic guide handle of FIG. 1;
fig. 5 is a schematic view of an end structure of another linear reference.
The reference numbers in the drawings are respectively: 1. the device comprises a handle, 2, buttons A, 3, buttons B, 4, buttons C, 5, buttons D, 6, a cover, 7, a communication cable, 8, a communication cable, 9, a double-hole flat-head bolt, 10, a line type standard, 11, a surface type standard, 12, a group A of studs, 13, sealing shockproof cotton, 14, a group B of studs, 15, a hexagonal stud, 16, an integrated circuit board, 17, a locking nut, 18, fixing blocks A, 19, a bolt, 20, fixing blocks B, 21 and 3D cameras, 22, a groove, 23, a ball, 24, a containing groove, 25, a color indicator, 26 and an injection hole.
Detailed Description
The present invention is further described in detail below with reference to examples so that those skilled in the art can practice the invention with reference to the description.
It will be understood that terms such as "having," "including," and "comprising," when used herein, do not preclude the presence or addition of one or more other elements or groups thereof.
Example 1
As shown in fig. 1-2, a robot guide handle based on IMU and vision comprises a handle 1, a plurality of buttons are arranged on the upper side of the handle 1, a surface type datum 11 is arranged at the front end of the handle 1, a linear datum 10 is arranged at the bottom of the handle 1, and the surface type datum 11 is arranged perpendicular to the linear datum 10; the linear reference 10 is used for realizing the guiding function of the robot in the linear application scene, and the surface type reference 11 is used for realizing the guiding function of the robot in the surface application scene; the linear application scene is at least suitable for the processes of deburring, chamfering and gluing, and the surface application scene is at least suitable for the processes of grinding and polishing;
the buttons at least comprise a button A2, a button B3, a button C4 and a button D5, wherein 4 buttons are respectively used for controlling the Movl point location record, the Movj point location record, the full track record and the program issuing function of the robot; therefore, the track guidance of the entity robot can be simply and quickly realized, and the real-time and high-efficiency effects are realized.
The front part of the handle 1 is provided with a 3D visual camera 21 with a downward view, the 3D visual camera 21 has a real sense tracking function, and an inertia measurement unit IMU which can be used for detecting the motion state of 6 degrees of freedom is integrated in the 3D visual camera 21;
as shown in fig. 4, an integrated circuit board 16 is disposed in the handle 1, and a first communication cable 7 connected to the 3D vision camera 21 and a second communication cable 8 connected to the integrated circuit board 16 are disposed on a side surface of the handle 1.
The surface type datum 11 and the linear type datum 10 are detachably connected with the handle 1.
As shown in fig. 3, the surface type datum 11 and the line type datum 10 are connected by a connecting piece of the same type, so that the positions of the surface type datum 11 and the line type datum 10 can be interchanged, and a customer can conveniently select a proper installation form according to actual application. In the embodiment, the cover 6 is fixedly connected with the handle 1 through a double-hole flat-head bolt 9; the linear datum 10 and the plane datum 11 are fixedly connected to the handle 1 through a group A of studs 12 and a group B of studs 14 respectively; the sealing shockproof cotton 13 is stuck on the cover 6 and used for sealing the guide handle and protecting the 3D camera 21 in a shock absorption way; the hexagonal stud 15 is strongly bonded with the convex part at the inner side of the handle 1, is not detachable, is provided with a PCB integrated circuit board 16, and is fixed on the hexagonal stud 15 through a locking nut 17; the fixing block a18 and the fixing block B19 press and fix the 3D camera 21 to the inside of the handle 1 by the bolt 19.
The front end of the handle 1 is provided with a cover 6 for protecting the camera, and the cover 6 is provided with a sealing shockproof cotton 13.
The inertial measurement unit IMU incorporates a gyroscope and various sensors for image stabilization to detect rotation and movement in three axes, as well as pitch, yaw, and roll conditions.
The integrated circuit board 16 is integrated with a wireless communication module for remotely guiding the robot. In the embodiment, the guide handle is connected and communicated with the robot through the Ethernet, can be communicated with the multi-brand robot, realizes the functions of guiding, recording, reproducing and the like of the robot track, enables remote operation to be possible, can be used in the fields of dangerous environment operation and the like, and well realizes the crossing from robot assistance to robot changing; the integrated circuit board is designed to integrate data receiving and data processing such as switching value and analog quantity, and finally output to terminal equipment through an RS422 transmission protocol.
The lower end of the linear reference 10 is processed into a cone shape.
A method for using a robot guide handle based on IMU and vision comprises the steps of obtaining the pose of a current reference datum in a world coordinate system through a 3D camera 21, analyzing the relative pose relation of a calibrated robot and a guide handle, and converting the track of the guide handle into the track of a tool at the tail end of the robot, so that the track reproduction function is realized;
in the track guiding process, an operator moves the linear benchmark 10 or the surface benchmark 11 along a reference benchmark surface, selects the linear benchmark 10 for a linear application scene, and selects the surface benchmark 11 for a surface application scene; an operator selects a button A2 or a button B3 to record point location information and an interpolation mode according to process requirements, and after point location is continuously recorded, a button D5 is clicked to send recorded program information to the robot, so that track guidance and teaching of the robot can be completed;
for the continuously changing complex curved surface working condition, an operator can select the linear datum 10 as a reference, click the button C4, open the full track following recording function, uniformly move along the curved surface through the tail end point of the linear datum 10, click the button C4 again after the track guidance is finished, close the full track following recording function, and click the button D5 to send recorded program information to the robot, so that the guidance teaching work of the robot on the complex curved surface working condition can be finished.
Example 2
Similar to embodiment 1, the difference is that the linear standard 10 is provided with a groove 22 at the lower end, and a ball 23 is arranged in the groove. Compared with the embodiment 1 in which the lower end of the linear datum is processed into a conical shape (sharp part), the structure of the lower end of the linear datum similar to a ball point pen point in the embodiment is less prone to scratching the reference datum plane, is smoother during sliding, and is less prone to generating harsh noise.
Example 3
As shown in fig. 5, similar to embodiment 2, it is different in that the upper part of the groove is communicated with a containing groove 24, and a color-containing indicator 25 is arranged in the containing groove; an injection hole 26 communicated with the accommodating groove is formed in the side wall of the lower part of the linear standard 10; the injection hole is used for injecting a small amount of color indicator into the holding tank, and the ball is used for marking out the operation track on the reference datum plane. The color-containing indicator can select corresponding ink according to the material of the workpiece, and the ball slides across the reference datum plane to form a track line which is more favorable for subsequent data adjustment, calibration, repeated testing and the like.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable to various fields of endeavor for which the invention may be embodied with additional modifications as would be readily apparent to those skilled in the art, and the invention is therefore not limited to the details given herein and to the embodiments shown and described without departing from the generic concept as defined by the claims and their equivalents.

Claims (10)

1. A robot guide handle based on an IMU and vision is characterized by comprising a handle (1), wherein a plurality of buttons are arranged on the upper side of the handle (1), a surface type datum (11) is arranged at the front end of the handle (1), a linear datum (10) is arranged at the bottom of the handle (1), and the surface type datum (11) is perpendicular to the linear datum (10); the linear reference (10) is used for realizing the guiding function of the robot in the linear application scene, and the surface type reference (11) is used for realizing the guiding function of the robot in the surface application scene; the linear application scene is at least suitable for the processes of deburring, chamfering and gluing, and the surface application scene is at least suitable for the processes of grinding and polishing;
the buttons at least comprise a button A (2), a button B (3), a button C (4) and a button D (5), and the 4 buttons are respectively used for controlling the Movl point location record, the Movj point location record, the full track record and the program issuing function of the robot;
the front part of the handle (1) is provided with a 3D visual camera (21) with a downward view, the 3D visual camera (21) has a real sense tracking function, and an Inertia Measurement Unit (IMU) which can be used for detecting the motion state of 6 degrees of freedom is integrated in the 3D visual camera;
be equipped with integrated circuit board (16) in handle (1), handle (1) side is equipped with first communication cable (7) of being connected with 3D vision camera (21), second communication cable (8) of being connected with integrated circuit board (16).
2. The IMU and vision based robot guiding handle according to claim 1, characterized in that the planar datum (11) and the linear datum (10) are both detachably connected to the handle (1).
3. The IMU and vision based robotic guide handle according to claim 2, characterized in that the face datum (11) and the line datum (10) are connected by the same type of connection, so that the positions of the face datum (11) and the line datum (10) are interchangeable.
4. The IMU and vision based robot guiding handle according to claim 1, characterized in that the handle (1) is provided with a cover (6) at the front end for protecting the camera, and a sealing shock-proof cotton (13) is provided on the cover (6).
5. The IMU and vision based robotic guide handle of claim 1, wherein gyroscopes for stabilizing images and various sensors are integrated within the Inertial Measurement Unit (IMU) to detect three axis rotation and movement, as well as pitch, yaw and roll conditions.
6. The IMU and vision based robot guidance handle of claim 1, wherein said integrated circuit board (16) has integrated thereon a wireless communication module for enabling remote guidance of the robot.
7. The IMU and vision based robot guiding handle of claim 1, characterized in that the linear datum (10) lower end is machined conical.
8. IMU and vision based robot guiding handle according to claim 1, characterized in that the linear reference (10) is provided with a groove (22) at its lower end, in which a ball (23) is provided.
9. The IMU and vision based robot guidance handle of claim 8, wherein the groove has a receiving groove (24) communicating with an upper portion of the groove, the receiving groove having a color containing indicator (25) therein; an injection hole (26) communicated with the accommodating groove is processed on the side wall of the lower part of the linear reference (10); the ball is used for marking a running track on the reference datum plane.
10. A use method of a robot guide handle based on IMU and vision is characterized in that a 3D camera (21) is used for obtaining the pose of a current reference in a world coordinate system, the relative pose relation of a calibrated robot and the guide handle is analyzed, and the track of the guide handle is converted into the track of a robot tail end tool, so that the track reproduction function is realized;
in the track guiding process, an operator moves a linear benchmark (10) or a surface benchmark (11) along a reference benchmark surface, selects the linear benchmark (10) for a linear application scene, and selects the surface benchmark (11) for a surface application scene; an operator selects a button A (2) or a button B (3) to record point location information and an interpolation mode according to process requirements, and after the point location is continuously recorded, a button D (5) is clicked to send recorded program information to the robot, so that the track guidance and teaching work of the robot can be completed;
for the continuously changing complex curved surface working condition, an operator can select the linear datum (10) as a reference, click the button C (4), open the full track following recording function, uniformly move along the curved surface through the tail end point of the linear datum (10), click the button C (4) again after track guidance is finished, close the full track following recording function, and click the button D (5) to send recorded program information to the robot, so that the guidance teaching work of the robot on the complex curved surface working condition can be finished.
CN202210444260.2A 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof Active CN114670212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210444260.2A CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210444260.2A CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Publications (2)

Publication Number Publication Date
CN114670212A true CN114670212A (en) 2022-06-28
CN114670212B CN114670212B (en) 2023-04-21

Family

ID=82081056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210444260.2A Active CN114670212B (en) 2022-04-26 2022-04-26 IMU and vision-based robot guiding handle and use method thereof

Country Status (1)

Country Link
CN (1) CN114670212B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0947989A (en) * 1995-08-08 1997-02-18 Kawasaki Heavy Ind Ltd Robot work teaching device
CN104766527A (en) * 2015-03-25 2015-07-08 淮安信息职业技术学院 Industrial robot teaching device integrated with diversified post ability training
CN107309882A (en) * 2017-08-14 2017-11-03 青岛理工大学 A kind of robot teaching programming system and method
CN110039520A (en) * 2018-04-11 2019-07-23 陈小龙 A kind of teaching based on image comparison, system of processing
CN110053045A (en) * 2019-04-08 2019-07-26 佛山市宸卡机器人科技有限公司 Workpiece surface contour line acquisition methods, interference detection method and relevant apparatus
US20200039082A1 (en) * 2018-08-03 2020-02-06 Yaskawa America, Inc. Robot instructing apparatus, teaching pendant, and method of instructing a robot
CN110919626A (en) * 2019-05-16 2020-03-27 广西大学 Robot handheld teaching device and method based on stereoscopic vision
CN111347431A (en) * 2020-04-16 2020-06-30 广东工业大学 Robot teaching spraying method and device for teaching handheld tool

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0947989A (en) * 1995-08-08 1997-02-18 Kawasaki Heavy Ind Ltd Robot work teaching device
CN104766527A (en) * 2015-03-25 2015-07-08 淮安信息职业技术学院 Industrial robot teaching device integrated with diversified post ability training
CN107309882A (en) * 2017-08-14 2017-11-03 青岛理工大学 A kind of robot teaching programming system and method
CN110039520A (en) * 2018-04-11 2019-07-23 陈小龙 A kind of teaching based on image comparison, system of processing
US20200039082A1 (en) * 2018-08-03 2020-02-06 Yaskawa America, Inc. Robot instructing apparatus, teaching pendant, and method of instructing a robot
CN110053045A (en) * 2019-04-08 2019-07-26 佛山市宸卡机器人科技有限公司 Workpiece surface contour line acquisition methods, interference detection method and relevant apparatus
CN110919626A (en) * 2019-05-16 2020-03-27 广西大学 Robot handheld teaching device and method based on stereoscopic vision
CN111347431A (en) * 2020-04-16 2020-06-30 广东工业大学 Robot teaching spraying method and device for teaching handheld tool

Also Published As

Publication number Publication date
CN114670212B (en) 2023-04-21

Similar Documents

Publication Publication Date Title
CN106346315B (en) Machine tool control system capable of obtaining workpiece origin and workpiece origin setting method
US11440179B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
CN107309882B (en) A kind of robot teaching programming system and method
US20050131582A1 (en) Process and device for determining the position and the orientation of an image reception means
US7298385B2 (en) Method and device for visualizing computer-generated informations
Zaeh et al. Interactive laser-projection for programming industrial robots
CN110977931A (en) Robot control device and display device using augmented reality and mixed reality
US20160158937A1 (en) Robot system having augmented reality-compatible display
US20050256611A1 (en) Method and a system for programming an industrial robot
CN103909776B (en) A kind of numerically-controlled precise woodcarving system of processing
CN110171009B (en) Robot handheld teaching device based on stereoscopic vision
KR20040103382A (en) Robot system
CN102135776A (en) Industrial robot control system based on visual positioning and control method thereof
US20150091898A1 (en) Display Apparatus
CN107621826A (en) Intelligent movable tracks car
CN107953333B (en) Control method and system for calibrating tool at tail end of manipulator
CN112894223A (en) Automatic welding robot of diversified type that turns to
CN110103201A (en) Checking job robot system
CN110039520A (en) A kind of teaching based on image comparison, system of processing
CN110193816B (en) Industrial robot teaching method, handle and system
Han et al. Grasping control method of manipulator based on binocular vision combining target detection and trajectory planning
CN114670212A (en) Robot guide handle based on IMU and vision and use method thereof
CN206084141U (en) Intelligence location laser marking system of robot
CN117021137A (en) Visual teaching device adapting to various polishing tools
CN113752236B (en) Device, calibration rod and method for teaching mechanical arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant