CN113580108A - Robot-assisted teaching system based on optical tracking - Google Patents

Robot-assisted teaching system based on optical tracking Download PDF

Info

Publication number
CN113580108A
CN113580108A CN202110905055.7A CN202110905055A CN113580108A CN 113580108 A CN113580108 A CN 113580108A CN 202110905055 A CN202110905055 A CN 202110905055A CN 113580108 A CN113580108 A CN 113580108A
Authority
CN
China
Prior art keywords
light
robot
emitting ball
processor
optical tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110905055.7A
Other languages
Chinese (zh)
Inventor
兰锦春
王磊
郭振杰
段体清
马静昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Mingtu Intelligent Technology Co ltd
Original Assignee
Suzhou Mingtu Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Mingtu Intelligent Technology Co ltd filed Critical Suzhou Mingtu Intelligent Technology Co ltd
Priority to CN202110905055.7A priority Critical patent/CN113580108A/en
Publication of CN113580108A publication Critical patent/CN113580108A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to an optical tracking-based robot-assisted teaching system.A space positioning module is internally provided with an observer and an upper computer; the teaching pen is provided with a key, a processor, a wireless communication module, a first light-emitting ball, a second light-emitting ball, a third light-emitting ball and a probe; infrared LEDs are arranged inside the first light-emitting ball, the second light-emitting ball and the third light-emitting ball; the key and the processor, the processor and the wireless communication module, the processor and the infrared LED, the processor and the probe are connected together through internal wires; the observer and the infrared LED transmit signals through infrared beams; the upper computer and the wireless communication module exchange data through a network; the upper computer and the robot exchange data through a network; the robot-assisted teaching system based on optical tracking is simple to operate, high in positioning accuracy and wide in application scene range.

Description

Robot-assisted teaching system based on optical tracking
Technical Field
The invention belongs to the technical field of teaching system correlation, and particularly relates to an optical tracking-based robot-assisted teaching system.
Background
Currently, an industrial robot mainly adopts a teaching and reproducing mode to set the motion track and the gesture of the robot, and the robot has high repeated positioning precision, so that the teaching track and the gesture can be accurately reproduced every time; the traditional teaching process controls the space motion of the robot through keys on a demonstrator, the mode is simple and convenient, but still has a plurality of problems in industrial application, firstly, the demonstrator cannot intuitively adjust the posture of the robot, certain requirements are provided for the technology and proficiency of field operators, secondly, the teaching mode is not suitable for application scenes with high precision requirements and a plurality of control points, and finally, the traditional teaching-reproducing mode depends on the repeated positioning precision of the robot, needs a high-precision clamp, and is difficult to adapt to flexible production requirements of various types and small batches in industrial production.
To the not enough that traditional robot demonstrator exists, various new teaching modes are constantly proposed, include: the force feedback traction teaching method is not easy to ensure in precision, the sensor is high in price and is difficult to popularize in industrial application; the gesture recognition mode cannot complete a complicated teaching trajectory.
Disclosure of Invention
The invention aims to provide an optical tracking-based robot-assisted teaching system which is simple to operate, high in positioning accuracy and wide in application scene range.
In order to achieve the purpose, the invention provides the following technical scheme: a robot-assisted teaching system based on optical tracking comprises a space positioning module, a teaching pen, a robot, an observer, an upper computer, keys, a processor, a wireless communication module, a first light-emitting ball, a second light-emitting ball, a third light-emitting ball, an infrared LED and a probe; an observer and an upper computer are arranged in the space positioning module; the teaching pen is provided with a key, a processor, a wireless communication module, a first light-emitting ball, a second light-emitting ball, a third light-emitting ball and a probe; infrared LEDs are arranged inside the first light-emitting ball, the second light-emitting ball and the third light-emitting ball; the key and the processor, the processor and the wireless communication module, the processor and the infrared LED, and the processor and the probe are connected together through internal wires; the observer and the infrared LED carry out signal transmission through infrared beams; the upper computer and the wireless communication module exchange data through a network; and the upper computer and the robot exchange data through a network.
As a further improvement of the present invention, the observer employs a binocular camera, and the view angle of the binocular camera covers the robot movable range.
As a further improvement of the invention, the processor is a micro signal processor.
As a further improvement of the invention, the specification and model of the first light-emitting ball, the second light-emitting ball and the third light-emitting ball are the same.
As a further improvement of the invention, the space positioning module contains a signal transmitter inside.
As a further improvement of the invention, the robot comprises a signal receiver inside.
Compared with the prior art, the invention has the beneficial effects that: according to the technical scheme, the observer adopts the binocular camera, so that the visual angle range can be effectively ensured to completely cover the robot, and the situation of space positioning error is avoided; according to the technical scheme, the infrared LEDs are arranged in the first light-emitting ball, the second light-emitting ball and the third light-emitting ball, so that the position and the posture of the teaching pen can be determined in real time through a binocular camera in the observer; according to the technical scheme, the observer is used for capturing the light source emitted by the infrared LED, the space positioning is realized, and compared with the artificial naked eye positioning, the precision is higher and the operation is simpler.
Drawings
Fig. 1 is a schematic view of the overall structure of the present invention.
FIG. 2 is a schematic diagram of a teaching pen according to the present invention.
In the figure: 1. a spatial positioning module; 101. an observer; 102. an upper computer; 2. a teaching pen; 201. pressing a key; 202. a processor; 203. a wireless communication module; 204. a first light-emitting ball; 205. a second light emitting ball; 206. a third light emitting ball; 207. an infrared LED; 208. a probe; 3. a robot.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 to 2, the present invention provides a technical solution: a robot-assisted teaching system based on optical tracking comprises a space positioning module 1, a teaching pen 2, a robot 3, an observer 101, an upper computer 102, a key 201, a processor 202, a wireless communication module 203, a first light-emitting ball 204, a second light-emitting ball 205, a third light-emitting ball 206, an infrared LED207 and a probe 208; an observer 101 and an upper computer 102 are arranged in the space positioning module 1; the teaching pen 2 is provided with a key 201, a processor 202, a wireless communication module 203, a first luminous ball 204, a second luminous ball 205, a third luminous ball 206 and a probe 208; infrared LEDs 207 are mounted inside the first light-emitting ball 204, the second light-emitting ball 205 and the third light-emitting ball 206; the button 201 and the processor 202, the processor 202 and the wireless communication module 203, the processor 202 and the infrared LED207, the processor 202 and the probe 208 are connected together through internal wires; the observer 101 and the infrared LED207 perform signal transmission through infrared beams; the upper computer 102 and the wireless communication module 203 exchange data through a network; the upper computer 102 and the robot 3 exchange data through a network; the observer 101 adopts a binocular camera, and the visual angle of the binocular camera covers the movable range of the robot 3; the processor 202 is a micro signal processor; the specification and model numbers of the first light-emitting ball 204, the second light-emitting ball 205 and the third light-emitting ball 206 are the same; the space positioning module 1 comprises a signal transmitter inside; the robot 3 includes a signal receiver therein.
When the device is used, firstly, the probe 3 touches a product to be processed, then the key 201 on the teaching pen 2 is pressed, at this time, light sources emitted by the infrared LEDs 207 in the first light-emitting ball 204, the second light-emitting ball 205 and the third light-emitting ball 206 are captured by the binocular camera in the observer 101, then a three-dimensional space coordinate XYZ is established, the coordinates of the first light-emitting ball 204, the second light-emitting ball 205 and the third light-emitting ball 206 are respectively set to be A, B, C, and the corresponding specific coordinates are as follows:
Figure BDA0003201334330000041
A. b, C, determining a plane ABC, setting the barycentric coordinate of the triangle ABC as O, wherein the specific value of O is as follows:
Figure BDA0003201334330000042
the coordinate of the gravity center O is the space coordinate of the teaching pen; the direction of the teaching pen 2 is determined by the normal direction of the plane ABC, and the normal vector is set to
Figure BDA0003201334330000043
It is composed of
Figure BDA0003201334330000044
The specific values of (A) are:
Figure BDA0003201334330000045
wherein x is a vector product multiplication symbol, | | | | | is a modulus of a vector, and the position and the posture of the teaching pen 2 can be determined by tracking the teaching pen 2 in real time by the binocular camera, so that the position and posture teaching of the robot 3 is realized.
An operator holds the teaching pen by hand and places the teaching pen according to the processing position and posture of the robot; pressing a key on the teaching pen, and recording a current position point; loosening the teaching pen, and recording the position and the posture of a subsequent teaching point position according to the same method; pressing an end key on the teaching pen to automatically generate a motion instruction of the robot; and sending the motion instruction to the robot, and moving the robot according to the teaching track.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. The utility model provides a supplementary teaching system of robot based on optical tracking which characterized in that: the device comprises a space positioning module (1), a teaching pen (2), a robot (3), an observer (101), an upper computer (102), a key (201), a processor (202), a wireless communication module (203), a first light-emitting ball (204), a second light-emitting ball (205), a third light-emitting ball (206), an infrared LED (207) and a probe (208); an observer (101) and an upper computer (102) are arranged in the space positioning module (1); the teaching pen (2) is provided with a key (201), a processor (202), a wireless communication module (203), a first light-emitting ball (204), a second light-emitting ball (205), a third light-emitting ball (206) and a probe (208); infrared LEDs (207) are arranged inside the first light-emitting ball (204), the second light-emitting ball (205) and the third light-emitting ball (206); the key (201) and the processor (202), the processor (202) and the wireless communication module (203), the processor (202) and the infrared LED (207), and the processor (202) and the probe (208) are connected together through internal leads; the observer (101) and the infrared LED (207) carry out signal transmission through infrared light beams; the upper computer (102) and the wireless communication module (203) exchange data through a network; and the upper computer (102) and the robot (3) exchange data through a network.
2. The optical tracking-based robot-assisted teaching system according to claim 1, wherein: the observer (101) adopts a binocular camera, and the visual angle of the binocular camera covers the movable range of the robot (3).
3. The optical tracking-based robot-assisted teaching system according to claim 1, wherein: the processor (202) is a micro-signal processor.
4. The optical tracking-based robot-assisted teaching system according to claim 1, wherein: the specification and model of the first light-emitting ball (204), the second light-emitting ball (205) and the third light-emitting ball (206) are the same.
5. The optical tracking-based robot-assisted teaching system according to claim 1, wherein: the space positioning module (1) comprises a signal transmitter inside.
6. The optical tracking-based robot-assisted teaching system according to claim 5, wherein: the robot (3) comprises a signal receiver inside.
CN202110905055.7A 2021-08-08 2021-08-08 Robot-assisted teaching system based on optical tracking Pending CN113580108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110905055.7A CN113580108A (en) 2021-08-08 2021-08-08 Robot-assisted teaching system based on optical tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110905055.7A CN113580108A (en) 2021-08-08 2021-08-08 Robot-assisted teaching system based on optical tracking

Publications (1)

Publication Number Publication Date
CN113580108A true CN113580108A (en) 2021-11-02

Family

ID=78256156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110905055.7A Pending CN113580108A (en) 2021-08-08 2021-08-08 Robot-assisted teaching system based on optical tracking

Country Status (1)

Country Link
CN (1) CN113580108A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09193064A (en) * 1996-01-22 1997-07-29 Toyoda Mach Works Ltd Robot teaching device
KR20140008658A (en) * 2012-07-11 2014-01-22 대우조선해양 주식회사 Welding point aautomatic recognition system and recognition method
CN106142092A (en) * 2016-07-26 2016-11-23 张扬 A kind of method robot being carried out teaching based on stereovision technique
CN108214495A (en) * 2018-03-21 2018-06-29 北京无远弗届科技有限公司 A kind of industrial robot teaching system and method
CN110171009A (en) * 2019-05-09 2019-08-27 广西安博特智能科技有限公司 A kind of robot handheld teaching apparatus based on stereoscopic vision
CN110900609A (en) * 2019-12-11 2020-03-24 浙江钱江机器人有限公司 Robot teaching device and method thereof
CN110919626A (en) * 2019-05-16 2020-03-27 广西大学 Robot handheld teaching device and method based on stereoscopic vision
CN113119077A (en) * 2021-04-30 2021-07-16 哈尔滨工业大学 Industrial robot handheld teaching device and teaching method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09193064A (en) * 1996-01-22 1997-07-29 Toyoda Mach Works Ltd Robot teaching device
KR20140008658A (en) * 2012-07-11 2014-01-22 대우조선해양 주식회사 Welding point aautomatic recognition system and recognition method
CN106142092A (en) * 2016-07-26 2016-11-23 张扬 A kind of method robot being carried out teaching based on stereovision technique
CN108214495A (en) * 2018-03-21 2018-06-29 北京无远弗届科技有限公司 A kind of industrial robot teaching system and method
CN110171009A (en) * 2019-05-09 2019-08-27 广西安博特智能科技有限公司 A kind of robot handheld teaching apparatus based on stereoscopic vision
CN110919626A (en) * 2019-05-16 2020-03-27 广西大学 Robot handheld teaching device and method based on stereoscopic vision
CN110900609A (en) * 2019-12-11 2020-03-24 浙江钱江机器人有限公司 Robot teaching device and method thereof
CN113119077A (en) * 2021-04-30 2021-07-16 哈尔滨工业大学 Industrial robot handheld teaching device and teaching method

Similar Documents

Publication Publication Date Title
CN110865704B (en) Gesture interaction device and method for 360-degree suspended light field three-dimensional display system
CN109240496B (en) Acousto-optic interaction system based on virtual reality
CN101963871A (en) Optical touch control system based on infrared spotlight recognition and realization method thereof
CN201859401U (en) Optical touch pen
CN109079794B (en) Robot control and teaching method based on human body posture following
CN110125944B (en) Mechanical arm teaching system and method
CN102169366A (en) Multi-target tracking method in three-dimensional space
CN103246350A (en) Man-machine interface device and method for achieving auxiliary information prompting based on regions of interest
WO2021227628A1 (en) Electronic device and interaction method therefor
CN109129492A (en) A kind of industrial robot platform that dynamic captures
CN108828996A (en) A kind of the mechanical arm remote control system and method for view-based access control model information
CN109240522A (en) A kind of active pointers and human-computer interaction device
CN113741699A (en) Gesture input device based on intelligent ring and system and method thereof
CN113580108A (en) Robot-assisted teaching system based on optical tracking
CN111862170A (en) Optical motion capture system and method
US11947726B2 (en) Multi-orientation fingertip planar tactile feedback device
CN108733232B (en) Input device and input method thereof
JPH11211414A (en) Position detecting system
CN207096576U (en) A kind of intelligent glasses system
CN116301321A (en) Control method of intelligent wearable device and related device
CN109901714A (en) A kind of electronics paper pen system and its control method
CN212623993U (en) Intelligent interactive pen and virtual reality system
RU110845U1 (en) MOBILE DEVICE MANAGEMENT SYSTEM BY USER TURNING THE USER'S HEAD
CN215240869U (en) Intelligent interactive drawing robot
CN112043388B (en) Touch man-machine interaction device for medical teleoperation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination