CN110919626B - Robot handheld teaching device and method based on stereoscopic vision - Google Patents

Robot handheld teaching device and method based on stereoscopic vision Download PDF

Info

Publication number
CN110919626B
CN110919626B CN201910408931.8A CN201910408931A CN110919626B CN 110919626 B CN110919626 B CN 110919626B CN 201910408931 A CN201910408931 A CN 201910408931A CN 110919626 B CN110919626 B CN 110919626B
Authority
CN
China
Prior art keywords
coordinate system
pose
teaching
robot
teaching device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910408931.8A
Other languages
Chinese (zh)
Other versions
CN110919626A (en
Inventor
陈琳
田硕
刘吉刚
王耀玮
潘海鸿
梁旭斌
蒲明辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi University
Original Assignee
Guangxi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi University filed Critical Guangxi University
Priority to CN201910408931.8A priority Critical patent/CN110919626B/en
Publication of CN110919626A publication Critical patent/CN110919626A/en
Application granted granted Critical
Publication of CN110919626B publication Critical patent/CN110919626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention belongs to the field of robot teaching and stereoscopic vision, and relates to a device and a method for quickly teaching a robot based on stereoscopic vision, wherein the method comprises the following steps: the device comprises an information sending module, a signal triggering device, at least three light sources, a handheld grab handle and a pose measuring rod piece. In the range of the stereoscopic vision effective field of view, the tail end of a pose measurement rod piece of the handheld teaching device moves to the position of a set point on a planned path in a set posture, image information of a characteristic identification unit is obtained, a pose matrix from a stereoscopic vision coordinate system to three light source coordinate systems is obtained through data processing, the pose matrix from the tail end coordinate system of the pose measurement rod piece to a robot base coordinate system is further constructed by combining the geometric dimension calibration result of the handheld teaching device, the pose information of a teaching point under the robot base coordinate system is calculated, and finally the robot reproduces the track of the teaching point. The device is simple, flexible, wide in application range, small in environmental influence, high in working efficiency, capable of realizing rapid teaching and improving the usability of the robot.

Description

Robot handheld teaching device and method based on stereoscopic vision
Technical Field
The invention relates to robot teaching equipment, in particular to a handheld teaching device and method based on stereoscopic vision and attitude sensors
Technical Field
Industrial robots are applied more and more widely in various industries, and a robot teaching process is an important step in a robot working process.
At present, two methods, namely an artificial teaching method and an off-line teaching method, are mainly used for robot teaching. The manual teaching means that the robot is guided by a manual operation to the robot end effector, or guided by a manual operation to the mechanical simulation device, or a teaching box to perform a desired operation, and programming of such a robot is realized by a real-time on-line teaching program, and the robot itself operates by memory, so that the reproduction can be repeated continuously. The off-line teaching method includes collecting teaching points, simulating simulation programming on a computer, and forming a track path through track planning. The conventional teaching process consumes a significant amount of operator time. There is therefore a need for a quick and simple teaching method.
In recent years, the robot vision technology has brought new possibilities for solving the problem of robot teaching. Currently, the more mature vision technologies are those based on time of flight (TOF), structured light, binocular vision and light field. The technologies can obtain the depth information of the measured object through a certain algorithm, and the depth information is fed back to the robot system, so that the depth problem in robot teaching can be solved.
Currently, for the visual teaching of industrial robots, for example, chinese patent nos.: CN201610595202.4, name: the invention discloses a method for teaching a robot based on a stereoscopic vision technology, which comprises the following steps: the method comprises the steps of placing markers on tools (such as a welding gun and a spray head), identifying the tools by using a stereoscopic vision camera, continuously recording a depth map of each frame, placing the tools at the tail end of the robot, calibrating a pose transformation matrix from a tool coordinate system at the tail end of the robot to a welding gun marker coordinate system, and then teaching and reproducing. According to the method, the robot arm does not need to be dragged during teaching, the method is light and convenient, the space is saved, and the system is simple and convenient to build. However, this approach has certain limitations. The method uses a real welding gun or other tools, and the marker needs to be fixed to different tools in consideration, so that the method has no universality. In addition, after the tool is installed at the tail end of the robot after being taught each time, the method needs to recalibrate the transformation relation from the tool pose matrix under the camera coordinate system to the arm tail end pose matrix under the robot coordinate system, which is time-consuming and labor-consuming for working condition environments requiring teaching different planning paths for many times, and the usability of the robot is limited.
Disclosure of Invention
Aiming at the existing problems, the invention provides a handheld teaching device and a handheld teaching method based on stereoscopic vision, and aims to solve the problems of tedious teaching process and long teaching time of a robot. The invention provides a simple teaching device structure and a teaching method, which can simplify the robot teaching process, shorten the teaching time, enable the robot teaching process to be convenient and fast, and improve the teaching efficiency and the usability of a robot.
In order to achieve the above object, the main technical solution of the present invention is as follows:
according to a first aspect of the present invention, the present invention provides a robot fast teaching device based on stereoscopic vision, comprising: the device comprises a signal sending module 1, a mode selection unit 2, a signal trigger device 3, a power supply module 4, a light spot point 5, a teaching device body 6, an attitude sensor 7 and a pose measurement rod piece 8. The light spot 5 is a light spot formed by electronic components such as a light emitting diode and the like which can be electrified to emit light, the power module 4 provides power for the light spot 5 and the signal trigger device 3, the signal trigger device 3 can control the light spot 5 to emit light and simultaneously send a signal to trigger the stereo camera to take a picture, the signal sending module 1 and the power module 4 are fixed inside the teaching device body 6, the attitude sensor 7 is fixed inside the teaching device body 6, the signal trigger device 3 is fixed on the teaching device body 6, the light spot 5 is fixed on the teaching device body 6, and the attitude measurement rod 8 is fixed at the front end of the teaching device body 6.
The mode selection unit 2 at least comprises two track fitting modes of straight line fitting and circular arc fitting, and a user can select and determine a track fitting mode of a set point on the taught planning track.
The position relation between the axis of the pose measurement rod 8 of the handheld teaching device and the axis of the attitude sensor 7 is fixed and known.
The pose measurement rod 8 can replace different end measurement rods according to different teaching environment requirements, and a pose transformation matrix T between a coordinate system of the light spot 5 and an end coordinate system of the pose measurement rod 8 needs to be calibrated after each replacement 4 The teaching environment at least comprises points, lines, planes, curved surfaces and the like in space.
The method comprises the following specific steps:
first, an initial value of the attitude sensor 7 is acquired: the position of the attitude sensor 7 relative to the stereo camera is fixed, the rotation relation between the coordinate system of the attitude sensor 7 and the coordinate system of the stereo camera is determined, and the attitude information DD measured by the attitude sensor 7 at the moment is recorded 1 As an initial attitude value;
secondly, calibrating the robot eyes to obtain a pose transformation matrix T between a robot tail end clamping tool coordinate system and a stereo camera coordinate system 2
Thirdly, calibrating a pose transformation matrix T of a light spot point 5 coordinate system to a pose measurement rod 8 end coordinate system 4
Fourthly, the posture state of the rod piece 8 which is measured by the posture is intuitively reached to the required teaching point, and the teaching work is carried out, specifically:
firstly, determining a track fitting mode of teaching points through a mode selection unit 2; secondly, operating the handheld teaching device to enable the light spot 5 on the handheld teaching device to be within the effective view field range of the stereo camera; then, moving the terminal point of the pose measurement rod 8 of the handheld teaching device to a set point on the planned path; finally, under the condition that the position of the tail end point of the pose measurement rod 8 is kept unchanged, the pose measurement rod 8 is adjusted to a set pose;
and fifthly, acquiring information, specifically:
the signal trigger device 3 is operated to trigger the light spots 5 to emit light, and meanwhile, the signal sending module 1 sends a signal to control the stereo camera to take a picture; the stereo camera transmits the shot image information PD to the computer, and simultaneously, the attitude sensor 7 transmits the measured attitude information DD at the moment 2 Synchronously transmitting to the computer;
sixthly, the computer processes data, specifically:
firstly, the computer is based on the posture information DD 1 And DD 2 And combining the position information of the light spot 5 in the three-dimensional camera coordinate system to construct a pose transformation matrix T between the three-dimensional camera coordinate system and the light spot 5 coordinate system 3 (ii) a Secondly, the computer acquires a pose transformation matrix T between a robot base coordinate system and a robot tail end holding tool coordinate system from the robot control module 1 (ii) a Finally, a pose transformation matrix T between the base coordinate system of the robot and the terminal coordinate system of the pose measurement rod 8 of the handheld teaching device is used 5 =T 1 ·T 2 ·T 3 ·T 4 The computer can calculate a pose transformation matrix T between the base coordinate system of the robot and the end coordinate system of the pose measurement rod 8 of the hand-held teaching device 5 Further converting the pose information into pose information PP of the end point (namely a set point) of the pose measurement rod 8 under the robot base coordinate system and storing the pose information PP;
and seventhly, repeating the fourth step to the sixth step until teaching work of the set points on all the planned paths is completed, obtaining pose information PP of the set points on all the planned paths under a robot base coordinate system, and controlling the robot terminal clamping tool to reproduce the taught pose state of the set points by the robot control module according to the pose information PP.
The invention has the beneficial effects that:
1. according to the robot handheld teaching device and method based on stereoscopic vision, the handheld teaching device with the light spot and the attitude sensor is used, the flexibility of the human hand is fully utilized, an operator can rapidly and intuitively move the handheld teaching device to reach the position and the attitude of a set point on a planned path, and compared with the traditional teaching device, the working efficiency is greatly improved.
2. According to the handheld robot teaching device and method based on stereoscopic vision, the pose measurement rod piece provides real-time pose feedback, so that an operator can evaluate and adjust the motion position and the pose of a clamping tool at the tail end of a robot in real time, and the planning track can run safely and stably in a complex environment.
3. The robot handheld teaching device and method based on stereoscopic vision greatly reduce the teaching difficulty of a space continuous curve planning path and have good adaptability to complex teaching working conditions.
Drawings
Fig. 1 is a block diagram of a handheld teaching device based on stereoscopic vision.
Fig. 2 is a schematic diagram of a robot teaching method based on stereoscopic vision.
Fig. 3 is a schematic diagram of a coordinate system of a handheld teaching device based on stereoscopic vision.
In the figure, 1-a signal sending module; 2-a mode selection unit; 3-a signal triggering device; 4-a power supply module; 5-light spot; 6-a teaching device body; 7-attitude sensor; 8 pose measuring rod pieces; a-a robot; b-a stereo camera; c-a handheld teaching device; the dotted lines between the signal triggering device 3, the power module 4 and the signal sending module 1 represent circuit connections; a dotted arrow between the stereo camera B and the handheld teaching device C indicates a shooting relationship between the stereo camera B and the handheld teaching device C;
Detailed Description
The invention provides a handheld teaching device and a track teaching method based on binocular vision and an attitude sensor.
The invention is further described below with reference to the accompanying drawings.
In an embodiment, as shown in fig. 1, a handheld teaching device C is composed of a signal sending module 1, a mode selecting unit 2, a signal triggering device 3, a power supply module 4, a light spot point 5, a teaching device body 6, an attitude sensor 7 and a pose measuring rod 8. The signal sending module 1 and the power supply module 4 are fixed in the rear interior of the teaching device body 6, the attitude sensor 7 is fixed in the front interior of the teaching device body 6, the signal trigger device 3 is fixed on the lower surface of the teaching device body 6, the mode selection unit 2 is fixed on the upper surface of the teaching device body 6, the light spot 5 is fixed on the upper surface of the teaching device body 6, the pose measurement rod 8 is fixed at the front end of the teaching device body 6, and the axis of the pose measurement rod 8 is parallel to the axis of the attitude sensor 7.
During installation, the light spot 5, the attitude sensor 7 and the pose measurement rod 8 must be fastened on the teaching device body 6, and in order to improve measurement stability and provide measurement accuracy, the light spot 5, the attitude sensor 7 and the pose measurement rod 8 of the handheld teaching device C are rigidly connected.
An operator operates the handheld teaching device C to move to the position and posture state of a set point on a planned path, presses the signal trigger device 3, and enables light spot image information PD shot by the stereo camera B and posture information DD measured by the posture sensor 7 to be in a state of pressing down 2 And transmitted to a computer (not shown) for data processing. The method can acquire and store the pose information PP of the tail end point of the pose measurement rod 8 in real time, and the robot A can reproduce the pose states of all teaching points according to the pose information PP.
The invention does not limit the pose relationship between the robot a and the stereo camera B, and the embodiment takes an eye-in-hand model for installing the stereo camera B on a clamping tool at the tail end of the robot a as an example, and the specific operation steps are as follows:
in a first step, a stereo camera B is mounted on a robot end gripping tool.
And secondly, completing the calibration of the internal and external parameters of the stereo camera B and the calibration of the robot eyes.
Thirdly, calibrating a pose transformation matrix T of a facula point coordinate system 5-1 and a pose measurement rod end coordinate system 8-1 4
As shown in fig. 3, a coordinate system diagram of the handheld teaching device C is shown, a light spot coordinate system 5-1 is established, and XYZ axes of the light spot coordinate system 5-1, the attitude sensor coordinate system 7-1 and the pose measurement rod end coordinate system 8-1 are parallel to each other with the pose measurement rod end coordinate system 8-1 as a reference. At this time, the attitude information DD measured by the attitude sensor 7 may represent the attitude information of the spot point coordinate system 5-1 and the pose measurement rod end coordinate system 8-1.
In the teaching process, the relation between the stereo camera B and the handheld teaching device C is realized by a pose transformation matrix T between a stereo camera coordinate system and a facula point coordinate system 5-1 3 Established, solving for T 3 The specific method comprises the following steps:
first, the attitude sensor 7 is placed on one side of the stereo camera B by a jig (not shown) so that XYZ axes of the stereo camera coordinate system and the attitude sensor coordinate system 7-1 are parallel to each other, and an angle value measured by the attitude sensor 7 at this time is recorded as an angle initial value DD 1 Is recorded as [ theta ] 11 ,θ 12 ,θ 13 ](ii) a Then, after the hand-held teaching device C is operated to reach a set-point pose state, the measured value DD of the attitude sensor 7 2 Is [ theta ] of 21 ,θ 22 ,θ 13 ]The difference Delta DD of the two measured angles is [ theta ] 1 ,θ 2 ,θ 3 ]And [ theta ] and 1 ,θ 2 ,θ 3 ]=[θ 21 ,θ 22 ,θ 13 ]-[θ 11 ,θ 12 ,θ 13 ]let θ 1 、θ 2 、θ 3 Respectively the rotation angles of the light spot coordinate system 5-1 around the X, Y and Z axes of the stereo camera coordinate system, and set R t Is a rotation matrix between a stereo camera coordinate system and a light spot coordinate system 5-1, as shown in (1):
Figure BSA0000183250220000041
is given by [ x ] g ,y g ,z g ]The position coordinates of the light spot 5 under the coordinate system of the stereo camera are obtained finally to obtain a pose transformation matrix T between the coordinate system of the stereo camera and the coordinate system 5-1 of the light spot 3 As shown in formula (2):
Figure BSA0000183250220000051
let T g A transformation relation shown in formula (3) exists for a translation vector of a stereo camera coordinate system to a position and posture measuring rod end coordinate system 8-1, namely position information of a position and posture measuring rod end point under the stereo camera coordinate system:
Figure BSA0000183250220000052
wherein, [ x y z ]] T The translation vector of the coordinate system 8-1 of the tail end of the rod member is measured for the spot point coordinate system 5-1 to the attitude, and is the quantity to be obtained and is recorded as T t
Further matrix operation can be performed to obtain a transformation relation shown in equation (4):
Figure BSA0000183250220000053
wherein i represents the number of characteristic points for calibration, and i is not less than 3, [ x ] ni y ni z ni ] T The position information of the corresponding feature point in the stereo camera coordinate system.
Position information (x) of i feature points n1 ,y n1 ,z n1 ),(x n2 ,y n2 ,z n2 ),...,(x ni ,y ni ,z ni ) In the formula (4), there is a transformation relationship as shown in the formula (5):
Figure BSA0000183250220000054
for matrix formats with the form of A.X = B and the matrix A is not a square matrix, the matrix can be obtained by the least square method
Figure BSA0000183250220000061
I.e. the transformation relation described by equation (6):
Figure BSA0000183250220000062
thereby obtaining the translation vector T of the coordinate system 5-1 of the light spot to the tail end coordinate system 8-1 of the attitude measurement rod piece t And R is constructed by the angle information measured by the attitude sensor 7 t Finally according to
Figure BSA0000183250220000063
Constructing a pose transformation matrix T of a coordinate system of the stereo camera to a pose measurement rod end coordinate system 8-1 4
And fourthly, operating the handheld teaching device to carry out teaching, placing the object to be processed in an effective view field range of the stereo camera B, determining a track fitting mode of teaching points through the operation mode selection unit 2, and operating the handheld teaching device C to reach the pose of a set point on the planned path.
Fifthly, acquiring information, pressing the signal trigger device 3 to trigger the light spot (5) to emit light, simultaneously sending a signal by the signal sending module 1 to control the stereo camera B to completely and clearly acquire the image information PD of the light spot (5) and transmit the image information PD to a computer, and simultaneously, the attitude sensor 7 transmits the attitude information DD measured at the moment 2 Synchronously transmitting to a computer, and the computer obtains a pose transformation matrix T from a stereo camera coordinate system to a facula point coordinate system 5-1 through data processing calculation 3 (solving for T 3 The same method as in the third step).
Sixthly, the computer processes data to obtain a pose transformation matrix T between a robot base coordinate system and a pose measurement rod coordinate system 8-1 of the handheld teaching device C 5
According to the coordinate system transformation relation shown in FIG. 3, at this time, according to the closed kinematic chain of the pose transformation matrix, the pose transformation relation (denoted as T) of the robot base coordinate system to the pose measurement rod end coordinate system 8-1 is adopted 5 ) A pose change matrix (denoted as T) from the robot base coordinate system to the robot end gripping tool coordinate system 1 Acquired by the robot control module), pose transformation matrix (denoted as T) of the robot end gripping tool coordinate system and the stereo camera coordinate system 2 Calibrated and acquired by the hands and eyes of the robot), the coordinate system of the stereo camera and the coordinates of the light spotPose transformation matrix T of system 5-1 3 And a pose transformation matrix T of a light spot coordinate system 5-1 and a pose measurement rod end coordinate system 8-1 on the handheld teaching device C 4 The relationship between them is shown in formula (7):
T 5 =T 1 ·T 2 ·T 3 ·T 4 (7)
according to the pose transformation relation T of the robot base coordinate system and the pose measurement rod end coordinate system 8-1 5 The system can be converted into pose information PP of a taught set point under a robot base coordinate system, the computer stores the pose information PP, and after teaching of all the set points is completed, all the pose information PP set under the base coordinate system is transmitted to the robot control module to control the robot to reappear pose states of all the set points to form a planning track.
The above is only a specific application example of the present invention, and the protection scope of the present invention is not limited in any way. In addition to the above embodiments, the present invention may have other embodiments. All technical solutions formed by using equivalent substitutions or equivalent transformations fall within the scope of the present invention.

Claims (5)

1. A robot handheld teaching device based on stereoscopic vision is characterized by at least comprising a signal sending unit (1), a mode selection unit (2), a signal trigger device (3), a power supply module (4), a light spot point (5), a teaching device body (6), an attitude sensor (7) and a pose measurement rod piece (8); the signal sending unit (1) and the power supply module (4) are fixed in the teaching device body (6), the attitude sensor (7) is fixed in the teaching device body (6), the signal trigger device (3) is fixed on the teaching device body (6), and the light spot (5) is fixed on the teaching device body (6); the pose measuring rod (8) is fixed at the front end of the teaching device body (6); the signal trigger device (3), the power supply module (4) and the light spot (5) are connected through a circuit, and the signal trigger device (3), the power supply module (4) and the signal transmission unit (TZ) are connected through a circuit;
the pose measurement rod (8) can be replaced by different pose measurement rods according to different teaching environments, and after the pose measurement rods are replaced, the position relation between the axis of the pose measurement rod (8) and the axis of the pose sensor (7) is fixed and known; the teaching environment at least comprises points, lines, planes and curved surfaces in space;
and establishing a light spot point coordinate system (5-1) of the light spot (5), and enabling XYZ axes of the light spot point coordinate system (5-1), the attitude sensor coordinate system (7-1) and the pose measurement rod end coordinate system (8-1) to be parallel to each other by taking the pose measurement rod end coordinate system (8-1) as a reference.
2. The device according to claim 1, wherein the mode selection unit (2) comprises at least two trajectory fitting modes of straight line fitting and circular arc fitting for the user to select a trajectory fitting mode for determining the set points on the planned trajectory being taught.
3. The robot handheld teaching device based on stereoscopic vision according to claim 1, wherein the light spot (5) is a light spot formed by a light emitting diode or an electronic component which is energized to emit light; the position of the light spot (5) fixed on the teaching device body (6) needs to satisfy the following requirements: when the handheld teaching device is shot by a stereo camera, the light spot (5) can be shot completely and clearly by the stereo camera.
4. The teaching method of the robot handheld teaching device based on stereoscopic vision according to claim 1, comprising the steps of:
first, an initial value of an attitude sensor (7) is acquired: fixing the attitude sensor (7) relative to the stereo camera, and determining the rotation relation R of the coordinate system of the attitude sensor (7) and the coordinate system of the stereo camera d Recording the attitude information DD measured by the attitude sensor (7) at the moment, and according to the rotation relation R d Converting the posture information DD into a stereoscopic camera posture DD 1 As an initial attitude value;
second step, go toCalibrating the robot hand and eye, and obtaining the pose transformation moment T between the robot end clamping tool coordinate system and the stereo camera coordinate system 2
Thirdly, calibrating a pose transformation matrix T between the coordinate system of the light spots (5) and the end coordinate system of the pose measurement rod (8) 4
Fourthly, teaching work is carried out by using the pose measuring rod piece (8) to intuitively reach the pose state of the required teaching point, and specifically, firstly, a track fitting mode of the teaching point is determined through the mode selection unit (2); secondly, operating the handheld teaching device to enable the light spot (5) to be placed in the effective visual field range of the stereoscopic vision system; then, the tail end of the pose measurement rod (8) of the handheld teaching device is placed on a set point on a planned path; finally, under the condition that the position of the tail end point of the pose measurement rod piece (8) is kept unchanged, the pose measurement rod piece (8) is adjusted to a set posture;
fifthly, acquiring information, operating the signal trigger device (3), triggering the light spots (5) to emit light, and simultaneously sending a signal by the signal sending unit (1) to control the stereo camera to take a picture; the stereo camera transmits the shot image information PD to a computer, and simultaneously, the attitude sensor (7) transmits the measured attitude information DD at the moment 2 Synchronously transmitting to the computer;
sixthly, the computer processes data, firstly, the computer processes the data according to the attitude information DD 1 And DD 2 And combining the position information of the light spot (5) in the three-dimensional camera coordinate system to construct a pose transformation matrix T between the three-dimensional camera coordinate system and the 1-end holding tool coordinate system of the light spot (5) coordinate system 1 (ii) a Finally, according to a pose transformation matrix T between a robot base coordinate system and a terminal coordinate system of the pose measurement rod (8) of the handheld teaching device 5 =T 1 ·T 2 ·T 3 ·T 4 The computer can calculate a pose transformation matrix T between the base coordinate system of the robot and the end coordinate system of the pose measurement rod (8) of the hand-held teaching device 5 Further converted into an end point (i.e., one) of the posture measurement pole (8)Set point) is stored in the pose information PP under the robot base coordinate system;
and seventhly, repeating the fourth step to the sixth step until teaching work of the set points on all the planned paths is completed, obtaining pose information PP of the set points on all the planned paths under a robot base coordinate system, and controlling the robot terminal clamping tool to reproduce the taught pose state of the set points by the robot control module according to the pose information PP.
5. The teaching method of robot handheld teaching device based on stereoscopic vision according to claim 4, characterized in that in the third step, a pose transformation matrix T between the coordinate system of the light spot (5) on the handheld teaching device and the coordinate system of the end of the pose measurement rod (8) is established 4 The method comprises the following steps:
a. placing a calibration object at least comprising three characteristic points in a visual field range of a stereoscopic vision module, and acquiring spatial position information of the points in a stereoscopic camera coordinate system;
b. operating the handheld teaching device to enable the tail end points of the pose measurement rod pieces (8) of the handheld teaching device to be aligned with the characteristic points, and adjusting the pose of the pose measurement rod pieces (8) to enable the light spots (5) on the handheld teaching device to be shot completely and clearly by a stereo camera;
c. then, an image of the light spot (5) on the handheld teaching device is collected through the stereo camera, image information of the light spot (5) is transmitted to a computer, and simultaneously the attitude sensor (7) transmits currently measured attitude information DD 2 Synchronously transmitting to the computer;
d. data processing is carried out through a computer, and a pose transformation matrix T between a coordinate system of a stereo camera on the handheld teaching device and a coordinate system of the light spot (5) under the current pose of the pose measurement rod (8) of the handheld teaching device is obtained 3
e. Based on the pose transformation matrix T obtained when teaching one feature point each time 3 And the corresponding position information of the characteristic points can establish the light spots on the handheld teaching deviceA pose transformation matrix T between the coordinate system of the point (5) and the coordinate system of the end of the pose measurement rod (8) 4
CN201910408931.8A 2019-05-16 2019-05-16 Robot handheld teaching device and method based on stereoscopic vision Active CN110919626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910408931.8A CN110919626B (en) 2019-05-16 2019-05-16 Robot handheld teaching device and method based on stereoscopic vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910408931.8A CN110919626B (en) 2019-05-16 2019-05-16 Robot handheld teaching device and method based on stereoscopic vision

Publications (2)

Publication Number Publication Date
CN110919626A CN110919626A (en) 2020-03-27
CN110919626B true CN110919626B (en) 2023-03-14

Family

ID=69855713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910408931.8A Active CN110919626B (en) 2019-05-16 2019-05-16 Robot handheld teaching device and method based on stereoscopic vision

Country Status (1)

Country Link
CN (1) CN110919626B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111152230B (en) * 2020-04-08 2020-09-04 季华实验室 Robot teaching method, system, teaching robot and storage medium
CN111360465A (en) * 2020-04-22 2020-07-03 杭州国辰机器人科技有限公司 Welding teaching gun based on intelligent teaching technology
CN111843997A (en) * 2020-07-29 2020-10-30 上海大学 Handheld general teaching system for mechanical arm and operation method thereof
CN113119112B (en) * 2021-03-18 2022-08-09 上海交通大学 Motion planning method and system suitable for vision measurement of six-degree-of-freedom robot
CN113070876A (en) * 2021-03-19 2021-07-06 深圳群宾精密工业有限公司 Manipulator dispensing path guiding and deviation rectifying method based on 3D vision
CN113319854B (en) * 2021-06-25 2023-01-20 河北工业大学 Visual demonstration method and system for bath robot
CN113524157A (en) * 2021-06-30 2021-10-22 深圳市越疆科技有限公司 Robot system, method, robot arm, and storage medium for configuring copy function
CN113580108A (en) * 2021-08-08 2021-11-02 苏州明图智能科技有限公司 Robot-assisted teaching system based on optical tracking
CN114670212B (en) * 2022-04-26 2023-04-21 南通新蓝机器人科技有限公司 IMU and vision-based robot guiding handle and use method thereof
CN115139283B (en) * 2022-07-18 2023-10-24 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115741714A (en) * 2022-11-28 2023-03-07 山东大学 Teaching robot control system and method based on image guidance
CN116852359A (en) * 2023-07-04 2023-10-10 无锡斯帝尔科技有限公司 TCP (Transmission control protocol) quick calibration device and method based on robot hand teaching device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748854A (en) * 1994-06-23 1998-05-05 Fanuc Ltd Robot position teaching system and method
CN103921265A (en) * 2013-01-16 2014-07-16 株式会社安川电机 Robot Teaching System And Robot Teaching Method
CN106142092A (en) * 2016-07-26 2016-11-23 张扬 A kind of method robot being carried out teaching based on stereovision technique
CN106217349A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Teaching apparatus and robot system
CN107756408A (en) * 2017-11-22 2018-03-06 浙江优迈德智能装备有限公司 A kind of robot trajectory's teaching apparatus and method based on active infrared binocular vision
CN108214495A (en) * 2018-03-21 2018-06-29 北京无远弗届科技有限公司 A kind of industrial robot teaching system and method
CN109571487A (en) * 2018-09-12 2019-04-05 河南工程学院 A kind of robotic presentation learning method of view-based access control model

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4101677B2 (en) * 2003-02-26 2008-06-18 三菱電機株式会社 Motion simulation device
CN104175031B (en) * 2014-08-20 2016-02-17 北京工业大学 A kind of welding robot system with autonomous centering capacity carries out the method for welding
CN105234943B (en) * 2015-09-09 2018-08-14 大族激光科技产业集团股份有限公司 A kind of industrial robot teaching device and method of view-based access control model identification

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748854A (en) * 1994-06-23 1998-05-05 Fanuc Ltd Robot position teaching system and method
CN103921265A (en) * 2013-01-16 2014-07-16 株式会社安川电机 Robot Teaching System And Robot Teaching Method
CN106217349A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Teaching apparatus and robot system
CN106142092A (en) * 2016-07-26 2016-11-23 张扬 A kind of method robot being carried out teaching based on stereovision technique
CN107756408A (en) * 2017-11-22 2018-03-06 浙江优迈德智能装备有限公司 A kind of robot trajectory's teaching apparatus and method based on active infrared binocular vision
CN108214495A (en) * 2018-03-21 2018-06-29 北京无远弗届科技有限公司 A kind of industrial robot teaching system and method
CN109571487A (en) * 2018-09-12 2019-04-05 河南工程学院 A kind of robotic presentation learning method of view-based access control model

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
周浩朋.基于机器视觉的机器人示教方法研究.《中国优秀硕士学位论文全文电子期刊网 信息科技辑》.2018, *
潘海鸿.基于双目视觉的机器人快速示教系统.《组 合 机 床 与 自 动 化 加 工 技 术》.2021, *
潘海鸿等.基于姿态传感器和发光点的手持示教器设计.《组合机床与自动化加工技术》.2021, *

Also Published As

Publication number Publication date
CN110919626A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110919626B (en) Robot handheld teaching device and method based on stereoscopic vision
CN110171009B (en) Robot handheld teaching device based on stereoscopic vision
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
CN107756408B (en) Robot track teaching device and method based on active infrared binocular vision
US20210402590A1 (en) Robotic navigation system and method
CN110170996B (en) Robot rapid teaching system based on stereoscopic vision
EP3068607B1 (en) System for robotic 3d printing
US20220063098A1 (en) Remote robotic welding with a handheld controller
CN108214495A (en) A kind of industrial robot teaching system and method
EP3272473B1 (en) Teaching device and method for generating control information
CN114043087B (en) Three-dimensional trajectory laser welding seam tracking attitude planning method
CN111347431A (en) Robot teaching spraying method and device for teaching handheld tool
CN110125944B (en) Mechanical arm teaching system and method
CN113246142B (en) Measuring path planning method based on laser guidance
EP3789151B1 (en) Gas tungsten arc welding training system, and a method of operating a gas tungsten arc welding system
CN114434059A (en) Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN110948467A (en) Handheld teaching device and method based on stereoscopic vision
CN210361314U (en) Robot teaching device based on augmented reality technology
CN110193816B (en) Industrial robot teaching method, handle and system
CN111843997A (en) Handheld general teaching system for mechanical arm and operation method thereof
CN111307155A (en) Double-cooperative-robot initial positioning measuring device and initial positioning method
CN114227681A (en) Robot off-line virtual teaching programming method based on infrared scanning tracking
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
CN113001142B (en) Automatic double-mechanical-arm assembling system for large-scale block optical assembly
CN111360789B (en) Workpiece processing teaching method, control method and robot teaching system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant