CN109352633B - Industrial robot teaching method and system and computer storage medium - Google Patents

Industrial robot teaching method and system and computer storage medium Download PDF

Info

Publication number
CN109352633B
CN109352633B CN201811473973.1A CN201811473973A CN109352633B CN 109352633 B CN109352633 B CN 109352633B CN 201811473973 A CN201811473973 A CN 201811473973A CN 109352633 B CN109352633 B CN 109352633B
Authority
CN
China
Prior art keywords
point
industrial robot
indicating
picture
robot teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811473973.1A
Other languages
Chinese (zh)
Other versions
CN109352633A (en
Inventor
吴曼玲
刘向东
万小丽
刘景亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CISDI Engineering Co Ltd
CISDI Chongqing Information Technology Co Ltd
Original Assignee
CISDI Engineering Co Ltd
CISDI Chongqing Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CISDI Engineering Co Ltd, CISDI Chongqing Information Technology Co Ltd filed Critical CISDI Engineering Co Ltd
Priority to CN201811473973.1A priority Critical patent/CN109352633B/en
Publication of CN109352633A publication Critical patent/CN109352633A/en
Application granted granted Critical
Publication of CN109352633B publication Critical patent/CN109352633B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The present invention provides an industrial robot teaching system, comprising: the indicating end is arranged at the tail end of the robot and used for forming an indicating point; the visual end is arranged at the tail end of the robot and is used for acquiring a picture at least comprising the indicating point; the presentation end is used for presenting the picture; the first operation terminal is used for determining a target point on the picture presented by the presentation terminal; and the second operation end is used for sending a control instruction to the indicating end to enable the indicating point to coincide with the target point. The industrial robot teaching system simplifies the complicated operation of the traditional teaching device, reduces the experience requirements of operators and the mastering of robot knowledge, simultaneously does not need to introduce a workpiece model and an operation environment in advance compared with an off-line programming mode, and is suitable for the processing conditions of various, small-batch and nonstandard workpieces.

Description

Industrial robot teaching method and system and computer storage medium
Technical Field
The invention relates to the technical field of industrial robot control teaching, in particular to an industrial robot teaching method, an industrial robot teaching system and a computer storage medium.
Background
With the continuous improvement of the industrial automation degree, industrial robots are widely applied in the fields of automobile and automobile part manufacturing industry, heavy machinery, aerospace, ships, chemical industry, electronic industry and the like. Generally, the teaching of the industrial robot mainly comprises two modes of teaching of a demonstrator and off-line programming.
Teaching of the teaching device is that an operator moves the tail end of the robot to a target position by using a teaching box, and robot joint angle information corresponding to the position is recorded in a memory. When the teaching operation is required to be reproduced, the robot controller sequentially reads the teaching positions and reproduces the teaching trajectory. The teaching demonstrator has certain requirements on operators, needs certain professional knowledge and relevant programming bases, and has a relatively long training period. Meanwhile, the teaching of the demonstrator is suitable for large-batch repetitive production, and can be repeatedly used by one-time teaching, but is not suitable for processing of various, small-batch and non-standard parts, and the teaching workload is large.
The off-line programming is the extension of robot programming language, it utilizes computer graphics to establish the geometric model of robot and its working environment, then makes off-line planning and programming for the task completed by robot, and makes dynamic simulation and collision detection for the result of programming, finally generates the robot motion code and transmits it to the robot control cabinet, so that the robot can complete the assigned task. The off-line programming requires that a robot model, working environment parameters and machining workpiece size parameters are imported in advance, and is not suitable for the conditions that the working environment is unknown and a machining element is not a standard part.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide an industrial robot teaching method, system and computer readable storage medium, so as to solve the defect of high teaching difficulty of manual operation in the prior art.
To achieve the above and other related objects, the present invention provides an industrial robot teaching system including at least:
the indicating end is arranged at the tail end of the robot and used for forming an indicating point;
the visual end is arranged at the tail end of the robot and is used for acquiring a picture at least comprising the indicating point;
the presenting end is used for presenting the picture;
the first operation terminal is used for determining a target point on the picture presented by the presentation terminal;
and the second operation end is used for sending a control instruction to the indicating end to enable the indicating point to coincide with the target point.
Optionally, the teaching system further comprises a storage end for storing the three-dimensional space coordinates of the target point.
Optionally, the indication end is a cursor indicator for emitting a laser beam to form a laser spot.
Optionally, the vision end at least comprises a camera arranged at the tail end of the robot.
To achieve the above and other related objects, the present invention also provides an industrial robot teaching method, which at least includes the steps of:
forming an indication point;
collecting a picture at least comprising the indication point;
presenting the picture;
determining a target point on the picture;
and sending a control instruction to enable the indication point to coincide with the target point.
Optionally, the step of sending a control instruction to make the indication point coincide with the target point further includes storing the three-dimensional space coordinates of the target point.
Optionally, the indication point is a laser spot formed by emitting a laser beam by a cursor indicator.
Optionally, the picture of the indication point is collected by a vision system, and the vision system at least comprises a camera arranged at the tail end of the robot.
Optionally, before the step of forming the indication point, at least a calibration step is further included, configured to calibrate a camera parameter, calibrate a relative position of the camera and the cursor emitter, and calibrate a relative position of a camera coordinate system and a robot base coordinate system.
To achieve the above and other related objects, the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, performs the industrial robot teaching method.
As described above, the industrial robot teaching method and system of the present invention have the following beneficial effects:
the industrial robot teaching method provided by the invention obtains a picture containing light spots (indication points) by a camera, manually clicks the points on the picture as 'guide points', sends a motion instruction to a robot controller to enable the tail end of the robot to move towards the direction of the 'guide points', so that the guide light spots are superposed with a teaching target point to obtain the coordinates of the teaching points.
The industrial robot teaching method and the system simplify the complicated operation of the traditional teaching device, reduce the experience requirements of operators and the mastering of robot knowledge, simultaneously do not need to introduce a workpiece model and an operation environment in advance compared with an off-line programming mode, and are suitable for the processing conditions of various, small-batch and nonstandard workpieces. In addition, aiming at teaching scenes with severe environment and damage to the health of workers, the indication end and the display end can be arranged at two places, so that remote teaching operation can be realized.
Drawings
Figure 1 shows a flow chart of an industrial robot teaching method according to an embodiment of the invention;
fig. 2 shows a schematic structural diagram of an industrial robot teaching system according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification.
The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
As shown in fig. 1, the present invention also provides an industrial robot teaching method, which at least includes the following steps:
s1 forms an indication point.
In one embodiment, the indication point is a laser spot formed by a laser beam emitted by the cursor indicator. The laser spot formed by the cursor indicator is always within the visual field of the camera.
S2 acquires a picture including at least the indication point.
In an embodiment, the image of the indication point is collected by a vision system, and the vision system at least comprises a camera arranged at the tail end of the robot and a camera bracket for mounting the camera.
S3 presents the screen.
The picture is presented through visual software, and the visual software is installed on a PC.
S4 determines a target point on the screen.
Specifically, according to the screen displayed by the visual software, a point in the screen is selected as a "guidance point", and the guidance point is the target point. The selection of the guide point can be realized by clicking a picture by a mouse, and the point clicked by the mouse is the guide point. The selection of the guide point can also be realized by touching, so that the aforementioned PC can have the function of a touch screen.
S5 sends a control command to make the pointing point coincide with the target point.
Specifically, a robot motion instruction is sent, so that the indication point moves towards a point clicked by the mouse; when the indication point is coincident with the target point, finishing the position teaching of the target point, and recording the three-dimensional space coordinate of the indication point.
In an embodiment, before the step of forming the indication point, at least a calibration step is further included, which is used for calibrating the camera parameters, calibrating the relative positions of the camera and the cursor emitter, and calibrating the relative positions of the camera coordinate system and the robot base coordinate system.
Specifically, in this embodiment, the target point is a stemming center point, and position coordinates of different stemming center points are obtained by moving a light spot (an indication point), so as to complete the teaching process. In the vision software, a path is planned for an acquired target point (the path refers to a line segment from an indicating point to the target point and line segments among a plurality of target points), a motion code is generated according to the path, then the generated motion code is sent to a robot controller, and the stemming adding action is completed by the tail end of the robot.
The industrial robot teaching method provided by the invention obtains a picture containing light spots (indicating points) by a camera, manually clicks the points on the picture to be guide points, sends a motion instruction to a robot controller to enable the tail end of the robot to move towards the direction of the guide points, so that the guide light spots are overlapped with a teaching target point to obtain the coordinates of the teaching points.
As shown in fig. 2, the present invention also provides an industrial robot teaching system, which at least comprises a pointing end, a vision end, a presentation end, a first operation end and a second operation end.
The indicating end 30 is arranged at the tail end of the robot and used for forming an indicating point;
in one embodiment, the indication point is a laser spot formed by a laser beam emitted by the cursor indicator. The laser light spot formed by the cursor indicator is always in the visual field of the camera.
The visual end 31 is arranged at the tail end of the robot and is used for collecting pictures at least comprising the indication points;
in one embodiment, the vision end comprises a camera and a camera mounting bracket arranged on the tail end of the robot.
The presentation end 32 is configured to present the screen;
in one embodiment, the screen is presented by visual software installed on a PC.
In one embodiment, the indication terminal and the presentation terminal can be disposed in two parts, and the two parts can be connected in a wired or wireless manner.
The first operation terminal 33 is configured to determine a target point on the screen presented by the presentation terminal;
specifically, the first operating end selects a point in the screen as a "guidance point" according to the screen displayed by the visual software, and the guidance point is the target point. More specifically, the first operating end may be a mouse, the selection of the guidance point may be implemented by clicking a screen with the mouse, and the point clicked by the mouse is the guidance point. The selection of the guide point can also be realized by touching, so that the aforementioned PC can have the function of a touch screen.
And the second operation end 34 is configured to send a control instruction to the indicating end so that the indicating point coincides with the target point.
Specifically, the second operation end sends a robot motion instruction to the robot, so that the indication point moves towards a point clicked by the mouse; when the pointing point coincides with the target point, the position teaching of the target point is completed, and at this time, the three-dimensional space coordinates of the pointing point are recorded and stored by the storage terminal 35.
In an embodiment, before the step of forming the indication point, at least a calibration step is further included, which is used for calibrating the camera parameters, calibrating the relative positions of the camera and the cursor emitter, and calibrating the relative positions of the camera coordinate system and the robot base coordinate system.
Specifically, in this embodiment, the target point is a stemming center point, and position coordinates of different stemming center points are obtained by moving a light spot (an indication point), so as to complete the teaching process. In the vision software, a path is planned for an acquired target point (the path refers to a line segment from an indicating point to the target point and line segments among a plurality of target points), a motion code is generated according to the path, then the generated motion code is sent to a robot controller, and the stemming adding action is completed by the tail end of the robot.
The present invention also provides a computer readable storage medium storing a computer program which, when executed by a processor, performs the industrial robot teaching method as shown in fig. 1.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. An industrial robot teaching system characterized in that it comprises at least:
the indicating end is arranged at the tail end of the robot and used for forming an indicating point;
the visual end is arranged at the tail end of the robot and is used for acquiring a picture at least comprising the indicating point;
the presentation end is used for presenting the picture;
the first operation terminal is used for determining a target point on the picture presented by the presentation terminal;
the second operation end is used for sending a control instruction to the indication end to enable the indication point to move towards the direction of a target point, and enable the indication point to be overlapped with the target point;
the indication end and the presentation end are arranged in two places and are connected in a wired or wireless mode.
2. An industrial robot teaching system according to claim 1, further comprising a storage terminal for storing three-dimensional space coordinates of the target point.
3. An industrial robot teaching system according to claim 1 wherein the indicating end is a cursor pointer for emitting a laser beam to form a laser spot.
4. An industrial robot teaching system according to claim 1, characterized in that the vision terminal comprises at least a camera arranged at the robot's end.
5. An industrial robot teaching method of an industrial robot teaching system according to any of claims 1 to 4, characterized in that the industrial robot teaching method comprises at least the steps of:
forming an indication point;
collecting a picture at least comprising the indication point;
presenting the picture;
determining a target point on the picture;
and sending a control instruction to the indicating end to enable the indicating point to move towards the direction of a target point, so that the indicating point and the target point are coincided.
6. An industrial robot teaching method according to claim 5, further comprising storing the three-dimensional space coordinates of the target point after the step of sending a control command to make the pointing point coincide with the target point.
7. An industrial robot teaching method according to claim 5, wherein the pointing point is a laser spot formed by emitting a laser beam by a cursor pointer.
8. An industrial robot teaching method according to claim 5, characterized in that the picture of the pointing point is captured by a vision system, which comprises at least a camera arranged at the robot end.
9. An industrial robot teaching method according to claim 8 further comprising at least a calibration step for calibrating camera parameters, calibrating relative camera and cursor emitter positions, calibrating camera coordinate system and robot base coordinate system positions, before said forming an indication point step.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, performs the industrial robot teaching method according to any of claims 5-9.
CN201811473973.1A 2018-12-04 2018-12-04 Industrial robot teaching method and system and computer storage medium Active CN109352633B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811473973.1A CN109352633B (en) 2018-12-04 2018-12-04 Industrial robot teaching method and system and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811473973.1A CN109352633B (en) 2018-12-04 2018-12-04 Industrial robot teaching method and system and computer storage medium

Publications (2)

Publication Number Publication Date
CN109352633A CN109352633A (en) 2019-02-19
CN109352633B true CN109352633B (en) 2022-08-09

Family

ID=65331137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811473973.1A Active CN109352633B (en) 2018-12-04 2018-12-04 Industrial robot teaching method and system and computer storage medium

Country Status (1)

Country Link
CN (1) CN109352633B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113510680A (en) * 2021-07-06 2021-10-19 河钢工业技术服务有限公司 Mud adding robot and method for blast furnace mud gun

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135776A (en) * 2011-01-25 2011-07-27 解则晓 Industrial robot control system based on visual positioning and control method thereof
CN106113050A (en) * 2016-07-05 2016-11-16 昆山华恒机器人有限公司 The teaching method of industrial robot, control method and device, system
CN108127668A (en) * 2017-12-19 2018-06-08 合肥欣奕华智能机器有限公司 The teaching apparatus and robot system of a kind of teaching method of robot, robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3366433B1 (en) * 2017-02-09 2022-03-09 Canon Kabushiki Kaisha Method of controlling robot, method of teaching robot, and robot system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135776A (en) * 2011-01-25 2011-07-27 解则晓 Industrial robot control system based on visual positioning and control method thereof
CN106113050A (en) * 2016-07-05 2016-11-16 昆山华恒机器人有限公司 The teaching method of industrial robot, control method and device, system
CN108127668A (en) * 2017-12-19 2018-06-08 合肥欣奕华智能机器有限公司 The teaching apparatus and robot system of a kind of teaching method of robot, robot

Also Published As

Publication number Publication date
CN109352633A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
AU2020201554B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
US10052765B2 (en) Robot system having augmented reality-compatible display
CN107972070B (en) Method and system for testing performance of robot and computer readable storage medium
US11520472B2 (en) Inspection program editing environment including integrated alignment program planning and editing features
KR102645817B1 (en) Apparatus and method for managing behavior of robot
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
JP2020015128A (en) Sensing system, work system, extended reality image display method, extended reality image memorizing method, and program
CN104802186A (en) Robot programming apparatus for creating robot program for capturing image of workpiece
CN109648568B (en) Robot control method, system and storage medium
US20240286272A1 (en) Robot teaching with scans in and out of robot workspace
CN109352633B (en) Industrial robot teaching method and system and computer storage medium
CN210361314U (en) Robot teaching device based on augmented reality technology
JP2020086759A (en) Three-dimensional model creation system, processing simulation system, and tool path automatic production system
JP7366264B2 (en) Robot teaching method and robot working method
JPH11338532A (en) Teaching device
US20160059414A1 (en) System and method for enhancing a visualization of coordinate points within a robot's working envelope
CN109664273B (en) Industrial robot cursor dragging teaching method and system
Schwandt et al. Robot manipulator programming interface based on augmened reality
JP3765061B2 (en) Offline teaching system for multi-dimensional coordinate measuring machine
Meinel et al. Supporting manual assembly through merging live position data and 3D-CAD data using a worker information system
JP2023505322A (en) Method and system for programming robots
WO2023073959A1 (en) Work assistance device and work assistance method
CN116619376B (en) Robot teaching control method based on virtual vision
CN114888794B (en) Robot man-machine interaction running path marking method and device
US12038159B2 (en) Method for creating XYZ focus paths with a user device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 400013 No. 1, Shuang Steel Road, Yuzhong District, Chongqing, China

Applicant after: CISDI ENGINEERING Co.,Ltd.

Applicant after: MCC CCID information technology (Chongqing) Co.,Ltd.

Address before: 400013 No. 1, Shuang Steel Road, Yuzhong District, Chongqing, China

Applicant before: CISDI ENGINEERING Co.,Ltd.

Applicant before: CISDI CHONGQING INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant