CN110732814A - intelligent welding robot based on vision technology - Google Patents

intelligent welding robot based on vision technology Download PDF

Info

Publication number
CN110732814A
CN110732814A CN201910933131.8A CN201910933131A CN110732814A CN 110732814 A CN110732814 A CN 110732814A CN 201910933131 A CN201910933131 A CN 201910933131A CN 110732814 A CN110732814 A CN 110732814A
Authority
CN
China
Prior art keywords
sensor
welding
end effector
vision
central control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910933131.8A
Other languages
Chinese (zh)
Inventor
廖建飞
黄进财
梁炳东
戴冰鸿
万安
郑婉君
褚云云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Gongchuang Future Artificial Intelligence Research Institute Co Ltd
Zhuhai Zhong Chuang Chuang Hui Technology Co Ltd
Original Assignee
Zhuhai Gongchuang Future Artificial Intelligence Research Institute Co Ltd
Zhuhai Zhong Chuang Chuang Hui Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Gongchuang Future Artificial Intelligence Research Institute Co Ltd, Zhuhai Zhong Chuang Chuang Hui Technology Co Ltd filed Critical Zhuhai Gongchuang Future Artificial Intelligence Research Institute Co Ltd
Priority to CN201910933131.8A priority Critical patent/CN110732814A/en
Publication of CN110732814A publication Critical patent/CN110732814A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0252Steering means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses intelligent welding robots based on a vision technology, which comprises an end effector, a vision sensor, six-axis mechanical arms, a central control panel and a neural network computing unit, wherein the end effector is connected to end joints of the six-axis mechanical arms, the vision sensor is fixed on the end effector, the vision sensor moves along with the end effector, the neural network computing unit is arranged on the central control panel, and the end effector, the vision sensor and the six-axis mechanical arms are respectively and electrically connected to the central control panel.

Description

intelligent welding robot based on vision technology
Technical Field
The invention relates to the field of intelligent processing equipment, in particular to intelligent welding robots based on a vision technology.
Background
For automatic welding equipment, different welding objects are often faced, welding paths or welding points are required to be manually set when a new object is welded at each time, the automatic welding equipment is processed according to a set mode, the situation that the welding position is changed greatly cannot be adapted, manual intervention is required at the time, and the automatic welding equipment cannot adapt to a new welding environment quickly, so that the operation requirement on workers is high, the labor cost is occupied, and the automatic welding equipment is not convenient enough.
Disclosure of Invention
In order to solve the above problems, the present invention provides intelligent welding robots based on visual technology, which can automatically learn the characteristics of the welding object by using a neural network computing unit, and can directly identify the learned object in a new environment, thereby reducing the trouble of human intervention.
The technical scheme adopted by the invention for solving the problems is as follows:
A smart welding robot based on vision technology, comprising:
the end effector is used for contacting a target object to weld;
the vision sensor is used for identifying the welding position of the target object through the image;
a six-axis robotic arm for moving the end effector to a specified position;
the central control panel is used for receiving control instructions and system settings;
the neural network computing unit is used for autonomously learning the welding image transmitted back by the vision sensor so as to quickly identify a new welding environment;
the end effector is connected to the end joint of the six-axis mechanical arm, the visual sensor is fixed on the end effector and moves along with the end effector, the neural network computing unit is arranged on the central control board, and the end effector, the visual sensor and the six-axis mechanical arm are respectively and electrically connected to the central control board.
, the robot further comprises a control panel and a manual rocker for manually controlling the six-axis robot arm, wherein the control panel and the manual rocker are independently arranged beside the six-axis robot arm and are connected to the central control panel through cables.
, the central control panel comprises:
the point positioning module is used for periodically acquiring coordinate points of the six-axis mechanical arm and drawing the coordinate points into a motion path, wherein the coordinate points are related to the deflection position of the motor of each joint point in the six-axis mechanical arm;
a storage module to store the path of motion of the end effector and the image of the vision sensor.
, the end effector is a welding gun, the head of the welding gun is linear, the visual sensor and the head of the welding gun are in the same line , and the orientation of the visual sensor is the same as that of the head of the welding gun.
And , the vision sensor includes a CMOS image sensor and a laser ranging sensor, the CMOS image sensor sends the image of the welding environment to the neural network computing unit, the laser ranging sensor sends the distance information to the neural network computing unit, and the orientation of the CMOS image sensor and the orientation of the laser ranging sensor are the same.
The or more technical schemes provided by the embodiment of the invention have the beneficial effects that the neural network computing unit identifies the image of the welding object obtained by the visual sensor, automatically extracts the characteristic value to obtain the common characteristic of the welding object, automatically completes the welding point identification when a new welding object needs to be welded, displays the welding point to a worker for confirmation and adjustment, and can also adjust the position of the end effector in real time according to the characteristic value in the welding process, thereby reducing the operation of manual intervention and reducing the labor cost.
Drawings
The invention is further illustrated in the following description with reference to the figures and examples.
FIG. 1 is a schematic diagram of a module connection according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the structural connections of an embodiment of the present invention;
FIG. 3 is a schematic view of a vision sensor configuration according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a CPU board structure according to an embodiment of the present invention.
Detailed Description
For purposes of making the objects, aspects and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and examples, it is to be understood that the specific examples are for purposes of illustration only and are not to be construed as limiting the invention.
Referring to fig. 1 and 2, embodiments of the present invention provide intelligent welding robots based on vision techniques, comprising:
an end effector 100 for contacting a target object to perform welding;
a vision sensor 200 for recognizing a welding position of a target object through an image;
a six-axis robot arm 300 for moving the end effector 100 to a specified position;
the central control panel is used for receiving control instructions and system settings;
a neural network computing unit for autonomously learning the welding image transmitted back by the vision sensor 200, thereby rapidly identifying a new welding environment;
the end effector 100 is connected to the end joints of the six-axis robot 300, the visual sensor 200 is fixed on the end effector 100, the visual sensor 200 moves along with the end effector 100, the neural network computing unit is arranged on the central control board, and the end effector 100, the visual sensor 200 and the six-axis robot 300 are respectively and electrically connected to the central control board.
The neural network computing unit is adopted to identify the image sent by the vision sensor 200, characteristic values can be extracted from the positions of the welding points and the shapes and sizes of the welding lines, training is completed through a neural network algorithm, and when the end effector 100 moves to a new welding environment, the neural network computing unit can rapidly identify the new welding environment by comparing the characteristic values, so that automatic welding is realized, and manual intervention is reduced.
Preferably, the six-axis robot arm further comprises a control panel 400 and a manual rocker 500 for manually controlling the six-axis robot arm 300, wherein the control panel 400 and the manual rocker 500 are independently arranged beside the six-axis robot arm 300 and are connected to the central control panel through cables. The control panel 400 includes, but is not limited to, start/stop buttons, a display screen, an operation status indicator lamp, and an input device according to general settings, and in addition, the manual rocker 500 is linked with the six-axis robot arm 300, and a worker can operate the manual rocker 500 to adjust the pose of the six-axis robot arm 300, and in the process, the central control board collects and stores the movement path, and reproduces the welding path adjusted by the worker during batch processing.
Referring to fig. 4, preferably, the central control board includes:
a point positioning module for periodically acquiring coordinate points of the six-axis robot arm 300 and drawing the coordinate points into a motion path, wherein the coordinate points are related to the deflection positions of the motors of the joint points in the six-axis robot arm 300;
a storage module for storing the motion path of the end effector 100 and the image of the vision sensor 200.
Based on the function of recording the movement path, the movement path can be acquired, besides the manual adjustment and positioning mode of a worker, the manual adjustment and positioning mode can also be a mode that the worker inputs a target position in a programming mode, the central control board tries to run the six-axis mechanical arm 300 to the target position, and the central control board is converted into the movement path according to the deflection position of the motor of each joint point in the six-axis mechanical arm 300 after fine adjustment; the final movement path is stored in a memory module for direct recall during the final machining.
Referring to fig. 3, the vision sensor 200 includes a CMOS image sensor 210 and a laser ranging sensor 220, the CMOS image sensor 210 transmits an image of a welding environment to a neural network computing unit, the laser ranging sensor 220 transmits distance information to the neural network computing unit, and the CMOS image sensor 210 and the laser ranging sensor 220 are oriented in the same direction, the welding positioning is facilitated by using the conventional welding torch 110 as the end effector 100, and actually, different end effectors 100 can be replaced according to different welding objects, but generally, the end effector 100 is oriented to a welding object, and the vision sensor 200 is oriented in the same direction as the end effector 100 to facilitate image and distance acquisition, and in the embodiment, 3D mapping in space is achieved by using two sensors, namely, the CMOS image sensor 210 and the laser ranging sensor 220, and the adaptability to complex welding objects is determined by .
In the embodiment of the invention, the neural network computing unit identifies the image of the welding object obtained by the visual sensor 200, automatically extracts the characteristic value to obtain the common characteristic of the welding object, automatically completes the identification of the welding point when a new welding object needs to be welded, displays the welding point for the confirmation and adjustment of a worker, and can also adjust the position of the end effector 100 in real time according to the characteristic value in the welding process, thereby reducing the operation of manual intervention and reducing the labor cost.
The above description is only a preferred embodiment of the present invention, and the present invention is not limited to the above embodiment, and the present invention shall fall within the protection scope of the present invention as long as the technical effects of the present invention are achieved by the same means.

Claims (5)

1, intelligent welding robot based on vision technique, which is characterized in that the robot comprises
An end effector (100) for contacting a target object to weld;
a vision sensor (200) for recognizing a welding position of a target object through an image;
a six-axis robotic arm (300) for moving the end effector (100) to a specified position;
the central control panel is used for receiving control instructions and system settings;
a neural network computing unit for autonomously learning the welding image transmitted back by the vision sensor (200) so as to quickly identify a new welding environment;
the end effector (100) is connected to an end joint of the six-axis mechanical arm (300), the visual sensor (200) is fixed on the end effector (100), the visual sensor (200) moves along with the end effector (100), the neural network computing unit is arranged on the central control board, and the end effector (100), the visual sensor (200) and the six-axis mechanical arm (300) are respectively and electrically connected to the central control board.
2. Intelligent welding robot based on vision technique, as claimed in claim 1, further comprising a control panel (400) and a manual rocker (500) for manually controlling the six-axis robot arm (300), the control panel (400) and manual rocker (500) being independently disposed beside the six-axis robot arm (300) and connected to the central control board by cables.
3. Intelligent welding robot based on vision technique, as claimed in claim 1, wherein the central control board comprises:
a point positioning module for periodically acquiring coordinate points of the six-axis robot arm (300) and drawing the coordinate points into a motion path, the coordinate points being related to deflection positions of motors of respective joint points in the six-axis robot arm (300);
a storage module for storing the path of motion of the end effector (100) and the image of the vision sensor (200).
4. Intelligent welding robot based on vision technique as claimed in claim 1, wherein the end effector (100) is a welding torch (110), the head of the welding torch (110) is linear, the vision sensor (200) is in the same line as the head of the welding torch (110), and the orientation of the vision sensor (200) is the same as the orientation of the head of the welding torch (110).
5. Intelligent welding robot based on vision technique, according to claim 4, characterized in that the vision sensor (200) includes a CMOS image sensor (210) and a laser ranging sensor (220), the CMOS image sensor (210) sends the image of the welding environment to the neural network computing unit, the laser ranging sensor (220) sends the distance information to the neural network computing unit, the orientation of the CMOS image sensor (210) and the laser ranging sensor (220) is the same.
CN201910933131.8A 2019-09-29 2019-09-29 intelligent welding robot based on vision technology Pending CN110732814A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910933131.8A CN110732814A (en) 2019-09-29 2019-09-29 intelligent welding robot based on vision technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910933131.8A CN110732814A (en) 2019-09-29 2019-09-29 intelligent welding robot based on vision technology

Publications (1)

Publication Number Publication Date
CN110732814A true CN110732814A (en) 2020-01-31

Family

ID=69269752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910933131.8A Pending CN110732814A (en) 2019-09-29 2019-09-29 intelligent welding robot based on vision technology

Country Status (1)

Country Link
CN (1) CN110732814A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112404658A (en) * 2020-10-19 2021-02-26 中国石油天然气集团有限公司 Remote control-based in-service pipeline arc 3D printing repair system and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106874914A (en) * 2017-01-12 2017-06-20 华南理工大学 A kind of industrial machinery arm visual spatial attention method based on depth convolutional neural networks
CN206632567U (en) * 2017-04-11 2017-11-14 池州职业技术学院 A kind of Intelligent welding robot device based on machine vision
CN107999955A (en) * 2017-12-29 2018-05-08 华南理工大学 A kind of six-shaft industrial robot line laser automatic tracking system and an automatic tracking method
CN108406091A (en) * 2017-02-09 2018-08-17 发那科株式会社 Laser Machining head and the laser-processing system for having filming apparatus
CN108544153A (en) * 2018-04-23 2018-09-18 哈尔滨阿尔特机器人技术有限公司 A kind of vision robot's system for Tube-sheet Welding
CN109175608A (en) * 2018-09-30 2019-01-11 华南理工大学 Weld bead feature points position On-line Measuring Method and seam track automatic measurement system
CN110039158A (en) * 2019-05-15 2019-07-23 上海应用技术大学 A kind of welder and welding procedure of flat wire armature
CN110135513A (en) * 2019-05-22 2019-08-16 广东工业大学 A kind of weld joint recognition method of the welding robot based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106874914A (en) * 2017-01-12 2017-06-20 华南理工大学 A kind of industrial machinery arm visual spatial attention method based on depth convolutional neural networks
CN108406091A (en) * 2017-02-09 2018-08-17 发那科株式会社 Laser Machining head and the laser-processing system for having filming apparatus
CN206632567U (en) * 2017-04-11 2017-11-14 池州职业技术学院 A kind of Intelligent welding robot device based on machine vision
CN107999955A (en) * 2017-12-29 2018-05-08 华南理工大学 A kind of six-shaft industrial robot line laser automatic tracking system and an automatic tracking method
CN108544153A (en) * 2018-04-23 2018-09-18 哈尔滨阿尔特机器人技术有限公司 A kind of vision robot's system for Tube-sheet Welding
CN109175608A (en) * 2018-09-30 2019-01-11 华南理工大学 Weld bead feature points position On-line Measuring Method and seam track automatic measurement system
CN110039158A (en) * 2019-05-15 2019-07-23 上海应用技术大学 A kind of welder and welding procedure of flat wire armature
CN110135513A (en) * 2019-05-22 2019-08-16 广东工业大学 A kind of weld joint recognition method of the welding robot based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张明文: "《工业机器人原理及应用 (DELTA并联机器人)》", 30 April 2018, 哈尔滨工业大学出版社 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112404658A (en) * 2020-10-19 2021-02-26 中国石油天然气集团有限公司 Remote control-based in-service pipeline arc 3D printing repair system and method

Similar Documents

Publication Publication Date Title
US10905508B2 (en) Remote control robot system
CN111823223B (en) Robot arm grabbing control system and method based on intelligent stereoscopic vision
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
CN106853639A (en) A kind of battery of mobile phone automatic assembly system and its control method
CN106584093A (en) Self-assembly system and method for industrial robots
US20090234502A1 (en) Apparatus for determining pickup pose of robot arm with camera
WO2018043525A1 (en) Robot system, robot system control device, and robot system control method
KR102403716B1 (en) robot system
CN113954085A (en) Intelligent positioning and control method of welding robot based on binocular vision and linear laser sensing data fusion
CN111421528A (en) Industrial robot's automated control system
CN110170996B (en) Robot rapid teaching system based on stereoscopic vision
US10175683B2 (en) Teaching data preparation device and teaching data preparation method for articulated robot
CN111975200A (en) Intelligent welding method and intelligent welding system based on visual teaching technology
CN113333998A (en) Automatic welding system and method based on cooperative robot
CN110732814A (en) intelligent welding robot based on vision technology
US20230286143A1 (en) Robot control in working space
CN110605720A (en) Industrial robot vision system and teaching method thereof
WO2020032263A1 (en) Robot system
CN107363831B (en) Teleoperation robot control system and method based on vision
JP7190552B1 (en) Robot teaching system
CN111899629B (en) Flexible robot teaching system and method
JP6832408B1 (en) Production system
CN111015675A (en) Typical robot vision teaching system
CN205571674U (en) Soldering joint tracking vision sensory control system based on structured light
Yong RESEARCH ON PATH RECOGNITION OF WELDING MANIPULATOR BASED ON AUTOMATIC CONTROL ALGORITHM.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200131