CN114055438A - Visual guide workpiece follow-up sorting system and method - Google Patents

Visual guide workpiece follow-up sorting system and method Download PDF

Info

Publication number
CN114055438A
CN114055438A CN202210046020.7A CN202210046020A CN114055438A CN 114055438 A CN114055438 A CN 114055438A CN 202210046020 A CN202210046020 A CN 202210046020A CN 114055438 A CN114055438 A CN 114055438A
Authority
CN
China
Prior art keywords
workpiece
grabbing
encoder
coordinate
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210046020.7A
Other languages
Chinese (zh)
Inventor
冀春锟
朱本旺
吴云
方彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Shibite Robot Co Ltd
Original Assignee
Hunan Shibite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Shibite Robot Co Ltd filed Critical Hunan Shibite Robot Co Ltd
Priority to CN202210046020.7A priority Critical patent/CN114055438A/en
Publication of CN114055438A publication Critical patent/CN114055438A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0093Programme-controlled manipulators co-operating with conveyor means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q7/00Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting
    • B23Q7/04Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting by means of grippers
    • B23Q7/048Multiple gripper units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0084Programme-controlled manipulators comprising a plurality of manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of intelligent manufacturing based on vision, and discloses a vision-guided workpiece follow-up sorting system and method, which aim to improve the production efficiency and prolong the service life of equipment. The method comprises the following steps: determining the time for the workpiece to enter a visual darkroom after the workpiece is detected to enter the first area; after the workpiece enters a visual darkroom, acquiring an image of the workpiece to identify the type of the workpiece, calculating a capture coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to an encoder at the image acquisition time; and when the workpiece enters the corresponding grabbing area of the robot, the corresponding robot calculates the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate, controls the mechanical arm per se to carry out follow-up operation according to the flow coordinate, and carries out workpiece grabbing operation after the coordinate of the mechanical arm per se is aligned with the flow coordinate.

Description

Visual guide workpiece follow-up sorting system and method
Technical Field
The invention relates to the technical field of intelligent manufacturing based on vision, in particular to a vision-guided workpiece follow-up sorting system and method.
Background
With the rapid development and stability of novel technologies such as industrial software and machine vision in recent years, the industry gradually enters the intelligent era, the manpower is gradually replaced, the machine vision becomes a new and precious intelligent production line, and flexible manufacturing and production become the business master strategy. In the production and manufacturing fields, the assembly line is an indispensable link, and particularly, the sorting on the conveying line is also an important step, which directly influences the yield of the whole production line and even enterprises.
At present, most sorting systems on conveying lines mainly use manual work, a small number of advanced industries adopt stopping the conveying lines to perform back vision recognition, and robots perform sorting. However, the method has the following disadvantages and shortcomings:
(1) the installation position of the camera is required to be within the range of the arm spread of the robot, and the installation position is limited.
(2) The conveying line is frequently started and stopped, the equipment load is large, the production beat of the easily-damaged equipment is reduced, and the production efficiency is influenced.
(3) The conveying line stops accurately, parts are prone to being not identified due to the fact that the parts are not in the visual field, and manual intervention is frequently needed.
Disclosure of Invention
The invention aims to disclose a visual guidance workpiece follow-up sorting system and method, which aim to improve the production efficiency and prolong the service life of equipment.
In order to achieve the above object, the present invention discloses a method for randomly sorting visually guided workpieces, comprising:
step S1, detecting whether the workpiece enters a first area of the conveyor belt;
step S2, after the workpiece is detected to enter the first area, determining a first encoder parameter of the workpiece entering the first area through an encoder linked with the conveyor belt; determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder;
step S3, after the workpiece enters the visual darkroom, acquiring an image of the workpiece to identify the type of the workpiece, calculating a capture coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time;
step S4, determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameters, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder;
step S5, when the workpiece enters the grabbing area corresponding to the robot, the corresponding robot calculates the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate, controls the mechanical arm of the corresponding robot to carry out follow-up operation according to the flow coordinate, and carries out workpiece grabbing operation after the coordinate of the mechanical arm is aligned with the flow coordinate.
To achieve the above object, the present invention further discloses a system for randomly sorting visually-guided workpieces, comprising:
a sensor for detecting whether a workpiece enters a first region of the conveyor belt;
a Programmable Logic Controller (PLC) configured to determine, by an encoder linked with the conveyor belt, a first encoder parameter when the workpiece enters the first area after detecting that the workpiece enters the first area; determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder;
the vision processing module is used for carrying out image acquisition on the workpiece after the workpiece enters the vision darkroom so as to identify the type of the workpiece, calculating a grabbing coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time;
the PLC is also used for determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameter, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder; the type, the grabbing coordinate, the second coding parameter and the real-time dynamic data of the encoder of the workpiece acquired from the vision processing module are forwarded to the corresponding robot;
and the robot is used for calculating the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate after the workpiece enters the corresponding grabbing area, controlling a mechanical arm per se to carry out follow-up operation according to the flow coordinate, and carrying out workpiece grabbing operation after the coordinate of the mechanical arm per se is aligned with the flow coordinate.
Preferably, the method and system of the present embodiment may further include: dividing the grabbing area into a corresponding number of grabbing subareas by at least two robots; and if the robot in the last grabbing sub-region fails to grab the workpiece in the following manner, informing the robot in the next grabbing sub-region to carry out grabbing supplement processing, and carrying the type, grabbing coordinates and second coding parameters corresponding to the workpiece in the corresponding notice.
The invention has the following beneficial effects:
1. the camera is flexible in installation position, and is installed in a visual darkroom mode, so that the polishing effect is better; and the time for the workpiece to enter the visual darkroom can be judged in advance, the image acquisition quality of the workpiece can be guaranteed to be optimal, the visual identification effect is better, and the sorting success rate is higher.
2. When the type of developments discernment work piece is handled through the vision, still through accurate record conveyer belt flow distance and speed of encoder to follow-up snatching is carried out by the robot, effectively promotes production efficiency, extension equipment life.
3. Through the logic strong association of a series of related data, the accurate flowing coordinate of the workpiece on the conveying belt can be accurately positioned, and the high reliability of robot grabbing is ensured.
The present invention will be described in further detail below with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic diagram of a vision-guided workpiece random sorting system according to an embodiment of the present invention.
Detailed Description
The embodiments of the invention will be described in detail below with reference to the drawings, but the invention can be implemented in many different ways as defined and covered by the claims.
Example 1
The present embodiment first discloses a system for randomly sorting visually-guided workpieces, as shown in fig. 1, including: sensor 1, conveyer belt 2, encoder 3, vision darkroom module 4 and robot 5. The sensor is in communication connection with the PLC, and the PLC is also in communication connection with the encoder, the vision darkroom module and the robot. Optionally, the encoder in this embodiment may be installed by pressing a friction wheel on the conveyor belt, and the operation principle may be as follows: the runner drives the self driven motor of encoder and then can export certain pulse to make a round, can obtain the revolution of motor and then can the stroke of reverse thrust conveyer belt according to the pulse volume of statistics.
Wherein the sensor is used for detecting whether the workpiece enters the first area of the conveyor belt.
The vision processing module is used for carrying out image acquisition on the workpiece after the workpiece enters the vision darkroom so as to identify the type of the workpiece, calculating the grabbing coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time. The grabbing coordinates are mainly used for positioning the position of a specific area covered by the workpiece on the conveyor belt.
The PLC controller is used for determining a first encoder parameter of the workpiece entering the first area through an encoder linked with the conveyor belt after the workpiece entering the first area is detected; determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder; on the other hand, the robot control system is also used for determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameters, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder; and the type, the grabbing coordinates and the second coding parameters of the workpiece acquired from the vision processing module are forwarded to the corresponding robot in real time.
In this embodiment, the robot is configured to calculate, after the workpiece enters the corresponding grabbing area, the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter, and the grabbing coordinate, control the mechanical arm of the robot to perform the follow-up operation according to the flow coordinate, and perform the workpiece grabbing operation after the coordinate of the mechanical arm is aligned with the flow coordinate.
The "flow coordinate" is a dynamically changing coordinate corresponding to different times after the capture coordinate in a static state at the image acquisition time moves along with the conveyor belt. It is obvious to those skilled in the art that when the coordinate system for determining the grasping coordinate is not consistent with the coordinate system of the robot itself, a corresponding coordinate conversion operation is performed. In other words, the alignment process is based on the alignment process in the same coordinate system.
As shown in fig. 1, in this embodiment, the two robots may divide the grabbing area into a corresponding number of grabbing sub-areas, so as to notify the robot in the next grabbing sub-area to perform the grabbing supplement process if the robot in the previous grabbing sub-area fails to grab the workpiece in a following manner, and the corresponding notification carries the type, grabbing coordinates and second encoding parameters corresponding to the workpiece. If the grabbing is successful, the corresponding robot returns a grabbing success command to the PLC.
Example 2
Corresponding to the systematicness shown in fig. 1, the present embodiment discloses a method for randomly sorting visually-guided workpieces, comprising the following steps:
step S1, detecting whether the workpiece enters the first area of the conveyor belt.
Step S2, after the workpiece is detected to enter the first area, determining a first encoder parameter of the workpiece entering the first area through an encoder linked with the conveyor belt; and determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder.
And step S3, after the workpiece enters the visual darkroom, acquiring an image of the workpiece to identify the type of the workpiece, calculating the capture coordinate of the workpiece in the acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time.
And step S4, determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameters, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder.
And step S5, when the workpiece enters the corresponding grabbing area of the robot, the corresponding robot calculates the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate, controls the mechanical arm of the corresponding robot to carry out follow-up operation according to the flow coordinate, and carries out workpiece grabbing operation after the coordinate of the mechanical arm is aligned with the flow coordinate.
Similarly, the method of the embodiment may also divide the grabbing area into the grabbing sub-areas of corresponding number by at least two robots; and if the robot in the last grabbing sub-region fails to grab the workpiece in the following manner, informing the robot in the next grabbing sub-region to carry out grabbing supplement processing, and carrying the type, grabbing coordinates and second coding parameters corresponding to the workpiece in the corresponding notice.
In summary, the system and method for randomly sorting visually guided workpieces disclosed in the above embodiments of the present invention at least have the following advantages:
1. the camera is flexible in installation position, and is installed in a visual darkroom mode, so that the polishing effect is better; and the time for the workpiece to enter the visual darkroom can be judged in advance, the image acquisition quality of the workpiece can be guaranteed to be optimal, the visual identification effect is better, and the sorting success rate is higher.
2. When the type of developments discernment work piece is handled through the vision, still through accurate record conveyer belt flow distance and speed of encoder to follow-up snatching is carried out by the robot, effectively promotes production efficiency, extension equipment life.
3. Through the logic strong association of a series of related data, the accurate flowing coordinate of the workpiece on the conveying belt can be accurately positioned, and the high reliability of robot grabbing is ensured.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (4)

1. A method for randomly sorting visually-guided workpieces, comprising:
step S1, detecting whether the workpiece enters a first area of the conveyor belt;
step S2, after the workpiece is detected to enter the first area, determining a first encoder parameter of the workpiece entering the first area through an encoder linked with the conveyor belt; determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder;
step S3, after the workpiece enters the visual darkroom, acquiring an image of the workpiece to identify the type of the workpiece, calculating a capture coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time;
step S4, determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameters, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder;
step S5, when the workpiece enters the grabbing area corresponding to the robot, the corresponding robot calculates the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate, controls the mechanical arm of the corresponding robot to carry out follow-up operation according to the flow coordinate, and carries out workpiece grabbing operation after the coordinate of the mechanical arm is aligned with the flow coordinate.
2. The method of claim 1, further comprising:
dividing the grabbing area into a corresponding number of grabbing subareas by at least two robots;
and if the robot in the last grabbing sub-region fails to grab the workpiece in the following manner, informing the robot in the next grabbing sub-region to carry out grabbing supplement processing, and carrying the type, grabbing coordinates and second coding parameters corresponding to the workpiece in the corresponding notice.
3. A vision-guided workpiece random sort system, comprising:
a sensor for detecting whether a workpiece enters a first region of the conveyor belt;
the PLC is used for determining a first encoder parameter of the workpiece entering the first area through an encoder linked with the conveyor belt after the workpiece entering the first area is detected; determining the time for the workpiece to enter the visual darkroom according to the first encoder parameter, the lengths of the first area and the visual darkroom and the real-time dynamic data of the encoder;
the vision processing module is used for carrying out image acquisition on the workpiece after the workpiece enters the vision darkroom so as to identify the type of the workpiece, calculating a grabbing coordinate of the workpiece in an acquired image coordinate system, and recording a second encoder parameter corresponding to the encoder at the image acquisition time;
the PLC is also used for determining whether the workpiece enters the grabbing area corresponding to the robot or not according to the second encoder parameter, the length between the vision darkroom and the grabbing area corresponding to the robot and the real-time dynamic data of the encoder; the type, the grabbing coordinate, the second coding parameter and the real-time dynamic data of the encoder of the workpiece acquired from the vision processing module are forwarded to the corresponding robot;
and the robot is used for calculating the flow coordinate of the workpiece through the real-time dynamic data of the encoder, the second encoder parameter and the grabbing coordinate after the workpiece enters the corresponding grabbing area, controlling a mechanical arm per se to carry out follow-up operation according to the flow coordinate, and carrying out workpiece grabbing operation after the coordinate of the mechanical arm per se is aligned with the flow coordinate.
4. The system of claim 3, wherein the grabbing area is divided into a corresponding number of grabbing sub-areas by at least two robots, so that if the robot of the previous grabbing sub-area fails to grab the workpiece in a follow-up manner, the robot of the next grabbing sub-area is notified to perform a grabbing supplement process, and the type, grabbing coordinates and second encoding parameters corresponding to the workpiece are carried in the corresponding notification.
CN202210046020.7A 2022-01-17 2022-01-17 Visual guide workpiece follow-up sorting system and method Pending CN114055438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210046020.7A CN114055438A (en) 2022-01-17 2022-01-17 Visual guide workpiece follow-up sorting system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210046020.7A CN114055438A (en) 2022-01-17 2022-01-17 Visual guide workpiece follow-up sorting system and method

Publications (1)

Publication Number Publication Date
CN114055438A true CN114055438A (en) 2022-02-18

Family

ID=80230962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210046020.7A Pending CN114055438A (en) 2022-01-17 2022-01-17 Visual guide workpiece follow-up sorting system and method

Country Status (1)

Country Link
CN (1) CN114055438A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114619429A (en) * 2022-04-24 2022-06-14 广东天太机器人有限公司 Mechanical arm control method based on recognition template
CN114782367A (en) * 2022-04-24 2022-07-22 广东天太机器人有限公司 Control system and method for mechanical arm
CN114986051A (en) * 2022-06-14 2022-09-02 广东天太机器人有限公司 Industrial robot welding control system and method based on template recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206132660U (en) * 2016-10-21 2017-04-26 泉州装备制造研究所 Small dimension ceramic tile vision sorting device
CN107618030A (en) * 2016-07-16 2018-01-23 深圳市得意自动化科技有限公司 The Robotic Dynamic tracking grasping means of view-based access control model and system
CN108789414A (en) * 2018-07-17 2018-11-13 五邑大学 Intelligent machine arm system based on three-dimensional machine vision and its control method
CN110743818A (en) * 2019-11-29 2020-02-04 苏州嘉诺环境工程有限公司 Garbage sorting system and garbage sorting method based on vision and deep learning
CN110841927A (en) * 2019-11-15 2020-02-28 上海威士顿信息技术股份有限公司 Sorting device, system and method and electronic equipment
CN113878576A (en) * 2021-09-28 2022-01-04 浙江大学 Robot vision sorting process programming method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107618030A (en) * 2016-07-16 2018-01-23 深圳市得意自动化科技有限公司 The Robotic Dynamic tracking grasping means of view-based access control model and system
CN206132660U (en) * 2016-10-21 2017-04-26 泉州装备制造研究所 Small dimension ceramic tile vision sorting device
CN108789414A (en) * 2018-07-17 2018-11-13 五邑大学 Intelligent machine arm system based on three-dimensional machine vision and its control method
CN110841927A (en) * 2019-11-15 2020-02-28 上海威士顿信息技术股份有限公司 Sorting device, system and method and electronic equipment
CN110743818A (en) * 2019-11-29 2020-02-04 苏州嘉诺环境工程有限公司 Garbage sorting system and garbage sorting method based on vision and deep learning
CN113878576A (en) * 2021-09-28 2022-01-04 浙江大学 Robot vision sorting process programming method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114619429A (en) * 2022-04-24 2022-06-14 广东天太机器人有限公司 Mechanical arm control method based on recognition template
CN114782367A (en) * 2022-04-24 2022-07-22 广东天太机器人有限公司 Control system and method for mechanical arm
CN114986051A (en) * 2022-06-14 2022-09-02 广东天太机器人有限公司 Industrial robot welding control system and method based on template recognition

Similar Documents

Publication Publication Date Title
CN114055438A (en) Visual guide workpiece follow-up sorting system and method
CN204777053U (en) Automatic logistics system of flexible manufacturing
CN205437563U (en) Portable welding robot
CN102730369B (en) Flexible working section of assembly line and working method thereof
CN105397812B (en) Mobile robot and the method that product is changed based on mobile robot
EP0104270A1 (en) Robot control apparatus
CN108788630A (en) Petrol engine wheel hub full automatic processing device and process
JP2008178919A (en) Working system of sheet metal
CN111285112A (en) Artificial intelligence feeding equipment of wheel production line rolling machine
CN110005659B (en) Real-time monitoring method for air cylinder
CN105965113A (en) Wire electrical discharge machine
CN113916285A (en) Visual detection device and detection process for washing machine roller fastening screw
CN208304351U (en) A kind of automatic package system of four axis robots
CN109597382A (en) Coal machine and manufactures intelligence blanking production system
CN107414450B (en) Visual guidance-based multi-joint robot assembly machine
CN103692351A (en) Numerical control system of sanding equipment and part polishing method based on system
CN115319762B (en) Robot control method for production line, production line and numerical control machine tool
CN106966181A (en) A kind of punching press metal plate factory thin flat plate material is said good-bye code fetch unit
CN116786429A (en) Intelligent sorting and distribution system for warehouse goods
CN218087727U (en) Moving and identifying system for tool
JP2004174500A (en) Bending device and sheeting system
CN213280552U (en) Full-automatic mechanical arm scanning device for SMT (surface mounting technology)
CN212665421U (en) Flexible manufacturing production line for pins
US20030231317A1 (en) System and device for detecting and separating out of position objects during manufacturing process
CN210647950U (en) Automatic punching robot production line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220218