CN108500979B - Robot grabbing method and system based on camera communication connection - Google Patents

Robot grabbing method and system based on camera communication connection Download PDF

Info

Publication number
CN108500979B
CN108500979B CN201810197156.1A CN201810197156A CN108500979B CN 108500979 B CN108500979 B CN 108500979B CN 201810197156 A CN201810197156 A CN 201810197156A CN 108500979 B CN108500979 B CN 108500979B
Authority
CN
China
Prior art keywords
robot
grabbing
image
nearby
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810197156.1A
Other languages
Chinese (zh)
Other versions
CN108500979A (en
Inventor
彭惠平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangling Motors Corp Ltd
Original Assignee
彭惠平
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 彭惠平 filed Critical 彭惠平
Priority to CN201810197156.1A priority Critical patent/CN108500979B/en
Publication of CN108500979A publication Critical patent/CN108500979A/en
Application granted granted Critical
Publication of CN108500979B publication Critical patent/CN108500979B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot grabbing method and system based on camera communication connection, and relates to the technical field of robots, wherein the method comprises the following steps: step one, a master control machine establishes a temporary matching relation between a camera and a nearby robot; secondly, the robot grabs the target workpiece; thirdly, a camera acquires a posture image captured by the robot to a main control computer; fourthly, matching the captured attitude image with the template captured image by the main control computer; step five, the main control computer calculates the position offset and the difference value between the captured attitude image and the template captured image to generate the matching degree; step six, the robot adjusts the grabbing posture according to the matching degree; and step seven, transferring the target workpiece by the robot. Through realizing the communication between robot and camera nearby, the cooperation database shortens and controls the time, improves and snatchs efficiency, snatchs the real-time matching between the image through snatching gesture image and template, guarantees to snatch the accuracy of gesture, has also played the guard action to the target work piece simultaneously.

Description

Robot grabbing method and system based on camera communication connection
Technical Field
The invention relates to the technical field of robots, in particular to a robot grabbing method and system based on camera communication connection.
Background
With the development of technological progress, the number of processing types of human participation is gradually reduced, robots are widely involved in processing systems of factories, and various types of robots are designed according to different processing environments, processing modes and other processing requirements.
Chinese patent with patent publication No. CN107009391A, which proposes a robot gripping positioning device, a robot gripping system and a gripping method, comprising: the camera moving mechanism is arranged on the camera; the camera moving mechanism is arranged corresponding to the material distribution table, the camera is installed on the camera moving mechanism, and the camera moving mechanism is used for driving the camera to move along a preset track relative to the material distribution table, so that the camera collects images of a plurality of preset positions on the material distribution table one by one.
The robot in the patent is only suitable for the position of fixed point and can only grab and clamp specific workpieces, and the application range is small, so that robots of other different models are difficult to effectively utilize. In actual production, workpieces placed at any positions in a factory need to be matched and transferred by a robot, the robot in the prior art can difficultly accurately identify the shapes of different workpieces and implement the optimal grabbing mode, the grabbing efficiency is low, and the production efficiency of the factory is influenced.
Disclosure of Invention
Aiming at the problems that the robot in the prior art is difficult to accurately recognize the shapes of different workpieces and adopts a reasonable grabbing mode, the invention aims to provide a robot grabbing method based on camera communication connection, which has the advantages of real-time control, accurate grabbing and high grabbing efficiency.
In order to achieve the purpose, the invention provides the following technical scheme:
a robot grabbing method and system based on camera communication connection comprises the following steps:
firstly, a camera collects image information of a target workpiece and each robot and outputs the image information to a main control computer, the main control computer identifies the robot close to the target workpiece according to the image information, a temporary matching relation between the camera and the robot is established, and a grabbing signal is output to the robot;
responding to the grabbing signal by the robot and grabbing the target workpiece;
step three, shooting and collecting a grabbing attitude image of the robot to the target workpiece in real time by a camera to a main control computer;
step four, the master control machine receives the captured attitude image acquired in real time, calls the template captured image of the corresponding target workpiece from the database, and matches the captured attitude image with the template captured image by adopting a shape-based template matching method;
step five, the main control computer calculates the position offset between the captured attitude image and the template captured image and the difference value of the two images to generate the matching degree;
step six, the main control computer judges the accuracy of the grabbing posture of the robot according to the matching degree, and when the grabbing posture of the robot is judged to be inaccurate, the robot adjusts the grabbing posture until the grabbing posture is matched;
and step seven, transferring the target workpiece by the robot.
According to the technical scheme, the main control computer recognizes the robot nearby the target workpiece through the camera and then can be in communication connection with the robot, the robot automatically approaches the target workpiece after receiving the grabbing signal, and due to the fact that the appearance shape of the target workpiece is irregular, the robot needs to grab the target workpiece at a specific angle and height in a three-dimensional space. The camera and the main control computer monitor the grabbing action of the nearby robot, and when the grabbing action of the robot is inaccurate, the accurate grabbing gesture is fed back to the robot until the robot finally finishes accurate grabbing and transferring on the target workpiece. The invention can identify and control the nearby robot to realize communication connection with the camera, thereby shortening the operation time and simultaneously improving the grabbing efficiency; the data base stores the best grabbing mode between different workpieces and any robot, the best grabbing information is provided for the robot in the grabbing process, the grabbing attitude is guaranteed to be accurate through real-time matching between the grabbing attitude image and the template grabbing image, and meanwhile the best protection effect is achieved on the target workpiece.
Further, the step one also comprises the following steps:
A. establishing a control network by taking the main control computer as a core, wherein the camera and the plurality of robots are accessed into the control network and are in control connection with the main control computer;
B. the camera shoots images of a target workpiece and a plurality of robots, the distance between the position of the target workpiece and each robot is obtained through image processing, and robot information closest to the target workpiece is output to the main control computer according to the distance;
C. the main control machine receives the robot information closest to the target workpiece and outputs the grabbing signal to the robot;
D. and establishing communication connection between the IP of the nearby robot and the IP of the camera.
Through the technical scheme, the master control machine controls the camera and the multiple robots after establishing the control network, communication connection between the camera and the nearby robots is achieved in the control network, and after grabbing is finished, communication connection is correspondingly interrupted.
Further, the second step further comprises the following steps:
A. the main control machine marks the position of the target workpiece and the current position of the nearby robot according to the image information;
B. the main control computer calculates the linear distance and the offset angle between the robot and a target workpiece by adopting an image calibration technology, and feeds the linear distance and the offset angle back to the robot;
C. the robot approaches to the target workpiece according to the linear distance and the offset angle;
D. and circulating ABC until the robot reaches the pre-grabbing position.
Through the technical scheme, the robot selects the optimal moving mode to automatically approach the target workpiece according to the moving distance and the moving angle, the preparation time before grabbing is shortened, and the grabbing efficiency of the target workpiece is improved.
Further, the fourth step further includes:
according to the image information of the target workpiece and the robot information acquired by the camera, the main control computer automatically identifies template capture images which are stored in a database in advance and correspond to the target workpiece and the robot.
By the technical scheme, template capture images of different workpieces by different robots are stored in the database, and the main control computer can quickly call the best template capture image after recognizing the target workpiece and the robot.
Aiming at the problems that the robot in the prior art is difficult to accurately recognize the shapes of different workpieces and adopts a reasonable grabbing mode, the invention aims to provide a robot grabbing system based on camera communication connection, which has the advantages of real-time control, accurate grabbing and high grabbing efficiency.
In order to achieve the purpose, the invention provides the following technical scheme:
a robot grasping system based on camera communication connection includes:
the workbench is used for placing a target workpiece;
the robot is movably arranged around the workbench and is used for grabbing the target workpiece to a set position in response to a grabbing signal of the main control machine;
the camera is arranged above the workbench and used for searching the nearby robot, shooting the grabbing gesture of the robot and outputting a grabbing gesture image;
a database for storing template capture images between the workpiece and the robot;
and the main control computer is used for receiving the captured attitude image acquired by the camera and the template captured image of the target workpiece in the calling database, calculating the position offset and the difference value between the captured attitude image and the template captured image, generating the matching degree, and generating control information according to the matching degree to control the robot to act.
Through above-mentioned technical scheme, because there are a plurality of robots around the work piece, the main control computer discerns near robot and makes it carry out the communication with the camera and be connected to form the interim matching relation between camera and the near robot, shortened the time of controlling other robots, improved work efficiency, the camera will guide the robot in real time and snatch the target work piece, improves and snatchs the precision, also is the protection to the target work piece simultaneously.
Further, the main control computer comprises a processor for processing position offset and difference values between the captured attitude image and the template captured image, and a control panel for realizing robot management, monitoring and communication.
Through the technical scheme, the processor processes the data, and an operator can monitor and perform manual intervention on the whole grabbing process through the control panel.
Furthermore, the system also comprises a safety protection device which is used for detecting the running safety state of the robot, dynamically monitoring the motion process of the robot, judging whether the moving route has an obstacle or not, and avoiding when the obstacle exists.
Through the technical scheme, in order to avoid collision between the self moving process of the robot and a temporary obstacle, the safety protection device is used for monitoring the self operation and protecting the robot.
Compared with the prior art, the invention has the following beneficial effects:
(1) aiming at the transfer of the target workpiece, the main control machine establishes a temporary matching relation between the robot and the camera nearby, so that the rapid transfer is realized, the control time is shortened, and the grabbing efficiency of the target workpiece is improved;
(2) after the matching degree between the captured posture image of the robot and the template captured image is obtained, the main control computer selectively intervenes and guides the robot to capture, so that the accuracy of the captured posture of the robot is guaranteed, and meanwhile, a target workpiece is protected;
(3) all template capture images between different workpieces and any robot are stored in the database, and the best capture information is provided for the robot in real time in the capture process.
Drawings
FIG. 1 is a schematic diagram of a robot grabbing method based on camera communication connection;
fig. 2 is a schematic diagram of a robot gripping system based on camera communication connection.
Reference numerals: 1. a robot; 2. a camera; 3. a main control machine; 4. a database; 5. a processor; 6. a control panel; 7. a safety protection device.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the following provides a detailed description of the present invention with reference to examples and drawings, but the embodiments of the present invention are not limited thereto.
As shown in fig. 1, a robot grabbing method and system based on camera communication connection includes the following steps:
step one (S10), the camera 2 acquires image information of the target workpiece and each robot 1 and outputs the image information to the main control computer 3, the main control computer 3 identifies the robot 1 which is close to the target workpiece according to the image information, establishes a temporary matching relationship between the camera 2 and the robot 1, and outputs a capture signal to the robot 1. After the master control machine 3 receives a grabbing instruction sent by an operator, the camera 2 collects image information of a target workpiece and each robot 1 and outputs the image information to the master control machine 3, the master control machine 3 marks each robot 1 and the target workpiece through an image information identification technology, the type of the target workpiece and the information of the robot 1 are identified, the robot information comprises codes of the robot 1, the distance between the target workpiece and each robot 1 is obtained through data processing, the robot 1 with the minimum distance is screened out after data comparison, the robot 1 is marked as a near robot 1, and the master control machine 3 establishes a temporary matching relationship between the robot 1 and the camera 2. The camera 2, the main control machine 3 and the near robot 1 form a temporary matching relationship, and the temporary matching relationship is released after the capturing is finished.
The first step also comprises the following steps:
A. a control network is established by taking a main control machine 3 as a core, and the camera 2 and the plurality of robots 1 are accessed into the control network and are in control connection with the main control machine 3. The main control machine 3, the camera 2 and the robots 1 are all brought into the same control network, the main control machine 3 can monitor and control the camera 2 and the robots 1 in the control network simultaneously and independently, and the control network is an Ethernet.
B. The camera 2 captures images of the target workpiece and the robots 1, obtains distances between the positions of the target workpiece and the robots 1 through image processing, and outputs robot information closest to the target workpiece to the main control computer 3 according to the distances. The main control machine 3 calculates the distances between all the robots 1 and the target workpiece, compares the distances, screens the robot 1 code with the minimum distance, marks the robot 1 as a nearby robot, and establishes a temporary matching relationship between the robot 1 and the camera 2 and the main control machine 3.
C. The main control machine 3 receives the robot information closest to the target workpiece and outputs a grabbing signal to the robot 1. After the robot 1 establishes a temporary matching relationship with the camera 2, the main control computer 3 sends a grabbing signal to the robot 1, moves to a target workpiece and carries out grabbing.
D. And establishing communication connection between the nearby robot 1IP and the camera 2 IP. In the Ethernet, a handshake protocol in communication is adopted, the robot 1 is in communication connection with the camera 2 through the own IP, a control relation is formed among the robot 1, the camera 2 and the master control machine 3 nearby, and after grabbing is finished, the control connection is correspondingly interrupted.
In step two (S20), the robot 1 performs gripping on the target workpiece in response to the gripping signal. The robot 1 receives the grabbing signal through a control system of the robot 1, the robot 1 calls a PLC program in the robot 1, and the robot 1 approaches to a target workpiece and carries out conventional grabbing.
The second step also comprises the following steps:
A. and the main control machine 3 marks the position of the target workpiece and the current position of the nearby robot 1 according to the image information. The main control machine 3 acquires the position of the target workpiece and the position of the nearby robot 1 from the image information of the target workpiece and each robot 1 acquired by the camera 2, establishes a mathematical model, marks the target workpiece as an origin coordinate point X (0, 0), and marks the nearby robot 1 as a movable coordinate point Y (a, b).
B. The main control machine 3 calculates the linear distance and the offset angle between the robot 1 and the target workpiece by adopting an image calibration technology, and feeds the linear distance and the offset angle back to the robot 1. The main control machine 3 calculates a distance parameter and an angle parameter between the origin coordinate point and the active coordinate point, wherein the distance parameter is L, L ^2= a ^2+ b ^2, the angle parameter is theta, tan theta = (a/b), and the robot 1 approaches to the target workpiece according to L and theta.
C. The robot 1 approaches to the target workpiece according to the linear distance and the offset angle. The robot 1 moves by referring to the linear distance and the offset angle fed back by the main control machine 3.
D. Loop ABC until the robot 1 reaches the pre-grip position. And when the robot 1 meets an obstacle, the robot automatically stops, so that the position is adjusted according to the current situation, the step of circulating the ABC finally reaches a pre-grabbing position, and the pre-grabbing position is a position where the robot 1 can grab a target workpiece.
And step three (S30), the camera 2 shoots and collects the grabbing attitude image of the robot 1 to the target workpiece in real time to the main control computer 3. The grabbing state of the robot 1 is monitored in real time by the camera 2.
And step four (S40), the main control computer 3 receives the captured attitude image acquired in real time, calls the template captured image corresponding to the target workpiece from the database 4, and matches the captured attitude image with the template captured image by adopting a shape-based template matching method. The camera 2 shoots 24 frames of images per second, the main control computer 3 calls the template grabbing images of the robot 1 and the target workpiece in the database 4 after acquiring the codes of the nearby robot 1, and grabbing attitude images generated in each grabbing state of the robot 1 are matched with the template grabbing images.
The fourth step also comprises:
according to the image information of the target workpiece and the robot information acquired by the camera 2, the main control computer 3 automatically identifies template capture images corresponding to the target workpiece and the robot 1, which are stored in the database 4 in advance. Because the types of the robots 1 are various and the styles of the target workpiece are various, template capture images of different workpieces by different robots 1 are stored in the database 4, and the main control computer 3 can quickly call the best template capture image after recognizing the target workpiece and the robot 1.
And step five (S50), the main control computer 3 calculates the position offset between the captured attitude image and the template captured image and the difference value of the two images to generate the matching degree. In the process that the master control machine 3 matches the grabbing attitude image generated by each grabbing state of the robot 1 with the template grabbing image, the matching degree connected with the offset and the difference value is comprehensively generated by calculating the position offset between each frame of image of the robot 1 and the template grabbing image in the image and the grabbing angle difference value between each frame of image of the robot 1 and the template grabbing image. Each frame image generates an offset r, a difference value s, f = (r + s)/2, and a final matching degree h = (f)1+f2+…fn)-(f1+f2+…fn-1). Wherein, the lower the matching degree value, the more accurate the grabbing posture of the robot 1.
And step six (S60), the main control computer 3 judges the accuracy of the grabbing posture of the robot 1 according to the matching degree, and when the grabbing posture of the robot 1 is judged to be inaccurate, the robot 1 adjusts the grabbing posture until the grabbing posture is matched. The main control machine 3 judges the matching degree value, when the value of the matching degree h is always in the process of linear reduction, the main control machine 3 does not perform control intervention on the robot 1, when the matching degree value fluctuates, the main control machine 3 feeds matching information back to the robot 1, and the robot 1 adjusts the grabbing posture.
Step seven (S70), the robot 1 transfers the target workpiece. The robot 1 can also transfer the target workpiece after finishing grabbing.
As shown in fig. 2, a robot 1 grabbing system based on camera 2 communication connection comprises a workbench, a robot 1, a camera 2, a database 4, a main control computer 3 and a safety protection device 7. The workbench is used for placing a target workpiece. And the robot 1 is movably arranged around the workbench and is used for grabbing the target workpiece to a set position in response to a grabbing signal of the main control machine 3. And the camera 2 is arranged above the workbench, has a wide shooting range, and is used for searching the nearby robot 1, shooting the grabbing gesture of the robot 1 and outputting an image of the grabbing gesture. And a database 4 for storing template capture images between each workpiece and each robot 1. And the main control computer 3 is used for receiving the captured attitude image acquired by the camera 2 and the template captured image of the target workpiece in the calling database 4, calculating the position offset and the difference value between the captured attitude image and the template captured image, generating the matching degree, and generating control information according to the matching degree to control the robot 1 to act. Because there are a plurality of robots 1 around the work piece, main control computer 3 discerns near robot 1 and makes it carry out communication connection with camera 2 to form the interim matching relation between camera 2 and the near robot 1, shortened the time of controlling other robots 1, improved the efficiency that the work piece shifted, camera 2 will guide robot 1 to snatch the target work piece in real time, also protected the target work piece when improving and snatching the precision.
The main control machine 3 comprises a processor 5 for processing position offset and difference values between the captured attitude image and the template captured image, and a control panel 6 for realizing management, monitoring and communication of the robot 1. The control panel 6 includes a display screen, an emergency stop button, control keys, and the like. The captured attitude image and the template captured image are subjected to data processing through the processor 5, and an operator can monitor and perform manual intervention on the whole capturing process through the control panel 6.
In order to avoid collision between the robot 1 and a temporary obstacle in the moving process, the safety protection device 7 comprises a distance measurement sensor installed on the robot 1 and used for detecting the safety state of the robot 1 in operation, dynamically monitoring the moving process of the robot 1, judging whether the moving route has the obstacle or not, and avoiding when the obstacle exists. The safety device 7 is used for monitoring the operation of the robot and protecting the robot 1.
The working principle and the beneficial effects of the invention are as follows:
the workpiece arranged on the workbench is a target workpiece, the main control machine 3 recognizes the robot 1 close to the target workpiece through the camera 2 and then can be in communication connection with the robot 1, the robot 1 is marked as the close robot 1, and a temporary matching relation is formed among the close robot 1, the camera 2 and the main control machine 3. The nearby robot 1 receives the grabbing signal and then automatically approaches to the target workpiece, and due to the irregularity of the appearance shape of the target workpiece, the robot 1 needs to grab the target workpiece at a specific angle and height in a three-dimensional space. At this time, the camera 2 and the main control computer 3 monitor the capturing action of the robot 1 nearby, and the captured posture image and the template captured image are compared by the processor 5 to be processed, so that the matching degree is generated. And judging whether the grabbing action of the robot 1 is accurate or not according to the matching degree. When the grabbing action of the robot 1 is inaccurate, the main control machine 3 feeds back the accurate grabbing posture to the robot 1, and when the grabbing action of the robot 1 is accurate, the main control machine 3 does not intervene on the robot 1 until the robot 1 finally finishes accurate grabbing and transferring of a target workpiece.
The invention can identify and control the nearby robot 1 and the camera 2 to realize communication connection, and realizes the temporary matching relation among the nearby robot 1, the camera 2 and the main control machine 3 aiming at the transfer of the target workpiece, thereby shortening the operation time and improving the grabbing efficiency. The database 4 stores the best grabbing modes and template grabbing images between different workpieces and any robot 1, and guarantees that the best grabbing information is provided for the robot 1 in real time in the grabbing process. The processor 5 is used for processing the matching degree between the grabbing attitude image and the template grabbing image, and selectively intervening and guiding the grabbing of the robot 1, so that the grabbing attitude of the robot 1 is ensured to be accurate, and the optimal grabbing mode is also used for protecting a target workpiece.
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may occur to those skilled in the art without departing from the principle of the invention, and are considered to be within the scope of the invention.

Claims (2)

1. A robot grabbing method based on camera communication connection is characterized by comprising the following steps:
firstly, a camera (2) collects image information of a target workpiece and each robot (1) and outputs the image information to a main control computer (3), the main control computer (3) identifies the robot (1) nearby the target workpiece according to the image information, a temporary matching relation between the camera (2) and the nearby robot (1) is established, and a grabbing signal is output to the nearby robot (1);
step two, the nearby robot (1) responds to the grabbing signal and grabs the target workpiece;
step three, shooting and collecting a captured attitude image of the robot (1) nearby to the target workpiece by the camera (2) in real time to a main control machine (3);
step four, the main control computer (3) receives the captured attitude image acquired in real time, the template captured image corresponding to the target workpiece is called from the database (4), and the captured attitude image is matched with the template captured image by adopting a shape-based template matching method; the camera (2) shoots 24 frames of images every second, the main control computer (3) calls the nearby robot (1) in the database (4) and the template grabbing image of the target workpiece after acquiring the nearby robot (1) code, and the grabbing attitude image generated in each grabbing state of the nearby robot (1) is matched with the template grabbing image;
step five, the main control computer (3) calculates the position offset between the captured attitude image and the template captured image and the difference value of the two images to generate the matching degree;
step six, the master control machine (3) judges the accuracy of the grabbing gesture of the nearby robot (1) according to the matching degree, and when the grabbing gesture of the nearby robot (1) is judged to be inaccurate, the nearby robot (1) adjusts the grabbing gesture until the grabbing gesture is matched;
step seven, transferring the target workpiece by the nearby robot (1);
the first step also comprises the following steps:
A. a control network is established by taking the main control machine (3) as a core, and the camera (2) and each robot (1) are accessed into the control network and are in control connection with the main control machine (3);
B. the camera (2) shoots images of the target workpiece and each robot (1), the distance between the position of the target workpiece and each robot (1) is obtained through image processing, and robot information close to the target workpiece is output to the main control computer (3) according to the distance;
C. the main control machine (3) receives robot information close to the target workpiece and outputs the grabbing signal to the close robot (1);
D. establishing communication connection between the IP of the robot (1) and the IP of the camera (2) nearby;
the second step also comprises the following steps:
A. the main control machine (3) marks the position of a target workpiece and the current position of the nearby robot (1) according to the image information;
B. the main control machine (3) calculates the linear distance and the offset angle between the nearby robot (1) and a target workpiece by adopting an image calibration technology, and feeds the linear distance and the offset angle back to the nearby robot (1);
C. the nearby robot (1) approaches to a target workpiece according to the linear distance and the offset angle;
D. circulating ABC until the nearby robot (1) reaches a pre-grabbing position;
detecting the running safety state of the nearby robot (1), dynamically monitoring the motion process of the nearby robot (1), judging whether an obstacle exists in a moving route, and performing avoidance processing when the obstacle exists;
the fourth step also comprises:
according to the image information of the target workpiece and the nearby robot information acquired by the camera (2), the main control computer (3) automatically identifies template capture images which are stored in a database (4) in advance and correspond to the target workpiece and the nearby robot (1).
2. A camera communication link based robotic gripper system, using the method of claim 1, comprising:
the workbench is used for placing a target workpiece;
the robot (1) is movably arranged around the workbench and is used for grabbing the target workpiece to a set position in response to a grabbing signal of the main control machine (3);
the camera (2) is arranged above the workbench and used for searching the nearby robot (1), shooting the grabbing gesture of the nearby robot (1) and outputting a grabbing gesture image;
a database (4) for storing template capture images between the workpiece and the nearby robot (1);
the main control machine (3) is used for receiving the captured attitude image acquired by the camera (2) and the template captured image of the target workpiece in the calling database (4), calculating the position offset and the difference value between the captured attitude image and the template captured image, generating a matching degree, and generating control information according to the matching degree to control the action of the nearby robot (1);
the main control machine (3) comprises a processor (5) for processing position offset and difference values between the captured attitude image and the template captured image, and a control panel (6) for realizing management, monitoring and communication of the robot (1) nearby; the camera (2) shoots 24 frames of images every second, the main control computer (3) calls the nearby robot (1) in the database (4) and the template grabbing image of the target workpiece after acquiring the nearby robot (1) code, and the grabbing attitude image generated in each grabbing state of the nearby robot (1) is matched with the template grabbing image;
the system also comprises a safety protection device (7) which is used for detecting the running safety state of the nearby robot (1), dynamically monitoring the motion process of the nearby robot (1), judging whether the moving route has an obstacle or not and carrying out avoidance processing when the obstacle exists.
CN201810197156.1A 2018-03-10 2018-03-10 Robot grabbing method and system based on camera communication connection Active CN108500979B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810197156.1A CN108500979B (en) 2018-03-10 2018-03-10 Robot grabbing method and system based on camera communication connection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810197156.1A CN108500979B (en) 2018-03-10 2018-03-10 Robot grabbing method and system based on camera communication connection

Publications (2)

Publication Number Publication Date
CN108500979A CN108500979A (en) 2018-09-07
CN108500979B true CN108500979B (en) 2020-10-27

Family

ID=63376434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810197156.1A Active CN108500979B (en) 2018-03-10 2018-03-10 Robot grabbing method and system based on camera communication connection

Country Status (1)

Country Link
CN (1) CN108500979B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109309791B (en) * 2018-11-09 2021-01-29 珠海格力智能装备有限公司 Method and system for controlling camera to take pictures
CN110275532B (en) * 2019-06-21 2020-12-15 珠海格力智能装备有限公司 Robot control method and device and visual equipment control method and device
CN110303498B (en) * 2019-07-03 2020-10-09 广东博智林机器人有限公司 Carrying system, control method thereof and floor tile paving system
CN110450129B (en) * 2019-07-19 2022-06-24 五邑大学 Carrying advancing method applied to carrying robot and carrying robot thereof
CN111113410B (en) * 2019-12-05 2021-11-30 珠海格力电器股份有限公司 Robot motion control method for visual adaptive detection, computer readable storage medium and robot
CN111251302B (en) * 2020-03-10 2021-12-17 三一机器人科技有限公司 Workpiece grabbing method and device based on vision system
CN111531544B (en) * 2020-05-13 2023-05-23 深圳赛动生物自动化有限公司 Robot control system based on image geometric matching and control method thereof
CN113109365A (en) * 2021-04-16 2021-07-13 中国科学院自动化研究所 Defect detection system and method suitable for various workpieces
CN113298975B (en) * 2021-05-13 2022-05-17 南京艾尔普再生医学科技有限公司 Full-automatic quality control system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4864363B2 (en) * 2005-07-07 2012-02-01 東芝機械株式会社 Handling device, working device, and program
JP5685027B2 (en) * 2010-09-07 2015-03-18 キヤノン株式会社 Information processing apparatus, object gripping system, robot system, information processing method, object gripping method, and program
CN106737665B (en) * 2016-11-30 2019-07-19 天津大学 Based on binocular vision and the matched mechanical arm control system of SIFT feature and implementation method
CN106774315B (en) * 2016-12-12 2020-12-01 深圳市智美达科技股份有限公司 Autonomous navigation method and device for robot
CN106527239B (en) * 2016-12-30 2018-12-21 华南智能机器人创新研究院 A kind of method and system of multirobot cooperative mode
CN106931945B (en) * 2017-03-10 2020-01-07 上海木木机器人技术有限公司 Robot navigation method and system
CN107414832A (en) * 2017-08-08 2017-12-01 华南理工大学 A kind of mobile mechanical arm crawl control system and method based on machine vision

Also Published As

Publication number Publication date
CN108500979A (en) 2018-09-07

Similar Documents

Publication Publication Date Title
CN108500979B (en) Robot grabbing method and system based on camera communication connection
CN107618030B (en) Robot dynamic tracking grabbing method and system based on vision
US10232512B2 (en) Coordinate system setting method, coordinate system setting apparatus, and robot system provided with coordinate system setting apparatus
US9604364B2 (en) Picking apparatus and picking method
JP2017042859A (en) Picking system, and processing device and method therefor and program
EP4019200A1 (en) Production system
CN106853639A (en) A kind of battery of mobile phone automatic assembly system and its control method
WO2015121767A1 (en) Automatic calibration method for robot systems using a vision sensor
EP4013578A1 (en) Robot-mounted moving device, system, and machine tool
CN106044570A (en) Steel coil lifting device automatic identification device and method adopting machine vision
WO2013111964A1 (en) Method for setting up vision-based structure
JP2003211381A (en) Robot control device
CN112775975A (en) Vision-guided multi-station robot welding deviation correcting device and method
CN110980276A (en) Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
US11235463B2 (en) Robot system and robot control method for cooperative work with human
US20220241982A1 (en) Work robot and work system
CN114139857A (en) Workpiece finishing process correcting method, system, storage medium and device
CN107199423B (en) Non-programming teaching-free intelligent welding robot
CN110039520B (en) Teaching and processing system based on image contrast
CN114074331A (en) Disordered grabbing method based on vision and robot
CN114670189A (en) Storage medium, and method and system for generating control program of robot
CN111992895A (en) Intelligent marking system and method
CN113015604B (en) Robot control system and robot control method
KR20210014033A (en) Method for picking and place object
EP4144494A1 (en) Image processing method, image processing device, robot mounted-type conveyance device, and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TA01 Transfer of patent application right

Effective date of registration: 20201013

Address after: 330001 No. 509 North Avenue, Yingbin Road, Jiangxi, Nanchang

Applicant after: JIANGLING MOTORS Co.,Ltd.

Address before: 330000 room 2, unit 12, 2 Jinggangshan Avenue, Qingyun District, Nanchang, Jiangxi, 204

Applicant before: Peng Huiping

TA01 Transfer of patent application right